Mar 10 06:44:04 crc systemd[1]: Starting Kubernetes Kubelet... Mar 10 06:44:04 crc restorecon[4750]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:04 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 06:44:05 crc restorecon[4750]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 06:44:06 crc restorecon[4750]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 06:44:06 crc restorecon[4750]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 10 06:44:08 crc kubenswrapper[4825]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 06:44:08 crc kubenswrapper[4825]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 10 06:44:08 crc kubenswrapper[4825]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 06:44:08 crc kubenswrapper[4825]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 06:44:08 crc kubenswrapper[4825]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 10 06:44:08 crc kubenswrapper[4825]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.421656 4825 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466221 4825 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466291 4825 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466301 4825 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466311 4825 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466322 4825 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466332 4825 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466341 4825 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466350 4825 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466359 4825 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466367 4825 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466376 4825 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466386 4825 feature_gate.go:330] unrecognized feature gate: Example Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466398 4825 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466409 4825 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466420 4825 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466430 4825 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466442 4825 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466450 4825 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466462 4825 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466475 4825 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466489 4825 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466499 4825 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466507 4825 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466517 4825 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466533 4825 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466542 4825 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466552 4825 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466561 4825 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466572 4825 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466580 4825 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466589 4825 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466597 4825 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466605 4825 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466614 4825 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466622 4825 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466639 4825 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466647 4825 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466657 4825 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466665 4825 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466674 4825 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466682 4825 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466690 4825 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466699 4825 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466707 4825 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466716 4825 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466725 4825 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466733 4825 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466742 4825 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466750 4825 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466761 4825 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466771 4825 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466782 4825 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466792 4825 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466801 4825 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466810 4825 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466818 4825 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466829 4825 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466838 4825 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466846 4825 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466854 4825 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466863 4825 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466873 4825 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466881 4825 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466890 4825 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466898 4825 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466906 4825 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466915 4825 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466923 4825 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466931 4825 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466945 4825 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.466955 4825 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467171 4825 flags.go:64] FLAG: --address="0.0.0.0" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467193 4825 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467212 4825 flags.go:64] FLAG: --anonymous-auth="true" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467224 4825 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467237 4825 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467248 4825 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467270 4825 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467284 4825 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467294 4825 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467304 4825 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467315 4825 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467362 4825 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467375 4825 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467385 4825 flags.go:64] FLAG: --cgroup-root="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467395 4825 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467405 4825 flags.go:64] FLAG: --client-ca-file="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467414 4825 flags.go:64] FLAG: --cloud-config="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467424 4825 flags.go:64] FLAG: --cloud-provider="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467434 4825 flags.go:64] FLAG: --cluster-dns="[]" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467446 4825 flags.go:64] FLAG: --cluster-domain="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467456 4825 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467466 4825 flags.go:64] FLAG: --config-dir="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467477 4825 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467488 4825 flags.go:64] FLAG: --container-log-max-files="5" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467500 4825 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467510 4825 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467520 4825 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467531 4825 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467541 4825 flags.go:64] FLAG: --contention-profiling="false" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467551 4825 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467561 4825 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467572 4825 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467582 4825 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467594 4825 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467604 4825 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467614 4825 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467630 4825 flags.go:64] FLAG: --enable-load-reader="false" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467640 4825 flags.go:64] FLAG: --enable-server="true" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467650 4825 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467664 4825 flags.go:64] FLAG: --event-burst="100" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467675 4825 flags.go:64] FLAG: --event-qps="50" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467685 4825 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467695 4825 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467705 4825 flags.go:64] FLAG: --eviction-hard="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467717 4825 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467727 4825 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467737 4825 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467748 4825 flags.go:64] FLAG: --eviction-soft="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467758 4825 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467785 4825 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467795 4825 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467804 4825 flags.go:64] FLAG: --experimental-mounter-path="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467814 4825 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467824 4825 flags.go:64] FLAG: --fail-swap-on="true" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467834 4825 flags.go:64] FLAG: --feature-gates="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467845 4825 flags.go:64] FLAG: --file-check-frequency="20s" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467857 4825 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467867 4825 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467878 4825 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467888 4825 flags.go:64] FLAG: --healthz-port="10248" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467898 4825 flags.go:64] FLAG: --help="false" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467908 4825 flags.go:64] FLAG: --hostname-override="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467918 4825 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467928 4825 flags.go:64] FLAG: --http-check-frequency="20s" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467938 4825 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467948 4825 flags.go:64] FLAG: --image-credential-provider-config="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467957 4825 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467967 4825 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467977 4825 flags.go:64] FLAG: --image-service-endpoint="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467986 4825 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.467996 4825 flags.go:64] FLAG: --kube-api-burst="100" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468006 4825 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468016 4825 flags.go:64] FLAG: --kube-api-qps="50" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468026 4825 flags.go:64] FLAG: --kube-reserved="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468036 4825 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468045 4825 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468056 4825 flags.go:64] FLAG: --kubelet-cgroups="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468066 4825 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468075 4825 flags.go:64] FLAG: --lock-file="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468084 4825 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468094 4825 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468106 4825 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468120 4825 flags.go:64] FLAG: --log-json-split-stream="false" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468167 4825 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468178 4825 flags.go:64] FLAG: --log-text-split-stream="false" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468188 4825 flags.go:64] FLAG: --logging-format="text" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468198 4825 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468209 4825 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468219 4825 flags.go:64] FLAG: --manifest-url="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468229 4825 flags.go:64] FLAG: --manifest-url-header="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468246 4825 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468256 4825 flags.go:64] FLAG: --max-open-files="1000000" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468268 4825 flags.go:64] FLAG: --max-pods="110" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468278 4825 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468288 4825 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468298 4825 flags.go:64] FLAG: --memory-manager-policy="None" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468309 4825 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468319 4825 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468328 4825 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468338 4825 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468361 4825 flags.go:64] FLAG: --node-status-max-images="50" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468371 4825 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468381 4825 flags.go:64] FLAG: --oom-score-adj="-999" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468391 4825 flags.go:64] FLAG: --pod-cidr="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468400 4825 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468414 4825 flags.go:64] FLAG: --pod-manifest-path="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468424 4825 flags.go:64] FLAG: --pod-max-pids="-1" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468434 4825 flags.go:64] FLAG: --pods-per-core="0" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468444 4825 flags.go:64] FLAG: --port="10250" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468454 4825 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468464 4825 flags.go:64] FLAG: --provider-id="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468475 4825 flags.go:64] FLAG: --qos-reserved="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468484 4825 flags.go:64] FLAG: --read-only-port="10255" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468500 4825 flags.go:64] FLAG: --register-node="true" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468510 4825 flags.go:64] FLAG: --register-schedulable="true" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468519 4825 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468536 4825 flags.go:64] FLAG: --registry-burst="10" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468545 4825 flags.go:64] FLAG: --registry-qps="5" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468555 4825 flags.go:64] FLAG: --reserved-cpus="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468566 4825 flags.go:64] FLAG: --reserved-memory="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468579 4825 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468589 4825 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468599 4825 flags.go:64] FLAG: --rotate-certificates="false" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468609 4825 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468619 4825 flags.go:64] FLAG: --runonce="false" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468629 4825 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468638 4825 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468655 4825 flags.go:64] FLAG: --seccomp-default="false" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468664 4825 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468674 4825 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468684 4825 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468693 4825 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468704 4825 flags.go:64] FLAG: --storage-driver-password="root" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468714 4825 flags.go:64] FLAG: --storage-driver-secure="false" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468723 4825 flags.go:64] FLAG: --storage-driver-table="stats" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468733 4825 flags.go:64] FLAG: --storage-driver-user="root" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468743 4825 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468753 4825 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468763 4825 flags.go:64] FLAG: --system-cgroups="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468773 4825 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468788 4825 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468801 4825 flags.go:64] FLAG: --tls-cert-file="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468811 4825 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468823 4825 flags.go:64] FLAG: --tls-min-version="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468832 4825 flags.go:64] FLAG: --tls-private-key-file="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468846 4825 flags.go:64] FLAG: --topology-manager-policy="none" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468857 4825 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468868 4825 flags.go:64] FLAG: --topology-manager-scope="container" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468879 4825 flags.go:64] FLAG: --v="2" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468893 4825 flags.go:64] FLAG: --version="false" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468905 4825 flags.go:64] FLAG: --vmodule="" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468917 4825 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.468927 4825 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469188 4825 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469199 4825 feature_gate.go:330] unrecognized feature gate: Example Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469211 4825 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469223 4825 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469233 4825 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469243 4825 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469252 4825 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469261 4825 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469269 4825 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469278 4825 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469286 4825 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469295 4825 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469304 4825 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469312 4825 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469321 4825 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469329 4825 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469338 4825 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469347 4825 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469355 4825 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469364 4825 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469374 4825 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469383 4825 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469392 4825 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469400 4825 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469414 4825 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469423 4825 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469432 4825 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469440 4825 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469449 4825 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469464 4825 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469472 4825 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469482 4825 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469490 4825 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469498 4825 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469507 4825 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469515 4825 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469527 4825 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469538 4825 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469548 4825 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469558 4825 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469568 4825 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469576 4825 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469586 4825 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469594 4825 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469603 4825 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469613 4825 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469623 4825 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469632 4825 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469641 4825 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469649 4825 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469657 4825 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469669 4825 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469680 4825 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469689 4825 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469698 4825 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469707 4825 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469720 4825 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469730 4825 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469739 4825 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469748 4825 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469757 4825 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469766 4825 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469775 4825 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469783 4825 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469792 4825 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469801 4825 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469809 4825 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469817 4825 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469826 4825 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469837 4825 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.469848 4825 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.469863 4825 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.513061 4825 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.513591 4825 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.513738 4825 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.513754 4825 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.513764 4825 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.513773 4825 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.513782 4825 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.513790 4825 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.513799 4825 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.513808 4825 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.513820 4825 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.513834 4825 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.513843 4825 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.513852 4825 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.513861 4825 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.513869 4825 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.513878 4825 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.513909 4825 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.513922 4825 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.513934 4825 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.513944 4825 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.513954 4825 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.513965 4825 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.513974 4825 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.513984 4825 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.513996 4825 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514007 4825 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514016 4825 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514026 4825 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514035 4825 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514044 4825 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514053 4825 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514062 4825 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514070 4825 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514079 4825 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514089 4825 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514100 4825 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514109 4825 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514118 4825 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514126 4825 feature_gate.go:330] unrecognized feature gate: Example Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514165 4825 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514176 4825 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514188 4825 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514197 4825 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514205 4825 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514213 4825 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514222 4825 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514231 4825 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514240 4825 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514249 4825 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514257 4825 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514268 4825 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514276 4825 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514285 4825 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514294 4825 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514302 4825 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514310 4825 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514320 4825 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514332 4825 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514342 4825 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514351 4825 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514360 4825 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514368 4825 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514377 4825 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514386 4825 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514394 4825 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514403 4825 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514411 4825 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514419 4825 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514428 4825 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514436 4825 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514444 4825 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514455 4825 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.514470 4825 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514782 4825 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514797 4825 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514807 4825 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514816 4825 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514826 4825 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514837 4825 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514849 4825 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514859 4825 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514869 4825 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514880 4825 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514890 4825 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514900 4825 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514909 4825 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514917 4825 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514927 4825 feature_gate.go:330] unrecognized feature gate: Example Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514936 4825 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514945 4825 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514954 4825 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514962 4825 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514971 4825 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514980 4825 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514988 4825 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.514996 4825 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515005 4825 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515013 4825 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515022 4825 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515030 4825 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515039 4825 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515047 4825 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515056 4825 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515064 4825 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515073 4825 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515081 4825 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515090 4825 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515099 4825 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515108 4825 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515119 4825 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515160 4825 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515172 4825 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515184 4825 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515195 4825 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515206 4825 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515219 4825 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515230 4825 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515241 4825 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515249 4825 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515258 4825 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515266 4825 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515275 4825 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515305 4825 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515314 4825 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515322 4825 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515331 4825 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515339 4825 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515347 4825 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515356 4825 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515365 4825 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515373 4825 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515382 4825 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515393 4825 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515404 4825 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515415 4825 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515426 4825 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515436 4825 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515445 4825 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515454 4825 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515463 4825 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515472 4825 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515481 4825 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515489 4825 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 06:44:08 crc kubenswrapper[4825]: W0310 06:44:08.515499 4825 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.515511 4825 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.515842 4825 server.go:940] "Client rotation is on, will bootstrap in background" Mar 10 06:44:08 crc kubenswrapper[4825]: E0310 06:44:08.523118 4825 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.535724 4825 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.535896 4825 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.542430 4825 server.go:997] "Starting client certificate rotation" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.542488 4825 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.542657 4825 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.650672 4825 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.655098 4825 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 06:44:08 crc kubenswrapper[4825]: E0310 06:44:08.655109 4825 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.754758 4825 log.go:25] "Validated CRI v1 runtime API" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.965800 4825 log.go:25] "Validated CRI v1 image API" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.968691 4825 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.993094 4825 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-10-06-39-24-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 10 06:44:08 crc kubenswrapper[4825]: I0310 06:44:08.993158 4825 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.025871 4825 manager.go:217] Machine: {Timestamp:2026-03-10 06:44:09.006946761 +0000 UTC m=+2.036727406 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:4821b499-aedc-4e42-be1d-4a415e5b2f81 BootID:7bc7fa84-cf6d-4b6d-8f9a-118c306e0760 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:4c:2c:89 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:4c:2c:89 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:70:37:f4 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:9e:fd:9b Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:21:ba:de Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:23:8f:c0 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:ff:f6:78 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:aa:87:1a:f4:7b:f1 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:fe:c1:03:f1:03:1f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.026158 4825 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.026438 4825 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.027120 4825 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.027406 4825 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.027477 4825 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.027713 4825 topology_manager.go:138] "Creating topology manager with none policy" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.027723 4825 container_manager_linux.go:303] "Creating device plugin manager" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.028383 4825 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.028419 4825 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.028706 4825 state_mem.go:36] "Initialized new in-memory state store" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.028804 4825 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.053010 4825 kubelet.go:418] "Attempting to sync node with API server" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.053057 4825 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.053077 4825 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.053092 4825 kubelet.go:324] "Adding apiserver pod source" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.053108 4825 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 10 06:44:09 crc kubenswrapper[4825]: W0310 06:44:09.063086 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:44:09 crc kubenswrapper[4825]: E0310 06:44:09.063215 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Mar 10 06:44:09 crc kubenswrapper[4825]: W0310 06:44:09.063355 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:44:09 crc kubenswrapper[4825]: E0310 06:44:09.063392 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.065110 4825 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.068116 4825 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.072319 4825 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.095081 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.095174 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.095188 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.095198 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.095215 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.095226 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.095236 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.095254 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.095270 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.095283 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.095302 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.095312 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.104906 4825 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.105626 4825 server.go:1280] "Started kubelet" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.105900 4825 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.106076 4825 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.106194 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.110082 4825 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 10 06:44:09 crc systemd[1]: Started Kubernetes Kubelet. Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.115055 4825 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.115154 4825 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 10 06:44:09 crc kubenswrapper[4825]: E0310 06:44:09.115420 4825 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.116235 4825 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.116268 4825 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 10 06:44:09 crc kubenswrapper[4825]: E0310 06:44:09.116271 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="200ms" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.116360 4825 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.116683 4825 server.go:460] "Adding debug handlers to kubelet server" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.116958 4825 factory.go:55] Registering systemd factory Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.117017 4825 factory.go:221] Registration of the systemd container factory successfully Mar 10 06:44:09 crc kubenswrapper[4825]: W0310 06:44:09.117039 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:44:09 crc kubenswrapper[4825]: E0310 06:44:09.117245 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.122600 4825 factory.go:153] Registering CRI-O factory Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.122669 4825 factory.go:221] Registration of the crio container factory successfully Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.122850 4825 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.122892 4825 factory.go:103] Registering Raw factory Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.122915 4825 manager.go:1196] Started watching for new ooms in manager Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.123806 4825 manager.go:319] Starting recovery of all containers Mar 10 06:44:09 crc kubenswrapper[4825]: E0310 06:44:09.123090 4825 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.222:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b67d67648e1b8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.10557228 +0000 UTC m=+2.135352895,LastTimestamp:2026-03-10 06:44:09.10557228 +0000 UTC m=+2.135352895,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.144074 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.144292 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.144374 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.144443 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.144520 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.144590 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.144658 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.144731 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.144895 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.145007 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.145082 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.145171 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.145241 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.145313 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.145380 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.147448 4825 manager.go:324] Recovery completed Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.148223 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.148318 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.148374 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.148403 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.148431 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.148476 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.148499 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.148541 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.148572 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.148602 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.148649 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.148710 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.148748 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.148790 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.148818 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.148850 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.148889 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.148918 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.148959 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.148989 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149014 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149052 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149085 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149176 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149318 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149344 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149378 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149404 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149436 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149462 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149485 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149517 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149541 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149566 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149601 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149624 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149655 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149687 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149722 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149757 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149785 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149817 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149903 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149935 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149959 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.149990 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150011 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150036 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150066 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150089 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150123 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150177 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150203 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150232 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150254 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150283 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150308 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150332 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150362 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150384 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150405 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150434 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150454 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150485 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150516 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150537 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150565 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150588 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150616 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150639 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150660 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150687 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150742 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150778 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150799 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150822 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150856 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150878 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.150905 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.151587 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.151744 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.151881 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.151975 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.152092 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.152250 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.152376 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.152498 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.152621 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.152742 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.153978 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.154113 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.154285 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.154383 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.154489 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.154596 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.154715 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.154820 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.154913 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.155010 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.155099 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.155221 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.155313 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.155409 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.155499 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.155594 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.155695 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.155785 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.162360 4825 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.162459 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.162492 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.162515 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.162538 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.162561 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.162582 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.162604 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.162626 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.162645 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.162669 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.162721 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.162743 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.162765 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.162785 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.162807 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.162830 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.162851 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.162873 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.162896 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.162917 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.162939 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.162960 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.162983 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163005 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163050 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163073 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163096 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163121 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163203 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163223 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163245 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163268 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163291 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163315 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163336 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163358 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163378 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163398 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163419 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163439 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163459 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163481 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163502 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163524 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163546 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163567 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163589 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163610 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163632 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163654 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163675 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163695 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163715 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163737 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163758 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163780 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163800 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163821 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163842 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163863 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163884 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163908 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163929 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163950 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163970 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.163989 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.164011 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.164033 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.164053 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.164076 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.164098 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.164118 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.164168 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.164233 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.164258 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.164279 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.164300 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.164321 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.164342 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.164363 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.164386 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.164407 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.164428 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.164448 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.164470 4825 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.164489 4825 reconstruct.go:97] "Volume reconstruction finished" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.164505 4825 reconciler.go:26] "Reconciler: start to sync state" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.168832 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.178789 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.178882 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.178909 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.180391 4825 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.180414 4825 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.180443 4825 state_mem.go:36] "Initialized new in-memory state store" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.191087 4825 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.193385 4825 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 10 06:44:09 crc kubenswrapper[4825]: E0310 06:44:09.215870 4825 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.234978 4825 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.235155 4825 kubelet.go:2335] "Starting kubelet main sync loop" Mar 10 06:44:09 crc kubenswrapper[4825]: E0310 06:44:09.235324 4825 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 10 06:44:09 crc kubenswrapper[4825]: W0310 06:44:09.236799 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:44:09 crc kubenswrapper[4825]: E0310 06:44:09.236869 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Mar 10 06:44:09 crc kubenswrapper[4825]: E0310 06:44:09.316404 4825 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 06:44:09 crc kubenswrapper[4825]: E0310 06:44:09.317204 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="400ms" Mar 10 06:44:09 crc kubenswrapper[4825]: E0310 06:44:09.336198 4825 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.347662 4825 policy_none.go:49] "None policy: Start" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.349029 4825 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.349064 4825 state_mem.go:35] "Initializing new in-memory state store" Mar 10 06:44:09 crc kubenswrapper[4825]: E0310 06:44:09.417193 4825 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.459626 4825 manager.go:334] "Starting Device Plugin manager" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.459807 4825 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.459829 4825 server.go:79] "Starting device plugin registration server" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.460390 4825 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.460411 4825 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 10 06:44:09 crc kubenswrapper[4825]: E0310 06:44:09.469879 4825 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.472498 4825 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.472679 4825 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.472695 4825 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.536656 4825 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.536820 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.538998 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.539063 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.539085 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.539385 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.539544 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.539609 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.541279 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.541314 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.541330 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.541659 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.541676 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.541687 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.541839 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.542323 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.542359 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.543396 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.543438 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.543460 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.543538 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.543557 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.543566 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.543661 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.543939 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.544015 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.544560 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.544596 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.544612 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.544807 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.544939 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.544969 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.545077 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.545119 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.545173 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.545673 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.545689 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.545698 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.545892 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.545905 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.545914 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.546027 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.546047 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.547126 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.547191 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.547208 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.560769 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.562756 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.562821 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.562840 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.562880 4825 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 06:44:09 crc kubenswrapper[4825]: E0310 06:44:09.563565 4825 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.222:6443: connect: connection refused" node="crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.702968 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.703019 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.703043 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.703058 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.703075 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.703091 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.703121 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.703163 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.703179 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.703193 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.703447 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.703488 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.703503 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.703523 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.703541 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: E0310 06:44:09.718705 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="800ms" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.764120 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.765912 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.765977 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.765991 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.766027 4825 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 06:44:09 crc kubenswrapper[4825]: E0310 06:44:09.766861 4825 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.222:6443: connect: connection refused" node="crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.805612 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.805689 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.805725 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.805760 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.805795 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.805826 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.805854 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.805875 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.805902 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.805934 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.805978 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.806003 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.806002 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.806000 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.805895 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.806055 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.806125 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.806210 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.806225 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.806247 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.806259 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.806290 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.806326 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.806334 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.806347 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.806363 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.806401 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.806418 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.806418 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.806507 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.877534 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.899818 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.922020 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.930655 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: I0310 06:44:09.936383 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 06:44:09 crc kubenswrapper[4825]: W0310 06:44:09.968494 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:44:09 crc kubenswrapper[4825]: E0310 06:44:09.969014 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Mar 10 06:44:09 crc kubenswrapper[4825]: W0310 06:44:09.980099 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:44:09 crc kubenswrapper[4825]: E0310 06:44:09.980200 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Mar 10 06:44:10 crc kubenswrapper[4825]: W0310 06:44:10.045199 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:44:10 crc kubenswrapper[4825]: E0310 06:44:10.045314 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Mar 10 06:44:10 crc kubenswrapper[4825]: W0310 06:44:10.108108 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-b5b063bb204a52a5e2e022b34fe62c235d8274c9edd44fec8bdc0dc7b0fac32a WatchSource:0}: Error finding container b5b063bb204a52a5e2e022b34fe62c235d8274c9edd44fec8bdc0dc7b0fac32a: Status 404 returned error can't find the container with id b5b063bb204a52a5e2e022b34fe62c235d8274c9edd44fec8bdc0dc7b0fac32a Mar 10 06:44:10 crc kubenswrapper[4825]: W0310 06:44:10.108662 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-622b177ec448c672113021ba0881625ae2b6ccb87a952e4375eb28a775946019 WatchSource:0}: Error finding container 622b177ec448c672113021ba0881625ae2b6ccb87a952e4375eb28a775946019: Status 404 returned error can't find the container with id 622b177ec448c672113021ba0881625ae2b6ccb87a952e4375eb28a775946019 Mar 10 06:44:10 crc kubenswrapper[4825]: I0310 06:44:10.110489 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:44:10 crc kubenswrapper[4825]: W0310 06:44:10.110780 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-61ddee46ffbb64a3261f3ddbf33f0895255ec579c4fe4fcc9ffa1cc42b9a2b07 WatchSource:0}: Error finding container 61ddee46ffbb64a3261f3ddbf33f0895255ec579c4fe4fcc9ffa1cc42b9a2b07: Status 404 returned error can't find the container with id 61ddee46ffbb64a3261f3ddbf33f0895255ec579c4fe4fcc9ffa1cc42b9a2b07 Mar 10 06:44:10 crc kubenswrapper[4825]: W0310 06:44:10.111861 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-bdab08577448b2580557cbc68f2b7a7be0f9ab5dfbd67aac77339a7787e2c451 WatchSource:0}: Error finding container bdab08577448b2580557cbc68f2b7a7be0f9ab5dfbd67aac77339a7787e2c451: Status 404 returned error can't find the container with id bdab08577448b2580557cbc68f2b7a7be0f9ab5dfbd67aac77339a7787e2c451 Mar 10 06:44:10 crc kubenswrapper[4825]: W0310 06:44:10.114898 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-92cd9fb986fdd8fdaa1047f81c34aa349aa6343b0559b53acd26f61a19d1dc65 WatchSource:0}: Error finding container 92cd9fb986fdd8fdaa1047f81c34aa349aa6343b0559b53acd26f61a19d1dc65: Status 404 returned error can't find the container with id 92cd9fb986fdd8fdaa1047f81c34aa349aa6343b0559b53acd26f61a19d1dc65 Mar 10 06:44:10 crc kubenswrapper[4825]: I0310 06:44:10.167957 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:10 crc kubenswrapper[4825]: I0310 06:44:10.170617 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:10 crc kubenswrapper[4825]: I0310 06:44:10.170674 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:10 crc kubenswrapper[4825]: I0310 06:44:10.170693 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:10 crc kubenswrapper[4825]: I0310 06:44:10.170734 4825 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 06:44:10 crc kubenswrapper[4825]: E0310 06:44:10.171372 4825 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.222:6443: connect: connection refused" node="crc" Mar 10 06:44:10 crc kubenswrapper[4825]: I0310 06:44:10.240710 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"61ddee46ffbb64a3261f3ddbf33f0895255ec579c4fe4fcc9ffa1cc42b9a2b07"} Mar 10 06:44:10 crc kubenswrapper[4825]: I0310 06:44:10.242720 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bdab08577448b2580557cbc68f2b7a7be0f9ab5dfbd67aac77339a7787e2c451"} Mar 10 06:44:10 crc kubenswrapper[4825]: I0310 06:44:10.244501 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"622b177ec448c672113021ba0881625ae2b6ccb87a952e4375eb28a775946019"} Mar 10 06:44:10 crc kubenswrapper[4825]: I0310 06:44:10.246080 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b5b063bb204a52a5e2e022b34fe62c235d8274c9edd44fec8bdc0dc7b0fac32a"} Mar 10 06:44:10 crc kubenswrapper[4825]: I0310 06:44:10.247095 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"92cd9fb986fdd8fdaa1047f81c34aa349aa6343b0559b53acd26f61a19d1dc65"} Mar 10 06:44:10 crc kubenswrapper[4825]: W0310 06:44:10.331474 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:44:10 crc kubenswrapper[4825]: E0310 06:44:10.331623 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Mar 10 06:44:10 crc kubenswrapper[4825]: E0310 06:44:10.519821 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="1.6s" Mar 10 06:44:10 crc kubenswrapper[4825]: I0310 06:44:10.669989 4825 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 06:44:10 crc kubenswrapper[4825]: E0310 06:44:10.671561 4825 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Mar 10 06:44:10 crc kubenswrapper[4825]: I0310 06:44:10.971718 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:10 crc kubenswrapper[4825]: I0310 06:44:10.973638 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:10 crc kubenswrapper[4825]: I0310 06:44:10.973713 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:10 crc kubenswrapper[4825]: I0310 06:44:10.973731 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:10 crc kubenswrapper[4825]: I0310 06:44:10.973763 4825 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 06:44:10 crc kubenswrapper[4825]: E0310 06:44:10.974577 4825 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.222:6443: connect: connection refused" node="crc" Mar 10 06:44:11 crc kubenswrapper[4825]: I0310 06:44:11.109931 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:44:11 crc kubenswrapper[4825]: W0310 06:44:11.949557 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:44:11 crc kubenswrapper[4825]: E0310 06:44:11.949656 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Mar 10 06:44:11 crc kubenswrapper[4825]: W0310 06:44:11.981188 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:44:11 crc kubenswrapper[4825]: E0310 06:44:11.981335 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.110611 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:44:12 crc kubenswrapper[4825]: E0310 06:44:12.121233 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="3.2s" Mar 10 06:44:12 crc kubenswrapper[4825]: W0310 06:44:12.244905 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:44:12 crc kubenswrapper[4825]: E0310 06:44:12.245025 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.255479 4825 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="32bdd291a7cfb7e9cc1a02cdd529f56074dae35c2c242dee8461075541c7edf7" exitCode=0 Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.255761 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.256527 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"32bdd291a7cfb7e9cc1a02cdd529f56074dae35c2c242dee8461075541c7edf7"} Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.257616 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.257653 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.257669 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.260422 4825 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72" exitCode=0 Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.260541 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72"} Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.260626 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.262324 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.262385 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.262401 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.264267 4825 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46" exitCode=0 Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.264363 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46"} Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.264407 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.266267 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.266332 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.266351 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.269285 4825 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc" exitCode=0 Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.269419 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc"} Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.269453 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.271911 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.271958 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.271974 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.278276 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ebd7c7e7538a870d8bf580b8b01161607bae4c06941a4db2895dc1ee86f3b988"} Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.278318 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d5da17071874543156df5adb8dee5973b11684f66e728934c3353e76eaad702e"} Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.278336 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"311d715e54fdfe9fcdda7648d431fa69ca054e330fd9415f9c4819462508b9ae"} Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.281029 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.282296 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.282341 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.282352 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:12 crc kubenswrapper[4825]: W0310 06:44:12.428841 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:44:12 crc kubenswrapper[4825]: E0310 06:44:12.429339 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.575354 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.576842 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.576914 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.576927 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:12 crc kubenswrapper[4825]: I0310 06:44:12.577140 4825 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 06:44:12 crc kubenswrapper[4825]: E0310 06:44:12.577754 4825 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.222:6443: connect: connection refused" node="crc" Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.110405 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.288360 4825 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8" exitCode=0 Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.288529 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8"} Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.288713 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.290244 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.290306 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.290322 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.291452 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e"} Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.291522 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.292905 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.292957 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.292977 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.298850 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce"} Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.298912 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e"} Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.298927 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7"} Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.298938 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3"} Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.316898 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c1f2bc1824721aecbf1494421ff4b33b65f910bb791a4eb7a591aa8eb455b958"} Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.317008 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.319603 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.319644 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.319658 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.323081 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2487dfed5a84ab11961ba04d4360e5d0e211ddc42e7e88b3b2ad04ae3a9d34e6"} Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.323122 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"46d0e4f89d7c516f3c7534ab4f6edad77bce567d389d9394d2b57425ccffdde7"} Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.323152 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2ac78bed04e5a12d272784fc14809731a05de5be9d952c39ac4ffaa40f8589ef"} Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.323268 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.324232 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.324268 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.324278 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.669502 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 06:44:13 crc kubenswrapper[4825]: I0310 06:44:13.676969 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.331952 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"54190a224d3b19cbd8fccc79ebb69fefd2a1a07feaec36415cad2c92c72c4de4"} Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.332031 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.333293 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.333342 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.333363 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.334964 4825 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0" exitCode=0 Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.335101 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0"} Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.335219 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.335245 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.335395 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.335385 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.336002 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.337052 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.337098 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.337115 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.337291 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.337343 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.337363 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.338258 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.338295 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.338311 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.338262 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.338387 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.338405 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.730704 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.779944 4825 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 06:44:14 crc kubenswrapper[4825]: I0310 06:44:14.967372 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:44:15 crc kubenswrapper[4825]: I0310 06:44:15.343583 4825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 06:44:15 crc kubenswrapper[4825]: I0310 06:44:15.343692 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:15 crc kubenswrapper[4825]: I0310 06:44:15.344364 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f"} Mar 10 06:44:15 crc kubenswrapper[4825]: I0310 06:44:15.344461 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974"} Mar 10 06:44:15 crc kubenswrapper[4825]: I0310 06:44:15.344487 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e"} Mar 10 06:44:15 crc kubenswrapper[4825]: I0310 06:44:15.344505 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945"} Mar 10 06:44:15 crc kubenswrapper[4825]: I0310 06:44:15.344549 4825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 06:44:15 crc kubenswrapper[4825]: I0310 06:44:15.344626 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:15 crc kubenswrapper[4825]: I0310 06:44:15.344703 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:15 crc kubenswrapper[4825]: I0310 06:44:15.345232 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:15 crc kubenswrapper[4825]: I0310 06:44:15.345286 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:15 crc kubenswrapper[4825]: I0310 06:44:15.345306 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:15 crc kubenswrapper[4825]: I0310 06:44:15.345774 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:15 crc kubenswrapper[4825]: I0310 06:44:15.345823 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:15 crc kubenswrapper[4825]: I0310 06:44:15.345838 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:15 crc kubenswrapper[4825]: I0310 06:44:15.346483 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:15 crc kubenswrapper[4825]: I0310 06:44:15.346520 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:15 crc kubenswrapper[4825]: I0310 06:44:15.346532 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:15 crc kubenswrapper[4825]: I0310 06:44:15.778502 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:15 crc kubenswrapper[4825]: I0310 06:44:15.780030 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:15 crc kubenswrapper[4825]: I0310 06:44:15.780081 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:15 crc kubenswrapper[4825]: I0310 06:44:15.780090 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:15 crc kubenswrapper[4825]: I0310 06:44:15.780114 4825 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 06:44:16 crc kubenswrapper[4825]: I0310 06:44:16.351255 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605"} Mar 10 06:44:16 crc kubenswrapper[4825]: I0310 06:44:16.351302 4825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 06:44:16 crc kubenswrapper[4825]: I0310 06:44:16.351368 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:16 crc kubenswrapper[4825]: I0310 06:44:16.351415 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:16 crc kubenswrapper[4825]: I0310 06:44:16.351368 4825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 06:44:16 crc kubenswrapper[4825]: I0310 06:44:16.351500 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:16 crc kubenswrapper[4825]: I0310 06:44:16.352497 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:16 crc kubenswrapper[4825]: I0310 06:44:16.352545 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:16 crc kubenswrapper[4825]: I0310 06:44:16.352561 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:16 crc kubenswrapper[4825]: I0310 06:44:16.353401 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:16 crc kubenswrapper[4825]: I0310 06:44:16.353436 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:16 crc kubenswrapper[4825]: I0310 06:44:16.353403 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:16 crc kubenswrapper[4825]: I0310 06:44:16.353470 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:16 crc kubenswrapper[4825]: I0310 06:44:16.353482 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:16 crc kubenswrapper[4825]: I0310 06:44:16.353449 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:16 crc kubenswrapper[4825]: I0310 06:44:16.508769 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:44:16 crc kubenswrapper[4825]: I0310 06:44:16.533771 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 06:44:17 crc kubenswrapper[4825]: I0310 06:44:17.354043 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:17 crc kubenswrapper[4825]: I0310 06:44:17.354067 4825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 06:44:17 crc kubenswrapper[4825]: I0310 06:44:17.354237 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:17 crc kubenswrapper[4825]: I0310 06:44:17.356719 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:17 crc kubenswrapper[4825]: I0310 06:44:17.362550 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:17 crc kubenswrapper[4825]: I0310 06:44:17.362603 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:17 crc kubenswrapper[4825]: I0310 06:44:17.362573 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:17 crc kubenswrapper[4825]: I0310 06:44:17.362649 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:17 crc kubenswrapper[4825]: I0310 06:44:17.362663 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:17 crc kubenswrapper[4825]: I0310 06:44:17.362675 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:17 crc kubenswrapper[4825]: I0310 06:44:17.362686 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:17 crc kubenswrapper[4825]: I0310 06:44:17.362620 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:17 crc kubenswrapper[4825]: I0310 06:44:17.362771 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:18 crc kubenswrapper[4825]: I0310 06:44:18.075722 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 10 06:44:18 crc kubenswrapper[4825]: I0310 06:44:18.357350 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:18 crc kubenswrapper[4825]: I0310 06:44:18.358715 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:18 crc kubenswrapper[4825]: I0310 06:44:18.358775 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:18 crc kubenswrapper[4825]: I0310 06:44:18.358792 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:19 crc kubenswrapper[4825]: I0310 06:44:19.042722 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:44:19 crc kubenswrapper[4825]: I0310 06:44:19.043034 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:19 crc kubenswrapper[4825]: I0310 06:44:19.044952 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:19 crc kubenswrapper[4825]: I0310 06:44:19.045007 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:19 crc kubenswrapper[4825]: I0310 06:44:19.045022 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:19 crc kubenswrapper[4825]: I0310 06:44:19.067319 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 10 06:44:19 crc kubenswrapper[4825]: I0310 06:44:19.151967 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 06:44:19 crc kubenswrapper[4825]: I0310 06:44:19.152192 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:19 crc kubenswrapper[4825]: I0310 06:44:19.153834 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:19 crc kubenswrapper[4825]: I0310 06:44:19.153946 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:19 crc kubenswrapper[4825]: I0310 06:44:19.153975 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:19 crc kubenswrapper[4825]: I0310 06:44:19.360342 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:19 crc kubenswrapper[4825]: I0310 06:44:19.361741 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:19 crc kubenswrapper[4825]: I0310 06:44:19.361809 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:19 crc kubenswrapper[4825]: I0310 06:44:19.361829 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:19 crc kubenswrapper[4825]: E0310 06:44:19.525338 4825 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 06:44:19 crc kubenswrapper[4825]: I0310 06:44:19.534065 4825 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 06:44:19 crc kubenswrapper[4825]: I0310 06:44:19.534160 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 06:44:24 crc kubenswrapper[4825]: I0310 06:44:24.111809 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 10 06:44:24 crc kubenswrapper[4825]: E0310 06:44:24.786204 4825 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 10 06:44:24 crc kubenswrapper[4825]: E0310 06:44:24.847099 4825 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:24Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 06:44:24 crc kubenswrapper[4825]: W0310 06:44:24.849429 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:24Z is after 2026-02-23T05:33:13Z Mar 10 06:44:24 crc kubenswrapper[4825]: E0310 06:44:24.849518 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 06:44:24 crc kubenswrapper[4825]: W0310 06:44:24.854386 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:24Z is after 2026-02-23T05:33:13Z Mar 10 06:44:24 crc kubenswrapper[4825]: E0310 06:44:24.854466 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 06:44:24 crc kubenswrapper[4825]: W0310 06:44:24.856776 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:24Z is after 2026-02-23T05:33:13Z Mar 10 06:44:24 crc kubenswrapper[4825]: E0310 06:44:24.856866 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 06:44:24 crc kubenswrapper[4825]: E0310 06:44:24.856925 4825 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:24Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b67d67648e1b8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.10557228 +0000 UTC m=+2.135352895,LastTimestamp:2026-03-10 06:44:09.10557228 +0000 UTC m=+2.135352895,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:24 crc kubenswrapper[4825]: W0310 06:44:24.862099 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:24Z is after 2026-02-23T05:33:13Z Mar 10 06:44:24 crc kubenswrapper[4825]: E0310 06:44:24.862249 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 06:44:24 crc kubenswrapper[4825]: E0310 06:44:24.862472 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:24Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 10 06:44:24 crc kubenswrapper[4825]: I0310 06:44:24.862573 4825 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 06:44:24 crc kubenswrapper[4825]: I0310 06:44:24.862825 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 10 06:44:24 crc kubenswrapper[4825]: I0310 06:44:24.867769 4825 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 06:44:24 crc kubenswrapper[4825]: I0310 06:44:24.867877 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 10 06:44:24 crc kubenswrapper[4825]: I0310 06:44:24.973621 4825 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]log ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]etcd ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/generic-apiserver-start-informers ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/priority-and-fairness-filter ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/start-apiextensions-informers ok Mar 10 06:44:24 crc kubenswrapper[4825]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Mar 10 06:44:24 crc kubenswrapper[4825]: [-]poststarthook/crd-informer-synced failed: reason withheld Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/start-system-namespaces-controller ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 10 06:44:24 crc kubenswrapper[4825]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 10 06:44:24 crc kubenswrapper[4825]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/bootstrap-controller ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/start-kube-aggregator-informers ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/apiservice-registration-controller ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/apiservice-discovery-controller ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]autoregister-completion ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/apiservice-openapi-controller ok Mar 10 06:44:24 crc kubenswrapper[4825]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 10 06:44:24 crc kubenswrapper[4825]: livez check failed Mar 10 06:44:24 crc kubenswrapper[4825]: I0310 06:44:24.973689 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 06:44:25 crc kubenswrapper[4825]: I0310 06:44:25.113530 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:25Z is after 2026-02-23T05:33:13Z Mar 10 06:44:25 crc kubenswrapper[4825]: I0310 06:44:25.381950 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 06:44:25 crc kubenswrapper[4825]: I0310 06:44:25.384477 4825 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="54190a224d3b19cbd8fccc79ebb69fefd2a1a07feaec36415cad2c92c72c4de4" exitCode=255 Mar 10 06:44:25 crc kubenswrapper[4825]: I0310 06:44:25.384525 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"54190a224d3b19cbd8fccc79ebb69fefd2a1a07feaec36415cad2c92c72c4de4"} Mar 10 06:44:25 crc kubenswrapper[4825]: I0310 06:44:25.384732 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:25 crc kubenswrapper[4825]: I0310 06:44:25.385596 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:25 crc kubenswrapper[4825]: I0310 06:44:25.385627 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:25 crc kubenswrapper[4825]: I0310 06:44:25.385639 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:25 crc kubenswrapper[4825]: I0310 06:44:25.386172 4825 scope.go:117] "RemoveContainer" containerID="54190a224d3b19cbd8fccc79ebb69fefd2a1a07feaec36415cad2c92c72c4de4" Mar 10 06:44:26 crc kubenswrapper[4825]: I0310 06:44:26.112740 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:26Z is after 2026-02-23T05:33:13Z Mar 10 06:44:26 crc kubenswrapper[4825]: I0310 06:44:26.390285 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 06:44:26 crc kubenswrapper[4825]: I0310 06:44:26.391098 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 06:44:26 crc kubenswrapper[4825]: I0310 06:44:26.393513 4825 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c48fdc2d26d514214f1b2733175c401543defc2a018a3c6cc45cbe0feae0fcb5" exitCode=255 Mar 10 06:44:26 crc kubenswrapper[4825]: I0310 06:44:26.393575 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c48fdc2d26d514214f1b2733175c401543defc2a018a3c6cc45cbe0feae0fcb5"} Mar 10 06:44:26 crc kubenswrapper[4825]: I0310 06:44:26.393653 4825 scope.go:117] "RemoveContainer" containerID="54190a224d3b19cbd8fccc79ebb69fefd2a1a07feaec36415cad2c92c72c4de4" Mar 10 06:44:26 crc kubenswrapper[4825]: I0310 06:44:26.393798 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:26 crc kubenswrapper[4825]: I0310 06:44:26.394912 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:26 crc kubenswrapper[4825]: I0310 06:44:26.394942 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:26 crc kubenswrapper[4825]: I0310 06:44:26.394952 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:26 crc kubenswrapper[4825]: I0310 06:44:26.395490 4825 scope.go:117] "RemoveContainer" containerID="c48fdc2d26d514214f1b2733175c401543defc2a018a3c6cc45cbe0feae0fcb5" Mar 10 06:44:26 crc kubenswrapper[4825]: E0310 06:44:26.395642 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 06:44:26 crc kubenswrapper[4825]: I0310 06:44:26.509438 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:44:27 crc kubenswrapper[4825]: I0310 06:44:27.115122 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:27Z is after 2026-02-23T05:33:13Z Mar 10 06:44:27 crc kubenswrapper[4825]: I0310 06:44:27.399217 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 06:44:27 crc kubenswrapper[4825]: I0310 06:44:27.402426 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:27 crc kubenswrapper[4825]: I0310 06:44:27.403663 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:27 crc kubenswrapper[4825]: I0310 06:44:27.403701 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:27 crc kubenswrapper[4825]: I0310 06:44:27.403714 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:27 crc kubenswrapper[4825]: I0310 06:44:27.404416 4825 scope.go:117] "RemoveContainer" containerID="c48fdc2d26d514214f1b2733175c401543defc2a018a3c6cc45cbe0feae0fcb5" Mar 10 06:44:27 crc kubenswrapper[4825]: E0310 06:44:27.404626 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 06:44:27 crc kubenswrapper[4825]: I0310 06:44:27.783912 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:44:28 crc kubenswrapper[4825]: I0310 06:44:28.115434 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:28Z is after 2026-02-23T05:33:13Z Mar 10 06:44:28 crc kubenswrapper[4825]: I0310 06:44:28.405446 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:28 crc kubenswrapper[4825]: I0310 06:44:28.406834 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:28 crc kubenswrapper[4825]: I0310 06:44:28.406903 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:28 crc kubenswrapper[4825]: I0310 06:44:28.406923 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:28 crc kubenswrapper[4825]: I0310 06:44:28.408105 4825 scope.go:117] "RemoveContainer" containerID="c48fdc2d26d514214f1b2733175c401543defc2a018a3c6cc45cbe0feae0fcb5" Mar 10 06:44:28 crc kubenswrapper[4825]: E0310 06:44:28.408552 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.100052 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.100411 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.102390 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.102459 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.102483 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.113204 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:29Z is after 2026-02-23T05:33:13Z Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.116411 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.159325 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.159524 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.161115 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.161190 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.161204 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.408883 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.410101 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.410178 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.410191 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:29 crc kubenswrapper[4825]: E0310 06:44:29.525810 4825 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.535331 4825 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.535421 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.974386 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.974615 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.976083 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.976173 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.976191 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.977114 4825 scope.go:117] "RemoveContainer" containerID="c48fdc2d26d514214f1b2733175c401543defc2a018a3c6cc45cbe0feae0fcb5" Mar 10 06:44:29 crc kubenswrapper[4825]: E0310 06:44:29.977478 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 06:44:29 crc kubenswrapper[4825]: I0310 06:44:29.981476 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:44:30 crc kubenswrapper[4825]: I0310 06:44:30.114774 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:30Z is after 2026-02-23T05:33:13Z Mar 10 06:44:30 crc kubenswrapper[4825]: I0310 06:44:30.412761 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:30 crc kubenswrapper[4825]: I0310 06:44:30.414765 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:30 crc kubenswrapper[4825]: I0310 06:44:30.414816 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:30 crc kubenswrapper[4825]: I0310 06:44:30.414835 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:30 crc kubenswrapper[4825]: I0310 06:44:30.415727 4825 scope.go:117] "RemoveContainer" containerID="c48fdc2d26d514214f1b2733175c401543defc2a018a3c6cc45cbe0feae0fcb5" Mar 10 06:44:30 crc kubenswrapper[4825]: E0310 06:44:30.416008 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 06:44:31 crc kubenswrapper[4825]: I0310 06:44:31.114933 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:31Z is after 2026-02-23T05:33:13Z Mar 10 06:44:31 crc kubenswrapper[4825]: I0310 06:44:31.247604 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:31 crc kubenswrapper[4825]: I0310 06:44:31.249689 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:31 crc kubenswrapper[4825]: I0310 06:44:31.249774 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:31 crc kubenswrapper[4825]: I0310 06:44:31.249793 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:31 crc kubenswrapper[4825]: I0310 06:44:31.249841 4825 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 06:44:31 crc kubenswrapper[4825]: E0310 06:44:31.255671 4825 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:31Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 06:44:31 crc kubenswrapper[4825]: E0310 06:44:31.269179 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:31Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 06:44:32 crc kubenswrapper[4825]: I0310 06:44:32.115331 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:32Z is after 2026-02-23T05:33:13Z Mar 10 06:44:32 crc kubenswrapper[4825]: I0310 06:44:32.808659 4825 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 06:44:32 crc kubenswrapper[4825]: E0310 06:44:32.814798 4825 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:32Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 06:44:32 crc kubenswrapper[4825]: W0310 06:44:32.859488 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:32Z is after 2026-02-23T05:33:13Z Mar 10 06:44:32 crc kubenswrapper[4825]: E0310 06:44:32.859614 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:32Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 06:44:33 crc kubenswrapper[4825]: I0310 06:44:33.115479 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:33Z is after 2026-02-23T05:33:13Z Mar 10 06:44:34 crc kubenswrapper[4825]: I0310 06:44:34.115702 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:34Z is after 2026-02-23T05:33:13Z Mar 10 06:44:34 crc kubenswrapper[4825]: W0310 06:44:34.275533 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:34Z is after 2026-02-23T05:33:13Z Mar 10 06:44:34 crc kubenswrapper[4825]: E0310 06:44:34.275656 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 06:44:34 crc kubenswrapper[4825]: E0310 06:44:34.862064 4825 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:34Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b67d67648e1b8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.10557228 +0000 UTC m=+2.135352895,LastTimestamp:2026-03-10 06:44:09.10557228 +0000 UTC m=+2.135352895,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:34 crc kubenswrapper[4825]: W0310 06:44:34.990413 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:34Z is after 2026-02-23T05:33:13Z Mar 10 06:44:34 crc kubenswrapper[4825]: E0310 06:44:34.990528 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 06:44:35 crc kubenswrapper[4825]: I0310 06:44:35.114218 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:35Z is after 2026-02-23T05:33:13Z Mar 10 06:44:36 crc kubenswrapper[4825]: I0310 06:44:36.113961 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:36Z is after 2026-02-23T05:33:13Z Mar 10 06:44:37 crc kubenswrapper[4825]: I0310 06:44:37.114944 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:37Z is after 2026-02-23T05:33:13Z Mar 10 06:44:37 crc kubenswrapper[4825]: W0310 06:44:37.318306 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:37Z is after 2026-02-23T05:33:13Z Mar 10 06:44:37 crc kubenswrapper[4825]: E0310 06:44:37.318424 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 06:44:38 crc kubenswrapper[4825]: I0310 06:44:38.112924 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:38Z is after 2026-02-23T05:33:13Z Mar 10 06:44:38 crc kubenswrapper[4825]: I0310 06:44:38.256410 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:38 crc kubenswrapper[4825]: I0310 06:44:38.258447 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:38 crc kubenswrapper[4825]: I0310 06:44:38.258511 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:38 crc kubenswrapper[4825]: I0310 06:44:38.258535 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:38 crc kubenswrapper[4825]: I0310 06:44:38.258584 4825 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 06:44:38 crc kubenswrapper[4825]: E0310 06:44:38.263969 4825 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:38Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 06:44:38 crc kubenswrapper[4825]: E0310 06:44:38.276231 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:38Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 06:44:39 crc kubenswrapper[4825]: I0310 06:44:39.114943 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:39Z is after 2026-02-23T05:33:13Z Mar 10 06:44:39 crc kubenswrapper[4825]: E0310 06:44:39.526566 4825 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 06:44:39 crc kubenswrapper[4825]: I0310 06:44:39.534392 4825 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 06:44:39 crc kubenswrapper[4825]: I0310 06:44:39.534459 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 06:44:39 crc kubenswrapper[4825]: I0310 06:44:39.534532 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 06:44:39 crc kubenswrapper[4825]: I0310 06:44:39.534710 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:39 crc kubenswrapper[4825]: I0310 06:44:39.536318 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:39 crc kubenswrapper[4825]: I0310 06:44:39.536352 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:39 crc kubenswrapper[4825]: I0310 06:44:39.536364 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:39 crc kubenswrapper[4825]: I0310 06:44:39.536872 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"d5da17071874543156df5adb8dee5973b11684f66e728934c3353e76eaad702e"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 10 06:44:39 crc kubenswrapper[4825]: I0310 06:44:39.537052 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://d5da17071874543156df5adb8dee5973b11684f66e728934c3353e76eaad702e" gracePeriod=30 Mar 10 06:44:40 crc kubenswrapper[4825]: I0310 06:44:40.113074 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:40Z is after 2026-02-23T05:33:13Z Mar 10 06:44:40 crc kubenswrapper[4825]: I0310 06:44:40.460337 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 06:44:40 crc kubenswrapper[4825]: I0310 06:44:40.460977 4825 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d5da17071874543156df5adb8dee5973b11684f66e728934c3353e76eaad702e" exitCode=255 Mar 10 06:44:40 crc kubenswrapper[4825]: I0310 06:44:40.461044 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d5da17071874543156df5adb8dee5973b11684f66e728934c3353e76eaad702e"} Mar 10 06:44:40 crc kubenswrapper[4825]: I0310 06:44:40.461365 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a32345952f87cad97a94ac8654386fd6e5253db4e25da1c4b9d6de141fea7832"} Mar 10 06:44:40 crc kubenswrapper[4825]: I0310 06:44:40.461650 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:40 crc kubenswrapper[4825]: I0310 06:44:40.463061 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:40 crc kubenswrapper[4825]: I0310 06:44:40.463116 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:40 crc kubenswrapper[4825]: I0310 06:44:40.463163 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:41 crc kubenswrapper[4825]: I0310 06:44:41.118506 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:41Z is after 2026-02-23T05:33:13Z Mar 10 06:44:41 crc kubenswrapper[4825]: I0310 06:44:41.463381 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:41 crc kubenswrapper[4825]: I0310 06:44:41.464293 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:41 crc kubenswrapper[4825]: I0310 06:44:41.464315 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:41 crc kubenswrapper[4825]: I0310 06:44:41.464323 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:42 crc kubenswrapper[4825]: I0310 06:44:42.113634 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:42Z is after 2026-02-23T05:33:13Z Mar 10 06:44:43 crc kubenswrapper[4825]: I0310 06:44:43.115232 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:43Z is after 2026-02-23T05:33:13Z Mar 10 06:44:44 crc kubenswrapper[4825]: I0310 06:44:44.114614 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:44Z is after 2026-02-23T05:33:13Z Mar 10 06:44:44 crc kubenswrapper[4825]: I0310 06:44:44.731210 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 06:44:44 crc kubenswrapper[4825]: I0310 06:44:44.732690 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:44 crc kubenswrapper[4825]: I0310 06:44:44.734352 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:44 crc kubenswrapper[4825]: I0310 06:44:44.734397 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:44 crc kubenswrapper[4825]: I0310 06:44:44.734414 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:44 crc kubenswrapper[4825]: E0310 06:44:44.865925 4825 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:44Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b67d67648e1b8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.10557228 +0000 UTC m=+2.135352895,LastTimestamp:2026-03-10 06:44:09.10557228 +0000 UTC m=+2.135352895,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:45 crc kubenswrapper[4825]: I0310 06:44:45.114720 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:45Z is after 2026-02-23T05:33:13Z Mar 10 06:44:45 crc kubenswrapper[4825]: I0310 06:44:45.235938 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:45 crc kubenswrapper[4825]: I0310 06:44:45.237861 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:45 crc kubenswrapper[4825]: I0310 06:44:45.237952 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:45 crc kubenswrapper[4825]: I0310 06:44:45.237985 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:45 crc kubenswrapper[4825]: I0310 06:44:45.238967 4825 scope.go:117] "RemoveContainer" containerID="c48fdc2d26d514214f1b2733175c401543defc2a018a3c6cc45cbe0feae0fcb5" Mar 10 06:44:45 crc kubenswrapper[4825]: I0310 06:44:45.264118 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:45 crc kubenswrapper[4825]: I0310 06:44:45.265510 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:45 crc kubenswrapper[4825]: I0310 06:44:45.265590 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:45 crc kubenswrapper[4825]: I0310 06:44:45.265609 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:45 crc kubenswrapper[4825]: I0310 06:44:45.265641 4825 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 06:44:45 crc kubenswrapper[4825]: E0310 06:44:45.272612 4825 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:45Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 06:44:45 crc kubenswrapper[4825]: E0310 06:44:45.282580 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:45Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 06:44:45 crc kubenswrapper[4825]: I0310 06:44:45.477941 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 06:44:45 crc kubenswrapper[4825]: I0310 06:44:45.480765 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4e0592c099e85d8ce818112bf02431516ffa3696e764fe5a8a824baaf2e1c0da"} Mar 10 06:44:46 crc kubenswrapper[4825]: I0310 06:44:46.113484 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:46Z is after 2026-02-23T05:33:13Z Mar 10 06:44:46 crc kubenswrapper[4825]: I0310 06:44:46.483429 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:46 crc kubenswrapper[4825]: I0310 06:44:46.485525 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:46 crc kubenswrapper[4825]: I0310 06:44:46.485572 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:46 crc kubenswrapper[4825]: I0310 06:44:46.485581 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:46 crc kubenswrapper[4825]: I0310 06:44:46.508772 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:44:46 crc kubenswrapper[4825]: I0310 06:44:46.533912 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 06:44:46 crc kubenswrapper[4825]: I0310 06:44:46.534189 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:46 crc kubenswrapper[4825]: I0310 06:44:46.535618 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:46 crc kubenswrapper[4825]: I0310 06:44:46.535721 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:46 crc kubenswrapper[4825]: I0310 06:44:46.535800 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:47 crc kubenswrapper[4825]: I0310 06:44:47.114360 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:47Z is after 2026-02-23T05:33:13Z Mar 10 06:44:47 crc kubenswrapper[4825]: I0310 06:44:47.488457 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 06:44:47 crc kubenswrapper[4825]: I0310 06:44:47.489262 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 06:44:47 crc kubenswrapper[4825]: I0310 06:44:47.491777 4825 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4e0592c099e85d8ce818112bf02431516ffa3696e764fe5a8a824baaf2e1c0da" exitCode=255 Mar 10 06:44:47 crc kubenswrapper[4825]: I0310 06:44:47.491829 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4e0592c099e85d8ce818112bf02431516ffa3696e764fe5a8a824baaf2e1c0da"} Mar 10 06:44:47 crc kubenswrapper[4825]: I0310 06:44:47.491875 4825 scope.go:117] "RemoveContainer" containerID="c48fdc2d26d514214f1b2733175c401543defc2a018a3c6cc45cbe0feae0fcb5" Mar 10 06:44:47 crc kubenswrapper[4825]: I0310 06:44:47.492067 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:47 crc kubenswrapper[4825]: I0310 06:44:47.493098 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:47 crc kubenswrapper[4825]: I0310 06:44:47.493169 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:47 crc kubenswrapper[4825]: I0310 06:44:47.493187 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:47 crc kubenswrapper[4825]: I0310 06:44:47.493939 4825 scope.go:117] "RemoveContainer" containerID="4e0592c099e85d8ce818112bf02431516ffa3696e764fe5a8a824baaf2e1c0da" Mar 10 06:44:47 crc kubenswrapper[4825]: E0310 06:44:47.494192 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 06:44:47 crc kubenswrapper[4825]: I0310 06:44:47.784886 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:44:48 crc kubenswrapper[4825]: I0310 06:44:48.115604 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:48Z is after 2026-02-23T05:33:13Z Mar 10 06:44:48 crc kubenswrapper[4825]: W0310 06:44:48.327570 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:48Z is after 2026-02-23T05:33:13Z Mar 10 06:44:48 crc kubenswrapper[4825]: E0310 06:44:48.327685 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:48Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 06:44:48 crc kubenswrapper[4825]: I0310 06:44:48.496504 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 06:44:48 crc kubenswrapper[4825]: I0310 06:44:48.498432 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:48 crc kubenswrapper[4825]: I0310 06:44:48.499456 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:48 crc kubenswrapper[4825]: I0310 06:44:48.499497 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:48 crc kubenswrapper[4825]: I0310 06:44:48.499511 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:48 crc kubenswrapper[4825]: I0310 06:44:48.500236 4825 scope.go:117] "RemoveContainer" containerID="4e0592c099e85d8ce818112bf02431516ffa3696e764fe5a8a824baaf2e1c0da" Mar 10 06:44:48 crc kubenswrapper[4825]: E0310 06:44:48.500441 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 06:44:49 crc kubenswrapper[4825]: I0310 06:44:49.117359 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:44:49 crc kubenswrapper[4825]: W0310 06:44:49.238465 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 10 06:44:49 crc kubenswrapper[4825]: E0310 06:44:49.238550 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 06:44:49 crc kubenswrapper[4825]: I0310 06:44:49.500958 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:49 crc kubenswrapper[4825]: I0310 06:44:49.502534 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:49 crc kubenswrapper[4825]: I0310 06:44:49.502609 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:49 crc kubenswrapper[4825]: I0310 06:44:49.502630 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:49 crc kubenswrapper[4825]: I0310 06:44:49.503669 4825 scope.go:117] "RemoveContainer" containerID="4e0592c099e85d8ce818112bf02431516ffa3696e764fe5a8a824baaf2e1c0da" Mar 10 06:44:49 crc kubenswrapper[4825]: E0310 06:44:49.504043 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 06:44:49 crc kubenswrapper[4825]: E0310 06:44:49.527032 4825 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 06:44:49 crc kubenswrapper[4825]: I0310 06:44:49.535318 4825 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 06:44:49 crc kubenswrapper[4825]: I0310 06:44:49.535422 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 06:44:50 crc kubenswrapper[4825]: I0310 06:44:50.115387 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:44:50 crc kubenswrapper[4825]: I0310 06:44:50.128363 4825 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 06:44:50 crc kubenswrapper[4825]: I0310 06:44:50.148269 4825 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 10 06:44:51 crc kubenswrapper[4825]: I0310 06:44:51.118069 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:44:52 crc kubenswrapper[4825]: I0310 06:44:52.114737 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:44:52 crc kubenswrapper[4825]: I0310 06:44:52.273006 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:52 crc kubenswrapper[4825]: I0310 06:44:52.274998 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:52 crc kubenswrapper[4825]: I0310 06:44:52.275102 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:52 crc kubenswrapper[4825]: I0310 06:44:52.275175 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:52 crc kubenswrapper[4825]: I0310 06:44:52.275292 4825 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 06:44:52 crc kubenswrapper[4825]: E0310 06:44:52.282760 4825 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 06:44:52 crc kubenswrapper[4825]: E0310 06:44:52.289938 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 06:44:53 crc kubenswrapper[4825]: I0310 06:44:53.117616 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:44:54 crc kubenswrapper[4825]: I0310 06:44:54.113540 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:44:54 crc kubenswrapper[4825]: E0310 06:44:54.874252 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d67648e1b8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.10557228 +0000 UTC m=+2.135352895,LastTimestamp:2026-03-10 06:44:09.10557228 +0000 UTC m=+2.135352895,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:54 crc kubenswrapper[4825]: E0310 06:44:54.882422 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d67aa707f2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.178851314 +0000 UTC m=+2.208631969,LastTimestamp:2026-03-10 06:44:09.178851314 +0000 UTC m=+2.208631969,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:54 crc kubenswrapper[4825]: E0310 06:44:54.890234 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d67aa7c20b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.178898955 +0000 UTC m=+2.208679620,LastTimestamp:2026-03-10 06:44:09.178898955 +0000 UTC m=+2.208679620,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:54 crc kubenswrapper[4825]: E0310 06:44:54.898106 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d67aa8235a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.178923866 +0000 UTC m=+2.208704531,LastTimestamp:2026-03-10 06:44:09.178923866 +0000 UTC m=+2.208704531,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:54 crc kubenswrapper[4825]: E0310 06:44:54.904056 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d68bbb6617 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.465398807 +0000 UTC m=+2.495179422,LastTimestamp:2026-03-10 06:44:09.465398807 +0000 UTC m=+2.495179422,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:54 crc kubenswrapper[4825]: E0310 06:44:54.920030 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b67d67aa707f2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d67aa707f2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.178851314 +0000 UTC m=+2.208631969,LastTimestamp:2026-03-10 06:44:09.539037329 +0000 UTC m=+2.568817974,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:54 crc kubenswrapper[4825]: E0310 06:44:54.927120 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b67d67aa7c20b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d67aa7c20b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.178898955 +0000 UTC m=+2.208679620,LastTimestamp:2026-03-10 06:44:09.53907538 +0000 UTC m=+2.568856035,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:54 crc kubenswrapper[4825]: E0310 06:44:54.933854 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b67d67aa8235a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d67aa8235a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.178923866 +0000 UTC m=+2.208704531,LastTimestamp:2026-03-10 06:44:09.539096481 +0000 UTC m=+2.568877136,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:54 crc kubenswrapper[4825]: E0310 06:44:54.941580 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b67d67aa707f2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d67aa707f2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.178851314 +0000 UTC m=+2.208631969,LastTimestamp:2026-03-10 06:44:09.541304322 +0000 UTC m=+2.571084937,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:54 crc kubenswrapper[4825]: E0310 06:44:54.948323 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b67d67aa7c20b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d67aa7c20b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.178898955 +0000 UTC m=+2.208679620,LastTimestamp:2026-03-10 06:44:09.541321652 +0000 UTC m=+2.571102267,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:54 crc kubenswrapper[4825]: E0310 06:44:54.955003 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b67d67aa8235a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d67aa8235a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.178923866 +0000 UTC m=+2.208704531,LastTimestamp:2026-03-10 06:44:09.541336763 +0000 UTC m=+2.571117378,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:54 crc kubenswrapper[4825]: E0310 06:44:54.962248 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b67d67aa707f2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d67aa707f2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.178851314 +0000 UTC m=+2.208631969,LastTimestamp:2026-03-10 06:44:09.54167246 +0000 UTC m=+2.571453075,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:54 crc kubenswrapper[4825]: E0310 06:44:54.969093 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b67d67aa7c20b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d67aa7c20b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.178898955 +0000 UTC m=+2.208679620,LastTimestamp:2026-03-10 06:44:09.541682241 +0000 UTC m=+2.571462856,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:54 crc kubenswrapper[4825]: E0310 06:44:54.975850 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b67d67aa8235a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d67aa8235a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.178923866 +0000 UTC m=+2.208704531,LastTimestamp:2026-03-10 06:44:09.541692211 +0000 UTC m=+2.571472826,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:54 crc kubenswrapper[4825]: E0310 06:44:54.983387 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b67d67aa707f2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d67aa707f2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.178851314 +0000 UTC m=+2.208631969,LastTimestamp:2026-03-10 06:44:09.543422471 +0000 UTC m=+2.573203126,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:54 crc kubenswrapper[4825]: E0310 06:44:54.990435 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b67d67aa7c20b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d67aa7c20b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.178898955 +0000 UTC m=+2.208679620,LastTimestamp:2026-03-10 06:44:09.543450021 +0000 UTC m=+2.573230676,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:54 crc kubenswrapper[4825]: E0310 06:44:54.998099 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b67d67aa8235a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d67aa8235a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.178923866 +0000 UTC m=+2.208704531,LastTimestamp:2026-03-10 06:44:09.543469932 +0000 UTC m=+2.573250577,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.005052 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b67d67aa707f2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d67aa707f2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.178851314 +0000 UTC m=+2.208631969,LastTimestamp:2026-03-10 06:44:09.543550464 +0000 UTC m=+2.573331079,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.012921 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b67d67aa7c20b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d67aa7c20b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.178898955 +0000 UTC m=+2.208679620,LastTimestamp:2026-03-10 06:44:09.543562314 +0000 UTC m=+2.573342929,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.019850 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b67d67aa8235a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d67aa8235a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.178923866 +0000 UTC m=+2.208704531,LastTimestamp:2026-03-10 06:44:09.543571114 +0000 UTC m=+2.573351719,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.027044 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b67d67aa707f2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d67aa707f2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.178851314 +0000 UTC m=+2.208631969,LastTimestamp:2026-03-10 06:44:09.544587148 +0000 UTC m=+2.574367803,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.034061 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b67d67aa7c20b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d67aa7c20b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.178898955 +0000 UTC m=+2.208679620,LastTimestamp:2026-03-10 06:44:09.544606408 +0000 UTC m=+2.574387063,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.041365 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b67d67aa8235a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d67aa8235a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.178923866 +0000 UTC m=+2.208704531,LastTimestamp:2026-03-10 06:44:09.544621079 +0000 UTC m=+2.574401734,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.048409 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b67d67aa707f2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d67aa707f2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.178851314 +0000 UTC m=+2.208631969,LastTimestamp:2026-03-10 06:44:09.54510595 +0000 UTC m=+2.574886605,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.053494 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b67d67aa7c20b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b67d67aa7c20b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:09.178898955 +0000 UTC m=+2.208679620,LastTimestamp:2026-03-10 06:44:09.545165131 +0000 UTC m=+2.574945786,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.060708 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b67d6b27a62e4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:10.115449572 +0000 UTC m=+3.145230217,LastTimestamp:2026-03-10 06:44:10.115449572 +0000 UTC m=+3.145230217,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.066976 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b67d6b27ba692 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:10.115532434 +0000 UTC m=+3.145313059,LastTimestamp:2026-03-10 06:44:10.115532434 +0000 UTC m=+3.145313059,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.073850 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b67d6b27cd99c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:10.115611036 +0000 UTC m=+3.145391691,LastTimestamp:2026-03-10 06:44:10.115611036 +0000 UTC m=+3.145391691,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.077172 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b67d6b27e1b06 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:10.115693318 +0000 UTC m=+3.145473943,LastTimestamp:2026-03-10 06:44:10.115693318 +0000 UTC m=+3.145473943,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.079992 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b67d6b2a47754 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:10.118207316 +0000 UTC m=+3.147987941,LastTimestamp:2026-03-10 06:44:10.118207316 +0000 UTC m=+3.147987941,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.083686 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b67d70c3f5ade openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:11.621530334 +0000 UTC m=+4.651310949,LastTimestamp:2026-03-10 06:44:11.621530334 +0000 UTC m=+4.651310949,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.086289 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b67d70c56ae9b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:11.623059099 +0000 UTC m=+4.652839754,LastTimestamp:2026-03-10 06:44:11.623059099 +0000 UTC m=+4.652839754,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.090378 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b67d70c80befb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:11.625815803 +0000 UTC m=+4.655596418,LastTimestamp:2026-03-10 06:44:11.625815803 +0000 UTC m=+4.655596418,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.092272 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b67d70cdd34c9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:11.631875273 +0000 UTC m=+4.661655878,LastTimestamp:2026-03-10 06:44:11.631875273 +0000 UTC m=+4.661655878,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.097986 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b67d70cfbbe95 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:11.633876629 +0000 UTC m=+4.663657254,LastTimestamp:2026-03-10 06:44:11.633876629 +0000 UTC m=+4.663657254,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.104199 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b67d70d03a765 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:11.634394981 +0000 UTC m=+4.664175596,LastTimestamp:2026-03-10 06:44:11.634394981 +0000 UTC m=+4.664175596,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.110417 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b67d70d6e1768 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:11.641370472 +0000 UTC m=+4.671151107,LastTimestamp:2026-03-10 06:44:11.641370472 +0000 UTC m=+4.671151107,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: I0310 06:44:55.111066 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.116201 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b67d70d8598d0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:11.642910928 +0000 UTC m=+4.672691583,LastTimestamp:2026-03-10 06:44:11.642910928 +0000 UTC m=+4.672691583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.121614 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b67d70d99fcc7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:11.644247239 +0000 UTC m=+4.674027864,LastTimestamp:2026-03-10 06:44:11.644247239 +0000 UTC m=+4.674027864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.129801 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b67d70de33488 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:11.64904564 +0000 UTC m=+4.678826295,LastTimestamp:2026-03-10 06:44:11.64904564 +0000 UTC m=+4.678826295,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.135938 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b67d70df817f5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:11.650414581 +0000 UTC m=+4.680195226,LastTimestamp:2026-03-10 06:44:11.650414581 +0000 UTC m=+4.680195226,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.142703 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b67d7213a75bd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:11.973531069 +0000 UTC m=+5.003311684,LastTimestamp:2026-03-10 06:44:11.973531069 +0000 UTC m=+5.003311684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.149764 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b67d721f8cbc9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:11.986004937 +0000 UTC m=+5.015785542,LastTimestamp:2026-03-10 06:44:11.986004937 +0000 UTC m=+5.015785542,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.156727 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b67d722130b79 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:11.987725177 +0000 UTC m=+5.017505832,LastTimestamp:2026-03-10 06:44:11.987725177 +0000 UTC m=+5.017505832,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.162908 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b67d72f58c3e8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.210398184 +0000 UTC m=+5.240178829,LastTimestamp:2026-03-10 06:44:12.210398184 +0000 UTC m=+5.240178829,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.169967 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b67d7302e1acd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.224379597 +0000 UTC m=+5.254160252,LastTimestamp:2026-03-10 06:44:12.224379597 +0000 UTC m=+5.254160252,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.178818 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b67d73041f92b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.225681707 +0000 UTC m=+5.255462342,LastTimestamp:2026-03-10 06:44:12.225681707 +0000 UTC m=+5.255462342,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.186847 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b67d7324e55f1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.260046321 +0000 UTC m=+5.289826946,LastTimestamp:2026-03-10 06:44:12.260046321 +0000 UTC m=+5.289826946,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.194223 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b67d732a1187d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.265470077 +0000 UTC m=+5.295250732,LastTimestamp:2026-03-10 06:44:12.265470077 +0000 UTC m=+5.295250732,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.202294 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b67d733039718 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.271925016 +0000 UTC m=+5.301705641,LastTimestamp:2026-03-10 06:44:12.271925016 +0000 UTC m=+5.301705641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.209190 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b67d7338b0f81 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.280803201 +0000 UTC m=+5.310583826,LastTimestamp:2026-03-10 06:44:12.280803201 +0000 UTC m=+5.310583826,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.216593 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b67d73f970d3b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.482915643 +0000 UTC m=+5.512696268,LastTimestamp:2026-03-10 06:44:12.482915643 +0000 UTC m=+5.512696268,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.222574 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b67d740024611 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.489942545 +0000 UTC m=+5.519723160,LastTimestamp:2026-03-10 06:44:12.489942545 +0000 UTC m=+5.519723160,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.228890 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b67d7407758cc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.497615052 +0000 UTC m=+5.527395657,LastTimestamp:2026-03-10 06:44:12.497615052 +0000 UTC m=+5.527395657,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.234780 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b67d740c81b39 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.502907705 +0000 UTC m=+5.532688340,LastTimestamp:2026-03-10 06:44:12.502907705 +0000 UTC m=+5.532688340,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.241074 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b67d740ccf3fe openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.503225342 +0000 UTC m=+5.533005967,LastTimestamp:2026-03-10 06:44:12.503225342 +0000 UTC m=+5.533005967,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.248076 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b67d741701149 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.513915209 +0000 UTC m=+5.543695824,LastTimestamp:2026-03-10 06:44:12.513915209 +0000 UTC m=+5.543695824,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.255517 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b67d7418ea8cc openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.515920076 +0000 UTC m=+5.545700691,LastTimestamp:2026-03-10 06:44:12.515920076 +0000 UTC m=+5.545700691,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.263780 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b67d741fac429 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.523004969 +0000 UTC m=+5.552785584,LastTimestamp:2026-03-10 06:44:12.523004969 +0000 UTC m=+5.552785584,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.271115 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b67d74230ff7d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.526559101 +0000 UTC m=+5.556339716,LastTimestamp:2026-03-10 06:44:12.526559101 +0000 UTC m=+5.556339716,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.278851 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b67d7426651d8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.530053592 +0000 UTC m=+5.559834217,LastTimestamp:2026-03-10 06:44:12.530053592 +0000 UTC m=+5.559834217,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.285325 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b67d743b114a7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.551730343 +0000 UTC m=+5.581510958,LastTimestamp:2026-03-10 06:44:12.551730343 +0000 UTC m=+5.581510958,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.292528 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b67d743cecdee openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.553678318 +0000 UTC m=+5.583458933,LastTimestamp:2026-03-10 06:44:12.553678318 +0000 UTC m=+5.583458933,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.300592 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b67d74c5d2152 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.697223506 +0000 UTC m=+5.727004121,LastTimestamp:2026-03-10 06:44:12.697223506 +0000 UTC m=+5.727004121,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.306879 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b67d74cfa331e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.707517214 +0000 UTC m=+5.737297829,LastTimestamp:2026-03-10 06:44:12.707517214 +0000 UTC m=+5.737297829,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.313756 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b67d74d0e8ef5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.708851445 +0000 UTC m=+5.738632060,LastTimestamp:2026-03-10 06:44:12.708851445 +0000 UTC m=+5.738632060,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.320910 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b67d7506d0d93 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.765375891 +0000 UTC m=+5.795156506,LastTimestamp:2026-03-10 06:44:12.765375891 +0000 UTC m=+5.795156506,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.326209 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b67d751267950 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.777527632 +0000 UTC m=+5.807308247,LastTimestamp:2026-03-10 06:44:12.777527632 +0000 UTC m=+5.807308247,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.333453 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b67d751385843 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.778698819 +0000 UTC m=+5.808479434,LastTimestamp:2026-03-10 06:44:12.778698819 +0000 UTC m=+5.808479434,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.340725 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b67d7597be380 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.917343104 +0000 UTC m=+5.947123709,LastTimestamp:2026-03-10 06:44:12.917343104 +0000 UTC m=+5.947123709,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.347169 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b67d75a94a36f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.935742319 +0000 UTC m=+5.965522934,LastTimestamp:2026-03-10 06:44:12.935742319 +0000 UTC m=+5.965522934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.354263 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b67d75c3d2712 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.963563282 +0000 UTC m=+5.993343917,LastTimestamp:2026-03-10 06:44:12.963563282 +0000 UTC m=+5.993343917,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.363104 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b67d75d24aa75 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.978735733 +0000 UTC m=+6.008516358,LastTimestamp:2026-03-10 06:44:12.978735733 +0000 UTC m=+6.008516358,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.369530 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b67d75d3c8335 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:12.980298549 +0000 UTC m=+6.010079164,LastTimestamp:2026-03-10 06:44:12.980298549 +0000 UTC m=+6.010079164,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.376363 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b67d769448273 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:13.182149235 +0000 UTC m=+6.211929860,LastTimestamp:2026-03-10 06:44:13.182149235 +0000 UTC m=+6.211929860,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.385018 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b67d76a61ee59 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:13.200854617 +0000 UTC m=+6.230635232,LastTimestamp:2026-03-10 06:44:13.200854617 +0000 UTC m=+6.230635232,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.392344 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b67d76a7638c4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:13.202184388 +0000 UTC m=+6.231965003,LastTimestamp:2026-03-10 06:44:13.202184388 +0000 UTC m=+6.231965003,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.401486 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b67d76fd0a5a8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:13.291996584 +0000 UTC m=+6.321777199,LastTimestamp:2026-03-10 06:44:13.291996584 +0000 UTC m=+6.321777199,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.410083 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b67d777459ed9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:13.417103065 +0000 UTC m=+6.446883690,LastTimestamp:2026-03-10 06:44:13.417103065 +0000 UTC m=+6.446883690,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.417963 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b67d777fc2ba6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:13.429066662 +0000 UTC m=+6.458847287,LastTimestamp:2026-03-10 06:44:13.429066662 +0000 UTC m=+6.458847287,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.426227 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b67d77df2c9cc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:13.529115084 +0000 UTC m=+6.558895729,LastTimestamp:2026-03-10 06:44:13.529115084 +0000 UTC m=+6.558895729,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.433560 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b67d77edcfa33 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:13.544462899 +0000 UTC m=+6.574243544,LastTimestamp:2026-03-10 06:44:13.544462899 +0000 UTC m=+6.574243544,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.444096 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b67d7ae42aa2f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:14.339656239 +0000 UTC m=+7.369436864,LastTimestamp:2026-03-10 06:44:14.339656239 +0000 UTC m=+7.369436864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.452412 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b67d7baaa1627 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:14.547760679 +0000 UTC m=+7.577541304,LastTimestamp:2026-03-10 06:44:14.547760679 +0000 UTC m=+7.577541304,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.460007 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b67d7bb38568b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:14.557083275 +0000 UTC m=+7.586863900,LastTimestamp:2026-03-10 06:44:14.557083275 +0000 UTC m=+7.586863900,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.467513 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b67d7bb531665 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:14.558836325 +0000 UTC m=+7.588616970,LastTimestamp:2026-03-10 06:44:14.558836325 +0000 UTC m=+7.588616970,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.475591 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b67d7c81cce9e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:14.773382814 +0000 UTC m=+7.803163429,LastTimestamp:2026-03-10 06:44:14.773382814 +0000 UTC m=+7.803163429,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.482795 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b67d7c90420fb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:14.788542715 +0000 UTC m=+7.818323370,LastTimestamp:2026-03-10 06:44:14.788542715 +0000 UTC m=+7.818323370,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.490985 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b67d7c918c070 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:14.789894256 +0000 UTC m=+7.819674871,LastTimestamp:2026-03-10 06:44:14.789894256 +0000 UTC m=+7.819674871,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.497967 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b67d7d67d78b8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:15.01459884 +0000 UTC m=+8.044379465,LastTimestamp:2026-03-10 06:44:15.01459884 +0000 UTC m=+8.044379465,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.505864 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b67d7d77c6db0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:15.031307696 +0000 UTC m=+8.061088351,LastTimestamp:2026-03-10 06:44:15.031307696 +0000 UTC m=+8.061088351,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.512812 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b67d7d792af01 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:15.032766209 +0000 UTC m=+8.062546854,LastTimestamp:2026-03-10 06:44:15.032766209 +0000 UTC m=+8.062546854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.521669 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b67d7e7682f7e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:15.29841651 +0000 UTC m=+8.328197125,LastTimestamp:2026-03-10 06:44:15.29841651 +0000 UTC m=+8.328197125,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.528990 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b67d7e89eb052 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:15.31876565 +0000 UTC m=+8.348546265,LastTimestamp:2026-03-10 06:44:15.31876565 +0000 UTC m=+8.348546265,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: W0310 06:44:55.539936 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.540005 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.540098 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b67d7e8bbb426 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:15.320667174 +0000 UTC m=+8.350447789,LastTimestamp:2026-03-10 06:44:15.320667174 +0000 UTC m=+8.350447789,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.554589 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b67d7f578b479 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:15.534380153 +0000 UTC m=+8.564160768,LastTimestamp:2026-03-10 06:44:15.534380153 +0000 UTC m=+8.564160768,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.562547 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b67d7f69373d3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:15.552910291 +0000 UTC m=+8.582690906,LastTimestamp:2026-03-10 06:44:15.552910291 +0000 UTC m=+8.582690906,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.571654 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 06:44:55 crc kubenswrapper[4825]: &Event{ObjectMeta:{kube-controller-manager-crc.189b67d8e3dfdf38 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 10 06:44:55 crc kubenswrapper[4825]: body: Mar 10 06:44:55 crc kubenswrapper[4825]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:19.534118712 +0000 UTC m=+12.563899327,LastTimestamp:2026-03-10 06:44:19.534118712 +0000 UTC m=+12.563899327,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 06:44:55 crc kubenswrapper[4825]: > Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.578670 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b67d8e3e0ed93 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:19.534187923 +0000 UTC m=+12.563968538,LastTimestamp:2026-03-10 06:44:19.534187923 +0000 UTC m=+12.563968538,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.583745 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 06:44:55 crc kubenswrapper[4825]: &Event{ObjectMeta:{kube-apiserver-crc.189b67da217a3c2a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 10 06:44:55 crc kubenswrapper[4825]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 06:44:55 crc kubenswrapper[4825]: Mar 10 06:44:55 crc kubenswrapper[4825]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:24.862612522 +0000 UTC m=+17.892393137,LastTimestamp:2026-03-10 06:44:24.862612522 +0000 UTC m=+17.892393137,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 06:44:55 crc kubenswrapper[4825]: > Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.592366 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b67da217dd962 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:24.862849378 +0000 UTC m=+17.892629993,LastTimestamp:2026-03-10 06:44:24.862849378 +0000 UTC m=+17.892629993,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.599020 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b67da217a3c2a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 06:44:55 crc kubenswrapper[4825]: &Event{ObjectMeta:{kube-apiserver-crc.189b67da217a3c2a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 10 06:44:55 crc kubenswrapper[4825]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 06:44:55 crc kubenswrapper[4825]: Mar 10 06:44:55 crc kubenswrapper[4825]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:24.862612522 +0000 UTC m=+17.892393137,LastTimestamp:2026-03-10 06:44:24.867847065 +0000 UTC m=+17.897627710,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 06:44:55 crc kubenswrapper[4825]: > Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.606053 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b67da217dd962\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b67da217dd962 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:24.862849378 +0000 UTC m=+17.892629993,LastTimestamp:2026-03-10 06:44:24.867915167 +0000 UTC m=+17.897695812,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.610984 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 06:44:55 crc kubenswrapper[4825]: &Event{ObjectMeta:{kube-apiserver-crc.189b67da2818e23a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 500 Mar 10 06:44:55 crc kubenswrapper[4825]: body: [+]ping ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]log ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]etcd ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/generic-apiserver-start-informers ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/priority-and-fairness-filter ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/start-apiextensions-informers ok Mar 10 06:44:55 crc kubenswrapper[4825]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Mar 10 06:44:55 crc kubenswrapper[4825]: [-]poststarthook/crd-informer-synced failed: reason withheld Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/start-system-namespaces-controller ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 10 06:44:55 crc kubenswrapper[4825]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 10 06:44:55 crc kubenswrapper[4825]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/bootstrap-controller ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/start-kube-aggregator-informers ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/apiservice-registration-controller ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/apiservice-discovery-controller ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]autoregister-completion ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/apiservice-openapi-controller ok Mar 10 06:44:55 crc kubenswrapper[4825]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 10 06:44:55 crc kubenswrapper[4825]: livez check failed Mar 10 06:44:55 crc kubenswrapper[4825]: Mar 10 06:44:55 crc kubenswrapper[4825]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:24.973673018 +0000 UTC m=+18.003453633,LastTimestamp:2026-03-10 06:44:24.973673018 +0000 UTC m=+18.003453633,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 06:44:55 crc kubenswrapper[4825]: > Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.613615 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b67da28198bb9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:24.973716409 +0000 UTC m=+18.003497024,LastTimestamp:2026-03-10 06:44:24.973716409 +0000 UTC m=+18.003497024,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.620073 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b67d76a7638c4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b67d76a7638c4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:13.202184388 +0000 UTC m=+6.231965003,LastTimestamp:2026-03-10 06:44:25.387234379 +0000 UTC m=+18.417015014,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.627491 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 06:44:55 crc kubenswrapper[4825]: &Event{ObjectMeta:{kube-controller-manager-crc.189b67db37ff4afa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 06:44:55 crc kubenswrapper[4825]: body: Mar 10 06:44:55 crc kubenswrapper[4825]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:29.53539865 +0000 UTC m=+22.565179305,LastTimestamp:2026-03-10 06:44:29.53539865 +0000 UTC m=+22.565179305,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 06:44:55 crc kubenswrapper[4825]: > Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.634691 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b67db38004420 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:29.535462432 +0000 UTC m=+22.565243077,LastTimestamp:2026-03-10 06:44:29.535462432 +0000 UTC m=+22.565243077,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.641555 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b67db37ff4afa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 06:44:55 crc kubenswrapper[4825]: &Event{ObjectMeta:{kube-controller-manager-crc.189b67db37ff4afa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 06:44:55 crc kubenswrapper[4825]: body: Mar 10 06:44:55 crc kubenswrapper[4825]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:29.53539865 +0000 UTC m=+22.565179305,LastTimestamp:2026-03-10 06:44:39.534440441 +0000 UTC m=+32.564221066,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 06:44:55 crc kubenswrapper[4825]: > Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.646830 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b67db38004420\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b67db38004420 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:29.535462432 +0000 UTC m=+22.565243077,LastTimestamp:2026-03-10 06:44:39.534500042 +0000 UTC m=+32.564280667,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.653098 4825 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b67dd8c242457 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:39.537034327 +0000 UTC m=+32.566814952,LastTimestamp:2026-03-10 06:44:39.537034327 +0000 UTC m=+32.566814952,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.658288 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b67d70d8598d0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b67d70d8598d0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:11.642910928 +0000 UTC m=+4.672691583,LastTimestamp:2026-03-10 06:44:39.658364304 +0000 UTC m=+32.688144929,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.663473 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b67d7213a75bd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b67d7213a75bd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:11.973531069 +0000 UTC m=+5.003311684,LastTimestamp:2026-03-10 06:44:39.890846798 +0000 UTC m=+32.920627413,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.670083 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b67d721f8cbc9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b67d721f8cbc9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:11.986004937 +0000 UTC m=+5.015785542,LastTimestamp:2026-03-10 06:44:39.89801281 +0000 UTC m=+32.927793425,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.677464 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b67d8e3dfdf38\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 06:44:55 crc kubenswrapper[4825]: &Event{ObjectMeta:{kube-controller-manager-crc.189b67d8e3dfdf38 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 10 06:44:55 crc kubenswrapper[4825]: body: Mar 10 06:44:55 crc kubenswrapper[4825]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:19.534118712 +0000 UTC m=+12.563899327,LastTimestamp:2026-03-10 06:44:49.535372568 +0000 UTC m=+42.565153223,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 06:44:55 crc kubenswrapper[4825]: > Mar 10 06:44:55 crc kubenswrapper[4825]: E0310 06:44:55.683955 4825 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b67d8e3e0ed93\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b67d8e3e0ed93 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:44:19.534187923 +0000 UTC m=+12.563968538,LastTimestamp:2026-03-10 06:44:49.53546029 +0000 UTC m=+42.565240945,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:44:56 crc kubenswrapper[4825]: I0310 06:44:56.117276 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:44:56 crc kubenswrapper[4825]: W0310 06:44:56.300515 4825 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 10 06:44:56 crc kubenswrapper[4825]: E0310 06:44:56.301034 4825 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 06:44:57 crc kubenswrapper[4825]: I0310 06:44:57.113424 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:44:58 crc kubenswrapper[4825]: I0310 06:44:58.118642 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:44:59 crc kubenswrapper[4825]: I0310 06:44:59.117710 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:44:59 crc kubenswrapper[4825]: I0310 06:44:59.283633 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:59 crc kubenswrapper[4825]: I0310 06:44:59.285652 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:59 crc kubenswrapper[4825]: I0310 06:44:59.285727 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:59 crc kubenswrapper[4825]: I0310 06:44:59.285756 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:59 crc kubenswrapper[4825]: I0310 06:44:59.285801 4825 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 06:44:59 crc kubenswrapper[4825]: E0310 06:44:59.294177 4825 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 06:44:59 crc kubenswrapper[4825]: E0310 06:44:59.300058 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 06:44:59 crc kubenswrapper[4825]: I0310 06:44:59.444606 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 06:44:59 crc kubenswrapper[4825]: I0310 06:44:59.444882 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:59 crc kubenswrapper[4825]: I0310 06:44:59.446800 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:59 crc kubenswrapper[4825]: I0310 06:44:59.446998 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:59 crc kubenswrapper[4825]: I0310 06:44:59.447184 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:44:59 crc kubenswrapper[4825]: I0310 06:44:59.453075 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 06:44:59 crc kubenswrapper[4825]: E0310 06:44:59.528218 4825 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 06:44:59 crc kubenswrapper[4825]: I0310 06:44:59.531030 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:44:59 crc kubenswrapper[4825]: I0310 06:44:59.532724 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:44:59 crc kubenswrapper[4825]: I0310 06:44:59.532794 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:44:59 crc kubenswrapper[4825]: I0310 06:44:59.532820 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:00 crc kubenswrapper[4825]: I0310 06:45:00.115566 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:45:00 crc kubenswrapper[4825]: I0310 06:45:00.236285 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:45:00 crc kubenswrapper[4825]: I0310 06:45:00.238010 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:00 crc kubenswrapper[4825]: I0310 06:45:00.238074 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:00 crc kubenswrapper[4825]: I0310 06:45:00.238096 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:00 crc kubenswrapper[4825]: I0310 06:45:00.238975 4825 scope.go:117] "RemoveContainer" containerID="4e0592c099e85d8ce818112bf02431516ffa3696e764fe5a8a824baaf2e1c0da" Mar 10 06:45:00 crc kubenswrapper[4825]: E0310 06:45:00.239306 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 06:45:01 crc kubenswrapper[4825]: I0310 06:45:01.113580 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:45:02 crc kubenswrapper[4825]: I0310 06:45:02.113762 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:45:02 crc kubenswrapper[4825]: I0310 06:45:02.500916 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 06:45:02 crc kubenswrapper[4825]: I0310 06:45:02.501194 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:45:02 crc kubenswrapper[4825]: I0310 06:45:02.502932 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:02 crc kubenswrapper[4825]: I0310 06:45:02.503053 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:02 crc kubenswrapper[4825]: I0310 06:45:02.503124 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:03 crc kubenswrapper[4825]: I0310 06:45:03.116826 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:45:04 crc kubenswrapper[4825]: I0310 06:45:04.118456 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:45:05 crc kubenswrapper[4825]: I0310 06:45:05.113798 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:45:06 crc kubenswrapper[4825]: I0310 06:45:06.117482 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:45:06 crc kubenswrapper[4825]: I0310 06:45:06.294428 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:45:06 crc kubenswrapper[4825]: I0310 06:45:06.296454 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:06 crc kubenswrapper[4825]: I0310 06:45:06.296516 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:06 crc kubenswrapper[4825]: I0310 06:45:06.296540 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:06 crc kubenswrapper[4825]: I0310 06:45:06.296580 4825 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 06:45:06 crc kubenswrapper[4825]: E0310 06:45:06.302374 4825 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 06:45:06 crc kubenswrapper[4825]: E0310 06:45:06.303011 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 06:45:07 crc kubenswrapper[4825]: I0310 06:45:07.115998 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:45:08 crc kubenswrapper[4825]: I0310 06:45:08.116473 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:45:09 crc kubenswrapper[4825]: I0310 06:45:09.115196 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:45:09 crc kubenswrapper[4825]: E0310 06:45:09.529818 4825 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 06:45:10 crc kubenswrapper[4825]: I0310 06:45:10.118399 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:45:11 crc kubenswrapper[4825]: I0310 06:45:11.117812 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:45:12 crc kubenswrapper[4825]: I0310 06:45:12.114655 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:45:13 crc kubenswrapper[4825]: I0310 06:45:13.117331 4825 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 06:45:13 crc kubenswrapper[4825]: I0310 06:45:13.302820 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:45:13 crc kubenswrapper[4825]: I0310 06:45:13.305095 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:13 crc kubenswrapper[4825]: I0310 06:45:13.305167 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:13 crc kubenswrapper[4825]: I0310 06:45:13.305181 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:13 crc kubenswrapper[4825]: I0310 06:45:13.305210 4825 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 06:45:13 crc kubenswrapper[4825]: E0310 06:45:13.310837 4825 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 06:45:13 crc kubenswrapper[4825]: E0310 06:45:13.311563 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 06:45:13 crc kubenswrapper[4825]: I0310 06:45:13.651489 4825 csr.go:261] certificate signing request csr-s4thh is approved, waiting to be issued Mar 10 06:45:13 crc kubenswrapper[4825]: I0310 06:45:13.661005 4825 csr.go:257] certificate signing request csr-s4thh is issued Mar 10 06:45:13 crc kubenswrapper[4825]: I0310 06:45:13.693977 4825 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 10 06:45:14 crc kubenswrapper[4825]: I0310 06:45:14.540504 4825 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 10 06:45:14 crc kubenswrapper[4825]: I0310 06:45:14.662819 4825 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-12 09:02:30.523278768 +0000 UTC Mar 10 06:45:14 crc kubenswrapper[4825]: I0310 06:45:14.662889 4825 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6650h17m15.860395866s for next certificate rotation Mar 10 06:45:15 crc kubenswrapper[4825]: I0310 06:45:15.235744 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:45:15 crc kubenswrapper[4825]: I0310 06:45:15.237231 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:15 crc kubenswrapper[4825]: I0310 06:45:15.237264 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:15 crc kubenswrapper[4825]: I0310 06:45:15.237273 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:15 crc kubenswrapper[4825]: I0310 06:45:15.237719 4825 scope.go:117] "RemoveContainer" containerID="4e0592c099e85d8ce818112bf02431516ffa3696e764fe5a8a824baaf2e1c0da" Mar 10 06:45:15 crc kubenswrapper[4825]: I0310 06:45:15.578407 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 06:45:15 crc kubenswrapper[4825]: I0310 06:45:15.580819 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d"} Mar 10 06:45:15 crc kubenswrapper[4825]: I0310 06:45:15.580969 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:45:15 crc kubenswrapper[4825]: I0310 06:45:15.582267 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:15 crc kubenswrapper[4825]: I0310 06:45:15.582322 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:15 crc kubenswrapper[4825]: I0310 06:45:15.582336 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:16 crc kubenswrapper[4825]: I0310 06:45:16.509711 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:45:16 crc kubenswrapper[4825]: I0310 06:45:16.585000 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 06:45:16 crc kubenswrapper[4825]: I0310 06:45:16.586162 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 06:45:16 crc kubenswrapper[4825]: I0310 06:45:16.588554 4825 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d" exitCode=255 Mar 10 06:45:16 crc kubenswrapper[4825]: I0310 06:45:16.588630 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d"} Mar 10 06:45:16 crc kubenswrapper[4825]: I0310 06:45:16.588761 4825 scope.go:117] "RemoveContainer" containerID="4e0592c099e85d8ce818112bf02431516ffa3696e764fe5a8a824baaf2e1c0da" Mar 10 06:45:16 crc kubenswrapper[4825]: I0310 06:45:16.588761 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:45:16 crc kubenswrapper[4825]: I0310 06:45:16.590111 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:16 crc kubenswrapper[4825]: I0310 06:45:16.590233 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:16 crc kubenswrapper[4825]: I0310 06:45:16.590307 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:16 crc kubenswrapper[4825]: I0310 06:45:16.590898 4825 scope.go:117] "RemoveContainer" containerID="414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d" Mar 10 06:45:16 crc kubenswrapper[4825]: E0310 06:45:16.591610 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 06:45:16 crc kubenswrapper[4825]: I0310 06:45:16.947417 4825 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 10 06:45:17 crc kubenswrapper[4825]: I0310 06:45:17.593805 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 06:45:17 crc kubenswrapper[4825]: I0310 06:45:17.597682 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:45:17 crc kubenswrapper[4825]: I0310 06:45:17.599243 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:17 crc kubenswrapper[4825]: I0310 06:45:17.599298 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:17 crc kubenswrapper[4825]: I0310 06:45:17.599317 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:17 crc kubenswrapper[4825]: I0310 06:45:17.600238 4825 scope.go:117] "RemoveContainer" containerID="414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d" Mar 10 06:45:17 crc kubenswrapper[4825]: E0310 06:45:17.600528 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 06:45:17 crc kubenswrapper[4825]: I0310 06:45:17.784155 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:45:18 crc kubenswrapper[4825]: I0310 06:45:18.599494 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:45:18 crc kubenswrapper[4825]: I0310 06:45:18.600563 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:18 crc kubenswrapper[4825]: I0310 06:45:18.600640 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:18 crc kubenswrapper[4825]: I0310 06:45:18.600660 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:18 crc kubenswrapper[4825]: I0310 06:45:18.601820 4825 scope.go:117] "RemoveContainer" containerID="414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d" Mar 10 06:45:18 crc kubenswrapper[4825]: E0310 06:45:18.602188 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 06:45:18 crc kubenswrapper[4825]: I0310 06:45:18.928009 4825 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.094562 4825 apiserver.go:52] "Watching apiserver" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.101665 4825 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.102270 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.102890 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.103121 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.104089 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.104255 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.103168 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.103232 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.104350 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.104459 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.104282 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.106592 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.107049 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.107628 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.108075 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.108206 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.108282 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.108786 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.109250 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.109388 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.117752 4825 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.128315 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.128377 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.128418 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.128456 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.128492 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.128529 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.128561 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.128591 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.128629 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.128726 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.128759 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.128797 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.128819 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.128831 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.128911 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.128939 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.128984 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129022 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129060 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129101 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129161 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129200 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129234 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129271 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129292 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129307 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129431 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129474 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129518 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129558 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129596 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129604 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129636 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129679 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129716 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129754 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129788 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129824 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129862 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129883 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129903 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129944 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.129981 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.130017 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.130059 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.130096 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.130160 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.130198 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.130230 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.130367 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.130625 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.130719 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.130816 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.130913 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.131040 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.131258 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.131325 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.131383 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.131435 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.131490 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.131629 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.131679 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.131732 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.131787 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.132019 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.132381 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.132672 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.132716 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.132906 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.132988 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.133660 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.133803 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.133982 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.134205 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.134253 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.134466 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.134504 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.134539 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.134577 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.134662 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.134862 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.134969 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.135825 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.135904 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.135940 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.136033 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.136076 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.136112 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.136175 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.136210 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.136249 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.136289 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.136324 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.136365 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.136402 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.136439 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.136475 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.136513 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.136547 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.136584 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.136621 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.136659 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.136773 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.136872 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.136946 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.130625 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.131555 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.131818 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.131838 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.131987 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.132045 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.132200 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.132245 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.132397 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.132606 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.132720 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.133648 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.133899 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.133990 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.134055 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.134685 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.134976 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.135269 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.135384 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.140327 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.140582 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.135411 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.135613 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.135746 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.140906 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.137062 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.135778 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.135817 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.137769 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.141014 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.141082 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.141121 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.141195 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.141235 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.141271 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.141304 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.141339 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.141375 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.149658 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.149759 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.149819 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.150036 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.150099 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.150165 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.150216 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.150259 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.150311 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.150348 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.150432 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.150495 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.150534 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.150578 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.150624 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.150662 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.150715 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.150759 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.150841 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.150882 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.150927 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.150968 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.151002 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.151047 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.151097 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.151170 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.151208 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.151253 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.151296 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.151335 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.151376 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.151438 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.151695 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.151754 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.151798 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.151948 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.152060 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.152115 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.152198 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.152244 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.152286 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.152380 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.152459 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.152524 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.152640 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.152695 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.152782 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.152828 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.152877 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.152928 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.152972 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.153025 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.153080 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.153164 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.153212 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.153259 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.153313 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.153376 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.153457 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.153544 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.153604 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.153648 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.153705 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.153771 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.153823 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.153866 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.153911 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.153955 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.154022 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.154069 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.154183 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.154232 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.154278 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.154434 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.154496 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.154647 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.154697 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.154878 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.154937 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.155050 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.155106 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.155304 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.155365 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.155563 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.155727 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.155780 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.155827 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.156105 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.156235 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.156363 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.156511 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.156567 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.156761 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.156818 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.156888 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.157112 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.157194 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.157250 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.157300 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.157508 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.157562 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.158691 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.158733 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.158769 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.158791 4825 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.158814 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.158919 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.158951 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.158974 4825 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.158998 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.159021 4825 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.159049 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.159251 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.159282 4825 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.159308 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.159338 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.159362 4825 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.159384 4825 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.159511 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.159534 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.159557 4825 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.159580 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.159607 4825 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.159632 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.159735 4825 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.159760 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.159785 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.159812 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.159837 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.137983 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.161687 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.161855 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.162343 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.162865 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.162899 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.162869 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.138208 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.163255 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.139033 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.163375 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.163383 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.139742 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.139897 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.140066 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.141177 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.141268 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.141514 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.141543 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.141713 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.145078 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.145230 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.145161 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.145692 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.145749 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.145797 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.146079 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.146316 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.146422 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.146541 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.146655 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.146891 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.147530 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.147561 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.147761 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.148070 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.148067 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.148200 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.148509 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.148758 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.148921 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.149414 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.149461 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.149504 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.149519 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.149548 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.149726 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.149702 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.150291 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.150458 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.150418 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.150645 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.150863 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.151117 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.151286 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.151474 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.151829 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.151594 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.152851 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.154237 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.154960 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.156280 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.156514 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.157193 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.157765 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.158453 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.158907 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.159643 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.138734 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.163929 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.164553 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.164681 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.165191 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.165241 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.165272 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.165763 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.166095 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.166274 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.166292 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.166777 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.166803 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.167296 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.167462 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.167481 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.167740 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.168125 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.168207 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.168207 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.168107 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.167780 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.168764 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.168597 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.169232 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.169100 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.169437 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.169432 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.169858 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.170161 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.170408 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.170784 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.170991 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.171518 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.171434 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.171944 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.171058 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.172324 4825 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.172328 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.172476 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.172521 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.172807 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.172834 4825 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.152883 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.174256 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.174380 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.174495 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.174963 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.175003 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.175188 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.175418 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.175807 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.175724 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.175928 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.176386 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.176538 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.176926 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.177831 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.171933 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:45:19.671594031 +0000 UTC m=+72.701374686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.177965 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.178694 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.175684 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.178955 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.176604 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.159861 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.179765 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.179896 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.175408 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.182822 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.182893 4825 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.180189 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.180217 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.180282 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.180538 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.181246 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 06:45:19.679673883 +0000 UTC m=+72.709454688 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.181685 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.181928 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.181970 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.182427 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.190193 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 06:45:19.690164706 +0000 UTC m=+72.719945351 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.190105 4825 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.179400 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.190263 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.190295 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.190314 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.190372 4825 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.182704 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.190393 4825 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.182789 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.182815 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.182840 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.184515 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.190847 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.193831 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.193863 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.194587 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.199058 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.199543 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.202106 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.205362 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.206254 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.206668 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.206761 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.206828 4825 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.206936 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 06:45:19.706917375 +0000 UTC m=+72.736697990 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.184307 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.209448 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.209588 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.209887 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.210013 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.210164 4825 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.210412 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 06:45:19.710395122 +0000 UTC m=+72.740175947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.215802 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.215984 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.216340 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.216378 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.216709 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.218938 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.223571 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.226165 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.233759 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.244127 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.244670 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.245368 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.249008 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.250780 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.252486 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.253396 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.254573 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.254967 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.255767 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.257673 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.258897 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.260630 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.261568 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.263522 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.264469 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.265442 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.267105 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.267393 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.268917 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.271111 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.272106 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.273250 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.274916 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.275953 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.277412 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.278099 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.279523 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.281608 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.281698 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.282704 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.284748 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.285415 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.286194 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.287271 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.287874 4825 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.288002 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.290774 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.290955 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.291001 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.291072 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.291094 4825 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.291116 4825 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.291161 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.291361 4825 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.291473 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.291525 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.291725 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.291810 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.292760 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.292890 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.292991 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.293095 4825 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.293208 4825 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.293286 4825 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.293364 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.293444 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.293522 4825 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.293600 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.293678 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.293763 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.293824 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.293858 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.294068 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.294179 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.294282 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.294381 4825 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.294464 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.294545 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.294623 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.294697 4825 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.294768 4825 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.294851 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.294954 4825 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.295041 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.295115 4825 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.295225 4825 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.295317 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.295393 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.295472 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.295545 4825 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.295623 4825 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.295702 4825 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.295783 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.295855 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.295933 4825 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.296010 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.296087 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.296189 4825 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.296281 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.296358 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.296442 4825 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.296517 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.296590 4825 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.296669 4825 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.296743 4825 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.296820 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.296899 4825 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.296971 4825 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.297057 4825 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.296998 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.297167 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.297329 4825 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.297404 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.297480 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.297559 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.297640 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.297719 4825 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.297797 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.297871 4825 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.297951 4825 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.298029 4825 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.298287 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.298394 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.298469 4825 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.298542 4825 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.298614 4825 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.298701 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.298775 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.298863 4825 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.298958 4825 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299109 4825 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299245 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.298516 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299341 4825 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299407 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299433 4825 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299453 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299473 4825 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299495 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299515 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299537 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299560 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299578 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299598 4825 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299617 4825 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299636 4825 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299654 4825 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299672 4825 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299691 4825 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299713 4825 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299735 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299755 4825 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299776 4825 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299794 4825 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299814 4825 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299833 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.299879 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.300275 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.300422 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.300511 4825 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.300596 4825 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.300677 4825 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.300755 4825 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.300830 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.300913 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.300991 4825 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.301063 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.301158 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.301239 4825 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.301323 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.301399 4825 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.301563 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.301639 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.301709 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.301789 4825 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.301867 4825 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.301945 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.302125 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.302471 4825 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.302551 4825 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.302632 4825 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.302733 4825 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.302581 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.302820 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.302964 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.303040 4825 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.303116 4825 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.303218 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.303306 4825 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.303391 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.303486 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.303595 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.303691 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.303788 4825 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.303893 4825 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.303997 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.304123 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.304246 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.304339 4825 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.304613 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.304761 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.304896 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.304986 4825 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.305070 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.305222 4825 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.305367 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.305484 4825 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.305599 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.305725 4825 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.305859 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.305997 4825 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.303955 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.306129 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.306413 4825 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.306547 4825 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.306652 4825 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.306880 4825 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.307010 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.306950 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.307041 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.308539 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.313206 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.314340 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.315063 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.316607 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.318106 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.319580 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.320486 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.321925 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.322613 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.323922 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.325459 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.327170 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.327939 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.329304 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.335825 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.348832 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.359406 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.370806 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.424862 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.441933 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 06:45:19 crc kubenswrapper[4825]: W0310 06:45:19.448093 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-185d2b15c2b90331e4661601bfa3c357a0a2de1f02270828c1b536236ce10706 WatchSource:0}: Error finding container 185d2b15c2b90331e4661601bfa3c357a0a2de1f02270828c1b536236ce10706: Status 404 returned error can't find the container with id 185d2b15c2b90331e4661601bfa3c357a0a2de1f02270828c1b536236ce10706 Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.451299 4825 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 06:45:19 crc kubenswrapper[4825]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 10 06:45:19 crc kubenswrapper[4825]: set -o allexport Mar 10 06:45:19 crc kubenswrapper[4825]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 10 06:45:19 crc kubenswrapper[4825]: source /etc/kubernetes/apiserver-url.env Mar 10 06:45:19 crc kubenswrapper[4825]: else Mar 10 06:45:19 crc kubenswrapper[4825]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 10 06:45:19 crc kubenswrapper[4825]: exit 1 Mar 10 06:45:19 crc kubenswrapper[4825]: fi Mar 10 06:45:19 crc kubenswrapper[4825]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 10 06:45:19 crc kubenswrapper[4825]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 06:45:19 crc kubenswrapper[4825]: > logger="UnhandledError" Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.452404 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.462812 4825 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 06:45:19 crc kubenswrapper[4825]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 06:45:19 crc kubenswrapper[4825]: if [[ -f "/env/_master" ]]; then Mar 10 06:45:19 crc kubenswrapper[4825]: set -o allexport Mar 10 06:45:19 crc kubenswrapper[4825]: source "/env/_master" Mar 10 06:45:19 crc kubenswrapper[4825]: set +o allexport Mar 10 06:45:19 crc kubenswrapper[4825]: fi Mar 10 06:45:19 crc kubenswrapper[4825]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 10 06:45:19 crc kubenswrapper[4825]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 10 06:45:19 crc kubenswrapper[4825]: ho_enable="--enable-hybrid-overlay" Mar 10 06:45:19 crc kubenswrapper[4825]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 10 06:45:19 crc kubenswrapper[4825]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 10 06:45:19 crc kubenswrapper[4825]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 10 06:45:19 crc kubenswrapper[4825]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 06:45:19 crc kubenswrapper[4825]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 10 06:45:19 crc kubenswrapper[4825]: --webhook-host=127.0.0.1 \ Mar 10 06:45:19 crc kubenswrapper[4825]: --webhook-port=9743 \ Mar 10 06:45:19 crc kubenswrapper[4825]: ${ho_enable} \ Mar 10 06:45:19 crc kubenswrapper[4825]: --enable-interconnect \ Mar 10 06:45:19 crc kubenswrapper[4825]: --disable-approver \ Mar 10 06:45:19 crc kubenswrapper[4825]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 10 06:45:19 crc kubenswrapper[4825]: --wait-for-kubernetes-api=200s \ Mar 10 06:45:19 crc kubenswrapper[4825]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 10 06:45:19 crc kubenswrapper[4825]: --loglevel="${LOGLEVEL}" Mar 10 06:45:19 crc kubenswrapper[4825]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 06:45:19 crc kubenswrapper[4825]: > logger="UnhandledError" Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.465980 4825 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 06:45:19 crc kubenswrapper[4825]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 06:45:19 crc kubenswrapper[4825]: if [[ -f "/env/_master" ]]; then Mar 10 06:45:19 crc kubenswrapper[4825]: set -o allexport Mar 10 06:45:19 crc kubenswrapper[4825]: source "/env/_master" Mar 10 06:45:19 crc kubenswrapper[4825]: set +o allexport Mar 10 06:45:19 crc kubenswrapper[4825]: fi Mar 10 06:45:19 crc kubenswrapper[4825]: Mar 10 06:45:19 crc kubenswrapper[4825]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 10 06:45:19 crc kubenswrapper[4825]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 06:45:19 crc kubenswrapper[4825]: --disable-webhook \ Mar 10 06:45:19 crc kubenswrapper[4825]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 10 06:45:19 crc kubenswrapper[4825]: --loglevel="${LOGLEVEL}" Mar 10 06:45:19 crc kubenswrapper[4825]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 06:45:19 crc kubenswrapper[4825]: > logger="UnhandledError" Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.467492 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.470229 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 06:45:19 crc kubenswrapper[4825]: W0310 06:45:19.489043 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-58e1a34201009700526e56a2b2e2574210221d39bb94b33f9dffe5cecb5b0036 WatchSource:0}: Error finding container 58e1a34201009700526e56a2b2e2574210221d39bb94b33f9dffe5cecb5b0036: Status 404 returned error can't find the container with id 58e1a34201009700526e56a2b2e2574210221d39bb94b33f9dffe5cecb5b0036 Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.492326 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.493548 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.603351 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2eeb477a677c528c9bd35b5751684446282dce4b642d0c971efe8c1a450ec463"} Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.604991 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"185d2b15c2b90331e4661601bfa3c357a0a2de1f02270828c1b536236ce10706"} Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.607803 4825 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 06:45:19 crc kubenswrapper[4825]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 06:45:19 crc kubenswrapper[4825]: if [[ -f "/env/_master" ]]; then Mar 10 06:45:19 crc kubenswrapper[4825]: set -o allexport Mar 10 06:45:19 crc kubenswrapper[4825]: source "/env/_master" Mar 10 06:45:19 crc kubenswrapper[4825]: set +o allexport Mar 10 06:45:19 crc kubenswrapper[4825]: fi Mar 10 06:45:19 crc kubenswrapper[4825]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 10 06:45:19 crc kubenswrapper[4825]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 10 06:45:19 crc kubenswrapper[4825]: ho_enable="--enable-hybrid-overlay" Mar 10 06:45:19 crc kubenswrapper[4825]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 10 06:45:19 crc kubenswrapper[4825]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 10 06:45:19 crc kubenswrapper[4825]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 10 06:45:19 crc kubenswrapper[4825]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 06:45:19 crc kubenswrapper[4825]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 10 06:45:19 crc kubenswrapper[4825]: --webhook-host=127.0.0.1 \ Mar 10 06:45:19 crc kubenswrapper[4825]: --webhook-port=9743 \ Mar 10 06:45:19 crc kubenswrapper[4825]: ${ho_enable} \ Mar 10 06:45:19 crc kubenswrapper[4825]: --enable-interconnect \ Mar 10 06:45:19 crc kubenswrapper[4825]: --disable-approver \ Mar 10 06:45:19 crc kubenswrapper[4825]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 10 06:45:19 crc kubenswrapper[4825]: --wait-for-kubernetes-api=200s \ Mar 10 06:45:19 crc kubenswrapper[4825]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 10 06:45:19 crc kubenswrapper[4825]: --loglevel="${LOGLEVEL}" Mar 10 06:45:19 crc kubenswrapper[4825]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 06:45:19 crc kubenswrapper[4825]: > logger="UnhandledError" Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.608292 4825 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 06:45:19 crc kubenswrapper[4825]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 10 06:45:19 crc kubenswrapper[4825]: set -o allexport Mar 10 06:45:19 crc kubenswrapper[4825]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 10 06:45:19 crc kubenswrapper[4825]: source /etc/kubernetes/apiserver-url.env Mar 10 06:45:19 crc kubenswrapper[4825]: else Mar 10 06:45:19 crc kubenswrapper[4825]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 10 06:45:19 crc kubenswrapper[4825]: exit 1 Mar 10 06:45:19 crc kubenswrapper[4825]: fi Mar 10 06:45:19 crc kubenswrapper[4825]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 10 06:45:19 crc kubenswrapper[4825]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 06:45:19 crc kubenswrapper[4825]: > logger="UnhandledError" Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.609425 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.610082 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"58e1a34201009700526e56a2b2e2574210221d39bb94b33f9dffe5cecb5b0036"} Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.610642 4825 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 06:45:19 crc kubenswrapper[4825]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 06:45:19 crc kubenswrapper[4825]: if [[ -f "/env/_master" ]]; then Mar 10 06:45:19 crc kubenswrapper[4825]: set -o allexport Mar 10 06:45:19 crc kubenswrapper[4825]: source "/env/_master" Mar 10 06:45:19 crc kubenswrapper[4825]: set +o allexport Mar 10 06:45:19 crc kubenswrapper[4825]: fi Mar 10 06:45:19 crc kubenswrapper[4825]: Mar 10 06:45:19 crc kubenswrapper[4825]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 10 06:45:19 crc kubenswrapper[4825]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 06:45:19 crc kubenswrapper[4825]: --disable-webhook \ Mar 10 06:45:19 crc kubenswrapper[4825]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 10 06:45:19 crc kubenswrapper[4825]: --loglevel="${LOGLEVEL}" Mar 10 06:45:19 crc kubenswrapper[4825]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 06:45:19 crc kubenswrapper[4825]: > logger="UnhandledError" Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.611861 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.612007 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.613075 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.618850 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.634973 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.650719 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.668548 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.680040 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.690815 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.702310 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.711093 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.711227 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.711310 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.711338 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.711415 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:45:20.711335822 +0000 UTC m=+73.741116467 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.711462 4825 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.711490 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.711583 4825 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.711649 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.711700 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.711724 4825 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.711676 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 06:45:20.711658501 +0000 UTC m=+73.741439146 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.711857 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 06:45:20.711824145 +0000 UTC m=+73.741604980 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.711808 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.711894 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.711909 4825 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.711961 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 06:45:20.711937398 +0000 UTC m=+73.741718283 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:19 crc kubenswrapper[4825]: E0310 06:45:19.712425 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 06:45:20.712402209 +0000 UTC m=+73.742182834 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.713928 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.725888 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.737640 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.754425 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.767788 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.779317 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:19 crc kubenswrapper[4825]: I0310 06:45:19.791663 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.236413 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.236475 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:20 crc kubenswrapper[4825]: E0310 06:45:20.236648 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:45:20 crc kubenswrapper[4825]: E0310 06:45:20.236939 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.311771 4825 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.313872 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.313939 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.313967 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.314072 4825 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.326715 4825 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.326901 4825 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.328564 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.328626 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.328641 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.328662 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.328674 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:20Z","lastTransitionTime":"2026-03-10T06:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:20 crc kubenswrapper[4825]: E0310 06:45:20.354195 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.359120 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.359354 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.359444 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.359568 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.359669 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:20Z","lastTransitionTime":"2026-03-10T06:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:20 crc kubenswrapper[4825]: E0310 06:45:20.372001 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.375918 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.375966 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.375983 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.376004 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.376020 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:20Z","lastTransitionTime":"2026-03-10T06:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:20 crc kubenswrapper[4825]: E0310 06:45:20.396926 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.401633 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.401682 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.401693 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.401714 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.401727 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:20Z","lastTransitionTime":"2026-03-10T06:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:20 crc kubenswrapper[4825]: E0310 06:45:20.411760 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.416553 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.416587 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.416597 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.416614 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.416626 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:20Z","lastTransitionTime":"2026-03-10T06:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:20 crc kubenswrapper[4825]: E0310 06:45:20.429227 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:20 crc kubenswrapper[4825]: E0310 06:45:20.429520 4825 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.431899 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.432000 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.432077 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.432165 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.432232 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:20Z","lastTransitionTime":"2026-03-10T06:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.534841 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.535399 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.535626 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.535810 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.535968 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:20Z","lastTransitionTime":"2026-03-10T06:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.638880 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.638960 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.639004 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.639035 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.639057 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:20Z","lastTransitionTime":"2026-03-10T06:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.722288 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.722510 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:20 crc kubenswrapper[4825]: E0310 06:45:20.722538 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:45:22.722499255 +0000 UTC m=+75.752280030 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:45:20 crc kubenswrapper[4825]: E0310 06:45:20.722710 4825 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 06:45:20 crc kubenswrapper[4825]: E0310 06:45:20.722815 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 06:45:22.722789762 +0000 UTC m=+75.752570417 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.723609 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.723698 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.723755 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:20 crc kubenswrapper[4825]: E0310 06:45:20.723824 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 06:45:20 crc kubenswrapper[4825]: E0310 06:45:20.723861 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 06:45:20 crc kubenswrapper[4825]: E0310 06:45:20.723881 4825 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 06:45:20 crc kubenswrapper[4825]: E0310 06:45:20.723885 4825 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:20 crc kubenswrapper[4825]: E0310 06:45:20.723932 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 06:45:20 crc kubenswrapper[4825]: E0310 06:45:20.723949 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 06:45:22.7239329 +0000 UTC m=+75.753713545 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:20 crc kubenswrapper[4825]: E0310 06:45:20.723956 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 06:45:20 crc kubenswrapper[4825]: E0310 06:45:20.723980 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 06:45:22.723966311 +0000 UTC m=+75.753746966 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 06:45:20 crc kubenswrapper[4825]: E0310 06:45:20.723980 4825 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:20 crc kubenswrapper[4825]: E0310 06:45:20.724031 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 06:45:22.724019793 +0000 UTC m=+75.753800438 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.741921 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.741957 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.741974 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.742000 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.742020 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:20Z","lastTransitionTime":"2026-03-10T06:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.845288 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.845350 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.845368 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.845393 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.845412 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:20Z","lastTransitionTime":"2026-03-10T06:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.947519 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.947588 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.947610 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.947638 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:20 crc kubenswrapper[4825]: I0310 06:45:20.947656 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:20Z","lastTransitionTime":"2026-03-10T06:45:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.050446 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.050519 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.050537 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.050562 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.050582 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:21Z","lastTransitionTime":"2026-03-10T06:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.153599 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.153676 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.153694 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.153725 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.153749 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:21Z","lastTransitionTime":"2026-03-10T06:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.236000 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:21 crc kubenswrapper[4825]: E0310 06:45:21.236282 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.256517 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.256566 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.256584 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.256609 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.256629 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:21Z","lastTransitionTime":"2026-03-10T06:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.359447 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.359502 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.359526 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.359553 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.359571 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:21Z","lastTransitionTime":"2026-03-10T06:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.462774 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.462840 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.462869 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.462885 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.462896 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:21Z","lastTransitionTime":"2026-03-10T06:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.565303 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.565365 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.565379 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.565401 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.565415 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:21Z","lastTransitionTime":"2026-03-10T06:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.668706 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.668768 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.668782 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.668801 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.668814 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:21Z","lastTransitionTime":"2026-03-10T06:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.772160 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.772468 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.772529 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.772598 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.772668 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:21Z","lastTransitionTime":"2026-03-10T06:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.876461 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.876504 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.876513 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.876528 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.876536 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:21Z","lastTransitionTime":"2026-03-10T06:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.979322 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.979386 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.979401 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.979424 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:21 crc kubenswrapper[4825]: I0310 06:45:21.979438 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:21Z","lastTransitionTime":"2026-03-10T06:45:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.081821 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.081920 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.081931 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.081946 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.081959 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:22Z","lastTransitionTime":"2026-03-10T06:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.184332 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.184401 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.184411 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.184429 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.184442 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:22Z","lastTransitionTime":"2026-03-10T06:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.236315 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.236491 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:22 crc kubenswrapper[4825]: E0310 06:45:22.236665 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:45:22 crc kubenswrapper[4825]: E0310 06:45:22.236887 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.287741 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.287803 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.287821 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.287850 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.287870 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:22Z","lastTransitionTime":"2026-03-10T06:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.392433 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.392492 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.392506 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.392525 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.392542 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:22Z","lastTransitionTime":"2026-03-10T06:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.499358 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.499420 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.499436 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.499459 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.499475 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:22Z","lastTransitionTime":"2026-03-10T06:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.602171 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.602495 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.602692 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.602932 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.603123 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:22Z","lastTransitionTime":"2026-03-10T06:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.706929 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.707829 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.708040 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.708230 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.708415 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:22Z","lastTransitionTime":"2026-03-10T06:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.742559 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.742669 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.742716 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.742756 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.742820 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:22 crc kubenswrapper[4825]: E0310 06:45:22.742993 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 06:45:22 crc kubenswrapper[4825]: E0310 06:45:22.743022 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 06:45:22 crc kubenswrapper[4825]: E0310 06:45:22.743041 4825 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:22 crc kubenswrapper[4825]: E0310 06:45:22.743110 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 06:45:26.743083787 +0000 UTC m=+79.772864432 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:22 crc kubenswrapper[4825]: E0310 06:45:22.743689 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:45:26.743667082 +0000 UTC m=+79.773447737 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:45:22 crc kubenswrapper[4825]: E0310 06:45:22.743813 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 06:45:22 crc kubenswrapper[4825]: E0310 06:45:22.743832 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 06:45:22 crc kubenswrapper[4825]: E0310 06:45:22.743850 4825 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:22 crc kubenswrapper[4825]: E0310 06:45:22.743895 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 06:45:26.743881097 +0000 UTC m=+79.773661742 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:22 crc kubenswrapper[4825]: E0310 06:45:22.743986 4825 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 06:45:22 crc kubenswrapper[4825]: E0310 06:45:22.744039 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 06:45:26.744014451 +0000 UTC m=+79.773795096 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 06:45:22 crc kubenswrapper[4825]: E0310 06:45:22.744111 4825 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 06:45:22 crc kubenswrapper[4825]: E0310 06:45:22.744212 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 06:45:26.744197045 +0000 UTC m=+79.773977690 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.812026 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.812078 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.812094 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.812118 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.812163 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:22Z","lastTransitionTime":"2026-03-10T06:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.916948 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.917035 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.917052 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.917077 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:22 crc kubenswrapper[4825]: I0310 06:45:22.917094 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:22Z","lastTransitionTime":"2026-03-10T06:45:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.020912 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.020985 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.021007 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.021035 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.021054 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:23Z","lastTransitionTime":"2026-03-10T06:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.125083 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.125197 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.125224 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.125256 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.125280 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:23Z","lastTransitionTime":"2026-03-10T06:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.229301 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.229387 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.229408 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.229440 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.229462 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:23Z","lastTransitionTime":"2026-03-10T06:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.235801 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:23 crc kubenswrapper[4825]: E0310 06:45:23.236214 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.332780 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.332873 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.332893 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.332918 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.332936 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:23Z","lastTransitionTime":"2026-03-10T06:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.436290 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.436373 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.436396 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.436428 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.436453 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:23Z","lastTransitionTime":"2026-03-10T06:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.539726 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.539829 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.539850 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.539886 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.539907 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:23Z","lastTransitionTime":"2026-03-10T06:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.641918 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.641988 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.642008 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.642036 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.642055 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:23Z","lastTransitionTime":"2026-03-10T06:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.745368 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.745439 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.745458 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.745509 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.745531 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:23Z","lastTransitionTime":"2026-03-10T06:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.848749 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.848809 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.848828 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.848857 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.848876 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:23Z","lastTransitionTime":"2026-03-10T06:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.956376 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.956461 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.956519 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.956560 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:23 crc kubenswrapper[4825]: I0310 06:45:23.956612 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:23Z","lastTransitionTime":"2026-03-10T06:45:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.034690 4825 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.059797 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.060192 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.060308 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.060457 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.060564 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:24Z","lastTransitionTime":"2026-03-10T06:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.163687 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.163804 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.163829 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.163865 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.163890 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:24Z","lastTransitionTime":"2026-03-10T06:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.235460 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.235550 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:24 crc kubenswrapper[4825]: E0310 06:45:24.235715 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:45:24 crc kubenswrapper[4825]: E0310 06:45:24.236593 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.267590 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.267670 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.267695 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.267728 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.267752 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:24Z","lastTransitionTime":"2026-03-10T06:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.370703 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.370745 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.370755 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.370769 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.370779 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:24Z","lastTransitionTime":"2026-03-10T06:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.473615 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.473670 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.473684 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.473705 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.473719 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:24Z","lastTransitionTime":"2026-03-10T06:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.576431 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.576506 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.576522 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.576546 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.576567 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:24Z","lastTransitionTime":"2026-03-10T06:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.679045 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.679116 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.679159 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.679189 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.679208 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:24Z","lastTransitionTime":"2026-03-10T06:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.782906 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.782995 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.783019 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.783052 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.783077 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:24Z","lastTransitionTime":"2026-03-10T06:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.886920 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.887000 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.887018 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.887047 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.887069 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:24Z","lastTransitionTime":"2026-03-10T06:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.990727 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.990817 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.990833 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.990857 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:24 crc kubenswrapper[4825]: I0310 06:45:24.990879 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:24Z","lastTransitionTime":"2026-03-10T06:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.093558 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.093627 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.093666 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.093708 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.093733 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:25Z","lastTransitionTime":"2026-03-10T06:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.198180 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.198280 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.198294 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.198315 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.198329 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:25Z","lastTransitionTime":"2026-03-10T06:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.236170 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:25 crc kubenswrapper[4825]: E0310 06:45:25.236385 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.301429 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.301516 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.301535 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.301595 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.301610 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:25Z","lastTransitionTime":"2026-03-10T06:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.405007 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.405073 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.405094 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.405119 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.405157 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:25Z","lastTransitionTime":"2026-03-10T06:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.508331 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.508406 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.508424 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.508453 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.508476 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:25Z","lastTransitionTime":"2026-03-10T06:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.611983 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.612561 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.612764 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.612908 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.613032 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:25Z","lastTransitionTime":"2026-03-10T06:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.716782 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.716837 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.716854 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.716879 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.716897 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:25Z","lastTransitionTime":"2026-03-10T06:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.820421 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.820487 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.820505 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.820531 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.820549 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:25Z","lastTransitionTime":"2026-03-10T06:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.923983 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.924437 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.924602 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.924852 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:25 crc kubenswrapper[4825]: I0310 06:45:25.925006 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:25Z","lastTransitionTime":"2026-03-10T06:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.028685 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.028788 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.028805 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.028831 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.028851 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:26Z","lastTransitionTime":"2026-03-10T06:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.132298 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.132371 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.132390 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.132416 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.132437 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:26Z","lastTransitionTime":"2026-03-10T06:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.235612 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.235699 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.235766 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.235808 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.235825 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:26 crc kubenswrapper[4825]: E0310 06:45:26.235828 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.235851 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.235887 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:26Z","lastTransitionTime":"2026-03-10T06:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:26 crc kubenswrapper[4825]: E0310 06:45:26.236115 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.339287 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.339362 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.339383 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.339410 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.339428 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:26Z","lastTransitionTime":"2026-03-10T06:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.442729 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.442842 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.442868 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.442894 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.442907 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:26Z","lastTransitionTime":"2026-03-10T06:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.546728 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.546813 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.546835 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.546866 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.546886 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:26Z","lastTransitionTime":"2026-03-10T06:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.649709 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.649823 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.649850 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.649887 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.649912 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:26Z","lastTransitionTime":"2026-03-10T06:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.754384 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.754483 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.754510 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.754543 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.754567 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:26Z","lastTransitionTime":"2026-03-10T06:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.788403 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.788533 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.788579 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.788620 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.788665 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:26 crc kubenswrapper[4825]: E0310 06:45:26.788746 4825 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 06:45:26 crc kubenswrapper[4825]: E0310 06:45:26.788862 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 06:45:26 crc kubenswrapper[4825]: E0310 06:45:26.788871 4825 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 06:45:26 crc kubenswrapper[4825]: E0310 06:45:26.788895 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 06:45:26 crc kubenswrapper[4825]: E0310 06:45:26.788921 4825 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:26 crc kubenswrapper[4825]: E0310 06:45:26.788925 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 06:45:26 crc kubenswrapper[4825]: E0310 06:45:26.789001 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 06:45:26 crc kubenswrapper[4825]: E0310 06:45:26.789025 4825 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:26 crc kubenswrapper[4825]: E0310 06:45:26.788863 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 06:45:34.788829166 +0000 UTC m=+87.818609781 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 06:45:26 crc kubenswrapper[4825]: E0310 06:45:26.789416 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 06:45:34.789402781 +0000 UTC m=+87.819183606 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 06:45:26 crc kubenswrapper[4825]: E0310 06:45:26.789442 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 06:45:34.789429751 +0000 UTC m=+87.819210366 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:26 crc kubenswrapper[4825]: E0310 06:45:26.789494 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:45:34.789482733 +0000 UTC m=+87.819263358 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:45:26 crc kubenswrapper[4825]: E0310 06:45:26.789515 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 06:45:34.789503663 +0000 UTC m=+87.819284528 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.857995 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.858044 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.858057 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.858079 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.858092 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:26Z","lastTransitionTime":"2026-03-10T06:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.962094 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.962226 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.963080 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.963126 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:26 crc kubenswrapper[4825]: I0310 06:45:26.963169 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:26Z","lastTransitionTime":"2026-03-10T06:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.066977 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.067418 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.067511 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.067619 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.067722 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:27Z","lastTransitionTime":"2026-03-10T06:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.170745 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.171098 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.171222 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.171313 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.171407 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:27Z","lastTransitionTime":"2026-03-10T06:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.236354 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:27 crc kubenswrapper[4825]: E0310 06:45:27.236566 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.274614 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.274663 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.274676 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.274698 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.274712 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:27Z","lastTransitionTime":"2026-03-10T06:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.378384 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.378669 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.378703 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.378735 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.378760 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:27Z","lastTransitionTime":"2026-03-10T06:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.482679 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.482740 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.482759 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.482799 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.482832 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:27Z","lastTransitionTime":"2026-03-10T06:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.586832 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.586922 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.586958 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.586994 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.587016 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:27Z","lastTransitionTime":"2026-03-10T06:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.690304 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.690356 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.690371 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.690393 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.690409 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:27Z","lastTransitionTime":"2026-03-10T06:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.793699 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.793768 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.793787 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.793815 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.793837 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:27Z","lastTransitionTime":"2026-03-10T06:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.896840 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.896912 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.896936 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.896968 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.896993 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:27Z","lastTransitionTime":"2026-03-10T06:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.999647 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.999725 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.999748 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.999782 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:27 crc kubenswrapper[4825]: I0310 06:45:27.999806 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:27Z","lastTransitionTime":"2026-03-10T06:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.103556 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.103634 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.103656 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.103688 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.103710 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:28Z","lastTransitionTime":"2026-03-10T06:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.206744 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.206812 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.206829 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.206871 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.206891 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:28Z","lastTransitionTime":"2026-03-10T06:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.236056 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.236168 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:28 crc kubenswrapper[4825]: E0310 06:45:28.236383 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:45:28 crc kubenswrapper[4825]: E0310 06:45:28.236532 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.309722 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.309772 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.309784 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.309801 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.309812 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:28Z","lastTransitionTime":"2026-03-10T06:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.413853 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.413984 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.414050 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.414085 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.414185 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:28Z","lastTransitionTime":"2026-03-10T06:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.518314 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.518391 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.518411 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.518442 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.518465 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:28Z","lastTransitionTime":"2026-03-10T06:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.622077 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.622223 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.622246 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.622319 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.622348 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:28Z","lastTransitionTime":"2026-03-10T06:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.726847 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.726930 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.726954 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.727003 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.727024 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:28Z","lastTransitionTime":"2026-03-10T06:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.829860 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.829951 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.829973 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.830001 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.830022 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:28Z","lastTransitionTime":"2026-03-10T06:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.933173 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.933251 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.933267 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.933287 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:28 crc kubenswrapper[4825]: I0310 06:45:28.933298 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:28Z","lastTransitionTime":"2026-03-10T06:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.036056 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.036118 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.036171 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.036198 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.036217 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:29Z","lastTransitionTime":"2026-03-10T06:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.139864 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.139978 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.140004 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.140048 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.140073 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:29Z","lastTransitionTime":"2026-03-10T06:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.235969 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:29 crc kubenswrapper[4825]: E0310 06:45:29.236488 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.243520 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.243593 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.243613 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.243638 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.243659 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:29Z","lastTransitionTime":"2026-03-10T06:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.253702 4825 scope.go:117] "RemoveContainer" containerID="414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d" Mar 10 06:45:29 crc kubenswrapper[4825]: E0310 06:45:29.254242 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.254909 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.256680 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.274867 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.291824 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.306467 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.321587 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.338477 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.345951 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.345993 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.346009 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.346031 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.346047 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:29Z","lastTransitionTime":"2026-03-10T06:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.354356 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.449357 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.449434 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.449458 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.449491 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.449512 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:29Z","lastTransitionTime":"2026-03-10T06:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.552617 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.552665 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.552680 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.552698 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.552713 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:29Z","lastTransitionTime":"2026-03-10T06:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.642391 4825 scope.go:117] "RemoveContainer" containerID="414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d" Mar 10 06:45:29 crc kubenswrapper[4825]: E0310 06:45:29.642729 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.655986 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.656071 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.656097 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.656166 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.656197 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:29Z","lastTransitionTime":"2026-03-10T06:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.760028 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.760120 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.760180 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.760221 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.760250 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:29Z","lastTransitionTime":"2026-03-10T06:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.863732 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.863808 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.863827 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.863861 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.863880 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:29Z","lastTransitionTime":"2026-03-10T06:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.966843 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.966929 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.966950 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.966982 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:29 crc kubenswrapper[4825]: I0310 06:45:29.967013 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:29Z","lastTransitionTime":"2026-03-10T06:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.070471 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.070541 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.070560 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.070589 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.070607 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:30Z","lastTransitionTime":"2026-03-10T06:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.174843 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.174929 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.174947 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.174973 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.174991 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:30Z","lastTransitionTime":"2026-03-10T06:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.236359 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.236359 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:30 crc kubenswrapper[4825]: E0310 06:45:30.236587 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:45:30 crc kubenswrapper[4825]: E0310 06:45:30.236697 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.283664 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.283760 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.283789 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.283827 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.283865 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:30Z","lastTransitionTime":"2026-03-10T06:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.387470 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.387518 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.387528 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.387545 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.387556 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:30Z","lastTransitionTime":"2026-03-10T06:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.490949 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.491032 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.491054 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.491079 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.491103 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:30Z","lastTransitionTime":"2026-03-10T06:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.594914 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.595013 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.595039 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.595069 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.595099 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:30Z","lastTransitionTime":"2026-03-10T06:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.645098 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.645196 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.645217 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.645248 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.645270 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:30Z","lastTransitionTime":"2026-03-10T06:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:30 crc kubenswrapper[4825]: E0310 06:45:30.660024 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.667102 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.667188 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.667207 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.667254 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.667273 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:30Z","lastTransitionTime":"2026-03-10T06:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:30 crc kubenswrapper[4825]: E0310 06:45:30.683734 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.688973 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.689015 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.689026 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.689044 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.689059 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:30Z","lastTransitionTime":"2026-03-10T06:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:30 crc kubenswrapper[4825]: E0310 06:45:30.701507 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.707683 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.707773 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.707799 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.707835 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.707867 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:30Z","lastTransitionTime":"2026-03-10T06:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:30 crc kubenswrapper[4825]: E0310 06:45:30.725919 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.732077 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.732186 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.732212 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.732244 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.732270 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:30Z","lastTransitionTime":"2026-03-10T06:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:30 crc kubenswrapper[4825]: E0310 06:45:30.749972 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:30 crc kubenswrapper[4825]: E0310 06:45:30.750378 4825 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.753925 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.753982 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.753997 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.754018 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.754034 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:30Z","lastTransitionTime":"2026-03-10T06:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.857850 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.857886 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.857915 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.857930 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.857943 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:30Z","lastTransitionTime":"2026-03-10T06:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.961656 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.961756 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.961781 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.961818 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:30 crc kubenswrapper[4825]: I0310 06:45:30.961843 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:30Z","lastTransitionTime":"2026-03-10T06:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.064832 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.064961 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.064981 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.065012 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.065035 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:31Z","lastTransitionTime":"2026-03-10T06:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.169420 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.169515 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.169541 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.169577 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.169608 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:31Z","lastTransitionTime":"2026-03-10T06:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.236127 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:31 crc kubenswrapper[4825]: E0310 06:45:31.236733 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.274463 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.274837 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.275038 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.275270 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.275412 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:31Z","lastTransitionTime":"2026-03-10T06:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.380992 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.381069 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.381089 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.381122 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.381184 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:31Z","lastTransitionTime":"2026-03-10T06:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.485243 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.485314 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.485334 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.485363 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.485387 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:31Z","lastTransitionTime":"2026-03-10T06:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.588340 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.588382 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.588414 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.588429 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.588438 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:31Z","lastTransitionTime":"2026-03-10T06:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.651830 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb"} Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.651935 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3"} Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.670848 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.683234 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.694741 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.695034 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.695286 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.695024 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.695528 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.695709 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:31Z","lastTransitionTime":"2026-03-10T06:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.709881 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.727685 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.744572 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.760991 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.783782 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.800446 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.800541 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.800566 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.800600 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.800634 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:31Z","lastTransitionTime":"2026-03-10T06:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.904422 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.904920 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.905112 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.905397 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:31 crc kubenswrapper[4825]: I0310 06:45:31.905601 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:31Z","lastTransitionTime":"2026-03-10T06:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.008972 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.009025 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.009044 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.009080 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.009101 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:32Z","lastTransitionTime":"2026-03-10T06:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.112818 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.112875 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.112894 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.112940 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.112961 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:32Z","lastTransitionTime":"2026-03-10T06:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.216272 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.216320 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.216339 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.216367 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.216386 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:32Z","lastTransitionTime":"2026-03-10T06:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.239644 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.239850 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:32 crc kubenswrapper[4825]: E0310 06:45:32.240044 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:45:32 crc kubenswrapper[4825]: E0310 06:45:32.240303 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.319952 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.320030 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.320050 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.320082 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.320103 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:32Z","lastTransitionTime":"2026-03-10T06:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.423222 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.423300 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.423324 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.423362 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.423385 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:32Z","lastTransitionTime":"2026-03-10T06:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.527355 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.527422 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.527446 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.527472 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.527494 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:32Z","lastTransitionTime":"2026-03-10T06:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.630411 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.630481 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.630501 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.630531 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.630552 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:32Z","lastTransitionTime":"2026-03-10T06:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.734491 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.734566 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.734586 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.734613 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.734634 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:32Z","lastTransitionTime":"2026-03-10T06:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.837594 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.837691 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.837719 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.837756 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.837780 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:32Z","lastTransitionTime":"2026-03-10T06:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.941040 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.941125 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.941180 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.941215 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:32 crc kubenswrapper[4825]: I0310 06:45:32.941235 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:32Z","lastTransitionTime":"2026-03-10T06:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.045280 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.045348 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.045367 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.045400 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.045421 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:33Z","lastTransitionTime":"2026-03-10T06:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.148969 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.149049 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.149068 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.149098 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.149117 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:33Z","lastTransitionTime":"2026-03-10T06:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.236421 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:33 crc kubenswrapper[4825]: E0310 06:45:33.236864 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.252356 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.252421 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.252439 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.252468 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.252487 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:33Z","lastTransitionTime":"2026-03-10T06:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.354533 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.354600 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.354618 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.354646 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.354665 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:33Z","lastTransitionTime":"2026-03-10T06:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.457404 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.457473 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.457494 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.457526 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.457546 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:33Z","lastTransitionTime":"2026-03-10T06:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.559980 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.560014 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.560024 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.560042 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.560053 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:33Z","lastTransitionTime":"2026-03-10T06:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.662330 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.662365 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.662376 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.662394 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.662404 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:33Z","lastTransitionTime":"2026-03-10T06:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.662874 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2"} Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.688410 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:33Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.705879 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:33Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.728991 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:33Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.750896 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:33Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.766275 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.766325 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.766337 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.766357 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.766372 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:33Z","lastTransitionTime":"2026-03-10T06:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.770555 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:33Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.792008 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:33Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.809793 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:33Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.826059 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:33Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.869822 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.869893 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.869914 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.869942 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.869963 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:33Z","lastTransitionTime":"2026-03-10T06:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.973733 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.973810 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.973827 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.973860 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:33 crc kubenswrapper[4825]: I0310 06:45:33.973885 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:33Z","lastTransitionTime":"2026-03-10T06:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.076935 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.077001 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.077019 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.077045 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.077065 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:34Z","lastTransitionTime":"2026-03-10T06:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.179652 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.179707 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.179719 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.179738 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.179752 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:34Z","lastTransitionTime":"2026-03-10T06:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.235891 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.235944 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:34 crc kubenswrapper[4825]: E0310 06:45:34.236171 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:45:34 crc kubenswrapper[4825]: E0310 06:45:34.236310 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.282757 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.282817 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.282834 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.282858 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.282894 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:34Z","lastTransitionTime":"2026-03-10T06:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.392500 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.392566 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.392582 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.392608 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.392622 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:34Z","lastTransitionTime":"2026-03-10T06:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.495340 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.495707 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.495773 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.495840 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.495907 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:34Z","lastTransitionTime":"2026-03-10T06:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.599192 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.599252 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.599263 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.599281 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.599296 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:34Z","lastTransitionTime":"2026-03-10T06:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.702569 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.702625 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.702635 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.702653 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.702667 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:34Z","lastTransitionTime":"2026-03-10T06:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.806046 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.806588 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.806740 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.806906 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.807049 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:34Z","lastTransitionTime":"2026-03-10T06:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.868433 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.868562 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.868603 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.868632 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.868663 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:34 crc kubenswrapper[4825]: E0310 06:45:34.868846 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 06:45:34 crc kubenswrapper[4825]: E0310 06:45:34.868854 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 06:45:34 crc kubenswrapper[4825]: E0310 06:45:34.868898 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 06:45:34 crc kubenswrapper[4825]: E0310 06:45:34.868903 4825 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 06:45:34 crc kubenswrapper[4825]: E0310 06:45:34.868923 4825 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:34 crc kubenswrapper[4825]: E0310 06:45:34.869050 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 06:45:50.86901294 +0000 UTC m=+103.898793555 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 06:45:34 crc kubenswrapper[4825]: E0310 06:45:34.869125 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 06:45:50.869104813 +0000 UTC m=+103.898885608 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:34 crc kubenswrapper[4825]: E0310 06:45:34.868873 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 06:45:34 crc kubenswrapper[4825]: E0310 06:45:34.869178 4825 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:34 crc kubenswrapper[4825]: E0310 06:45:34.869212 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 06:45:50.869203795 +0000 UTC m=+103.898984410 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:34 crc kubenswrapper[4825]: E0310 06:45:34.868912 4825 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 06:45:34 crc kubenswrapper[4825]: E0310 06:45:34.869407 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 06:45:50.869363019 +0000 UTC m=+103.899143794 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 06:45:34 crc kubenswrapper[4825]: E0310 06:45:34.870163 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:45:50.870104848 +0000 UTC m=+103.899885573 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.910203 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.910259 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.910273 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.910294 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:34 crc kubenswrapper[4825]: I0310 06:45:34.910306 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:34Z","lastTransitionTime":"2026-03-10T06:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.014030 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.014091 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.014110 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.014157 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.014176 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:35Z","lastTransitionTime":"2026-03-10T06:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.116662 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.116724 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.116734 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.116753 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.116766 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:35Z","lastTransitionTime":"2026-03-10T06:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.219726 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.219776 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.219786 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.219805 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.219820 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:35Z","lastTransitionTime":"2026-03-10T06:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.236200 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:35 crc kubenswrapper[4825]: E0310 06:45:35.236335 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.324122 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.324210 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.324229 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.324255 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.324275 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:35Z","lastTransitionTime":"2026-03-10T06:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.427857 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.427906 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.427921 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.427943 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.427956 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:35Z","lastTransitionTime":"2026-03-10T06:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.530813 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.530856 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.530867 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.530885 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.530899 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:35Z","lastTransitionTime":"2026-03-10T06:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.636019 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.636106 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.636160 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.636197 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.636226 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:35Z","lastTransitionTime":"2026-03-10T06:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.739321 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.739405 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.739424 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.739460 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.739478 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:35Z","lastTransitionTime":"2026-03-10T06:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.842585 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.842661 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.842679 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.842715 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.842735 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:35Z","lastTransitionTime":"2026-03-10T06:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.946204 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.946267 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.946279 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.946300 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:35 crc kubenswrapper[4825]: I0310 06:45:35.946314 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:35Z","lastTransitionTime":"2026-03-10T06:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.048918 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.048979 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.048995 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.049015 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.049029 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:36Z","lastTransitionTime":"2026-03-10T06:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.153559 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.153641 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.153659 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.153691 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.153710 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:36Z","lastTransitionTime":"2026-03-10T06:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.235926 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.235971 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:36 crc kubenswrapper[4825]: E0310 06:45:36.236253 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:45:36 crc kubenswrapper[4825]: E0310 06:45:36.236309 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.257275 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.257350 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.257368 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.257395 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.257417 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:36Z","lastTransitionTime":"2026-03-10T06:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.361327 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.361400 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.361421 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.361450 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.361506 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:36Z","lastTransitionTime":"2026-03-10T06:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.464682 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.464756 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.464823 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.464862 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.464889 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:36Z","lastTransitionTime":"2026-03-10T06:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.568515 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.568569 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.568585 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.568608 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.568626 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:36Z","lastTransitionTime":"2026-03-10T06:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.672453 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.672524 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.672545 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.672573 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.672595 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:36Z","lastTransitionTime":"2026-03-10T06:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.674647 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a"} Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.692318 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:36Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.710851 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:36Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.728094 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:36Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.745040 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:36Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.762610 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:36Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.775924 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.775996 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.776014 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.776038 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.776053 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:36Z","lastTransitionTime":"2026-03-10T06:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.783283 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:36Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.799794 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:36Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.815200 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:36Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.878905 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.878968 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.878985 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.879012 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.879035 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:36Z","lastTransitionTime":"2026-03-10T06:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.981708 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.981774 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.981794 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.981816 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:36 crc kubenswrapper[4825]: I0310 06:45:36.981835 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:36Z","lastTransitionTime":"2026-03-10T06:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.085510 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.085576 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.085593 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.085618 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.085633 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:37Z","lastTransitionTime":"2026-03-10T06:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.188675 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.188755 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.188774 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.188802 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.188822 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:37Z","lastTransitionTime":"2026-03-10T06:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.235599 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:37 crc kubenswrapper[4825]: E0310 06:45:37.235995 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.292044 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.292178 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.292205 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.292240 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.292264 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:37Z","lastTransitionTime":"2026-03-10T06:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.395368 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.395436 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.395459 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.395496 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.395513 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:37Z","lastTransitionTime":"2026-03-10T06:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.498496 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.498551 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.498569 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.498593 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.498611 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:37Z","lastTransitionTime":"2026-03-10T06:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.602410 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.602483 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.602507 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.602537 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.602561 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:37Z","lastTransitionTime":"2026-03-10T06:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.716811 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.716868 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.716881 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.716905 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.716917 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:37Z","lastTransitionTime":"2026-03-10T06:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.820539 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.820612 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.820631 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.820656 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.820676 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:37Z","lastTransitionTime":"2026-03-10T06:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.924591 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.924679 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.924705 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.924743 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:37 crc kubenswrapper[4825]: I0310 06:45:37.924774 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:37Z","lastTransitionTime":"2026-03-10T06:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.027855 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.027926 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.027950 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.027985 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.028008 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:38Z","lastTransitionTime":"2026-03-10T06:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.131853 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.131968 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.131995 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.132034 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.132059 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:38Z","lastTransitionTime":"2026-03-10T06:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.235485 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.235500 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:38 crc kubenswrapper[4825]: E0310 06:45:38.235685 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.235739 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.235790 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.235806 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.235827 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.235843 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:38Z","lastTransitionTime":"2026-03-10T06:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:38 crc kubenswrapper[4825]: E0310 06:45:38.235948 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.338729 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.338815 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.338831 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.338856 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.338879 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:38Z","lastTransitionTime":"2026-03-10T06:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.442208 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.442283 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.442299 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.442325 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.442344 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:38Z","lastTransitionTime":"2026-03-10T06:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.544855 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.544937 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.544954 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.544982 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.545002 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:38Z","lastTransitionTime":"2026-03-10T06:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.647690 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.647743 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.647759 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.647783 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.647801 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:38Z","lastTransitionTime":"2026-03-10T06:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.751029 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.751107 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.751120 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.751158 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.751173 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:38Z","lastTransitionTime":"2026-03-10T06:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.854195 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.854245 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.854260 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.854283 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.854300 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:38Z","lastTransitionTime":"2026-03-10T06:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.958028 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.958112 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.958175 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.958213 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:38 crc kubenswrapper[4825]: I0310 06:45:38.958237 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:38Z","lastTransitionTime":"2026-03-10T06:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.060966 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.061013 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.061022 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.061038 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.061047 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:39Z","lastTransitionTime":"2026-03-10T06:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.164219 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.164270 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.164282 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.164302 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.164318 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:39Z","lastTransitionTime":"2026-03-10T06:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.236103 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:39 crc kubenswrapper[4825]: E0310 06:45:39.236443 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.252604 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.266803 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.266846 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.266859 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.266878 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.266891 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:39Z","lastTransitionTime":"2026-03-10T06:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.270282 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.289758 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.306637 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.324331 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.339658 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.369608 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.369676 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.369691 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.369721 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.369736 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:39Z","lastTransitionTime":"2026-03-10T06:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.376065 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.408570 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.472088 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.472163 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.472176 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.472194 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.472208 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:39Z","lastTransitionTime":"2026-03-10T06:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.575424 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.575475 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.575485 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.575505 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.575518 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:39Z","lastTransitionTime":"2026-03-10T06:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.678500 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.678572 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.678591 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.678618 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.678637 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:39Z","lastTransitionTime":"2026-03-10T06:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.781775 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.781827 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.781843 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.781870 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.781889 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:39Z","lastTransitionTime":"2026-03-10T06:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.885087 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.885157 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.885173 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.885193 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.885206 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:39Z","lastTransitionTime":"2026-03-10T06:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.988474 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.988531 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.988548 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.988571 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:39 crc kubenswrapper[4825]: I0310 06:45:39.988587 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:39Z","lastTransitionTime":"2026-03-10T06:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.091728 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.092094 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.092185 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.092491 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.092563 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:40Z","lastTransitionTime":"2026-03-10T06:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.195865 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.196282 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.196350 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.196481 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.196590 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:40Z","lastTransitionTime":"2026-03-10T06:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.222639 4825 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.236019 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.236429 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:40 crc kubenswrapper[4825]: E0310 06:45:40.236569 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:45:40 crc kubenswrapper[4825]: E0310 06:45:40.236645 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.299311 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.299392 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.299418 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.299458 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.299483 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:40Z","lastTransitionTime":"2026-03-10T06:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.402988 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.403681 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.403797 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.403934 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.404057 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:40Z","lastTransitionTime":"2026-03-10T06:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.507507 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.507959 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.508098 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.508522 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.508779 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:40Z","lastTransitionTime":"2026-03-10T06:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.612907 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.613220 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.613340 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.613456 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.613529 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:40Z","lastTransitionTime":"2026-03-10T06:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.717049 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.717369 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.717444 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.717580 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.717655 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:40Z","lastTransitionTime":"2026-03-10T06:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.821026 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.821098 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.821118 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.821176 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.821196 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:40Z","lastTransitionTime":"2026-03-10T06:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.924171 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.924446 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.924557 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.924623 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.924679 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:40Z","lastTransitionTime":"2026-03-10T06:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.981615 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.981682 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.981700 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.981728 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:40 crc kubenswrapper[4825]: I0310 06:45:40.981744 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:40Z","lastTransitionTime":"2026-03-10T06:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:40 crc kubenswrapper[4825]: E0310 06:45:40.995935 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:40Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.002281 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.002444 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.002566 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.002677 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.002786 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:41Z","lastTransitionTime":"2026-03-10T06:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:41 crc kubenswrapper[4825]: E0310 06:45:41.022353 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:41Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.029433 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.029508 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.029527 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.029553 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.029571 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:41Z","lastTransitionTime":"2026-03-10T06:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:41 crc kubenswrapper[4825]: E0310 06:45:41.046908 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:41Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.052211 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.052400 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.052536 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.052687 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.052819 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:41Z","lastTransitionTime":"2026-03-10T06:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:41 crc kubenswrapper[4825]: E0310 06:45:41.073252 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:41Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.080959 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.082023 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.082109 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.082280 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.082366 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:41Z","lastTransitionTime":"2026-03-10T06:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:41 crc kubenswrapper[4825]: E0310 06:45:41.105694 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:41Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:41 crc kubenswrapper[4825]: E0310 06:45:41.105965 4825 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.114234 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.114298 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.114311 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.114331 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.114346 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:41Z","lastTransitionTime":"2026-03-10T06:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.216655 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.216934 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.217050 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.217149 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.217217 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:41Z","lastTransitionTime":"2026-03-10T06:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.236808 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:41 crc kubenswrapper[4825]: E0310 06:45:41.237193 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.252736 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.320735 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.321114 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.321248 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.321374 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.321495 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:41Z","lastTransitionTime":"2026-03-10T06:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.427794 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.428419 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.428488 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.428558 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.428636 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:41Z","lastTransitionTime":"2026-03-10T06:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.532662 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.532952 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.533088 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.533201 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.533288 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:41Z","lastTransitionTime":"2026-03-10T06:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.636025 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.636079 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.636092 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.636111 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.636125 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:41Z","lastTransitionTime":"2026-03-10T06:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.740044 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.740801 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.740990 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.741239 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.741444 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:41Z","lastTransitionTime":"2026-03-10T06:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.845277 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.845829 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.846195 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.846837 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.847633 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:41Z","lastTransitionTime":"2026-03-10T06:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.950985 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.951564 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.951842 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.952037 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:41 crc kubenswrapper[4825]: I0310 06:45:41.952254 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:41Z","lastTransitionTime":"2026-03-10T06:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.056473 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.056556 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.056587 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.056617 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.056640 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:42Z","lastTransitionTime":"2026-03-10T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.159796 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.160120 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.160286 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.160378 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.160478 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:42Z","lastTransitionTime":"2026-03-10T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.235506 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:42 crc kubenswrapper[4825]: E0310 06:45:42.235711 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.235541 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:42 crc kubenswrapper[4825]: E0310 06:45:42.236167 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.263822 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.263883 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.263905 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.263933 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.263951 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:42Z","lastTransitionTime":"2026-03-10T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.366830 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.366886 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.366905 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.366932 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.366949 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:42Z","lastTransitionTime":"2026-03-10T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.470683 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.471379 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.471447 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.471486 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.471512 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:42Z","lastTransitionTime":"2026-03-10T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.574852 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.574924 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.574944 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.574973 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.574993 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:42Z","lastTransitionTime":"2026-03-10T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.677906 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.677961 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.677978 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.677998 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.678014 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:42Z","lastTransitionTime":"2026-03-10T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.781317 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.781394 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.781413 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.781442 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.781463 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:42Z","lastTransitionTime":"2026-03-10T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.884329 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.884412 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.884431 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.884461 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.884486 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:42Z","lastTransitionTime":"2026-03-10T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.987477 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.987533 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.987551 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.987573 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:42 crc kubenswrapper[4825]: I0310 06:45:42.987588 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:42Z","lastTransitionTime":"2026-03-10T06:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.091266 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.091308 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.091320 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.091339 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.091354 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:43Z","lastTransitionTime":"2026-03-10T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.194553 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.194626 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.194644 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.194670 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.194696 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:43Z","lastTransitionTime":"2026-03-10T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.236316 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:43 crc kubenswrapper[4825]: E0310 06:45:43.236489 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.297595 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.297668 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.297689 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.297715 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.297736 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:43Z","lastTransitionTime":"2026-03-10T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.401287 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.401349 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.401367 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.401396 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.401415 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:43Z","lastTransitionTime":"2026-03-10T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.504930 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.504984 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.504998 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.505017 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.505031 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:43Z","lastTransitionTime":"2026-03-10T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.609242 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.609899 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.610034 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.610172 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.610300 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:43Z","lastTransitionTime":"2026-03-10T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.713165 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.713214 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.713229 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.713250 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.713266 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:43Z","lastTransitionTime":"2026-03-10T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.816448 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.816518 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.816545 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.816580 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.816603 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:43Z","lastTransitionTime":"2026-03-10T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.920180 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.920246 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.920274 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.920307 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:43 crc kubenswrapper[4825]: I0310 06:45:43.920329 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:43Z","lastTransitionTime":"2026-03-10T06:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.024123 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.024226 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.024244 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.024274 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.024292 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:44Z","lastTransitionTime":"2026-03-10T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.127533 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.127613 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.127636 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.127663 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.127681 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:44Z","lastTransitionTime":"2026-03-10T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.231126 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.231195 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.231207 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.231228 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.231244 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:44Z","lastTransitionTime":"2026-03-10T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.235731 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.235769 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:44 crc kubenswrapper[4825]: E0310 06:45:44.235891 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:45:44 crc kubenswrapper[4825]: E0310 06:45:44.236035 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.334692 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.334771 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.334785 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.334806 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.334821 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:44Z","lastTransitionTime":"2026-03-10T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.437955 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.438025 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.438041 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.438064 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.438082 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:44Z","lastTransitionTime":"2026-03-10T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.541854 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.541926 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.541947 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.541978 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.542000 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:44Z","lastTransitionTime":"2026-03-10T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.645594 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.645654 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.645668 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.645695 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.645709 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:44Z","lastTransitionTime":"2026-03-10T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.748605 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.748658 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.748672 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.748693 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.748709 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:44Z","lastTransitionTime":"2026-03-10T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.852731 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.852790 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.852807 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.852831 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.852853 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:44Z","lastTransitionTime":"2026-03-10T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.956083 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.956178 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.956198 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.956227 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:44 crc kubenswrapper[4825]: I0310 06:45:44.956247 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:44Z","lastTransitionTime":"2026-03-10T06:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.059999 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.060091 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.060115 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.060173 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.060194 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:45Z","lastTransitionTime":"2026-03-10T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.163931 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.164011 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.164029 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.164056 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.164076 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:45Z","lastTransitionTime":"2026-03-10T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.235659 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:45 crc kubenswrapper[4825]: E0310 06:45:45.236371 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.236814 4825 scope.go:117] "RemoveContainer" containerID="414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d" Mar 10 06:45:45 crc kubenswrapper[4825]: E0310 06:45:45.237196 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.267174 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.267464 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.267506 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.267542 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.267571 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:45Z","lastTransitionTime":"2026-03-10T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.370777 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.370861 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.370885 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.370916 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.370939 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:45Z","lastTransitionTime":"2026-03-10T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.473892 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.473962 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.473981 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.474005 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.474025 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:45Z","lastTransitionTime":"2026-03-10T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.577066 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.577189 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.577228 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.577262 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.577285 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:45Z","lastTransitionTime":"2026-03-10T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.680698 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.680765 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.680789 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.680820 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.680840 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:45Z","lastTransitionTime":"2026-03-10T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.784486 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.784568 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.784599 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.784637 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.784664 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:45Z","lastTransitionTime":"2026-03-10T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.887447 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.887508 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.887521 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.887543 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.887556 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:45Z","lastTransitionTime":"2026-03-10T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.990858 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.990940 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.990965 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.990997 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:45 crc kubenswrapper[4825]: I0310 06:45:45.991018 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:45Z","lastTransitionTime":"2026-03-10T06:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.094033 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.094414 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.094512 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.094599 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.094679 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:46Z","lastTransitionTime":"2026-03-10T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.157763 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-pgpc8"] Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.158112 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pgpc8" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.162086 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.162417 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.163321 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.179942 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.194467 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.198058 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.198171 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.198191 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.198220 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.198245 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:46Z","lastTransitionTime":"2026-03-10T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.217054 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.236416 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.236459 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:46 crc kubenswrapper[4825]: E0310 06:45:46.236868 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:45:46 crc kubenswrapper[4825]: E0310 06:45:46.237995 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.238563 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.253571 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.269057 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.285524 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.301932 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.302030 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.302057 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.302091 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.302117 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:46Z","lastTransitionTime":"2026-03-10T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.302527 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.307593 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgsbt\" (UniqueName: \"kubernetes.io/projected/bdb08f12-1daf-4d35-940c-914e187ffda0-kube-api-access-kgsbt\") pod \"node-resolver-pgpc8\" (UID: \"bdb08f12-1daf-4d35-940c-914e187ffda0\") " pod="openshift-dns/node-resolver-pgpc8" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.307856 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bdb08f12-1daf-4d35-940c-914e187ffda0-hosts-file\") pod \"node-resolver-pgpc8\" (UID: \"bdb08f12-1daf-4d35-940c-914e187ffda0\") " pod="openshift-dns/node-resolver-pgpc8" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.317773 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.333381 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.406069 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.406546 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.406726 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.406873 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.407035 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:46Z","lastTransitionTime":"2026-03-10T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.409580 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bdb08f12-1daf-4d35-940c-914e187ffda0-hosts-file\") pod \"node-resolver-pgpc8\" (UID: \"bdb08f12-1daf-4d35-940c-914e187ffda0\") " pod="openshift-dns/node-resolver-pgpc8" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.409685 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgsbt\" (UniqueName: \"kubernetes.io/projected/bdb08f12-1daf-4d35-940c-914e187ffda0-kube-api-access-kgsbt\") pod \"node-resolver-pgpc8\" (UID: \"bdb08f12-1daf-4d35-940c-914e187ffda0\") " pod="openshift-dns/node-resolver-pgpc8" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.410029 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bdb08f12-1daf-4d35-940c-914e187ffda0-hosts-file\") pod \"node-resolver-pgpc8\" (UID: \"bdb08f12-1daf-4d35-940c-914e187ffda0\") " pod="openshift-dns/node-resolver-pgpc8" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.452867 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgsbt\" (UniqueName: \"kubernetes.io/projected/bdb08f12-1daf-4d35-940c-914e187ffda0-kube-api-access-kgsbt\") pod \"node-resolver-pgpc8\" (UID: \"bdb08f12-1daf-4d35-940c-914e187ffda0\") " pod="openshift-dns/node-resolver-pgpc8" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.477006 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pgpc8" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.511773 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.511816 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.511826 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.511843 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.511855 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:46Z","lastTransitionTime":"2026-03-10T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.544083 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-bvt9j"] Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.545740 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-8dkbt"] Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.545929 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-g445x"] Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.546641 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g445x" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.547369 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.547636 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.552212 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.552428 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.552791 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.553012 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.553065 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.553312 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.553504 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.553774 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.553882 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.554113 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.556189 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.556421 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.571944 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.595517 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.611849 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-host-var-lib-cni-multus\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.612077 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9beb5814-89d0-47c0-8b0e-24376a358fc3-proxy-tls\") pod \"machine-config-daemon-bvt9j\" (UID: \"9beb5814-89d0-47c0-8b0e-24376a358fc3\") " pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.612182 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-system-cni-dir\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.612227 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/165351e4-3c96-4a68-8c75-43b001b0ec60-cni-binary-copy\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.612266 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-host-run-netns\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.612303 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-host-var-lib-kubelet\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.612358 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fc29189-3e91-4d20-8d00-682a9431a8ef-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g445x\" (UID: \"0fc29189-3e91-4d20-8d00-682a9431a8ef\") " pod="openshift-multus/multus-additional-cni-plugins-g445x" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.612406 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-os-release\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.612472 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0fc29189-3e91-4d20-8d00-682a9431a8ef-cni-binary-copy\") pod \"multus-additional-cni-plugins-g445x\" (UID: \"0fc29189-3e91-4d20-8d00-682a9431a8ef\") " pod="openshift-multus/multus-additional-cni-plugins-g445x" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.612517 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-cnibin\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.612556 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-host-run-multus-certs\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.612596 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88dxl\" (UniqueName: \"kubernetes.io/projected/9beb5814-89d0-47c0-8b0e-24376a358fc3-kube-api-access-88dxl\") pod \"machine-config-daemon-bvt9j\" (UID: \"9beb5814-89d0-47c0-8b0e-24376a358fc3\") " pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.612650 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-multus-conf-dir\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.612710 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gts5z\" (UniqueName: \"kubernetes.io/projected/0fc29189-3e91-4d20-8d00-682a9431a8ef-kube-api-access-gts5z\") pod \"multus-additional-cni-plugins-g445x\" (UID: \"0fc29189-3e91-4d20-8d00-682a9431a8ef\") " pod="openshift-multus/multus-additional-cni-plugins-g445x" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.612748 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-hostroot\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.612783 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/165351e4-3c96-4a68-8c75-43b001b0ec60-multus-daemon-config\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.612825 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0fc29189-3e91-4d20-8d00-682a9431a8ef-cnibin\") pod \"multus-additional-cni-plugins-g445x\" (UID: \"0fc29189-3e91-4d20-8d00-682a9431a8ef\") " pod="openshift-multus/multus-additional-cni-plugins-g445x" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.612868 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0fc29189-3e91-4d20-8d00-682a9431a8ef-os-release\") pod \"multus-additional-cni-plugins-g445x\" (UID: \"0fc29189-3e91-4d20-8d00-682a9431a8ef\") " pod="openshift-multus/multus-additional-cni-plugins-g445x" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.612918 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-host-run-k8s-cni-cncf-io\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.612970 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9beb5814-89d0-47c0-8b0e-24376a358fc3-rootfs\") pod \"machine-config-daemon-bvt9j\" (UID: \"9beb5814-89d0-47c0-8b0e-24376a358fc3\") " pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.613009 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0fc29189-3e91-4d20-8d00-682a9431a8ef-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g445x\" (UID: \"0fc29189-3e91-4d20-8d00-682a9431a8ef\") " pod="openshift-multus/multus-additional-cni-plugins-g445x" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.613044 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-multus-socket-dir-parent\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.612975 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.613086 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9beb5814-89d0-47c0-8b0e-24376a358fc3-mcd-auth-proxy-config\") pod \"machine-config-daemon-bvt9j\" (UID: \"9beb5814-89d0-47c0-8b0e-24376a358fc3\") " pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.613123 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-multus-cni-dir\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.613190 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-host-var-lib-cni-bin\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.613441 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fc29189-3e91-4d20-8d00-682a9431a8ef-system-cni-dir\") pod \"multus-additional-cni-plugins-g445x\" (UID: \"0fc29189-3e91-4d20-8d00-682a9431a8ef\") " pod="openshift-multus/multus-additional-cni-plugins-g445x" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.613477 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-etc-kubernetes\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.613513 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5xgs\" (UniqueName: \"kubernetes.io/projected/165351e4-3c96-4a68-8c75-43b001b0ec60-kube-api-access-c5xgs\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.615442 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.615490 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.615500 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.615519 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.615532 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:46Z","lastTransitionTime":"2026-03-10T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.635977 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.653671 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.667023 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.678326 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.692429 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.707380 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.712244 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pgpc8" event={"ID":"bdb08f12-1daf-4d35-940c-914e187ffda0","Type":"ContainerStarted","Data":"3d01b31186bf24810cb7b16bd8e50aa4d1fdb1cafcf794377dc5f1980c16087d"} Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.714677 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-hostroot\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.714786 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-hostroot\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.714812 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/165351e4-3c96-4a68-8c75-43b001b0ec60-multus-daemon-config\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.714979 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0fc29189-3e91-4d20-8d00-682a9431a8ef-os-release\") pod \"multus-additional-cni-plugins-g445x\" (UID: \"0fc29189-3e91-4d20-8d00-682a9431a8ef\") " pod="openshift-multus/multus-additional-cni-plugins-g445x" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.715068 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-host-run-k8s-cni-cncf-io\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.715152 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0fc29189-3e91-4d20-8d00-682a9431a8ef-os-release\") pod \"multus-additional-cni-plugins-g445x\" (UID: \"0fc29189-3e91-4d20-8d00-682a9431a8ef\") " pod="openshift-multus/multus-additional-cni-plugins-g445x" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.715177 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0fc29189-3e91-4d20-8d00-682a9431a8ef-cnibin\") pod \"multus-additional-cni-plugins-g445x\" (UID: \"0fc29189-3e91-4d20-8d00-682a9431a8ef\") " pod="openshift-multus/multus-additional-cni-plugins-g445x" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.715196 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-host-run-k8s-cni-cncf-io\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.715227 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0fc29189-3e91-4d20-8d00-682a9431a8ef-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g445x\" (UID: \"0fc29189-3e91-4d20-8d00-682a9431a8ef\") " pod="openshift-multus/multus-additional-cni-plugins-g445x" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.715314 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-multus-socket-dir-parent\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.715360 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0fc29189-3e91-4d20-8d00-682a9431a8ef-cnibin\") pod \"multus-additional-cni-plugins-g445x\" (UID: \"0fc29189-3e91-4d20-8d00-682a9431a8ef\") " pod="openshift-multus/multus-additional-cni-plugins-g445x" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.715397 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9beb5814-89d0-47c0-8b0e-24376a358fc3-rootfs\") pod \"machine-config-daemon-bvt9j\" (UID: \"9beb5814-89d0-47c0-8b0e-24376a358fc3\") " pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.715477 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9beb5814-89d0-47c0-8b0e-24376a358fc3-mcd-auth-proxy-config\") pod \"machine-config-daemon-bvt9j\" (UID: \"9beb5814-89d0-47c0-8b0e-24376a358fc3\") " pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.715502 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-multus-socket-dir-parent\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.715567 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-multus-cni-dir\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.715611 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-host-var-lib-cni-bin\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.715698 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fc29189-3e91-4d20-8d00-682a9431a8ef-system-cni-dir\") pod \"multus-additional-cni-plugins-g445x\" (UID: \"0fc29189-3e91-4d20-8d00-682a9431a8ef\") " pod="openshift-multus/multus-additional-cni-plugins-g445x" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.715730 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-multus-cni-dir\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.715742 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9beb5814-89d0-47c0-8b0e-24376a358fc3-rootfs\") pod \"machine-config-daemon-bvt9j\" (UID: \"9beb5814-89d0-47c0-8b0e-24376a358fc3\") " pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.715767 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-host-var-lib-cni-bin\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.715783 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-etc-kubernetes\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.715806 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/165351e4-3c96-4a68-8c75-43b001b0ec60-multus-daemon-config\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.715836 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fc29189-3e91-4d20-8d00-682a9431a8ef-system-cni-dir\") pod \"multus-additional-cni-plugins-g445x\" (UID: \"0fc29189-3e91-4d20-8d00-682a9431a8ef\") " pod="openshift-multus/multus-additional-cni-plugins-g445x" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.715885 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5xgs\" (UniqueName: \"kubernetes.io/projected/165351e4-3c96-4a68-8c75-43b001b0ec60-kube-api-access-c5xgs\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.715923 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-etc-kubernetes\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.715966 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-host-var-lib-cni-multus\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.716071 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/165351e4-3c96-4a68-8c75-43b001b0ec60-cni-binary-copy\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.716095 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-host-run-netns\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.716169 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-host-run-netns\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.716175 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-host-var-lib-kubelet\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.716197 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-host-var-lib-kubelet\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.716250 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9beb5814-89d0-47c0-8b0e-24376a358fc3-proxy-tls\") pod \"machine-config-daemon-bvt9j\" (UID: \"9beb5814-89d0-47c0-8b0e-24376a358fc3\") " pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.716283 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-host-var-lib-cni-multus\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.716333 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-system-cni-dir\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.716362 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-os-release\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.716409 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fc29189-3e91-4d20-8d00-682a9431a8ef-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g445x\" (UID: \"0fc29189-3e91-4d20-8d00-682a9431a8ef\") " pod="openshift-multus/multus-additional-cni-plugins-g445x" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.716432 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0fc29189-3e91-4d20-8d00-682a9431a8ef-cni-binary-copy\") pod \"multus-additional-cni-plugins-g445x\" (UID: \"0fc29189-3e91-4d20-8d00-682a9431a8ef\") " pod="openshift-multus/multus-additional-cni-plugins-g445x" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.716452 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-cnibin\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.716490 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-host-run-multus-certs\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.716480 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-os-release\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.716550 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-host-run-multus-certs\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.716574 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-system-cni-dir\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.716602 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88dxl\" (UniqueName: \"kubernetes.io/projected/9beb5814-89d0-47c0-8b0e-24376a358fc3-kube-api-access-88dxl\") pod \"machine-config-daemon-bvt9j\" (UID: \"9beb5814-89d0-47c0-8b0e-24376a358fc3\") " pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.716658 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-multus-conf-dir\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.716709 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/165351e4-3c96-4a68-8c75-43b001b0ec60-cni-binary-copy\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.716747 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gts5z\" (UniqueName: \"kubernetes.io/projected/0fc29189-3e91-4d20-8d00-682a9431a8ef-kube-api-access-gts5z\") pod \"multus-additional-cni-plugins-g445x\" (UID: \"0fc29189-3e91-4d20-8d00-682a9431a8ef\") " pod="openshift-multus/multus-additional-cni-plugins-g445x" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.716763 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-multus-conf-dir\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.716860 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/165351e4-3c96-4a68-8c75-43b001b0ec60-cnibin\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.716930 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9beb5814-89d0-47c0-8b0e-24376a358fc3-mcd-auth-proxy-config\") pod \"machine-config-daemon-bvt9j\" (UID: \"9beb5814-89d0-47c0-8b0e-24376a358fc3\") " pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.717082 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0fc29189-3e91-4d20-8d00-682a9431a8ef-cni-binary-copy\") pod \"multus-additional-cni-plugins-g445x\" (UID: \"0fc29189-3e91-4d20-8d00-682a9431a8ef\") " pod="openshift-multus/multus-additional-cni-plugins-g445x" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.717174 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0fc29189-3e91-4d20-8d00-682a9431a8ef-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g445x\" (UID: \"0fc29189-3e91-4d20-8d00-682a9431a8ef\") " pod="openshift-multus/multus-additional-cni-plugins-g445x" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.717179 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fc29189-3e91-4d20-8d00-682a9431a8ef-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g445x\" (UID: \"0fc29189-3e91-4d20-8d00-682a9431a8ef\") " pod="openshift-multus/multus-additional-cni-plugins-g445x" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.719435 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.719466 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.719479 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.719500 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.719514 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:46Z","lastTransitionTime":"2026-03-10T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.723879 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9beb5814-89d0-47c0-8b0e-24376a358fc3-proxy-tls\") pod \"machine-config-daemon-bvt9j\" (UID: \"9beb5814-89d0-47c0-8b0e-24376a358fc3\") " pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.733786 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.734481 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5xgs\" (UniqueName: \"kubernetes.io/projected/165351e4-3c96-4a68-8c75-43b001b0ec60-kube-api-access-c5xgs\") pod \"multus-8dkbt\" (UID: \"165351e4-3c96-4a68-8c75-43b001b0ec60\") " pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.736580 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gts5z\" (UniqueName: \"kubernetes.io/projected/0fc29189-3e91-4d20-8d00-682a9431a8ef-kube-api-access-gts5z\") pod \"multus-additional-cni-plugins-g445x\" (UID: \"0fc29189-3e91-4d20-8d00-682a9431a8ef\") " pod="openshift-multus/multus-additional-cni-plugins-g445x" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.742642 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88dxl\" (UniqueName: \"kubernetes.io/projected/9beb5814-89d0-47c0-8b0e-24376a358fc3-kube-api-access-88dxl\") pod \"machine-config-daemon-bvt9j\" (UID: \"9beb5814-89d0-47c0-8b0e-24376a358fc3\") " pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.746022 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.758277 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.779574 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.797793 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.817001 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.822238 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.822316 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.822340 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.822369 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.822469 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:46Z","lastTransitionTime":"2026-03-10T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.833803 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.848331 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.861264 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.876483 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g445x" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.879797 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.887121 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.895122 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.895445 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8dkbt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.925492 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.925537 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.925550 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.925569 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.925581 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:46Z","lastTransitionTime":"2026-03-10T06:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.932795 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.943164 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jhkb9"] Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.958265 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.960276 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.966646 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.967042 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.967472 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.967940 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.968560 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.968806 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.971512 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.982488 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:46 crc kubenswrapper[4825]: I0310 06:45:46.996885 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:46Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.015277 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.027901 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.027942 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.027951 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.027968 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.027977 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:47Z","lastTransitionTime":"2026-03-10T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.035879 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.052707 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.066026 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.080930 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.094869 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.117304 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.119257 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-systemd-units\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.119298 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-run-ovn-kubernetes\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.119323 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/79ec9d89-dc71-4f36-9254-00bd86795e43-ovnkube-config\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.119352 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-run-openvswitch\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.119441 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvxrb\" (UniqueName: \"kubernetes.io/projected/79ec9d89-dc71-4f36-9254-00bd86795e43-kube-api-access-rvxrb\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.119535 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.119576 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-run-netns\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.119604 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-run-ovn\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.119636 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-var-lib-openvswitch\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.119740 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-etc-openvswitch\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.119823 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-run-systemd\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.119882 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-cni-netd\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.119949 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-kubelet\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.120053 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-log-socket\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.120114 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-cni-bin\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.120251 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/79ec9d89-dc71-4f36-9254-00bd86795e43-ovnkube-script-lib\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.120326 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-node-log\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.120394 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/79ec9d89-dc71-4f36-9254-00bd86795e43-ovn-node-metrics-cert\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.120469 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-slash\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.120517 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/79ec9d89-dc71-4f36-9254-00bd86795e43-env-overrides\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.130758 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.130833 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.130849 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.130870 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.130886 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:47Z","lastTransitionTime":"2026-03-10T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.152642 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.168569 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.185284 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.204276 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.219773 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221043 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-run-openvswitch\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221084 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvxrb\" (UniqueName: \"kubernetes.io/projected/79ec9d89-dc71-4f36-9254-00bd86795e43-kube-api-access-rvxrb\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221107 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-run-netns\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221159 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-run-ovn\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221179 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221199 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-var-lib-openvswitch\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221227 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-etc-openvswitch\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221251 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-cni-netd\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221270 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-run-systemd\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221290 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-kubelet\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221319 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-log-socket\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221336 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-cni-bin\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221353 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/79ec9d89-dc71-4f36-9254-00bd86795e43-ovnkube-script-lib\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221371 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-node-log\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221387 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-etc-openvswitch\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221401 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-run-openvswitch\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221420 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-slash\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221396 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-slash\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221457 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-run-systemd\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221463 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221479 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-kubelet\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221366 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-run-ovn\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221609 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-var-lib-openvswitch\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221632 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/79ec9d89-dc71-4f36-9254-00bd86795e43-env-overrides\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221651 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/79ec9d89-dc71-4f36-9254-00bd86795e43-ovn-node-metrics-cert\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221684 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-systemd-units\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221720 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-node-log\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221754 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-cni-netd\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221773 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-cni-bin\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221805 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-log-socket\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.221858 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-systemd-units\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.222186 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-run-netns\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.222410 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-run-ovn-kubernetes\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.222646 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/79ec9d89-dc71-4f36-9254-00bd86795e43-env-overrides\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.222732 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/79ec9d89-dc71-4f36-9254-00bd86795e43-ovnkube-script-lib\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.222753 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-run-ovn-kubernetes\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.222513 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/79ec9d89-dc71-4f36-9254-00bd86795e43-ovnkube-config\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.223693 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/79ec9d89-dc71-4f36-9254-00bd86795e43-ovnkube-config\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.226467 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/79ec9d89-dc71-4f36-9254-00bd86795e43-ovn-node-metrics-cert\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.236247 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:47 crc kubenswrapper[4825]: E0310 06:45:47.236557 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.241206 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvxrb\" (UniqueName: \"kubernetes.io/projected/79ec9d89-dc71-4f36-9254-00bd86795e43-kube-api-access-rvxrb\") pod \"ovnkube-node-jhkb9\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.245632 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.249640 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.249687 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.249696 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.249720 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.249730 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:47Z","lastTransitionTime":"2026-03-10T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.271824 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.292198 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.352879 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.353285 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.353299 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.353329 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.353339 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:47Z","lastTransitionTime":"2026-03-10T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.359230 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:47 crc kubenswrapper[4825]: W0310 06:45:47.375234 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79ec9d89_dc71_4f36_9254_00bd86795e43.slice/crio-9be67cfb02442189023b4ded525ce954e90db103e45ad55a85fbcbb0c06d905f WatchSource:0}: Error finding container 9be67cfb02442189023b4ded525ce954e90db103e45ad55a85fbcbb0c06d905f: Status 404 returned error can't find the container with id 9be67cfb02442189023b4ded525ce954e90db103e45ad55a85fbcbb0c06d905f Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.460453 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.460497 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.460511 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.460531 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.460545 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:47Z","lastTransitionTime":"2026-03-10T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.563629 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.563700 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.563716 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.563744 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.563760 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:47Z","lastTransitionTime":"2026-03-10T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.666835 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.666909 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.666928 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.666956 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.666977 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:47Z","lastTransitionTime":"2026-03-10T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.724086 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f"} Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.724179 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c"} Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.724195 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"4d98e3cd2f4c3076e8d5d89207817b673006cd890d5a9647a604ac00ec643f17"} Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.738888 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8dkbt" event={"ID":"165351e4-3c96-4a68-8c75-43b001b0ec60","Type":"ContainerStarted","Data":"420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c"} Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.739290 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8dkbt" event={"ID":"165351e4-3c96-4a68-8c75-43b001b0ec60","Type":"ContainerStarted","Data":"df11781e38844239e6f0cdc1a849a6876c0689fdc6f635b04c1dc04c6f73a4df"} Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.742781 4825 generic.go:334] "Generic (PLEG): container finished" podID="0fc29189-3e91-4d20-8d00-682a9431a8ef" containerID="0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5" exitCode=0 Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.742870 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" event={"ID":"0fc29189-3e91-4d20-8d00-682a9431a8ef","Type":"ContainerDied","Data":"0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5"} Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.743373 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" event={"ID":"0fc29189-3e91-4d20-8d00-682a9431a8ef","Type":"ContainerStarted","Data":"9f5d38f28818fc53a64d23ecbb9501988765ef024ad11a468720cbc5cbd1e8e5"} Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.747248 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pgpc8" event={"ID":"bdb08f12-1daf-4d35-940c-914e187ffda0","Type":"ContainerStarted","Data":"9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1"} Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.749545 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.755647 4825 generic.go:334] "Generic (PLEG): container finished" podID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerID="1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b" exitCode=0 Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.755828 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerDied","Data":"1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b"} Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.755986 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerStarted","Data":"9be67cfb02442189023b4ded525ce954e90db103e45ad55a85fbcbb0c06d905f"} Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.765009 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.770224 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.770365 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.770465 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.770582 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.770665 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:47Z","lastTransitionTime":"2026-03-10T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.785499 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.808754 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.825589 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.842195 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.864021 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.873687 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.873731 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.873741 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.873757 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.873769 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:47Z","lastTransitionTime":"2026-03-10T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.880289 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.896428 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.913078 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.930074 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.953115 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.975153 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.976565 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.976595 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.976607 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.976625 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.976636 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:47Z","lastTransitionTime":"2026-03-10T06:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:47 crc kubenswrapper[4825]: I0310 06:45:47.995802 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:47Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.011905 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:48Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.029057 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:48Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.047046 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:48Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.065790 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:48Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.079600 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.079785 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.079871 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.079960 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.080068 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:48Z","lastTransitionTime":"2026-03-10T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.086486 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:48Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.116263 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:48Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.135887 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:48Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.151476 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:48Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.162822 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:48Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.182494 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.182623 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.182652 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.182686 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.182711 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:48Z","lastTransitionTime":"2026-03-10T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.184010 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:48Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.209607 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:48Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.230473 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:48Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.235375 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:48 crc kubenswrapper[4825]: E0310 06:45:48.235516 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.235792 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:48 crc kubenswrapper[4825]: E0310 06:45:48.235996 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.254479 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:48Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.278816 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:48Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.285835 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.285888 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.285898 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.285920 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.285930 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:48Z","lastTransitionTime":"2026-03-10T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.389460 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.389504 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.389515 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.389532 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.389545 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:48Z","lastTransitionTime":"2026-03-10T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.492867 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.492922 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.492935 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.492956 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.492969 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:48Z","lastTransitionTime":"2026-03-10T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.596029 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.596098 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.596115 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.596203 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.596260 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:48Z","lastTransitionTime":"2026-03-10T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.700286 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.700350 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.700367 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.700739 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.700775 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:48Z","lastTransitionTime":"2026-03-10T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.767647 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerStarted","Data":"e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3"} Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.767704 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerStarted","Data":"b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3"} Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.767716 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerStarted","Data":"3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841"} Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.769888 4825 generic.go:334] "Generic (PLEG): container finished" podID="0fc29189-3e91-4d20-8d00-682a9431a8ef" containerID="87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7" exitCode=0 Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.771768 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" event={"ID":"0fc29189-3e91-4d20-8d00-682a9431a8ef","Type":"ContainerDied","Data":"87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7"} Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.805161 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.805369 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.805386 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.805284 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:48Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.805409 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.805692 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:48Z","lastTransitionTime":"2026-03-10T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.820472 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:48Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.859228 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:48Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.895499 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:48Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.911436 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.911478 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.911488 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.911508 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.911518 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:48Z","lastTransitionTime":"2026-03-10T06:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.924212 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:48Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.951858 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:48Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.972278 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:48Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:48 crc kubenswrapper[4825]: I0310 06:45:48.988056 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:48Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.009209 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.013866 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.013895 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.013911 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.013931 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.013943 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:49Z","lastTransitionTime":"2026-03-10T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.021334 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.037276 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.048105 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.067488 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.085256 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.116841 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.116887 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.116897 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.116917 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.116930 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:49Z","lastTransitionTime":"2026-03-10T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.220231 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.220268 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.220278 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.220294 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.220305 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:49Z","lastTransitionTime":"2026-03-10T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.237699 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:49 crc kubenswrapper[4825]: E0310 06:45:49.237907 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.253578 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.268198 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.293333 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.307225 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.318546 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.323411 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.323440 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.323450 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.323465 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.323477 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:49Z","lastTransitionTime":"2026-03-10T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.334203 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.352666 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.363081 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.376398 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.391827 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.417836 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.426442 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.426510 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.426534 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.426564 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.426588 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:49Z","lastTransitionTime":"2026-03-10T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.438305 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.455722 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.469536 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.529568 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.529634 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.529653 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.529684 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.529705 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:49Z","lastTransitionTime":"2026-03-10T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.632911 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.632980 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.632999 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.633029 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.633049 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:49Z","lastTransitionTime":"2026-03-10T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.736181 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.736251 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.736268 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.736316 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.736332 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:49Z","lastTransitionTime":"2026-03-10T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.779652 4825 generic.go:334] "Generic (PLEG): container finished" podID="0fc29189-3e91-4d20-8d00-682a9431a8ef" containerID="81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583" exitCode=0 Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.779779 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" event={"ID":"0fc29189-3e91-4d20-8d00-682a9431a8ef","Type":"ContainerDied","Data":"81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583"} Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.787597 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerStarted","Data":"c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952"} Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.787666 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerStarted","Data":"8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54"} Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.787685 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerStarted","Data":"8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff"} Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.808598 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.826704 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.840021 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.840060 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.840070 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.840088 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.840103 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:49Z","lastTransitionTime":"2026-03-10T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.853496 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.876664 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.892621 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.909060 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.922787 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.934772 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.945640 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.945692 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.945711 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.945740 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.945759 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:49Z","lastTransitionTime":"2026-03-10T06:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.947745 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.976541 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:49 crc kubenswrapper[4825]: I0310 06:45:49.993106 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:49Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.008987 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:50Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.027257 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:50Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.047904 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:50Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.049426 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.049478 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.049493 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.049512 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.049526 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:50Z","lastTransitionTime":"2026-03-10T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.152555 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.152628 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.152692 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.152722 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.152741 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:50Z","lastTransitionTime":"2026-03-10T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.235670 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.235670 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:50 crc kubenswrapper[4825]: E0310 06:45:50.235868 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:45:50 crc kubenswrapper[4825]: E0310 06:45:50.235967 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.255904 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.255974 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.255993 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.256020 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.256037 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:50Z","lastTransitionTime":"2026-03-10T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.360085 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.360202 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.360223 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.360254 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.360272 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:50Z","lastTransitionTime":"2026-03-10T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.463925 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.463977 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.463994 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.464014 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.464028 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:50Z","lastTransitionTime":"2026-03-10T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.568068 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.568190 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.568214 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.568246 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.568268 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:50Z","lastTransitionTime":"2026-03-10T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.672236 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.672294 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.672306 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.672325 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.672341 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:50Z","lastTransitionTime":"2026-03-10T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.775562 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.775610 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.775622 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.775641 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.775655 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:50Z","lastTransitionTime":"2026-03-10T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.796091 4825 generic.go:334] "Generic (PLEG): container finished" podID="0fc29189-3e91-4d20-8d00-682a9431a8ef" containerID="c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b" exitCode=0 Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.796192 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" event={"ID":"0fc29189-3e91-4d20-8d00-682a9431a8ef","Type":"ContainerDied","Data":"c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b"} Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.821587 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:50Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.847798 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:50Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.869649 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.869691 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.869717 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.869739 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:50 crc kubenswrapper[4825]: E0310 06:45:50.869877 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 06:45:50 crc kubenswrapper[4825]: E0310 06:45:50.869895 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 06:45:50 crc kubenswrapper[4825]: E0310 06:45:50.869907 4825 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:50 crc kubenswrapper[4825]: E0310 06:45:50.869924 4825 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 06:45:50 crc kubenswrapper[4825]: E0310 06:45:50.869949 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 06:45:50 crc kubenswrapper[4825]: E0310 06:45:50.870001 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 06:45:50 crc kubenswrapper[4825]: E0310 06:45:50.870025 4825 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:50 crc kubenswrapper[4825]: E0310 06:45:50.869951 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 06:46:22.869936429 +0000 UTC m=+135.899717044 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:50 crc kubenswrapper[4825]: E0310 06:45:50.870080 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 06:46:22.870054482 +0000 UTC m=+135.899835107 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 06:45:50 crc kubenswrapper[4825]: E0310 06:45:50.870100 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 06:46:22.870089543 +0000 UTC m=+135.899870168 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:45:50 crc kubenswrapper[4825]: E0310 06:45:50.870186 4825 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 06:45:50 crc kubenswrapper[4825]: E0310 06:45:50.870266 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 06:46:22.870240386 +0000 UTC m=+135.900021151 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.881615 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.881691 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.881717 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.881751 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.881776 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:50Z","lastTransitionTime":"2026-03-10T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.882590 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:50Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.907533 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:50Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.926369 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:50Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.947824 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:50Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.968615 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:50Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.971516 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:45:50 crc kubenswrapper[4825]: E0310 06:45:50.971884 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:46:22.97185923 +0000 UTC m=+136.001639865 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.986640 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.986711 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.986732 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.986759 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.986782 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:50Z","lastTransitionTime":"2026-03-10T06:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:50 crc kubenswrapper[4825]: I0310 06:45:50.987878 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:50Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.007923 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:51Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.025754 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:51Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.063215 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:51Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.081250 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:51Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.093816 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.093901 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.093921 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.093955 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.093987 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:51Z","lastTransitionTime":"2026-03-10T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.096994 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:51Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.118495 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:51Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.198158 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.198241 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.198262 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.198290 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.198312 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:51Z","lastTransitionTime":"2026-03-10T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.236004 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:51 crc kubenswrapper[4825]: E0310 06:45:51.236170 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.293844 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.293931 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.293959 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.293996 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.294023 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:51Z","lastTransitionTime":"2026-03-10T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:51 crc kubenswrapper[4825]: E0310 06:45:51.317910 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:51Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.323736 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.323794 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.323814 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.323842 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.323865 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:51Z","lastTransitionTime":"2026-03-10T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:51 crc kubenswrapper[4825]: E0310 06:45:51.344288 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:51Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.349748 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.349827 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.349854 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.349887 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.349909 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:51Z","lastTransitionTime":"2026-03-10T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:51 crc kubenswrapper[4825]: E0310 06:45:51.372396 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:51Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.377575 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.377648 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.377717 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.377756 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.377782 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:51Z","lastTransitionTime":"2026-03-10T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:51 crc kubenswrapper[4825]: E0310 06:45:51.402219 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:51Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.407982 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.408338 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.408449 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.408486 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.408514 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:51Z","lastTransitionTime":"2026-03-10T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:51 crc kubenswrapper[4825]: E0310 06:45:51.427709 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:51Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:51 crc kubenswrapper[4825]: E0310 06:45:51.427950 4825 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.430257 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.430340 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.430360 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.430391 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.430410 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:51Z","lastTransitionTime":"2026-03-10T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.533432 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.533486 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.533499 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.533519 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.533533 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:51Z","lastTransitionTime":"2026-03-10T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.637184 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.637571 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.637582 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.637601 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.637615 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:51Z","lastTransitionTime":"2026-03-10T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.741324 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.741402 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.741422 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.741448 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.741468 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:51Z","lastTransitionTime":"2026-03-10T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.802879 4825 generic.go:334] "Generic (PLEG): container finished" podID="0fc29189-3e91-4d20-8d00-682a9431a8ef" containerID="7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad" exitCode=0 Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.802952 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" event={"ID":"0fc29189-3e91-4d20-8d00-682a9431a8ef","Type":"ContainerDied","Data":"7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad"} Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.808460 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerStarted","Data":"c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395"} Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.834837 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:51Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.847262 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.847333 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.847352 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.847380 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.847397 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:51Z","lastTransitionTime":"2026-03-10T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.860935 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:51Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.885898 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:51Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.916534 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:51Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.934427 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:51Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.950963 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.951087 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.951114 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.951179 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.951201 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:51Z","lastTransitionTime":"2026-03-10T06:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.955940 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:51Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.972966 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:51Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:51 crc kubenswrapper[4825]: I0310 06:45:51.987748 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:51Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.000042 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:51Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.023856 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.042736 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.055821 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.055861 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.055873 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.055895 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.055909 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:52Z","lastTransitionTime":"2026-03-10T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.062675 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.076011 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.092499 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.160673 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.160715 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.160729 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.160748 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.160761 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:52Z","lastTransitionTime":"2026-03-10T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.235980 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:52 crc kubenswrapper[4825]: E0310 06:45:52.236207 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.235975 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:52 crc kubenswrapper[4825]: E0310 06:45:52.236737 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.263967 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.264018 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.264035 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.264060 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.264078 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:52Z","lastTransitionTime":"2026-03-10T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.367260 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.367349 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.367364 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.367387 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.367431 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:52Z","lastTransitionTime":"2026-03-10T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.470047 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.470104 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.470123 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.470187 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.470210 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:52Z","lastTransitionTime":"2026-03-10T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.574293 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.574429 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.574442 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.574461 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.574473 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:52Z","lastTransitionTime":"2026-03-10T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.677772 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.677826 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.677837 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.677858 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.677870 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:52Z","lastTransitionTime":"2026-03-10T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.781204 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.781269 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.781296 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.781329 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.781349 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:52Z","lastTransitionTime":"2026-03-10T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.831444 4825 generic.go:334] "Generic (PLEG): container finished" podID="0fc29189-3e91-4d20-8d00-682a9431a8ef" containerID="2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b" exitCode=0 Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.831510 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" event={"ID":"0fc29189-3e91-4d20-8d00-682a9431a8ef","Type":"ContainerDied","Data":"2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b"} Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.854321 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.873332 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.883719 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.883795 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.883809 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.883860 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.883876 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:52Z","lastTransitionTime":"2026-03-10T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.887098 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.901317 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.914634 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.929187 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.946349 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.957868 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.980046 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.986220 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.986272 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.986290 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.986317 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.986336 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:52Z","lastTransitionTime":"2026-03-10T06:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:52 crc kubenswrapper[4825]: I0310 06:45:52.995270 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:52Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.009476 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.020005 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.046286 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.068457 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.090046 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.090084 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.090094 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.090110 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.090120 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:53Z","lastTransitionTime":"2026-03-10T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.121497 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-7l6mg"] Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.123365 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7l6mg" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.125714 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.126001 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.126163 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.126920 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.143580 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.163636 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.180240 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.193263 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.193313 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.193327 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.193351 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.193365 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:53Z","lastTransitionTime":"2026-03-10T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.197363 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.221535 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.236343 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:53 crc kubenswrapper[4825]: E0310 06:45:53.236521 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.239901 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.258827 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.275084 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.289120 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.296353 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.296387 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.296398 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.296416 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.296428 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:53Z","lastTransitionTime":"2026-03-10T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.302559 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68dff16f-0214-4421-934d-dcb6e9d1af28-host\") pod \"node-ca-7l6mg\" (UID: \"68dff16f-0214-4421-934d-dcb6e9d1af28\") " pod="openshift-image-registry/node-ca-7l6mg" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.302666 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/68dff16f-0214-4421-934d-dcb6e9d1af28-serviceca\") pod \"node-ca-7l6mg\" (UID: \"68dff16f-0214-4421-934d-dcb6e9d1af28\") " pod="openshift-image-registry/node-ca-7l6mg" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.302718 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8ztd\" (UniqueName: \"kubernetes.io/projected/68dff16f-0214-4421-934d-dcb6e9d1af28-kube-api-access-w8ztd\") pod \"node-ca-7l6mg\" (UID: \"68dff16f-0214-4421-934d-dcb6e9d1af28\") " pod="openshift-image-registry/node-ca-7l6mg" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.305423 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.319790 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.336443 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.355452 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.379889 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.396445 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.399996 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.400066 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.400085 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.400116 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.400166 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:53Z","lastTransitionTime":"2026-03-10T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.403599 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68dff16f-0214-4421-934d-dcb6e9d1af28-host\") pod \"node-ca-7l6mg\" (UID: \"68dff16f-0214-4421-934d-dcb6e9d1af28\") " pod="openshift-image-registry/node-ca-7l6mg" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.403651 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/68dff16f-0214-4421-934d-dcb6e9d1af28-serviceca\") pod \"node-ca-7l6mg\" (UID: \"68dff16f-0214-4421-934d-dcb6e9d1af28\") " pod="openshift-image-registry/node-ca-7l6mg" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.403712 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8ztd\" (UniqueName: \"kubernetes.io/projected/68dff16f-0214-4421-934d-dcb6e9d1af28-kube-api-access-w8ztd\") pod \"node-ca-7l6mg\" (UID: \"68dff16f-0214-4421-934d-dcb6e9d1af28\") " pod="openshift-image-registry/node-ca-7l6mg" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.403706 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68dff16f-0214-4421-934d-dcb6e9d1af28-host\") pod \"node-ca-7l6mg\" (UID: \"68dff16f-0214-4421-934d-dcb6e9d1af28\") " pod="openshift-image-registry/node-ca-7l6mg" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.406054 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/68dff16f-0214-4421-934d-dcb6e9d1af28-serviceca\") pod \"node-ca-7l6mg\" (UID: \"68dff16f-0214-4421-934d-dcb6e9d1af28\") " pod="openshift-image-registry/node-ca-7l6mg" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.486656 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8ztd\" (UniqueName: \"kubernetes.io/projected/68dff16f-0214-4421-934d-dcb6e9d1af28-kube-api-access-w8ztd\") pod \"node-ca-7l6mg\" (UID: \"68dff16f-0214-4421-934d-dcb6e9d1af28\") " pod="openshift-image-registry/node-ca-7l6mg" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.503401 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.503462 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.503481 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.503509 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.503526 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:53Z","lastTransitionTime":"2026-03-10T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.605864 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.605894 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.605903 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.605918 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.605927 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:53Z","lastTransitionTime":"2026-03-10T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.708831 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.708908 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.708934 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.708968 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.708994 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:53Z","lastTransitionTime":"2026-03-10T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.745458 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7l6mg" Mar 10 06:45:53 crc kubenswrapper[4825]: W0310 06:45:53.768735 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68dff16f_0214_4421_934d_dcb6e9d1af28.slice/crio-6f301e6e60f9eb8839b31221f08ab234adc3856b252441f4114055a583bce084 WatchSource:0}: Error finding container 6f301e6e60f9eb8839b31221f08ab234adc3856b252441f4114055a583bce084: Status 404 returned error can't find the container with id 6f301e6e60f9eb8839b31221f08ab234adc3856b252441f4114055a583bce084 Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.812225 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.812272 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.812289 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.812316 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.812331 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:53Z","lastTransitionTime":"2026-03-10T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.859423 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7l6mg" event={"ID":"68dff16f-0214-4421-934d-dcb6e9d1af28","Type":"ContainerStarted","Data":"6f301e6e60f9eb8839b31221f08ab234adc3856b252441f4114055a583bce084"} Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.877354 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" event={"ID":"0fc29189-3e91-4d20-8d00-682a9431a8ef","Type":"ContainerStarted","Data":"42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8"} Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.920301 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerStarted","Data":"39647cc02452b6b9dbd836e1b07a56c25b9818f32103272c5da339e42aa6c240"} Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.921288 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.921297 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.921327 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.921341 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.921356 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.921370 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:53Z","lastTransitionTime":"2026-03-10T06:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.921427 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.921483 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.926552 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.942355 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.961835 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.961907 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.963887 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.973617 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:53 crc kubenswrapper[4825]: I0310 06:45:53.987281 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.001340 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:53Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.018590 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.024367 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.024410 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.024420 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.024437 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.024447 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:54Z","lastTransitionTime":"2026-03-10T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.035245 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.047919 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.058906 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.068353 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.077699 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.086896 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.099388 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.111454 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.123297 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.128030 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.128092 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.128108 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.128160 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.128184 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:54Z","lastTransitionTime":"2026-03-10T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.135996 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.146234 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.156883 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.168643 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.183772 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.202685 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.224450 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.231806 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.231841 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.231852 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.231871 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.231884 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:54Z","lastTransitionTime":"2026-03-10T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.236058 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.236095 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:54 crc kubenswrapper[4825]: E0310 06:45:54.236236 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:45:54 crc kubenswrapper[4825]: E0310 06:45:54.236341 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.240850 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.253899 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.266016 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.283013 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.298209 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.317418 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39647cc02452b6b9dbd836e1b07a56c25b9818f32103272c5da339e42aa6c240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.335318 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.335371 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.335384 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.335406 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.335420 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:54Z","lastTransitionTime":"2026-03-10T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.341161 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.438938 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.439025 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.439049 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.439085 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.439115 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:54Z","lastTransitionTime":"2026-03-10T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.542834 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.542913 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.542939 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.542976 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.543002 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:54Z","lastTransitionTime":"2026-03-10T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.646676 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.646759 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.646780 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.646807 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.646826 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:54Z","lastTransitionTime":"2026-03-10T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.750422 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.750475 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.750493 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.750517 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.750537 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:54Z","lastTransitionTime":"2026-03-10T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.853727 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.853765 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.853774 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.853790 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.853800 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:54Z","lastTransitionTime":"2026-03-10T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.925247 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7l6mg" event={"ID":"68dff16f-0214-4421-934d-dcb6e9d1af28","Type":"ContainerStarted","Data":"df7bb00502b5ab82cbb301089867ff883cbfa0b30fabb0e58451aefeb0197810"} Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.955523 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.958216 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.958290 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.958310 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.958339 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.958358 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:54Z","lastTransitionTime":"2026-03-10T06:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:54 crc kubenswrapper[4825]: I0310 06:45:54.976259 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.002720 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:54Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.022534 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.050479 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.061503 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.061641 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.061665 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.061757 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.061799 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:55Z","lastTransitionTime":"2026-03-10T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.086216 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.118965 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39647cc02452b6b9dbd836e1b07a56c25b9818f32103272c5da339e42aa6c240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.138817 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.156200 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.165526 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.165628 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.165678 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.165708 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.165727 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:55Z","lastTransitionTime":"2026-03-10T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.175207 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.189432 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.201817 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7bb00502b5ab82cbb301089867ff883cbfa0b30fabb0e58451aefeb0197810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.211011 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.229175 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.236513 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:55 crc kubenswrapper[4825]: E0310 06:45:55.236806 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.251608 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:55Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.269095 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.269185 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.269207 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.269240 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.269263 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:55Z","lastTransitionTime":"2026-03-10T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.371441 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.371488 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.371501 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.371521 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.371533 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:55Z","lastTransitionTime":"2026-03-10T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.475875 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.475940 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.475960 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.475982 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.475995 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:55Z","lastTransitionTime":"2026-03-10T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.579545 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.579608 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.579623 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.579644 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.579658 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:55Z","lastTransitionTime":"2026-03-10T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.683488 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.683565 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.683585 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.683613 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.683630 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:55Z","lastTransitionTime":"2026-03-10T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.787229 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.787267 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.787278 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.787297 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.787309 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:55Z","lastTransitionTime":"2026-03-10T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.890434 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.890473 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.890482 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.890499 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.890509 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:55Z","lastTransitionTime":"2026-03-10T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.993866 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.993920 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.993935 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.993957 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:55 crc kubenswrapper[4825]: I0310 06:45:55.993977 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:55Z","lastTransitionTime":"2026-03-10T06:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.096875 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.096922 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.096933 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.096951 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.096968 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:56Z","lastTransitionTime":"2026-03-10T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.200443 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.200489 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.200498 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.200515 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.200526 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:56Z","lastTransitionTime":"2026-03-10T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.235940 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:56 crc kubenswrapper[4825]: E0310 06:45:56.236085 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.236299 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:56 crc kubenswrapper[4825]: E0310 06:45:56.236348 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.302899 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.302926 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.302935 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.302950 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.302959 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:56Z","lastTransitionTime":"2026-03-10T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.405429 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.405466 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.405479 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.405497 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.405509 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:56Z","lastTransitionTime":"2026-03-10T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.507367 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.507414 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.507430 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.507454 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.507473 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:56Z","lastTransitionTime":"2026-03-10T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.610396 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.610443 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.610457 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.610480 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.610497 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:56Z","lastTransitionTime":"2026-03-10T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.714416 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.714489 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.714870 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.716108 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.716259 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:56Z","lastTransitionTime":"2026-03-10T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.820227 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.820838 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.821070 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.821348 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.821558 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:56Z","lastTransitionTime":"2026-03-10T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.926693 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.926756 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.926775 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.926800 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.926819 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:56Z","lastTransitionTime":"2026-03-10T06:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.935949 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhkb9_79ec9d89-dc71-4f36-9254-00bd86795e43/ovnkube-controller/0.log" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.940690 4825 generic.go:334] "Generic (PLEG): container finished" podID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerID="39647cc02452b6b9dbd836e1b07a56c25b9818f32103272c5da339e42aa6c240" exitCode=1 Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.940746 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerDied","Data":"39647cc02452b6b9dbd836e1b07a56c25b9818f32103272c5da339e42aa6c240"} Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.941943 4825 scope.go:117] "RemoveContainer" containerID="39647cc02452b6b9dbd836e1b07a56c25b9818f32103272c5da339e42aa6c240" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.959124 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7bb00502b5ab82cbb301089867ff883cbfa0b30fabb0e58451aefeb0197810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:56Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:56 crc kubenswrapper[4825]: I0310 06:45:56.984680 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:56Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.008630 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.030535 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.030618 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.030639 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.030674 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.030699 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:57Z","lastTransitionTime":"2026-03-10T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.032624 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.056216 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.075464 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.111328 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.133454 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.134125 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.134401 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.134425 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.134451 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.134469 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:57Z","lastTransitionTime":"2026-03-10T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.161987 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.185353 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.205779 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.225338 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.235713 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:57 crc kubenswrapper[4825]: E0310 06:45:57.235899 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.240603 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.240679 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.240698 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.240732 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.240754 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:57Z","lastTransitionTime":"2026-03-10T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.248121 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.271646 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.296174 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39647cc02452b6b9dbd836e1b07a56c25b9818f32103272c5da339e42aa6c240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39647cc02452b6b9dbd836e1b07a56c25b9818f32103272c5da339e42aa6c240\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:45:56Z\\\",\\\"message\\\":\\\" 6618 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 06:45:56.629453 6618 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 06:45:56.629487 6618 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 06:45:56.629511 6618 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 06:45:56.629532 6618 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 06:45:56.629524 6618 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 06:45:56.629559 6618 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 06:45:56.629584 6618 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 06:45:56.629591 6618 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 06:45:56.629614 6618 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 06:45:56.629619 6618 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 06:45:56.629635 6618 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 06:45:56.629678 6618 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 06:45:56.629686 6618 factory.go:656] Stopping watch factory\\\\nI0310 06:45:56.629697 6618 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 06:45:56.629712 6618 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.343896 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.343963 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.343984 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.344013 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.344034 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:57Z","lastTransitionTime":"2026-03-10T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.447404 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.447451 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.447473 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.447498 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.447517 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:57Z","lastTransitionTime":"2026-03-10T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.550338 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.550380 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.550392 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.550410 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.550426 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:57Z","lastTransitionTime":"2026-03-10T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.653144 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.653184 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.653194 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.653212 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.653228 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:57Z","lastTransitionTime":"2026-03-10T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.755509 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.755540 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.755548 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.755564 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.755573 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:57Z","lastTransitionTime":"2026-03-10T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.858557 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.858592 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.858601 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.858617 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.858625 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:57Z","lastTransitionTime":"2026-03-10T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.975074 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.975126 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.975169 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.975196 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.975214 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:57Z","lastTransitionTime":"2026-03-10T06:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.977917 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhkb9_79ec9d89-dc71-4f36-9254-00bd86795e43/ovnkube-controller/0.log" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.980566 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerStarted","Data":"c1f533409ffd69a8e918e7a7a3e7c9bc7babd7e32a79b415ef112b6892debfaf"} Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.980968 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:45:57 crc kubenswrapper[4825]: I0310 06:45:57.998045 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:57Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.008408 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.020308 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.028826 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7bb00502b5ab82cbb301089867ff883cbfa0b30fabb0e58451aefeb0197810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.040573 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.052105 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.062778 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.077567 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.077644 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.077663 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.078258 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.078322 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:58Z","lastTransitionTime":"2026-03-10T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.086479 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.101640 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.114692 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.129375 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.141721 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.155384 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.171852 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f533409ffd69a8e918e7a7a3e7c9bc7babd7e32a79b415ef112b6892debfaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39647cc02452b6b9dbd836e1b07a56c25b9818f32103272c5da339e42aa6c240\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:45:56Z\\\",\\\"message\\\":\\\" 6618 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 06:45:56.629453 6618 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 06:45:56.629487 6618 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 06:45:56.629511 6618 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 06:45:56.629532 6618 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 06:45:56.629524 6618 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 06:45:56.629559 6618 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 06:45:56.629584 6618 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 06:45:56.629591 6618 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 06:45:56.629614 6618 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 06:45:56.629619 6618 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 06:45:56.629635 6618 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 06:45:56.629678 6618 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 06:45:56.629686 6618 factory.go:656] Stopping watch factory\\\\nI0310 06:45:56.629697 6618 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 06:45:56.629712 6618 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.180955 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.180998 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.181008 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.181024 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.181034 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:58Z","lastTransitionTime":"2026-03-10T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.185595 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:58Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.236255 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:45:58 crc kubenswrapper[4825]: E0310 06:45:58.236451 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.236703 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:45:58 crc kubenswrapper[4825]: E0310 06:45:58.236798 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.283428 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.283487 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.283499 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.283518 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.283848 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:58Z","lastTransitionTime":"2026-03-10T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.385955 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.385996 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.386007 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.386023 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.386034 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:58Z","lastTransitionTime":"2026-03-10T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.489408 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.489500 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.489520 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.489868 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.490174 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:58Z","lastTransitionTime":"2026-03-10T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.593591 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.593702 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.593721 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.593755 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.593773 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:58Z","lastTransitionTime":"2026-03-10T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.697157 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.697243 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.697280 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.697313 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.697335 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:58Z","lastTransitionTime":"2026-03-10T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.800433 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.800470 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.800480 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.800495 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.800504 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:58Z","lastTransitionTime":"2026-03-10T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.903621 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.903715 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.903733 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.903769 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.903786 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:58Z","lastTransitionTime":"2026-03-10T06:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.988707 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhkb9_79ec9d89-dc71-4f36-9254-00bd86795e43/ovnkube-controller/1.log" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.989815 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhkb9_79ec9d89-dc71-4f36-9254-00bd86795e43/ovnkube-controller/0.log" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.994438 4825 generic.go:334] "Generic (PLEG): container finished" podID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerID="c1f533409ffd69a8e918e7a7a3e7c9bc7babd7e32a79b415ef112b6892debfaf" exitCode=1 Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.994514 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerDied","Data":"c1f533409ffd69a8e918e7a7a3e7c9bc7babd7e32a79b415ef112b6892debfaf"} Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.994636 4825 scope.go:117] "RemoveContainer" containerID="39647cc02452b6b9dbd836e1b07a56c25b9818f32103272c5da339e42aa6c240" Mar 10 06:45:58 crc kubenswrapper[4825]: I0310 06:45:58.995273 4825 scope.go:117] "RemoveContainer" containerID="c1f533409ffd69a8e918e7a7a3e7c9bc7babd7e32a79b415ef112b6892debfaf" Mar 10 06:45:58 crc kubenswrapper[4825]: E0310 06:45:58.995476 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jhkb9_openshift-ovn-kubernetes(79ec9d89-dc71-4f36-9254-00bd86795e43)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.009202 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.009254 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.009271 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.009294 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.009311 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:59Z","lastTransitionTime":"2026-03-10T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.018620 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7bb00502b5ab82cbb301089867ff883cbfa0b30fabb0e58451aefeb0197810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.040166 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.063889 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.087047 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.106398 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.112718 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.112877 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.112952 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.113035 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.113074 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:59Z","lastTransitionTime":"2026-03-10T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.123906 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.151692 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.173429 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.195643 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.215063 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.217243 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.217504 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.217693 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.217890 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.218088 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:59Z","lastTransitionTime":"2026-03-10T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.238473 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:45:59 crc kubenswrapper[4825]: E0310 06:45:59.238690 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.239176 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.262886 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr"] Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.263897 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.265788 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.266808 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.267871 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.288986 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.312545 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.324771 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.324827 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.324838 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.324857 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.324867 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:59Z","lastTransitionTime":"2026-03-10T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.349906 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f533409ffd69a8e918e7a7a3e7c9bc7babd7e32a79b415ef112b6892debfaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39647cc02452b6b9dbd836e1b07a56c25b9818f32103272c5da339e42aa6c240\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:45:56Z\\\",\\\"message\\\":\\\" 6618 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 06:45:56.629453 6618 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 06:45:56.629487 6618 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 06:45:56.629511 6618 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 06:45:56.629532 6618 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 06:45:56.629524 6618 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 06:45:56.629559 6618 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 06:45:56.629584 6618 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 06:45:56.629591 6618 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 06:45:56.629614 6618 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 06:45:56.629619 6618 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 06:45:56.629635 6618 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 06:45:56.629678 6618 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 06:45:56.629686 6618 factory.go:656] Stopping watch factory\\\\nI0310 06:45:56.629697 6618 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 06:45:56.629712 6618 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f533409ffd69a8e918e7a7a3e7c9bc7babd7e32a79b415ef112b6892debfaf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:45:58Z\\\",\\\"message\\\":\\\"Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-authentication/oauth-openshift_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.222\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0310 06:45:57.980346 6770 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 06:45:57.980349 6770 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0310 06:45:57.982720 6770 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.365153 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.379518 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5e4a4c2-fcf8-42e2-a74b-50e89997ca17-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rhprr\" (UID: \"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.379618 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5e4a4c2-fcf8-42e2-a74b-50e89997ca17-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rhprr\" (UID: \"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.379660 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k58v4\" (UniqueName: \"kubernetes.io/projected/e5e4a4c2-fcf8-42e2-a74b-50e89997ca17-kube-api-access-k58v4\") pod \"ovnkube-control-plane-749d76644c-rhprr\" (UID: \"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.379701 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5e4a4c2-fcf8-42e2-a74b-50e89997ca17-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rhprr\" (UID: \"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.380530 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rhprr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.413873 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.427921 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.427988 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.428005 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.428033 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.428051 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:59Z","lastTransitionTime":"2026-03-10T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.437280 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.458291 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.475665 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.481196 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5e4a4c2-fcf8-42e2-a74b-50e89997ca17-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rhprr\" (UID: \"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.481394 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5e4a4c2-fcf8-42e2-a74b-50e89997ca17-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rhprr\" (UID: \"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.481531 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5e4a4c2-fcf8-42e2-a74b-50e89997ca17-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rhprr\" (UID: \"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.481656 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k58v4\" (UniqueName: \"kubernetes.io/projected/e5e4a4c2-fcf8-42e2-a74b-50e89997ca17-kube-api-access-k58v4\") pod \"ovnkube-control-plane-749d76644c-rhprr\" (UID: \"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.481894 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5e4a4c2-fcf8-42e2-a74b-50e89997ca17-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rhprr\" (UID: \"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.482609 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5e4a4c2-fcf8-42e2-a74b-50e89997ca17-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rhprr\" (UID: \"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.494417 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.495202 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5e4a4c2-fcf8-42e2-a74b-50e89997ca17-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rhprr\" (UID: \"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.511883 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k58v4\" (UniqueName: \"kubernetes.io/projected/e5e4a4c2-fcf8-42e2-a74b-50e89997ca17-kube-api-access-k58v4\") pod \"ovnkube-control-plane-749d76644c-rhprr\" (UID: \"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.515302 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.530768 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.530807 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.530819 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.530836 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.530848 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:59Z","lastTransitionTime":"2026-03-10T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.541356 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f533409ffd69a8e918e7a7a3e7c9bc7babd7e32a79b415ef112b6892debfaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39647cc02452b6b9dbd836e1b07a56c25b9818f32103272c5da339e42aa6c240\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:45:56Z\\\",\\\"message\\\":\\\" 6618 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 06:45:56.629453 6618 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 06:45:56.629487 6618 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 06:45:56.629511 6618 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 06:45:56.629532 6618 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 06:45:56.629524 6618 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 06:45:56.629559 6618 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 06:45:56.629584 6618 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 06:45:56.629591 6618 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 06:45:56.629614 6618 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 06:45:56.629619 6618 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 06:45:56.629635 6618 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 06:45:56.629678 6618 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 06:45:56.629686 6618 factory.go:656] Stopping watch factory\\\\nI0310 06:45:56.629697 6618 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 06:45:56.629712 6618 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f533409ffd69a8e918e7a7a3e7c9bc7babd7e32a79b415ef112b6892debfaf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:45:58Z\\\",\\\"message\\\":\\\"Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-authentication/oauth-openshift_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.222\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0310 06:45:57.980346 6770 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 06:45:57.980349 6770 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0310 06:45:57.982720 6770 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.565216 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.580952 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.585670 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.606979 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: W0310 06:45:59.609685 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5e4a4c2_fcf8_42e2_a74b_50e89997ca17.slice/crio-cad28467561e2cc75db45d52be8f8397787c1b1bb2465065df9dbaf731ad7f3c WatchSource:0}: Error finding container cad28467561e2cc75db45d52be8f8397787c1b1bb2465065df9dbaf731ad7f3c: Status 404 returned error can't find the container with id cad28467561e2cc75db45d52be8f8397787c1b1bb2465065df9dbaf731ad7f3c Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.626838 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.633718 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.633781 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.633800 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.633827 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.633846 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:59Z","lastTransitionTime":"2026-03-10T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.647782 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7bb00502b5ab82cbb301089867ff883cbfa0b30fabb0e58451aefeb0197810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.675407 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.700453 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.724921 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.740070 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.740115 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.740151 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.740175 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.740187 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:59Z","lastTransitionTime":"2026-03-10T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.743386 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.759704 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.781715 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f533409ffd69a8e918e7a7a3e7c9bc7babd7e32a79b415ef112b6892debfaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39647cc02452b6b9dbd836e1b07a56c25b9818f32103272c5da339e42aa6c240\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:45:56Z\\\",\\\"message\\\":\\\" 6618 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 06:45:56.629453 6618 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 06:45:56.629487 6618 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 06:45:56.629511 6618 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 06:45:56.629532 6618 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 06:45:56.629524 6618 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 06:45:56.629559 6618 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 06:45:56.629584 6618 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 06:45:56.629591 6618 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 06:45:56.629614 6618 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 06:45:56.629619 6618 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 06:45:56.629635 6618 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 06:45:56.629678 6618 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 06:45:56.629686 6618 factory.go:656] Stopping watch factory\\\\nI0310 06:45:56.629697 6618 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 06:45:56.629712 6618 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f533409ffd69a8e918e7a7a3e7c9bc7babd7e32a79b415ef112b6892debfaf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:45:58Z\\\",\\\"message\\\":\\\"Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-authentication/oauth-openshift_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.222\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0310 06:45:57.980346 6770 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 06:45:57.980349 6770 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0310 06:45:57.982720 6770 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.794778 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7bb00502b5ab82cbb301089867ff883cbfa0b30fabb0e58451aefeb0197810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.804168 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.817252 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.833780 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.843199 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.843230 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.843241 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.843260 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.843272 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:59Z","lastTransitionTime":"2026-03-10T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.847588 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.862822 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.892708 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.909647 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.921666 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rhprr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.940991 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.945515 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.945545 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.945557 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.945577 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.945589 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:45:59Z","lastTransitionTime":"2026-03-10T06:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.956236 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:45:59 crc kubenswrapper[4825]: I0310 06:45:59.974072 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:45:59Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.004162 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhkb9_79ec9d89-dc71-4f36-9254-00bd86795e43/ovnkube-controller/1.log" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.008377 4825 scope.go:117] "RemoveContainer" containerID="c1f533409ffd69a8e918e7a7a3e7c9bc7babd7e32a79b415ef112b6892debfaf" Mar 10 06:46:00 crc kubenswrapper[4825]: E0310 06:46:00.008761 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jhkb9_openshift-ovn-kubernetes(79ec9d89-dc71-4f36-9254-00bd86795e43)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.009200 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" event={"ID":"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17","Type":"ContainerStarted","Data":"ab5785d3d924225a531d92fc17a9efa444710e8e4f2e75f4308adcd24e80d8fe"} Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.009269 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" event={"ID":"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17","Type":"ContainerStarted","Data":"bc155a03c62e503236ad80c700aea4abf8ca948b38959b5f99df6965a5ad5bcf"} Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.009289 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" event={"ID":"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17","Type":"ContainerStarted","Data":"cad28467561e2cc75db45d52be8f8397787c1b1bb2465065df9dbaf731ad7f3c"} Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.020779 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.031703 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-pj5dl"] Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.032444 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:00 crc kubenswrapper[4825]: E0310 06:46:00.032548 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.038537 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.048533 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.048579 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.048595 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.048617 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.048634 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:00Z","lastTransitionTime":"2026-03-10T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.057820 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.072604 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.085945 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.094436 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnbd6\" (UniqueName: \"kubernetes.io/projected/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-kube-api-access-rnbd6\") pod \"network-metrics-daemon-pj5dl\" (UID: \"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\") " pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.094499 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs\") pod \"network-metrics-daemon-pj5dl\" (UID: \"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\") " pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.098943 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7bb00502b5ab82cbb301089867ff883cbfa0b30fabb0e58451aefeb0197810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.126886 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.147041 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.151816 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.151865 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.151876 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.151896 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.151907 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:00Z","lastTransitionTime":"2026-03-10T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.161222 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rhprr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.175463 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.187191 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.196020 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs\") pod \"network-metrics-daemon-pj5dl\" (UID: \"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\") " pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.196164 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnbd6\" (UniqueName: \"kubernetes.io/projected/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-kube-api-access-rnbd6\") pod \"network-metrics-daemon-pj5dl\" (UID: \"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\") " pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:00 crc kubenswrapper[4825]: E0310 06:46:00.196254 4825 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 06:46:00 crc kubenswrapper[4825]: E0310 06:46:00.196348 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs podName:114672c5-c1d0-4f87-b3aa-fb6d8535ffeb nodeName:}" failed. No retries permitted until 2026-03-10 06:46:00.69632514 +0000 UTC m=+113.726105745 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs") pod "network-metrics-daemon-pj5dl" (UID: "114672c5-c1d0-4f87-b3aa-fb6d8535ffeb") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.202340 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.214577 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnbd6\" (UniqueName: \"kubernetes.io/projected/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-kube-api-access-rnbd6\") pod \"network-metrics-daemon-pj5dl\" (UID: \"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\") " pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.217832 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.235831 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:00 crc kubenswrapper[4825]: E0310 06:46:00.236035 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.236087 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:00 crc kubenswrapper[4825]: E0310 06:46:00.236448 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.236925 4825 scope.go:117] "RemoveContainer" containerID="414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.237975 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.255514 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.255566 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.255582 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.255603 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.255619 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:00Z","lastTransitionTime":"2026-03-10T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.262896 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.304561 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f533409ffd69a8e918e7a7a3e7c9bc7babd7e32a79b415ef112b6892debfaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f533409ffd69a8e918e7a7a3e7c9bc7babd7e32a79b415ef112b6892debfaf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:45:58Z\\\",\\\"message\\\":\\\"Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-authentication/oauth-openshift_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.222\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0310 06:45:57.980346 6770 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 06:45:57.980349 6770 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0310 06:45:57.982720 6770 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jhkb9_openshift-ovn-kubernetes(79ec9d89-dc71-4f36-9254-00bd86795e43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.342557 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.363166 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.363220 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.363231 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.363250 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.363263 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:00Z","lastTransitionTime":"2026-03-10T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.374206 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.385463 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.398093 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.416287 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.439570 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f533409ffd69a8e918e7a7a3e7c9bc7babd7e32a79b415ef112b6892debfaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f533409ffd69a8e918e7a7a3e7c9bc7babd7e32a79b415ef112b6892debfaf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:45:58Z\\\",\\\"message\\\":\\\"Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-authentication/oauth-openshift_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.222\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0310 06:45:57.980346 6770 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 06:45:57.980349 6770 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0310 06:45:57.982720 6770 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jhkb9_openshift-ovn-kubernetes(79ec9d89-dc71-4f36-9254-00bd86795e43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.455058 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.466212 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.466274 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.466285 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.466305 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.466316 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:00Z","lastTransitionTime":"2026-03-10T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.471229 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.484693 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.497520 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.509698 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7bb00502b5ab82cbb301089867ff883cbfa0b30fabb0e58451aefeb0197810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.522802 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.537845 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.558619 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.569638 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.569711 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.569734 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.569765 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.569849 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:00Z","lastTransitionTime":"2026-03-10T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.579274 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc155a03c62e503236ad80c700aea4abf8ca948b38959b5f99df6965a5ad5bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab5785d3d924225a531d92fc17a9efa444710e8e4f2e75f4308adcd24e80d8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rhprr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.591432 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj5dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:46:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj5dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.616680 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:00Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.673514 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.673756 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.673962 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.674196 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.674333 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:00Z","lastTransitionTime":"2026-03-10T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.702445 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs\") pod \"network-metrics-daemon-pj5dl\" (UID: \"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\") " pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:00 crc kubenswrapper[4825]: E0310 06:46:00.702621 4825 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 06:46:00 crc kubenswrapper[4825]: E0310 06:46:00.702717 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs podName:114672c5-c1d0-4f87-b3aa-fb6d8535ffeb nodeName:}" failed. No retries permitted until 2026-03-10 06:46:01.702687416 +0000 UTC m=+114.732468201 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs") pod "network-metrics-daemon-pj5dl" (UID: "114672c5-c1d0-4f87-b3aa-fb6d8535ffeb") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.777841 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.778435 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.778600 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.778746 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.778883 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:00Z","lastTransitionTime":"2026-03-10T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.882051 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.882507 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.882636 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.882973 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.883206 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:00Z","lastTransitionTime":"2026-03-10T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.986547 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.986597 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.986615 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.986639 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:00 crc kubenswrapper[4825]: I0310 06:46:00.986656 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:00Z","lastTransitionTime":"2026-03-10T06:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.016361 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.022093 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0"} Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.023837 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.042987 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:01Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.066945 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:01Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.088980 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:01Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.090383 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.090579 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.090727 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.090870 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.091014 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:01Z","lastTransitionTime":"2026-03-10T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.109938 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:01Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.126355 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:01Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.143764 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7bb00502b5ab82cbb301089867ff883cbfa0b30fabb0e58451aefeb0197810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:01Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.166574 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:01Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.186694 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:01Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.194880 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.195074 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.195286 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.195518 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.195664 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:01Z","lastTransitionTime":"2026-03-10T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.208431 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc155a03c62e503236ad80c700aea4abf8ca948b38959b5f99df6965a5ad5bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab5785d3d924225a531d92fc17a9efa444710e8e4f2e75f4308adcd24e80d8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rhprr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:01Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.227334 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj5dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:46:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj5dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:01Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.238868 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:01 crc kubenswrapper[4825]: E0310 06:46:01.239263 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.252444 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:01Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.274119 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:01Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.295308 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:01Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.299001 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.299073 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.299219 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.299270 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.299294 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:01Z","lastTransitionTime":"2026-03-10T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.320757 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:46:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:01Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.338505 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:01Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.358303 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:01Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.379775 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f533409ffd69a8e918e7a7a3e7c9bc7babd7e32a79b415ef112b6892debfaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f533409ffd69a8e918e7a7a3e7c9bc7babd7e32a79b415ef112b6892debfaf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:45:58Z\\\",\\\"message\\\":\\\"Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-authentication/oauth-openshift_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.222\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0310 06:45:57.980346 6770 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 06:45:57.980349 6770 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0310 06:45:57.982720 6770 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jhkb9_openshift-ovn-kubernetes(79ec9d89-dc71-4f36-9254-00bd86795e43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:01Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.401993 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.402073 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.402105 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.402178 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.402206 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:01Z","lastTransitionTime":"2026-03-10T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.505439 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.505503 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.505521 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.505548 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.505566 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:01Z","lastTransitionTime":"2026-03-10T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.580683 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.580736 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.580773 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.580801 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.580819 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:01Z","lastTransitionTime":"2026-03-10T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:01 crc kubenswrapper[4825]: E0310 06:46:01.608666 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:01Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.614011 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.614059 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.614075 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.614101 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.614119 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:01Z","lastTransitionTime":"2026-03-10T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:01 crc kubenswrapper[4825]: E0310 06:46:01.633204 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:01Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.638171 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.638226 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.638246 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.638277 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.638298 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:01Z","lastTransitionTime":"2026-03-10T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:01 crc kubenswrapper[4825]: E0310 06:46:01.655779 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:01Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.661075 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.661170 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.661195 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.661225 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.661243 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:01Z","lastTransitionTime":"2026-03-10T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:01 crc kubenswrapper[4825]: E0310 06:46:01.678732 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:01Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.688972 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.689042 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.689061 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.689092 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.689112 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:01Z","lastTransitionTime":"2026-03-10T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:01 crc kubenswrapper[4825]: E0310 06:46:01.705843 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:01Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:01 crc kubenswrapper[4825]: E0310 06:46:01.705981 4825 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.708581 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.708645 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.708658 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.708680 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.708697 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:01Z","lastTransitionTime":"2026-03-10T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.719570 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs\") pod \"network-metrics-daemon-pj5dl\" (UID: \"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\") " pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:01 crc kubenswrapper[4825]: E0310 06:46:01.719816 4825 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 06:46:01 crc kubenswrapper[4825]: E0310 06:46:01.719928 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs podName:114672c5-c1d0-4f87-b3aa-fb6d8535ffeb nodeName:}" failed. No retries permitted until 2026-03-10 06:46:03.719903631 +0000 UTC m=+116.749684246 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs") pod "network-metrics-daemon-pj5dl" (UID: "114672c5-c1d0-4f87-b3aa-fb6d8535ffeb") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.813343 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.813416 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.813439 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.813472 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.813492 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:01Z","lastTransitionTime":"2026-03-10T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.916603 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.916684 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.916710 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.916742 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:01 crc kubenswrapper[4825]: I0310 06:46:01.916765 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:01Z","lastTransitionTime":"2026-03-10T06:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.020528 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.020778 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.020844 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.020899 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.020923 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:02Z","lastTransitionTime":"2026-03-10T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.125334 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.125445 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.125484 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.125524 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.125555 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:02Z","lastTransitionTime":"2026-03-10T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.228357 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.228420 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.228437 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.228458 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.228473 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:02Z","lastTransitionTime":"2026-03-10T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.235719 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.235730 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.235872 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:02 crc kubenswrapper[4825]: E0310 06:46:02.236089 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:02 crc kubenswrapper[4825]: E0310 06:46:02.236297 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:02 crc kubenswrapper[4825]: E0310 06:46:02.236374 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.331244 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.331294 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.331313 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.331336 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.331350 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:02Z","lastTransitionTime":"2026-03-10T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.434801 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.434895 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.434913 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.434952 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.434977 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:02Z","lastTransitionTime":"2026-03-10T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.538474 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.538595 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.538622 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.538658 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.538684 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:02Z","lastTransitionTime":"2026-03-10T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.645446 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.645512 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.645540 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.645573 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.645601 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:02Z","lastTransitionTime":"2026-03-10T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.748610 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.748705 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.748729 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.748763 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.748788 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:02Z","lastTransitionTime":"2026-03-10T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.852933 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.853004 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.853023 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.853055 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.853076 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:02Z","lastTransitionTime":"2026-03-10T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.957383 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.957458 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.957482 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.957518 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:02 crc kubenswrapper[4825]: I0310 06:46:02.957544 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:02Z","lastTransitionTime":"2026-03-10T06:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.060325 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.060382 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.060399 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.060429 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.060473 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:03Z","lastTransitionTime":"2026-03-10T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.164749 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.164811 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.164829 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.164857 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.164877 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:03Z","lastTransitionTime":"2026-03-10T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.235606 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:03 crc kubenswrapper[4825]: E0310 06:46:03.235867 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.267692 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.267732 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.267742 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.267765 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.267775 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:03Z","lastTransitionTime":"2026-03-10T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.371871 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.371994 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.372013 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.372036 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.372050 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:03Z","lastTransitionTime":"2026-03-10T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.475127 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.475230 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.475247 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.475278 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.475296 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:03Z","lastTransitionTime":"2026-03-10T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.579027 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.579098 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.579115 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.579171 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.579191 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:03Z","lastTransitionTime":"2026-03-10T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.683407 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.683481 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.683498 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.683525 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.683544 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:03Z","lastTransitionTime":"2026-03-10T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.749665 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs\") pod \"network-metrics-daemon-pj5dl\" (UID: \"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\") " pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:03 crc kubenswrapper[4825]: E0310 06:46:03.750573 4825 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 06:46:03 crc kubenswrapper[4825]: E0310 06:46:03.750774 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs podName:114672c5-c1d0-4f87-b3aa-fb6d8535ffeb nodeName:}" failed. No retries permitted until 2026-03-10 06:46:07.750739269 +0000 UTC m=+120.780519914 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs") pod "network-metrics-daemon-pj5dl" (UID: "114672c5-c1d0-4f87-b3aa-fb6d8535ffeb") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.786782 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.786868 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.786893 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.786929 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.786958 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:03Z","lastTransitionTime":"2026-03-10T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.890173 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.890261 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.890286 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.890314 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.890335 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:03Z","lastTransitionTime":"2026-03-10T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.993586 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.993650 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.993669 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.993696 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:03 crc kubenswrapper[4825]: I0310 06:46:03.993715 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:03Z","lastTransitionTime":"2026-03-10T06:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.097483 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.097543 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.097563 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.097593 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.097612 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:04Z","lastTransitionTime":"2026-03-10T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.202364 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.202466 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.202487 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.202517 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.202537 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:04Z","lastTransitionTime":"2026-03-10T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.236311 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.236406 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.236515 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:04 crc kubenswrapper[4825]: E0310 06:46:04.236734 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:04 crc kubenswrapper[4825]: E0310 06:46:04.236926 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:04 crc kubenswrapper[4825]: E0310 06:46:04.237199 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.305347 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.305449 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.305468 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.305493 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.305512 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:04Z","lastTransitionTime":"2026-03-10T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.408816 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.408871 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.408883 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.408903 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.408916 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:04Z","lastTransitionTime":"2026-03-10T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.512432 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.512512 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.512530 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.512556 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.512576 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:04Z","lastTransitionTime":"2026-03-10T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.616073 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.616123 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.616153 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.616174 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.616190 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:04Z","lastTransitionTime":"2026-03-10T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.719682 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.719752 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.719770 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.719797 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.719816 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:04Z","lastTransitionTime":"2026-03-10T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.823224 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.823281 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.823289 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.823305 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.823320 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:04Z","lastTransitionTime":"2026-03-10T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.927070 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.927158 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.927174 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.927200 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:04 crc kubenswrapper[4825]: I0310 06:46:04.927218 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:04Z","lastTransitionTime":"2026-03-10T06:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.030534 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.030585 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.030599 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.030620 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.030633 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:05Z","lastTransitionTime":"2026-03-10T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.133762 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.134057 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.134291 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.134453 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.134598 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:05Z","lastTransitionTime":"2026-03-10T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.236499 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:05 crc kubenswrapper[4825]: E0310 06:46:05.236695 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.239397 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.239439 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.239458 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.239478 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.239495 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:05Z","lastTransitionTime":"2026-03-10T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.342808 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.342881 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.342902 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.342933 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.342957 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:05Z","lastTransitionTime":"2026-03-10T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.445914 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.446338 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.446511 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.446659 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.446797 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:05Z","lastTransitionTime":"2026-03-10T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.550424 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.550483 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.550500 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.550528 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.550547 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:05Z","lastTransitionTime":"2026-03-10T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.654653 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.654736 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.654756 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.654787 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.654811 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:05Z","lastTransitionTime":"2026-03-10T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.758366 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.758733 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.758876 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.759021 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.759195 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:05Z","lastTransitionTime":"2026-03-10T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.862545 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.862643 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.862661 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.862692 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.862720 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:05Z","lastTransitionTime":"2026-03-10T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.966475 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.966831 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.966921 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.967030 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:05 crc kubenswrapper[4825]: I0310 06:46:05.967159 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:05Z","lastTransitionTime":"2026-03-10T06:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.070513 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.070973 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.071176 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.071331 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.071461 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:06Z","lastTransitionTime":"2026-03-10T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.175896 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.175979 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.176001 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.176036 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.176056 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:06Z","lastTransitionTime":"2026-03-10T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.236211 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.236336 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:06 crc kubenswrapper[4825]: E0310 06:46:06.236454 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.236228 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:06 crc kubenswrapper[4825]: E0310 06:46:06.236742 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:06 crc kubenswrapper[4825]: E0310 06:46:06.236875 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.280838 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.280916 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.280942 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.280973 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.280995 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:06Z","lastTransitionTime":"2026-03-10T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.384184 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.384235 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.384252 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.384276 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.384293 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:06Z","lastTransitionTime":"2026-03-10T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.488645 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.488730 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.488748 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.488777 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.488797 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:06Z","lastTransitionTime":"2026-03-10T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.592604 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.592693 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.592712 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.592743 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.592787 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:06Z","lastTransitionTime":"2026-03-10T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.696610 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.697123 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.697378 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.697531 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.697670 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:06Z","lastTransitionTime":"2026-03-10T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.801812 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.801912 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.801937 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.801972 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.802010 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:06Z","lastTransitionTime":"2026-03-10T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.914933 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.915590 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.915667 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.916002 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:06 crc kubenswrapper[4825]: I0310 06:46:06.916028 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:06Z","lastTransitionTime":"2026-03-10T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.019409 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.019491 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.019511 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.019539 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.019624 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:07Z","lastTransitionTime":"2026-03-10T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.123494 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.123563 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.123585 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.123618 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.123642 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:07Z","lastTransitionTime":"2026-03-10T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.226818 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.226892 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.226931 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.226969 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.226993 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:07Z","lastTransitionTime":"2026-03-10T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.236477 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:07 crc kubenswrapper[4825]: E0310 06:46:07.236707 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.329876 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.329950 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.329969 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.329999 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.330020 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:07Z","lastTransitionTime":"2026-03-10T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.433722 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.433789 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.433829 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.433857 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.433875 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:07Z","lastTransitionTime":"2026-03-10T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.538015 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.538089 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.538109 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.538173 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.538196 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:07Z","lastTransitionTime":"2026-03-10T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.642013 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.642070 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.642082 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.642103 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.642121 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:07Z","lastTransitionTime":"2026-03-10T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.745465 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.745535 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.745557 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.745589 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.745614 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:07Z","lastTransitionTime":"2026-03-10T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.833361 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs\") pod \"network-metrics-daemon-pj5dl\" (UID: \"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\") " pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:07 crc kubenswrapper[4825]: E0310 06:46:07.833582 4825 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 06:46:07 crc kubenswrapper[4825]: E0310 06:46:07.833736 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs podName:114672c5-c1d0-4f87-b3aa-fb6d8535ffeb nodeName:}" failed. No retries permitted until 2026-03-10 06:46:15.83370025 +0000 UTC m=+128.863480895 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs") pod "network-metrics-daemon-pj5dl" (UID: "114672c5-c1d0-4f87-b3aa-fb6d8535ffeb") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.852153 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.852191 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.852199 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.852216 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.852225 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:07Z","lastTransitionTime":"2026-03-10T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.956330 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.956387 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.956400 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.956424 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:07 crc kubenswrapper[4825]: I0310 06:46:07.956438 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:07Z","lastTransitionTime":"2026-03-10T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.059406 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.059492 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.059513 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.059544 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.059563 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:08Z","lastTransitionTime":"2026-03-10T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.162879 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.162982 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.163013 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.163051 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.163080 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:08Z","lastTransitionTime":"2026-03-10T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.235647 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.235642 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.235803 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:08 crc kubenswrapper[4825]: E0310 06:46:08.235896 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:08 crc kubenswrapper[4825]: E0310 06:46:08.236086 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:08 crc kubenswrapper[4825]: E0310 06:46:08.236278 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.266421 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.266478 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.266494 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.266520 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.266535 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:08Z","lastTransitionTime":"2026-03-10T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.370100 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.370209 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.370227 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.370254 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.370272 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:08Z","lastTransitionTime":"2026-03-10T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.474188 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.474229 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.474238 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.474254 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.474264 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:08Z","lastTransitionTime":"2026-03-10T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.577317 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.577362 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.577375 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.577395 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.577408 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:08Z","lastTransitionTime":"2026-03-10T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.680049 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.680119 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.680169 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.680198 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.680217 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:08Z","lastTransitionTime":"2026-03-10T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.782367 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.782424 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.782442 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.782466 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.782483 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:08Z","lastTransitionTime":"2026-03-10T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.885853 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.885918 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.885935 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.885961 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.885981 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:08Z","lastTransitionTime":"2026-03-10T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.989595 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.989648 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.989665 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.989692 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:08 crc kubenswrapper[4825]: I0310 06:46:08.989709 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:08Z","lastTransitionTime":"2026-03-10T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:09 crc kubenswrapper[4825]: I0310 06:46:09.092961 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:09 crc kubenswrapper[4825]: I0310 06:46:09.093054 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:09 crc kubenswrapper[4825]: I0310 06:46:09.093078 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:09 crc kubenswrapper[4825]: I0310 06:46:09.093113 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:09 crc kubenswrapper[4825]: I0310 06:46:09.093169 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:09Z","lastTransitionTime":"2026-03-10T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:09 crc kubenswrapper[4825]: E0310 06:46:09.193438 4825 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 10 06:46:09 crc kubenswrapper[4825]: I0310 06:46:09.235845 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:09 crc kubenswrapper[4825]: E0310 06:46:09.236467 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:09 crc kubenswrapper[4825]: I0310 06:46:09.259287 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:09Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:09 crc kubenswrapper[4825]: I0310 06:46:09.277792 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:09Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:09 crc kubenswrapper[4825]: I0310 06:46:09.293872 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7bb00502b5ab82cbb301089867ff883cbfa0b30fabb0e58451aefeb0197810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:09Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:09 crc kubenswrapper[4825]: I0310 06:46:09.313535 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:09Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:09 crc kubenswrapper[4825]: I0310 06:46:09.335643 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:09Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:09 crc kubenswrapper[4825]: I0310 06:46:09.356700 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:09Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:09 crc kubenswrapper[4825]: I0310 06:46:09.374876 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc155a03c62e503236ad80c700aea4abf8ca948b38959b5f99df6965a5ad5bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab5785d3d924225a531d92fc17a9efa444710e8e4f2e75f4308adcd24e80d8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rhprr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:09Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:09 crc kubenswrapper[4825]: I0310 06:46:09.395348 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj5dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:46:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj5dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:09Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:09 crc kubenswrapper[4825]: I0310 06:46:09.431473 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:09Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:09 crc kubenswrapper[4825]: I0310 06:46:09.454846 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:09Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:09 crc kubenswrapper[4825]: I0310 06:46:09.478357 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:09Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:09 crc kubenswrapper[4825]: E0310 06:46:09.492689 4825 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 06:46:09 crc kubenswrapper[4825]: I0310 06:46:09.513449 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:09Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:09 crc kubenswrapper[4825]: I0310 06:46:09.536903 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:09Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:09 crc kubenswrapper[4825]: I0310 06:46:09.562115 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:09Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:09 crc kubenswrapper[4825]: I0310 06:46:09.595350 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f533409ffd69a8e918e7a7a3e7c9bc7babd7e32a79b415ef112b6892debfaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f533409ffd69a8e918e7a7a3e7c9bc7babd7e32a79b415ef112b6892debfaf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:45:58Z\\\",\\\"message\\\":\\\"Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-authentication/oauth-openshift_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.222\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0310 06:45:57.980346 6770 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 06:45:57.980349 6770 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0310 06:45:57.982720 6770 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jhkb9_openshift-ovn-kubernetes(79ec9d89-dc71-4f36-9254-00bd86795e43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:09Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:09 crc kubenswrapper[4825]: I0310 06:46:09.622432 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:46:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:09Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:09 crc kubenswrapper[4825]: I0310 06:46:09.645485 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:09Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:10 crc kubenswrapper[4825]: I0310 06:46:10.235915 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:10 crc kubenswrapper[4825]: I0310 06:46:10.236080 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:10 crc kubenswrapper[4825]: E0310 06:46:10.236190 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:10 crc kubenswrapper[4825]: E0310 06:46:10.236303 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:10 crc kubenswrapper[4825]: I0310 06:46:10.236780 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:10 crc kubenswrapper[4825]: E0310 06:46:10.237052 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:11 crc kubenswrapper[4825]: I0310 06:46:11.236043 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:11 crc kubenswrapper[4825]: I0310 06:46:11.237536 4825 scope.go:117] "RemoveContainer" containerID="c1f533409ffd69a8e918e7a7a3e7c9bc7babd7e32a79b415ef112b6892debfaf" Mar 10 06:46:11 crc kubenswrapper[4825]: E0310 06:46:11.237903 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:11 crc kubenswrapper[4825]: I0310 06:46:11.965869 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:11 crc kubenswrapper[4825]: I0310 06:46:11.965929 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:11 crc kubenswrapper[4825]: I0310 06:46:11.965942 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:11 crc kubenswrapper[4825]: I0310 06:46:11.965963 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:11 crc kubenswrapper[4825]: I0310 06:46:11.965982 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:11Z","lastTransitionTime":"2026-03-10T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:11 crc kubenswrapper[4825]: E0310 06:46:11.990713 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:11Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:11 crc kubenswrapper[4825]: I0310 06:46:11.996278 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:11 crc kubenswrapper[4825]: I0310 06:46:11.996340 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:11 crc kubenswrapper[4825]: I0310 06:46:11.996356 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:11 crc kubenswrapper[4825]: I0310 06:46:11.996386 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:11 crc kubenswrapper[4825]: I0310 06:46:11.996406 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:11Z","lastTransitionTime":"2026-03-10T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:12 crc kubenswrapper[4825]: E0310 06:46:12.018337 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.023231 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.023293 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.023310 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.023332 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.023344 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:12Z","lastTransitionTime":"2026-03-10T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:12 crc kubenswrapper[4825]: E0310 06:46:12.040076 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.045119 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.045218 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.045241 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.045334 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.045356 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:12Z","lastTransitionTime":"2026-03-10T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.073172 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhkb9_79ec9d89-dc71-4f36-9254-00bd86795e43/ovnkube-controller/1.log" Mar 10 06:46:12 crc kubenswrapper[4825]: E0310 06:46:12.074194 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.076798 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerStarted","Data":"ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e"} Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.078180 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.081786 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.081851 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.081869 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.081895 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.081915 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:12Z","lastTransitionTime":"2026-03-10T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.097724 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:12 crc kubenswrapper[4825]: E0310 06:46:12.109012 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:12 crc kubenswrapper[4825]: E0310 06:46:12.109177 4825 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.116288 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.137822 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.159975 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f533409ffd69a8e918e7a7a3e7c9bc7babd7e32a79b415ef112b6892debfaf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:45:58Z\\\",\\\"message\\\":\\\"Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-authentication/oauth-openshift_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.222\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0310 06:45:57.980346 6770 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 06:45:57.980349 6770 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0310 06:45:57.982720 6770 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.176687 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:46:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.189797 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.206797 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.219834 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.235749 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.235910 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:12 crc kubenswrapper[4825]: E0310 06:46:12.235950 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.236189 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:12 crc kubenswrapper[4825]: E0310 06:46:12.236251 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:12 crc kubenswrapper[4825]: E0310 06:46:12.236316 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.513768 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7bb00502b5ab82cbb301089867ff883cbfa0b30fabb0e58451aefeb0197810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.532349 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.554299 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.576378 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.598262 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.617203 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj5dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:46:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj5dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.653795 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.679060 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:12 crc kubenswrapper[4825]: I0310 06:46:12.700302 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc155a03c62e503236ad80c700aea4abf8ca948b38959b5f99df6965a5ad5bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab5785d3d924225a531d92fc17a9efa444710e8e4f2e75f4308adcd24e80d8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rhprr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:12Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:13 crc kubenswrapper[4825]: I0310 06:46:13.085164 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhkb9_79ec9d89-dc71-4f36-9254-00bd86795e43/ovnkube-controller/2.log" Mar 10 06:46:13 crc kubenswrapper[4825]: I0310 06:46:13.086276 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhkb9_79ec9d89-dc71-4f36-9254-00bd86795e43/ovnkube-controller/1.log" Mar 10 06:46:13 crc kubenswrapper[4825]: I0310 06:46:13.095396 4825 generic.go:334] "Generic (PLEG): container finished" podID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerID="ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e" exitCode=1 Mar 10 06:46:13 crc kubenswrapper[4825]: I0310 06:46:13.095433 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerDied","Data":"ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e"} Mar 10 06:46:13 crc kubenswrapper[4825]: I0310 06:46:13.095668 4825 scope.go:117] "RemoveContainer" containerID="c1f533409ffd69a8e918e7a7a3e7c9bc7babd7e32a79b415ef112b6892debfaf" Mar 10 06:46:13 crc kubenswrapper[4825]: I0310 06:46:13.096486 4825 scope.go:117] "RemoveContainer" containerID="ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e" Mar 10 06:46:13 crc kubenswrapper[4825]: E0310 06:46:13.096805 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jhkb9_openshift-ovn-kubernetes(79ec9d89-dc71-4f36-9254-00bd86795e43)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" Mar 10 06:46:13 crc kubenswrapper[4825]: I0310 06:46:13.121199 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:13Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:13 crc kubenswrapper[4825]: I0310 06:46:13.140222 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:13Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:13 crc kubenswrapper[4825]: I0310 06:46:13.154203 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:13Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:13 crc kubenswrapper[4825]: I0310 06:46:13.172516 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:46:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:13Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:13 crc kubenswrapper[4825]: I0310 06:46:13.190312 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:13Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:13 crc kubenswrapper[4825]: I0310 06:46:13.216172 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:13Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:13 crc kubenswrapper[4825]: I0310 06:46:13.236040 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:13 crc kubenswrapper[4825]: E0310 06:46:13.236443 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:13 crc kubenswrapper[4825]: I0310 06:46:13.250412 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f533409ffd69a8e918e7a7a3e7c9bc7babd7e32a79b415ef112b6892debfaf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:45:58Z\\\",\\\"message\\\":\\\"Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-authentication/oauth-openshift_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.222\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0310 06:45:57.980346 6770 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0310 06:45:57.980349 6770 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0310 06:45:57.982720 6770 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"message\\\":\\\"om github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855516 7040 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855908 7040 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855945 7040 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.856623 7040 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 06:46:12.856685 7040 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 06:46:12.856711 7040 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 06:46:12.856767 7040 factory.go:656] Stopping watch factory\\\\nI0310 06:46:12.856790 7040 ovnkube.go:599] Stopped ovnkube\\\\nI0310 06:46:12.856840 7040 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 06:46:12.856855 7040 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 06:46:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:13Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:13 crc kubenswrapper[4825]: I0310 06:46:13.268073 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:13Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:13 crc kubenswrapper[4825]: I0310 06:46:13.284814 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:13Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:13 crc kubenswrapper[4825]: I0310 06:46:13.302267 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:13Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:13 crc kubenswrapper[4825]: I0310 06:46:13.315769 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:13Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:13 crc kubenswrapper[4825]: I0310 06:46:13.332928 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:13Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:13 crc kubenswrapper[4825]: I0310 06:46:13.346214 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7bb00502b5ab82cbb301089867ff883cbfa0b30fabb0e58451aefeb0197810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:13Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:13 crc kubenswrapper[4825]: I0310 06:46:13.369681 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:13Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:13 crc kubenswrapper[4825]: I0310 06:46:13.388039 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:13Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:13 crc kubenswrapper[4825]: I0310 06:46:13.405168 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc155a03c62e503236ad80c700aea4abf8ca948b38959b5f99df6965a5ad5bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab5785d3d924225a531d92fc17a9efa444710e8e4f2e75f4308adcd24e80d8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rhprr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:13Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:13 crc kubenswrapper[4825]: I0310 06:46:13.422544 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj5dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:46:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj5dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:13Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:14 crc kubenswrapper[4825]: I0310 06:46:14.104279 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhkb9_79ec9d89-dc71-4f36-9254-00bd86795e43/ovnkube-controller/2.log" Mar 10 06:46:14 crc kubenswrapper[4825]: I0310 06:46:14.111414 4825 scope.go:117] "RemoveContainer" containerID="ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e" Mar 10 06:46:14 crc kubenswrapper[4825]: E0310 06:46:14.111743 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jhkb9_openshift-ovn-kubernetes(79ec9d89-dc71-4f36-9254-00bd86795e43)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" Mar 10 06:46:14 crc kubenswrapper[4825]: I0310 06:46:14.130027 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:14 crc kubenswrapper[4825]: I0310 06:46:14.155636 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:14 crc kubenswrapper[4825]: I0310 06:46:14.189263 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"message\\\":\\\"om github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855516 7040 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855908 7040 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855945 7040 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.856623 7040 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 06:46:12.856685 7040 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 06:46:12.856711 7040 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 06:46:12.856767 7040 factory.go:656] Stopping watch factory\\\\nI0310 06:46:12.856790 7040 ovnkube.go:599] Stopped ovnkube\\\\nI0310 06:46:12.856840 7040 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 06:46:12.856855 7040 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 06:46:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:46:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jhkb9_openshift-ovn-kubernetes(79ec9d89-dc71-4f36-9254-00bd86795e43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:14 crc kubenswrapper[4825]: I0310 06:46:14.213586 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:46:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:14 crc kubenswrapper[4825]: I0310 06:46:14.235831 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:14 crc kubenswrapper[4825]: I0310 06:46:14.235894 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:14 crc kubenswrapper[4825]: I0310 06:46:14.235896 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:14 crc kubenswrapper[4825]: E0310 06:46:14.236106 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:14 crc kubenswrapper[4825]: I0310 06:46:14.236236 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:14 crc kubenswrapper[4825]: E0310 06:46:14.236424 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:14 crc kubenswrapper[4825]: E0310 06:46:14.236615 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:14 crc kubenswrapper[4825]: I0310 06:46:14.251792 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:14 crc kubenswrapper[4825]: I0310 06:46:14.269334 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:14 crc kubenswrapper[4825]: I0310 06:46:14.287092 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:14 crc kubenswrapper[4825]: I0310 06:46:14.306804 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7bb00502b5ab82cbb301089867ff883cbfa0b30fabb0e58451aefeb0197810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:14 crc kubenswrapper[4825]: I0310 06:46:14.324660 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:14 crc kubenswrapper[4825]: I0310 06:46:14.347125 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:14 crc kubenswrapper[4825]: I0310 06:46:14.366173 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc155a03c62e503236ad80c700aea4abf8ca948b38959b5f99df6965a5ad5bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab5785d3d924225a531d92fc17a9efa444710e8e4f2e75f4308adcd24e80d8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rhprr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:14 crc kubenswrapper[4825]: I0310 06:46:14.381409 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj5dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:46:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj5dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:14 crc kubenswrapper[4825]: I0310 06:46:14.416637 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:14 crc kubenswrapper[4825]: I0310 06:46:14.436674 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:14 crc kubenswrapper[4825]: I0310 06:46:14.460053 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:14 crc kubenswrapper[4825]: I0310 06:46:14.477874 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:14Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:14 crc kubenswrapper[4825]: E0310 06:46:14.494812 4825 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 06:46:15 crc kubenswrapper[4825]: I0310 06:46:15.235786 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:15 crc kubenswrapper[4825]: E0310 06:46:15.236033 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:15 crc kubenswrapper[4825]: I0310 06:46:15.917933 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs\") pod \"network-metrics-daemon-pj5dl\" (UID: \"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\") " pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:15 crc kubenswrapper[4825]: E0310 06:46:15.918552 4825 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 06:46:15 crc kubenswrapper[4825]: E0310 06:46:15.918807 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs podName:114672c5-c1d0-4f87-b3aa-fb6d8535ffeb nodeName:}" failed. No retries permitted until 2026-03-10 06:46:31.918778537 +0000 UTC m=+144.948559182 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs") pod "network-metrics-daemon-pj5dl" (UID: "114672c5-c1d0-4f87-b3aa-fb6d8535ffeb") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 06:46:16 crc kubenswrapper[4825]: I0310 06:46:16.235388 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:16 crc kubenswrapper[4825]: I0310 06:46:16.235481 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:16 crc kubenswrapper[4825]: E0310 06:46:16.235622 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:16 crc kubenswrapper[4825]: I0310 06:46:16.235395 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:16 crc kubenswrapper[4825]: E0310 06:46:16.235747 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:16 crc kubenswrapper[4825]: E0310 06:46:16.235843 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:16 crc kubenswrapper[4825]: I0310 06:46:16.514481 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:46:16 crc kubenswrapper[4825]: I0310 06:46:16.539194 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:16 crc kubenswrapper[4825]: I0310 06:46:16.566545 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:16 crc kubenswrapper[4825]: I0310 06:46:16.601719 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"message\\\":\\\"om github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855516 7040 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855908 7040 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855945 7040 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.856623 7040 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 06:46:12.856685 7040 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 06:46:12.856711 7040 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 06:46:12.856767 7040 factory.go:656] Stopping watch factory\\\\nI0310 06:46:12.856790 7040 ovnkube.go:599] Stopped ovnkube\\\\nI0310 06:46:12.856840 7040 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 06:46:12.856855 7040 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 06:46:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:46:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jhkb9_openshift-ovn-kubernetes(79ec9d89-dc71-4f36-9254-00bd86795e43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:16 crc kubenswrapper[4825]: I0310 06:46:16.634198 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:46:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:16 crc kubenswrapper[4825]: I0310 06:46:16.660945 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:16 crc kubenswrapper[4825]: I0310 06:46:16.683989 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:16 crc kubenswrapper[4825]: I0310 06:46:16.705494 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:16 crc kubenswrapper[4825]: I0310 06:46:16.723558 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:16 crc kubenswrapper[4825]: I0310 06:46:16.741197 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7bb00502b5ab82cbb301089867ff883cbfa0b30fabb0e58451aefeb0197810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:16 crc kubenswrapper[4825]: I0310 06:46:16.758545 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:16 crc kubenswrapper[4825]: I0310 06:46:16.777349 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:16 crc kubenswrapper[4825]: I0310 06:46:16.795645 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc155a03c62e503236ad80c700aea4abf8ca948b38959b5f99df6965a5ad5bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab5785d3d924225a531d92fc17a9efa444710e8e4f2e75f4308adcd24e80d8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rhprr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:16 crc kubenswrapper[4825]: I0310 06:46:16.813088 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj5dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:46:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj5dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:16 crc kubenswrapper[4825]: I0310 06:46:16.846174 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:16 crc kubenswrapper[4825]: I0310 06:46:16.868192 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:16 crc kubenswrapper[4825]: I0310 06:46:16.890992 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:16 crc kubenswrapper[4825]: I0310 06:46:16.910894 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:16Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:17 crc kubenswrapper[4825]: I0310 06:46:17.236168 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:17 crc kubenswrapper[4825]: E0310 06:46:17.236429 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:18 crc kubenswrapper[4825]: I0310 06:46:18.235620 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:18 crc kubenswrapper[4825]: I0310 06:46:18.235704 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:18 crc kubenswrapper[4825]: I0310 06:46:18.235757 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:18 crc kubenswrapper[4825]: E0310 06:46:18.235842 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:18 crc kubenswrapper[4825]: E0310 06:46:18.235981 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:18 crc kubenswrapper[4825]: E0310 06:46:18.236189 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:19 crc kubenswrapper[4825]: I0310 06:46:19.236504 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:19 crc kubenswrapper[4825]: E0310 06:46:19.236734 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:19 crc kubenswrapper[4825]: I0310 06:46:19.255281 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:19 crc kubenswrapper[4825]: I0310 06:46:19.281061 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:19 crc kubenswrapper[4825]: I0310 06:46:19.318351 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"message\\\":\\\"om github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855516 7040 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855908 7040 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855945 7040 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.856623 7040 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 06:46:12.856685 7040 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 06:46:12.856711 7040 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 06:46:12.856767 7040 factory.go:656] Stopping watch factory\\\\nI0310 06:46:12.856790 7040 ovnkube.go:599] Stopped ovnkube\\\\nI0310 06:46:12.856840 7040 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 06:46:12.856855 7040 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 06:46:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:46:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jhkb9_openshift-ovn-kubernetes(79ec9d89-dc71-4f36-9254-00bd86795e43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:19 crc kubenswrapper[4825]: I0310 06:46:19.343708 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:46:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:19 crc kubenswrapper[4825]: I0310 06:46:19.364049 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:19 crc kubenswrapper[4825]: I0310 06:46:19.384365 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:19 crc kubenswrapper[4825]: I0310 06:46:19.400846 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:19 crc kubenswrapper[4825]: I0310 06:46:19.417744 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7bb00502b5ab82cbb301089867ff883cbfa0b30fabb0e58451aefeb0197810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:19 crc kubenswrapper[4825]: I0310 06:46:19.433897 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:19 crc kubenswrapper[4825]: I0310 06:46:19.458831 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:19 crc kubenswrapper[4825]: I0310 06:46:19.481095 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:19 crc kubenswrapper[4825]: E0310 06:46:19.496193 4825 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 06:46:19 crc kubenswrapper[4825]: I0310 06:46:19.510318 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc155a03c62e503236ad80c700aea4abf8ca948b38959b5f99df6965a5ad5bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab5785d3d924225a531d92fc17a9efa444710e8e4f2e75f4308adcd24e80d8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rhprr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:19 crc kubenswrapper[4825]: I0310 06:46:19.529308 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj5dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:46:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj5dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:19 crc kubenswrapper[4825]: I0310 06:46:19.558451 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:19 crc kubenswrapper[4825]: I0310 06:46:19.577263 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:19 crc kubenswrapper[4825]: I0310 06:46:19.592206 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:19 crc kubenswrapper[4825]: I0310 06:46:19.605443 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:19Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:20 crc kubenswrapper[4825]: I0310 06:46:20.235672 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:20 crc kubenswrapper[4825]: I0310 06:46:20.235786 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:20 crc kubenswrapper[4825]: E0310 06:46:20.235862 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:20 crc kubenswrapper[4825]: E0310 06:46:20.235964 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:20 crc kubenswrapper[4825]: I0310 06:46:20.235808 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:20 crc kubenswrapper[4825]: E0310 06:46:20.236063 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:21 crc kubenswrapper[4825]: I0310 06:46:21.235976 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:21 crc kubenswrapper[4825]: E0310 06:46:21.236327 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.235958 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.235958 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:22 crc kubenswrapper[4825]: E0310 06:46:22.236263 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.236004 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:22 crc kubenswrapper[4825]: E0310 06:46:22.236373 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:22 crc kubenswrapper[4825]: E0310 06:46:22.236849 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.258625 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.357885 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.357935 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.357948 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.357970 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.357988 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:22Z","lastTransitionTime":"2026-03-10T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:22 crc kubenswrapper[4825]: E0310 06:46:22.381003 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:22Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.387347 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.387420 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.387441 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.387523 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.387544 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:22Z","lastTransitionTime":"2026-03-10T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:22 crc kubenswrapper[4825]: E0310 06:46:22.413162 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:22Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.425051 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.425233 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.425271 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.425533 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.425652 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:22Z","lastTransitionTime":"2026-03-10T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:22 crc kubenswrapper[4825]: E0310 06:46:22.453249 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:22Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.459735 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.459817 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.459836 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.459869 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.459889 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:22Z","lastTransitionTime":"2026-03-10T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:22 crc kubenswrapper[4825]: E0310 06:46:22.484620 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:22Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.492169 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.492254 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.492275 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.492301 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.492547 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:22Z","lastTransitionTime":"2026-03-10T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:22 crc kubenswrapper[4825]: E0310 06:46:22.512933 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:22Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:22 crc kubenswrapper[4825]: E0310 06:46:22.513303 4825 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.904661 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.904748 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.904805 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:22 crc kubenswrapper[4825]: I0310 06:46:22.904841 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:22 crc kubenswrapper[4825]: E0310 06:46:22.904907 4825 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 06:46:22 crc kubenswrapper[4825]: E0310 06:46:22.905035 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 06:47:26.905001784 +0000 UTC m=+199.934782619 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 06:46:22 crc kubenswrapper[4825]: E0310 06:46:22.905108 4825 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 06:46:22 crc kubenswrapper[4825]: E0310 06:46:22.905235 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 06:46:22 crc kubenswrapper[4825]: E0310 06:46:22.905256 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 06:46:22 crc kubenswrapper[4825]: E0310 06:46:22.905319 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 06:46:22 crc kubenswrapper[4825]: E0310 06:46:22.905343 4825 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:46:22 crc kubenswrapper[4825]: E0310 06:46:22.905273 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 06:46:22 crc kubenswrapper[4825]: E0310 06:46:22.905417 4825 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:46:22 crc kubenswrapper[4825]: E0310 06:46:22.905280 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 06:47:26.90523895 +0000 UTC m=+199.935019605 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 06:46:22 crc kubenswrapper[4825]: E0310 06:46:22.905489 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 06:47:26.905475397 +0000 UTC m=+199.935256212 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:46:22 crc kubenswrapper[4825]: E0310 06:46:22.905505 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 06:47:26.905497497 +0000 UTC m=+199.935278372 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:46:23 crc kubenswrapper[4825]: E0310 06:46:23.006511 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:27.006484858 +0000 UTC m=+200.036265513 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:46:23 crc kubenswrapper[4825]: I0310 06:46:23.006490 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:46:23 crc kubenswrapper[4825]: I0310 06:46:23.235884 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:23 crc kubenswrapper[4825]: E0310 06:46:23.236068 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:24 crc kubenswrapper[4825]: I0310 06:46:24.236366 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:24 crc kubenswrapper[4825]: I0310 06:46:24.236416 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:24 crc kubenswrapper[4825]: I0310 06:46:24.236564 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:24 crc kubenswrapper[4825]: E0310 06:46:24.236971 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:24 crc kubenswrapper[4825]: E0310 06:46:24.237068 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:24 crc kubenswrapper[4825]: E0310 06:46:24.237337 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:24 crc kubenswrapper[4825]: E0310 06:46:24.498089 4825 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 06:46:25 crc kubenswrapper[4825]: I0310 06:46:25.236432 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:25 crc kubenswrapper[4825]: E0310 06:46:25.236693 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:26 crc kubenswrapper[4825]: I0310 06:46:26.235884 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:26 crc kubenswrapper[4825]: I0310 06:46:26.235936 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:26 crc kubenswrapper[4825]: I0310 06:46:26.235946 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:26 crc kubenswrapper[4825]: E0310 06:46:26.236990 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:26 crc kubenswrapper[4825]: I0310 06:46:26.237661 4825 scope.go:117] "RemoveContainer" containerID="ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e" Mar 10 06:46:26 crc kubenswrapper[4825]: E0310 06:46:26.237985 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jhkb9_openshift-ovn-kubernetes(79ec9d89-dc71-4f36-9254-00bd86795e43)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" Mar 10 06:46:26 crc kubenswrapper[4825]: E0310 06:46:26.237992 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:26 crc kubenswrapper[4825]: E0310 06:46:26.238333 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:27 crc kubenswrapper[4825]: I0310 06:46:27.235968 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:27 crc kubenswrapper[4825]: E0310 06:46:27.236263 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:28 crc kubenswrapper[4825]: I0310 06:46:28.236077 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:28 crc kubenswrapper[4825]: I0310 06:46:28.236102 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:28 crc kubenswrapper[4825]: E0310 06:46:28.237465 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:28 crc kubenswrapper[4825]: I0310 06:46:28.237523 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:28 crc kubenswrapper[4825]: E0310 06:46:28.237856 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:28 crc kubenswrapper[4825]: E0310 06:46:28.238466 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:28 crc kubenswrapper[4825]: I0310 06:46:28.251615 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 10 06:46:29 crc kubenswrapper[4825]: I0310 06:46:29.235969 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:29 crc kubenswrapper[4825]: E0310 06:46:29.236227 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:29 crc kubenswrapper[4825]: I0310 06:46:29.254103 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:29Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:29 crc kubenswrapper[4825]: I0310 06:46:29.290490 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:29Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:29 crc kubenswrapper[4825]: I0310 06:46:29.344864 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:29Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:29 crc kubenswrapper[4825]: I0310 06:46:29.360782 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:29Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:29 crc kubenswrapper[4825]: I0310 06:46:29.375992 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:29Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:29 crc kubenswrapper[4825]: I0310 06:46:29.389279 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7bb00502b5ab82cbb301089867ff883cbfa0b30fabb0e58451aefeb0197810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:29Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:29 crc kubenswrapper[4825]: I0310 06:46:29.411877 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:29Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:29 crc kubenswrapper[4825]: I0310 06:46:29.427400 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8072d518-c139-457a-8169-4cdd02faae0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac78bed04e5a12d272784fc14809731a05de5be9d952c39ac4ffaa40f8589ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d0e4f89d7c516f3c7534ab4f6edad77bce567d389d9394d2b57425ccffdde7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2487dfed5a84ab11961ba04d4360e5d0e211ddc42e7e88b3b2ad04ae3a9d34e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bdd291a7cfb7e9cc1a02cdd529f56074dae35c2c242dee8461075541c7edf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bdd291a7cfb7e9cc1a02cdd529f56074dae35c2c242dee8461075541c7edf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:29Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:29 crc kubenswrapper[4825]: I0310 06:46:29.442826 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:29Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:29 crc kubenswrapper[4825]: I0310 06:46:29.462309 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc155a03c62e503236ad80c700aea4abf8ca948b38959b5f99df6965a5ad5bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab5785d3d924225a531d92fc17a9efa444710e8e4f2e75f4308adcd24e80d8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rhprr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:29Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:29 crc kubenswrapper[4825]: I0310 06:46:29.482421 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj5dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:46:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj5dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:29Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:29 crc kubenswrapper[4825]: E0310 06:46:29.498919 4825 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 06:46:29 crc kubenswrapper[4825]: I0310 06:46:29.502854 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:29Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:29 crc kubenswrapper[4825]: I0310 06:46:29.520806 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:29Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:29 crc kubenswrapper[4825]: I0310 06:46:29.539890 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:29Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:29 crc kubenswrapper[4825]: I0310 06:46:29.560844 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:46:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:29Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:29 crc kubenswrapper[4825]: I0310 06:46:29.576330 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6582d9a4-c4d1-4f0b-b0b2-0e0ae9678f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32345952f87cad97a94ac8654386fd6e5253db4e25da1c4b9d6de141fea7832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5da17071874543156df5adb8dee5973b11684f66e728934c3353e76eaad702e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:44:39Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 06:44:12.561772 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 06:44:12.564769 1 observer_polling.go:159] Starting file observer\\\\nI0310 06:44:12.609335 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 06:44:12.614802 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 06:44:39.542688 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 06:44:39.542886 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:38Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://311d715e54fdfe9fcdda7648d431fa69ca054e330fd9415f9c4819462508b9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebd7c7e7538a870d8bf580b8b01161607bae4c06941a4db2895dc1ee86f3b988\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f2bc1824721aecbf1494421ff4b33b65f910bb791a4eb7a591aa8eb455b958\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:29Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:29 crc kubenswrapper[4825]: I0310 06:46:29.591436 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:29Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:29 crc kubenswrapper[4825]: I0310 06:46:29.620241 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:29Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:29 crc kubenswrapper[4825]: I0310 06:46:29.656240 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"message\\\":\\\"om github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855516 7040 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855908 7040 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855945 7040 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.856623 7040 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 06:46:12.856685 7040 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 06:46:12.856711 7040 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 06:46:12.856767 7040 factory.go:656] Stopping watch factory\\\\nI0310 06:46:12.856790 7040 ovnkube.go:599] Stopped ovnkube\\\\nI0310 06:46:12.856840 7040 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 06:46:12.856855 7040 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 06:46:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:46:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jhkb9_openshift-ovn-kubernetes(79ec9d89-dc71-4f36-9254-00bd86795e43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:29Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:30 crc kubenswrapper[4825]: I0310 06:46:30.236261 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:30 crc kubenswrapper[4825]: I0310 06:46:30.236388 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:30 crc kubenswrapper[4825]: E0310 06:46:30.236484 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:30 crc kubenswrapper[4825]: E0310 06:46:30.236651 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:30 crc kubenswrapper[4825]: I0310 06:46:30.236412 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:30 crc kubenswrapper[4825]: E0310 06:46:30.236792 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:31 crc kubenswrapper[4825]: I0310 06:46:31.236628 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:31 crc kubenswrapper[4825]: E0310 06:46:31.236904 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:31 crc kubenswrapper[4825]: I0310 06:46:31.923005 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs\") pod \"network-metrics-daemon-pj5dl\" (UID: \"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\") " pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:31 crc kubenswrapper[4825]: E0310 06:46:31.923451 4825 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 06:46:31 crc kubenswrapper[4825]: E0310 06:46:31.923608 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs podName:114672c5-c1d0-4f87-b3aa-fb6d8535ffeb nodeName:}" failed. No retries permitted until 2026-03-10 06:47:03.92357679 +0000 UTC m=+176.953357435 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs") pod "network-metrics-daemon-pj5dl" (UID: "114672c5-c1d0-4f87-b3aa-fb6d8535ffeb") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.235912 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.236022 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:32 crc kubenswrapper[4825]: E0310 06:46:32.236203 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:32 crc kubenswrapper[4825]: E0310 06:46:32.236437 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.235790 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:32 crc kubenswrapper[4825]: E0310 06:46:32.237330 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.605599 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.605677 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.605697 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.605727 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.605748 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:32Z","lastTransitionTime":"2026-03-10T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:32 crc kubenswrapper[4825]: E0310 06:46:32.626422 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:32Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.632098 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.632200 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.632218 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.632310 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.632360 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:32Z","lastTransitionTime":"2026-03-10T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:32 crc kubenswrapper[4825]: E0310 06:46:32.652709 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:32Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.658196 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.658301 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.658312 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.658331 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.658345 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:32Z","lastTransitionTime":"2026-03-10T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:32 crc kubenswrapper[4825]: E0310 06:46:32.672501 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:32Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.677477 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.677545 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.677566 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.677592 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.677616 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:32Z","lastTransitionTime":"2026-03-10T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:32 crc kubenswrapper[4825]: E0310 06:46:32.697089 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:32Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.701498 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.701556 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.701574 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.701599 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:32 crc kubenswrapper[4825]: I0310 06:46:32.701618 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:32Z","lastTransitionTime":"2026-03-10T06:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:32 crc kubenswrapper[4825]: E0310 06:46:32.717204 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:32Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:32 crc kubenswrapper[4825]: E0310 06:46:32.717443 4825 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 06:46:33 crc kubenswrapper[4825]: I0310 06:46:33.235782 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:33 crc kubenswrapper[4825]: E0310 06:46:33.236010 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.217061 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8dkbt_165351e4-3c96-4a68-8c75-43b001b0ec60/kube-multus/0.log" Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.217182 4825 generic.go:334] "Generic (PLEG): container finished" podID="165351e4-3c96-4a68-8c75-43b001b0ec60" containerID="420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c" exitCode=1 Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.217226 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8dkbt" event={"ID":"165351e4-3c96-4a68-8c75-43b001b0ec60","Type":"ContainerDied","Data":"420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c"} Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.217876 4825 scope.go:117] "RemoveContainer" containerID="420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c" Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.235574 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.235651 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.235651 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:34 crc kubenswrapper[4825]: E0310 06:46:34.235819 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:34 crc kubenswrapper[4825]: E0310 06:46:34.236083 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:34 crc kubenswrapper[4825]: E0310 06:46:34.236541 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.247928 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:46:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:34Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.274989 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6582d9a4-c4d1-4f0b-b0b2-0e0ae9678f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32345952f87cad97a94ac8654386fd6e5253db4e25da1c4b9d6de141fea7832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5da17071874543156df5adb8dee5973b11684f66e728934c3353e76eaad702e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:44:39Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 06:44:12.561772 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 06:44:12.564769 1 observer_polling.go:159] Starting file observer\\\\nI0310 06:44:12.609335 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 06:44:12.614802 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 06:44:39.542688 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 06:44:39.542886 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:38Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://311d715e54fdfe9fcdda7648d431fa69ca054e330fd9415f9c4819462508b9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebd7c7e7538a870d8bf580b8b01161607bae4c06941a4db2895dc1ee86f3b988\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f2bc1824721aecbf1494421ff4b33b65f910bb791a4eb7a591aa8eb455b958\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:34Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.298712 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:34Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.324819 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:34Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.361940 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"message\\\":\\\"om github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855516 7040 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855908 7040 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855945 7040 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.856623 7040 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 06:46:12.856685 7040 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 06:46:12.856711 7040 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 06:46:12.856767 7040 factory.go:656] Stopping watch factory\\\\nI0310 06:46:12.856790 7040 ovnkube.go:599] Stopped ovnkube\\\\nI0310 06:46:12.856840 7040 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 06:46:12.856855 7040 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 06:46:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:46:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jhkb9_openshift-ovn-kubernetes(79ec9d89-dc71-4f36-9254-00bd86795e43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:34Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.381964 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7bb00502b5ab82cbb301089867ff883cbfa0b30fabb0e58451aefeb0197810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:34Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.399116 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:34Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.423312 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:34Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.445358 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:34Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.465303 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:34Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.483797 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:34Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:34 crc kubenswrapper[4825]: E0310 06:46:34.500321 4825 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.531590 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:34Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.555009 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8072d518-c139-457a-8169-4cdd02faae0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac78bed04e5a12d272784fc14809731a05de5be9d952c39ac4ffaa40f8589ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d0e4f89d7c516f3c7534ab4f6edad77bce567d389d9394d2b57425ccffdde7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2487dfed5a84ab11961ba04d4360e5d0e211ddc42e7e88b3b2ad04ae3a9d34e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bdd291a7cfb7e9cc1a02cdd529f56074dae35c2c242dee8461075541c7edf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bdd291a7cfb7e9cc1a02cdd529f56074dae35c2c242dee8461075541c7edf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:34Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.577622 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:34Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.595213 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc155a03c62e503236ad80c700aea4abf8ca948b38959b5f99df6965a5ad5bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab5785d3d924225a531d92fc17a9efa444710e8e4f2e75f4308adcd24e80d8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rhprr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:34Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.612899 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj5dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:46:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj5dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:34Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.631485 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:34Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.646334 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:34Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:34 crc kubenswrapper[4825]: I0310 06:46:34.667728 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:46:33Z\\\",\\\"message\\\":\\\"2026-03-10T06:45:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d188b728-454b-4c7a-882d-65d49163d93c\\\\n2026-03-10T06:45:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d188b728-454b-4c7a-882d-65d49163d93c to /host/opt/cni/bin/\\\\n2026-03-10T06:45:48Z [verbose] multus-daemon started\\\\n2026-03-10T06:45:48Z [verbose] Readiness Indicator file check\\\\n2026-03-10T06:46:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:34Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:35 crc kubenswrapper[4825]: I0310 06:46:35.225446 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8dkbt_165351e4-3c96-4a68-8c75-43b001b0ec60/kube-multus/0.log" Mar 10 06:46:35 crc kubenswrapper[4825]: I0310 06:46:35.226451 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8dkbt" event={"ID":"165351e4-3c96-4a68-8c75-43b001b0ec60","Type":"ContainerStarted","Data":"56708571d26121728c9ab436b6bd8aac9945e7b28e3f4c5d12035fa9809c70ef"} Mar 10 06:46:35 crc kubenswrapper[4825]: I0310 06:46:35.235906 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:35 crc kubenswrapper[4825]: E0310 06:46:35.236128 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:35 crc kubenswrapper[4825]: I0310 06:46:35.251661 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:35Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:35 crc kubenswrapper[4825]: I0310 06:46:35.272177 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:35Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:35 crc kubenswrapper[4825]: I0310 06:46:35.295075 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56708571d26121728c9ab436b6bd8aac9945e7b28e3f4c5d12035fa9809c70ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:46:33Z\\\",\\\"message\\\":\\\"2026-03-10T06:45:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d188b728-454b-4c7a-882d-65d49163d93c\\\\n2026-03-10T06:45:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d188b728-454b-4c7a-882d-65d49163d93c to /host/opt/cni/bin/\\\\n2026-03-10T06:45:48Z [verbose] multus-daemon started\\\\n2026-03-10T06:45:48Z [verbose] Readiness Indicator file check\\\\n2026-03-10T06:46:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:35Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:35 crc kubenswrapper[4825]: I0310 06:46:35.319853 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:46:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:35Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:35 crc kubenswrapper[4825]: I0310 06:46:35.341552 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6582d9a4-c4d1-4f0b-b0b2-0e0ae9678f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32345952f87cad97a94ac8654386fd6e5253db4e25da1c4b9d6de141fea7832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5da17071874543156df5adb8dee5973b11684f66e728934c3353e76eaad702e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:44:39Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 06:44:12.561772 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 06:44:12.564769 1 observer_polling.go:159] Starting file observer\\\\nI0310 06:44:12.609335 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 06:44:12.614802 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 06:44:39.542688 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 06:44:39.542886 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:38Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://311d715e54fdfe9fcdda7648d431fa69ca054e330fd9415f9c4819462508b9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebd7c7e7538a870d8bf580b8b01161607bae4c06941a4db2895dc1ee86f3b988\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f2bc1824721aecbf1494421ff4b33b65f910bb791a4eb7a591aa8eb455b958\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:35Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:35 crc kubenswrapper[4825]: I0310 06:46:35.361776 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:35Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:35 crc kubenswrapper[4825]: I0310 06:46:35.390652 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:35Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:35 crc kubenswrapper[4825]: I0310 06:46:35.422263 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"message\\\":\\\"om github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855516 7040 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855908 7040 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855945 7040 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.856623 7040 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 06:46:12.856685 7040 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 06:46:12.856711 7040 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 06:46:12.856767 7040 factory.go:656] Stopping watch factory\\\\nI0310 06:46:12.856790 7040 ovnkube.go:599] Stopped ovnkube\\\\nI0310 06:46:12.856840 7040 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 06:46:12.856855 7040 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 06:46:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:46:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jhkb9_openshift-ovn-kubernetes(79ec9d89-dc71-4f36-9254-00bd86795e43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:35Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:35 crc kubenswrapper[4825]: I0310 06:46:35.443851 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:35Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:35 crc kubenswrapper[4825]: I0310 06:46:35.468512 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:35Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:35 crc kubenswrapper[4825]: I0310 06:46:35.487241 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:35Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:35 crc kubenswrapper[4825]: I0310 06:46:35.509001 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:35Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:35 crc kubenswrapper[4825]: I0310 06:46:35.528068 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:35Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:35 crc kubenswrapper[4825]: I0310 06:46:35.546488 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7bb00502b5ab82cbb301089867ff883cbfa0b30fabb0e58451aefeb0197810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:35Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:35 crc kubenswrapper[4825]: I0310 06:46:35.584234 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:35Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:35 crc kubenswrapper[4825]: I0310 06:46:35.607226 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8072d518-c139-457a-8169-4cdd02faae0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac78bed04e5a12d272784fc14809731a05de5be9d952c39ac4ffaa40f8589ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d0e4f89d7c516f3c7534ab4f6edad77bce567d389d9394d2b57425ccffdde7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2487dfed5a84ab11961ba04d4360e5d0e211ddc42e7e88b3b2ad04ae3a9d34e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bdd291a7cfb7e9cc1a02cdd529f56074dae35c2c242dee8461075541c7edf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bdd291a7cfb7e9cc1a02cdd529f56074dae35c2c242dee8461075541c7edf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:35Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:35 crc kubenswrapper[4825]: I0310 06:46:35.633674 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:35Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:35 crc kubenswrapper[4825]: I0310 06:46:35.655427 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc155a03c62e503236ad80c700aea4abf8ca948b38959b5f99df6965a5ad5bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab5785d3d924225a531d92fc17a9efa444710e8e4f2e75f4308adcd24e80d8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rhprr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:35Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:35 crc kubenswrapper[4825]: I0310 06:46:35.674968 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj5dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:46:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj5dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:35Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:36 crc kubenswrapper[4825]: I0310 06:46:36.235697 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:36 crc kubenswrapper[4825]: I0310 06:46:36.235712 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:36 crc kubenswrapper[4825]: E0310 06:46:36.235922 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:36 crc kubenswrapper[4825]: I0310 06:46:36.235712 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:36 crc kubenswrapper[4825]: E0310 06:46:36.236050 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:36 crc kubenswrapper[4825]: E0310 06:46:36.236255 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:37 crc kubenswrapper[4825]: I0310 06:46:37.235510 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:37 crc kubenswrapper[4825]: E0310 06:46:37.236001 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:38 crc kubenswrapper[4825]: I0310 06:46:38.235418 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:38 crc kubenswrapper[4825]: I0310 06:46:38.235631 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:38 crc kubenswrapper[4825]: I0310 06:46:38.235697 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:38 crc kubenswrapper[4825]: E0310 06:46:38.235817 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:38 crc kubenswrapper[4825]: E0310 06:46:38.235988 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:38 crc kubenswrapper[4825]: E0310 06:46:38.236250 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:39 crc kubenswrapper[4825]: I0310 06:46:39.235948 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:39 crc kubenswrapper[4825]: E0310 06:46:39.236122 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:39 crc kubenswrapper[4825]: I0310 06:46:39.237398 4825 scope.go:117] "RemoveContainer" containerID="ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e" Mar 10 06:46:39 crc kubenswrapper[4825]: I0310 06:46:39.267776 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:39 crc kubenswrapper[4825]: I0310 06:46:39.286397 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8072d518-c139-457a-8169-4cdd02faae0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac78bed04e5a12d272784fc14809731a05de5be9d952c39ac4ffaa40f8589ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d0e4f89d7c516f3c7534ab4f6edad77bce567d389d9394d2b57425ccffdde7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2487dfed5a84ab11961ba04d4360e5d0e211ddc42e7e88b3b2ad04ae3a9d34e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bdd291a7cfb7e9cc1a02cdd529f56074dae35c2c242dee8461075541c7edf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bdd291a7cfb7e9cc1a02cdd529f56074dae35c2c242dee8461075541c7edf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:39 crc kubenswrapper[4825]: I0310 06:46:39.305016 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:39 crc kubenswrapper[4825]: I0310 06:46:39.317620 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc155a03c62e503236ad80c700aea4abf8ca948b38959b5f99df6965a5ad5bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab5785d3d924225a531d92fc17a9efa444710e8e4f2e75f4308adcd24e80d8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rhprr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:39 crc kubenswrapper[4825]: I0310 06:46:39.334964 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj5dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:46:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj5dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:39 crc kubenswrapper[4825]: I0310 06:46:39.355340 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:39 crc kubenswrapper[4825]: I0310 06:46:39.381062 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:39 crc kubenswrapper[4825]: I0310 06:46:39.402778 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56708571d26121728c9ab436b6bd8aac9945e7b28e3f4c5d12035fa9809c70ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:46:33Z\\\",\\\"message\\\":\\\"2026-03-10T06:45:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d188b728-454b-4c7a-882d-65d49163d93c\\\\n2026-03-10T06:45:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d188b728-454b-4c7a-882d-65d49163d93c to /host/opt/cni/bin/\\\\n2026-03-10T06:45:48Z [verbose] multus-daemon started\\\\n2026-03-10T06:45:48Z [verbose] Readiness Indicator file check\\\\n2026-03-10T06:46:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:39 crc kubenswrapper[4825]: I0310 06:46:39.425986 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:46:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:39 crc kubenswrapper[4825]: I0310 06:46:39.440426 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6582d9a4-c4d1-4f0b-b0b2-0e0ae9678f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32345952f87cad97a94ac8654386fd6e5253db4e25da1c4b9d6de141fea7832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5da17071874543156df5adb8dee5973b11684f66e728934c3353e76eaad702e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:44:39Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 06:44:12.561772 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 06:44:12.564769 1 observer_polling.go:159] Starting file observer\\\\nI0310 06:44:12.609335 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 06:44:12.614802 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 06:44:39.542688 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 06:44:39.542886 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:38Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://311d715e54fdfe9fcdda7648d431fa69ca054e330fd9415f9c4819462508b9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebd7c7e7538a870d8bf580b8b01161607bae4c06941a4db2895dc1ee86f3b988\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f2bc1824721aecbf1494421ff4b33b65f910bb791a4eb7a591aa8eb455b958\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:39 crc kubenswrapper[4825]: I0310 06:46:39.457316 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:39 crc kubenswrapper[4825]: I0310 06:46:39.474642 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:39 crc kubenswrapper[4825]: I0310 06:46:39.500382 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"message\\\":\\\"om github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855516 7040 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855908 7040 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855945 7040 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.856623 7040 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 06:46:12.856685 7040 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 06:46:12.856711 7040 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 06:46:12.856767 7040 factory.go:656] Stopping watch factory\\\\nI0310 06:46:12.856790 7040 ovnkube.go:599] Stopped ovnkube\\\\nI0310 06:46:12.856840 7040 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 06:46:12.856855 7040 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 06:46:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:46:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jhkb9_openshift-ovn-kubernetes(79ec9d89-dc71-4f36-9254-00bd86795e43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:39 crc kubenswrapper[4825]: E0310 06:46:39.504598 4825 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 06:46:39 crc kubenswrapper[4825]: I0310 06:46:39.521575 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:39 crc kubenswrapper[4825]: I0310 06:46:39.539689 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:39 crc kubenswrapper[4825]: I0310 06:46:39.554685 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:39 crc kubenswrapper[4825]: I0310 06:46:39.569980 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:39 crc kubenswrapper[4825]: I0310 06:46:39.583782 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:39 crc kubenswrapper[4825]: I0310 06:46:39.600636 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7bb00502b5ab82cbb301089867ff883cbfa0b30fabb0e58451aefeb0197810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:39Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.236516 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.236623 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.236544 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:40 crc kubenswrapper[4825]: E0310 06:46:40.236799 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:40 crc kubenswrapper[4825]: E0310 06:46:40.237002 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:40 crc kubenswrapper[4825]: E0310 06:46:40.237318 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.250043 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhkb9_79ec9d89-dc71-4f36-9254-00bd86795e43/ovnkube-controller/2.log" Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.254580 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerStarted","Data":"73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d"} Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.255357 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.277879 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.301671 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.325310 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56708571d26121728c9ab436b6bd8aac9945e7b28e3f4c5d12035fa9809c70ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:46:33Z\\\",\\\"message\\\":\\\"2026-03-10T06:45:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d188b728-454b-4c7a-882d-65d49163d93c\\\\n2026-03-10T06:45:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d188b728-454b-4c7a-882d-65d49163d93c to /host/opt/cni/bin/\\\\n2026-03-10T06:45:48Z [verbose] multus-daemon started\\\\n2026-03-10T06:45:48Z [verbose] Readiness Indicator file check\\\\n2026-03-10T06:46:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.359473 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"message\\\":\\\"om github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855516 7040 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855908 7040 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855945 7040 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.856623 7040 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 06:46:12.856685 7040 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 06:46:12.856711 7040 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 06:46:12.856767 7040 factory.go:656] Stopping watch factory\\\\nI0310 06:46:12.856790 7040 ovnkube.go:599] Stopped ovnkube\\\\nI0310 06:46:12.856840 7040 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 06:46:12.856855 7040 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 06:46:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:46:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.385514 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:46:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.408822 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6582d9a4-c4d1-4f0b-b0b2-0e0ae9678f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32345952f87cad97a94ac8654386fd6e5253db4e25da1c4b9d6de141fea7832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5da17071874543156df5adb8dee5973b11684f66e728934c3353e76eaad702e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:44:39Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 06:44:12.561772 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 06:44:12.564769 1 observer_polling.go:159] Starting file observer\\\\nI0310 06:44:12.609335 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 06:44:12.614802 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 06:44:39.542688 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 06:44:39.542886 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:38Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://311d715e54fdfe9fcdda7648d431fa69ca054e330fd9415f9c4819462508b9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebd7c7e7538a870d8bf580b8b01161607bae4c06941a4db2895dc1ee86f3b988\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f2bc1824721aecbf1494421ff4b33b65f910bb791a4eb7a591aa8eb455b958\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.429466 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.453765 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.470562 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.487434 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7bb00502b5ab82cbb301089867ff883cbfa0b30fabb0e58451aefeb0197810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.506283 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.521870 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.537579 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.553625 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.569476 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj5dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:46:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj5dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.602632 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.621101 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8072d518-c139-457a-8169-4cdd02faae0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac78bed04e5a12d272784fc14809731a05de5be9d952c39ac4ffaa40f8589ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d0e4f89d7c516f3c7534ab4f6edad77bce567d389d9394d2b57425ccffdde7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2487dfed5a84ab11961ba04d4360e5d0e211ddc42e7e88b3b2ad04ae3a9d34e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bdd291a7cfb7e9cc1a02cdd529f56074dae35c2c242dee8461075541c7edf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bdd291a7cfb7e9cc1a02cdd529f56074dae35c2c242dee8461075541c7edf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.639765 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:40 crc kubenswrapper[4825]: I0310 06:46:40.655693 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc155a03c62e503236ad80c700aea4abf8ca948b38959b5f99df6965a5ad5bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab5785d3d924225a531d92fc17a9efa444710e8e4f2e75f4308adcd24e80d8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rhprr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:40Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.236323 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:41 crc kubenswrapper[4825]: E0310 06:46:41.236593 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.261882 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhkb9_79ec9d89-dc71-4f36-9254-00bd86795e43/ovnkube-controller/3.log" Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.263086 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhkb9_79ec9d89-dc71-4f36-9254-00bd86795e43/ovnkube-controller/2.log" Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.268124 4825 generic.go:334] "Generic (PLEG): container finished" podID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerID="73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d" exitCode=1 Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.268221 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerDied","Data":"73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d"} Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.268284 4825 scope.go:117] "RemoveContainer" containerID="ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e" Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.269551 4825 scope.go:117] "RemoveContainer" containerID="73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d" Mar 10 06:46:41 crc kubenswrapper[4825]: E0310 06:46:41.269833 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jhkb9_openshift-ovn-kubernetes(79ec9d89-dc71-4f36-9254-00bd86795e43)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.310327 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.332863 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8072d518-c139-457a-8169-4cdd02faae0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac78bed04e5a12d272784fc14809731a05de5be9d952c39ac4ffaa40f8589ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d0e4f89d7c516f3c7534ab4f6edad77bce567d389d9394d2b57425ccffdde7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2487dfed5a84ab11961ba04d4360e5d0e211ddc42e7e88b3b2ad04ae3a9d34e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bdd291a7cfb7e9cc1a02cdd529f56074dae35c2c242dee8461075541c7edf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bdd291a7cfb7e9cc1a02cdd529f56074dae35c2c242dee8461075541c7edf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.355067 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.375956 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc155a03c62e503236ad80c700aea4abf8ca948b38959b5f99df6965a5ad5bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab5785d3d924225a531d92fc17a9efa444710e8e4f2e75f4308adcd24e80d8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rhprr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.395451 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj5dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:46:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj5dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.416650 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.437998 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.457945 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56708571d26121728c9ab436b6bd8aac9945e7b28e3f4c5d12035fa9809c70ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:46:33Z\\\",\\\"message\\\":\\\"2026-03-10T06:45:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d188b728-454b-4c7a-882d-65d49163d93c\\\\n2026-03-10T06:45:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d188b728-454b-4c7a-882d-65d49163d93c to /host/opt/cni/bin/\\\\n2026-03-10T06:45:48Z [verbose] multus-daemon started\\\\n2026-03-10T06:45:48Z [verbose] Readiness Indicator file check\\\\n2026-03-10T06:46:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.481606 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:46:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.502450 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6582d9a4-c4d1-4f0b-b0b2-0e0ae9678f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32345952f87cad97a94ac8654386fd6e5253db4e25da1c4b9d6de141fea7832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5da17071874543156df5adb8dee5973b11684f66e728934c3353e76eaad702e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:44:39Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 06:44:12.561772 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 06:44:12.564769 1 observer_polling.go:159] Starting file observer\\\\nI0310 06:44:12.609335 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 06:44:12.614802 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 06:44:39.542688 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 06:44:39.542886 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:38Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://311d715e54fdfe9fcdda7648d431fa69ca054e330fd9415f9c4819462508b9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebd7c7e7538a870d8bf580b8b01161607bae4c06941a4db2895dc1ee86f3b988\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f2bc1824721aecbf1494421ff4b33b65f910bb791a4eb7a591aa8eb455b958\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.521781 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.546049 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.587002 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca803889cd06db9b70ad534a668658ae8d914f540bc74177d6a34726a8e1765e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:46:12Z\\\",\\\"message\\\":\\\"om github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855516 7040 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855908 7040 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.855945 7040 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 06:46:12.856623 7040 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 06:46:12.856685 7040 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 06:46:12.856711 7040 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 06:46:12.856767 7040 factory.go:656] Stopping watch factory\\\\nI0310 06:46:12.856790 7040 ovnkube.go:599] Stopped ovnkube\\\\nI0310 06:46:12.856840 7040 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 06:46:12.856855 7040 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 06:46:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:46:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:46:40Z\\\",\\\"message\\\":\\\"06:46:40.237939 7337 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0310 06:46:40.238215 7337 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 06:46:40.238264 7337 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 06:46:40.238273 7337 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 06:46:40.238335 7337 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 06:46:40.238359 7337 factory.go:656] Stopping watch factory\\\\nI0310 06:46:40.238382 7337 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 06:46:40.238396 7337 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 06:46:40.238403 7337 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 06:46:40.238411 7337 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 06:46:40.238418 7337 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 06:46:40.238740 7337 ovnkube.go:599] Stopped ovnkube\\\\nI0310 06:46:40.238785 7337 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 06:46:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.605363 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7bb00502b5ab82cbb301089867ff883cbfa0b30fabb0e58451aefeb0197810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.621967 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.645272 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.666961 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.685282 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:41 crc kubenswrapper[4825]: I0310 06:46:41.698679 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:41Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.236088 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.236115 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:42 crc kubenswrapper[4825]: E0310 06:46:42.236563 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:42 crc kubenswrapper[4825]: E0310 06:46:42.236823 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.236342 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:42 crc kubenswrapper[4825]: E0310 06:46:42.237494 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.276337 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhkb9_79ec9d89-dc71-4f36-9254-00bd86795e43/ovnkube-controller/3.log" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.282627 4825 scope.go:117] "RemoveContainer" containerID="73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d" Mar 10 06:46:42 crc kubenswrapper[4825]: E0310 06:46:42.282959 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jhkb9_openshift-ovn-kubernetes(79ec9d89-dc71-4f36-9254-00bd86795e43)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.306183 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"468bfefa-2636-4ab4-b62d-2c5c738eb872\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:45:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 06:45:15.848873 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 06:45:15.849102 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 06:45:15.850424 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193444300/tls.crt::/tmp/serving-cert-2193444300/tls.key\\\\\\\"\\\\nI0310 06:45:16.151765 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 06:45:16.162919 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 06:45:16.162961 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 06:45:16.163018 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 06:45:16.163029 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 06:45:16.172887 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 06:45:16.172948 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172959 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 06:45:16.172968 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 06:45:16.172977 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 06:45:16.172983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 06:45:16.172989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 06:45:16.172898 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 06:45:16.175833 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:46:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.324765 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6582d9a4-c4d1-4f0b-b0b2-0e0ae9678f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a32345952f87cad97a94ac8654386fd6e5253db4e25da1c4b9d6de141fea7832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5da17071874543156df5adb8dee5973b11684f66e728934c3353e76eaad702e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T06:44:39Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 06:44:12.561772 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 06:44:12.564769 1 observer_polling.go:159] Starting file observer\\\\nI0310 06:44:12.609335 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 06:44:12.614802 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 06:44:39.542688 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 06:44:39.542886 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:44:38Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://311d715e54fdfe9fcdda7648d431fa69ca054e330fd9415f9c4819462508b9ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebd7c7e7538a870d8bf580b8b01161607bae4c06941a4db2895dc1ee86f3b988\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f2bc1824721aecbf1494421ff4b33b65f910bb791a4eb7a591aa8eb455b958\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.341209 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.368333 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-g445x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0fc29189-3e91-4d20-8d00-682a9431a8ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cfe45c4f271c015a502b451ffb62ae182be1227109f50815299d20c737c5f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b527f273ba68810b5efbb69bf47ddb7949370ede5703d33bcc091f2ff33f0b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ec0ac5e27e803d9b0a7b78b07f569e08099fa2e5eaa7c1522ac6636b4afbf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ab3459291e64b4d0cd837afa72ea89b9ff496bdbeb17a863051b4338e64583\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8981c88f96d23b1be78bb1fdcb4dd905eb0607abd278fb8f5d6e1d2674bf70b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f734c26dadc7da29326b238402bd448482fd0cafd80fd535f222034a67139ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ece8762f7351aa6510b7035a11c14eeb439a531abfdb6865d6d68195fe89e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gts5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-g445x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.398727 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79ec9d89-dc71-4f36-9254-00bd86795e43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:46:40Z\\\",\\\"message\\\":\\\"06:46:40.237939 7337 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0310 06:46:40.238215 7337 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 06:46:40.238264 7337 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 06:46:40.238273 7337 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 06:46:40.238335 7337 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 06:46:40.238359 7337 factory.go:656] Stopping watch factory\\\\nI0310 06:46:40.238382 7337 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 06:46:40.238396 7337 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 06:46:40.238403 7337 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 06:46:40.238411 7337 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 06:46:40.238418 7337 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0310 06:46:40.238740 7337 ovnkube.go:599] Stopped ovnkube\\\\nI0310 06:46:40.238785 7337 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 06:46:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:46:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jhkb9_openshift-ovn-kubernetes(79ec9d89-dc71-4f36-9254-00bd86795e43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvxrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jhkb9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.414817 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7l6mg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dff16f-0214-4421-934d-dcb6e9d1af28\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df7bb00502b5ab82cbb301089867ff883cbfa0b30fabb0e58451aefeb0197810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8ztd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7l6mg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.428290 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8db1f9b1-6042-44a1-97b8-c0e9269f8ab0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8b02103b8486962e24ee333889dc83acc4379b367dc12c20ca11602879239e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0e1e1d05ed83c602837199cd1d976261995e66fb041f3e974e173f271120c46\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.445093 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281b80a6aa275f1b641833c7bdc845f2def19b224d66671705d96c08b6eed3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.463913 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.485839 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ec1a7287733290068585115df1b16748bc415396154735f45a192543405254a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.503117 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pgpc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdb08f12-1daf-4d35-940c-914e187ffda0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d002f7817bf6d5fdb95b7dc90c3eb10f4d949456ad0c472e1671f0ab97278b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgsbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pgpc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.536906 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"743f3ca5-3ec0-4c5b-bad2-3e64df57da6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://872997317afa0a2274e0b4ce87bdd13fad7a3291521411a8a208bfd53672163e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce8afd573e70bc13a4b3d3539d466f0bc54b26139a06494ccb38d980c05f0974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1de8e93ef3747888e32ccae04649a70ef2b3f1ad42133acc0137f78770c9720f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c380bb44c46b881451835d9616e2b3e76e790bbcaccdc979b9c39bb8abdc605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd6d39109c49743644fca4e67fd9e0db312e712adbfe3f7a8ae5a4f749c5945\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d053208c165e9f8ba8678d8ee53e42268550e3d972e09de413c339f78b552a72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae9d3783a67ec396a0d09d23f029e37047abb9c3d5a28ef5a35ea9f559660ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4eaaed6033cc87c8cfdc69ff51a7a308a66916ae543d2a2e9ce7184b2718a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.557510 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8072d518-c139-457a-8169-4cdd02faae0a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:44:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac78bed04e5a12d272784fc14809731a05de5be9d952c39ac4ffaa40f8589ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d0e4f89d7c516f3c7534ab4f6edad77bce567d389d9394d2b57425ccffdde7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2487dfed5a84ab11961ba04d4360e5d0e211ddc42e7e88b3b2ad04ae3a9d34e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:44:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bdd291a7cfb7e9cc1a02cdd529f56074dae35c2c242dee8461075541c7edf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bdd291a7cfb7e9cc1a02cdd529f56074dae35c2c242dee8461075541c7edf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T06:44:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T06:44:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:44:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.580249 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.598657 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5e4a4c2-fcf8-42e2-a74b-50e89997ca17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc155a03c62e503236ad80c700aea4abf8ca948b38959b5f99df6965a5ad5bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab5785d3d924225a531d92fc17a9efa444710e8e4f2e75f4308adcd24e80d8fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k58v4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rhprr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.616303 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj5dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rnbd6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:46:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj5dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.637223 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c0cbc1e1c16536c1e26b0c607f0966b3749a7357e4cd130abd24d53903daacb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b86d021111582d0c9ef202b910fe8c744a3cb1a80dd163dd41125a47decec8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.655237 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9beb5814-89d0-47c0-8b0e-24376a358fc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adf9d4029544f4f9c869193aedbe1791dd7602dfb6e86bec4792ce01cf7c6f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88dxl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bvt9j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.677871 4825 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8dkbt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"165351e4-3c96-4a68-8c75-43b001b0ec60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56708571d26121728c9ab436b6bd8aac9945e7b28e3f4c5d12035fa9809c70ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T06:46:33Z\\\",\\\"message\\\":\\\"2026-03-10T06:45:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d188b728-454b-4c7a-882d-65d49163d93c\\\\n2026-03-10T06:45:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d188b728-454b-4c7a-882d-65d49163d93c to /host/opt/cni/bin/\\\\n2026-03-10T06:45:48Z [verbose] multus-daemon started\\\\n2026-03-10T06:45:48Z [verbose] Readiness Indicator file check\\\\n2026-03-10T06:46:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T06:45:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T06:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c5xgs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T06:45:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8dkbt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.774305 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.774361 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.774373 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.774393 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.774406 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:42Z","lastTransitionTime":"2026-03-10T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:42 crc kubenswrapper[4825]: E0310 06:46:42.795507 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.800482 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.800546 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.800565 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.800592 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.800614 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:42Z","lastTransitionTime":"2026-03-10T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:42 crc kubenswrapper[4825]: E0310 06:46:42.822049 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.828308 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.828412 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.828473 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.828503 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.828564 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:42Z","lastTransitionTime":"2026-03-10T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:42 crc kubenswrapper[4825]: E0310 06:46:42.849050 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.854893 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.854954 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.854977 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.855009 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.855033 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:42Z","lastTransitionTime":"2026-03-10T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:42 crc kubenswrapper[4825]: E0310 06:46:42.875600 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.880985 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.881044 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.881063 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.881091 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:42 crc kubenswrapper[4825]: I0310 06:46:42.881113 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:42Z","lastTransitionTime":"2026-03-10T06:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:42 crc kubenswrapper[4825]: E0310 06:46:42.901549 4825 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T06:46:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bc7fa84-cf6d-4b6d-8f9a-118c306e0760\\\",\\\"systemUUID\\\":\\\"4821b499-aedc-4e42-be1d-4a415e5b2f81\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T06:46:42Z is after 2025-08-24T17:21:41Z" Mar 10 06:46:42 crc kubenswrapper[4825]: E0310 06:46:42.901916 4825 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 06:46:43 crc kubenswrapper[4825]: I0310 06:46:43.235811 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:43 crc kubenswrapper[4825]: E0310 06:46:43.236042 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:44 crc kubenswrapper[4825]: I0310 06:46:44.235452 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:44 crc kubenswrapper[4825]: I0310 06:46:44.235509 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:44 crc kubenswrapper[4825]: I0310 06:46:44.235509 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:44 crc kubenswrapper[4825]: E0310 06:46:44.236167 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:44 crc kubenswrapper[4825]: E0310 06:46:44.236253 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:44 crc kubenswrapper[4825]: E0310 06:46:44.236373 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:44 crc kubenswrapper[4825]: E0310 06:46:44.506356 4825 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 06:46:45 crc kubenswrapper[4825]: I0310 06:46:45.235730 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:45 crc kubenswrapper[4825]: E0310 06:46:45.235939 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:46 crc kubenswrapper[4825]: I0310 06:46:46.236280 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:46 crc kubenswrapper[4825]: I0310 06:46:46.236296 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:46 crc kubenswrapper[4825]: E0310 06:46:46.236689 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:46 crc kubenswrapper[4825]: E0310 06:46:46.236758 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:46 crc kubenswrapper[4825]: I0310 06:46:46.236328 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:46 crc kubenswrapper[4825]: E0310 06:46:46.236847 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:47 crc kubenswrapper[4825]: I0310 06:46:47.236306 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:47 crc kubenswrapper[4825]: E0310 06:46:47.236543 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:48 crc kubenswrapper[4825]: I0310 06:46:48.235453 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:48 crc kubenswrapper[4825]: I0310 06:46:48.235657 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:48 crc kubenswrapper[4825]: E0310 06:46:48.235701 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:48 crc kubenswrapper[4825]: I0310 06:46:48.235747 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:48 crc kubenswrapper[4825]: E0310 06:46:48.235959 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:48 crc kubenswrapper[4825]: E0310 06:46:48.236176 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:49 crc kubenswrapper[4825]: I0310 06:46:49.236286 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:49 crc kubenswrapper[4825]: E0310 06:46:49.236510 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:49 crc kubenswrapper[4825]: I0310 06:46:49.300489 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rhprr" podStartSLOduration=108.300466785 podStartE2EDuration="1m48.300466785s" podCreationTimestamp="2026-03-10 06:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:46:49.299737385 +0000 UTC m=+162.329518030" watchObservedRunningTime="2026-03-10 06:46:49.300466785 +0000 UTC m=+162.330247410" Mar 10 06:46:49 crc kubenswrapper[4825]: I0310 06:46:49.365510 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=68.365470044 podStartE2EDuration="1m8.365470044s" podCreationTimestamp="2026-03-10 06:45:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:46:49.364745194 +0000 UTC m=+162.394525849" watchObservedRunningTime="2026-03-10 06:46:49.365470044 +0000 UTC m=+162.395250689" Mar 10 06:46:49 crc kubenswrapper[4825]: I0310 06:46:49.386226 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=21.386192645 podStartE2EDuration="21.386192645s" podCreationTimestamp="2026-03-10 06:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:46:49.38530065 +0000 UTC m=+162.415081315" watchObservedRunningTime="2026-03-10 06:46:49.386192645 +0000 UTC m=+162.415973290" Mar 10 06:46:49 crc kubenswrapper[4825]: I0310 06:46:49.432891 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8dkbt" podStartSLOduration=109.4328537 podStartE2EDuration="1m49.4328537s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:46:49.412368606 +0000 UTC m=+162.442149311" watchObservedRunningTime="2026-03-10 06:46:49.4328537 +0000 UTC m=+162.462634325" Mar 10 06:46:49 crc kubenswrapper[4825]: I0310 06:46:49.455557 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podStartSLOduration=109.455529684 podStartE2EDuration="1m49.455529684s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:46:49.453913979 +0000 UTC m=+162.483694614" watchObservedRunningTime="2026-03-10 06:46:49.455529684 +0000 UTC m=+162.485310309" Mar 10 06:46:49 crc kubenswrapper[4825]: I0310 06:46:49.503561 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-g445x" podStartSLOduration=109.503522795 podStartE2EDuration="1m49.503522795s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:46:49.502493297 +0000 UTC m=+162.532273952" watchObservedRunningTime="2026-03-10 06:46:49.503522795 +0000 UTC m=+162.533303450" Mar 10 06:46:49 crc kubenswrapper[4825]: E0310 06:46:49.507451 4825 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 06:46:49 crc kubenswrapper[4825]: I0310 06:46:49.593729 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=80.593696778 podStartE2EDuration="1m20.593696778s" podCreationTimestamp="2026-03-10 06:45:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:46:49.573926244 +0000 UTC m=+162.603706879" watchObservedRunningTime="2026-03-10 06:46:49.593696778 +0000 UTC m=+162.623477423" Mar 10 06:46:49 crc kubenswrapper[4825]: I0310 06:46:49.594888 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=27.59487837 podStartE2EDuration="27.59487837s" podCreationTimestamp="2026-03-10 06:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:46:49.592707911 +0000 UTC m=+162.622488526" watchObservedRunningTime="2026-03-10 06:46:49.59487837 +0000 UTC m=+162.624659025" Mar 10 06:46:49 crc kubenswrapper[4825]: I0310 06:46:49.675804 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-pgpc8" podStartSLOduration=109.675777698 podStartE2EDuration="1m49.675777698s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:46:49.674731299 +0000 UTC m=+162.704511914" watchObservedRunningTime="2026-03-10 06:46:49.675777698 +0000 UTC m=+162.705558313" Mar 10 06:46:49 crc kubenswrapper[4825]: I0310 06:46:49.698287 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7l6mg" podStartSLOduration=109.698264037 podStartE2EDuration="1m49.698264037s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:46:49.688662272 +0000 UTC m=+162.718442887" watchObservedRunningTime="2026-03-10 06:46:49.698264037 +0000 UTC m=+162.728044652" Mar 10 06:46:49 crc kubenswrapper[4825]: I0310 06:46:49.713723 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=90.713708562 podStartE2EDuration="1m30.713708562s" podCreationTimestamp="2026-03-10 06:45:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:46:49.700102847 +0000 UTC m=+162.729883452" watchObservedRunningTime="2026-03-10 06:46:49.713708562 +0000 UTC m=+162.743489177" Mar 10 06:46:50 crc kubenswrapper[4825]: I0310 06:46:50.236244 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:50 crc kubenswrapper[4825]: I0310 06:46:50.236305 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:50 crc kubenswrapper[4825]: I0310 06:46:50.236353 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:50 crc kubenswrapper[4825]: E0310 06:46:50.236409 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:50 crc kubenswrapper[4825]: E0310 06:46:50.236556 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:50 crc kubenswrapper[4825]: E0310 06:46:50.236758 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:51 crc kubenswrapper[4825]: I0310 06:46:51.235743 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:51 crc kubenswrapper[4825]: E0310 06:46:51.235978 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:52 crc kubenswrapper[4825]: I0310 06:46:52.236389 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:52 crc kubenswrapper[4825]: I0310 06:46:52.236564 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:52 crc kubenswrapper[4825]: I0310 06:46:52.236649 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:52 crc kubenswrapper[4825]: E0310 06:46:52.236641 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:52 crc kubenswrapper[4825]: E0310 06:46:52.236846 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:52 crc kubenswrapper[4825]: E0310 06:46:52.237015 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:52 crc kubenswrapper[4825]: I0310 06:46:52.931916 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 06:46:52 crc kubenswrapper[4825]: I0310 06:46:52.931997 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 06:46:52 crc kubenswrapper[4825]: I0310 06:46:52.932025 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 06:46:52 crc kubenswrapper[4825]: I0310 06:46:52.932053 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 06:46:52 crc kubenswrapper[4825]: I0310 06:46:52.932073 4825 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T06:46:52Z","lastTransitionTime":"2026-03-10T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.002237 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-8f8ph"] Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.002808 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8f8ph" Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.006215 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.006604 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.006841 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.007039 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.133379 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8445004d-71ef-465b-b776-b2e2dd74ad8c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-8f8ph\" (UID: \"8445004d-71ef-465b-b776-b2e2dd74ad8c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8f8ph" Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.133471 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8445004d-71ef-465b-b776-b2e2dd74ad8c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-8f8ph\" (UID: \"8445004d-71ef-465b-b776-b2e2dd74ad8c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8f8ph" Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.133529 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8445004d-71ef-465b-b776-b2e2dd74ad8c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-8f8ph\" (UID: \"8445004d-71ef-465b-b776-b2e2dd74ad8c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8f8ph" Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.133656 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8445004d-71ef-465b-b776-b2e2dd74ad8c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-8f8ph\" (UID: \"8445004d-71ef-465b-b776-b2e2dd74ad8c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8f8ph" Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.133718 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8445004d-71ef-465b-b776-b2e2dd74ad8c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-8f8ph\" (UID: \"8445004d-71ef-465b-b776-b2e2dd74ad8c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8f8ph" Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.218362 4825 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.229748 4825 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.234580 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8445004d-71ef-465b-b776-b2e2dd74ad8c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-8f8ph\" (UID: \"8445004d-71ef-465b-b776-b2e2dd74ad8c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8f8ph" Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.234652 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8445004d-71ef-465b-b776-b2e2dd74ad8c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-8f8ph\" (UID: \"8445004d-71ef-465b-b776-b2e2dd74ad8c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8f8ph" Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.234732 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8445004d-71ef-465b-b776-b2e2dd74ad8c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-8f8ph\" (UID: \"8445004d-71ef-465b-b776-b2e2dd74ad8c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8f8ph" Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.234827 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8445004d-71ef-465b-b776-b2e2dd74ad8c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-8f8ph\" (UID: \"8445004d-71ef-465b-b776-b2e2dd74ad8c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8f8ph" Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.234878 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8445004d-71ef-465b-b776-b2e2dd74ad8c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-8f8ph\" (UID: \"8445004d-71ef-465b-b776-b2e2dd74ad8c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8f8ph" Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.234883 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8445004d-71ef-465b-b776-b2e2dd74ad8c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-8f8ph\" (UID: \"8445004d-71ef-465b-b776-b2e2dd74ad8c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8f8ph" Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.234832 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8445004d-71ef-465b-b776-b2e2dd74ad8c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-8f8ph\" (UID: \"8445004d-71ef-465b-b776-b2e2dd74ad8c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8f8ph" Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.235723 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:53 crc kubenswrapper[4825]: E0310 06:46:53.235938 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.236888 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8445004d-71ef-465b-b776-b2e2dd74ad8c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-8f8ph\" (UID: \"8445004d-71ef-465b-b776-b2e2dd74ad8c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8f8ph" Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.244532 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8445004d-71ef-465b-b776-b2e2dd74ad8c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-8f8ph\" (UID: \"8445004d-71ef-465b-b776-b2e2dd74ad8c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8f8ph" Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.262788 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8445004d-71ef-465b-b776-b2e2dd74ad8c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-8f8ph\" (UID: \"8445004d-71ef-465b-b776-b2e2dd74ad8c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8f8ph" Mar 10 06:46:53 crc kubenswrapper[4825]: I0310 06:46:53.334694 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8f8ph" Mar 10 06:46:53 crc kubenswrapper[4825]: W0310 06:46:53.362560 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8445004d_71ef_465b_b776_b2e2dd74ad8c.slice/crio-b8b9923002dcff4ddfa9a6512ac9d79fab03b038e1d281a8d2e0992f123271a0 WatchSource:0}: Error finding container b8b9923002dcff4ddfa9a6512ac9d79fab03b038e1d281a8d2e0992f123271a0: Status 404 returned error can't find the container with id b8b9923002dcff4ddfa9a6512ac9d79fab03b038e1d281a8d2e0992f123271a0 Mar 10 06:46:54 crc kubenswrapper[4825]: I0310 06:46:54.235952 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:54 crc kubenswrapper[4825]: I0310 06:46:54.236008 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:54 crc kubenswrapper[4825]: E0310 06:46:54.236329 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:54 crc kubenswrapper[4825]: I0310 06:46:54.236357 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:54 crc kubenswrapper[4825]: E0310 06:46:54.236516 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:54 crc kubenswrapper[4825]: E0310 06:46:54.236821 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:54 crc kubenswrapper[4825]: I0310 06:46:54.339026 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8f8ph" event={"ID":"8445004d-71ef-465b-b776-b2e2dd74ad8c","Type":"ContainerStarted","Data":"050a377a1cdf90a5bc45e47b212b3ea234043035aeaff0ce0812ac2a95d39110"} Mar 10 06:46:54 crc kubenswrapper[4825]: I0310 06:46:54.339110 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8f8ph" event={"ID":"8445004d-71ef-465b-b776-b2e2dd74ad8c","Type":"ContainerStarted","Data":"b8b9923002dcff4ddfa9a6512ac9d79fab03b038e1d281a8d2e0992f123271a0"} Mar 10 06:46:54 crc kubenswrapper[4825]: I0310 06:46:54.361936 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8f8ph" podStartSLOduration=114.361901465 podStartE2EDuration="1m54.361901465s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:46:54.360678052 +0000 UTC m=+167.390458717" watchObservedRunningTime="2026-03-10 06:46:54.361901465 +0000 UTC m=+167.391682120" Mar 10 06:46:54 crc kubenswrapper[4825]: E0310 06:46:54.509233 4825 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 06:46:55 crc kubenswrapper[4825]: I0310 06:46:55.236522 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:55 crc kubenswrapper[4825]: E0310 06:46:55.236725 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:56 crc kubenswrapper[4825]: I0310 06:46:56.238359 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:56 crc kubenswrapper[4825]: I0310 06:46:56.238509 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:56 crc kubenswrapper[4825]: E0310 06:46:56.238587 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:56 crc kubenswrapper[4825]: I0310 06:46:56.238605 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:56 crc kubenswrapper[4825]: E0310 06:46:56.238716 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:56 crc kubenswrapper[4825]: E0310 06:46:56.238863 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:56 crc kubenswrapper[4825]: I0310 06:46:56.239673 4825 scope.go:117] "RemoveContainer" containerID="73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d" Mar 10 06:46:56 crc kubenswrapper[4825]: E0310 06:46:56.239839 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jhkb9_openshift-ovn-kubernetes(79ec9d89-dc71-4f36-9254-00bd86795e43)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" Mar 10 06:46:57 crc kubenswrapper[4825]: I0310 06:46:57.236008 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:57 crc kubenswrapper[4825]: E0310 06:46:57.236307 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:58 crc kubenswrapper[4825]: I0310 06:46:58.235794 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:46:58 crc kubenswrapper[4825]: I0310 06:46:58.235830 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:46:58 crc kubenswrapper[4825]: I0310 06:46:58.235908 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:46:58 crc kubenswrapper[4825]: E0310 06:46:58.236027 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:46:58 crc kubenswrapper[4825]: E0310 06:46:58.236270 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:46:58 crc kubenswrapper[4825]: E0310 06:46:58.236477 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:46:59 crc kubenswrapper[4825]: I0310 06:46:59.236179 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:46:59 crc kubenswrapper[4825]: E0310 06:46:59.238334 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:46:59 crc kubenswrapper[4825]: E0310 06:46:59.511045 4825 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 06:47:00 crc kubenswrapper[4825]: I0310 06:47:00.235526 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:47:00 crc kubenswrapper[4825]: I0310 06:47:00.235615 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:47:00 crc kubenswrapper[4825]: I0310 06:47:00.235776 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:47:00 crc kubenswrapper[4825]: E0310 06:47:00.235887 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:47:00 crc kubenswrapper[4825]: E0310 06:47:00.236081 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:47:00 crc kubenswrapper[4825]: E0310 06:47:00.236183 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:47:01 crc kubenswrapper[4825]: I0310 06:47:01.235972 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:47:01 crc kubenswrapper[4825]: E0310 06:47:01.236260 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:47:02 crc kubenswrapper[4825]: I0310 06:47:02.235481 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:47:02 crc kubenswrapper[4825]: I0310 06:47:02.235564 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:47:02 crc kubenswrapper[4825]: I0310 06:47:02.235573 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:47:02 crc kubenswrapper[4825]: E0310 06:47:02.235835 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:47:02 crc kubenswrapper[4825]: E0310 06:47:02.236057 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:47:02 crc kubenswrapper[4825]: E0310 06:47:02.236294 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:47:03 crc kubenswrapper[4825]: I0310 06:47:03.235865 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:47:03 crc kubenswrapper[4825]: E0310 06:47:03.236048 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:47:03 crc kubenswrapper[4825]: I0310 06:47:03.982193 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs\") pod \"network-metrics-daemon-pj5dl\" (UID: \"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\") " pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:47:03 crc kubenswrapper[4825]: E0310 06:47:03.982496 4825 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 06:47:03 crc kubenswrapper[4825]: E0310 06:47:03.983297 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs podName:114672c5-c1d0-4f87-b3aa-fb6d8535ffeb nodeName:}" failed. No retries permitted until 2026-03-10 06:48:07.983232247 +0000 UTC m=+241.013012902 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs") pod "network-metrics-daemon-pj5dl" (UID: "114672c5-c1d0-4f87-b3aa-fb6d8535ffeb") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 06:47:04 crc kubenswrapper[4825]: I0310 06:47:04.236207 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:47:04 crc kubenswrapper[4825]: I0310 06:47:04.236268 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:47:04 crc kubenswrapper[4825]: I0310 06:47:04.236377 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:47:04 crc kubenswrapper[4825]: E0310 06:47:04.236458 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:47:04 crc kubenswrapper[4825]: E0310 06:47:04.236545 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:47:04 crc kubenswrapper[4825]: E0310 06:47:04.236664 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:47:04 crc kubenswrapper[4825]: E0310 06:47:04.512209 4825 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 06:47:05 crc kubenswrapper[4825]: I0310 06:47:05.236069 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:47:05 crc kubenswrapper[4825]: E0310 06:47:05.236358 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:47:06 crc kubenswrapper[4825]: I0310 06:47:06.235763 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:47:06 crc kubenswrapper[4825]: I0310 06:47:06.235836 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:47:06 crc kubenswrapper[4825]: I0310 06:47:06.235801 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:47:06 crc kubenswrapper[4825]: E0310 06:47:06.236112 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:47:06 crc kubenswrapper[4825]: E0310 06:47:06.236329 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:47:06 crc kubenswrapper[4825]: E0310 06:47:06.236432 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:47:07 crc kubenswrapper[4825]: I0310 06:47:07.235841 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:47:07 crc kubenswrapper[4825]: E0310 06:47:07.236283 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:47:08 crc kubenswrapper[4825]: I0310 06:47:08.235841 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:47:08 crc kubenswrapper[4825]: I0310 06:47:08.235896 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:47:08 crc kubenswrapper[4825]: I0310 06:47:08.235841 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:47:08 crc kubenswrapper[4825]: E0310 06:47:08.236310 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:47:08 crc kubenswrapper[4825]: E0310 06:47:08.236422 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:47:08 crc kubenswrapper[4825]: E0310 06:47:08.236560 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:47:08 crc kubenswrapper[4825]: I0310 06:47:08.237748 4825 scope.go:117] "RemoveContainer" containerID="73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d" Mar 10 06:47:08 crc kubenswrapper[4825]: E0310 06:47:08.238060 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jhkb9_openshift-ovn-kubernetes(79ec9d89-dc71-4f36-9254-00bd86795e43)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" Mar 10 06:47:09 crc kubenswrapper[4825]: I0310 06:47:09.236593 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:47:09 crc kubenswrapper[4825]: E0310 06:47:09.240561 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:47:09 crc kubenswrapper[4825]: E0310 06:47:09.512890 4825 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 06:47:10 crc kubenswrapper[4825]: I0310 06:47:10.235893 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:47:10 crc kubenswrapper[4825]: I0310 06:47:10.235893 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:47:10 crc kubenswrapper[4825]: I0310 06:47:10.236069 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:47:10 crc kubenswrapper[4825]: E0310 06:47:10.236392 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:47:10 crc kubenswrapper[4825]: E0310 06:47:10.236513 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:47:10 crc kubenswrapper[4825]: E0310 06:47:10.236612 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:47:11 crc kubenswrapper[4825]: I0310 06:47:11.236343 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:47:11 crc kubenswrapper[4825]: E0310 06:47:11.236516 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:47:12 crc kubenswrapper[4825]: I0310 06:47:12.235813 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:47:12 crc kubenswrapper[4825]: I0310 06:47:12.235895 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:47:12 crc kubenswrapper[4825]: I0310 06:47:12.236327 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:47:12 crc kubenswrapper[4825]: E0310 06:47:12.236493 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:47:12 crc kubenswrapper[4825]: E0310 06:47:12.236804 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:47:12 crc kubenswrapper[4825]: E0310 06:47:12.236954 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:47:13 crc kubenswrapper[4825]: I0310 06:47:13.236525 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:47:13 crc kubenswrapper[4825]: E0310 06:47:13.236855 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:47:14 crc kubenswrapper[4825]: I0310 06:47:14.235736 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:47:14 crc kubenswrapper[4825]: I0310 06:47:14.235832 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:47:14 crc kubenswrapper[4825]: I0310 06:47:14.235756 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:47:14 crc kubenswrapper[4825]: E0310 06:47:14.235996 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:47:14 crc kubenswrapper[4825]: E0310 06:47:14.236186 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:47:14 crc kubenswrapper[4825]: E0310 06:47:14.236270 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:47:14 crc kubenswrapper[4825]: E0310 06:47:14.515005 4825 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 06:47:15 crc kubenswrapper[4825]: I0310 06:47:15.235739 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:47:15 crc kubenswrapper[4825]: E0310 06:47:15.235975 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:47:16 crc kubenswrapper[4825]: I0310 06:47:16.235468 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:47:16 crc kubenswrapper[4825]: I0310 06:47:16.235630 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:47:16 crc kubenswrapper[4825]: E0310 06:47:16.235686 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:47:16 crc kubenswrapper[4825]: I0310 06:47:16.235468 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:47:16 crc kubenswrapper[4825]: E0310 06:47:16.235865 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:47:16 crc kubenswrapper[4825]: E0310 06:47:16.235918 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:47:17 crc kubenswrapper[4825]: I0310 06:47:17.235785 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:47:17 crc kubenswrapper[4825]: E0310 06:47:17.236043 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:47:18 crc kubenswrapper[4825]: I0310 06:47:18.236410 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:47:18 crc kubenswrapper[4825]: I0310 06:47:18.236436 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:47:18 crc kubenswrapper[4825]: E0310 06:47:18.236630 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:47:18 crc kubenswrapper[4825]: I0310 06:47:18.236410 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:47:18 crc kubenswrapper[4825]: E0310 06:47:18.236780 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:47:18 crc kubenswrapper[4825]: E0310 06:47:18.236883 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:47:19 crc kubenswrapper[4825]: I0310 06:47:19.236620 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:47:19 crc kubenswrapper[4825]: E0310 06:47:19.239746 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:47:19 crc kubenswrapper[4825]: E0310 06:47:19.515584 4825 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 06:47:20 crc kubenswrapper[4825]: I0310 06:47:20.235622 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:47:20 crc kubenswrapper[4825]: I0310 06:47:20.235689 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:47:20 crc kubenswrapper[4825]: I0310 06:47:20.235648 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:47:20 crc kubenswrapper[4825]: E0310 06:47:20.235877 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:47:20 crc kubenswrapper[4825]: E0310 06:47:20.236713 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:47:20 crc kubenswrapper[4825]: E0310 06:47:20.236850 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:47:20 crc kubenswrapper[4825]: I0310 06:47:20.237583 4825 scope.go:117] "RemoveContainer" containerID="73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d" Mar 10 06:47:20 crc kubenswrapper[4825]: E0310 06:47:20.237914 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jhkb9_openshift-ovn-kubernetes(79ec9d89-dc71-4f36-9254-00bd86795e43)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" Mar 10 06:47:20 crc kubenswrapper[4825]: I0310 06:47:20.466832 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8dkbt_165351e4-3c96-4a68-8c75-43b001b0ec60/kube-multus/1.log" Mar 10 06:47:20 crc kubenswrapper[4825]: I0310 06:47:20.467996 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8dkbt_165351e4-3c96-4a68-8c75-43b001b0ec60/kube-multus/0.log" Mar 10 06:47:20 crc kubenswrapper[4825]: I0310 06:47:20.468076 4825 generic.go:334] "Generic (PLEG): container finished" podID="165351e4-3c96-4a68-8c75-43b001b0ec60" containerID="56708571d26121728c9ab436b6bd8aac9945e7b28e3f4c5d12035fa9809c70ef" exitCode=1 Mar 10 06:47:20 crc kubenswrapper[4825]: I0310 06:47:20.468116 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8dkbt" event={"ID":"165351e4-3c96-4a68-8c75-43b001b0ec60","Type":"ContainerDied","Data":"56708571d26121728c9ab436b6bd8aac9945e7b28e3f4c5d12035fa9809c70ef"} Mar 10 06:47:20 crc kubenswrapper[4825]: I0310 06:47:20.468190 4825 scope.go:117] "RemoveContainer" containerID="420d0f058d1a0590e8feaa3f7f4dc37f6f0b91274142ca8ffd6b92c08ea24c1c" Mar 10 06:47:20 crc kubenswrapper[4825]: I0310 06:47:20.471087 4825 scope.go:117] "RemoveContainer" containerID="56708571d26121728c9ab436b6bd8aac9945e7b28e3f4c5d12035fa9809c70ef" Mar 10 06:47:20 crc kubenswrapper[4825]: E0310 06:47:20.472964 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-8dkbt_openshift-multus(165351e4-3c96-4a68-8c75-43b001b0ec60)\"" pod="openshift-multus/multus-8dkbt" podUID="165351e4-3c96-4a68-8c75-43b001b0ec60" Mar 10 06:47:21 crc kubenswrapper[4825]: I0310 06:47:21.235935 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:47:21 crc kubenswrapper[4825]: E0310 06:47:21.236335 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:47:21 crc kubenswrapper[4825]: I0310 06:47:21.473163 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8dkbt_165351e4-3c96-4a68-8c75-43b001b0ec60/kube-multus/1.log" Mar 10 06:47:22 crc kubenswrapper[4825]: I0310 06:47:22.235947 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:47:22 crc kubenswrapper[4825]: E0310 06:47:22.236166 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:47:22 crc kubenswrapper[4825]: I0310 06:47:22.236236 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:47:22 crc kubenswrapper[4825]: I0310 06:47:22.236286 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:47:22 crc kubenswrapper[4825]: E0310 06:47:22.236830 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:47:22 crc kubenswrapper[4825]: E0310 06:47:22.236967 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:47:23 crc kubenswrapper[4825]: I0310 06:47:23.236431 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:47:23 crc kubenswrapper[4825]: E0310 06:47:23.236638 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:47:24 crc kubenswrapper[4825]: I0310 06:47:24.236248 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:47:24 crc kubenswrapper[4825]: E0310 06:47:24.236492 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:47:24 crc kubenswrapper[4825]: I0310 06:47:24.236647 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:47:24 crc kubenswrapper[4825]: E0310 06:47:24.237047 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:47:24 crc kubenswrapper[4825]: I0310 06:47:24.237485 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:47:24 crc kubenswrapper[4825]: E0310 06:47:24.237679 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:47:24 crc kubenswrapper[4825]: E0310 06:47:24.518182 4825 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 06:47:25 crc kubenswrapper[4825]: I0310 06:47:25.235743 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:47:25 crc kubenswrapper[4825]: E0310 06:47:25.235985 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:47:26 crc kubenswrapper[4825]: I0310 06:47:26.235659 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:47:26 crc kubenswrapper[4825]: I0310 06:47:26.235727 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:47:26 crc kubenswrapper[4825]: E0310 06:47:26.235916 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:47:26 crc kubenswrapper[4825]: I0310 06:47:26.235659 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:47:26 crc kubenswrapper[4825]: E0310 06:47:26.236433 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:47:26 crc kubenswrapper[4825]: E0310 06:47:26.236549 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:47:26 crc kubenswrapper[4825]: I0310 06:47:26.985386 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:47:26 crc kubenswrapper[4825]: I0310 06:47:26.985488 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:47:26 crc kubenswrapper[4825]: I0310 06:47:26.985538 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:47:26 crc kubenswrapper[4825]: I0310 06:47:26.985576 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:47:26 crc kubenswrapper[4825]: E0310 06:47:26.985725 4825 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 06:47:26 crc kubenswrapper[4825]: E0310 06:47:26.985762 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 06:47:26 crc kubenswrapper[4825]: E0310 06:47:26.985798 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 06:47:26 crc kubenswrapper[4825]: E0310 06:47:26.985807 4825 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 06:47:26 crc kubenswrapper[4825]: E0310 06:47:26.985872 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 06:49:28.985840115 +0000 UTC m=+322.015620770 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 06:47:26 crc kubenswrapper[4825]: E0310 06:47:26.985950 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 06:49:28.985915128 +0000 UTC m=+322.015695943 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 06:47:26 crc kubenswrapper[4825]: E0310 06:47:26.985823 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 06:47:26 crc kubenswrapper[4825]: E0310 06:47:26.986027 4825 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:47:26 crc kubenswrapper[4825]: E0310 06:47:26.985828 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 06:47:26 crc kubenswrapper[4825]: E0310 06:47:26.986093 4825 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:47:26 crc kubenswrapper[4825]: E0310 06:47:26.986192 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 06:49:28.986108534 +0000 UTC m=+322.015889369 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:47:26 crc kubenswrapper[4825]: E0310 06:47:26.986248 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 06:49:28.986227389 +0000 UTC m=+322.016008294 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 06:47:27 crc kubenswrapper[4825]: I0310 06:47:27.086353 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:27 crc kubenswrapper[4825]: E0310 06:47:27.086654 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:49:29.086604788 +0000 UTC m=+322.116385443 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:27 crc kubenswrapper[4825]: I0310 06:47:27.236544 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:47:27 crc kubenswrapper[4825]: E0310 06:47:27.236803 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:47:28 crc kubenswrapper[4825]: I0310 06:47:28.235564 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:47:28 crc kubenswrapper[4825]: I0310 06:47:28.235639 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:47:28 crc kubenswrapper[4825]: I0310 06:47:28.235687 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:47:28 crc kubenswrapper[4825]: E0310 06:47:28.236025 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:47:28 crc kubenswrapper[4825]: E0310 06:47:28.237669 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:47:28 crc kubenswrapper[4825]: E0310 06:47:28.237862 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:47:29 crc kubenswrapper[4825]: I0310 06:47:29.236305 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:47:29 crc kubenswrapper[4825]: E0310 06:47:29.238376 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:47:29 crc kubenswrapper[4825]: E0310 06:47:29.519377 4825 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 06:47:30 crc kubenswrapper[4825]: I0310 06:47:30.236387 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:47:30 crc kubenswrapper[4825]: I0310 06:47:30.236419 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:47:30 crc kubenswrapper[4825]: I0310 06:47:30.236523 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:47:30 crc kubenswrapper[4825]: E0310 06:47:30.236589 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:47:30 crc kubenswrapper[4825]: E0310 06:47:30.236724 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:47:30 crc kubenswrapper[4825]: E0310 06:47:30.236873 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:47:31 crc kubenswrapper[4825]: I0310 06:47:31.236621 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:47:31 crc kubenswrapper[4825]: E0310 06:47:31.237199 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:47:32 crc kubenswrapper[4825]: I0310 06:47:32.235633 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:47:32 crc kubenswrapper[4825]: I0310 06:47:32.235753 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:47:32 crc kubenswrapper[4825]: E0310 06:47:32.235858 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:47:32 crc kubenswrapper[4825]: I0310 06:47:32.235882 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:47:32 crc kubenswrapper[4825]: E0310 06:47:32.236062 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:47:32 crc kubenswrapper[4825]: E0310 06:47:32.236395 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:47:33 crc kubenswrapper[4825]: I0310 06:47:33.236339 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:47:33 crc kubenswrapper[4825]: E0310 06:47:33.236665 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:47:33 crc kubenswrapper[4825]: I0310 06:47:33.236977 4825 scope.go:117] "RemoveContainer" containerID="56708571d26121728c9ab436b6bd8aac9945e7b28e3f4c5d12035fa9809c70ef" Mar 10 06:47:33 crc kubenswrapper[4825]: I0310 06:47:33.238115 4825 scope.go:117] "RemoveContainer" containerID="73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d" Mar 10 06:47:33 crc kubenswrapper[4825]: I0310 06:47:33.524407 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhkb9_79ec9d89-dc71-4f36-9254-00bd86795e43/ovnkube-controller/3.log" Mar 10 06:47:33 crc kubenswrapper[4825]: I0310 06:47:33.528926 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerStarted","Data":"ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0"} Mar 10 06:47:33 crc kubenswrapper[4825]: I0310 06:47:33.529633 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:47:33 crc kubenswrapper[4825]: I0310 06:47:33.533772 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8dkbt_165351e4-3c96-4a68-8c75-43b001b0ec60/kube-multus/1.log" Mar 10 06:47:33 crc kubenswrapper[4825]: I0310 06:47:33.533869 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8dkbt" event={"ID":"165351e4-3c96-4a68-8c75-43b001b0ec60","Type":"ContainerStarted","Data":"948ff761b39a162bd7fe26b7700c7066bff6e2a7519a4d0bab5a4c56c3d2470a"} Mar 10 06:47:33 crc kubenswrapper[4825]: I0310 06:47:33.575188 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" podStartSLOduration=153.57511911 podStartE2EDuration="2m33.57511911s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:33.572708518 +0000 UTC m=+206.602489153" watchObservedRunningTime="2026-03-10 06:47:33.57511911 +0000 UTC m=+206.604899765" Mar 10 06:47:34 crc kubenswrapper[4825]: I0310 06:47:34.235833 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:47:34 crc kubenswrapper[4825]: I0310 06:47:34.235880 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:47:34 crc kubenswrapper[4825]: I0310 06:47:34.235898 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:47:34 crc kubenswrapper[4825]: E0310 06:47:34.236050 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:47:34 crc kubenswrapper[4825]: E0310 06:47:34.236463 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:47:34 crc kubenswrapper[4825]: E0310 06:47:34.236711 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:47:34 crc kubenswrapper[4825]: I0310 06:47:34.321683 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pj5dl"] Mar 10 06:47:34 crc kubenswrapper[4825]: E0310 06:47:34.520669 4825 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 06:47:34 crc kubenswrapper[4825]: I0310 06:47:34.537745 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:47:34 crc kubenswrapper[4825]: E0310 06:47:34.537988 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:47:35 crc kubenswrapper[4825]: I0310 06:47:35.236305 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:47:35 crc kubenswrapper[4825]: E0310 06:47:35.237019 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:47:36 crc kubenswrapper[4825]: I0310 06:47:36.235610 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:47:36 crc kubenswrapper[4825]: I0310 06:47:36.235706 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:47:36 crc kubenswrapper[4825]: I0310 06:47:36.235639 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:47:36 crc kubenswrapper[4825]: E0310 06:47:36.236262 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:47:36 crc kubenswrapper[4825]: E0310 06:47:36.236448 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:47:36 crc kubenswrapper[4825]: E0310 06:47:36.236601 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:47:37 crc kubenswrapper[4825]: I0310 06:47:37.236247 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:47:37 crc kubenswrapper[4825]: E0310 06:47:37.237433 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:47:38 crc kubenswrapper[4825]: I0310 06:47:38.236519 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:47:38 crc kubenswrapper[4825]: I0310 06:47:38.236648 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:47:38 crc kubenswrapper[4825]: E0310 06:47:38.236701 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:47:38 crc kubenswrapper[4825]: I0310 06:47:38.236543 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:47:38 crc kubenswrapper[4825]: E0310 06:47:38.236833 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:47:38 crc kubenswrapper[4825]: E0310 06:47:38.236990 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj5dl" podUID="114672c5-c1d0-4f87-b3aa-fb6d8535ffeb" Mar 10 06:47:39 crc kubenswrapper[4825]: I0310 06:47:39.235754 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:47:39 crc kubenswrapper[4825]: E0310 06:47:39.237910 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:47:40 crc kubenswrapper[4825]: I0310 06:47:40.236124 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:47:40 crc kubenswrapper[4825]: I0310 06:47:40.236211 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:47:40 crc kubenswrapper[4825]: I0310 06:47:40.236216 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:47:40 crc kubenswrapper[4825]: I0310 06:47:40.239690 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 06:47:40 crc kubenswrapper[4825]: I0310 06:47:40.239715 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 06:47:40 crc kubenswrapper[4825]: I0310 06:47:40.239929 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 06:47:40 crc kubenswrapper[4825]: I0310 06:47:40.240035 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 06:47:41 crc kubenswrapper[4825]: I0310 06:47:41.236570 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:47:41 crc kubenswrapper[4825]: I0310 06:47:41.239965 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 06:47:41 crc kubenswrapper[4825]: I0310 06:47:41.240542 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.000838 4825 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.061420 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6j748"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.062098 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.064779 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ksvjd"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.065943 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.067498 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.068321 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.070599 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-2kdt2"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.071307 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-2kdt2" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.079261 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-scbgc"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.080207 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-scbgc" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.086826 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.087534 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-j7fv5"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.088156 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j7fv5" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.088630 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:44 crc kubenswrapper[4825]: W0310 06:47:44.088948 4825 reflector.go:561] object-"openshift-apiserver"/"etcd-client": failed to list *v1.Secret: secrets "etcd-client" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 10 06:47:44 crc kubenswrapper[4825]: E0310 06:47:44.089021 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"etcd-client\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"etcd-client\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 06:47:44 crc kubenswrapper[4825]: W0310 06:47:44.089177 4825 reflector.go:561] object-"openshift-apiserver"/"audit-1": failed to list *v1.ConfigMap: configmaps "audit-1" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 10 06:47:44 crc kubenswrapper[4825]: E0310 06:47:44.089219 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"audit-1\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"audit-1\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 06:47:44 crc kubenswrapper[4825]: W0310 06:47:44.089347 4825 reflector.go:561] object-"openshift-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 10 06:47:44 crc kubenswrapper[4825]: E0310 06:47:44.089398 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 06:47:44 crc kubenswrapper[4825]: W0310 06:47:44.090429 4825 reflector.go:561] object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c": failed to list *v1.Secret: secrets "openshift-controller-manager-sa-dockercfg-msq4c" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Mar 10 06:47:44 crc kubenswrapper[4825]: E0310 06:47:44.090489 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-msq4c\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-controller-manager-sa-dockercfg-msq4c\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 06:47:44 crc kubenswrapper[4825]: W0310 06:47:44.090543 4825 reflector.go:561] object-"openshift-apiserver"/"encryption-config-1": failed to list *v1.Secret: secrets "encryption-config-1" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 10 06:47:44 crc kubenswrapper[4825]: W0310 06:47:44.090560 4825 reflector.go:561] object-"openshift-controller-manager"/"client-ca": failed to list *v1.ConfigMap: configmaps "client-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Mar 10 06:47:44 crc kubenswrapper[4825]: E0310 06:47:44.090582 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"encryption-config-1\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"encryption-config-1\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 06:47:44 crc kubenswrapper[4825]: E0310 06:47:44.090591 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"client-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"client-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.090683 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 10 06:47:44 crc kubenswrapper[4825]: W0310 06:47:44.090705 4825 reflector.go:561] object-"openshift-controller-manager"/"openshift-global-ca": failed to list *v1.ConfigMap: configmaps "openshift-global-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Mar 10 06:47:44 crc kubenswrapper[4825]: W0310 06:47:44.090730 4825 reflector.go:561] object-"openshift-route-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Mar 10 06:47:44 crc kubenswrapper[4825]: E0310 06:47:44.090777 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-global-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-global-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.090802 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 10 06:47:44 crc kubenswrapper[4825]: W0310 06:47:44.090852 4825 reflector.go:561] object-"openshift-apiserver"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 10 06:47:44 crc kubenswrapper[4825]: E0310 06:47:44.090831 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.090759 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 10 06:47:44 crc kubenswrapper[4825]: E0310 06:47:44.090884 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 06:47:44 crc kubenswrapper[4825]: W0310 06:47:44.090907 4825 reflector.go:561] object-"openshift-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Mar 10 06:47:44 crc kubenswrapper[4825]: W0310 06:47:44.091002 4825 reflector.go:561] object-"openshift-apiserver"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 10 06:47:44 crc kubenswrapper[4825]: E0310 06:47:44.091000 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.091027 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.091051 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.091063 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: W0310 06:47:44.090948 4825 reflector.go:561] object-"openshift-apiserver"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 10 06:47:44 crc kubenswrapper[4825]: E0310 06:47:44.091046 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 06:47:44 crc kubenswrapper[4825]: E0310 06:47:44.091118 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 06:47:44 crc kubenswrapper[4825]: W0310 06:47:44.090690 4825 reflector.go:561] object-"openshift-apiserver"/"etcd-serving-ca": failed to list *v1.ConfigMap: configmaps "etcd-serving-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 10 06:47:44 crc kubenswrapper[4825]: E0310 06:47:44.091256 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"etcd-serving-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"etcd-serving-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.102788 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xfp4f"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.106843 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.110064 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.118152 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.118699 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 10 06:47:44 crc kubenswrapper[4825]: W0310 06:47:44.118893 4825 reflector.go:561] object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2": failed to list *v1.Secret: secrets "route-controller-manager-sa-dockercfg-h2zr2" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.132770 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.132827 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.132783 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: E0310 06:47:44.133051 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"route-controller-manager-sa-dockercfg-h2zr2\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"route-controller-manager-sa-dockercfg-h2zr2\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.133090 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 10 06:47:44 crc kubenswrapper[4825]: W0310 06:47:44.133223 4825 reflector.go:561] object-"openshift-route-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Mar 10 06:47:44 crc kubenswrapper[4825]: E0310 06:47:44.133254 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.133317 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.133489 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.133587 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.133616 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.133824 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: W0310 06:47:44.133968 4825 reflector.go:561] object-"openshift-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Mar 10 06:47:44 crc kubenswrapper[4825]: E0310 06:47:44.134003 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.134070 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.134239 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.134372 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.134438 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.134473 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.134511 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.134680 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.134782 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.134866 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.134919 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.135025 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.135083 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 10 06:47:44 crc kubenswrapper[4825]: W0310 06:47:44.135089 4825 reflector.go:561] object-"openshift-route-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Mar 10 06:47:44 crc kubenswrapper[4825]: E0310 06:47:44.135157 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.135234 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.135283 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.135343 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.135617 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.136820 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4p7vx"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.137465 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4p7vx" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.137883 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8srf6"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.137987 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-etcd-serving-ca\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138047 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-etcd-client\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138087 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/960b4f98-8229-42eb-9755-891df394483c-service-ca-bundle\") pod \"authentication-operator-69f744f599-scbgc\" (UID: \"960b4f98-8229-42eb-9755-891df394483c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbgc" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138117 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-config\") pod \"controller-manager-879f6c89f-6j748\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138163 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk5fz\" (UniqueName: \"kubernetes.io/projected/25ad6c89-d2e0-408e-8c6f-a49da5a55bdd-kube-api-access-rk5fz\") pod \"machine-api-operator-5694c8668f-2kdt2\" (UID: \"25ad6c89-d2e0-408e-8c6f-a49da5a55bdd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2kdt2" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138189 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-serving-cert\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138216 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/25ad6c89-d2e0-408e-8c6f-a49da5a55bdd-images\") pod \"machine-api-operator-5694c8668f-2kdt2\" (UID: \"25ad6c89-d2e0-408e-8c6f-a49da5a55bdd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2kdt2" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138239 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbj9x\" (UniqueName: \"kubernetes.io/projected/0bab741b-822c-4548-8333-aa3f90ecd8a0-kube-api-access-gbj9x\") pod \"controller-manager-879f6c89f-6j748\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138295 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-client-ca\") pod \"controller-manager-879f6c89f-6j748\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138323 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-config\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138360 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpdhl\" (UniqueName: \"kubernetes.io/projected/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-kube-api-access-lpdhl\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138387 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bab741b-822c-4548-8333-aa3f90ecd8a0-serving-cert\") pod \"controller-manager-879f6c89f-6j748\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138413 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-client-ca\") pod \"route-controller-manager-6576b87f9c-grsjk\" (UID: \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138444 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcjrc\" (UniqueName: \"kubernetes.io/projected/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-kube-api-access-wcjrc\") pod \"route-controller-manager-6576b87f9c-grsjk\" (UID: \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138469 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hblmv\" (UniqueName: \"kubernetes.io/projected/960b4f98-8229-42eb-9755-891df394483c-kube-api-access-hblmv\") pod \"authentication-operator-69f744f599-scbgc\" (UID: \"960b4f98-8229-42eb-9755-891df394483c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbgc" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138497 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-node-pullsecrets\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138506 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8srf6" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138538 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/960b4f98-8229-42eb-9755-891df394483c-config\") pod \"authentication-operator-69f744f599-scbgc\" (UID: \"960b4f98-8229-42eb-9755-891df394483c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbgc" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138565 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138610 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-encryption-config\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138634 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-audit-dir\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138664 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/960b4f98-8229-42eb-9755-891df394483c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-scbgc\" (UID: \"960b4f98-8229-42eb-9755-891df394483c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbgc" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138690 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-config\") pod \"route-controller-manager-6576b87f9c-grsjk\" (UID: \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138714 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-image-import-ca\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138741 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6j748\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138766 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-serving-cert\") pod \"route-controller-manager-6576b87f9c-grsjk\" (UID: \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138790 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-audit\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138825 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ad6c89-d2e0-408e-8c6f-a49da5a55bdd-config\") pod \"machine-api-operator-5694c8668f-2kdt2\" (UID: \"25ad6c89-d2e0-408e-8c6f-a49da5a55bdd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2kdt2" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138869 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/25ad6c89-d2e0-408e-8c6f-a49da5a55bdd-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-2kdt2\" (UID: \"25ad6c89-d2e0-408e-8c6f-a49da5a55bdd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2kdt2" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.138910 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/960b4f98-8229-42eb-9755-891df394483c-serving-cert\") pod \"authentication-operator-69f744f599-scbgc\" (UID: \"960b4f98-8229-42eb-9755-891df394483c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbgc" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.139024 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.139194 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.139794 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.159227 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.159558 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.159576 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.159777 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.159830 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.159906 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.160087 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.160311 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.160411 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.160514 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.160559 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.160784 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.160995 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-72x4w"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.161824 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72x4w" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.163402 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.165112 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.165251 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.167203 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zq969"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.167674 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zq969" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.168220 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwnns"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.169012 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q2lm7"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.169834 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-q2lm7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.170304 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwnns" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.191163 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.192889 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.193185 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.193312 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.193532 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.193530 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.198694 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-l5r8d"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.199327 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-l5r8d" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.200928 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.214408 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.214747 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.214932 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.214962 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.218090 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-79576"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.219470 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-79576" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.231076 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.231366 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.233266 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.233405 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.242905 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.243232 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.243334 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.243601 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.243734 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.243922 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.244396 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.245061 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.245263 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.245413 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.245470 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.245408 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.245580 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.245726 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.246790 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.246968 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.247177 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.247436 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.247679 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.248421 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.248552 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tc5qk"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.249346 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tc5qk" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.250945 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.251087 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/27d7fca4-c1f9-411f-b705-e36c0c1e8356-machine-approver-tls\") pod \"machine-approver-56656f9798-j7fv5\" (UID: \"27d7fca4-c1f9-411f-b705-e36c0c1e8356\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j7fv5" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.251151 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92a495f2-d3a4-42b9-82f9-48b4dda31caf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4p7vx\" (UID: \"92a495f2-d3a4-42b9-82f9-48b4dda31caf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4p7vx" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.251175 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.251233 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.251260 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/960b4f98-8229-42eb-9755-891df394483c-service-ca-bundle\") pod \"authentication-operator-69f744f599-scbgc\" (UID: \"960b4f98-8229-42eb-9755-891df394483c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbgc" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.251290 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0289197b-e586-477b-bb0a-a8b8ef92b21d-metrics-tls\") pod \"dns-operator-744455d44c-q2lm7\" (UID: \"0289197b-e586-477b-bb0a-a8b8ef92b21d\") " pod="openshift-dns-operator/dns-operator-744455d44c-q2lm7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.251310 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjppg\" (UniqueName: \"kubernetes.io/projected/4282a4a7-f7d9-4c0e-9638-02c793c2e2e6-kube-api-access-mjppg\") pod \"apiserver-7bbb656c7d-grpt7\" (UID: \"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.251328 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.251345 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k294v\" (UniqueName: \"kubernetes.io/projected/cc9419c8-c23a-418b-8fba-9956bed2a193-kube-api-access-k294v\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.251366 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w24jm\" (UniqueName: \"kubernetes.io/projected/92a495f2-d3a4-42b9-82f9-48b4dda31caf-kube-api-access-w24jm\") pod \"openshift-apiserver-operator-796bbdcf4f-4p7vx\" (UID: \"92a495f2-d3a4-42b9-82f9-48b4dda31caf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4p7vx" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.251388 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-config\") pod \"controller-manager-879f6c89f-6j748\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252126 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4282a4a7-f7d9-4c0e-9638-02c793c2e2e6-audit-policies\") pod \"apiserver-7bbb656c7d-grpt7\" (UID: \"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252168 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cc9419c8-c23a-418b-8fba-9956bed2a193-audit-dir\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252188 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252216 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk5fz\" (UniqueName: \"kubernetes.io/projected/25ad6c89-d2e0-408e-8c6f-a49da5a55bdd-kube-api-access-rk5fz\") pod \"machine-api-operator-5694c8668f-2kdt2\" (UID: \"25ad6c89-d2e0-408e-8c6f-a49da5a55bdd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2kdt2" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252238 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ced2dcc4-9906-4cbe-b163-41ff41bb4f02-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8srf6\" (UID: \"ced2dcc4-9906-4cbe-b163-41ff41bb4f02\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8srf6" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252262 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af8fff67-ed41-4e2d-af5f-cf76cd2a4234-trusted-ca\") pod \"console-operator-58897d9998-zq969\" (UID: \"af8fff67-ed41-4e2d-af5f-cf76cd2a4234\") " pod="openshift-console-operator/console-operator-58897d9998-zq969" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252279 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4282a4a7-f7d9-4c0e-9638-02c793c2e2e6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-grpt7\" (UID: \"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252297 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/25ad6c89-d2e0-408e-8c6f-a49da5a55bdd-images\") pod \"machine-api-operator-5694c8668f-2kdt2\" (UID: \"25ad6c89-d2e0-408e-8c6f-a49da5a55bdd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2kdt2" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252316 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-serving-cert\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252338 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4282a4a7-f7d9-4c0e-9638-02c793c2e2e6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-grpt7\" (UID: \"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252359 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27d7fca4-c1f9-411f-b705-e36c0c1e8356-config\") pod \"machine-approver-56656f9798-j7fv5\" (UID: \"27d7fca4-c1f9-411f-b705-e36c0c1e8356\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j7fv5" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252379 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbj9x\" (UniqueName: \"kubernetes.io/projected/0bab741b-822c-4548-8333-aa3f90ecd8a0-kube-api-access-gbj9x\") pod \"controller-manager-879f6c89f-6j748\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252400 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/45ae05cc-9b3e-40e0-9235-f0cf1f6def5f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-72x4w\" (UID: \"45ae05cc-9b3e-40e0-9235-f0cf1f6def5f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72x4w" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252430 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-client-ca\") pod \"controller-manager-879f6c89f-6j748\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252447 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-config\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252466 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92a495f2-d3a4-42b9-82f9-48b4dda31caf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4p7vx\" (UID: \"92a495f2-d3a4-42b9-82f9-48b4dda31caf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4p7vx" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252482 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252502 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpdhl\" (UniqueName: \"kubernetes.io/projected/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-kube-api-access-lpdhl\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252497 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/960b4f98-8229-42eb-9755-891df394483c-service-ca-bundle\") pod \"authentication-operator-69f744f599-scbgc\" (UID: \"960b4f98-8229-42eb-9755-891df394483c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbgc" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252519 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4282a4a7-f7d9-4c0e-9638-02c793c2e2e6-etcd-client\") pod \"apiserver-7bbb656c7d-grpt7\" (UID: \"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252584 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af8fff67-ed41-4e2d-af5f-cf76cd2a4234-serving-cert\") pod \"console-operator-58897d9998-zq969\" (UID: \"af8fff67-ed41-4e2d-af5f-cf76cd2a4234\") " pod="openshift-console-operator/console-operator-58897d9998-zq969" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252617 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bab741b-822c-4548-8333-aa3f90ecd8a0-serving-cert\") pod \"controller-manager-879f6c89f-6j748\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252636 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-client-ca\") pod \"route-controller-manager-6576b87f9c-grsjk\" (UID: \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252653 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcjrc\" (UniqueName: \"kubernetes.io/projected/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-kube-api-access-wcjrc\") pod \"route-controller-manager-6576b87f9c-grsjk\" (UID: \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252671 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hblmv\" (UniqueName: \"kubernetes.io/projected/960b4f98-8229-42eb-9755-891df394483c-kube-api-access-hblmv\") pod \"authentication-operator-69f744f599-scbgc\" (UID: \"960b4f98-8229-42eb-9755-891df394483c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbgc" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252690 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4282a4a7-f7d9-4c0e-9638-02c793c2e2e6-serving-cert\") pod \"apiserver-7bbb656c7d-grpt7\" (UID: \"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252714 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-node-pullsecrets\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252747 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/960b4f98-8229-42eb-9755-891df394483c-config\") pod \"authentication-operator-69f744f599-scbgc\" (UID: \"960b4f98-8229-42eb-9755-891df394483c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbgc" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252764 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252783 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4282a4a7-f7d9-4c0e-9638-02c793c2e2e6-audit-dir\") pod \"apiserver-7bbb656c7d-grpt7\" (UID: \"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252812 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj4pk\" (UniqueName: \"kubernetes.io/projected/27d7fca4-c1f9-411f-b705-e36c0c1e8356-kube-api-access-wj4pk\") pod \"machine-approver-56656f9798-j7fv5\" (UID: \"27d7fca4-c1f9-411f-b705-e36c0c1e8356\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j7fv5" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252829 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252849 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252868 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252889 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-encryption-config\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252905 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-audit-dir\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252921 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252948 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/960b4f98-8229-42eb-9755-891df394483c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-scbgc\" (UID: \"960b4f98-8229-42eb-9755-891df394483c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbgc" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252971 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttpjj\" (UniqueName: \"kubernetes.io/projected/ced2dcc4-9906-4cbe-b163-41ff41bb4f02-kube-api-access-ttpjj\") pod \"cluster-samples-operator-665b6dd947-8srf6\" (UID: \"ced2dcc4-9906-4cbe-b163-41ff41bb4f02\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8srf6" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.252987 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4dhj\" (UniqueName: \"kubernetes.io/projected/d5867ece-9f04-49e4-8a81-1666fc1849ea-kube-api-access-g4dhj\") pod \"openshift-controller-manager-operator-756b6f6bc6-qwnns\" (UID: \"d5867ece-9f04-49e4-8a81-1666fc1849ea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwnns" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253012 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-config\") pod \"route-controller-manager-6576b87f9c-grsjk\" (UID: \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253040 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-image-import-ca\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253061 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtsbr\" (UniqueName: \"kubernetes.io/projected/45ae05cc-9b3e-40e0-9235-f0cf1f6def5f-kube-api-access-vtsbr\") pod \"openshift-config-operator-7777fb866f-72x4w\" (UID: \"45ae05cc-9b3e-40e0-9235-f0cf1f6def5f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72x4w" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253077 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-serving-cert\") pod \"route-controller-manager-6576b87f9c-grsjk\" (UID: \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253095 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-audit\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253115 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5867ece-9f04-49e4-8a81-1666fc1849ea-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qwnns\" (UID: \"d5867ece-9f04-49e4-8a81-1666fc1849ea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwnns" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253164 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6j748\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253202 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ad6c89-d2e0-408e-8c6f-a49da5a55bdd-config\") pod \"machine-api-operator-5694c8668f-2kdt2\" (UID: \"25ad6c89-d2e0-408e-8c6f-a49da5a55bdd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2kdt2" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253229 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/27d7fca4-c1f9-411f-b705-e36c0c1e8356-auth-proxy-config\") pod \"machine-approver-56656f9798-j7fv5\" (UID: \"27d7fca4-c1f9-411f-b705-e36c0c1e8356\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j7fv5" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253259 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7f54\" (UniqueName: \"kubernetes.io/projected/0289197b-e586-477b-bb0a-a8b8ef92b21d-kube-api-access-n7f54\") pod \"dns-operator-744455d44c-q2lm7\" (UID: \"0289197b-e586-477b-bb0a-a8b8ef92b21d\") " pod="openshift-dns-operator/dns-operator-744455d44c-q2lm7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253275 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cc9419c8-c23a-418b-8fba-9956bed2a193-audit-policies\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253290 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwgn4\" (UniqueName: \"kubernetes.io/projected/af8fff67-ed41-4e2d-af5f-cf76cd2a4234-kube-api-access-rwgn4\") pod \"console-operator-58897d9998-zq969\" (UID: \"af8fff67-ed41-4e2d-af5f-cf76cd2a4234\") " pod="openshift-console-operator/console-operator-58897d9998-zq969" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253313 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/25ad6c89-d2e0-408e-8c6f-a49da5a55bdd-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-2kdt2\" (UID: \"25ad6c89-d2e0-408e-8c6f-a49da5a55bdd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2kdt2" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253337 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253430 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/25ad6c89-d2e0-408e-8c6f-a49da5a55bdd-images\") pod \"machine-api-operator-5694c8668f-2kdt2\" (UID: \"25ad6c89-d2e0-408e-8c6f-a49da5a55bdd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2kdt2" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253600 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/960b4f98-8229-42eb-9755-891df394483c-serving-cert\") pod \"authentication-operator-69f744f599-scbgc\" (UID: \"960b4f98-8229-42eb-9755-891df394483c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbgc" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253627 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4282a4a7-f7d9-4c0e-9638-02c793c2e2e6-encryption-config\") pod \"apiserver-7bbb656c7d-grpt7\" (UID: \"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253651 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5867ece-9f04-49e4-8a81-1666fc1849ea-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qwnns\" (UID: \"d5867ece-9f04-49e4-8a81-1666fc1849ea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwnns" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253670 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253703 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-etcd-client\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253719 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-etcd-serving-ca\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253712 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-audit-dir\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253739 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af8fff67-ed41-4e2d-af5f-cf76cd2a4234-config\") pod \"console-operator-58897d9998-zq969\" (UID: \"af8fff67-ed41-4e2d-af5f-cf76cd2a4234\") " pod="openshift-console-operator/console-operator-58897d9998-zq969" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253753 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-client-ca\") pod \"route-controller-manager-6576b87f9c-grsjk\" (UID: \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.253760 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45ae05cc-9b3e-40e0-9235-f0cf1f6def5f-serving-cert\") pod \"openshift-config-operator-7777fb866f-72x4w\" (UID: \"45ae05cc-9b3e-40e0-9235-f0cf1f6def5f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72x4w" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.254240 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-node-pullsecrets\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.254563 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ad6c89-d2e0-408e-8c6f-a49da5a55bdd-config\") pod \"machine-api-operator-5694c8668f-2kdt2\" (UID: \"25ad6c89-d2e0-408e-8c6f-a49da5a55bdd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2kdt2" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.254812 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-config\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.255025 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/960b4f98-8229-42eb-9755-891df394483c-config\") pod \"authentication-operator-69f744f599-scbgc\" (UID: \"960b4f98-8229-42eb-9755-891df394483c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbgc" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.255233 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.255258 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/960b4f98-8229-42eb-9755-891df394483c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-scbgc\" (UID: \"960b4f98-8229-42eb-9755-891df394483c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbgc" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.255996 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-image-import-ca\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.256581 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-tgcvt"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.257091 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tgcvt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.259786 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-6bjtt"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.260204 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vlkfb"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.262600 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5xr9"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.262996 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mxs62"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.264097 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.264243 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.264790 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5xr9" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.264939 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lvqmg"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.265375 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mxs62" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.266015 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lvqmg" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.271663 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.276019 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/960b4f98-8229-42eb-9755-891df394483c-serving-cert\") pod \"authentication-operator-69f744f599-scbgc\" (UID: \"960b4f98-8229-42eb-9755-891df394483c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbgc" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.276789 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-txg9q"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.277495 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.278248 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8vfsj"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.278333 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/25ad6c89-d2e0-408e-8c6f-a49da5a55bdd-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-2kdt2\" (UID: \"25ad6c89-d2e0-408e-8c6f-a49da5a55bdd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2kdt2" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.280498 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6j748"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.280643 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8vfsj" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.282706 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8tdjt"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.283334 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8tdjt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.284227 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-csfgr"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.284920 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csfgr" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.284943 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8mnj8"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.285638 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8mnj8" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.286003 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.288051 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.289886 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m78bw"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.290339 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6577d"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.290565 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m78bw" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.291662 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6577d" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.292422 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-d5v8r"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.293187 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-d5v8r" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.294093 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7wbbf"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.294643 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-9cr4m"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.294708 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7wbbf" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.295035 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-9cr4m" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.295549 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wp526"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.296071 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wp526" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.296218 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552085-npl6x"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.296763 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552085-npl6x" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.297664 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g8v9j"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.298061 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g8v9j" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.299534 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hbbcq"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.300354 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nblmw"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.302045 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hbbcq" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.305849 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5kj8n"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.306007 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nblmw" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.307910 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552086-rkhdp"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.311294 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5kj8n" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.311743 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-scbgc"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.311768 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8srf6"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.311779 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwnns"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.311843 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552086-rkhdp" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.318304 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-72x4w"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.322726 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-79576"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.323911 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.325453 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zq969"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.327115 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vlkfb"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.328337 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xfp4f"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.329418 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-6bjtt"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.330520 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ksvjd"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.331488 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-2kdt2"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.332507 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q2lm7"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.333711 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-d5v8r"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.334852 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m78bw"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.337553 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-csfgr"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.338778 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-t8r7l"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.339840 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-t8r7l" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.340404 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-s68rr"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.342757 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4p7vx"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.342899 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-s68rr" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.344036 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.344616 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-txg9q"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.346856 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-l5r8d"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.348461 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.350702 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tc5qk"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.351632 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5xr9"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.352758 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552086-rkhdp"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.353988 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mxs62"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.354930 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ced2dcc4-9906-4cbe-b163-41ff41bb4f02-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8srf6\" (UID: \"ced2dcc4-9906-4cbe-b163-41ff41bb4f02\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8srf6" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.354977 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.356335 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f0e6d3a-b022-42dd-828e-bd5ec395d06c-service-ca-bundle\") pod \"router-default-5444994796-tgcvt\" (UID: \"9f0e6d3a-b022-42dd-828e-bd5ec395d06c\") " pod="openshift-ingress/router-default-5444994796-tgcvt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.355868 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.356388 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnsqk\" (UniqueName: \"kubernetes.io/projected/9f0e6d3a-b022-42dd-828e-bd5ec395d06c-kube-api-access-rnsqk\") pod \"router-default-5444994796-tgcvt\" (UID: \"9f0e6d3a-b022-42dd-828e-bd5ec395d06c\") " pod="openshift-ingress/router-default-5444994796-tgcvt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.355894 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lvqmg"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.356420 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e40273f3-dda6-4e38-b940-284ae6f95e41-images\") pod \"machine-config-operator-74547568cd-csfgr\" (UID: \"e40273f3-dda6-4e38-b940-284ae6f95e41\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csfgr" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.356510 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af8fff67-ed41-4e2d-af5f-cf76cd2a4234-trusted-ca\") pod \"console-operator-58897d9998-zq969\" (UID: \"af8fff67-ed41-4e2d-af5f-cf76cd2a4234\") " pod="openshift-console-operator/console-operator-58897d9998-zq969" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.356561 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4282a4a7-f7d9-4c0e-9638-02c793c2e2e6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-grpt7\" (UID: \"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.356603 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4282a4a7-f7d9-4c0e-9638-02c793c2e2e6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-grpt7\" (UID: \"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.356625 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27d7fca4-c1f9-411f-b705-e36c0c1e8356-config\") pod \"machine-approver-56656f9798-j7fv5\" (UID: \"27d7fca4-c1f9-411f-b705-e36c0c1e8356\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j7fv5" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.356660 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/45ae05cc-9b3e-40e0-9235-f0cf1f6def5f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-72x4w\" (UID: \"45ae05cc-9b3e-40e0-9235-f0cf1f6def5f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72x4w" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.356691 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9f0e6d3a-b022-42dd-828e-bd5ec395d06c-default-certificate\") pod \"router-default-5444994796-tgcvt\" (UID: \"9f0e6d3a-b022-42dd-828e-bd5ec395d06c\") " pod="openshift-ingress/router-default-5444994796-tgcvt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.356735 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8a265902-a773-4550-b3fa-79f94c82809c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8vfsj\" (UID: \"8a265902-a773-4550-b3fa-79f94c82809c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8vfsj" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.356757 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bde95e4a-12d5-4b7e-bd4e-9b2527faefa0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-79576\" (UID: \"bde95e4a-12d5-4b7e-bd4e-9b2527faefa0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-79576" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.356783 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3a60327-2809-415b-abde-d1569a2453b6-console-config\") pod \"console-f9d7485db-6bjtt\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.356823 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92a495f2-d3a4-42b9-82f9-48b4dda31caf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4p7vx\" (UID: \"92a495f2-d3a4-42b9-82f9-48b4dda31caf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4p7vx" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.356845 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.356879 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af8fff67-ed41-4e2d-af5f-cf76cd2a4234-serving-cert\") pod \"console-operator-58897d9998-zq969\" (UID: \"af8fff67-ed41-4e2d-af5f-cf76cd2a4234\") " pod="openshift-console-operator/console-operator-58897d9998-zq969" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.356901 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4282a4a7-f7d9-4c0e-9638-02c793c2e2e6-etcd-client\") pod \"apiserver-7bbb656c7d-grpt7\" (UID: \"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.356928 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4282a4a7-f7d9-4c0e-9638-02c793c2e2e6-serving-cert\") pod \"apiserver-7bbb656c7d-grpt7\" (UID: \"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.356949 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e73a713-a351-419b-903a-e44041c28d6f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mxs62\" (UID: \"3e73a713-a351-419b-903a-e44041c28d6f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mxs62" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.356971 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mflkh\" (UniqueName: \"kubernetes.io/projected/8a265902-a773-4550-b3fa-79f94c82809c-kube-api-access-mflkh\") pod \"multus-admission-controller-857f4d67dd-8vfsj\" (UID: \"8a265902-a773-4550-b3fa-79f94c82809c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8vfsj" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357008 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bde95e4a-12d5-4b7e-bd4e-9b2527faefa0-trusted-ca\") pod \"ingress-operator-5b745b69d9-79576\" (UID: \"bde95e4a-12d5-4b7e-bd4e-9b2527faefa0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-79576" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357058 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dzl2\" (UniqueName: \"kubernetes.io/projected/f3a60327-2809-415b-abde-d1569a2453b6-kube-api-access-9dzl2\") pod \"console-f9d7485db-6bjtt\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357103 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4282a4a7-f7d9-4c0e-9638-02c793c2e2e6-audit-dir\") pod \"apiserver-7bbb656c7d-grpt7\" (UID: \"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357174 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5zkr\" (UniqueName: \"kubernetes.io/projected/230c8679-e321-40bd-844e-e350e48404e3-kube-api-access-j5zkr\") pod \"catalog-operator-68c6474976-8tdjt\" (UID: \"230c8679-e321-40bd-844e-e350e48404e3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8tdjt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357210 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357231 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357253 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4d23b66c-f736-4af2-9dc7-6167ca4d53ef-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-txg9q\" (UID: \"4d23b66c-f736-4af2-9dc7-6167ca4d53ef\") " pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357283 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj4pk\" (UniqueName: \"kubernetes.io/projected/27d7fca4-c1f9-411f-b705-e36c0c1e8356-kube-api-access-wj4pk\") pod \"machine-approver-56656f9798-j7fv5\" (UID: \"27d7fca4-c1f9-411f-b705-e36c0c1e8356\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j7fv5" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357304 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357323 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357346 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbkqz\" (UniqueName: \"kubernetes.io/projected/e40273f3-dda6-4e38-b940-284ae6f95e41-kube-api-access-xbkqz\") pod \"machine-config-operator-74547568cd-csfgr\" (UID: \"e40273f3-dda6-4e38-b940-284ae6f95e41\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csfgr" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357364 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdj96\" (UniqueName: \"kubernetes.io/projected/bde95e4a-12d5-4b7e-bd4e-9b2527faefa0-kube-api-access-wdj96\") pod \"ingress-operator-5b745b69d9-79576\" (UID: \"bde95e4a-12d5-4b7e-bd4e-9b2527faefa0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-79576" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357396 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4dhj\" (UniqueName: \"kubernetes.io/projected/d5867ece-9f04-49e4-8a81-1666fc1849ea-kube-api-access-g4dhj\") pod \"openshift-controller-manager-operator-756b6f6bc6-qwnns\" (UID: \"d5867ece-9f04-49e4-8a81-1666fc1849ea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwnns" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357407 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4282a4a7-f7d9-4c0e-9638-02c793c2e2e6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-grpt7\" (UID: \"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357417 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e40273f3-dda6-4e38-b940-284ae6f95e41-proxy-tls\") pod \"machine-config-operator-74547568cd-csfgr\" (UID: \"e40273f3-dda6-4e38-b940-284ae6f95e41\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csfgr" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357443 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttpjj\" (UniqueName: \"kubernetes.io/projected/ced2dcc4-9906-4cbe-b163-41ff41bb4f02-kube-api-access-ttpjj\") pod \"cluster-samples-operator-665b6dd947-8srf6\" (UID: \"ced2dcc4-9906-4cbe-b163-41ff41bb4f02\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8srf6" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357462 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8vfsj"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357465 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3a60327-2809-415b-abde-d1569a2453b6-trusted-ca-bundle\") pod \"console-f9d7485db-6bjtt\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357531 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d23b66c-f736-4af2-9dc7-6167ca4d53ef-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-txg9q\" (UID: \"4d23b66c-f736-4af2-9dc7-6167ca4d53ef\") " pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357559 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bde95e4a-12d5-4b7e-bd4e-9b2527faefa0-metrics-tls\") pod \"ingress-operator-5b745b69d9-79576\" (UID: \"bde95e4a-12d5-4b7e-bd4e-9b2527faefa0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-79576" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357592 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtsbr\" (UniqueName: \"kubernetes.io/projected/45ae05cc-9b3e-40e0-9235-f0cf1f6def5f-kube-api-access-vtsbr\") pod \"openshift-config-operator-7777fb866f-72x4w\" (UID: \"45ae05cc-9b3e-40e0-9235-f0cf1f6def5f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72x4w" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357647 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5867ece-9f04-49e4-8a81-1666fc1849ea-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qwnns\" (UID: \"d5867ece-9f04-49e4-8a81-1666fc1849ea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwnns" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357680 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e73a713-a351-419b-903a-e44041c28d6f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mxs62\" (UID: \"3e73a713-a351-419b-903a-e44041c28d6f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mxs62" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357714 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/27d7fca4-c1f9-411f-b705-e36c0c1e8356-auth-proxy-config\") pod \"machine-approver-56656f9798-j7fv5\" (UID: \"27d7fca4-c1f9-411f-b705-e36c0c1e8356\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j7fv5" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357739 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e40273f3-dda6-4e38-b940-284ae6f95e41-auth-proxy-config\") pod \"machine-config-operator-74547568cd-csfgr\" (UID: \"e40273f3-dda6-4e38-b940-284ae6f95e41\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csfgr" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357764 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/230c8679-e321-40bd-844e-e350e48404e3-profile-collector-cert\") pod \"catalog-operator-68c6474976-8tdjt\" (UID: \"230c8679-e321-40bd-844e-e350e48404e3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8tdjt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357791 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7f54\" (UniqueName: \"kubernetes.io/projected/0289197b-e586-477b-bb0a-a8b8ef92b21d-kube-api-access-n7f54\") pod \"dns-operator-744455d44c-q2lm7\" (UID: \"0289197b-e586-477b-bb0a-a8b8ef92b21d\") " pod="openshift-dns-operator/dns-operator-744455d44c-q2lm7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357802 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af8fff67-ed41-4e2d-af5f-cf76cd2a4234-trusted-ca\") pod \"console-operator-58897d9998-zq969\" (UID: \"af8fff67-ed41-4e2d-af5f-cf76cd2a4234\") " pod="openshift-console-operator/console-operator-58897d9998-zq969" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357818 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cc9419c8-c23a-418b-8fba-9956bed2a193-audit-policies\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357852 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3a60327-2809-415b-abde-d1569a2453b6-oauth-serving-cert\") pod \"console-f9d7485db-6bjtt\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357895 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwgn4\" (UniqueName: \"kubernetes.io/projected/af8fff67-ed41-4e2d-af5f-cf76cd2a4234-kube-api-access-rwgn4\") pod \"console-operator-58897d9998-zq969\" (UID: \"af8fff67-ed41-4e2d-af5f-cf76cd2a4234\") " pod="openshift-console-operator/console-operator-58897d9998-zq969" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357925 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357954 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c941f1c7-e449-495d-9af7-e40d1b2f2cd5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-g5xr9\" (UID: \"c941f1c7-e449-495d-9af7-e40d1b2f2cd5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5xr9" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.357982 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2959v\" (UniqueName: \"kubernetes.io/projected/3e73a713-a351-419b-903a-e44041c28d6f-kube-api-access-2959v\") pod \"cluster-image-registry-operator-dc59b4c8b-mxs62\" (UID: \"3e73a713-a351-419b-903a-e44041c28d6f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mxs62" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.358010 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5867ece-9f04-49e4-8a81-1666fc1849ea-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qwnns\" (UID: \"d5867ece-9f04-49e4-8a81-1666fc1849ea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwnns" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.358037 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.358068 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c941f1c7-e449-495d-9af7-e40d1b2f2cd5-config\") pod \"kube-apiserver-operator-766d6c64bb-g5xr9\" (UID: \"c941f1c7-e449-495d-9af7-e40d1b2f2cd5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5xr9" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.358095 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e73a713-a351-419b-903a-e44041c28d6f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mxs62\" (UID: \"3e73a713-a351-419b-903a-e44041c28d6f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mxs62" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.358106 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27d7fca4-c1f9-411f-b705-e36c0c1e8356-config\") pod \"machine-approver-56656f9798-j7fv5\" (UID: \"27d7fca4-c1f9-411f-b705-e36c0c1e8356\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j7fv5" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.358122 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3a60327-2809-415b-abde-d1569a2453b6-console-oauth-config\") pod \"console-f9d7485db-6bjtt\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.358187 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4282a4a7-f7d9-4c0e-9638-02c793c2e2e6-audit-dir\") pod \"apiserver-7bbb656c7d-grpt7\" (UID: \"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.358199 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4282a4a7-f7d9-4c0e-9638-02c793c2e2e6-encryption-config\") pod \"apiserver-7bbb656c7d-grpt7\" (UID: \"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.358227 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3a60327-2809-415b-abde-d1569a2453b6-service-ca\") pod \"console-f9d7485db-6bjtt\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.358234 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/45ae05cc-9b3e-40e0-9235-f0cf1f6def5f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-72x4w\" (UID: \"45ae05cc-9b3e-40e0-9235-f0cf1f6def5f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72x4w" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.358289 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/230c8679-e321-40bd-844e-e350e48404e3-srv-cert\") pod \"catalog-operator-68c6474976-8tdjt\" (UID: \"230c8679-e321-40bd-844e-e350e48404e3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8tdjt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.358306 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4282a4a7-f7d9-4c0e-9638-02c793c2e2e6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-grpt7\" (UID: \"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.358317 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3a60327-2809-415b-abde-d1569a2453b6-console-serving-cert\") pod \"console-f9d7485db-6bjtt\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.359416 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cc9419c8-c23a-418b-8fba-9956bed2a193-audit-policies\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.359498 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af8fff67-ed41-4e2d-af5f-cf76cd2a4234-config\") pod \"console-operator-58897d9998-zq969\" (UID: \"af8fff67-ed41-4e2d-af5f-cf76cd2a4234\") " pod="openshift-console-operator/console-operator-58897d9998-zq969" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.359654 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45ae05cc-9b3e-40e0-9235-f0cf1f6def5f-serving-cert\") pod \"openshift-config-operator-7777fb866f-72x4w\" (UID: \"45ae05cc-9b3e-40e0-9235-f0cf1f6def5f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72x4w" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.359812 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/27d7fca4-c1f9-411f-b705-e36c0c1e8356-machine-approver-tls\") pod \"machine-approver-56656f9798-j7fv5\" (UID: \"27d7fca4-c1f9-411f-b705-e36c0c1e8356\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j7fv5" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.359854 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f0e6d3a-b022-42dd-828e-bd5ec395d06c-metrics-certs\") pod \"router-default-5444994796-tgcvt\" (UID: \"9f0e6d3a-b022-42dd-828e-bd5ec395d06c\") " pod="openshift-ingress/router-default-5444994796-tgcvt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.359964 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0289197b-e586-477b-bb0a-a8b8ef92b21d-metrics-tls\") pod \"dns-operator-744455d44c-q2lm7\" (UID: \"0289197b-e586-477b-bb0a-a8b8ef92b21d\") " pod="openshift-dns-operator/dns-operator-744455d44c-q2lm7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.360059 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjppg\" (UniqueName: \"kubernetes.io/projected/4282a4a7-f7d9-4c0e-9638-02c793c2e2e6-kube-api-access-mjppg\") pod \"apiserver-7bbb656c7d-grpt7\" (UID: \"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.360119 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92a495f2-d3a4-42b9-82f9-48b4dda31caf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4p7vx\" (UID: \"92a495f2-d3a4-42b9-82f9-48b4dda31caf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4p7vx" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.360222 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.360251 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/27d7fca4-c1f9-411f-b705-e36c0c1e8356-auth-proxy-config\") pod \"machine-approver-56656f9798-j7fv5\" (UID: \"27d7fca4-c1f9-411f-b705-e36c0c1e8356\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j7fv5" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.360277 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.358038 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7wbbf"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.360334 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8mnj8"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.360586 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.360642 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w24jm\" (UniqueName: \"kubernetes.io/projected/92a495f2-d3a4-42b9-82f9-48b4dda31caf-kube-api-access-w24jm\") pod \"openshift-apiserver-operator-796bbdcf4f-4p7vx\" (UID: \"92a495f2-d3a4-42b9-82f9-48b4dda31caf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4p7vx" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.360677 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k294v\" (UniqueName: \"kubernetes.io/projected/cc9419c8-c23a-418b-8fba-9956bed2a193-kube-api-access-k294v\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.360711 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c941f1c7-e449-495d-9af7-e40d1b2f2cd5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-g5xr9\" (UID: \"c941f1c7-e449-495d-9af7-e40d1b2f2cd5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5xr9" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.360741 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc5w9\" (UniqueName: \"kubernetes.io/projected/4d23b66c-f736-4af2-9dc7-6167ca4d53ef-kube-api-access-mc5w9\") pod \"marketplace-operator-79b997595-txg9q\" (UID: \"4d23b66c-f736-4af2-9dc7-6167ca4d53ef\") " pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.360985 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af8fff67-ed41-4e2d-af5f-cf76cd2a4234-config\") pod \"console-operator-58897d9998-zq969\" (UID: \"af8fff67-ed41-4e2d-af5f-cf76cd2a4234\") " pod="openshift-console-operator/console-operator-58897d9998-zq969" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.362465 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ced2dcc4-9906-4cbe-b163-41ff41bb4f02-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8srf6\" (UID: \"ced2dcc4-9906-4cbe-b163-41ff41bb4f02\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8srf6" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.362780 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.363098 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hbbcq"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.363152 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8tdjt"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.363714 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.363920 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4282a4a7-f7d9-4c0e-9638-02c793c2e2e6-audit-policies\") pod \"apiserver-7bbb656c7d-grpt7\" (UID: \"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.363955 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cc9419c8-c23a-418b-8fba-9956bed2a193-audit-dir\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.363982 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9f0e6d3a-b022-42dd-828e-bd5ec395d06c-stats-auth\") pod \"router-default-5444994796-tgcvt\" (UID: \"9f0e6d3a-b022-42dd-828e-bd5ec395d06c\") " pod="openshift-ingress/router-default-5444994796-tgcvt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.364504 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cc9419c8-c23a-418b-8fba-9956bed2a193-audit-dir\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.365097 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92a495f2-d3a4-42b9-82f9-48b4dda31caf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4p7vx\" (UID: \"92a495f2-d3a4-42b9-82f9-48b4dda31caf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4p7vx" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.365508 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.365534 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5867ece-9f04-49e4-8a81-1666fc1849ea-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qwnns\" (UID: \"d5867ece-9f04-49e4-8a81-1666fc1849ea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwnns" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.365617 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.365620 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-9cr4m"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.367643 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6577d"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.368415 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.369021 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.369038 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.369262 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552085-npl6x"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.370611 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af8fff67-ed41-4e2d-af5f-cf76cd2a4234-serving-cert\") pod \"console-operator-58897d9998-zq969\" (UID: \"af8fff67-ed41-4e2d-af5f-cf76cd2a4234\") " pod="openshift-console-operator/console-operator-58897d9998-zq969" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.370877 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.370984 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nblmw"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.371716 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45ae05cc-9b3e-40e0-9235-f0cf1f6def5f-serving-cert\") pod \"openshift-config-operator-7777fb866f-72x4w\" (UID: \"45ae05cc-9b3e-40e0-9235-f0cf1f6def5f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72x4w" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.371927 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.371912 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5867ece-9f04-49e4-8a81-1666fc1849ea-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qwnns\" (UID: \"d5867ece-9f04-49e4-8a81-1666fc1849ea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwnns" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.372494 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0289197b-e586-477b-bb0a-a8b8ef92b21d-metrics-tls\") pod \"dns-operator-744455d44c-q2lm7\" (UID: \"0289197b-e586-477b-bb0a-a8b8ef92b21d\") " pod="openshift-dns-operator/dns-operator-744455d44c-q2lm7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.372893 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.373397 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92a495f2-d3a4-42b9-82f9-48b4dda31caf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4p7vx\" (UID: \"92a495f2-d3a4-42b9-82f9-48b4dda31caf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4p7vx" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.373686 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.373772 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5kj8n"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.374506 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/27d7fca4-c1f9-411f-b705-e36c0c1e8356-machine-approver-tls\") pod \"machine-approver-56656f9798-j7fv5\" (UID: \"27d7fca4-c1f9-411f-b705-e36c0c1e8356\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j7fv5" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.375065 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wp526"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.378437 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-s68rr"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.378469 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g8v9j"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.378483 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2r62c"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.380639 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2r62c" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.381715 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2r62c"] Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.383652 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4282a4a7-f7d9-4c0e-9638-02c793c2e2e6-encryption-config\") pod \"apiserver-7bbb656c7d-grpt7\" (UID: \"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.383756 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4282a4a7-f7d9-4c0e-9638-02c793c2e2e6-audit-policies\") pod \"apiserver-7bbb656c7d-grpt7\" (UID: \"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.383853 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4282a4a7-f7d9-4c0e-9638-02c793c2e2e6-serving-cert\") pod \"apiserver-7bbb656c7d-grpt7\" (UID: \"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.383889 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4282a4a7-f7d9-4c0e-9638-02c793c2e2e6-etcd-client\") pod \"apiserver-7bbb656c7d-grpt7\" (UID: \"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.384001 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.438014 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbj9x\" (UniqueName: \"kubernetes.io/projected/0bab741b-822c-4548-8333-aa3f90ecd8a0-kube-api-access-gbj9x\") pod \"controller-manager-879f6c89f-6j748\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.466486 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f0e6d3a-b022-42dd-828e-bd5ec395d06c-metrics-certs\") pod \"router-default-5444994796-tgcvt\" (UID: \"9f0e6d3a-b022-42dd-828e-bd5ec395d06c\") " pod="openshift-ingress/router-default-5444994796-tgcvt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.466594 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c941f1c7-e449-495d-9af7-e40d1b2f2cd5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-g5xr9\" (UID: \"c941f1c7-e449-495d-9af7-e40d1b2f2cd5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5xr9" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.466625 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc5w9\" (UniqueName: \"kubernetes.io/projected/4d23b66c-f736-4af2-9dc7-6167ca4d53ef-kube-api-access-mc5w9\") pod \"marketplace-operator-79b997595-txg9q\" (UID: \"4d23b66c-f736-4af2-9dc7-6167ca4d53ef\") " pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.466664 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9f0e6d3a-b022-42dd-828e-bd5ec395d06c-stats-auth\") pod \"router-default-5444994796-tgcvt\" (UID: \"9f0e6d3a-b022-42dd-828e-bd5ec395d06c\") " pod="openshift-ingress/router-default-5444994796-tgcvt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.466698 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f0e6d3a-b022-42dd-828e-bd5ec395d06c-service-ca-bundle\") pod \"router-default-5444994796-tgcvt\" (UID: \"9f0e6d3a-b022-42dd-828e-bd5ec395d06c\") " pod="openshift-ingress/router-default-5444994796-tgcvt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.466729 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnsqk\" (UniqueName: \"kubernetes.io/projected/9f0e6d3a-b022-42dd-828e-bd5ec395d06c-kube-api-access-rnsqk\") pod \"router-default-5444994796-tgcvt\" (UID: \"9f0e6d3a-b022-42dd-828e-bd5ec395d06c\") " pod="openshift-ingress/router-default-5444994796-tgcvt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.466782 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e40273f3-dda6-4e38-b940-284ae6f95e41-images\") pod \"machine-config-operator-74547568cd-csfgr\" (UID: \"e40273f3-dda6-4e38-b940-284ae6f95e41\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csfgr" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.466838 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9f0e6d3a-b022-42dd-828e-bd5ec395d06c-default-certificate\") pod \"router-default-5444994796-tgcvt\" (UID: \"9f0e6d3a-b022-42dd-828e-bd5ec395d06c\") " pod="openshift-ingress/router-default-5444994796-tgcvt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.466863 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8a265902-a773-4550-b3fa-79f94c82809c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8vfsj\" (UID: \"8a265902-a773-4550-b3fa-79f94c82809c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8vfsj" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.466902 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bde95e4a-12d5-4b7e-bd4e-9b2527faefa0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-79576\" (UID: \"bde95e4a-12d5-4b7e-bd4e-9b2527faefa0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-79576" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.466927 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3a60327-2809-415b-abde-d1569a2453b6-console-config\") pod \"console-f9d7485db-6bjtt\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.466994 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e73a713-a351-419b-903a-e44041c28d6f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mxs62\" (UID: \"3e73a713-a351-419b-903a-e44041c28d6f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mxs62" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.467016 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mflkh\" (UniqueName: \"kubernetes.io/projected/8a265902-a773-4550-b3fa-79f94c82809c-kube-api-access-mflkh\") pod \"multus-admission-controller-857f4d67dd-8vfsj\" (UID: \"8a265902-a773-4550-b3fa-79f94c82809c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8vfsj" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.467045 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bde95e4a-12d5-4b7e-bd4e-9b2527faefa0-trusted-ca\") pod \"ingress-operator-5b745b69d9-79576\" (UID: \"bde95e4a-12d5-4b7e-bd4e-9b2527faefa0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-79576" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.467074 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dzl2\" (UniqueName: \"kubernetes.io/projected/f3a60327-2809-415b-abde-d1569a2453b6-kube-api-access-9dzl2\") pod \"console-f9d7485db-6bjtt\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.467146 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5zkr\" (UniqueName: \"kubernetes.io/projected/230c8679-e321-40bd-844e-e350e48404e3-kube-api-access-j5zkr\") pod \"catalog-operator-68c6474976-8tdjt\" (UID: \"230c8679-e321-40bd-844e-e350e48404e3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8tdjt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.467205 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4d23b66c-f736-4af2-9dc7-6167ca4d53ef-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-txg9q\" (UID: \"4d23b66c-f736-4af2-9dc7-6167ca4d53ef\") " pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.467242 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbkqz\" (UniqueName: \"kubernetes.io/projected/e40273f3-dda6-4e38-b940-284ae6f95e41-kube-api-access-xbkqz\") pod \"machine-config-operator-74547568cd-csfgr\" (UID: \"e40273f3-dda6-4e38-b940-284ae6f95e41\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csfgr" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.467274 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdj96\" (UniqueName: \"kubernetes.io/projected/bde95e4a-12d5-4b7e-bd4e-9b2527faefa0-kube-api-access-wdj96\") pod \"ingress-operator-5b745b69d9-79576\" (UID: \"bde95e4a-12d5-4b7e-bd4e-9b2527faefa0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-79576" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.467338 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e40273f3-dda6-4e38-b940-284ae6f95e41-proxy-tls\") pod \"machine-config-operator-74547568cd-csfgr\" (UID: \"e40273f3-dda6-4e38-b940-284ae6f95e41\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csfgr" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.467379 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3a60327-2809-415b-abde-d1569a2453b6-trusted-ca-bundle\") pod \"console-f9d7485db-6bjtt\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.467413 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d23b66c-f736-4af2-9dc7-6167ca4d53ef-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-txg9q\" (UID: \"4d23b66c-f736-4af2-9dc7-6167ca4d53ef\") " pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.467444 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bde95e4a-12d5-4b7e-bd4e-9b2527faefa0-metrics-tls\") pod \"ingress-operator-5b745b69d9-79576\" (UID: \"bde95e4a-12d5-4b7e-bd4e-9b2527faefa0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-79576" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.467500 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e73a713-a351-419b-903a-e44041c28d6f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mxs62\" (UID: \"3e73a713-a351-419b-903a-e44041c28d6f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mxs62" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.467529 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e40273f3-dda6-4e38-b940-284ae6f95e41-auth-proxy-config\") pod \"machine-config-operator-74547568cd-csfgr\" (UID: \"e40273f3-dda6-4e38-b940-284ae6f95e41\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csfgr" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.467553 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/230c8679-e321-40bd-844e-e350e48404e3-profile-collector-cert\") pod \"catalog-operator-68c6474976-8tdjt\" (UID: \"230c8679-e321-40bd-844e-e350e48404e3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8tdjt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.467605 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3a60327-2809-415b-abde-d1569a2453b6-oauth-serving-cert\") pod \"console-f9d7485db-6bjtt\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.467640 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c941f1c7-e449-495d-9af7-e40d1b2f2cd5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-g5xr9\" (UID: \"c941f1c7-e449-495d-9af7-e40d1b2f2cd5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5xr9" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.467668 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2959v\" (UniqueName: \"kubernetes.io/projected/3e73a713-a351-419b-903a-e44041c28d6f-kube-api-access-2959v\") pod \"cluster-image-registry-operator-dc59b4c8b-mxs62\" (UID: \"3e73a713-a351-419b-903a-e44041c28d6f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mxs62" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.467699 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c941f1c7-e449-495d-9af7-e40d1b2f2cd5-config\") pod \"kube-apiserver-operator-766d6c64bb-g5xr9\" (UID: \"c941f1c7-e449-495d-9af7-e40d1b2f2cd5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5xr9" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.467729 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e73a713-a351-419b-903a-e44041c28d6f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mxs62\" (UID: \"3e73a713-a351-419b-903a-e44041c28d6f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mxs62" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.467749 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3a60327-2809-415b-abde-d1569a2453b6-console-oauth-config\") pod \"console-f9d7485db-6bjtt\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.467774 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3a60327-2809-415b-abde-d1569a2453b6-service-ca\") pod \"console-f9d7485db-6bjtt\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.467834 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/230c8679-e321-40bd-844e-e350e48404e3-srv-cert\") pod \"catalog-operator-68c6474976-8tdjt\" (UID: \"230c8679-e321-40bd-844e-e350e48404e3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8tdjt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.467866 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3a60327-2809-415b-abde-d1569a2453b6-console-serving-cert\") pod \"console-f9d7485db-6bjtt\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.470502 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e40273f3-dda6-4e38-b940-284ae6f95e41-auth-proxy-config\") pod \"machine-config-operator-74547568cd-csfgr\" (UID: \"e40273f3-dda6-4e38-b940-284ae6f95e41\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csfgr" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.472509 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bde95e4a-12d5-4b7e-bd4e-9b2527faefa0-metrics-tls\") pod \"ingress-operator-5b745b69d9-79576\" (UID: \"bde95e4a-12d5-4b7e-bd4e-9b2527faefa0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-79576" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.475990 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bde95e4a-12d5-4b7e-bd4e-9b2527faefa0-trusted-ca\") pod \"ingress-operator-5b745b69d9-79576\" (UID: \"bde95e4a-12d5-4b7e-bd4e-9b2527faefa0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-79576" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.481625 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hblmv\" (UniqueName: \"kubernetes.io/projected/960b4f98-8229-42eb-9755-891df394483c-kube-api-access-hblmv\") pod \"authentication-operator-69f744f599-scbgc\" (UID: \"960b4f98-8229-42eb-9755-891df394483c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-scbgc" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.500859 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk5fz\" (UniqueName: \"kubernetes.io/projected/25ad6c89-d2e0-408e-8c6f-a49da5a55bdd-kube-api-access-rk5fz\") pod \"machine-api-operator-5694c8668f-2kdt2\" (UID: \"25ad6c89-d2e0-408e-8c6f-a49da5a55bdd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2kdt2" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.505170 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.524296 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.544197 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.552185 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9f0e6d3a-b022-42dd-828e-bd5ec395d06c-default-certificate\") pod \"router-default-5444994796-tgcvt\" (UID: \"9f0e6d3a-b022-42dd-828e-bd5ec395d06c\") " pod="openshift-ingress/router-default-5444994796-tgcvt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.563994 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.572994 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9f0e6d3a-b022-42dd-828e-bd5ec395d06c-stats-auth\") pod \"router-default-5444994796-tgcvt\" (UID: \"9f0e6d3a-b022-42dd-828e-bd5ec395d06c\") " pod="openshift-ingress/router-default-5444994796-tgcvt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.585459 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.591734 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f0e6d3a-b022-42dd-828e-bd5ec395d06c-metrics-certs\") pod \"router-default-5444994796-tgcvt\" (UID: \"9f0e6d3a-b022-42dd-828e-bd5ec395d06c\") " pod="openshift-ingress/router-default-5444994796-tgcvt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.604653 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.609957 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f0e6d3a-b022-42dd-828e-bd5ec395d06c-service-ca-bundle\") pod \"router-default-5444994796-tgcvt\" (UID: \"9f0e6d3a-b022-42dd-828e-bd5ec395d06c\") " pod="openshift-ingress/router-default-5444994796-tgcvt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.625467 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.644663 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.663886 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.670453 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3a60327-2809-415b-abde-d1569a2453b6-console-config\") pod \"console-f9d7485db-6bjtt\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.685088 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.691248 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3a60327-2809-415b-abde-d1569a2453b6-service-ca\") pod \"console-f9d7485db-6bjtt\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.705632 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.735586 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.742570 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e73a713-a351-419b-903a-e44041c28d6f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mxs62\" (UID: \"3e73a713-a351-419b-903a-e44041c28d6f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mxs62" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.754051 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.760850 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3a60327-2809-415b-abde-d1569a2453b6-trusted-ca-bundle\") pod \"console-f9d7485db-6bjtt\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.765363 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.765881 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-2kdt2" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.774880 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3a60327-2809-415b-abde-d1569a2453b6-console-serving-cert\") pod \"console-f9d7485db-6bjtt\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.778449 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-scbgc" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.785296 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.805925 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.826698 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.831567 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3a60327-2809-415b-abde-d1569a2453b6-oauth-serving-cert\") pod \"console-f9d7485db-6bjtt\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.845835 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.856400 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3a60327-2809-415b-abde-d1569a2453b6-console-oauth-config\") pod \"console-f9d7485db-6bjtt\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.865909 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.884493 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.905277 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.925189 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.944834 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.956029 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c941f1c7-e449-495d-9af7-e40d1b2f2cd5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-g5xr9\" (UID: \"c941f1c7-e449-495d-9af7-e40d1b2f2cd5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5xr9" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.968012 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.971826 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c941f1c7-e449-495d-9af7-e40d1b2f2cd5-config\") pod \"kube-apiserver-operator-766d6c64bb-g5xr9\" (UID: \"c941f1c7-e449-495d-9af7-e40d1b2f2cd5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5xr9" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.985563 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 10 06:47:44 crc kubenswrapper[4825]: I0310 06:47:44.996671 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e73a713-a351-419b-903a-e44041c28d6f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mxs62\" (UID: \"3e73a713-a351-419b-903a-e44041c28d6f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mxs62" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.004514 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.025739 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.045499 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.063458 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-2kdt2"] Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.065445 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.078234 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-scbgc"] Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.085546 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.105850 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.125366 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.144535 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.165450 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.177314 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4d23b66c-f736-4af2-9dc7-6167ca4d53ef-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-txg9q\" (UID: \"4d23b66c-f736-4af2-9dc7-6167ca4d53ef\") " pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.190720 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.200810 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d23b66c-f736-4af2-9dc7-6167ca4d53ef-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-txg9q\" (UID: \"4d23b66c-f736-4af2-9dc7-6167ca4d53ef\") " pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.204300 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.223975 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.233564 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8a265902-a773-4550-b3fa-79f94c82809c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8vfsj\" (UID: \"8a265902-a773-4550-b3fa-79f94c82809c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8vfsj" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.245614 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.253479 4825 secret.go:188] Couldn't get secret openshift-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.253589 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-serving-cert podName:f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb nodeName:}" failed. No retries permitted until 2026-03-10 06:47:45.753558004 +0000 UTC m=+218.783338619 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-serving-cert") pod "apiserver-76f77b778f-ksvjd" (UID: "f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb") : failed to sync secret cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.254050 4825 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.254111 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bab741b-822c-4548-8333-aa3f90ecd8a0-serving-cert podName:0bab741b-822c-4548-8333-aa3f90ecd8a0 nodeName:}" failed. No retries permitted until 2026-03-10 06:47:45.754098592 +0000 UTC m=+218.783879207 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0bab741b-822c-4548-8333-aa3f90ecd8a0-serving-cert") pod "controller-manager-879f6c89f-6j748" (UID: "0bab741b-822c-4548-8333-aa3f90ecd8a0") : failed to sync secret cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.254194 4825 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.254236 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-client-ca podName:0bab741b-822c-4548-8333-aa3f90ecd8a0 nodeName:}" failed. No retries permitted until 2026-03-10 06:47:45.754219696 +0000 UTC m=+218.784000311 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-client-ca") pod "controller-manager-879f6c89f-6j748" (UID: "0bab741b-822c-4548-8333-aa3f90ecd8a0") : failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.254280 4825 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.254322 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-config podName:0bab741b-822c-4548-8333-aa3f90ecd8a0 nodeName:}" failed. No retries permitted until 2026-03-10 06:47:45.754308499 +0000 UTC m=+218.784089124 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-config") pod "controller-manager-879f6c89f-6j748" (UID: "0bab741b-822c-4548-8333-aa3f90ecd8a0") : failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.254375 4825 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.254414 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-proxy-ca-bundles podName:0bab741b-822c-4548-8333-aa3f90ecd8a0 nodeName:}" failed. No retries permitted until 2026-03-10 06:47:45.754402672 +0000 UTC m=+218.784183277 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-proxy-ca-bundles") pod "controller-manager-879f6c89f-6j748" (UID: "0bab741b-822c-4548-8333-aa3f90ecd8a0") : failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.254444 4825 secret.go:188] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.254483 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-serving-cert podName:01203ed2-23cf-4b39-ae6c-6ed22c0d66c0 nodeName:}" failed. No retries permitted until 2026-03-10 06:47:45.754469044 +0000 UTC m=+218.784249659 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-serving-cert") pod "route-controller-manager-6576b87f9c-grsjk" (UID: "01203ed2-23cf-4b39-ae6c-6ed22c0d66c0") : failed to sync secret cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.254516 4825 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.254563 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-audit podName:f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb nodeName:}" failed. No retries permitted until 2026-03-10 06:47:45.754546897 +0000 UTC m=+218.784327512 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-audit") pod "apiserver-76f77b778f-ksvjd" (UID: "f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb") : failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.254593 4825 secret.go:188] Couldn't get secret openshift-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.254642 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-encryption-config podName:f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb nodeName:}" failed. No retries permitted until 2026-03-10 06:47:45.75462974 +0000 UTC m=+218.784410355 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-encryption-config") pod "apiserver-76f77b778f-ksvjd" (UID: "f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb") : failed to sync secret cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.254797 4825 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.254841 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-etcd-serving-ca podName:f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb nodeName:}" failed. No retries permitted until 2026-03-10 06:47:45.754829767 +0000 UTC m=+218.784610382 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-etcd-serving-ca") pod "apiserver-76f77b778f-ksvjd" (UID: "f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb") : failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.254872 4825 secret.go:188] Couldn't get secret openshift-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.254910 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-etcd-client podName:f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb nodeName:}" failed. No retries permitted until 2026-03-10 06:47:45.754901129 +0000 UTC m=+218.784681744 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-etcd-client") pod "apiserver-76f77b778f-ksvjd" (UID: "f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb") : failed to sync secret cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.254959 4825 configmap.go:193] Couldn't get configMap openshift-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.254997 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-trusted-ca-bundle podName:f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb nodeName:}" failed. No retries permitted until 2026-03-10 06:47:45.754980742 +0000 UTC m=+218.784761357 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-trusted-ca-bundle") pod "apiserver-76f77b778f-ksvjd" (UID: "f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb") : failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.257395 4825 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.257740 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-config podName:01203ed2-23cf-4b39-ae6c-6ed22c0d66c0 nodeName:}" failed. No retries permitted until 2026-03-10 06:47:45.757536128 +0000 UTC m=+218.787316743 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-config") pod "route-controller-manager-6576b87f9c-grsjk" (UID: "01203ed2-23cf-4b39-ae6c-6ed22c0d66c0") : failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.269625 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.273896 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/230c8679-e321-40bd-844e-e350e48404e3-profile-collector-cert\") pod \"catalog-operator-68c6474976-8tdjt\" (UID: \"230c8679-e321-40bd-844e-e350e48404e3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8tdjt" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.284248 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.302599 4825 request.go:700] Waited for 1.019022924s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.304257 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.324988 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.336747 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/230c8679-e321-40bd-844e-e350e48404e3-srv-cert\") pod \"catalog-operator-68c6474976-8tdjt\" (UID: \"230c8679-e321-40bd-844e-e350e48404e3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8tdjt" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.345828 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.349907 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e40273f3-dda6-4e38-b940-284ae6f95e41-images\") pod \"machine-config-operator-74547568cd-csfgr\" (UID: \"e40273f3-dda6-4e38-b940-284ae6f95e41\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csfgr" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.365372 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.383762 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.405270 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.413009 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e40273f3-dda6-4e38-b940-284ae6f95e41-proxy-tls\") pod \"machine-config-operator-74547568cd-csfgr\" (UID: \"e40273f3-dda6-4e38-b940-284ae6f95e41\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csfgr" Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.414654 4825 projected.go:288] Couldn't get configMap openshift-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.425694 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.446698 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.455738 4825 projected.go:288] Couldn't get configMap openshift-route-controller-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.455839 4825 projected.go:194] Error preparing data for projected volume kube-api-access-wcjrc for pod openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: E0310 06:47:45.455974 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-kube-api-access-wcjrc podName:01203ed2-23cf-4b39-ae6c-6ed22c0d66c0 nodeName:}" failed. No retries permitted until 2026-03-10 06:47:45.955932178 +0000 UTC m=+218.985712843 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wcjrc" (UniqueName: "kubernetes.io/projected/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-kube-api-access-wcjrc") pod "route-controller-manager-6576b87f9c-grsjk" (UID: "01203ed2-23cf-4b39-ae6c-6ed22c0d66c0") : failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.466674 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.485034 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.504941 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.524480 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.545323 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.582186 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-scbgc" event={"ID":"960b4f98-8229-42eb-9755-891df394483c","Type":"ContainerStarted","Data":"d360f1e82049681a58567d0288e590eaf95cd9d7888b98799e9c72d924380a12"} Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.582290 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-scbgc" event={"ID":"960b4f98-8229-42eb-9755-891df394483c","Type":"ContainerStarted","Data":"c1b606462077f61c5e6a34818852f33ba3e22ddbdb2a5d5b9bd78f66e3bd6310"} Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.584962 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-2kdt2" event={"ID":"25ad6c89-d2e0-408e-8c6f-a49da5a55bdd","Type":"ContainerStarted","Data":"bbb84efc74fe4d1b837854b687f6e7799f3619c6b05b3c16351e7ea7e9aef49c"} Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.585058 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-2kdt2" event={"ID":"25ad6c89-d2e0-408e-8c6f-a49da5a55bdd","Type":"ContainerStarted","Data":"40285bff2920642cfd159498fa446ebd0b184b9a997c18a1ea359daa59c9daf8"} Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.585092 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-2kdt2" event={"ID":"25ad6c89-d2e0-408e-8c6f-a49da5a55bdd","Type":"ContainerStarted","Data":"8d21ccfb66318a95daec4966ace812210f89248045d8477e2115761e4e0e6633"} Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.586032 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.605909 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.624911 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.644530 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.665634 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.685466 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.705285 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.725864 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.745410 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.765627 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.785992 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.789689 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-etcd-client\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.789770 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-etcd-serving-ca\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.789954 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-config\") pod \"controller-manager-879f6c89f-6j748\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.790024 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-serving-cert\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.790102 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-client-ca\") pod \"controller-manager-879f6c89f-6j748\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.790243 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bab741b-822c-4548-8333-aa3f90ecd8a0-serving-cert\") pod \"controller-manager-879f6c89f-6j748\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.790308 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.790406 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-encryption-config\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.790511 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-config\") pod \"route-controller-manager-6576b87f9c-grsjk\" (UID: \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.790577 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6j748\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.790629 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-serving-cert\") pod \"route-controller-manager-6576b87f9c-grsjk\" (UID: \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.790670 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-audit\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.805572 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.825301 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.846296 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.865107 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.886300 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.904808 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.925291 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.944258 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.965012 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.985325 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 10 06:47:45 crc kubenswrapper[4825]: I0310 06:47:45.994649 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcjrc\" (UniqueName: \"kubernetes.io/projected/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-kube-api-access-wcjrc\") pod \"route-controller-manager-6576b87f9c-grsjk\" (UID: \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.005865 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.025515 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.045854 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.065628 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.104888 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.124268 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.144628 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.165284 4825 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.185400 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.204875 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.243051 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwgn4\" (UniqueName: \"kubernetes.io/projected/af8fff67-ed41-4e2d-af5f-cf76cd2a4234-kube-api-access-rwgn4\") pod \"console-operator-58897d9998-zq969\" (UID: \"af8fff67-ed41-4e2d-af5f-cf76cd2a4234\") " pod="openshift-console-operator/console-operator-58897d9998-zq969" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.291562 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7f54\" (UniqueName: \"kubernetes.io/projected/0289197b-e586-477b-bb0a-a8b8ef92b21d-kube-api-access-n7f54\") pod \"dns-operator-744455d44c-q2lm7\" (UID: \"0289197b-e586-477b-bb0a-a8b8ef92b21d\") " pod="openshift-dns-operator/dns-operator-744455d44c-q2lm7" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.301359 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4dhj\" (UniqueName: \"kubernetes.io/projected/d5867ece-9f04-49e4-8a81-1666fc1849ea-kube-api-access-g4dhj\") pod \"openshift-controller-manager-operator-756b6f6bc6-qwnns\" (UID: \"d5867ece-9f04-49e4-8a81-1666fc1849ea\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwnns" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.302876 4825 request.go:700] Waited for 1.942907875s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/serviceaccounts/openshift-config-operator/token Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.305776 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttpjj\" (UniqueName: \"kubernetes.io/projected/ced2dcc4-9906-4cbe-b163-41ff41bb4f02-kube-api-access-ttpjj\") pod \"cluster-samples-operator-665b6dd947-8srf6\" (UID: \"ced2dcc4-9906-4cbe-b163-41ff41bb4f02\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8srf6" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.332901 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtsbr\" (UniqueName: \"kubernetes.io/projected/45ae05cc-9b3e-40e0-9235-f0cf1f6def5f-kube-api-access-vtsbr\") pod \"openshift-config-operator-7777fb866f-72x4w\" (UID: \"45ae05cc-9b3e-40e0-9235-f0cf1f6def5f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72x4w" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.345406 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjppg\" (UniqueName: \"kubernetes.io/projected/4282a4a7-f7d9-4c0e-9638-02c793c2e2e6-kube-api-access-mjppg\") pod \"apiserver-7bbb656c7d-grpt7\" (UID: \"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.377240 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj4pk\" (UniqueName: \"kubernetes.io/projected/27d7fca4-c1f9-411f-b705-e36c0c1e8356-kube-api-access-wj4pk\") pod \"machine-approver-56656f9798-j7fv5\" (UID: \"27d7fca4-c1f9-411f-b705-e36c0c1e8356\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j7fv5" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.382866 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w24jm\" (UniqueName: \"kubernetes.io/projected/92a495f2-d3a4-42b9-82f9-48b4dda31caf-kube-api-access-w24jm\") pod \"openshift-apiserver-operator-796bbdcf4f-4p7vx\" (UID: \"92a495f2-d3a4-42b9-82f9-48b4dda31caf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4p7vx" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.384854 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8srf6" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.394716 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72x4w" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.402887 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zq969" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.405219 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.413026 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k294v\" (UniqueName: \"kubernetes.io/projected/cc9419c8-c23a-418b-8fba-9956bed2a193-kube-api-access-k294v\") pod \"oauth-openshift-558db77b4-xfp4f\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:46 crc kubenswrapper[4825]: E0310 06:47:46.415054 4825 projected.go:288] Couldn't get configMap openshift-apiserver/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:46 crc kubenswrapper[4825]: E0310 06:47:46.415088 4825 projected.go:194] Error preparing data for projected volume kube-api-access-lpdhl for pod openshift-apiserver/apiserver-76f77b778f-ksvjd: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:46 crc kubenswrapper[4825]: E0310 06:47:46.415168 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-kube-api-access-lpdhl podName:f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb nodeName:}" failed. No retries permitted until 2026-03-10 06:47:46.915148236 +0000 UTC m=+219.944928851 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-lpdhl" (UniqueName: "kubernetes.io/projected/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-kube-api-access-lpdhl") pod "apiserver-76f77b778f-ksvjd" (UID: "f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb") : failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.421012 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-q2lm7" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.424253 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.426395 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwnns" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.447013 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.465311 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.505691 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c941f1c7-e449-495d-9af7-e40d1b2f2cd5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-g5xr9\" (UID: \"c941f1c7-e449-495d-9af7-e40d1b2f2cd5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5xr9" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.519213 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5xr9" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.524523 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc5w9\" (UniqueName: \"kubernetes.io/projected/4d23b66c-f736-4af2-9dc7-6167ca4d53ef-kube-api-access-mc5w9\") pod \"marketplace-operator-79b997595-txg9q\" (UID: \"4d23b66c-f736-4af2-9dc7-6167ca4d53ef\") " pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.540452 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdj96\" (UniqueName: \"kubernetes.io/projected/bde95e4a-12d5-4b7e-bd4e-9b2527faefa0-kube-api-access-wdj96\") pod \"ingress-operator-5b745b69d9-79576\" (UID: \"bde95e4a-12d5-4b7e-bd4e-9b2527faefa0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-79576" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.552116 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.562894 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnsqk\" (UniqueName: \"kubernetes.io/projected/9f0e6d3a-b022-42dd-828e-bd5ec395d06c-kube-api-access-rnsqk\") pod \"router-default-5444994796-tgcvt\" (UID: \"9f0e6d3a-b022-42dd-828e-bd5ec395d06c\") " pod="openshift-ingress/router-default-5444994796-tgcvt" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.589292 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbkqz\" (UniqueName: \"kubernetes.io/projected/e40273f3-dda6-4e38-b940-284ae6f95e41-kube-api-access-xbkqz\") pod \"machine-config-operator-74547568cd-csfgr\" (UID: \"e40273f3-dda6-4e38-b940-284ae6f95e41\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csfgr" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.598409 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j7fv5" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.598691 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5zkr\" (UniqueName: \"kubernetes.io/projected/230c8679-e321-40bd-844e-e350e48404e3-kube-api-access-j5zkr\") pod \"catalog-operator-68c6474976-8tdjt\" (UID: \"230c8679-e321-40bd-844e-e350e48404e3\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8tdjt" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.617530 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.621036 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dzl2\" (UniqueName: \"kubernetes.io/projected/f3a60327-2809-415b-abde-d1569a2453b6-kube-api-access-9dzl2\") pod \"console-f9d7485db-6bjtt\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:46 crc kubenswrapper[4825]: W0310 06:47:46.642018 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27d7fca4_c1f9_411f_b705_e36c0c1e8356.slice/crio-c80289a1cb7e97e8bcc018b2ccc38e97897910f1b79bb39baabd1772f287d518 WatchSource:0}: Error finding container c80289a1cb7e97e8bcc018b2ccc38e97897910f1b79bb39baabd1772f287d518: Status 404 returned error can't find the container with id c80289a1cb7e97e8bcc018b2ccc38e97897910f1b79bb39baabd1772f287d518 Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.645538 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bde95e4a-12d5-4b7e-bd4e-9b2527faefa0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-79576\" (UID: \"bde95e4a-12d5-4b7e-bd4e-9b2527faefa0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-79576" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.646566 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.654797 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4p7vx" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.662065 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2959v\" (UniqueName: \"kubernetes.io/projected/3e73a713-a351-419b-903a-e44041c28d6f-kube-api-access-2959v\") pod \"cluster-image-registry-operator-dc59b4c8b-mxs62\" (UID: \"3e73a713-a351-419b-903a-e44041c28d6f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mxs62" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.682502 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e73a713-a351-419b-903a-e44041c28d6f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mxs62\" (UID: \"3e73a713-a351-419b-903a-e44041c28d6f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mxs62" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.700875 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mflkh\" (UniqueName: \"kubernetes.io/projected/8a265902-a773-4550-b3fa-79f94c82809c-kube-api-access-mflkh\") pod \"multus-admission-controller-857f4d67dd-8vfsj\" (UID: \"8a265902-a773-4550-b3fa-79f94c82809c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8vfsj" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.706009 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.719232 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwnns"] Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.725783 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 10 06:47:46 crc kubenswrapper[4825]: W0310 06:47:46.734448 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5867ece_9f04_49e4_8a81_1666fc1849ea.slice/crio-d66dc3b7a5d7152f1d26ae559a7a0bd625c9bb7242af531501ade606bbfc9cfd WatchSource:0}: Error finding container d66dc3b7a5d7152f1d26ae559a7a0bd625c9bb7242af531501ade606bbfc9cfd: Status 404 returned error can't find the container with id d66dc3b7a5d7152f1d26ae559a7a0bd625c9bb7242af531501ade606bbfc9cfd Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.734554 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-encryption-config\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.743667 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-79576" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.754172 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.755751 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-72x4w"] Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.764925 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.766771 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tgcvt" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.767918 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-client-ca\") pod \"controller-manager-879f6c89f-6j748\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.772860 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-serving-cert\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.789339 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 06:47:46 crc kubenswrapper[4825]: E0310 06:47:46.790658 4825 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:46 crc kubenswrapper[4825]: E0310 06:47:46.790774 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-etcd-serving-ca podName:f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb nodeName:}" failed. No retries permitted until 2026-03-10 06:47:47.790745717 +0000 UTC m=+220.820526332 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-etcd-serving-ca") pod "apiserver-76f77b778f-ksvjd" (UID: "f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb") : failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:46 crc kubenswrapper[4825]: E0310 06:47:46.790843 4825 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 10 06:47:46 crc kubenswrapper[4825]: E0310 06:47:46.790921 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bab741b-822c-4548-8333-aa3f90ecd8a0-serving-cert podName:0bab741b-822c-4548-8333-aa3f90ecd8a0 nodeName:}" failed. No retries permitted until 2026-03-10 06:47:47.790899612 +0000 UTC m=+220.820680227 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0bab741b-822c-4548-8333-aa3f90ecd8a0-serving-cert") pod "controller-manager-879f6c89f-6j748" (UID: "0bab741b-822c-4548-8333-aa3f90ecd8a0") : failed to sync secret cache: timed out waiting for the condition Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.790948 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5xr9"] Mar 10 06:47:46 crc kubenswrapper[4825]: E0310 06:47:46.790989 4825 secret.go:188] Couldn't get secret openshift-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Mar 10 06:47:46 crc kubenswrapper[4825]: E0310 06:47:46.791010 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-etcd-client podName:f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb nodeName:}" failed. No retries permitted until 2026-03-10 06:47:47.791004275 +0000 UTC m=+220.820784890 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-etcd-client") pod "apiserver-76f77b778f-ksvjd" (UID: "f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb") : failed to sync secret cache: timed out waiting for the condition Mar 10 06:47:46 crc kubenswrapper[4825]: E0310 06:47:46.791062 4825 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:46 crc kubenswrapper[4825]: E0310 06:47:46.791083 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-proxy-ca-bundles podName:0bab741b-822c-4548-8333-aa3f90ecd8a0 nodeName:}" failed. No retries permitted until 2026-03-10 06:47:47.791077038 +0000 UTC m=+220.820857653 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-proxy-ca-bundles") pod "controller-manager-879f6c89f-6j748" (UID: "0bab741b-822c-4548-8333-aa3f90ecd8a0") : failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:46 crc kubenswrapper[4825]: E0310 06:47:46.791107 4825 configmap.go:193] Couldn't get configMap openshift-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:46 crc kubenswrapper[4825]: E0310 06:47:46.791109 4825 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:46 crc kubenswrapper[4825]: E0310 06:47:46.791142 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-trusted-ca-bundle podName:f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb nodeName:}" failed. No retries permitted until 2026-03-10 06:47:47.791122609 +0000 UTC m=+220.820903224 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-trusted-ca-bundle") pod "apiserver-76f77b778f-ksvjd" (UID: "f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb") : failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:46 crc kubenswrapper[4825]: E0310 06:47:46.791159 4825 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:46 crc kubenswrapper[4825]: E0310 06:47:46.791169 4825 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:46 crc kubenswrapper[4825]: E0310 06:47:46.791196 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-config podName:0bab741b-822c-4548-8333-aa3f90ecd8a0 nodeName:}" failed. No retries permitted until 2026-03-10 06:47:47.791172931 +0000 UTC m=+220.820953546 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-config") pod "controller-manager-879f6c89f-6j748" (UID: "0bab741b-822c-4548-8333-aa3f90ecd8a0") : failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:46 crc kubenswrapper[4825]: E0310 06:47:46.791212 4825 secret.go:188] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 10 06:47:46 crc kubenswrapper[4825]: E0310 06:47:46.791218 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-audit podName:f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb nodeName:}" failed. No retries permitted until 2026-03-10 06:47:47.791210182 +0000 UTC m=+220.820990797 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-audit") pod "apiserver-76f77b778f-ksvjd" (UID: "f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb") : failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:46 crc kubenswrapper[4825]: E0310 06:47:46.791292 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-config podName:01203ed2-23cf-4b39-ae6c-6ed22c0d66c0 nodeName:}" failed. No retries permitted until 2026-03-10 06:47:47.791278885 +0000 UTC m=+220.821059670 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-config") pod "route-controller-manager-6576b87f9c-grsjk" (UID: "01203ed2-23cf-4b39-ae6c-6ed22c0d66c0") : failed to sync configmap cache: timed out waiting for the condition Mar 10 06:47:46 crc kubenswrapper[4825]: E0310 06:47:46.791308 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-serving-cert podName:01203ed2-23cf-4b39-ae6c-6ed22c0d66c0 nodeName:}" failed. No retries permitted until 2026-03-10 06:47:47.791299465 +0000 UTC m=+220.821080290 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-serving-cert") pod "route-controller-manager-6576b87f9c-grsjk" (UID: "01203ed2-23cf-4b39-ae6c-6ed22c0d66c0") : failed to sync secret cache: timed out waiting for the condition Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.803175 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.804546 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.826059 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.827496 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mxs62" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.829729 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcjrc\" (UniqueName: \"kubernetes.io/projected/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-kube-api-access-wcjrc\") pod \"route-controller-manager-6576b87f9c-grsjk\" (UID: \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.829996 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-txg9q"] Mar 10 06:47:46 crc kubenswrapper[4825]: W0310 06:47:46.842597 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc941f1c7_e449_495d_9af7_e40d1b2f2cd5.slice/crio-5261e48e87abf14ec4272ba8aa42ae59074a624449ace41c2eb887a264c558ff WatchSource:0}: Error finding container 5261e48e87abf14ec4272ba8aa42ae59074a624449ace41c2eb887a264c558ff: Status 404 returned error can't find the container with id 5261e48e87abf14ec4272ba8aa42ae59074a624449ace41c2eb887a264c558ff Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.845552 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.859003 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zq969"] Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.859614 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8vfsj" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.861312 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8srf6"] Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.864688 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.871972 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8tdjt" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.885995 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csfgr" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.905607 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.912591 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.914096 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xfp4f"] Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.925874 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.939532 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q2lm7"] Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.943260 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4p7vx"] Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.945030 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.964444 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7"] Mar 10 06:47:46 crc kubenswrapper[4825]: W0310 06:47:46.965586 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc9419c8_c23a_418b_8fba_9956bed2a193.slice/crio-122b20ffae414c29feb2d4e6155a74121af80bbb8218ba491dbb9381f65968e4 WatchSource:0}: Error finding container 122b20ffae414c29feb2d4e6155a74121af80bbb8218ba491dbb9381f65968e4: Status 404 returned error can't find the container with id 122b20ffae414c29feb2d4e6155a74121af80bbb8218ba491dbb9381f65968e4 Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.965904 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 06:47:46 crc kubenswrapper[4825]: I0310 06:47:46.992466 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.013185 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpfs5\" (UniqueName: \"kubernetes.io/projected/07b348c9-f5aa-4fbd-a44c-3c2e8f82b0a9-kube-api-access-cpfs5\") pod \"migrator-59844c95c7-8mnj8\" (UID: \"07b348c9-f5aa-4fbd-a44c-3c2e8f82b0a9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8mnj8" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.013222 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c9944ab9-f72f-44bd-8850-2898feff4a28-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.013240 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75d9bfda-62b1-4488-9b08-cf1c8753d0da-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lvqmg\" (UID: \"75d9bfda-62b1-4488-9b08-cf1c8753d0da\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lvqmg" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.013257 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e8fd448-83e0-45d3-aecb-73f7e345c3bd-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-tc5qk\" (UID: \"7e8fd448-83e0-45d3-aecb-73f7e345c3bd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tc5qk" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.013300 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpdhl\" (UniqueName: \"kubernetes.io/projected/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-kube-api-access-lpdhl\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.013320 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9944ab9-f72f-44bd-8850-2898feff4a28-trusted-ca\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.013347 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c9944ab9-f72f-44bd-8850-2898feff4a28-registry-certificates\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.013386 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hskjb\" (UniqueName: \"kubernetes.io/projected/b756af5b-95bd-47f1-a48e-7fe620e67b8c-kube-api-access-hskjb\") pod \"etcd-operator-b45778765-l5r8d\" (UID: \"b756af5b-95bd-47f1-a48e-7fe620e67b8c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l5r8d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.013401 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e8fd448-83e0-45d3-aecb-73f7e345c3bd-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-tc5qk\" (UID: \"7e8fd448-83e0-45d3-aecb-73f7e345c3bd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tc5qk" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.013430 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c9944ab9-f72f-44bd-8850-2898feff4a28-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.013468 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b756af5b-95bd-47f1-a48e-7fe620e67b8c-config\") pod \"etcd-operator-b45778765-l5r8d\" (UID: \"b756af5b-95bd-47f1-a48e-7fe620e67b8c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l5r8d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.013524 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b756af5b-95bd-47f1-a48e-7fe620e67b8c-serving-cert\") pod \"etcd-operator-b45778765-l5r8d\" (UID: \"b756af5b-95bd-47f1-a48e-7fe620e67b8c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l5r8d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.013564 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.013620 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjdzk\" (UniqueName: \"kubernetes.io/projected/c9944ab9-f72f-44bd-8850-2898feff4a28-kube-api-access-wjdzk\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.013646 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75d9bfda-62b1-4488-9b08-cf1c8753d0da-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lvqmg\" (UID: \"75d9bfda-62b1-4488-9b08-cf1c8753d0da\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lvqmg" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.013676 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e8fd448-83e0-45d3-aecb-73f7e345c3bd-config\") pod \"kube-controller-manager-operator-78b949d7b-tc5qk\" (UID: \"7e8fd448-83e0-45d3-aecb-73f7e345c3bd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tc5qk" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.013704 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b756af5b-95bd-47f1-a48e-7fe620e67b8c-etcd-ca\") pod \"etcd-operator-b45778765-l5r8d\" (UID: \"b756af5b-95bd-47f1-a48e-7fe620e67b8c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l5r8d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.013756 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b756af5b-95bd-47f1-a48e-7fe620e67b8c-etcd-client\") pod \"etcd-operator-b45778765-l5r8d\" (UID: \"b756af5b-95bd-47f1-a48e-7fe620e67b8c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l5r8d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.013874 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b756af5b-95bd-47f1-a48e-7fe620e67b8c-etcd-service-ca\") pod \"etcd-operator-b45778765-l5r8d\" (UID: \"b756af5b-95bd-47f1-a48e-7fe620e67b8c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l5r8d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.013902 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75d9bfda-62b1-4488-9b08-cf1c8753d0da-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lvqmg\" (UID: \"75d9bfda-62b1-4488-9b08-cf1c8753d0da\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lvqmg" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.013973 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9944ab9-f72f-44bd-8850-2898feff4a28-bound-sa-token\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.013990 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c9944ab9-f72f-44bd-8850-2898feff4a28-registry-tls\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: E0310 06:47:47.019168 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:47.519147313 +0000 UTC m=+220.548927928 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.024251 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.027102 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpdhl\" (UniqueName: \"kubernetes.io/projected/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-kube-api-access-lpdhl\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.048017 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.107538 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-79576"] Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.115239 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.115420 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b756af5b-95bd-47f1-a48e-7fe620e67b8c-serving-cert\") pod \"etcd-operator-b45778765-l5r8d\" (UID: \"b756af5b-95bd-47f1-a48e-7fe620e67b8c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l5r8d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.115477 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3e18f017-e70d-45b7-a7fc-a9398f698980-mountpoint-dir\") pod \"csi-hostpathplugin-s68rr\" (UID: \"3e18f017-e70d-45b7-a7fc-a9398f698980\") " pod="hostpath-provisioner/csi-hostpathplugin-s68rr" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.115495 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fa3021cc-fba5-4be1-ab8b-da2e1ad68307-signing-cabundle\") pod \"service-ca-9c57cc56f-d5v8r\" (UID: \"fa3021cc-fba5-4be1-ab8b-da2e1ad68307\") " pod="openshift-service-ca/service-ca-9c57cc56f-d5v8r" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.115512 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/caf1fe52-fc04-4251-bdf1-f4cf1d80f45f-tmpfs\") pod \"packageserver-d55dfcdfc-5kj8n\" (UID: \"caf1fe52-fc04-4251-bdf1-f4cf1d80f45f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5kj8n" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.115539 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64lkm\" (UniqueName: \"kubernetes.io/projected/bdb77a6d-e89b-4fbf-9e00-93e23571d248-kube-api-access-64lkm\") pod \"kube-storage-version-migrator-operator-b67b599dd-m78bw\" (UID: \"bdb77a6d-e89b-4fbf-9e00-93e23571d248\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m78bw" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.115564 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4889476-c0d0-4e4b-986a-f4dcdacce72b-config-volume\") pod \"collect-profiles-29552085-npl6x\" (UID: \"f4889476-c0d0-4e4b-986a-f4dcdacce72b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552085-npl6x" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.115581 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gpdq\" (UniqueName: \"kubernetes.io/projected/fa3021cc-fba5-4be1-ab8b-da2e1ad68307-kube-api-access-4gpdq\") pod \"service-ca-9c57cc56f-d5v8r\" (UID: \"fa3021cc-fba5-4be1-ab8b-da2e1ad68307\") " pod="openshift-service-ca/service-ca-9c57cc56f-d5v8r" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.115617 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7c8037e4-1d42-4e69-a841-8042b4de2ffd-srv-cert\") pod \"olm-operator-6b444d44fb-g8v9j\" (UID: \"7c8037e4-1d42-4e69-a841-8042b4de2ffd\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g8v9j" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.116931 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdb77a6d-e89b-4fbf-9e00-93e23571d248-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-m78bw\" (UID: \"bdb77a6d-e89b-4fbf-9e00-93e23571d248\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m78bw" Mar 10 06:47:47 crc kubenswrapper[4825]: E0310 06:47:47.116997 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:47.616974706 +0000 UTC m=+220.646755321 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117116 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/af13f2b9-c077-4470-8b65-0e2c834717e7-proxy-tls\") pod \"machine-config-controller-84d6567774-6577d\" (UID: \"af13f2b9-c077-4470-8b65-0e2c834717e7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6577d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117205 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjdzk\" (UniqueName: \"kubernetes.io/projected/c9944ab9-f72f-44bd-8850-2898feff4a28-kube-api-access-wjdzk\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117240 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fba5b680-6beb-4ef8-8e8d-91be04d21de9-config\") pod \"service-ca-operator-777779d784-7wbbf\" (UID: \"fba5b680-6beb-4ef8-8e8d-91be04d21de9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7wbbf" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117256 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3e18f017-e70d-45b7-a7fc-a9398f698980-registration-dir\") pod \"csi-hostpathplugin-s68rr\" (UID: \"3e18f017-e70d-45b7-a7fc-a9398f698980\") " pod="hostpath-provisioner/csi-hostpathplugin-s68rr" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117279 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75d9bfda-62b1-4488-9b08-cf1c8753d0da-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lvqmg\" (UID: \"75d9bfda-62b1-4488-9b08-cf1c8753d0da\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lvqmg" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117301 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vp76\" (UniqueName: \"kubernetes.io/projected/d2e4a51d-47a6-45a0-b510-d25924b2e22a-kube-api-access-4vp76\") pod \"dns-default-hbbcq\" (UID: \"d2e4a51d-47a6-45a0-b510-d25924b2e22a\") " pod="openshift-dns/dns-default-hbbcq" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117355 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e8fd448-83e0-45d3-aecb-73f7e345c3bd-config\") pod \"kube-controller-manager-operator-78b949d7b-tc5qk\" (UID: \"7e8fd448-83e0-45d3-aecb-73f7e345c3bd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tc5qk" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117376 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gszlc\" (UniqueName: \"kubernetes.io/projected/f4889476-c0d0-4e4b-986a-f4dcdacce72b-kube-api-access-gszlc\") pod \"collect-profiles-29552085-npl6x\" (UID: \"f4889476-c0d0-4e4b-986a-f4dcdacce72b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552085-npl6x" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117392 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3e18f017-e70d-45b7-a7fc-a9398f698980-csi-data-dir\") pod \"csi-hostpathplugin-s68rr\" (UID: \"3e18f017-e70d-45b7-a7fc-a9398f698980\") " pod="hostpath-provisioner/csi-hostpathplugin-s68rr" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117407 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7c8037e4-1d42-4e69-a841-8042b4de2ffd-profile-collector-cert\") pod \"olm-operator-6b444d44fb-g8v9j\" (UID: \"7c8037e4-1d42-4e69-a841-8042b4de2ffd\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g8v9j" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117424 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b756af5b-95bd-47f1-a48e-7fe620e67b8c-etcd-ca\") pod \"etcd-operator-b45778765-l5r8d\" (UID: \"b756af5b-95bd-47f1-a48e-7fe620e67b8c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l5r8d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117443 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fa3021cc-fba5-4be1-ab8b-da2e1ad68307-signing-key\") pod \"service-ca-9c57cc56f-d5v8r\" (UID: \"fa3021cc-fba5-4be1-ab8b-da2e1ad68307\") " pod="openshift-service-ca/service-ca-9c57cc56f-d5v8r" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117472 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdb77a6d-e89b-4fbf-9e00-93e23571d248-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-m78bw\" (UID: \"bdb77a6d-e89b-4fbf-9e00-93e23571d248\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m78bw" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117493 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75cbx\" (UniqueName: \"kubernetes.io/projected/3e18f017-e70d-45b7-a7fc-a9398f698980-kube-api-access-75cbx\") pod \"csi-hostpathplugin-s68rr\" (UID: \"3e18f017-e70d-45b7-a7fc-a9398f698980\") " pod="hostpath-provisioner/csi-hostpathplugin-s68rr" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117564 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3e18f017-e70d-45b7-a7fc-a9398f698980-plugins-dir\") pod \"csi-hostpathplugin-s68rr\" (UID: \"3e18f017-e70d-45b7-a7fc-a9398f698980\") " pod="hostpath-provisioner/csi-hostpathplugin-s68rr" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117582 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/caf1fe52-fc04-4251-bdf1-f4cf1d80f45f-webhook-cert\") pod \"packageserver-d55dfcdfc-5kj8n\" (UID: \"caf1fe52-fc04-4251-bdf1-f4cf1d80f45f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5kj8n" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117608 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b756af5b-95bd-47f1-a48e-7fe620e67b8c-etcd-client\") pod \"etcd-operator-b45778765-l5r8d\" (UID: \"b756af5b-95bd-47f1-a48e-7fe620e67b8c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l5r8d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117629 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/caf1fe52-fc04-4251-bdf1-f4cf1d80f45f-apiservice-cert\") pod \"packageserver-d55dfcdfc-5kj8n\" (UID: \"caf1fe52-fc04-4251-bdf1-f4cf1d80f45f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5kj8n" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117675 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt8l4\" (UniqueName: \"kubernetes.io/projected/d2c0d6e4-6024-48a0-8bca-5069baf7a8ab-kube-api-access-bt8l4\") pod \"downloads-7954f5f757-9cr4m\" (UID: \"d2c0d6e4-6024-48a0-8bca-5069baf7a8ab\") " pod="openshift-console/downloads-7954f5f757-9cr4m" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117694 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jh49\" (UniqueName: \"kubernetes.io/projected/19b3fb37-5072-45f0-8349-1296e31a1193-kube-api-access-6jh49\") pod \"auto-csr-approver-29552086-rkhdp\" (UID: \"19b3fb37-5072-45f0-8349-1296e31a1193\") " pod="openshift-infra/auto-csr-approver-29552086-rkhdp" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117713 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s986l\" (UniqueName: \"kubernetes.io/projected/caf1fe52-fc04-4251-bdf1-f4cf1d80f45f-kube-api-access-s986l\") pod \"packageserver-d55dfcdfc-5kj8n\" (UID: \"caf1fe52-fc04-4251-bdf1-f4cf1d80f45f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5kj8n" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117728 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/557beeba-985a-429a-9814-55e49dcb4e44-certs\") pod \"machine-config-server-t8r7l\" (UID: \"557beeba-985a-429a-9814-55e49dcb4e44\") " pod="openshift-machine-config-operator/machine-config-server-t8r7l" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117805 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2e4a51d-47a6-45a0-b510-d25924b2e22a-config-volume\") pod \"dns-default-hbbcq\" (UID: \"d2e4a51d-47a6-45a0-b510-d25924b2e22a\") " pod="openshift-dns/dns-default-hbbcq" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117835 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2e4a51d-47a6-45a0-b510-d25924b2e22a-metrics-tls\") pod \"dns-default-hbbcq\" (UID: \"d2e4a51d-47a6-45a0-b510-d25924b2e22a\") " pod="openshift-dns/dns-default-hbbcq" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117853 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksz7r\" (UniqueName: \"kubernetes.io/projected/14642521-187c-45cf-aa34-cfd4fa40e632-kube-api-access-ksz7r\") pod \"control-plane-machine-set-operator-78cbb6b69f-wp526\" (UID: \"14642521-187c-45cf-aa34-cfd4fa40e632\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wp526" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117872 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b756af5b-95bd-47f1-a48e-7fe620e67b8c-etcd-service-ca\") pod \"etcd-operator-b45778765-l5r8d\" (UID: \"b756af5b-95bd-47f1-a48e-7fe620e67b8c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l5r8d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117903 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-268z8\" (UniqueName: \"kubernetes.io/projected/fba5b680-6beb-4ef8-8e8d-91be04d21de9-kube-api-access-268z8\") pod \"service-ca-operator-777779d784-7wbbf\" (UID: \"fba5b680-6beb-4ef8-8e8d-91be04d21de9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7wbbf" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117922 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/14642521-187c-45cf-aa34-cfd4fa40e632-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wp526\" (UID: \"14642521-187c-45cf-aa34-cfd4fa40e632\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wp526" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117948 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75d9bfda-62b1-4488-9b08-cf1c8753d0da-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lvqmg\" (UID: \"75d9bfda-62b1-4488-9b08-cf1c8753d0da\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lvqmg" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117966 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/af13f2b9-c077-4470-8b65-0e2c834717e7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6577d\" (UID: \"af13f2b9-c077-4470-8b65-0e2c834717e7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6577d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.117998 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxn9n\" (UniqueName: \"kubernetes.io/projected/7c8037e4-1d42-4e69-a841-8042b4de2ffd-kube-api-access-gxn9n\") pod \"olm-operator-6b444d44fb-g8v9j\" (UID: \"7c8037e4-1d42-4e69-a841-8042b4de2ffd\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g8v9j" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.118044 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b13c3a5-b9b7-421b-8c46-40a7408d6086-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nblmw\" (UID: \"6b13c3a5-b9b7-421b-8c46-40a7408d6086\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nblmw" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.118062 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prxch\" (UniqueName: \"kubernetes.io/projected/6b13c3a5-b9b7-421b-8c46-40a7408d6086-kube-api-access-prxch\") pod \"package-server-manager-789f6589d5-nblmw\" (UID: \"6b13c3a5-b9b7-421b-8c46-40a7408d6086\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nblmw" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.118081 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7qm7\" (UniqueName: \"kubernetes.io/projected/c83fc408-8240-480a-834d-c791da137960-kube-api-access-g7qm7\") pod \"ingress-canary-2r62c\" (UID: \"c83fc408-8240-480a-834d-c791da137960\") " pod="openshift-ingress-canary/ingress-canary-2r62c" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.118110 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9944ab9-f72f-44bd-8850-2898feff4a28-bound-sa-token\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.118125 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj6md\" (UniqueName: \"kubernetes.io/projected/557beeba-985a-429a-9814-55e49dcb4e44-kube-api-access-pj6md\") pod \"machine-config-server-t8r7l\" (UID: \"557beeba-985a-429a-9814-55e49dcb4e44\") " pod="openshift-machine-config-operator/machine-config-server-t8r7l" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.118164 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c9944ab9-f72f-44bd-8850-2898feff4a28-registry-tls\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.118184 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpfs5\" (UniqueName: \"kubernetes.io/projected/07b348c9-f5aa-4fbd-a44c-3c2e8f82b0a9-kube-api-access-cpfs5\") pod \"migrator-59844c95c7-8mnj8\" (UID: \"07b348c9-f5aa-4fbd-a44c-3c2e8f82b0a9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8mnj8" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.118210 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/557beeba-985a-429a-9814-55e49dcb4e44-node-bootstrap-token\") pod \"machine-config-server-t8r7l\" (UID: \"557beeba-985a-429a-9814-55e49dcb4e44\") " pod="openshift-machine-config-operator/machine-config-server-t8r7l" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.118237 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c9944ab9-f72f-44bd-8850-2898feff4a28-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.118253 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75d9bfda-62b1-4488-9b08-cf1c8753d0da-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lvqmg\" (UID: \"75d9bfda-62b1-4488-9b08-cf1c8753d0da\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lvqmg" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.118271 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e8fd448-83e0-45d3-aecb-73f7e345c3bd-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-tc5qk\" (UID: \"7e8fd448-83e0-45d3-aecb-73f7e345c3bd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tc5qk" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.118831 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b756af5b-95bd-47f1-a48e-7fe620e67b8c-etcd-service-ca\") pod \"etcd-operator-b45778765-l5r8d\" (UID: \"b756af5b-95bd-47f1-a48e-7fe620e67b8c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l5r8d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.119589 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e8fd448-83e0-45d3-aecb-73f7e345c3bd-config\") pod \"kube-controller-manager-operator-78b949d7b-tc5qk\" (UID: \"7e8fd448-83e0-45d3-aecb-73f7e345c3bd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tc5qk" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.119743 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75d9bfda-62b1-4488-9b08-cf1c8753d0da-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lvqmg\" (UID: \"75d9bfda-62b1-4488-9b08-cf1c8753d0da\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lvqmg" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.119850 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9944ab9-f72f-44bd-8850-2898feff4a28-trusted-ca\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.119881 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4889476-c0d0-4e4b-986a-f4dcdacce72b-secret-volume\") pod \"collect-profiles-29552085-npl6x\" (UID: \"f4889476-c0d0-4e4b-986a-f4dcdacce72b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552085-npl6x" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.119926 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c9944ab9-f72f-44bd-8850-2898feff4a28-registry-certificates\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.119948 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e8fd448-83e0-45d3-aecb-73f7e345c3bd-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-tc5qk\" (UID: \"7e8fd448-83e0-45d3-aecb-73f7e345c3bd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tc5qk" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.119965 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hskjb\" (UniqueName: \"kubernetes.io/projected/b756af5b-95bd-47f1-a48e-7fe620e67b8c-kube-api-access-hskjb\") pod \"etcd-operator-b45778765-l5r8d\" (UID: \"b756af5b-95bd-47f1-a48e-7fe620e67b8c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l5r8d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.119995 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt6xr\" (UniqueName: \"kubernetes.io/projected/af13f2b9-c077-4470-8b65-0e2c834717e7-kube-api-access-pt6xr\") pod \"machine-config-controller-84d6567774-6577d\" (UID: \"af13f2b9-c077-4470-8b65-0e2c834717e7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6577d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.120118 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c9944ab9-f72f-44bd-8850-2898feff4a28-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.120194 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b756af5b-95bd-47f1-a48e-7fe620e67b8c-config\") pod \"etcd-operator-b45778765-l5r8d\" (UID: \"b756af5b-95bd-47f1-a48e-7fe620e67b8c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l5r8d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.120261 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b756af5b-95bd-47f1-a48e-7fe620e67b8c-etcd-ca\") pod \"etcd-operator-b45778765-l5r8d\" (UID: \"b756af5b-95bd-47f1-a48e-7fe620e67b8c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l5r8d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.120277 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fba5b680-6beb-4ef8-8e8d-91be04d21de9-serving-cert\") pod \"service-ca-operator-777779d784-7wbbf\" (UID: \"fba5b680-6beb-4ef8-8e8d-91be04d21de9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7wbbf" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.120334 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c83fc408-8240-480a-834d-c791da137960-cert\") pod \"ingress-canary-2r62c\" (UID: \"c83fc408-8240-480a-834d-c791da137960\") " pod="openshift-ingress-canary/ingress-canary-2r62c" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.120413 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3e18f017-e70d-45b7-a7fc-a9398f698980-socket-dir\") pod \"csi-hostpathplugin-s68rr\" (UID: \"3e18f017-e70d-45b7-a7fc-a9398f698980\") " pod="hostpath-provisioner/csi-hostpathplugin-s68rr" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.121808 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b756af5b-95bd-47f1-a48e-7fe620e67b8c-config\") pod \"etcd-operator-b45778765-l5r8d\" (UID: \"b756af5b-95bd-47f1-a48e-7fe620e67b8c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l5r8d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.125270 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c9944ab9-f72f-44bd-8850-2898feff4a28-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.126703 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c9944ab9-f72f-44bd-8850-2898feff4a28-registry-certificates\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.126937 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9944ab9-f72f-44bd-8850-2898feff4a28-trusted-ca\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.129628 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e8fd448-83e0-45d3-aecb-73f7e345c3bd-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-tc5qk\" (UID: \"7e8fd448-83e0-45d3-aecb-73f7e345c3bd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tc5qk" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.133566 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b756af5b-95bd-47f1-a48e-7fe620e67b8c-serving-cert\") pod \"etcd-operator-b45778765-l5r8d\" (UID: \"b756af5b-95bd-47f1-a48e-7fe620e67b8c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l5r8d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.134205 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c9944ab9-f72f-44bd-8850-2898feff4a28-registry-tls\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.142558 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b756af5b-95bd-47f1-a48e-7fe620e67b8c-etcd-client\") pod \"etcd-operator-b45778765-l5r8d\" (UID: \"b756af5b-95bd-47f1-a48e-7fe620e67b8c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l5r8d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.143535 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c9944ab9-f72f-44bd-8850-2898feff4a28-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.143790 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75d9bfda-62b1-4488-9b08-cf1c8753d0da-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lvqmg\" (UID: \"75d9bfda-62b1-4488-9b08-cf1c8753d0da\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lvqmg" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.144377 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjdzk\" (UniqueName: \"kubernetes.io/projected/c9944ab9-f72f-44bd-8850-2898feff4a28-kube-api-access-wjdzk\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.145027 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mxs62"] Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.179325 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e8fd448-83e0-45d3-aecb-73f7e345c3bd-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-tc5qk\" (UID: \"7e8fd448-83e0-45d3-aecb-73f7e345c3bd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tc5qk" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.188252 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75d9bfda-62b1-4488-9b08-cf1c8753d0da-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lvqmg\" (UID: \"75d9bfda-62b1-4488-9b08-cf1c8753d0da\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lvqmg" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.212155 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9944ab9-f72f-44bd-8850-2898feff4a28-bound-sa-token\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.224494 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7c8037e4-1d42-4e69-a841-8042b4de2ffd-profile-collector-cert\") pod \"olm-operator-6b444d44fb-g8v9j\" (UID: \"7c8037e4-1d42-4e69-a841-8042b4de2ffd\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g8v9j" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.227690 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gszlc\" (UniqueName: \"kubernetes.io/projected/f4889476-c0d0-4e4b-986a-f4dcdacce72b-kube-api-access-gszlc\") pod \"collect-profiles-29552085-npl6x\" (UID: \"f4889476-c0d0-4e4b-986a-f4dcdacce72b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552085-npl6x" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.227746 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3e18f017-e70d-45b7-a7fc-a9398f698980-csi-data-dir\") pod \"csi-hostpathplugin-s68rr\" (UID: \"3e18f017-e70d-45b7-a7fc-a9398f698980\") " pod="hostpath-provisioner/csi-hostpathplugin-s68rr" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.227787 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fa3021cc-fba5-4be1-ab8b-da2e1ad68307-signing-key\") pod \"service-ca-9c57cc56f-d5v8r\" (UID: \"fa3021cc-fba5-4be1-ab8b-da2e1ad68307\") " pod="openshift-service-ca/service-ca-9c57cc56f-d5v8r" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.227842 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdb77a6d-e89b-4fbf-9e00-93e23571d248-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-m78bw\" (UID: \"bdb77a6d-e89b-4fbf-9e00-93e23571d248\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m78bw" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.227868 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75cbx\" (UniqueName: \"kubernetes.io/projected/3e18f017-e70d-45b7-a7fc-a9398f698980-kube-api-access-75cbx\") pod \"csi-hostpathplugin-s68rr\" (UID: \"3e18f017-e70d-45b7-a7fc-a9398f698980\") " pod="hostpath-provisioner/csi-hostpathplugin-s68rr" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.227932 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3e18f017-e70d-45b7-a7fc-a9398f698980-plugins-dir\") pod \"csi-hostpathplugin-s68rr\" (UID: \"3e18f017-e70d-45b7-a7fc-a9398f698980\") " pod="hostpath-provisioner/csi-hostpathplugin-s68rr" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.227984 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/caf1fe52-fc04-4251-bdf1-f4cf1d80f45f-webhook-cert\") pod \"packageserver-d55dfcdfc-5kj8n\" (UID: \"caf1fe52-fc04-4251-bdf1-f4cf1d80f45f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5kj8n" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228018 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/caf1fe52-fc04-4251-bdf1-f4cf1d80f45f-apiservice-cert\") pod \"packageserver-d55dfcdfc-5kj8n\" (UID: \"caf1fe52-fc04-4251-bdf1-f4cf1d80f45f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5kj8n" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228069 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt8l4\" (UniqueName: \"kubernetes.io/projected/d2c0d6e4-6024-48a0-8bca-5069baf7a8ab-kube-api-access-bt8l4\") pod \"downloads-7954f5f757-9cr4m\" (UID: \"d2c0d6e4-6024-48a0-8bca-5069baf7a8ab\") " pod="openshift-console/downloads-7954f5f757-9cr4m" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228105 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jh49\" (UniqueName: \"kubernetes.io/projected/19b3fb37-5072-45f0-8349-1296e31a1193-kube-api-access-6jh49\") pod \"auto-csr-approver-29552086-rkhdp\" (UID: \"19b3fb37-5072-45f0-8349-1296e31a1193\") " pod="openshift-infra/auto-csr-approver-29552086-rkhdp" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228168 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s986l\" (UniqueName: \"kubernetes.io/projected/caf1fe52-fc04-4251-bdf1-f4cf1d80f45f-kube-api-access-s986l\") pod \"packageserver-d55dfcdfc-5kj8n\" (UID: \"caf1fe52-fc04-4251-bdf1-f4cf1d80f45f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5kj8n" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228196 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/557beeba-985a-429a-9814-55e49dcb4e44-certs\") pod \"machine-config-server-t8r7l\" (UID: \"557beeba-985a-429a-9814-55e49dcb4e44\") " pod="openshift-machine-config-operator/machine-config-server-t8r7l" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228230 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2e4a51d-47a6-45a0-b510-d25924b2e22a-config-volume\") pod \"dns-default-hbbcq\" (UID: \"d2e4a51d-47a6-45a0-b510-d25924b2e22a\") " pod="openshift-dns/dns-default-hbbcq" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228289 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2e4a51d-47a6-45a0-b510-d25924b2e22a-metrics-tls\") pod \"dns-default-hbbcq\" (UID: \"d2e4a51d-47a6-45a0-b510-d25924b2e22a\") " pod="openshift-dns/dns-default-hbbcq" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228318 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksz7r\" (UniqueName: \"kubernetes.io/projected/14642521-187c-45cf-aa34-cfd4fa40e632-kube-api-access-ksz7r\") pod \"control-plane-machine-set-operator-78cbb6b69f-wp526\" (UID: \"14642521-187c-45cf-aa34-cfd4fa40e632\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wp526" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228358 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-268z8\" (UniqueName: \"kubernetes.io/projected/fba5b680-6beb-4ef8-8e8d-91be04d21de9-kube-api-access-268z8\") pod \"service-ca-operator-777779d784-7wbbf\" (UID: \"fba5b680-6beb-4ef8-8e8d-91be04d21de9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7wbbf" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228383 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/14642521-187c-45cf-aa34-cfd4fa40e632-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wp526\" (UID: \"14642521-187c-45cf-aa34-cfd4fa40e632\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wp526" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228406 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/af13f2b9-c077-4470-8b65-0e2c834717e7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6577d\" (UID: \"af13f2b9-c077-4470-8b65-0e2c834717e7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6577d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228460 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxn9n\" (UniqueName: \"kubernetes.io/projected/7c8037e4-1d42-4e69-a841-8042b4de2ffd-kube-api-access-gxn9n\") pod \"olm-operator-6b444d44fb-g8v9j\" (UID: \"7c8037e4-1d42-4e69-a841-8042b4de2ffd\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g8v9j" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228486 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prxch\" (UniqueName: \"kubernetes.io/projected/6b13c3a5-b9b7-421b-8c46-40a7408d6086-kube-api-access-prxch\") pod \"package-server-manager-789f6589d5-nblmw\" (UID: \"6b13c3a5-b9b7-421b-8c46-40a7408d6086\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nblmw" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228518 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b13c3a5-b9b7-421b-8c46-40a7408d6086-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nblmw\" (UID: \"6b13c3a5-b9b7-421b-8c46-40a7408d6086\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nblmw" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228539 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7qm7\" (UniqueName: \"kubernetes.io/projected/c83fc408-8240-480a-834d-c791da137960-kube-api-access-g7qm7\") pod \"ingress-canary-2r62c\" (UID: \"c83fc408-8240-480a-834d-c791da137960\") " pod="openshift-ingress-canary/ingress-canary-2r62c" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228572 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj6md\" (UniqueName: \"kubernetes.io/projected/557beeba-985a-429a-9814-55e49dcb4e44-kube-api-access-pj6md\") pod \"machine-config-server-t8r7l\" (UID: \"557beeba-985a-429a-9814-55e49dcb4e44\") " pod="openshift-machine-config-operator/machine-config-server-t8r7l" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228617 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/557beeba-985a-429a-9814-55e49dcb4e44-node-bootstrap-token\") pod \"machine-config-server-t8r7l\" (UID: \"557beeba-985a-429a-9814-55e49dcb4e44\") " pod="openshift-machine-config-operator/machine-config-server-t8r7l" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228650 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4889476-c0d0-4e4b-986a-f4dcdacce72b-secret-volume\") pod \"collect-profiles-29552085-npl6x\" (UID: \"f4889476-c0d0-4e4b-986a-f4dcdacce72b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552085-npl6x" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228693 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt6xr\" (UniqueName: \"kubernetes.io/projected/af13f2b9-c077-4470-8b65-0e2c834717e7-kube-api-access-pt6xr\") pod \"machine-config-controller-84d6567774-6577d\" (UID: \"af13f2b9-c077-4470-8b65-0e2c834717e7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6577d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228745 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fba5b680-6beb-4ef8-8e8d-91be04d21de9-serving-cert\") pod \"service-ca-operator-777779d784-7wbbf\" (UID: \"fba5b680-6beb-4ef8-8e8d-91be04d21de9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7wbbf" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228768 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c83fc408-8240-480a-834d-c791da137960-cert\") pod \"ingress-canary-2r62c\" (UID: \"c83fc408-8240-480a-834d-c791da137960\") " pod="openshift-ingress-canary/ingress-canary-2r62c" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228805 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3e18f017-e70d-45b7-a7fc-a9398f698980-socket-dir\") pod \"csi-hostpathplugin-s68rr\" (UID: \"3e18f017-e70d-45b7-a7fc-a9398f698980\") " pod="hostpath-provisioner/csi-hostpathplugin-s68rr" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228889 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3e18f017-e70d-45b7-a7fc-a9398f698980-mountpoint-dir\") pod \"csi-hostpathplugin-s68rr\" (UID: \"3e18f017-e70d-45b7-a7fc-a9398f698980\") " pod="hostpath-provisioner/csi-hostpathplugin-s68rr" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228909 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fa3021cc-fba5-4be1-ab8b-da2e1ad68307-signing-cabundle\") pod \"service-ca-9c57cc56f-d5v8r\" (UID: \"fa3021cc-fba5-4be1-ab8b-da2e1ad68307\") " pod="openshift-service-ca/service-ca-9c57cc56f-d5v8r" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228937 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228959 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64lkm\" (UniqueName: \"kubernetes.io/projected/bdb77a6d-e89b-4fbf-9e00-93e23571d248-kube-api-access-64lkm\") pod \"kube-storage-version-migrator-operator-b67b599dd-m78bw\" (UID: \"bdb77a6d-e89b-4fbf-9e00-93e23571d248\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m78bw" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.228981 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/caf1fe52-fc04-4251-bdf1-f4cf1d80f45f-tmpfs\") pod \"packageserver-d55dfcdfc-5kj8n\" (UID: \"caf1fe52-fc04-4251-bdf1-f4cf1d80f45f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5kj8n" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.229015 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4889476-c0d0-4e4b-986a-f4dcdacce72b-config-volume\") pod \"collect-profiles-29552085-npl6x\" (UID: \"f4889476-c0d0-4e4b-986a-f4dcdacce72b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552085-npl6x" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.229053 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gpdq\" (UniqueName: \"kubernetes.io/projected/fa3021cc-fba5-4be1-ab8b-da2e1ad68307-kube-api-access-4gpdq\") pod \"service-ca-9c57cc56f-d5v8r\" (UID: \"fa3021cc-fba5-4be1-ab8b-da2e1ad68307\") " pod="openshift-service-ca/service-ca-9c57cc56f-d5v8r" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.229091 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7c8037e4-1d42-4e69-a841-8042b4de2ffd-srv-cert\") pod \"olm-operator-6b444d44fb-g8v9j\" (UID: \"7c8037e4-1d42-4e69-a841-8042b4de2ffd\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g8v9j" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.229112 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdb77a6d-e89b-4fbf-9e00-93e23571d248-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-m78bw\" (UID: \"bdb77a6d-e89b-4fbf-9e00-93e23571d248\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m78bw" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.229150 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/af13f2b9-c077-4470-8b65-0e2c834717e7-proxy-tls\") pod \"machine-config-controller-84d6567774-6577d\" (UID: \"af13f2b9-c077-4470-8b65-0e2c834717e7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6577d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.229179 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fba5b680-6beb-4ef8-8e8d-91be04d21de9-config\") pod \"service-ca-operator-777779d784-7wbbf\" (UID: \"fba5b680-6beb-4ef8-8e8d-91be04d21de9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7wbbf" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.229199 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3e18f017-e70d-45b7-a7fc-a9398f698980-registration-dir\") pod \"csi-hostpathplugin-s68rr\" (UID: \"3e18f017-e70d-45b7-a7fc-a9398f698980\") " pod="hostpath-provisioner/csi-hostpathplugin-s68rr" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.229223 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vp76\" (UniqueName: \"kubernetes.io/projected/d2e4a51d-47a6-45a0-b510-d25924b2e22a-kube-api-access-4vp76\") pod \"dns-default-hbbcq\" (UID: \"d2e4a51d-47a6-45a0-b510-d25924b2e22a\") " pod="openshift-dns/dns-default-hbbcq" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.229880 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3e18f017-e70d-45b7-a7fc-a9398f698980-plugins-dir\") pod \"csi-hostpathplugin-s68rr\" (UID: \"3e18f017-e70d-45b7-a7fc-a9398f698980\") " pod="hostpath-provisioner/csi-hostpathplugin-s68rr" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.233625 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7c8037e4-1d42-4e69-a841-8042b4de2ffd-profile-collector-cert\") pod \"olm-operator-6b444d44fb-g8v9j\" (UID: \"7c8037e4-1d42-4e69-a841-8042b4de2ffd\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g8v9j" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.240944 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bdb77a6d-e89b-4fbf-9e00-93e23571d248-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-m78bw\" (UID: \"bdb77a6d-e89b-4fbf-9e00-93e23571d248\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m78bw" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.241274 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/557beeba-985a-429a-9814-55e49dcb4e44-node-bootstrap-token\") pod \"machine-config-server-t8r7l\" (UID: \"557beeba-985a-429a-9814-55e49dcb4e44\") " pod="openshift-machine-config-operator/machine-config-server-t8r7l" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.241191 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fba5b680-6beb-4ef8-8e8d-91be04d21de9-config\") pod \"service-ca-operator-777779d784-7wbbf\" (UID: \"fba5b680-6beb-4ef8-8e8d-91be04d21de9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7wbbf" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.241618 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3e18f017-e70d-45b7-a7fc-a9398f698980-registration-dir\") pod \"csi-hostpathplugin-s68rr\" (UID: \"3e18f017-e70d-45b7-a7fc-a9398f698980\") " pod="hostpath-provisioner/csi-hostpathplugin-s68rr" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.241664 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2e4a51d-47a6-45a0-b510-d25924b2e22a-config-volume\") pod \"dns-default-hbbcq\" (UID: \"d2e4a51d-47a6-45a0-b510-d25924b2e22a\") " pod="openshift-dns/dns-default-hbbcq" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.241815 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bdb77a6d-e89b-4fbf-9e00-93e23571d248-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-m78bw\" (UID: \"bdb77a6d-e89b-4fbf-9e00-93e23571d248\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m78bw" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.242082 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3e18f017-e70d-45b7-a7fc-a9398f698980-csi-data-dir\") pod \"csi-hostpathplugin-s68rr\" (UID: \"3e18f017-e70d-45b7-a7fc-a9398f698980\") " pod="hostpath-provisioner/csi-hostpathplugin-s68rr" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.242347 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/caf1fe52-fc04-4251-bdf1-f4cf1d80f45f-tmpfs\") pod \"packageserver-d55dfcdfc-5kj8n\" (UID: \"caf1fe52-fc04-4251-bdf1-f4cf1d80f45f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5kj8n" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.242546 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/14642521-187c-45cf-aa34-cfd4fa40e632-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wp526\" (UID: \"14642521-187c-45cf-aa34-cfd4fa40e632\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wp526" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.243072 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3e18f017-e70d-45b7-a7fc-a9398f698980-mountpoint-dir\") pod \"csi-hostpathplugin-s68rr\" (UID: \"3e18f017-e70d-45b7-a7fc-a9398f698980\") " pod="hostpath-provisioner/csi-hostpathplugin-s68rr" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.243248 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3e18f017-e70d-45b7-a7fc-a9398f698980-socket-dir\") pod \"csi-hostpathplugin-s68rr\" (UID: \"3e18f017-e70d-45b7-a7fc-a9398f698980\") " pod="hostpath-provisioner/csi-hostpathplugin-s68rr" Mar 10 06:47:47 crc kubenswrapper[4825]: E0310 06:47:47.244433 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:47.744411862 +0000 UTC m=+220.774192477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.253362 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/af13f2b9-c077-4470-8b65-0e2c834717e7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6577d\" (UID: \"af13f2b9-c077-4470-8b65-0e2c834717e7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6577d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.253799 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fba5b680-6beb-4ef8-8e8d-91be04d21de9-serving-cert\") pod \"service-ca-operator-777779d784-7wbbf\" (UID: \"fba5b680-6beb-4ef8-8e8d-91be04d21de9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7wbbf" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.257091 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fa3021cc-fba5-4be1-ab8b-da2e1ad68307-signing-key\") pod \"service-ca-9c57cc56f-d5v8r\" (UID: \"fa3021cc-fba5-4be1-ab8b-da2e1ad68307\") " pod="openshift-service-ca/service-ca-9c57cc56f-d5v8r" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.260257 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fa3021cc-fba5-4be1-ab8b-da2e1ad68307-signing-cabundle\") pod \"service-ca-9c57cc56f-d5v8r\" (UID: \"fa3021cc-fba5-4be1-ab8b-da2e1ad68307\") " pod="openshift-service-ca/service-ca-9c57cc56f-d5v8r" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.260572 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4889476-c0d0-4e4b-986a-f4dcdacce72b-config-volume\") pod \"collect-profiles-29552085-npl6x\" (UID: \"f4889476-c0d0-4e4b-986a-f4dcdacce72b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552085-npl6x" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.266382 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2e4a51d-47a6-45a0-b510-d25924b2e22a-metrics-tls\") pod \"dns-default-hbbcq\" (UID: \"d2e4a51d-47a6-45a0-b510-d25924b2e22a\") " pod="openshift-dns/dns-default-hbbcq" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.267949 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7c8037e4-1d42-4e69-a841-8042b4de2ffd-srv-cert\") pod \"olm-operator-6b444d44fb-g8v9j\" (UID: \"7c8037e4-1d42-4e69-a841-8042b4de2ffd\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g8v9j" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.270550 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b13c3a5-b9b7-421b-8c46-40a7408d6086-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nblmw\" (UID: \"6b13c3a5-b9b7-421b-8c46-40a7408d6086\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nblmw" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.275243 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c83fc408-8240-480a-834d-c791da137960-cert\") pod \"ingress-canary-2r62c\" (UID: \"c83fc408-8240-480a-834d-c791da137960\") " pod="openshift-ingress-canary/ingress-canary-2r62c" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.278855 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/af13f2b9-c077-4470-8b65-0e2c834717e7-proxy-tls\") pod \"machine-config-controller-84d6567774-6577d\" (UID: \"af13f2b9-c077-4470-8b65-0e2c834717e7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6577d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.279883 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/557beeba-985a-429a-9814-55e49dcb4e44-certs\") pod \"machine-config-server-t8r7l\" (UID: \"557beeba-985a-429a-9814-55e49dcb4e44\") " pod="openshift-machine-config-operator/machine-config-server-t8r7l" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.280356 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/caf1fe52-fc04-4251-bdf1-f4cf1d80f45f-webhook-cert\") pod \"packageserver-d55dfcdfc-5kj8n\" (UID: \"caf1fe52-fc04-4251-bdf1-f4cf1d80f45f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5kj8n" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.281604 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/caf1fe52-fc04-4251-bdf1-f4cf1d80f45f-apiservice-cert\") pod \"packageserver-d55dfcdfc-5kj8n\" (UID: \"caf1fe52-fc04-4251-bdf1-f4cf1d80f45f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5kj8n" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.281901 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4889476-c0d0-4e4b-986a-f4dcdacce72b-secret-volume\") pod \"collect-profiles-29552085-npl6x\" (UID: \"f4889476-c0d0-4e4b-986a-f4dcdacce72b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552085-npl6x" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.282443 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpfs5\" (UniqueName: \"kubernetes.io/projected/07b348c9-f5aa-4fbd-a44c-3c2e8f82b0a9-kube-api-access-cpfs5\") pod \"migrator-59844c95c7-8mnj8\" (UID: \"07b348c9-f5aa-4fbd-a44c-3c2e8f82b0a9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8mnj8" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.290511 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hskjb\" (UniqueName: \"kubernetes.io/projected/b756af5b-95bd-47f1-a48e-7fe620e67b8c-kube-api-access-hskjb\") pod \"etcd-operator-b45778765-l5r8d\" (UID: \"b756af5b-95bd-47f1-a48e-7fe620e67b8c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-l5r8d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.291450 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-6bjtt"] Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.291487 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8vfsj"] Mar 10 06:47:47 crc kubenswrapper[4825]: W0310 06:47:47.301848 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a265902_a773_4550_b3fa_79f94c82809c.slice/crio-b05216c4cf63d4e15c02b41fc7a35bad59f4f56caf4f1606fa5e47f3adb179b9 WatchSource:0}: Error finding container b05216c4cf63d4e15c02b41fc7a35bad59f4f56caf4f1606fa5e47f3adb179b9: Status 404 returned error can't find the container with id b05216c4cf63d4e15c02b41fc7a35bad59f4f56caf4f1606fa5e47f3adb179b9 Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.306664 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vp76\" (UniqueName: \"kubernetes.io/projected/d2e4a51d-47a6-45a0-b510-d25924b2e22a-kube-api-access-4vp76\") pod \"dns-default-hbbcq\" (UID: \"d2e4a51d-47a6-45a0-b510-d25924b2e22a\") " pod="openshift-dns/dns-default-hbbcq" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.328065 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8tdjt"] Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.330704 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.330823 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75cbx\" (UniqueName: \"kubernetes.io/projected/3e18f017-e70d-45b7-a7fc-a9398f698980-kube-api-access-75cbx\") pod \"csi-hostpathplugin-s68rr\" (UID: \"3e18f017-e70d-45b7-a7fc-a9398f698980\") " pod="hostpath-provisioner/csi-hostpathplugin-s68rr" Mar 10 06:47:47 crc kubenswrapper[4825]: E0310 06:47:47.331232 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:47.831213792 +0000 UTC m=+220.860994407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.332970 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-l5r8d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.347636 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksz7r\" (UniqueName: \"kubernetes.io/projected/14642521-187c-45cf-aa34-cfd4fa40e632-kube-api-access-ksz7r\") pod \"control-plane-machine-set-operator-78cbb6b69f-wp526\" (UID: \"14642521-187c-45cf-aa34-cfd4fa40e632\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wp526" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.354525 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tc5qk" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.362921 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s986l\" (UniqueName: \"kubernetes.io/projected/caf1fe52-fc04-4251-bdf1-f4cf1d80f45f-kube-api-access-s986l\") pod \"packageserver-d55dfcdfc-5kj8n\" (UID: \"caf1fe52-fc04-4251-bdf1-f4cf1d80f45f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5kj8n" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.399961 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-csfgr"] Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.401178 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt8l4\" (UniqueName: \"kubernetes.io/projected/d2c0d6e4-6024-48a0-8bca-5069baf7a8ab-kube-api-access-bt8l4\") pod \"downloads-7954f5f757-9cr4m\" (UID: \"d2c0d6e4-6024-48a0-8bca-5069baf7a8ab\") " pod="openshift-console/downloads-7954f5f757-9cr4m" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.406715 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jh49\" (UniqueName: \"kubernetes.io/projected/19b3fb37-5072-45f0-8349-1296e31a1193-kube-api-access-6jh49\") pod \"auto-csr-approver-29552086-rkhdp\" (UID: \"19b3fb37-5072-45f0-8349-1296e31a1193\") " pod="openshift-infra/auto-csr-approver-29552086-rkhdp" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.406978 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.421781 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gszlc\" (UniqueName: \"kubernetes.io/projected/f4889476-c0d0-4e4b-986a-f4dcdacce72b-kube-api-access-gszlc\") pod \"collect-profiles-29552085-npl6x\" (UID: \"f4889476-c0d0-4e4b-986a-f4dcdacce72b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552085-npl6x" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.432506 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: E0310 06:47:47.433625 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:47.93360641 +0000 UTC m=+220.963387025 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.434511 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lvqmg" Mar 10 06:47:47 crc kubenswrapper[4825]: W0310 06:47:47.434615 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode40273f3_dda6_4e38_b940_284ae6f95e41.slice/crio-78bf7d5f327e4a6e2a40273b59973ef83019b01fe2002ea4cd7df1f4b8b0226d WatchSource:0}: Error finding container 78bf7d5f327e4a6e2a40273b59973ef83019b01fe2002ea4cd7df1f4b8b0226d: Status 404 returned error can't find the container with id 78bf7d5f327e4a6e2a40273b59973ef83019b01fe2002ea4cd7df1f4b8b0226d Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.442704 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64lkm\" (UniqueName: \"kubernetes.io/projected/bdb77a6d-e89b-4fbf-9e00-93e23571d248-kube-api-access-64lkm\") pod \"kube-storage-version-migrator-operator-b67b599dd-m78bw\" (UID: \"bdb77a6d-e89b-4fbf-9e00-93e23571d248\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m78bw" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.460607 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt6xr\" (UniqueName: \"kubernetes.io/projected/af13f2b9-c077-4470-8b65-0e2c834717e7-kube-api-access-pt6xr\") pod \"machine-config-controller-84d6567774-6577d\" (UID: \"af13f2b9-c077-4470-8b65-0e2c834717e7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6577d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.487827 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-268z8\" (UniqueName: \"kubernetes.io/projected/fba5b680-6beb-4ef8-8e8d-91be04d21de9-kube-api-access-268z8\") pod \"service-ca-operator-777779d784-7wbbf\" (UID: \"fba5b680-6beb-4ef8-8e8d-91be04d21de9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7wbbf" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.494290 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8mnj8" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.500322 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m78bw" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.506646 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxn9n\" (UniqueName: \"kubernetes.io/projected/7c8037e4-1d42-4e69-a841-8042b4de2ffd-kube-api-access-gxn9n\") pod \"olm-operator-6b444d44fb-g8v9j\" (UID: \"7c8037e4-1d42-4e69-a841-8042b4de2ffd\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g8v9j" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.509666 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6577d" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.527572 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7wbbf" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.531638 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-9cr4m" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.533692 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7qm7\" (UniqueName: \"kubernetes.io/projected/c83fc408-8240-480a-834d-c791da137960-kube-api-access-g7qm7\") pod \"ingress-canary-2r62c\" (UID: \"c83fc408-8240-480a-834d-c791da137960\") " pod="openshift-ingress-canary/ingress-canary-2r62c" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.534014 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:47 crc kubenswrapper[4825]: E0310 06:47:47.534681 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:48.034661013 +0000 UTC m=+221.064441628 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.538886 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wp526" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.548088 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552085-npl6x" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.552986 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj6md\" (UniqueName: \"kubernetes.io/projected/557beeba-985a-429a-9814-55e49dcb4e44-kube-api-access-pj6md\") pod \"machine-config-server-t8r7l\" (UID: \"557beeba-985a-429a-9814-55e49dcb4e44\") " pod="openshift-machine-config-operator/machine-config-server-t8r7l" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.553524 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g8v9j" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.561216 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hbbcq" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.576468 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5kj8n" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.579185 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gpdq\" (UniqueName: \"kubernetes.io/projected/fa3021cc-fba5-4be1-ab8b-da2e1ad68307-kube-api-access-4gpdq\") pod \"service-ca-9c57cc56f-d5v8r\" (UID: \"fa3021cc-fba5-4be1-ab8b-da2e1ad68307\") " pod="openshift-service-ca/service-ca-9c57cc56f-d5v8r" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.586637 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prxch\" (UniqueName: \"kubernetes.io/projected/6b13c3a5-b9b7-421b-8c46-40a7408d6086-kube-api-access-prxch\") pod \"package-server-manager-789f6589d5-nblmw\" (UID: \"6b13c3a5-b9b7-421b-8c46-40a7408d6086\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nblmw" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.601377 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552086-rkhdp" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.610558 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-79576" event={"ID":"bde95e4a-12d5-4b7e-bd4e-9b2527faefa0","Type":"ContainerStarted","Data":"5a88dca32dcfd9654d98eea81bfc974774286a99f3cdd4df9fa85438dd9f4e04"} Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.612837 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-t8r7l" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.627339 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" event={"ID":"cc9419c8-c23a-418b-8fba-9956bed2a193","Type":"ContainerStarted","Data":"122b20ffae414c29feb2d4e6155a74121af80bbb8218ba491dbb9381f65968e4"} Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.627836 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-s68rr" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.638818 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: E0310 06:47:47.643376 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:48.143358924 +0000 UTC m=+221.173139539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.644293 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-l5r8d"] Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.652524 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2r62c" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.659972 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csfgr" event={"ID":"e40273f3-dda6-4e38-b940-284ae6f95e41","Type":"ContainerStarted","Data":"78bf7d5f327e4a6e2a40273b59973ef83019b01fe2002ea4cd7df1f4b8b0226d"} Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.670232 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j7fv5" event={"ID":"27d7fca4-c1f9-411f-b705-e36c0c1e8356","Type":"ContainerStarted","Data":"16f482d0209c03fbf81ea0aa732770a6a7b6e59f78f8e0e3e2e79d97101f5dbb"} Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.670333 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j7fv5" event={"ID":"27d7fca4-c1f9-411f-b705-e36c0c1e8356","Type":"ContainerStarted","Data":"c80289a1cb7e97e8bcc018b2ccc38e97897910f1b79bb39baabd1772f287d518"} Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.675477 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" event={"ID":"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6","Type":"ContainerStarted","Data":"9f99d5b02f8c6ff5f7b0d8e0c54938422bcc9b4d7ce67115b1e14777d00853b0"} Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.683303 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" event={"ID":"4d23b66c-f736-4af2-9dc7-6167ca4d53ef","Type":"ContainerStarted","Data":"83fb2a979171b266537a6400dec6d73b3a93f176ac49b11d7b0d9cbb89e4c14b"} Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.683694 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.684547 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6bjtt" event={"ID":"f3a60327-2809-415b-abde-d1569a2453b6","Type":"ContainerStarted","Data":"646706ff1b6a2ebe6615bad16203e62b5b8c9f40ae343769fcfbb3c7fdfbea43"} Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.686762 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8vfsj" event={"ID":"8a265902-a773-4550-b3fa-79f94c82809c","Type":"ContainerStarted","Data":"b05216c4cf63d4e15c02b41fc7a35bad59f4f56caf4f1606fa5e47f3adb179b9"} Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.691196 4825 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-txg9q container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.691267 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" podUID="4d23b66c-f736-4af2-9dc7-6167ca4d53ef" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.716742 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwnns" event={"ID":"d5867ece-9f04-49e4-8a81-1666fc1849ea","Type":"ContainerStarted","Data":"d04775e109cc87d92b38bb438ca75d64bdfb64235be421d7ec909ca0bcd5bec1"} Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.716797 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwnns" event={"ID":"d5867ece-9f04-49e4-8a81-1666fc1849ea","Type":"ContainerStarted","Data":"d66dc3b7a5d7152f1d26ae559a7a0bd625c9bb7242af531501ade606bbfc9cfd"} Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.736864 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zq969" event={"ID":"af8fff67-ed41-4e2d-af5f-cf76cd2a4234","Type":"ContainerStarted","Data":"a18a3e0702306f82cddaac66c366cd28d7d4df1fc17f4afd001eae3983a66f62"} Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.737896 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-zq969" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.739443 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:47 crc kubenswrapper[4825]: E0310 06:47:47.739625 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:48.239608074 +0000 UTC m=+221.269388689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.740144 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mxs62" event={"ID":"3e73a713-a351-419b-903a-e44041c28d6f","Type":"ContainerStarted","Data":"9543b53023dd606a9a7becb099ff123076fb7d0aed427567e9def3799604a06c"} Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.740905 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: E0310 06:47:47.741188 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:48.241179248 +0000 UTC m=+221.270959863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.746016 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4p7vx" event={"ID":"92a495f2-d3a4-42b9-82f9-48b4dda31caf","Type":"ContainerStarted","Data":"a5735de5da15a88c3f2ec2de53861e6524bc6c5f5493923ff7c79d2bf9b5faec"} Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.746711 4825 patch_prober.go:28] interesting pod/console-operator-58897d9998-zq969 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.746891 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zq969" podUID="af8fff67-ed41-4e2d-af5f-cf76cd2a4234" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.749506 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8tdjt" event={"ID":"230c8679-e321-40bd-844e-e350e48404e3","Type":"ContainerStarted","Data":"7c5d00ee554902642d3e51e77ceb2aad6fd9de466aeeda9a37088a249489c6df"} Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.758667 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tgcvt" event={"ID":"9f0e6d3a-b022-42dd-828e-bd5ec395d06c","Type":"ContainerStarted","Data":"9e5559fb2a77bc2329fad37d15d5d00ad356a26f000d95686020d0c4da013d02"} Mar 10 06:47:47 crc kubenswrapper[4825]: W0310 06:47:47.761488 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb756af5b_95bd_47f1_a48e_7fe620e67b8c.slice/crio-dc844cbf4e2ecc177477e06773c671960750893ec9eb769e7747edb5aa51af5d WatchSource:0}: Error finding container dc844cbf4e2ecc177477e06773c671960750893ec9eb769e7747edb5aa51af5d: Status 404 returned error can't find the container with id dc844cbf4e2ecc177477e06773c671960750893ec9eb769e7747edb5aa51af5d Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.765095 4825 generic.go:334] "Generic (PLEG): container finished" podID="45ae05cc-9b3e-40e0-9235-f0cf1f6def5f" containerID="12c8b7bfc5ce42dd10cbb26f378233d46a20bdae6129e1baf73c135bf13af0a3" exitCode=0 Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.765178 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72x4w" event={"ID":"45ae05cc-9b3e-40e0-9235-f0cf1f6def5f","Type":"ContainerDied","Data":"12c8b7bfc5ce42dd10cbb26f378233d46a20bdae6129e1baf73c135bf13af0a3"} Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.765206 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72x4w" event={"ID":"45ae05cc-9b3e-40e0-9235-f0cf1f6def5f","Type":"ContainerStarted","Data":"4c4711ea2e96d4ea5c406444849f393f13b381de33e551b950511be5a6daf176"} Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.766989 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tc5qk"] Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.768147 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-tgcvt" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.773101 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q2lm7" event={"ID":"0289197b-e586-477b-bb0a-a8b8ef92b21d","Type":"ContainerStarted","Data":"6d93142089101d187b89ec937cdb1d4c9c59c6b2c17b5920056a3c9a653f3d56"} Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.775342 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8srf6" event={"ID":"ced2dcc4-9906-4cbe-b163-41ff41bb4f02","Type":"ContainerStarted","Data":"8aeb22846a68ad61d1dc203fb7efff5d88a5e77234adbd686a18f0686888d14b"} Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.778665 4825 patch_prober.go:28] interesting pod/router-default-5444994796-tgcvt container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.778693 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tgcvt" podUID="9f0e6d3a-b022-42dd-828e-bd5ec395d06c" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.786028 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5xr9" event={"ID":"c941f1c7-e449-495d-9af7-e40d1b2f2cd5","Type":"ContainerStarted","Data":"5261e48e87abf14ec4272ba8aa42ae59074a624449ace41c2eb887a264c558ff"} Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.817849 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-d5v8r" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.843848 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.844026 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-config\") pod \"route-controller-manager-6576b87f9c-grsjk\" (UID: \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" Mar 10 06:47:47 crc kubenswrapper[4825]: E0310 06:47:47.844080 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:48.344055662 +0000 UTC m=+221.373836287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.844110 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-serving-cert\") pod \"route-controller-manager-6576b87f9c-grsjk\" (UID: \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.844181 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-audit\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.844283 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-etcd-serving-ca\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.845466 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.845564 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-config\") pod \"controller-manager-879f6c89f-6j748\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.845722 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bab741b-822c-4548-8333-aa3f90ecd8a0-serving-cert\") pod \"controller-manager-879f6c89f-6j748\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.845804 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6j748\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.845898 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-etcd-client\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.846067 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.848108 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.849167 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-config\") pod \"route-controller-manager-6576b87f9c-grsjk\" (UID: \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.850578 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-etcd-serving-ca\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.851244 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-audit\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:47 crc kubenswrapper[4825]: E0310 06:47:47.851808 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:48.351786554 +0000 UTC m=+221.381567169 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.852250 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-config\") pod \"controller-manager-879f6c89f-6j748\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.860587 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-serving-cert\") pod \"route-controller-manager-6576b87f9c-grsjk\" (UID: \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.865762 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6j748\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.867077 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb-etcd-client\") pod \"apiserver-76f77b778f-ksvjd\" (UID: \"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb\") " pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.867998 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bab741b-822c-4548-8333-aa3f90ecd8a0-serving-cert\") pod \"controller-manager-879f6c89f-6j748\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.869542 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nblmw" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.947655 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:47 crc kubenswrapper[4825]: E0310 06:47:47.949484 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:48.449465852 +0000 UTC m=+221.479246467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.970083 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lvqmg"] Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.989221 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-scbgc" podStartSLOduration=167.989205118 podStartE2EDuration="2m47.989205118s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:47.988707371 +0000 UTC m=+221.018487986" watchObservedRunningTime="2026-03-10 06:47:47.989205118 +0000 UTC m=+221.018985733" Mar 10 06:47:47 crc kubenswrapper[4825]: I0310 06:47:47.991266 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.009284 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-8mnj8"] Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.047967 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.050566 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:48 crc kubenswrapper[4825]: E0310 06:47:48.050923 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:48.550909688 +0000 UTC m=+221.580690303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.053790 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.100317 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-tgcvt" podStartSLOduration=168.100298311 podStartE2EDuration="2m48.100298311s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:48.099515504 +0000 UTC m=+221.129296119" watchObservedRunningTime="2026-03-10 06:47:48.100298311 +0000 UTC m=+221.130078926" Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.152924 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:48 crc kubenswrapper[4825]: E0310 06:47:48.154753 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:48.654733475 +0000 UTC m=+221.684514090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.193947 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6577d"] Mar 10 06:47:48 crc kubenswrapper[4825]: W0310 06:47:48.197432 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75d9bfda_62b1_4488_9b08_cf1c8753d0da.slice/crio-ff1c866460e7cb46da216c1a72ea6a7248ac43d1c73fc4a28b805a3ef9b62569 WatchSource:0}: Error finding container ff1c866460e7cb46da216c1a72ea6a7248ac43d1c73fc4a28b805a3ef9b62569: Status 404 returned error can't find the container with id ff1c866460e7cb46da216c1a72ea6a7248ac43d1c73fc4a28b805a3ef9b62569 Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.246406 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m78bw"] Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.263862 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:48 crc kubenswrapper[4825]: E0310 06:47:48.268368 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:48.768349753 +0000 UTC m=+221.798130358 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.341981 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4p7vx" podStartSLOduration=168.341959536 podStartE2EDuration="2m48.341959536s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:48.340981613 +0000 UTC m=+221.370762228" watchObservedRunningTime="2026-03-10 06:47:48.341959536 +0000 UTC m=+221.371740151" Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.364755 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:48 crc kubenswrapper[4825]: E0310 06:47:48.364880 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:48.864862892 +0000 UTC m=+221.894643507 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.365244 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:48 crc kubenswrapper[4825]: E0310 06:47:48.365610 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:48.865602637 +0000 UTC m=+221.895383252 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:48 crc kubenswrapper[4825]: W0310 06:47:48.419536 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod557beeba_985a_429a_9814_55e49dcb4e44.slice/crio-33ba6ee396795479de0e76890a5c5f96d01ce19a4eac751fb6e974d3ab82a4b5 WatchSource:0}: Error finding container 33ba6ee396795479de0e76890a5c5f96d01ce19a4eac751fb6e974d3ab82a4b5: Status 404 returned error can't find the container with id 33ba6ee396795479de0e76890a5c5f96d01ce19a4eac751fb6e974d3ab82a4b5 Mar 10 06:47:48 crc kubenswrapper[4825]: W0310 06:47:48.438671 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdb77a6d_e89b_4fbf_9e00_93e23571d248.slice/crio-5cf17ef83f2f9702125607523b15cadcb549d6bc61fa1f0fcbb622e4dd972c01 WatchSource:0}: Error finding container 5cf17ef83f2f9702125607523b15cadcb549d6bc61fa1f0fcbb622e4dd972c01: Status 404 returned error can't find the container with id 5cf17ef83f2f9702125607523b15cadcb549d6bc61fa1f0fcbb622e4dd972c01 Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.466685 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:48 crc kubenswrapper[4825]: E0310 06:47:48.466924 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:48.966895427 +0000 UTC m=+221.996676042 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.467054 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:48 crc kubenswrapper[4825]: E0310 06:47:48.468558 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:48.968544883 +0000 UTC m=+221.998325488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.570874 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:48 crc kubenswrapper[4825]: E0310 06:47:48.571569 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:49.071538112 +0000 UTC m=+222.101318727 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.571808 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:48 crc kubenswrapper[4825]: E0310 06:47:48.576784 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:49.076742138 +0000 UTC m=+222.106522753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.618700 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7wbbf"] Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.663574 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552085-npl6x"] Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.674127 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:48 crc kubenswrapper[4825]: E0310 06:47:48.674532 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:49.174513089 +0000 UTC m=+222.204293704 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.730820 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-zq969" podStartSLOduration=168.730797496 podStartE2EDuration="2m48.730797496s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:48.691052439 +0000 UTC m=+221.720833054" watchObservedRunningTime="2026-03-10 06:47:48.730797496 +0000 UTC m=+221.760578111" Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.767796 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hbbcq"] Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.817641 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wp526"] Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.817699 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5kj8n"] Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.789426 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.780514 4825 patch_prober.go:28] interesting pod/router-default-5444994796-tgcvt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 06:47:48 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Mar 10 06:47:48 crc kubenswrapper[4825]: [+]process-running ok Mar 10 06:47:48 crc kubenswrapper[4825]: healthz check failed Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.817988 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tgcvt" podUID="9f0e6d3a-b022-42dd-828e-bd5ec395d06c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 06:47:48 crc kubenswrapper[4825]: E0310 06:47:48.789814 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:49.289793314 +0000 UTC m=+222.319573929 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:48 crc kubenswrapper[4825]: W0310 06:47:48.829002 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2e4a51d_47a6_45a0_b510_d25924b2e22a.slice/crio-9b450c332858eb2c00cd825160a2e0f7db8f4ce1b719c51e08d28365ea7a06ab WatchSource:0}: Error finding container 9b450c332858eb2c00cd825160a2e0f7db8f4ce1b719c51e08d28365ea7a06ab: Status 404 returned error can't find the container with id 9b450c332858eb2c00cd825160a2e0f7db8f4ce1b719c51e08d28365ea7a06ab Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.854116 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g8v9j"] Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.859672 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qwnns" podStartSLOduration=168.85964799 podStartE2EDuration="2m48.85964799s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:48.81889769 +0000 UTC m=+221.848678295" watchObservedRunningTime="2026-03-10 06:47:48.85964799 +0000 UTC m=+221.889428605" Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.868606 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-9cr4m"] Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.888427 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m78bw" event={"ID":"bdb77a6d-e89b-4fbf-9e00-93e23571d248","Type":"ContainerStarted","Data":"5cf17ef83f2f9702125607523b15cadcb549d6bc61fa1f0fcbb622e4dd972c01"} Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.896327 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5xr9" event={"ID":"c941f1c7-e449-495d-9af7-e40d1b2f2cd5","Type":"ContainerStarted","Data":"8bf8d7c4a1d23dde05ba99f7312fab1d55a053bb8535442bebfd72da3609fd7e"} Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.900499 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j7fv5" event={"ID":"27d7fca4-c1f9-411f-b705-e36c0c1e8356","Type":"ContainerStarted","Data":"3a4073eb7ee5833eee4805e54c7210a359ec978eaaf81b8ff9e5175462372480"} Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.903286 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-t8r7l" event={"ID":"557beeba-985a-429a-9814-55e49dcb4e44","Type":"ContainerStarted","Data":"33ba6ee396795479de0e76890a5c5f96d01ce19a4eac751fb6e974d3ab82a4b5"} Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.919746 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:48 crc kubenswrapper[4825]: E0310 06:47:48.920202 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:49.42018486 +0000 UTC m=+222.449965475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.967164 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552086-rkhdp"] Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.968581 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4p7vx" event={"ID":"92a495f2-d3a4-42b9-82f9-48b4dda31caf","Type":"ContainerStarted","Data":"0d7c817f222a8803b868e9f136f2b8c98dbc071cd4e77062b65a4cea6839e6e0"} Mar 10 06:47:48 crc kubenswrapper[4825]: I0310 06:47:48.979598 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-s68rr"] Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.023947 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.025348 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-d5v8r"] Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.037439 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lvqmg" event={"ID":"75d9bfda-62b1-4488-9b08-cf1c8753d0da","Type":"ContainerStarted","Data":"ff1c866460e7cb46da216c1a72ea6a7248ac43d1c73fc4a28b805a3ef9b62569"} Mar 10 06:47:49 crc kubenswrapper[4825]: E0310 06:47:49.026335 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:49.526315915 +0000 UTC m=+222.556096530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.060979 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8tdjt" event={"ID":"230c8679-e321-40bd-844e-e350e48404e3","Type":"ContainerStarted","Data":"2156ff545e7ca1994cac79280b8366d5ddce8651cad13e089d5c4a1fecae4e4f"} Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.061788 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8tdjt" Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.083233 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8tdjt" Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.089176 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mxs62" podStartSLOduration=169.089115912 podStartE2EDuration="2m49.089115912s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:49.060149181 +0000 UTC m=+222.089929816" watchObservedRunningTime="2026-03-10 06:47:49.089115912 +0000 UTC m=+222.118896527" Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.126854 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:49 crc kubenswrapper[4825]: E0310 06:47:49.137594 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:49.637552462 +0000 UTC m=+222.667333077 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.157589 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8srf6" event={"ID":"ced2dcc4-9906-4cbe-b163-41ff41bb4f02","Type":"ContainerStarted","Data":"956665be0bf7935f3f1532f28dfbc220973fc1bcc025f6dbf8b27e5fff1c040b"} Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.194330 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mxs62" event={"ID":"3e73a713-a351-419b-903a-e44041c28d6f","Type":"ContainerStarted","Data":"5c86dbe093628fe960a04e9c55801bb5a948d4c23332330c0bcd21e18a0024e9"} Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.210872 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8vfsj" event={"ID":"8a265902-a773-4550-b3fa-79f94c82809c","Type":"ContainerStarted","Data":"c97ce29cd0e3240bfbd21ff944a8cdc199f877a7c15a14cc2f46022b6bb55571"} Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.238386 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:49 crc kubenswrapper[4825]: E0310 06:47:49.238960 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:49.738946056 +0000 UTC m=+222.768726661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.251683 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-2kdt2" podStartSLOduration=168.251661947 podStartE2EDuration="2m48.251661947s" podCreationTimestamp="2026-03-10 06:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:49.245602452 +0000 UTC m=+222.275383057" watchObservedRunningTime="2026-03-10 06:47:49.251661947 +0000 UTC m=+222.281442552" Mar 10 06:47:49 crc kubenswrapper[4825]: W0310 06:47:49.280333 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19b3fb37_5072_45f0_8349_1296e31a1193.slice/crio-917456c05b8f2e9daa4f600662c64fba8c5d0e8f30e258105fe8d3b141677a8a WatchSource:0}: Error finding container 917456c05b8f2e9daa4f600662c64fba8c5d0e8f30e258105fe8d3b141677a8a: Status 404 returned error can't find the container with id 917456c05b8f2e9daa4f600662c64fba8c5d0e8f30e258105fe8d3b141677a8a Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.292146 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.296538 4825 patch_prober.go:28] interesting pod/console-operator-58897d9998-zq969 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.296601 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zq969" podUID="af8fff67-ed41-4e2d-af5f-cf76cd2a4234" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.341747 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:49 crc kubenswrapper[4825]: E0310 06:47:49.343738 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:49.843716055 +0000 UTC m=+222.873496670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.432978 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csfgr" event={"ID":"e40273f3-dda6-4e38-b940-284ae6f95e41","Type":"ContainerStarted","Data":"fab36212b263cecd1acd0492426029a2a209e9417a821f6385df742bee125b19"} Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.433370 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6bjtt" event={"ID":"f3a60327-2809-415b-abde-d1569a2453b6","Type":"ContainerStarted","Data":"e1caf58256b6246018ee52bb36d21e9e27ff8bb5b218d2d8869057a0e006a818"} Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.433384 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6577d" event={"ID":"af13f2b9-c077-4470-8b65-0e2c834717e7","Type":"ContainerStarted","Data":"34a2657e763eec3cd6def6c4dcb75c1f48c86cc958db629747050771e7c09f27"} Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.433403 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zq969" event={"ID":"af8fff67-ed41-4e2d-af5f-cf76cd2a4234","Type":"ContainerStarted","Data":"cc05731d0648edb95fab8987042e684da21d6791a9278aabd6007cefe1323cf8"} Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.433425 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6j748"] Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.445206 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:49 crc kubenswrapper[4825]: E0310 06:47:49.445712 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:49.945698119 +0000 UTC m=+222.975478734 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.456616 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" event={"ID":"4d23b66c-f736-4af2-9dc7-6167ca4d53ef","Type":"ContainerStarted","Data":"4d4ea277f678ddf2345f9186c8aa38db3e06aebef380d7640660e915cd075c41"} Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.463443 4825 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-txg9q container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.463570 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" podUID="4d23b66c-f736-4af2-9dc7-6167ca4d53ef" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.469021 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-g5xr9" podStartSLOduration=168.468992828 podStartE2EDuration="2m48.468992828s" podCreationTimestamp="2026-03-10 06:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:49.4155957 +0000 UTC m=+222.445376315" watchObservedRunningTime="2026-03-10 06:47:49.468992828 +0000 UTC m=+222.498773443" Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.550033 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" podStartSLOduration=168.550015662 podStartE2EDuration="2m48.550015662s" podCreationTimestamp="2026-03-10 06:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:49.54846192 +0000 UTC m=+222.578242535" watchObservedRunningTime="2026-03-10 06:47:49.550015662 +0000 UTC m=+222.579796277" Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.552569 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2r62c"] Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.553204 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-79576" event={"ID":"bde95e4a-12d5-4b7e-bd4e-9b2527faefa0","Type":"ContainerStarted","Data":"e0275aed1cd3f56e6b1fc1edc944960f688bf15402135c13cc994b624a55ea58"} Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.553287 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-79576" event={"ID":"bde95e4a-12d5-4b7e-bd4e-9b2527faefa0","Type":"ContainerStarted","Data":"f3149874787d971df3766d91374f46b9e05a571ad11d06fef412cc4b2928122c"} Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.557300 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:49 crc kubenswrapper[4825]: E0310 06:47:49.560288 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:50.060261139 +0000 UTC m=+223.090041754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.571978 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nblmw"] Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.591764 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk"] Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.591941 4825 generic.go:334] "Generic (PLEG): container finished" podID="4282a4a7-f7d9-4c0e-9638-02c793c2e2e6" containerID="c755f01a99251339916392b88886546545ac656fb8a90072425423bef423c0f9" exitCode=0 Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.593485 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" event={"ID":"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6","Type":"ContainerDied","Data":"c755f01a99251339916392b88886546545ac656fb8a90072425423bef423c0f9"} Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.593554 4825 ???:1] "http: TLS handshake error from 192.168.126.11:39510: no serving certificate available for the kubelet" Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.621815 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j7fv5" podStartSLOduration=169.621796514 podStartE2EDuration="2m49.621796514s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:49.593910659 +0000 UTC m=+222.623691274" watchObservedRunningTime="2026-03-10 06:47:49.621796514 +0000 UTC m=+222.651577129" Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.623764 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8srf6" podStartSLOduration=169.62375851 podStartE2EDuration="2m49.62375851s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:49.620774549 +0000 UTC m=+222.650555164" watchObservedRunningTime="2026-03-10 06:47:49.62375851 +0000 UTC m=+222.653539125" Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.633197 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tgcvt" event={"ID":"9f0e6d3a-b022-42dd-828e-bd5ec395d06c","Type":"ContainerStarted","Data":"5b2e1aa3a5f722088f686facd9283fe9350afea20dc42be33e48c96ce22b10aa"} Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.637238 4825 ???:1] "http: TLS handshake error from 192.168.126.11:39514: no serving certificate available for the kubelet" Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.660379 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tc5qk" event={"ID":"7e8fd448-83e0-45d3-aecb-73f7e345c3bd","Type":"ContainerStarted","Data":"92424353eccb99e0ccd76d3499becdeb362250f1802d46be7e2bcdff6451a109"} Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.662634 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-6bjtt" podStartSLOduration=169.662610306 podStartE2EDuration="2m49.662610306s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:49.662611656 +0000 UTC m=+222.692392261" watchObservedRunningTime="2026-03-10 06:47:49.662610306 +0000 UTC m=+222.692390921" Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.664319 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:49 crc kubenswrapper[4825]: E0310 06:47:49.666666 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:50.166652713 +0000 UTC m=+223.196433328 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.673518 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" event={"ID":"cc9419c8-c23a-418b-8fba-9956bed2a193","Type":"ContainerStarted","Data":"7a8f8af22b29d41abc26637da30ee7f659dd5470468ff5f4f2286bafd73974ad"} Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.674422 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.679627 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-l5r8d" event={"ID":"b756af5b-95bd-47f1-a48e-7fe620e67b8c","Type":"ContainerStarted","Data":"dc844cbf4e2ecc177477e06773c671960750893ec9eb769e7747edb5aa51af5d"} Mar 10 06:47:49 crc kubenswrapper[4825]: W0310 06:47:49.683313 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc83fc408_8240_480a_834d_c791da137960.slice/crio-1060dd0834262ac30f359e0770f74b89236a16aca5e26e217af680f8f3113658 WatchSource:0}: Error finding container 1060dd0834262ac30f359e0770f74b89236a16aca5e26e217af680f8f3113658: Status 404 returned error can't find the container with id 1060dd0834262ac30f359e0770f74b89236a16aca5e26e217af680f8f3113658 Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.684178 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72x4w" event={"ID":"45ae05cc-9b3e-40e0-9235-f0cf1f6def5f","Type":"ContainerStarted","Data":"4ed802cbbd0493487bf2ab4ed8174f81d0705a8e09dd041eac5fec9e1ebfd316"} Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.684199 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72x4w" Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.695306 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8tdjt" podStartSLOduration=168.695286943 podStartE2EDuration="2m48.695286943s" podCreationTimestamp="2026-03-10 06:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:49.694853268 +0000 UTC m=+222.724633893" watchObservedRunningTime="2026-03-10 06:47:49.695286943 +0000 UTC m=+222.725067558" Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.695851 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q2lm7" event={"ID":"0289197b-e586-477b-bb0a-a8b8ef92b21d","Type":"ContainerStarted","Data":"cb0a161099189bf3b898c81236bbcf9b3722dfbe125b9de428610ab6f541692b"} Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.708776 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8mnj8" event={"ID":"07b348c9-f5aa-4fbd-a44c-3c2e8f82b0a9","Type":"ContainerStarted","Data":"5408f757f3d68582be462a5af8ebbf999de61d805bb2d68060f97ff4c3cd7721"} Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.709922 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ksvjd"] Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.717568 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.723527 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-79576" podStartSLOduration=169.723507538 podStartE2EDuration="2m49.723507538s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:49.720867209 +0000 UTC m=+222.750647824" watchObservedRunningTime="2026-03-10 06:47:49.723507538 +0000 UTC m=+222.753288153" Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.765048 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:49 crc kubenswrapper[4825]: E0310 06:47:49.766854 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:50.266837316 +0000 UTC m=+223.296617931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.780948 4825 ???:1] "http: TLS handshake error from 192.168.126.11:39516: no serving certificate available for the kubelet" Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.801840 4825 patch_prober.go:28] interesting pod/router-default-5444994796-tgcvt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 06:47:49 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Mar 10 06:47:49 crc kubenswrapper[4825]: [+]process-running ok Mar 10 06:47:49 crc kubenswrapper[4825]: healthz check failed Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.802085 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tgcvt" podUID="9f0e6d3a-b022-42dd-828e-bd5ec395d06c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.862368 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tc5qk" podStartSLOduration=168.862349271 podStartE2EDuration="2m48.862349271s" podCreationTimestamp="2026-03-10 06:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:49.8608459 +0000 UTC m=+222.890626515" watchObservedRunningTime="2026-03-10 06:47:49.862349271 +0000 UTC m=+222.892129886" Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.866949 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:49 crc kubenswrapper[4825]: E0310 06:47:49.867381 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:50.367365971 +0000 UTC m=+223.397146586 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.942988 4825 ???:1] "http: TLS handshake error from 192.168.126.11:39518: no serving certificate available for the kubelet" Mar 10 06:47:49 crc kubenswrapper[4825]: I0310 06:47:49.967813 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:49 crc kubenswrapper[4825]: E0310 06:47:49.968683 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:50.468642391 +0000 UTC m=+223.498423006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.018298 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" podStartSLOduration=170.018274352 podStartE2EDuration="2m50.018274352s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:50.000811471 +0000 UTC m=+223.030592086" watchObservedRunningTime="2026-03-10 06:47:50.018274352 +0000 UTC m=+223.048054967" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.020095 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6j748"] Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.044853 4825 ???:1] "http: TLS handshake error from 192.168.126.11:39524: no serving certificate available for the kubelet" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.070273 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:50 crc kubenswrapper[4825]: E0310 06:47:50.070811 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:50.570786851 +0000 UTC m=+223.600567466 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.074068 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72x4w" podStartSLOduration=170.074046041 podStartE2EDuration="2m50.074046041s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:50.038079073 +0000 UTC m=+223.067859698" watchObservedRunningTime="2026-03-10 06:47:50.074046041 +0000 UTC m=+223.103826656" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.119507 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk"] Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.137685 4825 ???:1] "http: TLS handshake error from 192.168.126.11:39538: no serving certificate available for the kubelet" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.173173 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:50 crc kubenswrapper[4825]: E0310 06:47:50.173994 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:50.673972524 +0000 UTC m=+223.703753139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.239515 4825 ???:1] "http: TLS handshake error from 192.168.126.11:39542: no serving certificate available for the kubelet" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.251793 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-q2lm7" podStartSLOduration=170.251751589 podStartE2EDuration="2m50.251751589s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:50.25091078 +0000 UTC m=+223.280691395" watchObservedRunningTime="2026-03-10 06:47:50.251751589 +0000 UTC m=+223.281532204" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.289076 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:50 crc kubenswrapper[4825]: E0310 06:47:50.296090 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:50.796046079 +0000 UTC m=+223.825826694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.391963 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:50 crc kubenswrapper[4825]: E0310 06:47:50.392339 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:50.89231866 +0000 UTC m=+223.922099275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.424646 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9zmmv"] Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.425789 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zmmv" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.432700 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.494706 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcqbl\" (UniqueName: \"kubernetes.io/projected/e945d496-847a-4109-ae2d-41e169b241c2-kube-api-access-tcqbl\") pod \"community-operators-9zmmv\" (UID: \"e945d496-847a-4109-ae2d-41e169b241c2\") " pod="openshift-marketplace/community-operators-9zmmv" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.495231 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e945d496-847a-4109-ae2d-41e169b241c2-catalog-content\") pod \"community-operators-9zmmv\" (UID: \"e945d496-847a-4109-ae2d-41e169b241c2\") " pod="openshift-marketplace/community-operators-9zmmv" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.495421 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.495549 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e945d496-847a-4109-ae2d-41e169b241c2-utilities\") pod \"community-operators-9zmmv\" (UID: \"e945d496-847a-4109-ae2d-41e169b241c2\") " pod="openshift-marketplace/community-operators-9zmmv" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.495976 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9zmmv"] Mar 10 06:47:50 crc kubenswrapper[4825]: E0310 06:47:50.496456 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:50.996443297 +0000 UTC m=+224.026223912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.601301 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.601592 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e945d496-847a-4109-ae2d-41e169b241c2-utilities\") pod \"community-operators-9zmmv\" (UID: \"e945d496-847a-4109-ae2d-41e169b241c2\") " pod="openshift-marketplace/community-operators-9zmmv" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.601654 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcqbl\" (UniqueName: \"kubernetes.io/projected/e945d496-847a-4109-ae2d-41e169b241c2-kube-api-access-tcqbl\") pod \"community-operators-9zmmv\" (UID: \"e945d496-847a-4109-ae2d-41e169b241c2\") " pod="openshift-marketplace/community-operators-9zmmv" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.601709 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e945d496-847a-4109-ae2d-41e169b241c2-catalog-content\") pod \"community-operators-9zmmv\" (UID: \"e945d496-847a-4109-ae2d-41e169b241c2\") " pod="openshift-marketplace/community-operators-9zmmv" Mar 10 06:47:50 crc kubenswrapper[4825]: E0310 06:47:50.602215 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:51.102177328 +0000 UTC m=+224.131957943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.602369 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e945d496-847a-4109-ae2d-41e169b241c2-catalog-content\") pod \"community-operators-9zmmv\" (UID: \"e945d496-847a-4109-ae2d-41e169b241c2\") " pod="openshift-marketplace/community-operators-9zmmv" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.606065 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e945d496-847a-4109-ae2d-41e169b241c2-utilities\") pod \"community-operators-9zmmv\" (UID: \"e945d496-847a-4109-ae2d-41e169b241c2\") " pod="openshift-marketplace/community-operators-9zmmv" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.621604 4825 ???:1] "http: TLS handshake error from 192.168.126.11:39550: no serving certificate available for the kubelet" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.633742 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4gwdl"] Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.641244 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gwdl" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.645753 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.658462 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4gwdl"] Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.685162 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcqbl\" (UniqueName: \"kubernetes.io/projected/e945d496-847a-4109-ae2d-41e169b241c2-kube-api-access-tcqbl\") pod \"community-operators-9zmmv\" (UID: \"e945d496-847a-4109-ae2d-41e169b241c2\") " pod="openshift-marketplace/community-operators-9zmmv" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.702830 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5wjz\" (UniqueName: \"kubernetes.io/projected/f7b359b6-5dbd-4270-9195-a355b8ce3dbd-kube-api-access-w5wjz\") pod \"certified-operators-4gwdl\" (UID: \"f7b359b6-5dbd-4270-9195-a355b8ce3dbd\") " pod="openshift-marketplace/certified-operators-4gwdl" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.702918 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.702956 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7b359b6-5dbd-4270-9195-a355b8ce3dbd-catalog-content\") pod \"certified-operators-4gwdl\" (UID: \"f7b359b6-5dbd-4270-9195-a355b8ce3dbd\") " pod="openshift-marketplace/certified-operators-4gwdl" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.702991 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7b359b6-5dbd-4270-9195-a355b8ce3dbd-utilities\") pod \"certified-operators-4gwdl\" (UID: \"f7b359b6-5dbd-4270-9195-a355b8ce3dbd\") " pod="openshift-marketplace/certified-operators-4gwdl" Mar 10 06:47:50 crc kubenswrapper[4825]: E0310 06:47:50.703463 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:51.203435277 +0000 UTC m=+224.233216082 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.760474 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-s68rr" event={"ID":"3e18f017-e70d-45b7-a7fc-a9398f698980","Type":"ContainerStarted","Data":"256fc11d6da573384434d31e221dda392f6f888feeea77b46e3827c72a6beda7"} Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.776051 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8srf6" event={"ID":"ced2dcc4-9906-4cbe-b163-41ff41bb4f02","Type":"ContainerStarted","Data":"54b4aa846f86305474a6bc122dc515cc979856bdebb01041b2b7e13129b88bf5"} Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.777507 4825 patch_prober.go:28] interesting pod/router-default-5444994796-tgcvt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 06:47:50 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Mar 10 06:47:50 crc kubenswrapper[4825]: [+]process-running ok Mar 10 06:47:50 crc kubenswrapper[4825]: healthz check failed Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.777558 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tgcvt" podUID="9f0e6d3a-b022-42dd-828e-bd5ec395d06c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.801956 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m78bw" event={"ID":"bdb77a6d-e89b-4fbf-9e00-93e23571d248","Type":"ContainerStarted","Data":"18605251060e58932faf38bc7a9683533d2655ff6adce54b71583efa3962b703"} Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.804046 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.816396 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7b359b6-5dbd-4270-9195-a355b8ce3dbd-catalog-content\") pod \"certified-operators-4gwdl\" (UID: \"f7b359b6-5dbd-4270-9195-a355b8ce3dbd\") " pod="openshift-marketplace/certified-operators-4gwdl" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.816510 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7b359b6-5dbd-4270-9195-a355b8ce3dbd-utilities\") pod \"certified-operators-4gwdl\" (UID: \"f7b359b6-5dbd-4270-9195-a355b8ce3dbd\") " pod="openshift-marketplace/certified-operators-4gwdl" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.816742 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5wjz\" (UniqueName: \"kubernetes.io/projected/f7b359b6-5dbd-4270-9195-a355b8ce3dbd-kube-api-access-w5wjz\") pod \"certified-operators-4gwdl\" (UID: \"f7b359b6-5dbd-4270-9195-a355b8ce3dbd\") " pod="openshift-marketplace/certified-operators-4gwdl" Mar 10 06:47:50 crc kubenswrapper[4825]: E0310 06:47:50.817257 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:51.317236512 +0000 UTC m=+224.347017127 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.817720 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7b359b6-5dbd-4270-9195-a355b8ce3dbd-catalog-content\") pod \"certified-operators-4gwdl\" (UID: \"f7b359b6-5dbd-4270-9195-a355b8ce3dbd\") " pod="openshift-marketplace/certified-operators-4gwdl" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.817990 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7b359b6-5dbd-4270-9195-a355b8ce3dbd-utilities\") pod \"certified-operators-4gwdl\" (UID: \"f7b359b6-5dbd-4270-9195-a355b8ce3dbd\") " pod="openshift-marketplace/certified-operators-4gwdl" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.854718 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zmmv" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.856845 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hbdjf"] Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.895424 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5wjz\" (UniqueName: \"kubernetes.io/projected/f7b359b6-5dbd-4270-9195-a355b8ce3dbd-kube-api-access-w5wjz\") pod \"certified-operators-4gwdl\" (UID: \"f7b359b6-5dbd-4270-9195-a355b8ce3dbd\") " pod="openshift-marketplace/certified-operators-4gwdl" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.903428 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2r62c" event={"ID":"c83fc408-8240-480a-834d-c791da137960","Type":"ContainerStarted","Data":"64e900101338748fb1c68a761567c48a29471f90e65d59ee01138837914a9d28"} Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.929038 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hbdjf"] Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.929082 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2r62c" event={"ID":"c83fc408-8240-480a-834d-c791da137960","Type":"ContainerStarted","Data":"1060dd0834262ac30f359e0770f74b89236a16aca5e26e217af680f8f3113658"} Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.929106 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nblmw" event={"ID":"6b13c3a5-b9b7-421b-8c46-40a7408d6086","Type":"ContainerStarted","Data":"30ac993a5d8b27ad7b5c767382a2c893d153bb5302cdde57fda5319a47461dfb"} Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.929119 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nblmw" event={"ID":"6b13c3a5-b9b7-421b-8c46-40a7408d6086","Type":"ContainerStarted","Data":"7becf1f763db2cc677e74f0991c085dfc9dc4e5aed312a6d04fc0d8ae8f44ed1"} Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.918718 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d232f8a7-f013-4a5c-a7dc-2150c4b3040c-catalog-content\") pod \"community-operators-hbdjf\" (UID: \"d232f8a7-f013-4a5c-a7dc-2150c4b3040c\") " pod="openshift-marketplace/community-operators-hbdjf" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.929264 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.929385 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d232f8a7-f013-4a5c-a7dc-2150c4b3040c-utilities\") pod \"community-operators-hbdjf\" (UID: \"d232f8a7-f013-4a5c-a7dc-2150c4b3040c\") " pod="openshift-marketplace/community-operators-hbdjf" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.929516 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6bpt\" (UniqueName: \"kubernetes.io/projected/d232f8a7-f013-4a5c-a7dc-2150c4b3040c-kube-api-access-p6bpt\") pod \"community-operators-hbdjf\" (UID: \"d232f8a7-f013-4a5c-a7dc-2150c4b3040c\") " pod="openshift-marketplace/community-operators-hbdjf" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.909826 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hbdjf" Mar 10 06:47:50 crc kubenswrapper[4825]: E0310 06:47:50.933100 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:51.433079345 +0000 UTC m=+224.462859960 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.940402 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m78bw" podStartSLOduration=169.940373432 podStartE2EDuration="2m49.940373432s" podCreationTimestamp="2026-03-10 06:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:50.841888927 +0000 UTC m=+223.871669542" watchObservedRunningTime="2026-03-10 06:47:50.940373432 +0000 UTC m=+223.970154057" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.943229 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2r62c" podStartSLOduration=6.943212738 podStartE2EDuration="6.943212738s" podCreationTimestamp="2026-03-10 06:47:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:50.908045687 +0000 UTC m=+223.937826312" watchObservedRunningTime="2026-03-10 06:47:50.943212738 +0000 UTC m=+223.972993353" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.960176 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-9cr4m" event={"ID":"d2c0d6e4-6024-48a0-8bca-5069baf7a8ab","Type":"ContainerStarted","Data":"1be43be9b1536b7aa994452abb00360788470b88426d4bc5a819f87b8e5994d6"} Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.960247 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-9cr4m" event={"ID":"d2c0d6e4-6024-48a0-8bca-5069baf7a8ab","Type":"ContainerStarted","Data":"34e19dc7c54cf8249cddb833e3b641b2f21046e8f274fa787f269b69f2cb5e8e"} Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.960673 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-9cr4m" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.971997 4825 patch_prober.go:28] interesting pod/downloads-7954f5f757-9cr4m container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.972044 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9cr4m" podUID="d2c0d6e4-6024-48a0-8bca-5069baf7a8ab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.36:8080/\": dial tcp 10.217.0.36:8080: connect: connection refused" Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.989755 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6577d" event={"ID":"af13f2b9-c077-4470-8b65-0e2c834717e7","Type":"ContainerStarted","Data":"ae3d872ded0502f66e7ded2819069a13af6a3a4cce34c80ea1badaefadf15026"} Mar 10 06:47:50 crc kubenswrapper[4825]: I0310 06:47:50.991599 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" event={"ID":"0bab741b-822c-4548-8333-aa3f90ecd8a0","Type":"ContainerStarted","Data":"055a3fff85827539d578674378c64256b637de94c64ea8a5573bb622d0e62782"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:50.995514 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-t8r7l" event={"ID":"557beeba-985a-429a-9814-55e49dcb4e44","Type":"ContainerStarted","Data":"1f57aa1ef7e9a0aa17d3c5a8003177a928bb5d5490a40a1c36a6b218cae5078d"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.001117 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7wbbf" event={"ID":"fba5b680-6beb-4ef8-8e8d-91be04d21de9","Type":"ContainerStarted","Data":"4123667053dad12eee0cb3002ae6a48b8bee848c7d64233d9aa1dadb89fe843a"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.001198 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7wbbf" event={"ID":"fba5b680-6beb-4ef8-8e8d-91be04d21de9","Type":"ContainerStarted","Data":"fb3ac44c30845636c1d84ecff45f1981f921d761d67d42eaa62191c8c47a2e22"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.005920 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-9cr4m" podStartSLOduration=171.005889991 podStartE2EDuration="2m51.005889991s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:51.005353373 +0000 UTC m=+224.035133988" watchObservedRunningTime="2026-03-10 06:47:51.005889991 +0000 UTC m=+224.035670606" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.019969 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lfhnn"] Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.021103 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lfhnn" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.024068 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-l5r8d" event={"ID":"b756af5b-95bd-47f1-a48e-7fe620e67b8c","Type":"ContainerStarted","Data":"212c2761a3a5713666b0967ebdcb598dfcb4293e25fbd1502e9c4eda504b26f3"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.035537 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.035731 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d232f8a7-f013-4a5c-a7dc-2150c4b3040c-catalog-content\") pod \"community-operators-hbdjf\" (UID: \"d232f8a7-f013-4a5c-a7dc-2150c4b3040c\") " pod="openshift-marketplace/community-operators-hbdjf" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.035789 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d232f8a7-f013-4a5c-a7dc-2150c4b3040c-utilities\") pod \"community-operators-hbdjf\" (UID: \"d232f8a7-f013-4a5c-a7dc-2150c4b3040c\") " pod="openshift-marketplace/community-operators-hbdjf" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.035836 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6bpt\" (UniqueName: \"kubernetes.io/projected/d232f8a7-f013-4a5c-a7dc-2150c4b3040c-kube-api-access-p6bpt\") pod \"community-operators-hbdjf\" (UID: \"d232f8a7-f013-4a5c-a7dc-2150c4b3040c\") " pod="openshift-marketplace/community-operators-hbdjf" Mar 10 06:47:51 crc kubenswrapper[4825]: E0310 06:47:51.036244 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:51.536228329 +0000 UTC m=+224.566008944 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.045419 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d232f8a7-f013-4a5c-a7dc-2150c4b3040c-catalog-content\") pod \"community-operators-hbdjf\" (UID: \"d232f8a7-f013-4a5c-a7dc-2150c4b3040c\") " pod="openshift-marketplace/community-operators-hbdjf" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.045979 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d232f8a7-f013-4a5c-a7dc-2150c4b3040c-utilities\") pod \"community-operators-hbdjf\" (UID: \"d232f8a7-f013-4a5c-a7dc-2150c4b3040c\") " pod="openshift-marketplace/community-operators-hbdjf" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.048686 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lfhnn"] Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.054650 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lvqmg" event={"ID":"75d9bfda-62b1-4488-9b08-cf1c8753d0da","Type":"ContainerStarted","Data":"d13e985e93002e0c214551306441b33ada2894b7dca73cbd08c3693e84add37b"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.056304 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7wbbf" podStartSLOduration=170.056276118 podStartE2EDuration="2m50.056276118s" podCreationTimestamp="2026-03-10 06:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:51.055941016 +0000 UTC m=+224.085721632" watchObservedRunningTime="2026-03-10 06:47:51.056276118 +0000 UTC m=+224.086056733" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.061421 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wp526" event={"ID":"14642521-187c-45cf-aa34-cfd4fa40e632","Type":"ContainerStarted","Data":"93f8b426212030d358dd16aff152e2d7dd6c415860a6059174027bb4994935a1"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.061475 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wp526" event={"ID":"14642521-187c-45cf-aa34-cfd4fa40e632","Type":"ContainerStarted","Data":"d4c0e91441b3ed040e947dd46a459e045abc0e46bc40b6596707c483ddb74596"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.075650 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q2lm7" event={"ID":"0289197b-e586-477b-bb0a-a8b8ef92b21d","Type":"ContainerStarted","Data":"7d573c181b2770bae8327477725526da5602fdc15c151e1ba36481d71c2743e6"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.080786 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-t8r7l" podStartSLOduration=7.080767417 podStartE2EDuration="7.080767417s" podCreationTimestamp="2026-03-10 06:47:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:51.07877821 +0000 UTC m=+224.108558825" watchObservedRunningTime="2026-03-10 06:47:51.080767417 +0000 UTC m=+224.110548032" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.090521 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6bpt\" (UniqueName: \"kubernetes.io/projected/d232f8a7-f013-4a5c-a7dc-2150c4b3040c-kube-api-access-p6bpt\") pod \"community-operators-hbdjf\" (UID: \"d232f8a7-f013-4a5c-a7dc-2150c4b3040c\") " pod="openshift-marketplace/community-operators-hbdjf" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.097799 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g8v9j" event={"ID":"7c8037e4-1d42-4e69-a841-8042b4de2ffd","Type":"ContainerStarted","Data":"7928d58a65c22e859aeff3bf2954e8ed2a2287541117dd1cc11acc4494d0a91a"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.097853 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g8v9j" event={"ID":"7c8037e4-1d42-4e69-a841-8042b4de2ffd","Type":"ContainerStarted","Data":"7d05d2d0a85e25d4935ce605cb60e5f050188f3af2f943475ffcd1cf13f98f6c"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.098892 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g8v9j" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.101352 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5kj8n" event={"ID":"caf1fe52-fc04-4251-bdf1-f4cf1d80f45f","Type":"ContainerStarted","Data":"48a6d8c8ce364717a4d026c1a62107fc84872abdd9f698a17ac096000cf46017"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.101404 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5kj8n" event={"ID":"caf1fe52-fc04-4251-bdf1-f4cf1d80f45f","Type":"ContainerStarted","Data":"6c9e36174499675c056aefe7fcb67ab46bdcf5075215ff426e318e6b0a1ed550"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.101978 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5kj8n" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.110802 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" event={"ID":"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0","Type":"ContainerStarted","Data":"3e191ea6f070b84b1094f8b78c48e2dee767235e5bc813667132ea7d826260ad"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.110926 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-l5r8d" podStartSLOduration=171.110904278 podStartE2EDuration="2m51.110904278s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:51.109014804 +0000 UTC m=+224.138795429" watchObservedRunningTime="2026-03-10 06:47:51.110904278 +0000 UTC m=+224.140684893" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.115935 4825 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-5kj8n container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" start-of-body= Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.116032 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5kj8n" podUID="caf1fe52-fc04-4251-bdf1-f4cf1d80f45f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.116100 4825 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-g8v9j container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.116553 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g8v9j" podUID="7c8037e4-1d42-4e69-a841-8042b4de2ffd" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.135265 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gwdl" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.137006 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33ed941e-9788-4c9e-bcd5-6af206460adb-utilities\") pod \"certified-operators-lfhnn\" (UID: \"33ed941e-9788-4c9e-bcd5-6af206460adb\") " pod="openshift-marketplace/certified-operators-lfhnn" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.137142 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.137298 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l84j\" (UniqueName: \"kubernetes.io/projected/33ed941e-9788-4c9e-bcd5-6af206460adb-kube-api-access-4l84j\") pod \"certified-operators-lfhnn\" (UID: \"33ed941e-9788-4c9e-bcd5-6af206460adb\") " pod="openshift-marketplace/certified-operators-lfhnn" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.137363 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33ed941e-9788-4c9e-bcd5-6af206460adb-catalog-content\") pod \"certified-operators-lfhnn\" (UID: \"33ed941e-9788-4c9e-bcd5-6af206460adb\") " pod="openshift-marketplace/certified-operators-lfhnn" Mar 10 06:47:51 crc kubenswrapper[4825]: E0310 06:47:51.139877 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:51.639858359 +0000 UTC m=+224.669638984 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.140286 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csfgr" event={"ID":"e40273f3-dda6-4e38-b940-284ae6f95e41","Type":"ContainerStarted","Data":"12383b378c20d486f55c94d323939c5edba30e3c445e316b25596aacbd361a5d"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.163530 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" event={"ID":"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb","Type":"ContainerStarted","Data":"c9f6a3554727222e269c9f165af375f07bcad1f76d1a1f46294b1780f45001e8"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.182209 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wp526" podStartSLOduration=170.182184012 podStartE2EDuration="2m50.182184012s" podCreationTimestamp="2026-03-10 06:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:51.139084393 +0000 UTC m=+224.168865008" watchObservedRunningTime="2026-03-10 06:47:51.182184012 +0000 UTC m=+224.211964617" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.246314 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.246891 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33ed941e-9788-4c9e-bcd5-6af206460adb-utilities\") pod \"certified-operators-lfhnn\" (UID: \"33ed941e-9788-4c9e-bcd5-6af206460adb\") " pod="openshift-marketplace/certified-operators-lfhnn" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.247061 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l84j\" (UniqueName: \"kubernetes.io/projected/33ed941e-9788-4c9e-bcd5-6af206460adb-kube-api-access-4l84j\") pod \"certified-operators-lfhnn\" (UID: \"33ed941e-9788-4c9e-bcd5-6af206460adb\") " pod="openshift-marketplace/certified-operators-lfhnn" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.247144 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33ed941e-9788-4c9e-bcd5-6af206460adb-catalog-content\") pod \"certified-operators-lfhnn\" (UID: \"33ed941e-9788-4c9e-bcd5-6af206460adb\") " pod="openshift-marketplace/certified-operators-lfhnn" Mar 10 06:47:51 crc kubenswrapper[4825]: E0310 06:47:51.247545 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:51.747527566 +0000 UTC m=+224.777308181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.254358 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hbdjf" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.266682 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-d5v8r" event={"ID":"fa3021cc-fba5-4be1-ab8b-da2e1ad68307","Type":"ContainerStarted","Data":"f7445800eef66eed47e86edcf2e2678e1fd40419a6fae87f17b64a2e6eeb3cba"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.266727 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-d5v8r" event={"ID":"fa3021cc-fba5-4be1-ab8b-da2e1ad68307","Type":"ContainerStarted","Data":"69c65fdbb09892b3d918f38e9614917414dac763d87206ed2ca25c17b899fafb"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.273978 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lvqmg" podStartSLOduration=170.27394669 podStartE2EDuration="2m50.27394669s" podCreationTimestamp="2026-03-10 06:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:51.267698589 +0000 UTC m=+224.297479204" watchObservedRunningTime="2026-03-10 06:47:51.27394669 +0000 UTC m=+224.303727305" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.279362 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33ed941e-9788-4c9e-bcd5-6af206460adb-utilities\") pod \"certified-operators-lfhnn\" (UID: \"33ed941e-9788-4c9e-bcd5-6af206460adb\") " pod="openshift-marketplace/certified-operators-lfhnn" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.282614 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33ed941e-9788-4c9e-bcd5-6af206460adb-catalog-content\") pod \"certified-operators-lfhnn\" (UID: \"33ed941e-9788-4c9e-bcd5-6af206460adb\") " pod="openshift-marketplace/certified-operators-lfhnn" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.309541 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hbbcq" event={"ID":"d2e4a51d-47a6-45a0-b510-d25924b2e22a","Type":"ContainerStarted","Data":"9b450c332858eb2c00cd825160a2e0f7db8f4ce1b719c51e08d28365ea7a06ab"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.343975 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g8v9j" podStartSLOduration=170.34392634 podStartE2EDuration="2m50.34392634s" podCreationTimestamp="2026-03-10 06:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:51.340095621 +0000 UTC m=+224.369876256" watchObservedRunningTime="2026-03-10 06:47:51.34392634 +0000 UTC m=+224.373706955" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.359495 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l84j\" (UniqueName: \"kubernetes.io/projected/33ed941e-9788-4c9e-bcd5-6af206460adb-kube-api-access-4l84j\") pod \"certified-operators-lfhnn\" (UID: \"33ed941e-9788-4c9e-bcd5-6af206460adb\") " pod="openshift-marketplace/certified-operators-lfhnn" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.360098 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:51 crc kubenswrapper[4825]: E0310 06:47:51.362014 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:51.862000793 +0000 UTC m=+224.891781408 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.373415 4825 ???:1] "http: TLS handshake error from 192.168.126.11:39560: no serving certificate available for the kubelet" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.377914 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8vfsj" event={"ID":"8a265902-a773-4550-b3fa-79f94c82809c","Type":"ContainerStarted","Data":"ce0dcbc4dc739bed011d55d91818de1fdfdb0239012eb1fc80e21b9d53e437da"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.415428 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-csfgr" podStartSLOduration=170.398903793 podStartE2EDuration="2m50.398903793s" podCreationTimestamp="2026-03-10 06:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:51.387795436 +0000 UTC m=+224.417576051" watchObservedRunningTime="2026-03-10 06:47:51.398903793 +0000 UTC m=+224.428684398" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.428442 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552086-rkhdp" event={"ID":"19b3fb37-5072-45f0-8349-1296e31a1193","Type":"ContainerStarted","Data":"917456c05b8f2e9daa4f600662c64fba8c5d0e8f30e258105fe8d3b141677a8a"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.436022 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-d5v8r" podStartSLOduration=170.435991639 podStartE2EDuration="2m50.435991639s" podCreationTimestamp="2026-03-10 06:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:51.417177912 +0000 UTC m=+224.446958527" watchObservedRunningTime="2026-03-10 06:47:51.435991639 +0000 UTC m=+224.465772244" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.461520 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:51 crc kubenswrapper[4825]: E0310 06:47:51.463337 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:51.963316184 +0000 UTC m=+224.993096789 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.475241 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5kj8n" podStartSLOduration=170.475213667 podStartE2EDuration="2m50.475213667s" podCreationTimestamp="2026-03-10 06:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:51.46350171 +0000 UTC m=+224.493282325" watchObservedRunningTime="2026-03-10 06:47:51.475213667 +0000 UTC m=+224.504994282" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.489748 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8mnj8" event={"ID":"07b348c9-f5aa-4fbd-a44c-3c2e8f82b0a9","Type":"ContainerStarted","Data":"67ab50b3d2d9442ddf4c3f4f4de5525b32017f45f85596127f6c6b02d8333151"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.520714 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tc5qk" event={"ID":"7e8fd448-83e0-45d3-aecb-73f7e345c3bd","Type":"ContainerStarted","Data":"1c407a5da2395c2094b7a2562f07cf56f02c71fd6e48904d08edc8c24dd2120a"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.522942 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-8vfsj" podStartSLOduration=170.522929183 podStartE2EDuration="2m50.522929183s" podCreationTimestamp="2026-03-10 06:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:51.521330949 +0000 UTC m=+224.551111564" watchObservedRunningTime="2026-03-10 06:47:51.522929183 +0000 UTC m=+224.552709798" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.575937 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.589179 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552085-npl6x" event={"ID":"f4889476-c0d0-4e4b-986a-f4dcdacce72b","Type":"ContainerStarted","Data":"48ead55ecdcc5b367d13e319264acfded320c333d5f630757bdcd4e5589ea7fb"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.606536 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552085-npl6x" event={"ID":"f4889476-c0d0-4e4b-986a-f4dcdacce72b","Type":"ContainerStarted","Data":"f044f6937aa8febe11168d12bdff487a5f332bb1cab3d4c838c29042c23fdb09"} Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.590817 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lfhnn" Mar 10 06:47:51 crc kubenswrapper[4825]: E0310 06:47:51.609038 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:52.109014419 +0000 UTC m=+225.138795034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.612980 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8mnj8" podStartSLOduration=170.612948912 podStartE2EDuration="2m50.612948912s" podCreationTimestamp="2026-03-10 06:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:51.608669777 +0000 UTC m=+224.638450392" watchObservedRunningTime="2026-03-10 06:47:51.612948912 +0000 UTC m=+224.642729527" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.635503 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-zq969" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.657945 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.660334 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72x4w" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.708741 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:51 crc kubenswrapper[4825]: E0310 06:47:51.708924 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:52.208888742 +0000 UTC m=+225.238669367 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.709178 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:51 crc kubenswrapper[4825]: E0310 06:47:51.711863 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:52.211850282 +0000 UTC m=+225.241630897 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.733942 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29552085-npl6x" podStartSLOduration=171.733916339 podStartE2EDuration="2m51.733916339s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:51.660535664 +0000 UTC m=+224.690316279" watchObservedRunningTime="2026-03-10 06:47:51.733916339 +0000 UTC m=+224.763696954" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.797097 4825 patch_prober.go:28] interesting pod/router-default-5444994796-tgcvt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 06:47:51 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Mar 10 06:47:51 crc kubenswrapper[4825]: [+]process-running ok Mar 10 06:47:51 crc kubenswrapper[4825]: healthz check failed Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.797183 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tgcvt" podUID="9f0e6d3a-b022-42dd-828e-bd5ec395d06c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.817984 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:51 crc kubenswrapper[4825]: E0310 06:47:51.818412 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:52.31839242 +0000 UTC m=+225.348173035 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:51 crc kubenswrapper[4825]: I0310 06:47:51.920867 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:51 crc kubenswrapper[4825]: E0310 06:47:51.921682 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:52.421666148 +0000 UTC m=+225.451446763 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.023919 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:52 crc kubenswrapper[4825]: E0310 06:47:52.024350 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:52.524331836 +0000 UTC m=+225.554112441 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.125772 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:52 crc kubenswrapper[4825]: E0310 06:47:52.126237 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:52.626221727 +0000 UTC m=+225.656002342 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.227351 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:52 crc kubenswrapper[4825]: E0310 06:47:52.228045 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:52.728023785 +0000 UTC m=+225.757804400 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.295870 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4gwdl"] Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.329680 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:52 crc kubenswrapper[4825]: E0310 06:47:52.330353 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:52.83032916 +0000 UTC m=+225.860109965 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.348165 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hbdjf"] Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.385943 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9zmmv"] Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.419456 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vkvnj"] Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.431500 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:52 crc kubenswrapper[4825]: E0310 06:47:52.432627 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:52.932608934 +0000 UTC m=+225.962389549 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.440463 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vkvnj"] Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.440631 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vkvnj" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.449736 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.470435 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lfhnn"] Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.535586 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.535658 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9p8v\" (UniqueName: \"kubernetes.io/projected/d41ec286-2462-448b-ab27-1acc0e1dab3c-kube-api-access-t9p8v\") pod \"redhat-marketplace-vkvnj\" (UID: \"d41ec286-2462-448b-ab27-1acc0e1dab3c\") " pod="openshift-marketplace/redhat-marketplace-vkvnj" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.535761 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d41ec286-2462-448b-ab27-1acc0e1dab3c-catalog-content\") pod \"redhat-marketplace-vkvnj\" (UID: \"d41ec286-2462-448b-ab27-1acc0e1dab3c\") " pod="openshift-marketplace/redhat-marketplace-vkvnj" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.535786 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d41ec286-2462-448b-ab27-1acc0e1dab3c-utilities\") pod \"redhat-marketplace-vkvnj\" (UID: \"d41ec286-2462-448b-ab27-1acc0e1dab3c\") " pod="openshift-marketplace/redhat-marketplace-vkvnj" Mar 10 06:47:52 crc kubenswrapper[4825]: E0310 06:47:52.537407 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:53.037384592 +0000 UTC m=+226.067165207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.612335 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" event={"ID":"0bab741b-822c-4548-8333-aa3f90ecd8a0","Type":"ContainerStarted","Data":"52bb6a0f470c8056867111ac34610082b578937ba95c6d38d1c41ec2077d0f38"} Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.612439 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" podUID="0bab741b-822c-4548-8333-aa3f90ecd8a0" containerName="controller-manager" containerID="cri-o://52bb6a0f470c8056867111ac34610082b578937ba95c6d38d1c41ec2077d0f38" gracePeriod=30 Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.612607 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.625657 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-s68rr" event={"ID":"3e18f017-e70d-45b7-a7fc-a9398f698980","Type":"ContainerStarted","Data":"81de5062cc3ae825b095b344a270a918c2cd3e2aedefd3f91f71215014fb0c84"} Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.626011 4825 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6j748 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.626065 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" podUID="0bab741b-822c-4548-8333-aa3f90ecd8a0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.636752 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:52 crc kubenswrapper[4825]: E0310 06:47:52.637118 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:53.137078139 +0000 UTC m=+226.166858754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.637767 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d41ec286-2462-448b-ab27-1acc0e1dab3c-catalog-content\") pod \"redhat-marketplace-vkvnj\" (UID: \"d41ec286-2462-448b-ab27-1acc0e1dab3c\") " pod="openshift-marketplace/redhat-marketplace-vkvnj" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.637800 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d41ec286-2462-448b-ab27-1acc0e1dab3c-utilities\") pod \"redhat-marketplace-vkvnj\" (UID: \"d41ec286-2462-448b-ab27-1acc0e1dab3c\") " pod="openshift-marketplace/redhat-marketplace-vkvnj" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.637858 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.637885 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9p8v\" (UniqueName: \"kubernetes.io/projected/d41ec286-2462-448b-ab27-1acc0e1dab3c-kube-api-access-t9p8v\") pod \"redhat-marketplace-vkvnj\" (UID: \"d41ec286-2462-448b-ab27-1acc0e1dab3c\") " pod="openshift-marketplace/redhat-marketplace-vkvnj" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.639855 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" podStartSLOduration=172.639828162 podStartE2EDuration="2m52.639828162s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:52.630349991 +0000 UTC m=+225.660130606" watchObservedRunningTime="2026-03-10 06:47:52.639828162 +0000 UTC m=+225.669608777" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.641751 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d41ec286-2462-448b-ab27-1acc0e1dab3c-catalog-content\") pod \"redhat-marketplace-vkvnj\" (UID: \"d41ec286-2462-448b-ab27-1acc0e1dab3c\") " pod="openshift-marketplace/redhat-marketplace-vkvnj" Mar 10 06:47:52 crc kubenswrapper[4825]: E0310 06:47:52.641819 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:53.141805449 +0000 UTC m=+226.171586064 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.641843 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d41ec286-2462-448b-ab27-1acc0e1dab3c-utilities\") pod \"redhat-marketplace-vkvnj\" (UID: \"d41ec286-2462-448b-ab27-1acc0e1dab3c\") " pod="openshift-marketplace/redhat-marketplace-vkvnj" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.684816 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zmmv" event={"ID":"e945d496-847a-4109-ae2d-41e169b241c2","Type":"ContainerStarted","Data":"3b8b29a0243edbb7a048bcd7a324d016e71460515b8950d383ec8fe104a64074"} Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.685351 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9p8v\" (UniqueName: \"kubernetes.io/projected/d41ec286-2462-448b-ab27-1acc0e1dab3c-kube-api-access-t9p8v\") pod \"redhat-marketplace-vkvnj\" (UID: \"d41ec286-2462-448b-ab27-1acc0e1dab3c\") " pod="openshift-marketplace/redhat-marketplace-vkvnj" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.695533 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hbbcq" event={"ID":"d2e4a51d-47a6-45a0-b510-d25924b2e22a","Type":"ContainerStarted","Data":"d53fad51c39f4905889c9e285cb73e70ea311117e03a356b2a66d76e8b586630"} Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.695603 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hbbcq" event={"ID":"d2e4a51d-47a6-45a0-b510-d25924b2e22a","Type":"ContainerStarted","Data":"b178e82bcb325edd8d503d636dae95cb4c5ba034468cc024f6c9d596056fa027"} Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.696735 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-hbbcq" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.718673 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nblmw" event={"ID":"6b13c3a5-b9b7-421b-8c46-40a7408d6086","Type":"ContainerStarted","Data":"775eefdd3eff695f91b356b2250976f62afb5ab2a843926c9118e99138482d8f"} Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.754121 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nblmw" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.756159 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:52 crc kubenswrapper[4825]: E0310 06:47:52.757585 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:53.25756504 +0000 UTC m=+226.287345655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.762352 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hbbcq" podStartSLOduration=8.762324701 podStartE2EDuration="8.762324701s" podCreationTimestamp="2026-03-10 06:47:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:52.755393616 +0000 UTC m=+225.785174231" watchObservedRunningTime="2026-03-10 06:47:52.762324701 +0000 UTC m=+225.792105316" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.771080 4825 ???:1] "http: TLS handshake error from 192.168.126.11:44990: no serving certificate available for the kubelet" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.772747 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lfhnn" event={"ID":"33ed941e-9788-4c9e-bcd5-6af206460adb","Type":"ContainerStarted","Data":"8f55c48652aa364654b9e17b26a212598ff09f76119329099cc12e8619f83286"} Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.792368 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6577d" event={"ID":"af13f2b9-c077-4470-8b65-0e2c834717e7","Type":"ContainerStarted","Data":"9093e2f61f8c51df1f18c4eb68e95e07da23121500963be48c971a5c4da20a2d"} Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.792781 4825 patch_prober.go:28] interesting pod/router-default-5444994796-tgcvt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 06:47:52 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Mar 10 06:47:52 crc kubenswrapper[4825]: [+]process-running ok Mar 10 06:47:52 crc kubenswrapper[4825]: healthz check failed Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.792851 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tgcvt" podUID="9f0e6d3a-b022-42dd-828e-bd5ec395d06c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.794678 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vkvnj" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.800119 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nblmw" podStartSLOduration=171.800100291 podStartE2EDuration="2m51.800100291s" podCreationTimestamp="2026-03-10 06:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:52.796882042 +0000 UTC m=+225.826662647" watchObservedRunningTime="2026-03-10 06:47:52.800100291 +0000 UTC m=+225.829880906" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.810612 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-8mnj8" event={"ID":"07b348c9-f5aa-4fbd-a44c-3c2e8f82b0a9","Type":"ContainerStarted","Data":"e3e70713f0b8e916bc82cd29e6edc94f3935ee78d3384c93d5ebcc176ead0d54"} Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.815578 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lfxbn"] Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.817212 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lfxbn" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.834701 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" event={"ID":"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0","Type":"ContainerStarted","Data":"e09b2d4f0c6d3f92dfbff134c27f34ad877b2863779a22e4e1807f9d2c6926bf"} Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.834925 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" podUID="01203ed2-23cf-4b39-ae6c-6ed22c0d66c0" containerName="route-controller-manager" containerID="cri-o://e09b2d4f0c6d3f92dfbff134c27f34ad877b2863779a22e4e1807f9d2c6926bf" gracePeriod=30 Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.835409 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.852599 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6577d" podStartSLOduration=171.852568468 podStartE2EDuration="2m51.852568468s" podCreationTimestamp="2026-03-10 06:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:52.826192564 +0000 UTC m=+225.855973189" watchObservedRunningTime="2026-03-10 06:47:52.852568468 +0000 UTC m=+225.882349093" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.858406 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lfxbn"] Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.859513 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn9kg\" (UniqueName: \"kubernetes.io/projected/823b95d6-3aa7-4fda-bd75-9fe7e95a209f-kube-api-access-fn9kg\") pod \"redhat-marketplace-lfxbn\" (UID: \"823b95d6-3aa7-4fda-bd75-9fe7e95a209f\") " pod="openshift-marketplace/redhat-marketplace-lfxbn" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.859577 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/823b95d6-3aa7-4fda-bd75-9fe7e95a209f-utilities\") pod \"redhat-marketplace-lfxbn\" (UID: \"823b95d6-3aa7-4fda-bd75-9fe7e95a209f\") " pod="openshift-marketplace/redhat-marketplace-lfxbn" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.859617 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.859680 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/823b95d6-3aa7-4fda-bd75-9fe7e95a209f-catalog-content\") pod \"redhat-marketplace-lfxbn\" (UID: \"823b95d6-3aa7-4fda-bd75-9fe7e95a209f\") " pod="openshift-marketplace/redhat-marketplace-lfxbn" Mar 10 06:47:52 crc kubenswrapper[4825]: E0310 06:47:52.862603 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:53.362588807 +0000 UTC m=+226.392369422 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.871350 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" event={"ID":"4282a4a7-f7d9-4c0e-9638-02c793c2e2e6","Type":"ContainerStarted","Data":"41b3eb79a2298f411780cd78dc85cdb54d7d6667a12f989ad593a2eaa3681663"} Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.881111 4825 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-grsjk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": read tcp 10.217.0.2:36020->10.217.0.24:8443: read: connection reset by peer" start-of-body= Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.881193 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" podUID="01203ed2-23cf-4b39-ae6c-6ed22c0d66c0" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": read tcp 10.217.0.2:36020->10.217.0.24:8443: read: connection reset by peer" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.892328 4825 generic.go:334] "Generic (PLEG): container finished" podID="f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb" containerID="763e5c48636dd5e1028997cbd1cbd2f775706c86d2b4a4bb53f712eb576f49a0" exitCode=0 Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.892473 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" event={"ID":"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb","Type":"ContainerDied","Data":"763e5c48636dd5e1028997cbd1cbd2f775706c86d2b4a4bb53f712eb576f49a0"} Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.894834 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" podStartSLOduration=171.894810518 podStartE2EDuration="2m51.894810518s" podCreationTimestamp="2026-03-10 06:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:52.892151438 +0000 UTC m=+225.921932063" watchObservedRunningTime="2026-03-10 06:47:52.894810518 +0000 UTC m=+225.924591133" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.939834 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gwdl" event={"ID":"f7b359b6-5dbd-4270-9195-a355b8ce3dbd","Type":"ContainerStarted","Data":"5b1caaa0c6dce7c64aae35b69d036af11e91e672b4b1c88e6a4b8a2fe2c96d9b"} Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.954050 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" podStartSLOduration=171.954031304 podStartE2EDuration="2m51.954031304s" podCreationTimestamp="2026-03-10 06:45:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:52.951704585 +0000 UTC m=+225.981485210" watchObservedRunningTime="2026-03-10 06:47:52.954031304 +0000 UTC m=+225.983811919" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.956688 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbdjf" event={"ID":"d232f8a7-f013-4a5c-a7dc-2150c4b3040c","Type":"ContainerStarted","Data":"f533bccf4abe8d105de241b927bcf4d846a0d32e66c4cd938d03bfd9ed99ce2d"} Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.961013 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.961409 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/823b95d6-3aa7-4fda-bd75-9fe7e95a209f-catalog-content\") pod \"redhat-marketplace-lfxbn\" (UID: \"823b95d6-3aa7-4fda-bd75-9fe7e95a209f\") " pod="openshift-marketplace/redhat-marketplace-lfxbn" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.962562 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn9kg\" (UniqueName: \"kubernetes.io/projected/823b95d6-3aa7-4fda-bd75-9fe7e95a209f-kube-api-access-fn9kg\") pod \"redhat-marketplace-lfxbn\" (UID: \"823b95d6-3aa7-4fda-bd75-9fe7e95a209f\") " pod="openshift-marketplace/redhat-marketplace-lfxbn" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.962625 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/823b95d6-3aa7-4fda-bd75-9fe7e95a209f-utilities\") pod \"redhat-marketplace-lfxbn\" (UID: \"823b95d6-3aa7-4fda-bd75-9fe7e95a209f\") " pod="openshift-marketplace/redhat-marketplace-lfxbn" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.965640 4825 patch_prober.go:28] interesting pod/downloads-7954f5f757-9cr4m container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.966047 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9cr4m" podUID="d2c0d6e4-6024-48a0-8bca-5069baf7a8ab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.36:8080/\": dial tcp 10.217.0.36:8080: connect: connection refused" Mar 10 06:47:52 crc kubenswrapper[4825]: E0310 06:47:52.966774 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:53.466755295 +0000 UTC m=+226.496535910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.973614 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/823b95d6-3aa7-4fda-bd75-9fe7e95a209f-catalog-content\") pod \"redhat-marketplace-lfxbn\" (UID: \"823b95d6-3aa7-4fda-bd75-9fe7e95a209f\") " pod="openshift-marketplace/redhat-marketplace-lfxbn" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.979636 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5kj8n" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.986542 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g8v9j" Mar 10 06:47:52 crc kubenswrapper[4825]: I0310 06:47:52.989005 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/823b95d6-3aa7-4fda-bd75-9fe7e95a209f-utilities\") pod \"redhat-marketplace-lfxbn\" (UID: \"823b95d6-3aa7-4fda-bd75-9fe7e95a209f\") " pod="openshift-marketplace/redhat-marketplace-lfxbn" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.014535 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn9kg\" (UniqueName: \"kubernetes.io/projected/823b95d6-3aa7-4fda-bd75-9fe7e95a209f-kube-api-access-fn9kg\") pod \"redhat-marketplace-lfxbn\" (UID: \"823b95d6-3aa7-4fda-bd75-9fe7e95a209f\") " pod="openshift-marketplace/redhat-marketplace-lfxbn" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.072534 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:53 crc kubenswrapper[4825]: E0310 06:47:53.083601 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:53.583579832 +0000 UTC m=+226.613360447 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.173925 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:53 crc kubenswrapper[4825]: E0310 06:47:53.174239 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:53.674125799 +0000 UTC m=+226.703906414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.175769 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:53 crc kubenswrapper[4825]: E0310 06:47:53.176416 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:53.676400386 +0000 UTC m=+226.706181001 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.235322 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager_controller-manager-879f6c89f-6j748_0bab741b-822c-4548-8333-aa3f90ecd8a0/controller-manager/0.log" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.235410 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.276677 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-proxy-ca-bundles\") pod \"0bab741b-822c-4548-8333-aa3f90ecd8a0\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.276719 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bab741b-822c-4548-8333-aa3f90ecd8a0-serving-cert\") pod \"0bab741b-822c-4548-8333-aa3f90ecd8a0\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.276742 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-config\") pod \"0bab741b-822c-4548-8333-aa3f90ecd8a0\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.276772 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbj9x\" (UniqueName: \"kubernetes.io/projected/0bab741b-822c-4548-8333-aa3f90ecd8a0-kube-api-access-gbj9x\") pod \"0bab741b-822c-4548-8333-aa3f90ecd8a0\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.276885 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.276931 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-client-ca\") pod \"0bab741b-822c-4548-8333-aa3f90ecd8a0\" (UID: \"0bab741b-822c-4548-8333-aa3f90ecd8a0\") " Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.277848 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-config" (OuterVolumeSpecName: "config") pod "0bab741b-822c-4548-8333-aa3f90ecd8a0" (UID: "0bab741b-822c-4548-8333-aa3f90ecd8a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.279148 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0bab741b-822c-4548-8333-aa3f90ecd8a0" (UID: "0bab741b-822c-4548-8333-aa3f90ecd8a0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.279790 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-client-ca" (OuterVolumeSpecName: "client-ca") pod "0bab741b-822c-4548-8333-aa3f90ecd8a0" (UID: "0bab741b-822c-4548-8333-aa3f90ecd8a0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:47:53 crc kubenswrapper[4825]: E0310 06:47:53.279937 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:53.779914352 +0000 UTC m=+226.809694957 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.284858 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c5b59d588-q6lfm"] Mar 10 06:47:53 crc kubenswrapper[4825]: E0310 06:47:53.285116 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bab741b-822c-4548-8333-aa3f90ecd8a0" containerName="controller-manager" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.286158 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lfxbn" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.290683 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bab741b-822c-4548-8333-aa3f90ecd8a0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0bab741b-822c-4548-8333-aa3f90ecd8a0" (UID: "0bab741b-822c-4548-8333-aa3f90ecd8a0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.291218 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bab741b-822c-4548-8333-aa3f90ecd8a0-kube-api-access-gbj9x" (OuterVolumeSpecName: "kube-api-access-gbj9x") pod "0bab741b-822c-4548-8333-aa3f90ecd8a0" (UID: "0bab741b-822c-4548-8333-aa3f90ecd8a0"). InnerVolumeSpecName "kube-api-access-gbj9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.285127 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bab741b-822c-4548-8333-aa3f90ecd8a0" containerName="controller-manager" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.294401 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bab741b-822c-4548-8333-aa3f90ecd8a0" containerName="controller-manager" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.295057 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.305541 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c5b59d588-q6lfm"] Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.379164 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f86d9d2c-0701-4d7a-926f-64f7733767d4-serving-cert\") pod \"controller-manager-6c5b59d588-q6lfm\" (UID: \"f86d9d2c-0701-4d7a-926f-64f7733767d4\") " pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.379622 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.379693 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f86d9d2c-0701-4d7a-926f-64f7733767d4-client-ca\") pod \"controller-manager-6c5b59d588-q6lfm\" (UID: \"f86d9d2c-0701-4d7a-926f-64f7733767d4\") " pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.379715 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f86d9d2c-0701-4d7a-926f-64f7733767d4-config\") pod \"controller-manager-6c5b59d588-q6lfm\" (UID: \"f86d9d2c-0701-4d7a-926f-64f7733767d4\") " pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.379753 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfmlp\" (UniqueName: \"kubernetes.io/projected/f86d9d2c-0701-4d7a-926f-64f7733767d4-kube-api-access-qfmlp\") pod \"controller-manager-6c5b59d588-q6lfm\" (UID: \"f86d9d2c-0701-4d7a-926f-64f7733767d4\") " pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.379778 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f86d9d2c-0701-4d7a-926f-64f7733767d4-proxy-ca-bundles\") pod \"controller-manager-6c5b59d588-q6lfm\" (UID: \"f86d9d2c-0701-4d7a-926f-64f7733767d4\") " pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.379826 4825 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.379837 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bab741b-822c-4548-8333-aa3f90ecd8a0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.379846 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.379857 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbj9x\" (UniqueName: \"kubernetes.io/projected/0bab741b-822c-4548-8333-aa3f90ecd8a0-kube-api-access-gbj9x\") on node \"crc\" DevicePath \"\"" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.379867 4825 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0bab741b-822c-4548-8333-aa3f90ecd8a0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:47:53 crc kubenswrapper[4825]: E0310 06:47:53.380190 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:53.880176028 +0000 UTC m=+226.909956643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.480809 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:53 crc kubenswrapper[4825]: E0310 06:47:53.480920 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:53.980902239 +0000 UTC m=+227.010682854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.481237 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:53 crc kubenswrapper[4825]: E0310 06:47:53.481600 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:53.981590882 +0000 UTC m=+227.011371497 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.481815 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f86d9d2c-0701-4d7a-926f-64f7733767d4-client-ca\") pod \"controller-manager-6c5b59d588-q6lfm\" (UID: \"f86d9d2c-0701-4d7a-926f-64f7733767d4\") " pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.481846 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f86d9d2c-0701-4d7a-926f-64f7733767d4-config\") pod \"controller-manager-6c5b59d588-q6lfm\" (UID: \"f86d9d2c-0701-4d7a-926f-64f7733767d4\") " pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.482878 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f86d9d2c-0701-4d7a-926f-64f7733767d4-client-ca\") pod \"controller-manager-6c5b59d588-q6lfm\" (UID: \"f86d9d2c-0701-4d7a-926f-64f7733767d4\") " pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.482997 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfmlp\" (UniqueName: \"kubernetes.io/projected/f86d9d2c-0701-4d7a-926f-64f7733767d4-kube-api-access-qfmlp\") pod \"controller-manager-6c5b59d588-q6lfm\" (UID: \"f86d9d2c-0701-4d7a-926f-64f7733767d4\") " pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.483027 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f86d9d2c-0701-4d7a-926f-64f7733767d4-proxy-ca-bundles\") pod \"controller-manager-6c5b59d588-q6lfm\" (UID: \"f86d9d2c-0701-4d7a-926f-64f7733767d4\") " pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.483334 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f86d9d2c-0701-4d7a-926f-64f7733767d4-config\") pod \"controller-manager-6c5b59d588-q6lfm\" (UID: \"f86d9d2c-0701-4d7a-926f-64f7733767d4\") " pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.483655 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f86d9d2c-0701-4d7a-926f-64f7733767d4-serving-cert\") pod \"controller-manager-6c5b59d588-q6lfm\" (UID: \"f86d9d2c-0701-4d7a-926f-64f7733767d4\") " pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.484986 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f86d9d2c-0701-4d7a-926f-64f7733767d4-proxy-ca-bundles\") pod \"controller-manager-6c5b59d588-q6lfm\" (UID: \"f86d9d2c-0701-4d7a-926f-64f7733767d4\") " pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.499582 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f86d9d2c-0701-4d7a-926f-64f7733767d4-serving-cert\") pod \"controller-manager-6c5b59d588-q6lfm\" (UID: \"f86d9d2c-0701-4d7a-926f-64f7733767d4\") " pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.502281 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfmlp\" (UniqueName: \"kubernetes.io/projected/f86d9d2c-0701-4d7a-926f-64f7733767d4-kube-api-access-qfmlp\") pod \"controller-manager-6c5b59d588-q6lfm\" (UID: \"f86d9d2c-0701-4d7a-926f-64f7733767d4\") " pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.542151 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.574562 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vkvnj"] Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.584772 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcjrc\" (UniqueName: \"kubernetes.io/projected/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-kube-api-access-wcjrc\") pod \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\" (UID: \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\") " Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.585429 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.585521 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-client-ca\") pod \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\" (UID: \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\") " Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.585542 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-serving-cert\") pod \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\" (UID: \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\") " Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.585646 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-config\") pod \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\" (UID: \"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0\") " Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.586178 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-client-ca" (OuterVolumeSpecName: "client-ca") pod "01203ed2-23cf-4b39-ae6c-6ed22c0d66c0" (UID: "01203ed2-23cf-4b39-ae6c-6ed22c0d66c0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:47:53 crc kubenswrapper[4825]: E0310 06:47:53.586482 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:54.086459674 +0000 UTC m=+227.116240289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.587504 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-config" (OuterVolumeSpecName: "config") pod "01203ed2-23cf-4b39-ae6c-6ed22c0d66c0" (UID: "01203ed2-23cf-4b39-ae6c-6ed22c0d66c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.590242 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-kube-api-access-wcjrc" (OuterVolumeSpecName: "kube-api-access-wcjrc") pod "01203ed2-23cf-4b39-ae6c-6ed22c0d66c0" (UID: "01203ed2-23cf-4b39-ae6c-6ed22c0d66c0"). InnerVolumeSpecName "kube-api-access-wcjrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.592061 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01203ed2-23cf-4b39-ae6c-6ed22c0d66c0" (UID: "01203ed2-23cf-4b39-ae6c-6ed22c0d66c0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.602660 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7h5g2"] Mar 10 06:47:53 crc kubenswrapper[4825]: E0310 06:47:53.603022 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01203ed2-23cf-4b39-ae6c-6ed22c0d66c0" containerName="route-controller-manager" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.603043 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="01203ed2-23cf-4b39-ae6c-6ed22c0d66c0" containerName="route-controller-manager" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.603225 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="01203ed2-23cf-4b39-ae6c-6ed22c0d66c0" containerName="route-controller-manager" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.604211 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7h5g2" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.606469 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.628001 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7h5g2"] Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.643929 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.687907 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27fbaf63-cf93-470e-b012-ad8bb403f65a-utilities\") pod \"redhat-operators-7h5g2\" (UID: \"27fbaf63-cf93-470e-b012-ad8bb403f65a\") " pod="openshift-marketplace/redhat-operators-7h5g2" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.687972 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcm4d\" (UniqueName: \"kubernetes.io/projected/27fbaf63-cf93-470e-b012-ad8bb403f65a-kube-api-access-pcm4d\") pod \"redhat-operators-7h5g2\" (UID: \"27fbaf63-cf93-470e-b012-ad8bb403f65a\") " pod="openshift-marketplace/redhat-operators-7h5g2" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.688007 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27fbaf63-cf93-470e-b012-ad8bb403f65a-catalog-content\") pod \"redhat-operators-7h5g2\" (UID: \"27fbaf63-cf93-470e-b012-ad8bb403f65a\") " pod="openshift-marketplace/redhat-operators-7h5g2" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.688058 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.688097 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.688108 4825 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.688119 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.688159 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcjrc\" (UniqueName: \"kubernetes.io/projected/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0-kube-api-access-wcjrc\") on node \"crc\" DevicePath \"\"" Mar 10 06:47:53 crc kubenswrapper[4825]: E0310 06:47:53.689025 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:54.189001727 +0000 UTC m=+227.218782332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.702831 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lfxbn"] Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.773511 4825 patch_prober.go:28] interesting pod/router-default-5444994796-tgcvt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 06:47:53 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Mar 10 06:47:53 crc kubenswrapper[4825]: [+]process-running ok Mar 10 06:47:53 crc kubenswrapper[4825]: healthz check failed Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.773564 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tgcvt" podUID="9f0e6d3a-b022-42dd-828e-bd5ec395d06c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.797944 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.798153 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27fbaf63-cf93-470e-b012-ad8bb403f65a-utilities\") pod \"redhat-operators-7h5g2\" (UID: \"27fbaf63-cf93-470e-b012-ad8bb403f65a\") " pod="openshift-marketplace/redhat-operators-7h5g2" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.798192 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcm4d\" (UniqueName: \"kubernetes.io/projected/27fbaf63-cf93-470e-b012-ad8bb403f65a-kube-api-access-pcm4d\") pod \"redhat-operators-7h5g2\" (UID: \"27fbaf63-cf93-470e-b012-ad8bb403f65a\") " pod="openshift-marketplace/redhat-operators-7h5g2" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.798223 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27fbaf63-cf93-470e-b012-ad8bb403f65a-catalog-content\") pod \"redhat-operators-7h5g2\" (UID: \"27fbaf63-cf93-470e-b012-ad8bb403f65a\") " pod="openshift-marketplace/redhat-operators-7h5g2" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.798754 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27fbaf63-cf93-470e-b012-ad8bb403f65a-catalog-content\") pod \"redhat-operators-7h5g2\" (UID: \"27fbaf63-cf93-470e-b012-ad8bb403f65a\") " pod="openshift-marketplace/redhat-operators-7h5g2" Mar 10 06:47:53 crc kubenswrapper[4825]: E0310 06:47:53.798850 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:54.298831966 +0000 UTC m=+227.328612591 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.800717 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27fbaf63-cf93-470e-b012-ad8bb403f65a-utilities\") pod \"redhat-operators-7h5g2\" (UID: \"27fbaf63-cf93-470e-b012-ad8bb403f65a\") " pod="openshift-marketplace/redhat-operators-7h5g2" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.825900 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcm4d\" (UniqueName: \"kubernetes.io/projected/27fbaf63-cf93-470e-b012-ad8bb403f65a-kube-api-access-pcm4d\") pod \"redhat-operators-7h5g2\" (UID: \"27fbaf63-cf93-470e-b012-ad8bb403f65a\") " pod="openshift-marketplace/redhat-operators-7h5g2" Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.900005 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:53 crc kubenswrapper[4825]: E0310 06:47:53.900495 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:54.400479439 +0000 UTC m=+227.430260044 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:53 crc kubenswrapper[4825]: I0310 06:47:53.966734 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7h5g2" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.000735 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:54 crc kubenswrapper[4825]: E0310 06:47:54.001406 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:54.501372686 +0000 UTC m=+227.531153301 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.010655 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2gg7k"] Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.012025 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" event={"ID":"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb","Type":"ContainerStarted","Data":"b9b5e746a365f2eaf7067b9f3fa2f38ca99ada10102b262c8eb2c9bf69f41082"} Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.012208 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2gg7k" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.029640 4825 generic.go:334] "Generic (PLEG): container finished" podID="e945d496-847a-4109-ae2d-41e169b241c2" containerID="bf5279fde39c6ec638486a54176ba3fa9d3bf341ae6fb17ff1c0ca8ff14c8288" exitCode=0 Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.029834 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zmmv" event={"ID":"e945d496-847a-4109-ae2d-41e169b241c2","Type":"ContainerDied","Data":"bf5279fde39c6ec638486a54176ba3fa9d3bf341ae6fb17ff1c0ca8ff14c8288"} Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.036276 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.037178 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.037277 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.039834 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.040123 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.044332 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2gg7k"] Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.050387 4825 generic.go:334] "Generic (PLEG): container finished" podID="33ed941e-9788-4c9e-bcd5-6af206460adb" containerID="42ce30d0dd130f168111305056b5f9a7adfb58a8db5677fb87359586bbd611ae" exitCode=0 Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.050578 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lfhnn" event={"ID":"33ed941e-9788-4c9e-bcd5-6af206460adb","Type":"ContainerDied","Data":"42ce30d0dd130f168111305056b5f9a7adfb58a8db5677fb87359586bbd611ae"} Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.070884 4825 generic.go:334] "Generic (PLEG): container finished" podID="d41ec286-2462-448b-ab27-1acc0e1dab3c" containerID="240cdac1966db2f595a0f9bb13f2f782bb43bbced94342bb7a5e7435dc697408" exitCode=0 Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.071019 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vkvnj" event={"ID":"d41ec286-2462-448b-ab27-1acc0e1dab3c","Type":"ContainerDied","Data":"240cdac1966db2f595a0f9bb13f2f782bb43bbced94342bb7a5e7435dc697408"} Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.071054 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vkvnj" event={"ID":"d41ec286-2462-448b-ab27-1acc0e1dab3c","Type":"ContainerStarted","Data":"e2519a5aca1f5139e9a5ba481a45ae0def4a882840d4d3bd1593fea658ba1e00"} Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.082263 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager_controller-manager-879f6c89f-6j748_0bab741b-822c-4548-8333-aa3f90ecd8a0/controller-manager/0.log" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.082328 4825 generic.go:334] "Generic (PLEG): container finished" podID="0bab741b-822c-4548-8333-aa3f90ecd8a0" containerID="52bb6a0f470c8056867111ac34610082b578937ba95c6d38d1c41ec2077d0f38" exitCode=2 Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.083101 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" event={"ID":"0bab741b-822c-4548-8333-aa3f90ecd8a0","Type":"ContainerDied","Data":"52bb6a0f470c8056867111ac34610082b578937ba95c6d38d1c41ec2077d0f38"} Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.083163 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" event={"ID":"0bab741b-822c-4548-8333-aa3f90ecd8a0","Type":"ContainerDied","Data":"055a3fff85827539d578674378c64256b637de94c64ea8a5573bb622d0e62782"} Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.083188 4825 scope.go:117] "RemoveContainer" containerID="52bb6a0f470c8056867111ac34610082b578937ba95c6d38d1c41ec2077d0f38" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.083925 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6j748" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.100839 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lfxbn" event={"ID":"823b95d6-3aa7-4fda-bd75-9fe7e95a209f","Type":"ContainerStarted","Data":"330caed1df0291cf2f9b79b348597ab836872a68dd3151b16fbdaf8f85df95c0"} Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.102922 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b42ba5a9-16fa-4616-961e-8d663b3f3ac1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b42ba5a9-16fa-4616-961e-8d663b3f3ac1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.102959 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58d27fca-dd33-4753-9ea6-4b55c191b2b8-utilities\") pod \"redhat-operators-2gg7k\" (UID: \"58d27fca-dd33-4753-9ea6-4b55c191b2b8\") " pod="openshift-marketplace/redhat-operators-2gg7k" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.103012 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b42ba5a9-16fa-4616-961e-8d663b3f3ac1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b42ba5a9-16fa-4616-961e-8d663b3f3ac1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.103083 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58d27fca-dd33-4753-9ea6-4b55c191b2b8-catalog-content\") pod \"redhat-operators-2gg7k\" (UID: \"58d27fca-dd33-4753-9ea6-4b55c191b2b8\") " pod="openshift-marketplace/redhat-operators-2gg7k" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.103182 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkc9w\" (UniqueName: \"kubernetes.io/projected/58d27fca-dd33-4753-9ea6-4b55c191b2b8-kube-api-access-hkc9w\") pod \"redhat-operators-2gg7k\" (UID: \"58d27fca-dd33-4753-9ea6-4b55c191b2b8\") " pod="openshift-marketplace/redhat-operators-2gg7k" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.103228 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:54 crc kubenswrapper[4825]: E0310 06:47:54.103614 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:54.603596069 +0000 UTC m=+227.633376684 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.120296 4825 generic.go:334] "Generic (PLEG): container finished" podID="01203ed2-23cf-4b39-ae6c-6ed22c0d66c0" containerID="e09b2d4f0c6d3f92dfbff134c27f34ad877b2863779a22e4e1807f9d2c6926bf" exitCode=0 Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.120386 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" event={"ID":"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0","Type":"ContainerDied","Data":"e09b2d4f0c6d3f92dfbff134c27f34ad877b2863779a22e4e1807f9d2c6926bf"} Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.120453 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" event={"ID":"01203ed2-23cf-4b39-ae6c-6ed22c0d66c0","Type":"ContainerDied","Data":"3e191ea6f070b84b1094f8b78c48e2dee767235e5bc813667132ea7d826260ad"} Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.120612 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.145814 4825 generic.go:334] "Generic (PLEG): container finished" podID="f7b359b6-5dbd-4270-9195-a355b8ce3dbd" containerID="4623f7ccc4c40b9a1a9edc649b908f5ba0b76b3bf82762a14b9a20d479bc2adb" exitCode=0 Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.145918 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gwdl" event={"ID":"f7b359b6-5dbd-4270-9195-a355b8ce3dbd","Type":"ContainerDied","Data":"4623f7ccc4c40b9a1a9edc649b908f5ba0b76b3bf82762a14b9a20d479bc2adb"} Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.165049 4825 generic.go:334] "Generic (PLEG): container finished" podID="d232f8a7-f013-4a5c-a7dc-2150c4b3040c" containerID="31345fc7f6c0909f76a40e8c716468655688b031af3150f505b158afdeba1bad" exitCode=0 Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.166428 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbdjf" event={"ID":"d232f8a7-f013-4a5c-a7dc-2150c4b3040c","Type":"ContainerDied","Data":"31345fc7f6c0909f76a40e8c716468655688b031af3150f505b158afdeba1bad"} Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.212397 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.213461 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c5b59d588-q6lfm"] Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.215017 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b42ba5a9-16fa-4616-961e-8d663b3f3ac1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b42ba5a9-16fa-4616-961e-8d663b3f3ac1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.215087 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58d27fca-dd33-4753-9ea6-4b55c191b2b8-utilities\") pod \"redhat-operators-2gg7k\" (UID: \"58d27fca-dd33-4753-9ea6-4b55c191b2b8\") " pod="openshift-marketplace/redhat-operators-2gg7k" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.215180 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b42ba5a9-16fa-4616-961e-8d663b3f3ac1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b42ba5a9-16fa-4616-961e-8d663b3f3ac1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 06:47:54 crc kubenswrapper[4825]: E0310 06:47:54.226468 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:54.726440559 +0000 UTC m=+227.756221174 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.226627 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b42ba5a9-16fa-4616-961e-8d663b3f3ac1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b42ba5a9-16fa-4616-961e-8d663b3f3ac1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.227078 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58d27fca-dd33-4753-9ea6-4b55c191b2b8-utilities\") pod \"redhat-operators-2gg7k\" (UID: \"58d27fca-dd33-4753-9ea6-4b55c191b2b8\") " pod="openshift-marketplace/redhat-operators-2gg7k" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.228000 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk"] Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.231547 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58d27fca-dd33-4753-9ea6-4b55c191b2b8-catalog-content\") pod \"redhat-operators-2gg7k\" (UID: \"58d27fca-dd33-4753-9ea6-4b55c191b2b8\") " pod="openshift-marketplace/redhat-operators-2gg7k" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.231759 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkc9w\" (UniqueName: \"kubernetes.io/projected/58d27fca-dd33-4753-9ea6-4b55c191b2b8-kube-api-access-hkc9w\") pod \"redhat-operators-2gg7k\" (UID: \"58d27fca-dd33-4753-9ea6-4b55c191b2b8\") " pod="openshift-marketplace/redhat-operators-2gg7k" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.231820 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.237865 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58d27fca-dd33-4753-9ea6-4b55c191b2b8-catalog-content\") pod \"redhat-operators-2gg7k\" (UID: \"58d27fca-dd33-4753-9ea6-4b55c191b2b8\") " pod="openshift-marketplace/redhat-operators-2gg7k" Mar 10 06:47:54 crc kubenswrapper[4825]: E0310 06:47:54.240145 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:54.740110192 +0000 UTC m=+227.769890807 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.245933 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-grsjk"] Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.247309 4825 scope.go:117] "RemoveContainer" containerID="52bb6a0f470c8056867111ac34610082b578937ba95c6d38d1c41ec2077d0f38" Mar 10 06:47:54 crc kubenswrapper[4825]: E0310 06:47:54.248486 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52bb6a0f470c8056867111ac34610082b578937ba95c6d38d1c41ec2077d0f38\": container with ID starting with 52bb6a0f470c8056867111ac34610082b578937ba95c6d38d1c41ec2077d0f38 not found: ID does not exist" containerID="52bb6a0f470c8056867111ac34610082b578937ba95c6d38d1c41ec2077d0f38" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.248526 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52bb6a0f470c8056867111ac34610082b578937ba95c6d38d1c41ec2077d0f38"} err="failed to get container status \"52bb6a0f470c8056867111ac34610082b578937ba95c6d38d1c41ec2077d0f38\": rpc error: code = NotFound desc = could not find container \"52bb6a0f470c8056867111ac34610082b578937ba95c6d38d1c41ec2077d0f38\": container with ID starting with 52bb6a0f470c8056867111ac34610082b578937ba95c6d38d1c41ec2077d0f38 not found: ID does not exist" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.248557 4825 scope.go:117] "RemoveContainer" containerID="e09b2d4f0c6d3f92dfbff134c27f34ad877b2863779a22e4e1807f9d2c6926bf" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.258508 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6j748"] Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.263840 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkc9w\" (UniqueName: \"kubernetes.io/projected/58d27fca-dd33-4753-9ea6-4b55c191b2b8-kube-api-access-hkc9w\") pod \"redhat-operators-2gg7k\" (UID: \"58d27fca-dd33-4753-9ea6-4b55c191b2b8\") " pod="openshift-marketplace/redhat-operators-2gg7k" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.265270 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b42ba5a9-16fa-4616-961e-8d663b3f3ac1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b42ba5a9-16fa-4616-961e-8d663b3f3ac1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.267851 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6j748"] Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.333680 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:54 crc kubenswrapper[4825]: E0310 06:47:54.333911 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:54.833867998 +0000 UTC m=+227.863648613 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.334235 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:54 crc kubenswrapper[4825]: E0310 06:47:54.334716 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:54.834709346 +0000 UTC m=+227.864489961 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.344655 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2gg7k" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.365745 4825 scope.go:117] "RemoveContainer" containerID="e09b2d4f0c6d3f92dfbff134c27f34ad877b2863779a22e4e1807f9d2c6926bf" Mar 10 06:47:54 crc kubenswrapper[4825]: E0310 06:47:54.366218 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e09b2d4f0c6d3f92dfbff134c27f34ad877b2863779a22e4e1807f9d2c6926bf\": container with ID starting with e09b2d4f0c6d3f92dfbff134c27f34ad877b2863779a22e4e1807f9d2c6926bf not found: ID does not exist" containerID="e09b2d4f0c6d3f92dfbff134c27f34ad877b2863779a22e4e1807f9d2c6926bf" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.366247 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e09b2d4f0c6d3f92dfbff134c27f34ad877b2863779a22e4e1807f9d2c6926bf"} err="failed to get container status \"e09b2d4f0c6d3f92dfbff134c27f34ad877b2863779a22e4e1807f9d2c6926bf\": rpc error: code = NotFound desc = could not find container \"e09b2d4f0c6d3f92dfbff134c27f34ad877b2863779a22e4e1807f9d2c6926bf\": container with ID starting with e09b2d4f0c6d3f92dfbff134c27f34ad877b2863779a22e4e1807f9d2c6926bf not found: ID does not exist" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.378900 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.442061 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:54 crc kubenswrapper[4825]: E0310 06:47:54.442235 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:54.942209217 +0000 UTC m=+227.971989832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.442421 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:54 crc kubenswrapper[4825]: E0310 06:47:54.442775 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:54.942762316 +0000 UTC m=+227.972542931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.546117 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:54 crc kubenswrapper[4825]: E0310 06:47:54.546296 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:55.046259591 +0000 UTC m=+228.076040206 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.546983 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:54 crc kubenswrapper[4825]: E0310 06:47:54.547385 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:55.047371199 +0000 UTC m=+228.077151814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.595861 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7h5g2"] Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.641232 4825 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.648334 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:54 crc kubenswrapper[4825]: E0310 06:47:54.648589 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 06:47:55.148548086 +0000 UTC m=+228.178328701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.648783 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:54 crc kubenswrapper[4825]: E0310 06:47:54.649202 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 06:47:55.149183807 +0000 UTC m=+228.178964412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vlkfb" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.667779 4825 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-10T06:47:54.641429705Z","Handler":null,"Name":""} Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.681893 4825 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.682267 4825 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.687809 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2gg7k"] Mar 10 06:47:54 crc kubenswrapper[4825]: W0310 06:47:54.708867 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58d27fca_dd33_4753_9ea6_4b55c191b2b8.slice/crio-0c2e196c25e5380e2ecd9888b8afb6756aca7df98c3ac9f3fda07f21268c2417 WatchSource:0}: Error finding container 0c2e196c25e5380e2ecd9888b8afb6756aca7df98c3ac9f3fda07f21268c2417: Status 404 returned error can't find the container with id 0c2e196c25e5380e2ecd9888b8afb6756aca7df98c3ac9f3fda07f21268c2417 Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.750761 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.757362 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.768177 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.773068 4825 patch_prober.go:28] interesting pod/router-default-5444994796-tgcvt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 06:47:54 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Mar 10 06:47:54 crc kubenswrapper[4825]: [+]process-running ok Mar 10 06:47:54 crc kubenswrapper[4825]: healthz check failed Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.773314 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tgcvt" podUID="9f0e6d3a-b022-42dd-828e-bd5ec395d06c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 06:47:54 crc kubenswrapper[4825]: W0310 06:47:54.823223 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb42ba5a9_16fa_4616_961e_8d663b3f3ac1.slice/crio-30dc858ef76c65cca13cbf81ae30b510714f5543288e3e6963c6b290ae451fa3 WatchSource:0}: Error finding container 30dc858ef76c65cca13cbf81ae30b510714f5543288e3e6963c6b290ae451fa3: Status 404 returned error can't find the container with id 30dc858ef76c65cca13cbf81ae30b510714f5543288e3e6963c6b290ae451fa3 Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.853445 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.857514 4825 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.857563 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.888985 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vlkfb\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:54 crc kubenswrapper[4825]: I0310 06:47:54.912727 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.334326 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01203ed2-23cf-4b39-ae6c-6ed22c0d66c0" path="/var/lib/kubelet/pods/01203ed2-23cf-4b39-ae6c-6ed22c0d66c0/volumes" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.339348 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bab741b-822c-4548-8333-aa3f90ecd8a0" path="/var/lib/kubelet/pods/0bab741b-822c-4548-8333-aa3f90ecd8a0/volumes" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.340060 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.340876 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-s68rr" event={"ID":"3e18f017-e70d-45b7-a7fc-a9398f698980","Type":"ContainerStarted","Data":"0b80e444ba7fb4d976f9c610d30c6981ffd55d8eed1a0e2fbccef8063588afe4"} Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.340960 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-s68rr" event={"ID":"3e18f017-e70d-45b7-a7fc-a9398f698980","Type":"ContainerStarted","Data":"72010b7f1dd6bf1ea149c3629acb06a596cf7d2cdb440fea73f0e95d0d6cd99a"} Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.340978 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b42ba5a9-16fa-4616-961e-8d663b3f3ac1","Type":"ContainerStarted","Data":"30dc858ef76c65cca13cbf81ae30b510714f5543288e3e6963c6b290ae451fa3"} Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.341758 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gg7k" event={"ID":"58d27fca-dd33-4753-9ea6-4b55c191b2b8","Type":"ContainerStarted","Data":"ffea082d781ea4dd1701576ab4dbc553eb6b6655cc4aab424e4722dd61f13604"} Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.341802 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gg7k" event={"ID":"58d27fca-dd33-4753-9ea6-4b55c191b2b8","Type":"ContainerStarted","Data":"0c2e196c25e5380e2ecd9888b8afb6756aca7df98c3ac9f3fda07f21268c2417"} Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.371092 4825 generic.go:334] "Generic (PLEG): container finished" podID="27fbaf63-cf93-470e-b012-ad8bb403f65a" containerID="7555ba4068a98143157d525135c2b58738e4dcc9550e5fdea5b751522c27c3d1" exitCode=0 Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.371224 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h5g2" event={"ID":"27fbaf63-cf93-470e-b012-ad8bb403f65a","Type":"ContainerDied","Data":"7555ba4068a98143157d525135c2b58738e4dcc9550e5fdea5b751522c27c3d1"} Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.371260 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h5g2" event={"ID":"27fbaf63-cf93-470e-b012-ad8bb403f65a","Type":"ContainerStarted","Data":"ccd786935bdb282ed517896e006002ce36d82fbb5f90ecdfedc5b6fca09cad22"} Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.372948 4825 ???:1] "http: TLS handshake error from 192.168.126.11:44996: no serving certificate available for the kubelet" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.420264 4825 generic.go:334] "Generic (PLEG): container finished" podID="823b95d6-3aa7-4fda-bd75-9fe7e95a209f" containerID="b3da0856c57dd8f0381398216d95ec848f6fa7015fcd8844ccd31233e59c1403" exitCode=0 Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.420385 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lfxbn" event={"ID":"823b95d6-3aa7-4fda-bd75-9fe7e95a209f","Type":"ContainerDied","Data":"b3da0856c57dd8f0381398216d95ec848f6fa7015fcd8844ccd31233e59c1403"} Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.423859 4825 generic.go:334] "Generic (PLEG): container finished" podID="f4889476-c0d0-4e4b-986a-f4dcdacce72b" containerID="48ead55ecdcc5b367d13e319264acfded320c333d5f630757bdcd4e5589ea7fb" exitCode=0 Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.423915 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552085-npl6x" event={"ID":"f4889476-c0d0-4e4b-986a-f4dcdacce72b","Type":"ContainerDied","Data":"48ead55ecdcc5b367d13e319264acfded320c333d5f630757bdcd4e5589ea7fb"} Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.456795 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz"] Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.458145 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.464748 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.464992 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.465840 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.466255 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.466400 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.466574 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.482832 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz"] Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.492384 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" event={"ID":"f0e5fd8e-7ba6-4466-9a04-6c2ec61e41cb","Type":"ContainerStarted","Data":"d6bd8d40774fc9725531260281ef692bc7d128ab10dc6376d9dc4c3e6de763f9"} Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.502980 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" event={"ID":"f86d9d2c-0701-4d7a-926f-64f7733767d4","Type":"ContainerStarted","Data":"4b09ddfcbafce7de75ada51b793f536dcb329d0eb9b225fb8295030375245d0f"} Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.503057 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" event={"ID":"f86d9d2c-0701-4d7a-926f-64f7733767d4","Type":"ContainerStarted","Data":"a6802189a97937a7aea9c7cfbb226770a5a7af3dca8b5e09447a545ab92447a7"} Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.504148 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.545899 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vlkfb"] Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.549657 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.568911 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f05d28cf-03a0-4bea-8c0e-fe252efc5c49-config\") pod \"route-controller-manager-854d9cb458-vnjfz\" (UID: \"f05d28cf-03a0-4bea-8c0e-fe252efc5c49\") " pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.568980 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f05d28cf-03a0-4bea-8c0e-fe252efc5c49-serving-cert\") pod \"route-controller-manager-854d9cb458-vnjfz\" (UID: \"f05d28cf-03a0-4bea-8c0e-fe252efc5c49\") " pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.569053 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f05d28cf-03a0-4bea-8c0e-fe252efc5c49-client-ca\") pod \"route-controller-manager-854d9cb458-vnjfz\" (UID: \"f05d28cf-03a0-4bea-8c0e-fe252efc5c49\") " pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.569443 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7qb7\" (UniqueName: \"kubernetes.io/projected/f05d28cf-03a0-4bea-8c0e-fe252efc5c49-kube-api-access-q7qb7\") pod \"route-controller-manager-854d9cb458-vnjfz\" (UID: \"f05d28cf-03a0-4bea-8c0e-fe252efc5c49\") " pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.661030 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" podStartSLOduration=5.661012208 podStartE2EDuration="5.661012208s" podCreationTimestamp="2026-03-10 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:55.660007454 +0000 UTC m=+228.689788079" watchObservedRunningTime="2026-03-10 06:47:55.661012208 +0000 UTC m=+228.690792823" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.661579 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" podStartSLOduration=175.661575277 podStartE2EDuration="2m55.661575277s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:55.626033843 +0000 UTC m=+228.655814458" watchObservedRunningTime="2026-03-10 06:47:55.661575277 +0000 UTC m=+228.691355892" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.672067 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f05d28cf-03a0-4bea-8c0e-fe252efc5c49-serving-cert\") pod \"route-controller-manager-854d9cb458-vnjfz\" (UID: \"f05d28cf-03a0-4bea-8c0e-fe252efc5c49\") " pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.672173 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f05d28cf-03a0-4bea-8c0e-fe252efc5c49-config\") pod \"route-controller-manager-854d9cb458-vnjfz\" (UID: \"f05d28cf-03a0-4bea-8c0e-fe252efc5c49\") " pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.672197 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f05d28cf-03a0-4bea-8c0e-fe252efc5c49-client-ca\") pod \"route-controller-manager-854d9cb458-vnjfz\" (UID: \"f05d28cf-03a0-4bea-8c0e-fe252efc5c49\") " pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.672256 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7qb7\" (UniqueName: \"kubernetes.io/projected/f05d28cf-03a0-4bea-8c0e-fe252efc5c49-kube-api-access-q7qb7\") pod \"route-controller-manager-854d9cb458-vnjfz\" (UID: \"f05d28cf-03a0-4bea-8c0e-fe252efc5c49\") " pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.674377 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f05d28cf-03a0-4bea-8c0e-fe252efc5c49-config\") pod \"route-controller-manager-854d9cb458-vnjfz\" (UID: \"f05d28cf-03a0-4bea-8c0e-fe252efc5c49\") " pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.674925 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f05d28cf-03a0-4bea-8c0e-fe252efc5c49-client-ca\") pod \"route-controller-manager-854d9cb458-vnjfz\" (UID: \"f05d28cf-03a0-4bea-8c0e-fe252efc5c49\") " pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.709540 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f05d28cf-03a0-4bea-8c0e-fe252efc5c49-serving-cert\") pod \"route-controller-manager-854d9cb458-vnjfz\" (UID: \"f05d28cf-03a0-4bea-8c0e-fe252efc5c49\") " pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.720517 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7qb7\" (UniqueName: \"kubernetes.io/projected/f05d28cf-03a0-4bea-8c0e-fe252efc5c49-kube-api-access-q7qb7\") pod \"route-controller-manager-854d9cb458-vnjfz\" (UID: \"f05d28cf-03a0-4bea-8c0e-fe252efc5c49\") " pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.771247 4825 patch_prober.go:28] interesting pod/router-default-5444994796-tgcvt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 06:47:55 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Mar 10 06:47:55 crc kubenswrapper[4825]: [+]process-running ok Mar 10 06:47:55 crc kubenswrapper[4825]: healthz check failed Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.771304 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tgcvt" podUID="9f0e6d3a-b022-42dd-828e-bd5ec395d06c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 06:47:55 crc kubenswrapper[4825]: I0310 06:47:55.799502 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.426476 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz"] Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.434704 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.444810 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.490962 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.497986 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.500106 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5038ec51-bad1-48bf-a91f-ea63f329e294-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5038ec51-bad1-48bf-a91f-ea63f329e294\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.500261 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5038ec51-bad1-48bf-a91f-ea63f329e294-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5038ec51-bad1-48bf-a91f-ea63f329e294\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.516489 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.568776 4825 generic.go:334] "Generic (PLEG): container finished" podID="b42ba5a9-16fa-4616-961e-8d663b3f3ac1" containerID="8182176aff5a0b0a897212e0741beb5b7d53a5dd08cc63981e0a6b54d8289aaa" exitCode=0 Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.568934 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b42ba5a9-16fa-4616-961e-8d663b3f3ac1","Type":"ContainerDied","Data":"8182176aff5a0b0a897212e0741beb5b7d53a5dd08cc63981e0a6b54d8289aaa"} Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.581515 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" event={"ID":"f05d28cf-03a0-4bea-8c0e-fe252efc5c49","Type":"ContainerStarted","Data":"55ceb7bc912c8371eab44ff6f8a3d6aa4faa4aa19f137d280a5ad6685091c566"} Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.605177 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5038ec51-bad1-48bf-a91f-ea63f329e294-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5038ec51-bad1-48bf-a91f-ea63f329e294\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.605258 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5038ec51-bad1-48bf-a91f-ea63f329e294-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5038ec51-bad1-48bf-a91f-ea63f329e294\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.605553 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5038ec51-bad1-48bf-a91f-ea63f329e294-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5038ec51-bad1-48bf-a91f-ea63f329e294\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.610031 4825 generic.go:334] "Generic (PLEG): container finished" podID="58d27fca-dd33-4753-9ea6-4b55c191b2b8" containerID="ffea082d781ea4dd1701576ab4dbc553eb6b6655cc4aab424e4722dd61f13604" exitCode=0 Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.610154 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gg7k" event={"ID":"58d27fca-dd33-4753-9ea6-4b55c191b2b8","Type":"ContainerDied","Data":"ffea082d781ea4dd1701576ab4dbc553eb6b6655cc4aab424e4722dd61f13604"} Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.620765 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.620803 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.631817 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-s68rr" event={"ID":"3e18f017-e70d-45b7-a7fc-a9398f698980","Type":"ContainerStarted","Data":"c0deebc35329fa117fdee04f607aff6899b74229ed958b10073617cfad427d55"} Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.643407 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.647282 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5038ec51-bad1-48bf-a91f-ea63f329e294-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5038ec51-bad1-48bf-a91f-ea63f329e294\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.650784 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" event={"ID":"c9944ab9-f72f-44bd-8850-2898feff4a28","Type":"ContainerStarted","Data":"b1cb629928a9d8d3d8b2f4e35f90517ad19d163dc03e966429ace1ec7cea2401"} Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.653442 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" event={"ID":"c9944ab9-f72f-44bd-8850-2898feff4a28","Type":"ContainerStarted","Data":"e9b032bda118bca66c38fe9b1f4f89f6237eeb8597e6f62d4dffec6a653474b9"} Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.654456 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.695663 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-s68rr" podStartSLOduration=12.691015874 podStartE2EDuration="12.691015874s" podCreationTimestamp="2026-03-10 06:47:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:56.679167732 +0000 UTC m=+229.708948357" watchObservedRunningTime="2026-03-10 06:47:56.691015874 +0000 UTC m=+229.720796489" Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.758724 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" podStartSLOduration=176.758697566 podStartE2EDuration="2m56.758697566s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:56.752864618 +0000 UTC m=+229.782645233" watchObservedRunningTime="2026-03-10 06:47:56.758697566 +0000 UTC m=+229.788478181" Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.768012 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-tgcvt" Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.781173 4825 patch_prober.go:28] interesting pod/router-default-5444994796-tgcvt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 06:47:56 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Mar 10 06:47:56 crc kubenswrapper[4825]: [+]process-running ok Mar 10 06:47:56 crc kubenswrapper[4825]: healthz check failed Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.781263 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tgcvt" podUID="9f0e6d3a-b022-42dd-828e-bd5ec395d06c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.805120 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.805723 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.806897 4825 patch_prober.go:28] interesting pod/console-f9d7485db-6bjtt container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.806942 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-6bjtt" podUID="f3a60327-2809-415b-abde-d1569a2453b6" containerName="console" probeResult="failure" output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" Mar 10 06:47:56 crc kubenswrapper[4825]: I0310 06:47:56.842385 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 06:47:57 crc kubenswrapper[4825]: I0310 06:47:57.425202 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552085-npl6x" Mar 10 06:47:57 crc kubenswrapper[4825]: I0310 06:47:57.454884 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 06:47:57 crc kubenswrapper[4825]: I0310 06:47:57.533072 4825 patch_prober.go:28] interesting pod/downloads-7954f5f757-9cr4m container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.36:8080/\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Mar 10 06:47:57 crc kubenswrapper[4825]: I0310 06:47:57.534164 4825 patch_prober.go:28] interesting pod/downloads-7954f5f757-9cr4m container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Mar 10 06:47:57 crc kubenswrapper[4825]: I0310 06:47:57.534257 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9cr4m" podUID="d2c0d6e4-6024-48a0-8bca-5069baf7a8ab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.36:8080/\": dial tcp 10.217.0.36:8080: connect: connection refused" Mar 10 06:47:57 crc kubenswrapper[4825]: I0310 06:47:57.538033 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-9cr4m" podUID="d2c0d6e4-6024-48a0-8bca-5069baf7a8ab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.36:8080/\": dial tcp 10.217.0.36:8080: connect: connection refused" Mar 10 06:47:57 crc kubenswrapper[4825]: I0310 06:47:57.544787 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4889476-c0d0-4e4b-986a-f4dcdacce72b-config-volume\") pod \"f4889476-c0d0-4e4b-986a-f4dcdacce72b\" (UID: \"f4889476-c0d0-4e4b-986a-f4dcdacce72b\") " Mar 10 06:47:57 crc kubenswrapper[4825]: I0310 06:47:57.544851 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gszlc\" (UniqueName: \"kubernetes.io/projected/f4889476-c0d0-4e4b-986a-f4dcdacce72b-kube-api-access-gszlc\") pod \"f4889476-c0d0-4e4b-986a-f4dcdacce72b\" (UID: \"f4889476-c0d0-4e4b-986a-f4dcdacce72b\") " Mar 10 06:47:57 crc kubenswrapper[4825]: I0310 06:47:57.545075 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4889476-c0d0-4e4b-986a-f4dcdacce72b-secret-volume\") pod \"f4889476-c0d0-4e4b-986a-f4dcdacce72b\" (UID: \"f4889476-c0d0-4e4b-986a-f4dcdacce72b\") " Mar 10 06:47:57 crc kubenswrapper[4825]: I0310 06:47:57.578261 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4889476-c0d0-4e4b-986a-f4dcdacce72b-config-volume" (OuterVolumeSpecName: "config-volume") pod "f4889476-c0d0-4e4b-986a-f4dcdacce72b" (UID: "f4889476-c0d0-4e4b-986a-f4dcdacce72b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:47:57 crc kubenswrapper[4825]: I0310 06:47:57.602672 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4889476-c0d0-4e4b-986a-f4dcdacce72b-kube-api-access-gszlc" (OuterVolumeSpecName: "kube-api-access-gszlc") pod "f4889476-c0d0-4e4b-986a-f4dcdacce72b" (UID: "f4889476-c0d0-4e4b-986a-f4dcdacce72b"). InnerVolumeSpecName "kube-api-access-gszlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:47:57 crc kubenswrapper[4825]: I0310 06:47:57.607212 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4889476-c0d0-4e4b-986a-f4dcdacce72b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f4889476-c0d0-4e4b-986a-f4dcdacce72b" (UID: "f4889476-c0d0-4e4b-986a-f4dcdacce72b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:47:57 crc kubenswrapper[4825]: I0310 06:47:57.650078 4825 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4889476-c0d0-4e4b-986a-f4dcdacce72b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 06:47:57 crc kubenswrapper[4825]: I0310 06:47:57.650123 4825 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4889476-c0d0-4e4b-986a-f4dcdacce72b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 06:47:57 crc kubenswrapper[4825]: I0310 06:47:57.650144 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gszlc\" (UniqueName: \"kubernetes.io/projected/f4889476-c0d0-4e4b-986a-f4dcdacce72b-kube-api-access-gszlc\") on node \"crc\" DevicePath \"\"" Mar 10 06:47:57 crc kubenswrapper[4825]: I0310 06:47:57.780760 4825 patch_prober.go:28] interesting pod/router-default-5444994796-tgcvt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 06:47:57 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Mar 10 06:47:57 crc kubenswrapper[4825]: [+]process-running ok Mar 10 06:47:57 crc kubenswrapper[4825]: healthz check failed Mar 10 06:47:57 crc kubenswrapper[4825]: I0310 06:47:57.780858 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tgcvt" podUID="9f0e6d3a-b022-42dd-828e-bd5ec395d06c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 06:47:58 crc kubenswrapper[4825]: I0310 06:47:58.046923 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5038ec51-bad1-48bf-a91f-ea63f329e294","Type":"ContainerStarted","Data":"8a856c30a90b2aedd4ccda408842c9046c6d0188e34249ed7063bb5525890662"} Mar 10 06:47:58 crc kubenswrapper[4825]: I0310 06:47:58.048739 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:58 crc kubenswrapper[4825]: I0310 06:47:58.050375 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:58 crc kubenswrapper[4825]: I0310 06:47:58.051178 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" event={"ID":"f05d28cf-03a0-4bea-8c0e-fe252efc5c49","Type":"ContainerStarted","Data":"ffe75fbb6c6c36c4c9c3922b2217be7106e13b72145242a4e7a4cd41544d5f91"} Mar 10 06:47:58 crc kubenswrapper[4825]: I0310 06:47:58.051318 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" Mar 10 06:47:58 crc kubenswrapper[4825]: I0310 06:47:58.058013 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" Mar 10 06:47:58 crc kubenswrapper[4825]: I0310 06:47:58.058467 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552085-npl6x" Mar 10 06:47:58 crc kubenswrapper[4825]: I0310 06:47:58.058516 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552085-npl6x" event={"ID":"f4889476-c0d0-4e4b-986a-f4dcdacce72b","Type":"ContainerDied","Data":"f044f6937aa8febe11168d12bdff487a5f332bb1cab3d4c838c29042c23fdb09"} Mar 10 06:47:58 crc kubenswrapper[4825]: I0310 06:47:58.058535 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f044f6937aa8febe11168d12bdff487a5f332bb1cab3d4c838c29042c23fdb09" Mar 10 06:47:58 crc kubenswrapper[4825]: I0310 06:47:58.074804 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:58 crc kubenswrapper[4825]: I0310 06:47:58.075588 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" podStartSLOduration=8.075567627 podStartE2EDuration="8.075567627s" podCreationTimestamp="2026-03-10 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:58.072291426 +0000 UTC m=+231.102072051" watchObservedRunningTime="2026-03-10 06:47:58.075567627 +0000 UTC m=+231.105348242" Mar 10 06:47:58 crc kubenswrapper[4825]: I0310 06:47:58.083317 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-grpt7" Mar 10 06:47:58 crc kubenswrapper[4825]: I0310 06:47:58.441168 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 06:47:58 crc kubenswrapper[4825]: I0310 06:47:58.564011 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b42ba5a9-16fa-4616-961e-8d663b3f3ac1-kubelet-dir\") pod \"b42ba5a9-16fa-4616-961e-8d663b3f3ac1\" (UID: \"b42ba5a9-16fa-4616-961e-8d663b3f3ac1\") " Mar 10 06:47:58 crc kubenswrapper[4825]: I0310 06:47:58.564110 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b42ba5a9-16fa-4616-961e-8d663b3f3ac1-kube-api-access\") pod \"b42ba5a9-16fa-4616-961e-8d663b3f3ac1\" (UID: \"b42ba5a9-16fa-4616-961e-8d663b3f3ac1\") " Mar 10 06:47:58 crc kubenswrapper[4825]: I0310 06:47:58.564172 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b42ba5a9-16fa-4616-961e-8d663b3f3ac1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b42ba5a9-16fa-4616-961e-8d663b3f3ac1" (UID: "b42ba5a9-16fa-4616-961e-8d663b3f3ac1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:47:58 crc kubenswrapper[4825]: I0310 06:47:58.564512 4825 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b42ba5a9-16fa-4616-961e-8d663b3f3ac1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 06:47:58 crc kubenswrapper[4825]: I0310 06:47:58.584475 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b42ba5a9-16fa-4616-961e-8d663b3f3ac1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b42ba5a9-16fa-4616-961e-8d663b3f3ac1" (UID: "b42ba5a9-16fa-4616-961e-8d663b3f3ac1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:47:58 crc kubenswrapper[4825]: I0310 06:47:58.666098 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b42ba5a9-16fa-4616-961e-8d663b3f3ac1-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 06:47:58 crc kubenswrapper[4825]: I0310 06:47:58.770944 4825 patch_prober.go:28] interesting pod/router-default-5444994796-tgcvt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 06:47:58 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Mar 10 06:47:58 crc kubenswrapper[4825]: [+]process-running ok Mar 10 06:47:58 crc kubenswrapper[4825]: healthz check failed Mar 10 06:47:58 crc kubenswrapper[4825]: I0310 06:47:58.771028 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tgcvt" podUID="9f0e6d3a-b022-42dd-828e-bd5ec395d06c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 06:47:59 crc kubenswrapper[4825]: I0310 06:47:59.084746 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5038ec51-bad1-48bf-a91f-ea63f329e294","Type":"ContainerStarted","Data":"169b820f3c06d18bea50f0dc12055cec3e8b40a5880f357ad980313730c9ebf3"} Mar 10 06:47:59 crc kubenswrapper[4825]: I0310 06:47:59.089885 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 06:47:59 crc kubenswrapper[4825]: I0310 06:47:59.103529 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b42ba5a9-16fa-4616-961e-8d663b3f3ac1","Type":"ContainerDied","Data":"30dc858ef76c65cca13cbf81ae30b510714f5543288e3e6963c6b290ae451fa3"} Mar 10 06:47:59 crc kubenswrapper[4825]: I0310 06:47:59.103598 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30dc858ef76c65cca13cbf81ae30b510714f5543288e3e6963c6b290ae451fa3" Mar 10 06:47:59 crc kubenswrapper[4825]: I0310 06:47:59.108838 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.108817723 podStartE2EDuration="3.108817723s" podCreationTimestamp="2026-03-10 06:47:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:47:59.107447617 +0000 UTC m=+232.137228232" watchObservedRunningTime="2026-03-10 06:47:59.108817723 +0000 UTC m=+232.138598338" Mar 10 06:47:59 crc kubenswrapper[4825]: I0310 06:47:59.113098 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-ksvjd" Mar 10 06:47:59 crc kubenswrapper[4825]: I0310 06:47:59.770404 4825 patch_prober.go:28] interesting pod/router-default-5444994796-tgcvt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 06:47:59 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Mar 10 06:47:59 crc kubenswrapper[4825]: [+]process-running ok Mar 10 06:47:59 crc kubenswrapper[4825]: healthz check failed Mar 10 06:47:59 crc kubenswrapper[4825]: I0310 06:47:59.770471 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tgcvt" podUID="9f0e6d3a-b022-42dd-828e-bd5ec395d06c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 06:48:00 crc kubenswrapper[4825]: I0310 06:48:00.119206 4825 generic.go:334] "Generic (PLEG): container finished" podID="5038ec51-bad1-48bf-a91f-ea63f329e294" containerID="169b820f3c06d18bea50f0dc12055cec3e8b40a5880f357ad980313730c9ebf3" exitCode=0 Mar 10 06:48:00 crc kubenswrapper[4825]: I0310 06:48:00.120336 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5038ec51-bad1-48bf-a91f-ea63f329e294","Type":"ContainerDied","Data":"169b820f3c06d18bea50f0dc12055cec3e8b40a5880f357ad980313730c9ebf3"} Mar 10 06:48:00 crc kubenswrapper[4825]: I0310 06:48:00.132510 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552088-7fs5s"] Mar 10 06:48:00 crc kubenswrapper[4825]: E0310 06:48:00.132746 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b42ba5a9-16fa-4616-961e-8d663b3f3ac1" containerName="pruner" Mar 10 06:48:00 crc kubenswrapper[4825]: I0310 06:48:00.132758 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42ba5a9-16fa-4616-961e-8d663b3f3ac1" containerName="pruner" Mar 10 06:48:00 crc kubenswrapper[4825]: E0310 06:48:00.132781 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4889476-c0d0-4e4b-986a-f4dcdacce72b" containerName="collect-profiles" Mar 10 06:48:00 crc kubenswrapper[4825]: I0310 06:48:00.132787 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4889476-c0d0-4e4b-986a-f4dcdacce72b" containerName="collect-profiles" Mar 10 06:48:00 crc kubenswrapper[4825]: I0310 06:48:00.132888 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4889476-c0d0-4e4b-986a-f4dcdacce72b" containerName="collect-profiles" Mar 10 06:48:00 crc kubenswrapper[4825]: I0310 06:48:00.132902 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b42ba5a9-16fa-4616-961e-8d663b3f3ac1" containerName="pruner" Mar 10 06:48:00 crc kubenswrapper[4825]: I0310 06:48:00.137245 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552088-7fs5s" Mar 10 06:48:00 crc kubenswrapper[4825]: I0310 06:48:00.139007 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 06:48:00 crc kubenswrapper[4825]: I0310 06:48:00.142240 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552088-7fs5s"] Mar 10 06:48:00 crc kubenswrapper[4825]: I0310 06:48:00.190340 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs2mb\" (UniqueName: \"kubernetes.io/projected/ccb47913-9550-4871-9cf8-47fa8734b406-kube-api-access-bs2mb\") pod \"auto-csr-approver-29552088-7fs5s\" (UID: \"ccb47913-9550-4871-9cf8-47fa8734b406\") " pod="openshift-infra/auto-csr-approver-29552088-7fs5s" Mar 10 06:48:00 crc kubenswrapper[4825]: I0310 06:48:00.292386 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs2mb\" (UniqueName: \"kubernetes.io/projected/ccb47913-9550-4871-9cf8-47fa8734b406-kube-api-access-bs2mb\") pod \"auto-csr-approver-29552088-7fs5s\" (UID: \"ccb47913-9550-4871-9cf8-47fa8734b406\") " pod="openshift-infra/auto-csr-approver-29552088-7fs5s" Mar 10 06:48:00 crc kubenswrapper[4825]: I0310 06:48:00.309855 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs2mb\" (UniqueName: \"kubernetes.io/projected/ccb47913-9550-4871-9cf8-47fa8734b406-kube-api-access-bs2mb\") pod \"auto-csr-approver-29552088-7fs5s\" (UID: \"ccb47913-9550-4871-9cf8-47fa8734b406\") " pod="openshift-infra/auto-csr-approver-29552088-7fs5s" Mar 10 06:48:00 crc kubenswrapper[4825]: I0310 06:48:00.492406 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552088-7fs5s" Mar 10 06:48:00 crc kubenswrapper[4825]: I0310 06:48:00.517402 4825 ???:1] "http: TLS handshake error from 192.168.126.11:44998: no serving certificate available for the kubelet" Mar 10 06:48:00 crc kubenswrapper[4825]: I0310 06:48:00.776050 4825 patch_prober.go:28] interesting pod/router-default-5444994796-tgcvt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 06:48:00 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Mar 10 06:48:00 crc kubenswrapper[4825]: [+]process-running ok Mar 10 06:48:00 crc kubenswrapper[4825]: healthz check failed Mar 10 06:48:00 crc kubenswrapper[4825]: I0310 06:48:00.776149 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tgcvt" podUID="9f0e6d3a-b022-42dd-828e-bd5ec395d06c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 06:48:01 crc kubenswrapper[4825]: I0310 06:48:01.313193 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552088-7fs5s"] Mar 10 06:48:01 crc kubenswrapper[4825]: W0310 06:48:01.331626 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccb47913_9550_4871_9cf8_47fa8734b406.slice/crio-d0cf47b9651893ad08797a3f6518ec11e7a315a4ff2ae3e2539ce055ddc7aaad WatchSource:0}: Error finding container d0cf47b9651893ad08797a3f6518ec11e7a315a4ff2ae3e2539ce055ddc7aaad: Status 404 returned error can't find the container with id d0cf47b9651893ad08797a3f6518ec11e7a315a4ff2ae3e2539ce055ddc7aaad Mar 10 06:48:01 crc kubenswrapper[4825]: I0310 06:48:01.527277 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 06:48:01 crc kubenswrapper[4825]: I0310 06:48:01.613108 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5038ec51-bad1-48bf-a91f-ea63f329e294-kubelet-dir\") pod \"5038ec51-bad1-48bf-a91f-ea63f329e294\" (UID: \"5038ec51-bad1-48bf-a91f-ea63f329e294\") " Mar 10 06:48:01 crc kubenswrapper[4825]: I0310 06:48:01.613241 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5038ec51-bad1-48bf-a91f-ea63f329e294-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5038ec51-bad1-48bf-a91f-ea63f329e294" (UID: "5038ec51-bad1-48bf-a91f-ea63f329e294"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:48:01 crc kubenswrapper[4825]: I0310 06:48:01.613272 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5038ec51-bad1-48bf-a91f-ea63f329e294-kube-api-access\") pod \"5038ec51-bad1-48bf-a91f-ea63f329e294\" (UID: \"5038ec51-bad1-48bf-a91f-ea63f329e294\") " Mar 10 06:48:01 crc kubenswrapper[4825]: I0310 06:48:01.613707 4825 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5038ec51-bad1-48bf-a91f-ea63f329e294-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:01 crc kubenswrapper[4825]: I0310 06:48:01.622244 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5038ec51-bad1-48bf-a91f-ea63f329e294-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5038ec51-bad1-48bf-a91f-ea63f329e294" (UID: "5038ec51-bad1-48bf-a91f-ea63f329e294"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:48:01 crc kubenswrapper[4825]: I0310 06:48:01.714752 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5038ec51-bad1-48bf-a91f-ea63f329e294-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:01 crc kubenswrapper[4825]: I0310 06:48:01.771020 4825 patch_prober.go:28] interesting pod/router-default-5444994796-tgcvt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 06:48:01 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Mar 10 06:48:01 crc kubenswrapper[4825]: [+]process-running ok Mar 10 06:48:01 crc kubenswrapper[4825]: healthz check failed Mar 10 06:48:01 crc kubenswrapper[4825]: I0310 06:48:01.771104 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tgcvt" podUID="9f0e6d3a-b022-42dd-828e-bd5ec395d06c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 06:48:02 crc kubenswrapper[4825]: I0310 06:48:02.184472 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5038ec51-bad1-48bf-a91f-ea63f329e294","Type":"ContainerDied","Data":"8a856c30a90b2aedd4ccda408842c9046c6d0188e34249ed7063bb5525890662"} Mar 10 06:48:02 crc kubenswrapper[4825]: I0310 06:48:02.184533 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a856c30a90b2aedd4ccda408842c9046c6d0188e34249ed7063bb5525890662" Mar 10 06:48:02 crc kubenswrapper[4825]: I0310 06:48:02.184615 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 06:48:02 crc kubenswrapper[4825]: I0310 06:48:02.235461 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552088-7fs5s" event={"ID":"ccb47913-9550-4871-9cf8-47fa8734b406","Type":"ContainerStarted","Data":"d0cf47b9651893ad08797a3f6518ec11e7a315a4ff2ae3e2539ce055ddc7aaad"} Mar 10 06:48:02 crc kubenswrapper[4825]: I0310 06:48:02.566390 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hbbcq" Mar 10 06:48:02 crc kubenswrapper[4825]: I0310 06:48:02.770843 4825 patch_prober.go:28] interesting pod/router-default-5444994796-tgcvt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 06:48:02 crc kubenswrapper[4825]: [-]has-synced failed: reason withheld Mar 10 06:48:02 crc kubenswrapper[4825]: [+]process-running ok Mar 10 06:48:02 crc kubenswrapper[4825]: healthz check failed Mar 10 06:48:02 crc kubenswrapper[4825]: I0310 06:48:02.771367 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tgcvt" podUID="9f0e6d3a-b022-42dd-828e-bd5ec395d06c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 06:48:03 crc kubenswrapper[4825]: I0310 06:48:03.525584 4825 ???:1] "http: TLS handshake error from 192.168.126.11:50970: no serving certificate available for the kubelet" Mar 10 06:48:03 crc kubenswrapper[4825]: I0310 06:48:03.770851 4825 patch_prober.go:28] interesting pod/router-default-5444994796-tgcvt container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 06:48:03 crc kubenswrapper[4825]: [+]has-synced ok Mar 10 06:48:03 crc kubenswrapper[4825]: [+]process-running ok Mar 10 06:48:03 crc kubenswrapper[4825]: healthz check failed Mar 10 06:48:03 crc kubenswrapper[4825]: I0310 06:48:03.770940 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tgcvt" podUID="9f0e6d3a-b022-42dd-828e-bd5ec395d06c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 06:48:04 crc kubenswrapper[4825]: I0310 06:48:04.770974 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-tgcvt" Mar 10 06:48:04 crc kubenswrapper[4825]: I0310 06:48:04.775203 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-tgcvt" Mar 10 06:48:06 crc kubenswrapper[4825]: I0310 06:48:06.819259 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:48:06 crc kubenswrapper[4825]: I0310 06:48:06.824702 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:48:07 crc kubenswrapper[4825]: I0310 06:48:07.539412 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-9cr4m" Mar 10 06:48:08 crc kubenswrapper[4825]: I0310 06:48:08.041595 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs\") pod \"network-metrics-daemon-pj5dl\" (UID: \"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\") " pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:48:08 crc kubenswrapper[4825]: I0310 06:48:08.047448 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/114672c5-c1d0-4f87-b3aa-fb6d8535ffeb-metrics-certs\") pod \"network-metrics-daemon-pj5dl\" (UID: \"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb\") " pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:48:08 crc kubenswrapper[4825]: I0310 06:48:08.193515 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj5dl" Mar 10 06:48:09 crc kubenswrapper[4825]: I0310 06:48:09.311229 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c5b59d588-q6lfm"] Mar 10 06:48:09 crc kubenswrapper[4825]: I0310 06:48:09.311894 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" podUID="f86d9d2c-0701-4d7a-926f-64f7733767d4" containerName="controller-manager" containerID="cri-o://4b09ddfcbafce7de75ada51b793f536dcb329d0eb9b225fb8295030375245d0f" gracePeriod=30 Mar 10 06:48:09 crc kubenswrapper[4825]: I0310 06:48:09.314153 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz"] Mar 10 06:48:09 crc kubenswrapper[4825]: I0310 06:48:09.314304 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" podUID="f05d28cf-03a0-4bea-8c0e-fe252efc5c49" containerName="route-controller-manager" containerID="cri-o://ffe75fbb6c6c36c4c9c3922b2217be7106e13b72145242a4e7a4cd41544d5f91" gracePeriod=30 Mar 10 06:48:10 crc kubenswrapper[4825]: I0310 06:48:10.379069 4825 generic.go:334] "Generic (PLEG): container finished" podID="f05d28cf-03a0-4bea-8c0e-fe252efc5c49" containerID="ffe75fbb6c6c36c4c9c3922b2217be7106e13b72145242a4e7a4cd41544d5f91" exitCode=0 Mar 10 06:48:10 crc kubenswrapper[4825]: I0310 06:48:10.379193 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" event={"ID":"f05d28cf-03a0-4bea-8c0e-fe252efc5c49","Type":"ContainerDied","Data":"ffe75fbb6c6c36c4c9c3922b2217be7106e13b72145242a4e7a4cd41544d5f91"} Mar 10 06:48:10 crc kubenswrapper[4825]: I0310 06:48:10.381982 4825 generic.go:334] "Generic (PLEG): container finished" podID="f86d9d2c-0701-4d7a-926f-64f7733767d4" containerID="4b09ddfcbafce7de75ada51b793f536dcb329d0eb9b225fb8295030375245d0f" exitCode=0 Mar 10 06:48:10 crc kubenswrapper[4825]: I0310 06:48:10.382027 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" event={"ID":"f86d9d2c-0701-4d7a-926f-64f7733767d4","Type":"ContainerDied","Data":"4b09ddfcbafce7de75ada51b793f536dcb329d0eb9b225fb8295030375245d0f"} Mar 10 06:48:10 crc kubenswrapper[4825]: I0310 06:48:10.790597 4825 ???:1] "http: TLS handshake error from 192.168.126.11:50974: no serving certificate available for the kubelet" Mar 10 06:48:13 crc kubenswrapper[4825]: I0310 06:48:13.645549 4825 patch_prober.go:28] interesting pod/controller-manager-6c5b59d588-q6lfm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 10 06:48:13 crc kubenswrapper[4825]: I0310 06:48:13.647273 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" podUID="f86d9d2c-0701-4d7a-926f-64f7733767d4" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 10 06:48:14 crc kubenswrapper[4825]: I0310 06:48:14.921367 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:48:16 crc kubenswrapper[4825]: I0310 06:48:16.801850 4825 patch_prober.go:28] interesting pod/route-controller-manager-854d9cb458-vnjfz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 06:48:16 crc kubenswrapper[4825]: I0310 06:48:16.802429 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" podUID="f05d28cf-03a0-4bea-8c0e-fe252efc5c49" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 06:48:16 crc kubenswrapper[4825]: I0310 06:48:16.888300 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 06:48:16 crc kubenswrapper[4825]: I0310 06:48:16.888389 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 06:48:19 crc kubenswrapper[4825]: E0310 06:48:19.471851 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 10 06:48:19 crc kubenswrapper[4825]: E0310 06:48:19.473221 4825 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 06:48:19 crc kubenswrapper[4825]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 10 06:48:19 crc kubenswrapper[4825]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6jh49,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29552086-rkhdp_openshift-infra(19b3fb37-5072-45f0-8349-1296e31a1193): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 10 06:48:19 crc kubenswrapper[4825]: > logger="UnhandledError" Mar 10 06:48:19 crc kubenswrapper[4825]: E0310 06:48:19.474307 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29552086-rkhdp" podUID="19b3fb37-5072-45f0-8349-1296e31a1193" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.485045 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" event={"ID":"f05d28cf-03a0-4bea-8c0e-fe252efc5c49","Type":"ContainerDied","Data":"55ceb7bc912c8371eab44ff6f8a3d6aa4faa4aa19f137d280a5ad6685091c566"} Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.485170 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55ceb7bc912c8371eab44ff6f8a3d6aa4faa4aa19f137d280a5ad6685091c566" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.487521 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.529795 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr"] Mar 10 06:48:19 crc kubenswrapper[4825]: E0310 06:48:19.530426 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5038ec51-bad1-48bf-a91f-ea63f329e294" containerName="pruner" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.530519 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5038ec51-bad1-48bf-a91f-ea63f329e294" containerName="pruner" Mar 10 06:48:19 crc kubenswrapper[4825]: E0310 06:48:19.530624 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f05d28cf-03a0-4bea-8c0e-fe252efc5c49" containerName="route-controller-manager" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.530696 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05d28cf-03a0-4bea-8c0e-fe252efc5c49" containerName="route-controller-manager" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.530900 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f05d28cf-03a0-4bea-8c0e-fe252efc5c49" containerName="route-controller-manager" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.530995 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5038ec51-bad1-48bf-a91f-ea63f329e294" containerName="pruner" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.531573 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.545864 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr"] Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.643092 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f05d28cf-03a0-4bea-8c0e-fe252efc5c49-client-ca\") pod \"f05d28cf-03a0-4bea-8c0e-fe252efc5c49\" (UID: \"f05d28cf-03a0-4bea-8c0e-fe252efc5c49\") " Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.643184 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f05d28cf-03a0-4bea-8c0e-fe252efc5c49-config\") pod \"f05d28cf-03a0-4bea-8c0e-fe252efc5c49\" (UID: \"f05d28cf-03a0-4bea-8c0e-fe252efc5c49\") " Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.643213 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f05d28cf-03a0-4bea-8c0e-fe252efc5c49-serving-cert\") pod \"f05d28cf-03a0-4bea-8c0e-fe252efc5c49\" (UID: \"f05d28cf-03a0-4bea-8c0e-fe252efc5c49\") " Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.643243 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7qb7\" (UniqueName: \"kubernetes.io/projected/f05d28cf-03a0-4bea-8c0e-fe252efc5c49-kube-api-access-q7qb7\") pod \"f05d28cf-03a0-4bea-8c0e-fe252efc5c49\" (UID: \"f05d28cf-03a0-4bea-8c0e-fe252efc5c49\") " Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.643424 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7bf472b-4206-4d33-832b-38b84b2760ae-config\") pod \"route-controller-manager-dc89489b9-b4slr\" (UID: \"c7bf472b-4206-4d33-832b-38b84b2760ae\") " pod="openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.643496 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7bf472b-4206-4d33-832b-38b84b2760ae-client-ca\") pod \"route-controller-manager-dc89489b9-b4slr\" (UID: \"c7bf472b-4206-4d33-832b-38b84b2760ae\") " pod="openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.643553 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvtsh\" (UniqueName: \"kubernetes.io/projected/c7bf472b-4206-4d33-832b-38b84b2760ae-kube-api-access-lvtsh\") pod \"route-controller-manager-dc89489b9-b4slr\" (UID: \"c7bf472b-4206-4d33-832b-38b84b2760ae\") " pod="openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.644205 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f05d28cf-03a0-4bea-8c0e-fe252efc5c49-config" (OuterVolumeSpecName: "config") pod "f05d28cf-03a0-4bea-8c0e-fe252efc5c49" (UID: "f05d28cf-03a0-4bea-8c0e-fe252efc5c49"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.644199 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f05d28cf-03a0-4bea-8c0e-fe252efc5c49-client-ca" (OuterVolumeSpecName: "client-ca") pod "f05d28cf-03a0-4bea-8c0e-fe252efc5c49" (UID: "f05d28cf-03a0-4bea-8c0e-fe252efc5c49"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.646197 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7bf472b-4206-4d33-832b-38b84b2760ae-serving-cert\") pod \"route-controller-manager-dc89489b9-b4slr\" (UID: \"c7bf472b-4206-4d33-832b-38b84b2760ae\") " pod="openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.646338 4825 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f05d28cf-03a0-4bea-8c0e-fe252efc5c49-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.646367 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f05d28cf-03a0-4bea-8c0e-fe252efc5c49-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.649745 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f05d28cf-03a0-4bea-8c0e-fe252efc5c49-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f05d28cf-03a0-4bea-8c0e-fe252efc5c49" (UID: "f05d28cf-03a0-4bea-8c0e-fe252efc5c49"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.650034 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f05d28cf-03a0-4bea-8c0e-fe252efc5c49-kube-api-access-q7qb7" (OuterVolumeSpecName: "kube-api-access-q7qb7") pod "f05d28cf-03a0-4bea-8c0e-fe252efc5c49" (UID: "f05d28cf-03a0-4bea-8c0e-fe252efc5c49"). InnerVolumeSpecName "kube-api-access-q7qb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.747858 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7bf472b-4206-4d33-832b-38b84b2760ae-serving-cert\") pod \"route-controller-manager-dc89489b9-b4slr\" (UID: \"c7bf472b-4206-4d33-832b-38b84b2760ae\") " pod="openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.747943 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7bf472b-4206-4d33-832b-38b84b2760ae-config\") pod \"route-controller-manager-dc89489b9-b4slr\" (UID: \"c7bf472b-4206-4d33-832b-38b84b2760ae\") " pod="openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.748018 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7bf472b-4206-4d33-832b-38b84b2760ae-client-ca\") pod \"route-controller-manager-dc89489b9-b4slr\" (UID: \"c7bf472b-4206-4d33-832b-38b84b2760ae\") " pod="openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.748079 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvtsh\" (UniqueName: \"kubernetes.io/projected/c7bf472b-4206-4d33-832b-38b84b2760ae-kube-api-access-lvtsh\") pod \"route-controller-manager-dc89489b9-b4slr\" (UID: \"c7bf472b-4206-4d33-832b-38b84b2760ae\") " pod="openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.748175 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f05d28cf-03a0-4bea-8c0e-fe252efc5c49-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.748194 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7qb7\" (UniqueName: \"kubernetes.io/projected/f05d28cf-03a0-4bea-8c0e-fe252efc5c49-kube-api-access-q7qb7\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.749550 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7bf472b-4206-4d33-832b-38b84b2760ae-config\") pod \"route-controller-manager-dc89489b9-b4slr\" (UID: \"c7bf472b-4206-4d33-832b-38b84b2760ae\") " pod="openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.749563 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7bf472b-4206-4d33-832b-38b84b2760ae-client-ca\") pod \"route-controller-manager-dc89489b9-b4slr\" (UID: \"c7bf472b-4206-4d33-832b-38b84b2760ae\") " pod="openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.752220 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7bf472b-4206-4d33-832b-38b84b2760ae-serving-cert\") pod \"route-controller-manager-dc89489b9-b4slr\" (UID: \"c7bf472b-4206-4d33-832b-38b84b2760ae\") " pod="openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.765895 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvtsh\" (UniqueName: \"kubernetes.io/projected/c7bf472b-4206-4d33-832b-38b84b2760ae-kube-api-access-lvtsh\") pod \"route-controller-manager-dc89489b9-b4slr\" (UID: \"c7bf472b-4206-4d33-832b-38b84b2760ae\") " pod="openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr" Mar 10 06:48:19 crc kubenswrapper[4825]: I0310 06:48:19.865997 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr" Mar 10 06:48:20 crc kubenswrapper[4825]: I0310 06:48:20.491266 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz" Mar 10 06:48:20 crc kubenswrapper[4825]: E0310 06:48:20.496017 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29552086-rkhdp" podUID="19b3fb37-5072-45f0-8349-1296e31a1193" Mar 10 06:48:20 crc kubenswrapper[4825]: I0310 06:48:20.529615 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz"] Mar 10 06:48:20 crc kubenswrapper[4825]: I0310 06:48:20.534604 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-854d9cb458-vnjfz"] Mar 10 06:48:21 crc kubenswrapper[4825]: E0310 06:48:21.157211 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 10 06:48:21 crc kubenswrapper[4825]: E0310 06:48:21.157713 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p6bpt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-hbdjf_openshift-marketplace(d232f8a7-f013-4a5c-a7dc-2150c4b3040c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 06:48:21 crc kubenswrapper[4825]: E0310 06:48:21.158901 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-hbdjf" podUID="d232f8a7-f013-4a5c-a7dc-2150c4b3040c" Mar 10 06:48:21 crc kubenswrapper[4825]: I0310 06:48:21.243624 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f05d28cf-03a0-4bea-8c0e-fe252efc5c49" path="/var/lib/kubelet/pods/f05d28cf-03a0-4bea-8c0e-fe252efc5c49/volumes" Mar 10 06:48:23 crc kubenswrapper[4825]: E0310 06:48:23.193313 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-hbdjf" podUID="d232f8a7-f013-4a5c-a7dc-2150c4b3040c" Mar 10 06:48:24 crc kubenswrapper[4825]: I0310 06:48:24.646006 4825 patch_prober.go:28] interesting pod/controller-manager-6c5b59d588-q6lfm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 06:48:24 crc kubenswrapper[4825]: I0310 06:48:24.646109 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" podUID="f86d9d2c-0701-4d7a-926f-64f7733767d4" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 06:48:25 crc kubenswrapper[4825]: I0310 06:48:25.015748 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xfp4f"] Mar 10 06:48:27 crc kubenswrapper[4825]: E0310 06:48:27.559564 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 10 06:48:27 crc kubenswrapper[4825]: E0310 06:48:27.560162 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pcm4d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-7h5g2_openshift-marketplace(27fbaf63-cf93-470e-b012-ad8bb403f65a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 06:48:27 crc kubenswrapper[4825]: E0310 06:48:27.561857 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-7h5g2" podUID="27fbaf63-cf93-470e-b012-ad8bb403f65a" Mar 10 06:48:27 crc kubenswrapper[4825]: I0310 06:48:27.875335 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nblmw" Mar 10 06:48:28 crc kubenswrapper[4825]: I0310 06:48:28.614910 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 06:48:28 crc kubenswrapper[4825]: I0310 06:48:28.618208 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 06:48:28 crc kubenswrapper[4825]: I0310 06:48:28.621663 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 06:48:28 crc kubenswrapper[4825]: I0310 06:48:28.622422 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 06:48:28 crc kubenswrapper[4825]: I0310 06:48:28.623921 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 06:48:28 crc kubenswrapper[4825]: I0310 06:48:28.725767 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b0c2a90-0982-47a7-a77c-720eea591481-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7b0c2a90-0982-47a7-a77c-720eea591481\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 06:48:28 crc kubenswrapper[4825]: I0310 06:48:28.725923 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b0c2a90-0982-47a7-a77c-720eea591481-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7b0c2a90-0982-47a7-a77c-720eea591481\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 06:48:28 crc kubenswrapper[4825]: I0310 06:48:28.827444 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b0c2a90-0982-47a7-a77c-720eea591481-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7b0c2a90-0982-47a7-a77c-720eea591481\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 06:48:28 crc kubenswrapper[4825]: I0310 06:48:28.827522 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b0c2a90-0982-47a7-a77c-720eea591481-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7b0c2a90-0982-47a7-a77c-720eea591481\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 06:48:28 crc kubenswrapper[4825]: I0310 06:48:28.827957 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b0c2a90-0982-47a7-a77c-720eea591481-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7b0c2a90-0982-47a7-a77c-720eea591481\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 06:48:28 crc kubenswrapper[4825]: I0310 06:48:28.847838 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b0c2a90-0982-47a7-a77c-720eea591481-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7b0c2a90-0982-47a7-a77c-720eea591481\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 06:48:28 crc kubenswrapper[4825]: I0310 06:48:28.949597 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 06:48:29 crc kubenswrapper[4825]: E0310 06:48:29.002322 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7h5g2" podUID="27fbaf63-cf93-470e-b012-ad8bb403f65a" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.050064 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" Mar 10 06:48:29 crc kubenswrapper[4825]: E0310 06:48:29.106255 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 10 06:48:29 crc kubenswrapper[4825]: E0310 06:48:29.106481 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t9p8v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-vkvnj_openshift-marketplace(d41ec286-2462-448b-ab27-1acc0e1dab3c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 06:48:29 crc kubenswrapper[4825]: E0310 06:48:29.107704 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-vkvnj" podUID="d41ec286-2462-448b-ab27-1acc0e1dab3c" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.125382 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-556c9dd674-99mg2"] Mar 10 06:48:29 crc kubenswrapper[4825]: E0310 06:48:29.126179 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f86d9d2c-0701-4d7a-926f-64f7733767d4" containerName="controller-manager" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.126197 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f86d9d2c-0701-4d7a-926f-64f7733767d4" containerName="controller-manager" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.126333 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f86d9d2c-0701-4d7a-926f-64f7733767d4" containerName="controller-manager" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.127586 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-556c9dd674-99mg2" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.130551 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-556c9dd674-99mg2"] Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.132083 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f86d9d2c-0701-4d7a-926f-64f7733767d4-config\") pod \"f86d9d2c-0701-4d7a-926f-64f7733767d4\" (UID: \"f86d9d2c-0701-4d7a-926f-64f7733767d4\") " Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.132147 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f86d9d2c-0701-4d7a-926f-64f7733767d4-serving-cert\") pod \"f86d9d2c-0701-4d7a-926f-64f7733767d4\" (UID: \"f86d9d2c-0701-4d7a-926f-64f7733767d4\") " Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.132186 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfmlp\" (UniqueName: \"kubernetes.io/projected/f86d9d2c-0701-4d7a-926f-64f7733767d4-kube-api-access-qfmlp\") pod \"f86d9d2c-0701-4d7a-926f-64f7733767d4\" (UID: \"f86d9d2c-0701-4d7a-926f-64f7733767d4\") " Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.132231 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f86d9d2c-0701-4d7a-926f-64f7733767d4-proxy-ca-bundles\") pod \"f86d9d2c-0701-4d7a-926f-64f7733767d4\" (UID: \"f86d9d2c-0701-4d7a-926f-64f7733767d4\") " Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.132367 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f86d9d2c-0701-4d7a-926f-64f7733767d4-client-ca\") pod \"f86d9d2c-0701-4d7a-926f-64f7733767d4\" (UID: \"f86d9d2c-0701-4d7a-926f-64f7733767d4\") " Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.134867 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f86d9d2c-0701-4d7a-926f-64f7733767d4-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f86d9d2c-0701-4d7a-926f-64f7733767d4" (UID: "f86d9d2c-0701-4d7a-926f-64f7733767d4"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.134877 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f86d9d2c-0701-4d7a-926f-64f7733767d4-client-ca" (OuterVolumeSpecName: "client-ca") pod "f86d9d2c-0701-4d7a-926f-64f7733767d4" (UID: "f86d9d2c-0701-4d7a-926f-64f7733767d4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.137549 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f86d9d2c-0701-4d7a-926f-64f7733767d4-config" (OuterVolumeSpecName: "config") pod "f86d9d2c-0701-4d7a-926f-64f7733767d4" (UID: "f86d9d2c-0701-4d7a-926f-64f7733767d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.140295 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f86d9d2c-0701-4d7a-926f-64f7733767d4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f86d9d2c-0701-4d7a-926f-64f7733767d4" (UID: "f86d9d2c-0701-4d7a-926f-64f7733767d4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.145352 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f86d9d2c-0701-4d7a-926f-64f7733767d4-kube-api-access-qfmlp" (OuterVolumeSpecName: "kube-api-access-qfmlp") pod "f86d9d2c-0701-4d7a-926f-64f7733767d4" (UID: "f86d9d2c-0701-4d7a-926f-64f7733767d4"). InnerVolumeSpecName "kube-api-access-qfmlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:48:29 crc kubenswrapper[4825]: E0310 06:48:29.154523 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 10 06:48:29 crc kubenswrapper[4825]: E0310 06:48:29.154742 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4l84j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-lfhnn_openshift-marketplace(33ed941e-9788-4c9e-bcd5-6af206460adb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 06:48:29 crc kubenswrapper[4825]: E0310 06:48:29.158003 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-lfhnn" podUID="33ed941e-9788-4c9e-bcd5-6af206460adb" Mar 10 06:48:29 crc kubenswrapper[4825]: E0310 06:48:29.218307 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 10 06:48:29 crc kubenswrapper[4825]: E0310 06:48:29.218988 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hkc9w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-2gg7k_openshift-marketplace(58d27fca-dd33-4753-9ea6-4b55c191b2b8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 06:48:29 crc kubenswrapper[4825]: E0310 06:48:29.221693 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-2gg7k" podUID="58d27fca-dd33-4753-9ea6-4b55c191b2b8" Mar 10 06:48:29 crc kubenswrapper[4825]: E0310 06:48:29.235603 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 10 06:48:29 crc kubenswrapper[4825]: E0310 06:48:29.235755 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fn9kg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-lfxbn_openshift-marketplace(823b95d6-3aa7-4fda-bd75-9fe7e95a209f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.236228 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64443acf-ea44-4a9a-92c9-c7daf622140c-config\") pod \"controller-manager-556c9dd674-99mg2\" (UID: \"64443acf-ea44-4a9a-92c9-c7daf622140c\") " pod="openshift-controller-manager/controller-manager-556c9dd674-99mg2" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.236286 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64443acf-ea44-4a9a-92c9-c7daf622140c-serving-cert\") pod \"controller-manager-556c9dd674-99mg2\" (UID: \"64443acf-ea44-4a9a-92c9-c7daf622140c\") " pod="openshift-controller-manager/controller-manager-556c9dd674-99mg2" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.236320 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64443acf-ea44-4a9a-92c9-c7daf622140c-client-ca\") pod \"controller-manager-556c9dd674-99mg2\" (UID: \"64443acf-ea44-4a9a-92c9-c7daf622140c\") " pod="openshift-controller-manager/controller-manager-556c9dd674-99mg2" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.236379 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64443acf-ea44-4a9a-92c9-c7daf622140c-proxy-ca-bundles\") pod \"controller-manager-556c9dd674-99mg2\" (UID: \"64443acf-ea44-4a9a-92c9-c7daf622140c\") " pod="openshift-controller-manager/controller-manager-556c9dd674-99mg2" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.236400 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6h7l\" (UniqueName: \"kubernetes.io/projected/64443acf-ea44-4a9a-92c9-c7daf622140c-kube-api-access-b6h7l\") pod \"controller-manager-556c9dd674-99mg2\" (UID: \"64443acf-ea44-4a9a-92c9-c7daf622140c\") " pod="openshift-controller-manager/controller-manager-556c9dd674-99mg2" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.236480 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f86d9d2c-0701-4d7a-926f-64f7733767d4-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.236493 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f86d9d2c-0701-4d7a-926f-64f7733767d4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.236504 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfmlp\" (UniqueName: \"kubernetes.io/projected/f86d9d2c-0701-4d7a-926f-64f7733767d4-kube-api-access-qfmlp\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.236517 4825 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f86d9d2c-0701-4d7a-926f-64f7733767d4-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.236526 4825 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f86d9d2c-0701-4d7a-926f-64f7733767d4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:29 crc kubenswrapper[4825]: E0310 06:48:29.238231 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-lfxbn" podUID="823b95d6-3aa7-4fda-bd75-9fe7e95a209f" Mar 10 06:48:29 crc kubenswrapper[4825]: E0310 06:48:29.257148 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 10 06:48:29 crc kubenswrapper[4825]: E0310 06:48:29.257350 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tcqbl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-9zmmv_openshift-marketplace(e945d496-847a-4109-ae2d-41e169b241c2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 06:48:29 crc kubenswrapper[4825]: E0310 06:48:29.258945 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-9zmmv" podUID="e945d496-847a-4109-ae2d-41e169b241c2" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.321333 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-556c9dd674-99mg2"] Mar 10 06:48:29 crc kubenswrapper[4825]: E0310 06:48:29.323483 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-b6h7l proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-556c9dd674-99mg2" podUID="64443acf-ea44-4a9a-92c9-c7daf622140c" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.341594 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64443acf-ea44-4a9a-92c9-c7daf622140c-proxy-ca-bundles\") pod \"controller-manager-556c9dd674-99mg2\" (UID: \"64443acf-ea44-4a9a-92c9-c7daf622140c\") " pod="openshift-controller-manager/controller-manager-556c9dd674-99mg2" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.342234 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6h7l\" (UniqueName: \"kubernetes.io/projected/64443acf-ea44-4a9a-92c9-c7daf622140c-kube-api-access-b6h7l\") pod \"controller-manager-556c9dd674-99mg2\" (UID: \"64443acf-ea44-4a9a-92c9-c7daf622140c\") " pod="openshift-controller-manager/controller-manager-556c9dd674-99mg2" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.342386 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64443acf-ea44-4a9a-92c9-c7daf622140c-config\") pod \"controller-manager-556c9dd674-99mg2\" (UID: \"64443acf-ea44-4a9a-92c9-c7daf622140c\") " pod="openshift-controller-manager/controller-manager-556c9dd674-99mg2" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.342418 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64443acf-ea44-4a9a-92c9-c7daf622140c-serving-cert\") pod \"controller-manager-556c9dd674-99mg2\" (UID: \"64443acf-ea44-4a9a-92c9-c7daf622140c\") " pod="openshift-controller-manager/controller-manager-556c9dd674-99mg2" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.342534 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64443acf-ea44-4a9a-92c9-c7daf622140c-client-ca\") pod \"controller-manager-556c9dd674-99mg2\" (UID: \"64443acf-ea44-4a9a-92c9-c7daf622140c\") " pod="openshift-controller-manager/controller-manager-556c9dd674-99mg2" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.346788 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64443acf-ea44-4a9a-92c9-c7daf622140c-client-ca\") pod \"controller-manager-556c9dd674-99mg2\" (UID: \"64443acf-ea44-4a9a-92c9-c7daf622140c\") " pod="openshift-controller-manager/controller-manager-556c9dd674-99mg2" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.350055 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64443acf-ea44-4a9a-92c9-c7daf622140c-config\") pod \"controller-manager-556c9dd674-99mg2\" (UID: \"64443acf-ea44-4a9a-92c9-c7daf622140c\") " pod="openshift-controller-manager/controller-manager-556c9dd674-99mg2" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.369578 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6h7l\" (UniqueName: \"kubernetes.io/projected/64443acf-ea44-4a9a-92c9-c7daf622140c-kube-api-access-b6h7l\") pod \"controller-manager-556c9dd674-99mg2\" (UID: \"64443acf-ea44-4a9a-92c9-c7daf622140c\") " pod="openshift-controller-manager/controller-manager-556c9dd674-99mg2" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.369814 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64443acf-ea44-4a9a-92c9-c7daf622140c-serving-cert\") pod \"controller-manager-556c9dd674-99mg2\" (UID: \"64443acf-ea44-4a9a-92c9-c7daf622140c\") " pod="openshift-controller-manager/controller-manager-556c9dd674-99mg2" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.375090 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64443acf-ea44-4a9a-92c9-c7daf622140c-proxy-ca-bundles\") pod \"controller-manager-556c9dd674-99mg2\" (UID: \"64443acf-ea44-4a9a-92c9-c7daf622140c\") " pod="openshift-controller-manager/controller-manager-556c9dd674-99mg2" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.412581 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr"] Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.556514 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gwdl" event={"ID":"f7b359b6-5dbd-4270-9195-a355b8ce3dbd","Type":"ContainerStarted","Data":"fac4bb211af00d8f9c1dbebd44d9d67ed30eee5c2232504f09c4c1150fa89544"} Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.559578 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr"] Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.560606 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552088-7fs5s" event={"ID":"ccb47913-9550-4871-9cf8-47fa8734b406","Type":"ContainerStarted","Data":"639a015422630d0721259b232accc01baa4b433bac72a5f3c4e6d9e334208d22"} Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.566860 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.567270 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c5b59d588-q6lfm" event={"ID":"f86d9d2c-0701-4d7a-926f-64f7733767d4","Type":"ContainerDied","Data":"a6802189a97937a7aea9c7cfbb226770a5a7af3dca8b5e09447a545ab92447a7"} Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.567306 4825 scope.go:117] "RemoveContainer" containerID="4b09ddfcbafce7de75ada51b793f536dcb329d0eb9b225fb8295030375245d0f" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.568656 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-556c9dd674-99mg2" Mar 10 06:48:29 crc kubenswrapper[4825]: E0310 06:48:29.569921 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-2gg7k" podUID="58d27fca-dd33-4753-9ea6-4b55c191b2b8" Mar 10 06:48:29 crc kubenswrapper[4825]: E0310 06:48:29.570169 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-9zmmv" podUID="e945d496-847a-4109-ae2d-41e169b241c2" Mar 10 06:48:29 crc kubenswrapper[4825]: E0310 06:48:29.570317 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-vkvnj" podUID="d41ec286-2462-448b-ab27-1acc0e1dab3c" Mar 10 06:48:29 crc kubenswrapper[4825]: E0310 06:48:29.570472 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-lfxbn" podUID="823b95d6-3aa7-4fda-bd75-9fe7e95a209f" Mar 10 06:48:29 crc kubenswrapper[4825]: E0310 06:48:29.570883 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-lfhnn" podUID="33ed941e-9788-4c9e-bcd5-6af206460adb" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.584663 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-556c9dd674-99mg2" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.652451 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64443acf-ea44-4a9a-92c9-c7daf622140c-proxy-ca-bundles\") pod \"64443acf-ea44-4a9a-92c9-c7daf622140c\" (UID: \"64443acf-ea44-4a9a-92c9-c7daf622140c\") " Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.652530 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64443acf-ea44-4a9a-92c9-c7daf622140c-serving-cert\") pod \"64443acf-ea44-4a9a-92c9-c7daf622140c\" (UID: \"64443acf-ea44-4a9a-92c9-c7daf622140c\") " Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.653302 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64443acf-ea44-4a9a-92c9-c7daf622140c-config\") pod \"64443acf-ea44-4a9a-92c9-c7daf622140c\" (UID: \"64443acf-ea44-4a9a-92c9-c7daf622140c\") " Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.653338 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64443acf-ea44-4a9a-92c9-c7daf622140c-client-ca\") pod \"64443acf-ea44-4a9a-92c9-c7daf622140c\" (UID: \"64443acf-ea44-4a9a-92c9-c7daf622140c\") " Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.653416 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6h7l\" (UniqueName: \"kubernetes.io/projected/64443acf-ea44-4a9a-92c9-c7daf622140c-kube-api-access-b6h7l\") pod \"64443acf-ea44-4a9a-92c9-c7daf622140c\" (UID: \"64443acf-ea44-4a9a-92c9-c7daf622140c\") " Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.653399 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64443acf-ea44-4a9a-92c9-c7daf622140c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "64443acf-ea44-4a9a-92c9-c7daf622140c" (UID: "64443acf-ea44-4a9a-92c9-c7daf622140c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.654008 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64443acf-ea44-4a9a-92c9-c7daf622140c-config" (OuterVolumeSpecName: "config") pod "64443acf-ea44-4a9a-92c9-c7daf622140c" (UID: "64443acf-ea44-4a9a-92c9-c7daf622140c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.654405 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64443acf-ea44-4a9a-92c9-c7daf622140c-client-ca" (OuterVolumeSpecName: "client-ca") pod "64443acf-ea44-4a9a-92c9-c7daf622140c" (UID: "64443acf-ea44-4a9a-92c9-c7daf622140c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.655066 4825 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64443acf-ea44-4a9a-92c9-c7daf622140c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.655085 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64443acf-ea44-4a9a-92c9-c7daf622140c-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.655094 4825 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64443acf-ea44-4a9a-92c9-c7daf622140c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.655799 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552088-7fs5s" podStartSLOduration=1.935852591 podStartE2EDuration="29.655768061s" podCreationTimestamp="2026-03-10 06:48:00 +0000 UTC" firstStartedPulling="2026-03-10 06:48:01.335395897 +0000 UTC m=+234.365176512" lastFinishedPulling="2026-03-10 06:48:29.055311367 +0000 UTC m=+262.085091982" observedRunningTime="2026-03-10 06:48:29.653015077 +0000 UTC m=+262.682795692" watchObservedRunningTime="2026-03-10 06:48:29.655768061 +0000 UTC m=+262.685548696" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.663243 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64443acf-ea44-4a9a-92c9-c7daf622140c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "64443acf-ea44-4a9a-92c9-c7daf622140c" (UID: "64443acf-ea44-4a9a-92c9-c7daf622140c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.663416 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64443acf-ea44-4a9a-92c9-c7daf622140c-kube-api-access-b6h7l" (OuterVolumeSpecName: "kube-api-access-b6h7l") pod "64443acf-ea44-4a9a-92c9-c7daf622140c" (UID: "64443acf-ea44-4a9a-92c9-c7daf622140c"). InnerVolumeSpecName "kube-api-access-b6h7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.743277 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pj5dl"] Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.758833 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6h7l\" (UniqueName: \"kubernetes.io/projected/64443acf-ea44-4a9a-92c9-c7daf622140c-kube-api-access-b6h7l\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.758865 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64443acf-ea44-4a9a-92c9-c7daf622140c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.761232 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c5b59d588-q6lfm"] Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.765963 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c5b59d588-q6lfm"] Mar 10 06:48:29 crc kubenswrapper[4825]: I0310 06:48:29.781944 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 06:48:29 crc kubenswrapper[4825]: W0310 06:48:29.844795 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7b0c2a90_0982_47a7_a77c_720eea591481.slice/crio-05625360749fdd658de573fd14ba9c1db43f413a3765e253df47007e0923a4a6 WatchSource:0}: Error finding container 05625360749fdd658de573fd14ba9c1db43f413a3765e253df47007e0923a4a6: Status 404 returned error can't find the container with id 05625360749fdd658de573fd14ba9c1db43f413a3765e253df47007e0923a4a6 Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.026869 4825 csr.go:261] certificate signing request csr-x4c9l is approved, waiting to be issued Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.033845 4825 csr.go:257] certificate signing request csr-x4c9l is issued Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.572959 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7b0c2a90-0982-47a7-a77c-720eea591481","Type":"ContainerStarted","Data":"22e4774bf3dd7adc454a5558066f8d92ffaf47a28e354f2feb2c6c92d19f79ab"} Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.573041 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7b0c2a90-0982-47a7-a77c-720eea591481","Type":"ContainerStarted","Data":"05625360749fdd658de573fd14ba9c1db43f413a3765e253df47007e0923a4a6"} Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.575998 4825 generic.go:334] "Generic (PLEG): container finished" podID="ccb47913-9550-4871-9cf8-47fa8734b406" containerID="639a015422630d0721259b232accc01baa4b433bac72a5f3c4e6d9e334208d22" exitCode=0 Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.576104 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552088-7fs5s" event={"ID":"ccb47913-9550-4871-9cf8-47fa8734b406","Type":"ContainerDied","Data":"639a015422630d0721259b232accc01baa4b433bac72a5f3c4e6d9e334208d22"} Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.582379 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pj5dl" event={"ID":"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb","Type":"ContainerStarted","Data":"05cc562e9aa697f1ed700fbb6bea23a6230b1e87a71f769dc70f63cd80a8c6ec"} Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.582421 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pj5dl" event={"ID":"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb","Type":"ContainerStarted","Data":"677146a29b0d3c9f14554c7a74a0b2117261e0497188b0ccaf7d9e62c2335ea4"} Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.582438 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pj5dl" event={"ID":"114672c5-c1d0-4f87-b3aa-fb6d8535ffeb","Type":"ContainerStarted","Data":"f4843fedf2c3024b68a2c7032a0ceb9ef4f02e0db993f3699ea051ab129b536e"} Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.586921 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr" event={"ID":"c7bf472b-4206-4d33-832b-38b84b2760ae","Type":"ContainerStarted","Data":"78dcd9bba494d9f8c139148c4beefc5e70a1eca442eb33d8c462b63f28f890a0"} Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.586965 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr" event={"ID":"c7bf472b-4206-4d33-832b-38b84b2760ae","Type":"ContainerStarted","Data":"defeedf5ce37519bdd26e06f9c22da77b1ff7e7e004879303ce6b7a79927ce9f"} Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.587104 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr" podUID="c7bf472b-4206-4d33-832b-38b84b2760ae" containerName="route-controller-manager" containerID="cri-o://78dcd9bba494d9f8c139148c4beefc5e70a1eca442eb33d8c462b63f28f890a0" gracePeriod=30 Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.590741 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.594978 4825 generic.go:334] "Generic (PLEG): container finished" podID="f7b359b6-5dbd-4270-9195-a355b8ce3dbd" containerID="fac4bb211af00d8f9c1dbebd44d9d67ed30eee5c2232504f09c4c1150fa89544" exitCode=0 Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.595073 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-556c9dd674-99mg2" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.596320 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gwdl" event={"ID":"f7b359b6-5dbd-4270-9195-a355b8ce3dbd","Type":"ContainerDied","Data":"fac4bb211af00d8f9c1dbebd44d9d67ed30eee5c2232504f09c4c1150fa89544"} Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.597125 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.600921 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.600892704 podStartE2EDuration="2.600892704s" podCreationTimestamp="2026-03-10 06:48:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:48:30.594336115 +0000 UTC m=+263.624116730" watchObservedRunningTime="2026-03-10 06:48:30.600892704 +0000 UTC m=+263.630673319" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.636699 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-pj5dl" podStartSLOduration=210.636593485 podStartE2EDuration="3m30.636593485s" podCreationTimestamp="2026-03-10 06:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:48:30.635023947 +0000 UTC m=+263.664804562" watchObservedRunningTime="2026-03-10 06:48:30.636593485 +0000 UTC m=+263.666374100" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.637727 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr" podStartSLOduration=21.637718539 podStartE2EDuration="21.637718539s" podCreationTimestamp="2026-03-10 06:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:48:30.621045184 +0000 UTC m=+263.650825799" watchObservedRunningTime="2026-03-10 06:48:30.637718539 +0000 UTC m=+263.667499154" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.771164 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-78845db449-qk8kh"] Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.772972 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.774147 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-556c9dd674-99mg2"] Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.781414 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.784494 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.784769 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.785532 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.785591 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.785528 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.785775 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.788245 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-556c9dd674-99mg2"] Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.827538 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78845db449-qk8kh"] Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.881564 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-config\") pod \"controller-manager-78845db449-qk8kh\" (UID: \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\") " pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.881680 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-proxy-ca-bundles\") pod \"controller-manager-78845db449-qk8kh\" (UID: \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\") " pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.881766 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-client-ca\") pod \"controller-manager-78845db449-qk8kh\" (UID: \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\") " pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.881790 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76msk\" (UniqueName: \"kubernetes.io/projected/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-kube-api-access-76msk\") pod \"controller-manager-78845db449-qk8kh\" (UID: \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\") " pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.881820 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-serving-cert\") pod \"controller-manager-78845db449-qk8kh\" (UID: \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\") " pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.982948 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-client-ca\") pod \"controller-manager-78845db449-qk8kh\" (UID: \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\") " pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.983056 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76msk\" (UniqueName: \"kubernetes.io/projected/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-kube-api-access-76msk\") pod \"controller-manager-78845db449-qk8kh\" (UID: \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\") " pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.983597 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-serving-cert\") pod \"controller-manager-78845db449-qk8kh\" (UID: \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\") " pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.985450 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-client-ca\") pod \"controller-manager-78845db449-qk8kh\" (UID: \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\") " pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.985926 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-config\") pod \"controller-manager-78845db449-qk8kh\" (UID: \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\") " pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.984061 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-config\") pod \"controller-manager-78845db449-qk8kh\" (UID: \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\") " pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.986224 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-proxy-ca-bundles\") pod \"controller-manager-78845db449-qk8kh\" (UID: \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\") " pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.987387 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-proxy-ca-bundles\") pod \"controller-manager-78845db449-qk8kh\" (UID: \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\") " pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" Mar 10 06:48:30 crc kubenswrapper[4825]: I0310 06:48:30.991930 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-serving-cert\") pod \"controller-manager-78845db449-qk8kh\" (UID: \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\") " pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.002034 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76msk\" (UniqueName: \"kubernetes.io/projected/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-kube-api-access-76msk\") pod \"controller-manager-78845db449-qk8kh\" (UID: \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\") " pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.035814 4825 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-04 18:58:52.720386991 +0000 UTC Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.035868 4825 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7212h10m21.684521624s for next certificate rotation Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.062776 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr" Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.111978 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.190635 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7bf472b-4206-4d33-832b-38b84b2760ae-config\") pod \"c7bf472b-4206-4d33-832b-38b84b2760ae\" (UID: \"c7bf472b-4206-4d33-832b-38b84b2760ae\") " Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.190896 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvtsh\" (UniqueName: \"kubernetes.io/projected/c7bf472b-4206-4d33-832b-38b84b2760ae-kube-api-access-lvtsh\") pod \"c7bf472b-4206-4d33-832b-38b84b2760ae\" (UID: \"c7bf472b-4206-4d33-832b-38b84b2760ae\") " Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.190950 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7bf472b-4206-4d33-832b-38b84b2760ae-client-ca\") pod \"c7bf472b-4206-4d33-832b-38b84b2760ae\" (UID: \"c7bf472b-4206-4d33-832b-38b84b2760ae\") " Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.191033 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7bf472b-4206-4d33-832b-38b84b2760ae-serving-cert\") pod \"c7bf472b-4206-4d33-832b-38b84b2760ae\" (UID: \"c7bf472b-4206-4d33-832b-38b84b2760ae\") " Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.191816 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7bf472b-4206-4d33-832b-38b84b2760ae-config" (OuterVolumeSpecName: "config") pod "c7bf472b-4206-4d33-832b-38b84b2760ae" (UID: "c7bf472b-4206-4d33-832b-38b84b2760ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.191868 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7bf472b-4206-4d33-832b-38b84b2760ae-client-ca" (OuterVolumeSpecName: "client-ca") pod "c7bf472b-4206-4d33-832b-38b84b2760ae" (UID: "c7bf472b-4206-4d33-832b-38b84b2760ae"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.195595 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7bf472b-4206-4d33-832b-38b84b2760ae-kube-api-access-lvtsh" (OuterVolumeSpecName: "kube-api-access-lvtsh") pod "c7bf472b-4206-4d33-832b-38b84b2760ae" (UID: "c7bf472b-4206-4d33-832b-38b84b2760ae"). InnerVolumeSpecName "kube-api-access-lvtsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.196505 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7bf472b-4206-4d33-832b-38b84b2760ae-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c7bf472b-4206-4d33-832b-38b84b2760ae" (UID: "c7bf472b-4206-4d33-832b-38b84b2760ae"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.248387 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64443acf-ea44-4a9a-92c9-c7daf622140c" path="/var/lib/kubelet/pods/64443acf-ea44-4a9a-92c9-c7daf622140c/volumes" Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.248832 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f86d9d2c-0701-4d7a-926f-64f7733767d4" path="/var/lib/kubelet/pods/f86d9d2c-0701-4d7a-926f-64f7733767d4/volumes" Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.297257 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvtsh\" (UniqueName: \"kubernetes.io/projected/c7bf472b-4206-4d33-832b-38b84b2760ae-kube-api-access-lvtsh\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.297310 4825 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7bf472b-4206-4d33-832b-38b84b2760ae-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.297321 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7bf472b-4206-4d33-832b-38b84b2760ae-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.297330 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7bf472b-4206-4d33-832b-38b84b2760ae-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.362182 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78845db449-qk8kh"] Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.604621 4825 generic.go:334] "Generic (PLEG): container finished" podID="c7bf472b-4206-4d33-832b-38b84b2760ae" containerID="78dcd9bba494d9f8c139148c4beefc5e70a1eca442eb33d8c462b63f28f890a0" exitCode=0 Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.604719 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr" event={"ID":"c7bf472b-4206-4d33-832b-38b84b2760ae","Type":"ContainerDied","Data":"78dcd9bba494d9f8c139148c4beefc5e70a1eca442eb33d8c462b63f28f890a0"} Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.605188 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr" event={"ID":"c7bf472b-4206-4d33-832b-38b84b2760ae","Type":"ContainerDied","Data":"defeedf5ce37519bdd26e06f9c22da77b1ff7e7e004879303ce6b7a79927ce9f"} Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.604772 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr" Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.605215 4825 scope.go:117] "RemoveContainer" containerID="78dcd9bba494d9f8c139148c4beefc5e70a1eca442eb33d8c462b63f28f890a0" Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.611302 4825 generic.go:334] "Generic (PLEG): container finished" podID="7b0c2a90-0982-47a7-a77c-720eea591481" containerID="22e4774bf3dd7adc454a5558066f8d92ffaf47a28e354f2feb2c6c92d19f79ab" exitCode=0 Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.611582 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7b0c2a90-0982-47a7-a77c-720eea591481","Type":"ContainerDied","Data":"22e4774bf3dd7adc454a5558066f8d92ffaf47a28e354f2feb2c6c92d19f79ab"} Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.617253 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" event={"ID":"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5","Type":"ContainerStarted","Data":"7db23a7b0546bb7ba7d9ac8e3281c144fdc2e557310be04c61bf6ac3a7f13f66"} Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.617524 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" event={"ID":"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5","Type":"ContainerStarted","Data":"da094da924366153496b4ac0066ab2797ee482423c5f98a49e235487508d1b6e"} Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.618094 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.620936 4825 patch_prober.go:28] interesting pod/controller-manager-78845db449-qk8kh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" start-of-body= Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.621003 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" podUID="7e3bbdbd-ab62-4e56-94b2-925c13ada6f5" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.651950 4825 scope.go:117] "RemoveContainer" containerID="78dcd9bba494d9f8c139148c4beefc5e70a1eca442eb33d8c462b63f28f890a0" Mar 10 06:48:31 crc kubenswrapper[4825]: E0310 06:48:31.652673 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78dcd9bba494d9f8c139148c4beefc5e70a1eca442eb33d8c462b63f28f890a0\": container with ID starting with 78dcd9bba494d9f8c139148c4beefc5e70a1eca442eb33d8c462b63f28f890a0 not found: ID does not exist" containerID="78dcd9bba494d9f8c139148c4beefc5e70a1eca442eb33d8c462b63f28f890a0" Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.652740 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78dcd9bba494d9f8c139148c4beefc5e70a1eca442eb33d8c462b63f28f890a0"} err="failed to get container status \"78dcd9bba494d9f8c139148c4beefc5e70a1eca442eb33d8c462b63f28f890a0\": rpc error: code = NotFound desc = could not find container \"78dcd9bba494d9f8c139148c4beefc5e70a1eca442eb33d8c462b63f28f890a0\": container with ID starting with 78dcd9bba494d9f8c139148c4beefc5e70a1eca442eb33d8c462b63f28f890a0 not found: ID does not exist" Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.660109 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr"] Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.669627 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dc89489b9-b4slr"] Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.689657 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" podStartSLOduration=2.689630296 podStartE2EDuration="2.689630296s" podCreationTimestamp="2026-03-10 06:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:48:31.684942144 +0000 UTC m=+264.714722769" watchObservedRunningTime="2026-03-10 06:48:31.689630296 +0000 UTC m=+264.719410911" Mar 10 06:48:31 crc kubenswrapper[4825]: I0310 06:48:31.887056 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552088-7fs5s" Mar 10 06:48:32 crc kubenswrapper[4825]: I0310 06:48:32.013068 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs2mb\" (UniqueName: \"kubernetes.io/projected/ccb47913-9550-4871-9cf8-47fa8734b406-kube-api-access-bs2mb\") pod \"ccb47913-9550-4871-9cf8-47fa8734b406\" (UID: \"ccb47913-9550-4871-9cf8-47fa8734b406\") " Mar 10 06:48:32 crc kubenswrapper[4825]: I0310 06:48:32.028487 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccb47913-9550-4871-9cf8-47fa8734b406-kube-api-access-bs2mb" (OuterVolumeSpecName: "kube-api-access-bs2mb") pod "ccb47913-9550-4871-9cf8-47fa8734b406" (UID: "ccb47913-9550-4871-9cf8-47fa8734b406"). InnerVolumeSpecName "kube-api-access-bs2mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:48:32 crc kubenswrapper[4825]: I0310 06:48:32.036532 4825 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-16 17:30:42.288916856 +0000 UTC Mar 10 06:48:32 crc kubenswrapper[4825]: I0310 06:48:32.036634 4825 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7498h42m10.252285191s for next certificate rotation Mar 10 06:48:32 crc kubenswrapper[4825]: I0310 06:48:32.115236 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs2mb\" (UniqueName: \"kubernetes.io/projected/ccb47913-9550-4871-9cf8-47fa8734b406-kube-api-access-bs2mb\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:32 crc kubenswrapper[4825]: I0310 06:48:32.628060 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gwdl" event={"ID":"f7b359b6-5dbd-4270-9195-a355b8ce3dbd","Type":"ContainerStarted","Data":"962de788726a07017b622672c07b4be37a71c9ca654aec23a2069145d527f6e1"} Mar 10 06:48:32 crc kubenswrapper[4825]: I0310 06:48:32.629650 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552088-7fs5s" event={"ID":"ccb47913-9550-4871-9cf8-47fa8734b406","Type":"ContainerDied","Data":"d0cf47b9651893ad08797a3f6518ec11e7a315a4ff2ae3e2539ce055ddc7aaad"} Mar 10 06:48:32 crc kubenswrapper[4825]: I0310 06:48:32.629707 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0cf47b9651893ad08797a3f6518ec11e7a315a4ff2ae3e2539ce055ddc7aaad" Mar 10 06:48:32 crc kubenswrapper[4825]: I0310 06:48:32.629685 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552088-7fs5s" Mar 10 06:48:32 crc kubenswrapper[4825]: I0310 06:48:32.639669 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" Mar 10 06:48:32 crc kubenswrapper[4825]: I0310 06:48:32.677172 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4gwdl" podStartSLOduration=4.010366073 podStartE2EDuration="42.677124772s" podCreationTimestamp="2026-03-10 06:47:50 +0000 UTC" firstStartedPulling="2026-03-10 06:47:52.986759533 +0000 UTC m=+226.016540148" lastFinishedPulling="2026-03-10 06:48:31.653518232 +0000 UTC m=+264.683298847" observedRunningTime="2026-03-10 06:48:32.65296968 +0000 UTC m=+265.682750305" watchObservedRunningTime="2026-03-10 06:48:32.677124772 +0000 UTC m=+265.706905397" Mar 10 06:48:32 crc kubenswrapper[4825]: I0310 06:48:32.996057 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.134195 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b0c2a90-0982-47a7-a77c-720eea591481-kubelet-dir\") pod \"7b0c2a90-0982-47a7-a77c-720eea591481\" (UID: \"7b0c2a90-0982-47a7-a77c-720eea591481\") " Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.134337 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b0c2a90-0982-47a7-a77c-720eea591481-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7b0c2a90-0982-47a7-a77c-720eea591481" (UID: "7b0c2a90-0982-47a7-a77c-720eea591481"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.134434 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b0c2a90-0982-47a7-a77c-720eea591481-kube-api-access\") pod \"7b0c2a90-0982-47a7-a77c-720eea591481\" (UID: \"7b0c2a90-0982-47a7-a77c-720eea591481\") " Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.134774 4825 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7b0c2a90-0982-47a7-a77c-720eea591481-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.147364 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b0c2a90-0982-47a7-a77c-720eea591481-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7b0c2a90-0982-47a7-a77c-720eea591481" (UID: "7b0c2a90-0982-47a7-a77c-720eea591481"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.236542 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b0c2a90-0982-47a7-a77c-720eea591481-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.242622 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7bf472b-4206-4d33-832b-38b84b2760ae" path="/var/lib/kubelet/pods/c7bf472b-4206-4d33-832b-38b84b2760ae/volumes" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.477225 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt"] Mar 10 06:48:33 crc kubenswrapper[4825]: E0310 06:48:33.477599 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7bf472b-4206-4d33-832b-38b84b2760ae" containerName="route-controller-manager" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.477624 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7bf472b-4206-4d33-832b-38b84b2760ae" containerName="route-controller-manager" Mar 10 06:48:33 crc kubenswrapper[4825]: E0310 06:48:33.477637 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb47913-9550-4871-9cf8-47fa8734b406" containerName="oc" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.477648 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb47913-9550-4871-9cf8-47fa8734b406" containerName="oc" Mar 10 06:48:33 crc kubenswrapper[4825]: E0310 06:48:33.477817 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b0c2a90-0982-47a7-a77c-720eea591481" containerName="pruner" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.477833 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0c2a90-0982-47a7-a77c-720eea591481" containerName="pruner" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.477976 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7bf472b-4206-4d33-832b-38b84b2760ae" containerName="route-controller-manager" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.477992 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccb47913-9550-4871-9cf8-47fa8734b406" containerName="oc" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.478005 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b0c2a90-0982-47a7-a77c-720eea591481" containerName="pruner" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.479103 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.483992 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.484449 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.485184 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.485243 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.486174 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.486698 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.496209 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt"] Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.638453 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7b0c2a90-0982-47a7-a77c-720eea591481","Type":"ContainerDied","Data":"05625360749fdd658de573fd14ba9c1db43f413a3765e253df47007e0923a4a6"} Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.638523 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05625360749fdd658de573fd14ba9c1db43f413a3765e253df47007e0923a4a6" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.638539 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.642632 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gn2f\" (UniqueName: \"kubernetes.io/projected/49f3cdc9-d395-4b9b-abc5-4de670c497a2-kube-api-access-2gn2f\") pod \"route-controller-manager-55d9d79d49-6tqpt\" (UID: \"49f3cdc9-d395-4b9b-abc5-4de670c497a2\") " pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.642774 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f3cdc9-d395-4b9b-abc5-4de670c497a2-config\") pod \"route-controller-manager-55d9d79d49-6tqpt\" (UID: \"49f3cdc9-d395-4b9b-abc5-4de670c497a2\") " pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.642838 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49f3cdc9-d395-4b9b-abc5-4de670c497a2-serving-cert\") pod \"route-controller-manager-55d9d79d49-6tqpt\" (UID: \"49f3cdc9-d395-4b9b-abc5-4de670c497a2\") " pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.642867 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49f3cdc9-d395-4b9b-abc5-4de670c497a2-client-ca\") pod \"route-controller-manager-55d9d79d49-6tqpt\" (UID: \"49f3cdc9-d395-4b9b-abc5-4de670c497a2\") " pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.745279 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49f3cdc9-d395-4b9b-abc5-4de670c497a2-serving-cert\") pod \"route-controller-manager-55d9d79d49-6tqpt\" (UID: \"49f3cdc9-d395-4b9b-abc5-4de670c497a2\") " pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.745368 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49f3cdc9-d395-4b9b-abc5-4de670c497a2-client-ca\") pod \"route-controller-manager-55d9d79d49-6tqpt\" (UID: \"49f3cdc9-d395-4b9b-abc5-4de670c497a2\") " pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.745515 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gn2f\" (UniqueName: \"kubernetes.io/projected/49f3cdc9-d395-4b9b-abc5-4de670c497a2-kube-api-access-2gn2f\") pod \"route-controller-manager-55d9d79d49-6tqpt\" (UID: \"49f3cdc9-d395-4b9b-abc5-4de670c497a2\") " pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.745734 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f3cdc9-d395-4b9b-abc5-4de670c497a2-config\") pod \"route-controller-manager-55d9d79d49-6tqpt\" (UID: \"49f3cdc9-d395-4b9b-abc5-4de670c497a2\") " pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.746762 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49f3cdc9-d395-4b9b-abc5-4de670c497a2-client-ca\") pod \"route-controller-manager-55d9d79d49-6tqpt\" (UID: \"49f3cdc9-d395-4b9b-abc5-4de670c497a2\") " pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.747109 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f3cdc9-d395-4b9b-abc5-4de670c497a2-config\") pod \"route-controller-manager-55d9d79d49-6tqpt\" (UID: \"49f3cdc9-d395-4b9b-abc5-4de670c497a2\") " pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.751842 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49f3cdc9-d395-4b9b-abc5-4de670c497a2-serving-cert\") pod \"route-controller-manager-55d9d79d49-6tqpt\" (UID: \"49f3cdc9-d395-4b9b-abc5-4de670c497a2\") " pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.767945 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gn2f\" (UniqueName: \"kubernetes.io/projected/49f3cdc9-d395-4b9b-abc5-4de670c497a2-kube-api-access-2gn2f\") pod \"route-controller-manager-55d9d79d49-6tqpt\" (UID: \"49f3cdc9-d395-4b9b-abc5-4de670c497a2\") " pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" Mar 10 06:48:33 crc kubenswrapper[4825]: I0310 06:48:33.794093 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" Mar 10 06:48:34 crc kubenswrapper[4825]: I0310 06:48:34.136342 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt"] Mar 10 06:48:34 crc kubenswrapper[4825]: I0310 06:48:34.644385 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" event={"ID":"49f3cdc9-d395-4b9b-abc5-4de670c497a2","Type":"ContainerStarted","Data":"ed716500bb2415c3460225d70822efcdf82ca949ebbfb0cf63e53bbb37d85bd7"} Mar 10 06:48:34 crc kubenswrapper[4825]: I0310 06:48:34.644446 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" event={"ID":"49f3cdc9-d395-4b9b-abc5-4de670c497a2","Type":"ContainerStarted","Data":"b0470ca555a506e96d9cbc8a6e2a89895a55e604ae4b32f551d10d6d34ba1547"} Mar 10 06:48:34 crc kubenswrapper[4825]: I0310 06:48:34.644743 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" Mar 10 06:48:34 crc kubenswrapper[4825]: I0310 06:48:34.678274 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" podStartSLOduration=5.678243194 podStartE2EDuration="5.678243194s" podCreationTimestamp="2026-03-10 06:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:48:34.670767028 +0000 UTC m=+267.700547673" watchObservedRunningTime="2026-03-10 06:48:34.678243194 +0000 UTC m=+267.708023839" Mar 10 06:48:35 crc kubenswrapper[4825]: I0310 06:48:35.162396 4825 patch_prober.go:28] interesting pod/route-controller-manager-55d9d79d49-6tqpt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 10 06:48:35 crc kubenswrapper[4825]: [+]log ok Mar 10 06:48:35 crc kubenswrapper[4825]: [-]poststarthook/max-in-flight-filter failed: reason withheld Mar 10 06:48:35 crc kubenswrapper[4825]: [-]poststarthook/storage-object-count-tracker-hook failed: reason withheld Mar 10 06:48:35 crc kubenswrapper[4825]: healthz check failed Mar 10 06:48:35 crc kubenswrapper[4825]: I0310 06:48:35.162496 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" podUID="49f3cdc9-d395-4b9b-abc5-4de670c497a2" containerName="route-controller-manager" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 06:48:35 crc kubenswrapper[4825]: I0310 06:48:35.655311 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" Mar 10 06:48:37 crc kubenswrapper[4825]: I0310 06:48:37.431304 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 06:48:37 crc kubenswrapper[4825]: I0310 06:48:37.432105 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 06:48:37 crc kubenswrapper[4825]: I0310 06:48:37.436780 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 06:48:37 crc kubenswrapper[4825]: I0310 06:48:37.437905 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 06:48:37 crc kubenswrapper[4825]: I0310 06:48:37.451409 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 06:48:37 crc kubenswrapper[4825]: I0310 06:48:37.627699 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c5d3354d-1386-4f42-9c53-134c6583650d-var-lock\") pod \"installer-9-crc\" (UID: \"c5d3354d-1386-4f42-9c53-134c6583650d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 06:48:37 crc kubenswrapper[4825]: I0310 06:48:37.627765 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5d3354d-1386-4f42-9c53-134c6583650d-kube-api-access\") pod \"installer-9-crc\" (UID: \"c5d3354d-1386-4f42-9c53-134c6583650d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 06:48:37 crc kubenswrapper[4825]: I0310 06:48:37.627823 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5d3354d-1386-4f42-9c53-134c6583650d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c5d3354d-1386-4f42-9c53-134c6583650d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 06:48:37 crc kubenswrapper[4825]: I0310 06:48:37.729205 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5d3354d-1386-4f42-9c53-134c6583650d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c5d3354d-1386-4f42-9c53-134c6583650d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 06:48:37 crc kubenswrapper[4825]: I0310 06:48:37.729840 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c5d3354d-1386-4f42-9c53-134c6583650d-var-lock\") pod \"installer-9-crc\" (UID: \"c5d3354d-1386-4f42-9c53-134c6583650d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 06:48:37 crc kubenswrapper[4825]: I0310 06:48:37.729908 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5d3354d-1386-4f42-9c53-134c6583650d-kube-api-access\") pod \"installer-9-crc\" (UID: \"c5d3354d-1386-4f42-9c53-134c6583650d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 06:48:37 crc kubenswrapper[4825]: I0310 06:48:37.729383 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5d3354d-1386-4f42-9c53-134c6583650d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c5d3354d-1386-4f42-9c53-134c6583650d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 06:48:37 crc kubenswrapper[4825]: I0310 06:48:37.730481 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c5d3354d-1386-4f42-9c53-134c6583650d-var-lock\") pod \"installer-9-crc\" (UID: \"c5d3354d-1386-4f42-9c53-134c6583650d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 06:48:37 crc kubenswrapper[4825]: I0310 06:48:37.758400 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5d3354d-1386-4f42-9c53-134c6583650d-kube-api-access\") pod \"installer-9-crc\" (UID: \"c5d3354d-1386-4f42-9c53-134c6583650d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 06:48:38 crc kubenswrapper[4825]: I0310 06:48:38.051178 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 06:48:38 crc kubenswrapper[4825]: I0310 06:48:38.473038 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 06:48:38 crc kubenswrapper[4825]: I0310 06:48:38.671175 4825 generic.go:334] "Generic (PLEG): container finished" podID="19b3fb37-5072-45f0-8349-1296e31a1193" containerID="4f3fb183a2ed1f131e03c1dfc1a862219753e0f6b41ec9a24a609a228e351314" exitCode=0 Mar 10 06:48:38 crc kubenswrapper[4825]: I0310 06:48:38.671254 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552086-rkhdp" event={"ID":"19b3fb37-5072-45f0-8349-1296e31a1193","Type":"ContainerDied","Data":"4f3fb183a2ed1f131e03c1dfc1a862219753e0f6b41ec9a24a609a228e351314"} Mar 10 06:48:38 crc kubenswrapper[4825]: I0310 06:48:38.673611 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c5d3354d-1386-4f42-9c53-134c6583650d","Type":"ContainerStarted","Data":"3cccda6dc9f8f6468ff7c1852aa62c04165f7c89056f5cb797e4cb74c8c06c19"} Mar 10 06:48:39 crc kubenswrapper[4825]: I0310 06:48:39.682853 4825 generic.go:334] "Generic (PLEG): container finished" podID="d232f8a7-f013-4a5c-a7dc-2150c4b3040c" containerID="ef60518a1963bb3bd15a97cfec6bf1d489c92025c5007d70c66942da30e96e37" exitCode=0 Mar 10 06:48:39 crc kubenswrapper[4825]: I0310 06:48:39.682964 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbdjf" event={"ID":"d232f8a7-f013-4a5c-a7dc-2150c4b3040c","Type":"ContainerDied","Data":"ef60518a1963bb3bd15a97cfec6bf1d489c92025c5007d70c66942da30e96e37"} Mar 10 06:48:39 crc kubenswrapper[4825]: I0310 06:48:39.687976 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c5d3354d-1386-4f42-9c53-134c6583650d","Type":"ContainerStarted","Data":"726fc181965abc85a7bc538762549b8ee17fd04b8c3ed4c1a01009331b2dfb70"} Mar 10 06:48:39 crc kubenswrapper[4825]: I0310 06:48:39.738585 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.738562584 podStartE2EDuration="2.738562584s" podCreationTimestamp="2026-03-10 06:48:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:48:39.737736119 +0000 UTC m=+272.767516744" watchObservedRunningTime="2026-03-10 06:48:39.738562584 +0000 UTC m=+272.768343209" Mar 10 06:48:40 crc kubenswrapper[4825]: I0310 06:48:40.063363 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552086-rkhdp" Mar 10 06:48:40 crc kubenswrapper[4825]: I0310 06:48:40.172439 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jh49\" (UniqueName: \"kubernetes.io/projected/19b3fb37-5072-45f0-8349-1296e31a1193-kube-api-access-6jh49\") pod \"19b3fb37-5072-45f0-8349-1296e31a1193\" (UID: \"19b3fb37-5072-45f0-8349-1296e31a1193\") " Mar 10 06:48:40 crc kubenswrapper[4825]: I0310 06:48:40.181473 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19b3fb37-5072-45f0-8349-1296e31a1193-kube-api-access-6jh49" (OuterVolumeSpecName: "kube-api-access-6jh49") pod "19b3fb37-5072-45f0-8349-1296e31a1193" (UID: "19b3fb37-5072-45f0-8349-1296e31a1193"). InnerVolumeSpecName "kube-api-access-6jh49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:48:40 crc kubenswrapper[4825]: I0310 06:48:40.275356 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jh49\" (UniqueName: \"kubernetes.io/projected/19b3fb37-5072-45f0-8349-1296e31a1193-kube-api-access-6jh49\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:40 crc kubenswrapper[4825]: I0310 06:48:40.694281 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552086-rkhdp" Mar 10 06:48:40 crc kubenswrapper[4825]: I0310 06:48:40.694288 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552086-rkhdp" event={"ID":"19b3fb37-5072-45f0-8349-1296e31a1193","Type":"ContainerDied","Data":"917456c05b8f2e9daa4f600662c64fba8c5d0e8f30e258105fe8d3b141677a8a"} Mar 10 06:48:40 crc kubenswrapper[4825]: I0310 06:48:40.694358 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="917456c05b8f2e9daa4f600662c64fba8c5d0e8f30e258105fe8d3b141677a8a" Mar 10 06:48:40 crc kubenswrapper[4825]: I0310 06:48:40.698588 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbdjf" event={"ID":"d232f8a7-f013-4a5c-a7dc-2150c4b3040c","Type":"ContainerStarted","Data":"067c6aa47d60fae5b55275d9207c32c6890352828f44c81dd8cb15ea46376e6c"} Mar 10 06:48:40 crc kubenswrapper[4825]: I0310 06:48:40.728185 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hbdjf" podStartSLOduration=3.432131651 podStartE2EDuration="50.728162473s" podCreationTimestamp="2026-03-10 06:47:50 +0000 UTC" firstStartedPulling="2026-03-10 06:47:52.986769593 +0000 UTC m=+226.016550198" lastFinishedPulling="2026-03-10 06:48:40.282800405 +0000 UTC m=+273.312581020" observedRunningTime="2026-03-10 06:48:40.723492151 +0000 UTC m=+273.753272786" watchObservedRunningTime="2026-03-10 06:48:40.728162473 +0000 UTC m=+273.757943088" Mar 10 06:48:41 crc kubenswrapper[4825]: I0310 06:48:41.137325 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4gwdl" Mar 10 06:48:41 crc kubenswrapper[4825]: I0310 06:48:41.137814 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4gwdl" Mar 10 06:48:41 crc kubenswrapper[4825]: I0310 06:48:41.255301 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hbdjf" Mar 10 06:48:41 crc kubenswrapper[4825]: I0310 06:48:41.255349 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hbdjf" Mar 10 06:48:41 crc kubenswrapper[4825]: I0310 06:48:41.287488 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4gwdl" Mar 10 06:48:41 crc kubenswrapper[4825]: I0310 06:48:41.708348 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h5g2" event={"ID":"27fbaf63-cf93-470e-b012-ad8bb403f65a","Type":"ContainerStarted","Data":"3e301dd8065a4a8ef3343728382f154d549d583f3cfc28e9fc21316a68215461"} Mar 10 06:48:41 crc kubenswrapper[4825]: I0310 06:48:41.711859 4825 generic.go:334] "Generic (PLEG): container finished" podID="e945d496-847a-4109-ae2d-41e169b241c2" containerID="505aed85c350b976d22668271546308dfb7ba915a5c98bc578f6a43d0ea182b0" exitCode=0 Mar 10 06:48:41 crc kubenswrapper[4825]: I0310 06:48:41.711898 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zmmv" event={"ID":"e945d496-847a-4109-ae2d-41e169b241c2","Type":"ContainerDied","Data":"505aed85c350b976d22668271546308dfb7ba915a5c98bc578f6a43d0ea182b0"} Mar 10 06:48:41 crc kubenswrapper[4825]: I0310 06:48:41.785232 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4gwdl" Mar 10 06:48:42 crc kubenswrapper[4825]: I0310 06:48:42.317552 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-hbdjf" podUID="d232f8a7-f013-4a5c-a7dc-2150c4b3040c" containerName="registry-server" probeResult="failure" output=< Mar 10 06:48:42 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 06:48:42 crc kubenswrapper[4825]: > Mar 10 06:48:42 crc kubenswrapper[4825]: I0310 06:48:42.720672 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zmmv" event={"ID":"e945d496-847a-4109-ae2d-41e169b241c2","Type":"ContainerStarted","Data":"58a324ade7abfa6edc714b16d5c4422ff524ed8541e992b8ddf9e249f3eb58a3"} Mar 10 06:48:42 crc kubenswrapper[4825]: I0310 06:48:42.723403 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gg7k" event={"ID":"58d27fca-dd33-4753-9ea6-4b55c191b2b8","Type":"ContainerStarted","Data":"3d3c788e71e7ecc21bd432e78cfca9d09c70e1fabcd5fc612ecdd9e2c7cd1ef2"} Mar 10 06:48:42 crc kubenswrapper[4825]: I0310 06:48:42.727145 4825 generic.go:334] "Generic (PLEG): container finished" podID="27fbaf63-cf93-470e-b012-ad8bb403f65a" containerID="3e301dd8065a4a8ef3343728382f154d549d583f3cfc28e9fc21316a68215461" exitCode=0 Mar 10 06:48:42 crc kubenswrapper[4825]: I0310 06:48:42.727209 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h5g2" event={"ID":"27fbaf63-cf93-470e-b012-ad8bb403f65a","Type":"ContainerDied","Data":"3e301dd8065a4a8ef3343728382f154d549d583f3cfc28e9fc21316a68215461"} Mar 10 06:48:42 crc kubenswrapper[4825]: I0310 06:48:42.730336 4825 generic.go:334] "Generic (PLEG): container finished" podID="823b95d6-3aa7-4fda-bd75-9fe7e95a209f" containerID="2b21d0117de4c1b41f35b94dbcdc46feb174d950c22f6f642ce38167b2226e4e" exitCode=0 Mar 10 06:48:42 crc kubenswrapper[4825]: I0310 06:48:42.730868 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lfxbn" event={"ID":"823b95d6-3aa7-4fda-bd75-9fe7e95a209f","Type":"ContainerDied","Data":"2b21d0117de4c1b41f35b94dbcdc46feb174d950c22f6f642ce38167b2226e4e"} Mar 10 06:48:42 crc kubenswrapper[4825]: I0310 06:48:42.800845 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9zmmv" podStartSLOduration=4.647213392 podStartE2EDuration="52.800817643s" podCreationTimestamp="2026-03-10 06:47:50 +0000 UTC" firstStartedPulling="2026-03-10 06:47:54.044960233 +0000 UTC m=+227.074740848" lastFinishedPulling="2026-03-10 06:48:42.198564484 +0000 UTC m=+275.228345099" observedRunningTime="2026-03-10 06:48:42.759238474 +0000 UTC m=+275.789019079" watchObservedRunningTime="2026-03-10 06:48:42.800817643 +0000 UTC m=+275.830598258" Mar 10 06:48:43 crc kubenswrapper[4825]: I0310 06:48:43.741981 4825 generic.go:334] "Generic (PLEG): container finished" podID="33ed941e-9788-4c9e-bcd5-6af206460adb" containerID="41f362b98d4228d78d6329a4f4232a394052b85cdaf3037fad8d27e0907b4015" exitCode=0 Mar 10 06:48:43 crc kubenswrapper[4825]: I0310 06:48:43.742063 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lfhnn" event={"ID":"33ed941e-9788-4c9e-bcd5-6af206460adb","Type":"ContainerDied","Data":"41f362b98d4228d78d6329a4f4232a394052b85cdaf3037fad8d27e0907b4015"} Mar 10 06:48:43 crc kubenswrapper[4825]: I0310 06:48:43.747185 4825 generic.go:334] "Generic (PLEG): container finished" podID="58d27fca-dd33-4753-9ea6-4b55c191b2b8" containerID="3d3c788e71e7ecc21bd432e78cfca9d09c70e1fabcd5fc612ecdd9e2c7cd1ef2" exitCode=0 Mar 10 06:48:43 crc kubenswrapper[4825]: I0310 06:48:43.747234 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gg7k" event={"ID":"58d27fca-dd33-4753-9ea6-4b55c191b2b8","Type":"ContainerDied","Data":"3d3c788e71e7ecc21bd432e78cfca9d09c70e1fabcd5fc612ecdd9e2c7cd1ef2"} Mar 10 06:48:43 crc kubenswrapper[4825]: I0310 06:48:43.750760 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h5g2" event={"ID":"27fbaf63-cf93-470e-b012-ad8bb403f65a","Type":"ContainerStarted","Data":"3d3eb1c1fad78b075a8af2242d56cb89c43230c27301c8216b1bdeeb592c227e"} Mar 10 06:48:43 crc kubenswrapper[4825]: I0310 06:48:43.755772 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lfxbn" event={"ID":"823b95d6-3aa7-4fda-bd75-9fe7e95a209f","Type":"ContainerStarted","Data":"9551d524c53c2343b1dfcd7f6a4247a5495d670c3403a8839a30f91df10a8d70"} Mar 10 06:48:43 crc kubenswrapper[4825]: I0310 06:48:43.791410 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lfxbn" podStartSLOduration=4.059216924 podStartE2EDuration="51.791381902s" podCreationTimestamp="2026-03-10 06:47:52 +0000 UTC" firstStartedPulling="2026-03-10 06:47:55.423048178 +0000 UTC m=+228.452828793" lastFinishedPulling="2026-03-10 06:48:43.155213136 +0000 UTC m=+276.184993771" observedRunningTime="2026-03-10 06:48:43.786797723 +0000 UTC m=+276.816578338" watchObservedRunningTime="2026-03-10 06:48:43.791381902 +0000 UTC m=+276.821162527" Mar 10 06:48:43 crc kubenswrapper[4825]: I0310 06:48:43.812466 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7h5g2" podStartSLOduration=2.99876819 podStartE2EDuration="50.812446999s" podCreationTimestamp="2026-03-10 06:47:53 +0000 UTC" firstStartedPulling="2026-03-10 06:47:55.374868906 +0000 UTC m=+228.404649521" lastFinishedPulling="2026-03-10 06:48:43.188547715 +0000 UTC m=+276.218328330" observedRunningTime="2026-03-10 06:48:43.808437228 +0000 UTC m=+276.838217863" watchObservedRunningTime="2026-03-10 06:48:43.812446999 +0000 UTC m=+276.842227614" Mar 10 06:48:43 crc kubenswrapper[4825]: I0310 06:48:43.968065 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7h5g2" Mar 10 06:48:43 crc kubenswrapper[4825]: I0310 06:48:43.968156 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7h5g2" Mar 10 06:48:44 crc kubenswrapper[4825]: I0310 06:48:44.766042 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lfhnn" event={"ID":"33ed941e-9788-4c9e-bcd5-6af206460adb","Type":"ContainerStarted","Data":"06cd39ab0a6426205c1b1ac093b01e43bdc28b478f51c54cfd31026294143e76"} Mar 10 06:48:44 crc kubenswrapper[4825]: I0310 06:48:44.768929 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gg7k" event={"ID":"58d27fca-dd33-4753-9ea6-4b55c191b2b8","Type":"ContainerStarted","Data":"16702d347b624b3bd369486d2ad384ae2237c37d9deaac93249aca1b3e347f67"} Mar 10 06:48:44 crc kubenswrapper[4825]: I0310 06:48:44.791788 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lfhnn" podStartSLOduration=4.706972283 podStartE2EDuration="54.791768397s" podCreationTimestamp="2026-03-10 06:47:50 +0000 UTC" firstStartedPulling="2026-03-10 06:47:54.067891469 +0000 UTC m=+227.097672084" lastFinishedPulling="2026-03-10 06:48:44.152687563 +0000 UTC m=+277.182468198" observedRunningTime="2026-03-10 06:48:44.791402356 +0000 UTC m=+277.821182971" watchObservedRunningTime="2026-03-10 06:48:44.791768397 +0000 UTC m=+277.821549012" Mar 10 06:48:44 crc kubenswrapper[4825]: I0310 06:48:44.814104 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2gg7k" podStartSLOduration=2.925595872 podStartE2EDuration="51.814070493s" podCreationTimestamp="2026-03-10 06:47:53 +0000 UTC" firstStartedPulling="2026-03-10 06:47:55.354331701 +0000 UTC m=+228.384112316" lastFinishedPulling="2026-03-10 06:48:44.242806322 +0000 UTC m=+277.272586937" observedRunningTime="2026-03-10 06:48:44.810100902 +0000 UTC m=+277.839881517" watchObservedRunningTime="2026-03-10 06:48:44.814070493 +0000 UTC m=+277.843851108" Mar 10 06:48:45 crc kubenswrapper[4825]: I0310 06:48:45.023882 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7h5g2" podUID="27fbaf63-cf93-470e-b012-ad8bb403f65a" containerName="registry-server" probeResult="failure" output=< Mar 10 06:48:45 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 06:48:45 crc kubenswrapper[4825]: > Mar 10 06:48:45 crc kubenswrapper[4825]: I0310 06:48:45.778553 4825 generic.go:334] "Generic (PLEG): container finished" podID="d41ec286-2462-448b-ab27-1acc0e1dab3c" containerID="b04271cd9b4e4bdf29345f6d5cb92625e4711bb84e566a466273a97200d21000" exitCode=0 Mar 10 06:48:45 crc kubenswrapper[4825]: I0310 06:48:45.778613 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vkvnj" event={"ID":"d41ec286-2462-448b-ab27-1acc0e1dab3c","Type":"ContainerDied","Data":"b04271cd9b4e4bdf29345f6d5cb92625e4711bb84e566a466273a97200d21000"} Mar 10 06:48:46 crc kubenswrapper[4825]: I0310 06:48:46.788515 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vkvnj" event={"ID":"d41ec286-2462-448b-ab27-1acc0e1dab3c","Type":"ContainerStarted","Data":"d1b239e14465c419a2f7f5c90b87daa5e46a35ae3f43167c2bac4bb852f84614"} Mar 10 06:48:46 crc kubenswrapper[4825]: I0310 06:48:46.888866 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 06:48:46 crc kubenswrapper[4825]: I0310 06:48:46.888990 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 06:48:49 crc kubenswrapper[4825]: I0310 06:48:49.361182 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vkvnj" podStartSLOduration=5.26612049 podStartE2EDuration="57.361159819s" podCreationTimestamp="2026-03-10 06:47:52 +0000 UTC" firstStartedPulling="2026-03-10 06:47:54.074516904 +0000 UTC m=+227.104297519" lastFinishedPulling="2026-03-10 06:48:46.169556203 +0000 UTC m=+279.199336848" observedRunningTime="2026-03-10 06:48:46.818851607 +0000 UTC m=+279.848632242" watchObservedRunningTime="2026-03-10 06:48:49.361159819 +0000 UTC m=+282.390940434" Mar 10 06:48:49 crc kubenswrapper[4825]: I0310 06:48:49.362168 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78845db449-qk8kh"] Mar 10 06:48:49 crc kubenswrapper[4825]: I0310 06:48:49.362508 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" podUID="7e3bbdbd-ab62-4e56-94b2-925c13ada6f5" containerName="controller-manager" containerID="cri-o://7db23a7b0546bb7ba7d9ac8e3281c144fdc2e557310be04c61bf6ac3a7f13f66" gracePeriod=30 Mar 10 06:48:49 crc kubenswrapper[4825]: I0310 06:48:49.365079 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt"] Mar 10 06:48:49 crc kubenswrapper[4825]: I0310 06:48:49.365361 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" podUID="49f3cdc9-d395-4b9b-abc5-4de670c497a2" containerName="route-controller-manager" containerID="cri-o://ed716500bb2415c3460225d70822efcdf82ca949ebbfb0cf63e53bbb37d85bd7" gracePeriod=30 Mar 10 06:48:49 crc kubenswrapper[4825]: I0310 06:48:49.810744 4825 generic.go:334] "Generic (PLEG): container finished" podID="7e3bbdbd-ab62-4e56-94b2-925c13ada6f5" containerID="7db23a7b0546bb7ba7d9ac8e3281c144fdc2e557310be04c61bf6ac3a7f13f66" exitCode=0 Mar 10 06:48:49 crc kubenswrapper[4825]: I0310 06:48:49.810962 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" event={"ID":"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5","Type":"ContainerDied","Data":"7db23a7b0546bb7ba7d9ac8e3281c144fdc2e557310be04c61bf6ac3a7f13f66"} Mar 10 06:48:49 crc kubenswrapper[4825]: I0310 06:48:49.815050 4825 generic.go:334] "Generic (PLEG): container finished" podID="49f3cdc9-d395-4b9b-abc5-4de670c497a2" containerID="ed716500bb2415c3460225d70822efcdf82ca949ebbfb0cf63e53bbb37d85bd7" exitCode=0 Mar 10 06:48:49 crc kubenswrapper[4825]: I0310 06:48:49.815096 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" event={"ID":"49f3cdc9-d395-4b9b-abc5-4de670c497a2","Type":"ContainerDied","Data":"ed716500bb2415c3460225d70822efcdf82ca949ebbfb0cf63e53bbb37d85bd7"} Mar 10 06:48:49 crc kubenswrapper[4825]: I0310 06:48:49.957843 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" Mar 10 06:48:49 crc kubenswrapper[4825]: I0310 06:48:49.963416 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.042154 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gn2f\" (UniqueName: \"kubernetes.io/projected/49f3cdc9-d395-4b9b-abc5-4de670c497a2-kube-api-access-2gn2f\") pod \"49f3cdc9-d395-4b9b-abc5-4de670c497a2\" (UID: \"49f3cdc9-d395-4b9b-abc5-4de670c497a2\") " Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.042257 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49f3cdc9-d395-4b9b-abc5-4de670c497a2-client-ca\") pod \"49f3cdc9-d395-4b9b-abc5-4de670c497a2\" (UID: \"49f3cdc9-d395-4b9b-abc5-4de670c497a2\") " Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.042305 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76msk\" (UniqueName: \"kubernetes.io/projected/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-kube-api-access-76msk\") pod \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\" (UID: \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\") " Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.042336 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-config\") pod \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\" (UID: \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\") " Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.042357 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-serving-cert\") pod \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\" (UID: \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\") " Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.042384 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49f3cdc9-d395-4b9b-abc5-4de670c497a2-serving-cert\") pod \"49f3cdc9-d395-4b9b-abc5-4de670c497a2\" (UID: \"49f3cdc9-d395-4b9b-abc5-4de670c497a2\") " Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.042449 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f3cdc9-d395-4b9b-abc5-4de670c497a2-config\") pod \"49f3cdc9-d395-4b9b-abc5-4de670c497a2\" (UID: \"49f3cdc9-d395-4b9b-abc5-4de670c497a2\") " Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.042505 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-client-ca\") pod \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\" (UID: \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\") " Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.042530 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-proxy-ca-bundles\") pod \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\" (UID: \"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5\") " Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.043354 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49f3cdc9-d395-4b9b-abc5-4de670c497a2-config" (OuterVolumeSpecName: "config") pod "49f3cdc9-d395-4b9b-abc5-4de670c497a2" (UID: "49f3cdc9-d395-4b9b-abc5-4de670c497a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.043434 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49f3cdc9-d395-4b9b-abc5-4de670c497a2-client-ca" (OuterVolumeSpecName: "client-ca") pod "49f3cdc9-d395-4b9b-abc5-4de670c497a2" (UID: "49f3cdc9-d395-4b9b-abc5-4de670c497a2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.043443 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7e3bbdbd-ab62-4e56-94b2-925c13ada6f5" (UID: "7e3bbdbd-ab62-4e56-94b2-925c13ada6f5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.043434 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-client-ca" (OuterVolumeSpecName: "client-ca") pod "7e3bbdbd-ab62-4e56-94b2-925c13ada6f5" (UID: "7e3bbdbd-ab62-4e56-94b2-925c13ada6f5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.043567 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-config" (OuterVolumeSpecName: "config") pod "7e3bbdbd-ab62-4e56-94b2-925c13ada6f5" (UID: "7e3bbdbd-ab62-4e56-94b2-925c13ada6f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.043916 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49f3cdc9-d395-4b9b-abc5-4de670c497a2-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.043945 4825 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.043955 4825 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.043965 4825 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49f3cdc9-d395-4b9b-abc5-4de670c497a2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.043974 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.049877 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49f3cdc9-d395-4b9b-abc5-4de670c497a2-kube-api-access-2gn2f" (OuterVolumeSpecName: "kube-api-access-2gn2f") pod "49f3cdc9-d395-4b9b-abc5-4de670c497a2" (UID: "49f3cdc9-d395-4b9b-abc5-4de670c497a2"). InnerVolumeSpecName "kube-api-access-2gn2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.049961 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-kube-api-access-76msk" (OuterVolumeSpecName: "kube-api-access-76msk") pod "7e3bbdbd-ab62-4e56-94b2-925c13ada6f5" (UID: "7e3bbdbd-ab62-4e56-94b2-925c13ada6f5"). InnerVolumeSpecName "kube-api-access-76msk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.050467 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49f3cdc9-d395-4b9b-abc5-4de670c497a2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "49f3cdc9-d395-4b9b-abc5-4de670c497a2" (UID: "49f3cdc9-d395-4b9b-abc5-4de670c497a2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.051572 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7e3bbdbd-ab62-4e56-94b2-925c13ada6f5" (UID: "7e3bbdbd-ab62-4e56-94b2-925c13ada6f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.063628 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" podUID="cc9419c8-c23a-418b-8fba-9956bed2a193" containerName="oauth-openshift" containerID="cri-o://7a8f8af22b29d41abc26637da30ee7f659dd5470468ff5f4f2286bafd73974ad" gracePeriod=15 Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.145082 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gn2f\" (UniqueName: \"kubernetes.io/projected/49f3cdc9-d395-4b9b-abc5-4de670c497a2-kube-api-access-2gn2f\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.145144 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76msk\" (UniqueName: \"kubernetes.io/projected/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-kube-api-access-76msk\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.145156 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.145168 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49f3cdc9-d395-4b9b-abc5-4de670c497a2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.502091 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll"] Mar 10 06:48:50 crc kubenswrapper[4825]: E0310 06:48:50.502769 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e3bbdbd-ab62-4e56-94b2-925c13ada6f5" containerName="controller-manager" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.502797 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e3bbdbd-ab62-4e56-94b2-925c13ada6f5" containerName="controller-manager" Mar 10 06:48:50 crc kubenswrapper[4825]: E0310 06:48:50.502815 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f3cdc9-d395-4b9b-abc5-4de670c497a2" containerName="route-controller-manager" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.502828 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f3cdc9-d395-4b9b-abc5-4de670c497a2" containerName="route-controller-manager" Mar 10 06:48:50 crc kubenswrapper[4825]: E0310 06:48:50.502858 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b3fb37-5072-45f0-8349-1296e31a1193" containerName="oc" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.502870 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b3fb37-5072-45f0-8349-1296e31a1193" containerName="oc" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.503041 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="49f3cdc9-d395-4b9b-abc5-4de670c497a2" containerName="route-controller-manager" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.503061 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="19b3fb37-5072-45f0-8349-1296e31a1193" containerName="oc" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.503076 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e3bbdbd-ab62-4e56-94b2-925c13ada6f5" containerName="controller-manager" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.503895 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.516550 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj"] Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.517829 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.526501 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj"] Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.531568 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll"] Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.551957 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a48b68df-105c-418f-9cb7-525cb3641e20-serving-cert\") pod \"controller-manager-5ffd5f77bc-vh5ll\" (UID: \"a48b68df-105c-418f-9cb7-525cb3641e20\") " pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.552101 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c6152dc4-04ac-4ace-880d-67f4be272e50-client-ca\") pod \"route-controller-manager-57886dd6c5-vszwj\" (UID: \"c6152dc4-04ac-4ace-880d-67f4be272e50\") " pod="openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.552443 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6152dc4-04ac-4ace-880d-67f4be272e50-config\") pod \"route-controller-manager-57886dd6c5-vszwj\" (UID: \"c6152dc4-04ac-4ace-880d-67f4be272e50\") " pod="openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.552519 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a48b68df-105c-418f-9cb7-525cb3641e20-config\") pod \"controller-manager-5ffd5f77bc-vh5ll\" (UID: \"a48b68df-105c-418f-9cb7-525cb3641e20\") " pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.552569 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxvrj\" (UniqueName: \"kubernetes.io/projected/c6152dc4-04ac-4ace-880d-67f4be272e50-kube-api-access-sxvrj\") pod \"route-controller-manager-57886dd6c5-vszwj\" (UID: \"c6152dc4-04ac-4ace-880d-67f4be272e50\") " pod="openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.552723 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6152dc4-04ac-4ace-880d-67f4be272e50-serving-cert\") pod \"route-controller-manager-57886dd6c5-vszwj\" (UID: \"c6152dc4-04ac-4ace-880d-67f4be272e50\") " pod="openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.552852 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a48b68df-105c-418f-9cb7-525cb3641e20-client-ca\") pod \"controller-manager-5ffd5f77bc-vh5ll\" (UID: \"a48b68df-105c-418f-9cb7-525cb3641e20\") " pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.552909 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a48b68df-105c-418f-9cb7-525cb3641e20-proxy-ca-bundles\") pod \"controller-manager-5ffd5f77bc-vh5ll\" (UID: \"a48b68df-105c-418f-9cb7-525cb3641e20\") " pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.552948 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wflvv\" (UniqueName: \"kubernetes.io/projected/a48b68df-105c-418f-9cb7-525cb3641e20-kube-api-access-wflvv\") pod \"controller-manager-5ffd5f77bc-vh5ll\" (UID: \"a48b68df-105c-418f-9cb7-525cb3641e20\") " pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.655296 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a48b68df-105c-418f-9cb7-525cb3641e20-proxy-ca-bundles\") pod \"controller-manager-5ffd5f77bc-vh5ll\" (UID: \"a48b68df-105c-418f-9cb7-525cb3641e20\") " pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.655370 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wflvv\" (UniqueName: \"kubernetes.io/projected/a48b68df-105c-418f-9cb7-525cb3641e20-kube-api-access-wflvv\") pod \"controller-manager-5ffd5f77bc-vh5ll\" (UID: \"a48b68df-105c-418f-9cb7-525cb3641e20\") " pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.655422 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a48b68df-105c-418f-9cb7-525cb3641e20-serving-cert\") pod \"controller-manager-5ffd5f77bc-vh5ll\" (UID: \"a48b68df-105c-418f-9cb7-525cb3641e20\") " pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.655458 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c6152dc4-04ac-4ace-880d-67f4be272e50-client-ca\") pod \"route-controller-manager-57886dd6c5-vszwj\" (UID: \"c6152dc4-04ac-4ace-880d-67f4be272e50\") " pod="openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.655534 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6152dc4-04ac-4ace-880d-67f4be272e50-config\") pod \"route-controller-manager-57886dd6c5-vszwj\" (UID: \"c6152dc4-04ac-4ace-880d-67f4be272e50\") " pod="openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.655595 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a48b68df-105c-418f-9cb7-525cb3641e20-config\") pod \"controller-manager-5ffd5f77bc-vh5ll\" (UID: \"a48b68df-105c-418f-9cb7-525cb3641e20\") " pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.655621 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxvrj\" (UniqueName: \"kubernetes.io/projected/c6152dc4-04ac-4ace-880d-67f4be272e50-kube-api-access-sxvrj\") pod \"route-controller-manager-57886dd6c5-vszwj\" (UID: \"c6152dc4-04ac-4ace-880d-67f4be272e50\") " pod="openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.655661 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6152dc4-04ac-4ace-880d-67f4be272e50-serving-cert\") pod \"route-controller-manager-57886dd6c5-vszwj\" (UID: \"c6152dc4-04ac-4ace-880d-67f4be272e50\") " pod="openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.655690 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a48b68df-105c-418f-9cb7-525cb3641e20-client-ca\") pod \"controller-manager-5ffd5f77bc-vh5ll\" (UID: \"a48b68df-105c-418f-9cb7-525cb3641e20\") " pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.657058 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a48b68df-105c-418f-9cb7-525cb3641e20-client-ca\") pod \"controller-manager-5ffd5f77bc-vh5ll\" (UID: \"a48b68df-105c-418f-9cb7-525cb3641e20\") " pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.657351 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c6152dc4-04ac-4ace-880d-67f4be272e50-client-ca\") pod \"route-controller-manager-57886dd6c5-vszwj\" (UID: \"c6152dc4-04ac-4ace-880d-67f4be272e50\") " pod="openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.657741 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a48b68df-105c-418f-9cb7-525cb3641e20-proxy-ca-bundles\") pod \"controller-manager-5ffd5f77bc-vh5ll\" (UID: \"a48b68df-105c-418f-9cb7-525cb3641e20\") " pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.657773 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6152dc4-04ac-4ace-880d-67f4be272e50-config\") pod \"route-controller-manager-57886dd6c5-vszwj\" (UID: \"c6152dc4-04ac-4ace-880d-67f4be272e50\") " pod="openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.657872 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a48b68df-105c-418f-9cb7-525cb3641e20-config\") pod \"controller-manager-5ffd5f77bc-vh5ll\" (UID: \"a48b68df-105c-418f-9cb7-525cb3641e20\") " pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.664039 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6152dc4-04ac-4ace-880d-67f4be272e50-serving-cert\") pod \"route-controller-manager-57886dd6c5-vszwj\" (UID: \"c6152dc4-04ac-4ace-880d-67f4be272e50\") " pod="openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.664285 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a48b68df-105c-418f-9cb7-525cb3641e20-serving-cert\") pod \"controller-manager-5ffd5f77bc-vh5ll\" (UID: \"a48b68df-105c-418f-9cb7-525cb3641e20\") " pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.676515 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxvrj\" (UniqueName: \"kubernetes.io/projected/c6152dc4-04ac-4ace-880d-67f4be272e50-kube-api-access-sxvrj\") pod \"route-controller-manager-57886dd6c5-vszwj\" (UID: \"c6152dc4-04ac-4ace-880d-67f4be272e50\") " pod="openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.680064 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wflvv\" (UniqueName: \"kubernetes.io/projected/a48b68df-105c-418f-9cb7-525cb3641e20-kube-api-access-wflvv\") pod \"controller-manager-5ffd5f77bc-vh5ll\" (UID: \"a48b68df-105c-418f-9cb7-525cb3641e20\") " pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.724152 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.756643 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-cliconfig\") pod \"cc9419c8-c23a-418b-8fba-9956bed2a193\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.757043 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-session\") pod \"cc9419c8-c23a-418b-8fba-9956bed2a193\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.757240 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cc9419c8-c23a-418b-8fba-9956bed2a193-audit-dir\") pod \"cc9419c8-c23a-418b-8fba-9956bed2a193\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.757319 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc9419c8-c23a-418b-8fba-9956bed2a193-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "cc9419c8-c23a-418b-8fba-9956bed2a193" (UID: "cc9419c8-c23a-418b-8fba-9956bed2a193"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.757434 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-router-certs\") pod \"cc9419c8-c23a-418b-8fba-9956bed2a193\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.757556 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-trusted-ca-bundle\") pod \"cc9419c8-c23a-418b-8fba-9956bed2a193\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.757690 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-user-template-provider-selection\") pod \"cc9419c8-c23a-418b-8fba-9956bed2a193\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.757844 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-ocp-branding-template\") pod \"cc9419c8-c23a-418b-8fba-9956bed2a193\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.757962 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cc9419c8-c23a-418b-8fba-9956bed2a193-audit-policies\") pod \"cc9419c8-c23a-418b-8fba-9956bed2a193\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.758071 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-user-idp-0-file-data\") pod \"cc9419c8-c23a-418b-8fba-9956bed2a193\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.758200 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-user-template-error\") pod \"cc9419c8-c23a-418b-8fba-9956bed2a193\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.758357 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-service-ca\") pod \"cc9419c8-c23a-418b-8fba-9956bed2a193\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.758486 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-user-template-login\") pod \"cc9419c8-c23a-418b-8fba-9956bed2a193\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.758610 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k294v\" (UniqueName: \"kubernetes.io/projected/cc9419c8-c23a-418b-8fba-9956bed2a193-kube-api-access-k294v\") pod \"cc9419c8-c23a-418b-8fba-9956bed2a193\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.758715 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-serving-cert\") pod \"cc9419c8-c23a-418b-8fba-9956bed2a193\" (UID: \"cc9419c8-c23a-418b-8fba-9956bed2a193\") " Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.759199 4825 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cc9419c8-c23a-418b-8fba-9956bed2a193-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.758637 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "cc9419c8-c23a-418b-8fba-9956bed2a193" (UID: "cc9419c8-c23a-418b-8fba-9956bed2a193"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.758867 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "cc9419c8-c23a-418b-8fba-9956bed2a193" (UID: "cc9419c8-c23a-418b-8fba-9956bed2a193"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.761212 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "cc9419c8-c23a-418b-8fba-9956bed2a193" (UID: "cc9419c8-c23a-418b-8fba-9956bed2a193"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.761702 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc9419c8-c23a-418b-8fba-9956bed2a193-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "cc9419c8-c23a-418b-8fba-9956bed2a193" (UID: "cc9419c8-c23a-418b-8fba-9956bed2a193"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.761704 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "cc9419c8-c23a-418b-8fba-9956bed2a193" (UID: "cc9419c8-c23a-418b-8fba-9956bed2a193"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.764381 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "cc9419c8-c23a-418b-8fba-9956bed2a193" (UID: "cc9419c8-c23a-418b-8fba-9956bed2a193"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.764674 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc9419c8-c23a-418b-8fba-9956bed2a193-kube-api-access-k294v" (OuterVolumeSpecName: "kube-api-access-k294v") pod "cc9419c8-c23a-418b-8fba-9956bed2a193" (UID: "cc9419c8-c23a-418b-8fba-9956bed2a193"). InnerVolumeSpecName "kube-api-access-k294v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.764770 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "cc9419c8-c23a-418b-8fba-9956bed2a193" (UID: "cc9419c8-c23a-418b-8fba-9956bed2a193"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.765080 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "cc9419c8-c23a-418b-8fba-9956bed2a193" (UID: "cc9419c8-c23a-418b-8fba-9956bed2a193"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.765275 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "cc9419c8-c23a-418b-8fba-9956bed2a193" (UID: "cc9419c8-c23a-418b-8fba-9956bed2a193"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.766303 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "cc9419c8-c23a-418b-8fba-9956bed2a193" (UID: "cc9419c8-c23a-418b-8fba-9956bed2a193"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.767013 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "cc9419c8-c23a-418b-8fba-9956bed2a193" (UID: "cc9419c8-c23a-418b-8fba-9956bed2a193"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.767686 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "cc9419c8-c23a-418b-8fba-9956bed2a193" (UID: "cc9419c8-c23a-418b-8fba-9956bed2a193"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.824573 4825 generic.go:334] "Generic (PLEG): container finished" podID="cc9419c8-c23a-418b-8fba-9956bed2a193" containerID="7a8f8af22b29d41abc26637da30ee7f659dd5470468ff5f4f2286bafd73974ad" exitCode=0 Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.824627 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.824654 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" event={"ID":"cc9419c8-c23a-418b-8fba-9956bed2a193","Type":"ContainerDied","Data":"7a8f8af22b29d41abc26637da30ee7f659dd5470468ff5f4f2286bafd73974ad"} Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.824686 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xfp4f" event={"ID":"cc9419c8-c23a-418b-8fba-9956bed2a193","Type":"ContainerDied","Data":"122b20ffae414c29feb2d4e6155a74121af80bbb8218ba491dbb9381f65968e4"} Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.824703 4825 scope.go:117] "RemoveContainer" containerID="7a8f8af22b29d41abc26637da30ee7f659dd5470468ff5f4f2286bafd73974ad" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.829520 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.830882 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.831093 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78845db449-qk8kh" event={"ID":"7e3bbdbd-ab62-4e56-94b2-925c13ada6f5","Type":"ContainerDied","Data":"da094da924366153496b4ac0066ab2797ee482423c5f98a49e235487508d1b6e"} Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.834767 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" event={"ID":"49f3cdc9-d395-4b9b-abc5-4de670c497a2","Type":"ContainerDied","Data":"b0470ca555a506e96d9cbc8a6e2a89895a55e604ae4b32f551d10d6d34ba1547"} Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.834847 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.856249 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9zmmv" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.856298 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9zmmv" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.860745 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.860828 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.860882 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.860907 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.860959 4825 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cc9419c8-c23a-418b-8fba-9956bed2a193-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.860979 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.860999 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.861047 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.861062 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.861079 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k294v\" (UniqueName: \"kubernetes.io/projected/cc9419c8-c23a-418b-8fba-9956bed2a193-kube-api-access-k294v\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.861096 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.861148 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.861166 4825 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cc9419c8-c23a-418b-8fba-9956bed2a193-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.863591 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.878415 4825 scope.go:117] "RemoveContainer" containerID="7a8f8af22b29d41abc26637da30ee7f659dd5470468ff5f4f2286bafd73974ad" Mar 10 06:48:50 crc kubenswrapper[4825]: E0310 06:48:50.879540 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a8f8af22b29d41abc26637da30ee7f659dd5470468ff5f4f2286bafd73974ad\": container with ID starting with 7a8f8af22b29d41abc26637da30ee7f659dd5470468ff5f4f2286bafd73974ad not found: ID does not exist" containerID="7a8f8af22b29d41abc26637da30ee7f659dd5470468ff5f4f2286bafd73974ad" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.880084 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a8f8af22b29d41abc26637da30ee7f659dd5470468ff5f4f2286bafd73974ad"} err="failed to get container status \"7a8f8af22b29d41abc26637da30ee7f659dd5470468ff5f4f2286bafd73974ad\": rpc error: code = NotFound desc = could not find container \"7a8f8af22b29d41abc26637da30ee7f659dd5470468ff5f4f2286bafd73974ad\": container with ID starting with 7a8f8af22b29d41abc26637da30ee7f659dd5470468ff5f4f2286bafd73974ad not found: ID does not exist" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.880222 4825 scope.go:117] "RemoveContainer" containerID="7db23a7b0546bb7ba7d9ac8e3281c144fdc2e557310be04c61bf6ac3a7f13f66" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.891061 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt"] Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.896506 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55d9d79d49-6tqpt"] Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.904886 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78845db449-qk8kh"] Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.924022 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9zmmv" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.924212 4825 scope.go:117] "RemoveContainer" containerID="ed716500bb2415c3460225d70822efcdf82ca949ebbfb0cf63e53bbb37d85bd7" Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.927275 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-78845db449-qk8kh"] Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.930490 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xfp4f"] Mar 10 06:48:50 crc kubenswrapper[4825]: I0310 06:48:50.933225 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xfp4f"] Mar 10 06:48:51 crc kubenswrapper[4825]: I0310 06:48:51.245009 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49f3cdc9-d395-4b9b-abc5-4de670c497a2" path="/var/lib/kubelet/pods/49f3cdc9-d395-4b9b-abc5-4de670c497a2/volumes" Mar 10 06:48:51 crc kubenswrapper[4825]: I0310 06:48:51.246437 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e3bbdbd-ab62-4e56-94b2-925c13ada6f5" path="/var/lib/kubelet/pods/7e3bbdbd-ab62-4e56-94b2-925c13ada6f5/volumes" Mar 10 06:48:51 crc kubenswrapper[4825]: I0310 06:48:51.247322 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc9419c8-c23a-418b-8fba-9956bed2a193" path="/var/lib/kubelet/pods/cc9419c8-c23a-418b-8fba-9956bed2a193/volumes" Mar 10 06:48:51 crc kubenswrapper[4825]: I0310 06:48:51.292863 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hbdjf" Mar 10 06:48:51 crc kubenswrapper[4825]: I0310 06:48:51.352317 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll"] Mar 10 06:48:51 crc kubenswrapper[4825]: W0310 06:48:51.364557 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda48b68df_105c_418f_9cb7_525cb3641e20.slice/crio-7673d6e05a681f6445cb3df09b4306b814ea627794e397b0e3b0e41dad3f5eaf WatchSource:0}: Error finding container 7673d6e05a681f6445cb3df09b4306b814ea627794e397b0e3b0e41dad3f5eaf: Status 404 returned error can't find the container with id 7673d6e05a681f6445cb3df09b4306b814ea627794e397b0e3b0e41dad3f5eaf Mar 10 06:48:51 crc kubenswrapper[4825]: I0310 06:48:51.366151 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hbdjf" Mar 10 06:48:51 crc kubenswrapper[4825]: I0310 06:48:51.408080 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj"] Mar 10 06:48:51 crc kubenswrapper[4825]: W0310 06:48:51.410765 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6152dc4_04ac_4ace_880d_67f4be272e50.slice/crio-90923028eb71f7a7b7023e77d0d0db15b826b4c3f4891e39b0066aae61b98600 WatchSource:0}: Error finding container 90923028eb71f7a7b7023e77d0d0db15b826b4c3f4891e39b0066aae61b98600: Status 404 returned error can't find the container with id 90923028eb71f7a7b7023e77d0d0db15b826b4c3f4891e39b0066aae61b98600 Mar 10 06:48:51 crc kubenswrapper[4825]: I0310 06:48:51.591657 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lfhnn" Mar 10 06:48:51 crc kubenswrapper[4825]: I0310 06:48:51.592200 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lfhnn" Mar 10 06:48:51 crc kubenswrapper[4825]: I0310 06:48:51.640315 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lfhnn" Mar 10 06:48:51 crc kubenswrapper[4825]: I0310 06:48:51.843410 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" event={"ID":"a48b68df-105c-418f-9cb7-525cb3641e20","Type":"ContainerStarted","Data":"ffceb64c382508bdd878ad3e3ae44bd2e72c872f3e13b5c45be505c20730c0b2"} Mar 10 06:48:51 crc kubenswrapper[4825]: I0310 06:48:51.843491 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" event={"ID":"a48b68df-105c-418f-9cb7-525cb3641e20","Type":"ContainerStarted","Data":"7673d6e05a681f6445cb3df09b4306b814ea627794e397b0e3b0e41dad3f5eaf"} Mar 10 06:48:51 crc kubenswrapper[4825]: I0310 06:48:51.843882 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" Mar 10 06:48:51 crc kubenswrapper[4825]: I0310 06:48:51.848256 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" Mar 10 06:48:51 crc kubenswrapper[4825]: I0310 06:48:51.850870 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj" event={"ID":"c6152dc4-04ac-4ace-880d-67f4be272e50","Type":"ContainerStarted","Data":"a36301ca3ac9dc807e52bfa5b981c9190333c861f97f982cb238f864042460a7"} Mar 10 06:48:51 crc kubenswrapper[4825]: I0310 06:48:51.850930 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj" event={"ID":"c6152dc4-04ac-4ace-880d-67f4be272e50","Type":"ContainerStarted","Data":"90923028eb71f7a7b7023e77d0d0db15b826b4c3f4891e39b0066aae61b98600"} Mar 10 06:48:51 crc kubenswrapper[4825]: I0310 06:48:51.851580 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj" Mar 10 06:48:51 crc kubenswrapper[4825]: I0310 06:48:51.859557 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj" Mar 10 06:48:51 crc kubenswrapper[4825]: I0310 06:48:51.875438 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" podStartSLOduration=2.8754095509999997 podStartE2EDuration="2.875409551s" podCreationTimestamp="2026-03-10 06:48:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:48:51.871662918 +0000 UTC m=+284.901443533" watchObservedRunningTime="2026-03-10 06:48:51.875409551 +0000 UTC m=+284.905190166" Mar 10 06:48:51 crc kubenswrapper[4825]: I0310 06:48:51.891839 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj" podStartSLOduration=2.891813208 podStartE2EDuration="2.891813208s" podCreationTimestamp="2026-03-10 06:48:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:48:51.890052015 +0000 UTC m=+284.919832630" watchObservedRunningTime="2026-03-10 06:48:51.891813208 +0000 UTC m=+284.921593823" Mar 10 06:48:51 crc kubenswrapper[4825]: I0310 06:48:51.901643 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lfhnn" Mar 10 06:48:51 crc kubenswrapper[4825]: I0310 06:48:51.916349 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9zmmv" Mar 10 06:48:51 crc kubenswrapper[4825]: I0310 06:48:51.927366 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hbdjf"] Mar 10 06:48:52 crc kubenswrapper[4825]: I0310 06:48:52.800166 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vkvnj" Mar 10 06:48:52 crc kubenswrapper[4825]: I0310 06:48:52.802751 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vkvnj" Mar 10 06:48:52 crc kubenswrapper[4825]: I0310 06:48:52.863548 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hbdjf" podUID="d232f8a7-f013-4a5c-a7dc-2150c4b3040c" containerName="registry-server" containerID="cri-o://067c6aa47d60fae5b55275d9207c32c6890352828f44c81dd8cb15ea46376e6c" gracePeriod=2 Mar 10 06:48:52 crc kubenswrapper[4825]: I0310 06:48:52.899275 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vkvnj" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.287540 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lfxbn" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.288733 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lfxbn" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.316836 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lfhnn"] Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.368497 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lfxbn" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.510567 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-66c58d94fc-m9h82"] Mar 10 06:48:53 crc kubenswrapper[4825]: E0310 06:48:53.511080 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc9419c8-c23a-418b-8fba-9956bed2a193" containerName="oauth-openshift" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.511120 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc9419c8-c23a-418b-8fba-9956bed2a193" containerName="oauth-openshift" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.511429 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc9419c8-c23a-418b-8fba-9956bed2a193" containerName="oauth-openshift" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.512427 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.523650 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.523857 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.523962 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.524039 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.524086 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.524427 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.525506 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.527199 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.529252 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.529551 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.529740 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66c58d94fc-m9h82"] Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.530015 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.533204 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.535067 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.536889 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.547992 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.626962 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-audit-dir\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.627058 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.627205 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-system-router-certs\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.627272 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-user-template-login\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.627305 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.627351 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.627409 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.627458 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.627523 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.627564 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-user-template-error\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.627607 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cgrv\" (UniqueName: \"kubernetes.io/projected/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-kube-api-access-4cgrv\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.627660 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-system-service-ca\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.627697 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-system-session\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.627757 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-audit-policies\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.729969 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-system-router-certs\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.730121 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-user-template-login\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.730211 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.730280 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.730351 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.730412 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.730488 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.730541 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-user-template-error\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.730598 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cgrv\" (UniqueName: \"kubernetes.io/projected/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-kube-api-access-4cgrv\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.730666 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-system-service-ca\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.730722 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-system-session\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.730817 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-audit-policies\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.730884 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-audit-dir\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.730930 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.732728 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-audit-dir\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.734393 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-audit-policies\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.734918 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-system-service-ca\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.735848 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.737152 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.737947 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.737974 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-user-template-error\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.738326 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-system-session\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.738588 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.739056 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.739295 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-system-router-certs\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.740278 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.741418 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-v4-0-config-user-template-login\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.776062 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cgrv\" (UniqueName: \"kubernetes.io/projected/4412cbab-3dc4-4a96-9f37-af4fe4a8aeec-kube-api-access-4cgrv\") pod \"oauth-openshift-66c58d94fc-m9h82\" (UID: \"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.879764 4825 generic.go:334] "Generic (PLEG): container finished" podID="d232f8a7-f013-4a5c-a7dc-2150c4b3040c" containerID="067c6aa47d60fae5b55275d9207c32c6890352828f44c81dd8cb15ea46376e6c" exitCode=0 Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.879994 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbdjf" event={"ID":"d232f8a7-f013-4a5c-a7dc-2150c4b3040c","Type":"ContainerDied","Data":"067c6aa47d60fae5b55275d9207c32c6890352828f44c81dd8cb15ea46376e6c"} Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.898070 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.956657 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vkvnj" Mar 10 06:48:53 crc kubenswrapper[4825]: I0310 06:48:53.960353 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lfxbn" Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.041075 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7h5g2" Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.110657 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7h5g2" Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.266416 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hbdjf" Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.343414 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d232f8a7-f013-4a5c-a7dc-2150c4b3040c-catalog-content\") pod \"d232f8a7-f013-4a5c-a7dc-2150c4b3040c\" (UID: \"d232f8a7-f013-4a5c-a7dc-2150c4b3040c\") " Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.343462 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d232f8a7-f013-4a5c-a7dc-2150c4b3040c-utilities\") pod \"d232f8a7-f013-4a5c-a7dc-2150c4b3040c\" (UID: \"d232f8a7-f013-4a5c-a7dc-2150c4b3040c\") " Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.343664 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6bpt\" (UniqueName: \"kubernetes.io/projected/d232f8a7-f013-4a5c-a7dc-2150c4b3040c-kube-api-access-p6bpt\") pod \"d232f8a7-f013-4a5c-a7dc-2150c4b3040c\" (UID: \"d232f8a7-f013-4a5c-a7dc-2150c4b3040c\") " Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.344935 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d232f8a7-f013-4a5c-a7dc-2150c4b3040c-utilities" (OuterVolumeSpecName: "utilities") pod "d232f8a7-f013-4a5c-a7dc-2150c4b3040c" (UID: "d232f8a7-f013-4a5c-a7dc-2150c4b3040c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.345114 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2gg7k" Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.345200 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2gg7k" Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.357152 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d232f8a7-f013-4a5c-a7dc-2150c4b3040c-kube-api-access-p6bpt" (OuterVolumeSpecName: "kube-api-access-p6bpt") pod "d232f8a7-f013-4a5c-a7dc-2150c4b3040c" (UID: "d232f8a7-f013-4a5c-a7dc-2150c4b3040c"). InnerVolumeSpecName "kube-api-access-p6bpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.395625 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2gg7k" Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.403071 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d232f8a7-f013-4a5c-a7dc-2150c4b3040c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d232f8a7-f013-4a5c-a7dc-2150c4b3040c" (UID: "d232f8a7-f013-4a5c-a7dc-2150c4b3040c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.458364 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6bpt\" (UniqueName: \"kubernetes.io/projected/d232f8a7-f013-4a5c-a7dc-2150c4b3040c-kube-api-access-p6bpt\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.458447 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d232f8a7-f013-4a5c-a7dc-2150c4b3040c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.458461 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d232f8a7-f013-4a5c-a7dc-2150c4b3040c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.460464 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66c58d94fc-m9h82"] Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.896150 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" event={"ID":"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec","Type":"ContainerStarted","Data":"4d3ab78f1f964615a1f52644d8f9dfcc0fabbe16bad89c014b2407c88431dca7"} Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.904199 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lfhnn" podUID="33ed941e-9788-4c9e-bcd5-6af206460adb" containerName="registry-server" containerID="cri-o://06cd39ab0a6426205c1b1ac093b01e43bdc28b478f51c54cfd31026294143e76" gracePeriod=2 Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.904418 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hbdjf" Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.905253 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hbdjf" event={"ID":"d232f8a7-f013-4a5c-a7dc-2150c4b3040c","Type":"ContainerDied","Data":"f533bccf4abe8d105de241b927bcf4d846a0d32e66c4cd938d03bfd9ed99ce2d"} Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.905377 4825 scope.go:117] "RemoveContainer" containerID="067c6aa47d60fae5b55275d9207c32c6890352828f44c81dd8cb15ea46376e6c" Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.941765 4825 scope.go:117] "RemoveContainer" containerID="ef60518a1963bb3bd15a97cfec6bf1d489c92025c5007d70c66942da30e96e37" Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.959711 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hbdjf"] Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.964721 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hbdjf"] Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.980912 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2gg7k" Mar 10 06:48:54 crc kubenswrapper[4825]: I0310 06:48:54.986214 4825 scope.go:117] "RemoveContainer" containerID="31345fc7f6c0909f76a40e8c716468655688b031af3150f505b158afdeba1bad" Mar 10 06:48:55 crc kubenswrapper[4825]: I0310 06:48:55.249176 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d232f8a7-f013-4a5c-a7dc-2150c4b3040c" path="/var/lib/kubelet/pods/d232f8a7-f013-4a5c-a7dc-2150c4b3040c/volumes" Mar 10 06:48:55 crc kubenswrapper[4825]: I0310 06:48:55.389948 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lfhnn" Mar 10 06:48:55 crc kubenswrapper[4825]: I0310 06:48:55.481264 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33ed941e-9788-4c9e-bcd5-6af206460adb-utilities\") pod \"33ed941e-9788-4c9e-bcd5-6af206460adb\" (UID: \"33ed941e-9788-4c9e-bcd5-6af206460adb\") " Mar 10 06:48:55 crc kubenswrapper[4825]: I0310 06:48:55.481325 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33ed941e-9788-4c9e-bcd5-6af206460adb-catalog-content\") pod \"33ed941e-9788-4c9e-bcd5-6af206460adb\" (UID: \"33ed941e-9788-4c9e-bcd5-6af206460adb\") " Mar 10 06:48:55 crc kubenswrapper[4825]: I0310 06:48:55.481418 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l84j\" (UniqueName: \"kubernetes.io/projected/33ed941e-9788-4c9e-bcd5-6af206460adb-kube-api-access-4l84j\") pod \"33ed941e-9788-4c9e-bcd5-6af206460adb\" (UID: \"33ed941e-9788-4c9e-bcd5-6af206460adb\") " Mar 10 06:48:55 crc kubenswrapper[4825]: I0310 06:48:55.482181 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33ed941e-9788-4c9e-bcd5-6af206460adb-utilities" (OuterVolumeSpecName: "utilities") pod "33ed941e-9788-4c9e-bcd5-6af206460adb" (UID: "33ed941e-9788-4c9e-bcd5-6af206460adb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:48:55 crc kubenswrapper[4825]: I0310 06:48:55.490525 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33ed941e-9788-4c9e-bcd5-6af206460adb-kube-api-access-4l84j" (OuterVolumeSpecName: "kube-api-access-4l84j") pod "33ed941e-9788-4c9e-bcd5-6af206460adb" (UID: "33ed941e-9788-4c9e-bcd5-6af206460adb"). InnerVolumeSpecName "kube-api-access-4l84j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:48:55 crc kubenswrapper[4825]: I0310 06:48:55.537571 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33ed941e-9788-4c9e-bcd5-6af206460adb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33ed941e-9788-4c9e-bcd5-6af206460adb" (UID: "33ed941e-9788-4c9e-bcd5-6af206460adb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:48:55 crc kubenswrapper[4825]: I0310 06:48:55.583185 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33ed941e-9788-4c9e-bcd5-6af206460adb-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:55 crc kubenswrapper[4825]: I0310 06:48:55.583220 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33ed941e-9788-4c9e-bcd5-6af206460adb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:55 crc kubenswrapper[4825]: I0310 06:48:55.583232 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l84j\" (UniqueName: \"kubernetes.io/projected/33ed941e-9788-4c9e-bcd5-6af206460adb-kube-api-access-4l84j\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:55 crc kubenswrapper[4825]: I0310 06:48:55.714923 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lfxbn"] Mar 10 06:48:55 crc kubenswrapper[4825]: I0310 06:48:55.917258 4825 generic.go:334] "Generic (PLEG): container finished" podID="33ed941e-9788-4c9e-bcd5-6af206460adb" containerID="06cd39ab0a6426205c1b1ac093b01e43bdc28b478f51c54cfd31026294143e76" exitCode=0 Mar 10 06:48:55 crc kubenswrapper[4825]: I0310 06:48:55.917379 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lfhnn" Mar 10 06:48:55 crc kubenswrapper[4825]: I0310 06:48:55.917366 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lfhnn" event={"ID":"33ed941e-9788-4c9e-bcd5-6af206460adb","Type":"ContainerDied","Data":"06cd39ab0a6426205c1b1ac093b01e43bdc28b478f51c54cfd31026294143e76"} Mar 10 06:48:55 crc kubenswrapper[4825]: I0310 06:48:55.917577 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lfhnn" event={"ID":"33ed941e-9788-4c9e-bcd5-6af206460adb","Type":"ContainerDied","Data":"8f55c48652aa364654b9e17b26a212598ff09f76119329099cc12e8619f83286"} Mar 10 06:48:55 crc kubenswrapper[4825]: I0310 06:48:55.917618 4825 scope.go:117] "RemoveContainer" containerID="06cd39ab0a6426205c1b1ac093b01e43bdc28b478f51c54cfd31026294143e76" Mar 10 06:48:55 crc kubenswrapper[4825]: I0310 06:48:55.921780 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" event={"ID":"4412cbab-3dc4-4a96-9f37-af4fe4a8aeec","Type":"ContainerStarted","Data":"60b7b3323b44ece7aeba45d9b44064e91dbb643c5244e963309356f1516fe5b5"} Mar 10 06:48:55 crc kubenswrapper[4825]: I0310 06:48:55.948281 4825 scope.go:117] "RemoveContainer" containerID="41f362b98d4228d78d6329a4f4232a394052b85cdaf3037fad8d27e0907b4015" Mar 10 06:48:55 crc kubenswrapper[4825]: I0310 06:48:55.959239 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" podStartSLOduration=30.959211767 podStartE2EDuration="30.959211767s" podCreationTimestamp="2026-03-10 06:48:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:48:55.947206774 +0000 UTC m=+288.976987389" watchObservedRunningTime="2026-03-10 06:48:55.959211767 +0000 UTC m=+288.988992382" Mar 10 06:48:55 crc kubenswrapper[4825]: I0310 06:48:55.969695 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lfhnn"] Mar 10 06:48:55 crc kubenswrapper[4825]: I0310 06:48:55.981656 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lfhnn"] Mar 10 06:48:55 crc kubenswrapper[4825]: I0310 06:48:55.986455 4825 scope.go:117] "RemoveContainer" containerID="42ce30d0dd130f168111305056b5f9a7adfb58a8db5677fb87359586bbd611ae" Mar 10 06:48:56 crc kubenswrapper[4825]: I0310 06:48:56.014409 4825 scope.go:117] "RemoveContainer" containerID="06cd39ab0a6426205c1b1ac093b01e43bdc28b478f51c54cfd31026294143e76" Mar 10 06:48:56 crc kubenswrapper[4825]: E0310 06:48:56.015006 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06cd39ab0a6426205c1b1ac093b01e43bdc28b478f51c54cfd31026294143e76\": container with ID starting with 06cd39ab0a6426205c1b1ac093b01e43bdc28b478f51c54cfd31026294143e76 not found: ID does not exist" containerID="06cd39ab0a6426205c1b1ac093b01e43bdc28b478f51c54cfd31026294143e76" Mar 10 06:48:56 crc kubenswrapper[4825]: I0310 06:48:56.015040 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06cd39ab0a6426205c1b1ac093b01e43bdc28b478f51c54cfd31026294143e76"} err="failed to get container status \"06cd39ab0a6426205c1b1ac093b01e43bdc28b478f51c54cfd31026294143e76\": rpc error: code = NotFound desc = could not find container \"06cd39ab0a6426205c1b1ac093b01e43bdc28b478f51c54cfd31026294143e76\": container with ID starting with 06cd39ab0a6426205c1b1ac093b01e43bdc28b478f51c54cfd31026294143e76 not found: ID does not exist" Mar 10 06:48:56 crc kubenswrapper[4825]: I0310 06:48:56.015067 4825 scope.go:117] "RemoveContainer" containerID="41f362b98d4228d78d6329a4f4232a394052b85cdaf3037fad8d27e0907b4015" Mar 10 06:48:56 crc kubenswrapper[4825]: E0310 06:48:56.015535 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41f362b98d4228d78d6329a4f4232a394052b85cdaf3037fad8d27e0907b4015\": container with ID starting with 41f362b98d4228d78d6329a4f4232a394052b85cdaf3037fad8d27e0907b4015 not found: ID does not exist" containerID="41f362b98d4228d78d6329a4f4232a394052b85cdaf3037fad8d27e0907b4015" Mar 10 06:48:56 crc kubenswrapper[4825]: I0310 06:48:56.015556 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41f362b98d4228d78d6329a4f4232a394052b85cdaf3037fad8d27e0907b4015"} err="failed to get container status \"41f362b98d4228d78d6329a4f4232a394052b85cdaf3037fad8d27e0907b4015\": rpc error: code = NotFound desc = could not find container \"41f362b98d4228d78d6329a4f4232a394052b85cdaf3037fad8d27e0907b4015\": container with ID starting with 41f362b98d4228d78d6329a4f4232a394052b85cdaf3037fad8d27e0907b4015 not found: ID does not exist" Mar 10 06:48:56 crc kubenswrapper[4825]: I0310 06:48:56.015568 4825 scope.go:117] "RemoveContainer" containerID="42ce30d0dd130f168111305056b5f9a7adfb58a8db5677fb87359586bbd611ae" Mar 10 06:48:56 crc kubenswrapper[4825]: E0310 06:48:56.015775 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42ce30d0dd130f168111305056b5f9a7adfb58a8db5677fb87359586bbd611ae\": container with ID starting with 42ce30d0dd130f168111305056b5f9a7adfb58a8db5677fb87359586bbd611ae not found: ID does not exist" containerID="42ce30d0dd130f168111305056b5f9a7adfb58a8db5677fb87359586bbd611ae" Mar 10 06:48:56 crc kubenswrapper[4825]: I0310 06:48:56.015794 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42ce30d0dd130f168111305056b5f9a7adfb58a8db5677fb87359586bbd611ae"} err="failed to get container status \"42ce30d0dd130f168111305056b5f9a7adfb58a8db5677fb87359586bbd611ae\": rpc error: code = NotFound desc = could not find container \"42ce30d0dd130f168111305056b5f9a7adfb58a8db5677fb87359586bbd611ae\": container with ID starting with 42ce30d0dd130f168111305056b5f9a7adfb58a8db5677fb87359586bbd611ae not found: ID does not exist" Mar 10 06:48:56 crc kubenswrapper[4825]: I0310 06:48:56.935062 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lfxbn" podUID="823b95d6-3aa7-4fda-bd75-9fe7e95a209f" containerName="registry-server" containerID="cri-o://9551d524c53c2343b1dfcd7f6a4247a5495d670c3403a8839a30f91df10a8d70" gracePeriod=2 Mar 10 06:48:56 crc kubenswrapper[4825]: I0310 06:48:56.935353 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:56 crc kubenswrapper[4825]: I0310 06:48:56.946651 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-66c58d94fc-m9h82" Mar 10 06:48:57 crc kubenswrapper[4825]: I0310 06:48:57.250221 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33ed941e-9788-4c9e-bcd5-6af206460adb" path="/var/lib/kubelet/pods/33ed941e-9788-4c9e-bcd5-6af206460adb/volumes" Mar 10 06:48:57 crc kubenswrapper[4825]: I0310 06:48:57.448766 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lfxbn" Mar 10 06:48:57 crc kubenswrapper[4825]: I0310 06:48:57.521660 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/823b95d6-3aa7-4fda-bd75-9fe7e95a209f-catalog-content\") pod \"823b95d6-3aa7-4fda-bd75-9fe7e95a209f\" (UID: \"823b95d6-3aa7-4fda-bd75-9fe7e95a209f\") " Mar 10 06:48:57 crc kubenswrapper[4825]: I0310 06:48:57.521804 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fn9kg\" (UniqueName: \"kubernetes.io/projected/823b95d6-3aa7-4fda-bd75-9fe7e95a209f-kube-api-access-fn9kg\") pod \"823b95d6-3aa7-4fda-bd75-9fe7e95a209f\" (UID: \"823b95d6-3aa7-4fda-bd75-9fe7e95a209f\") " Mar 10 06:48:57 crc kubenswrapper[4825]: I0310 06:48:57.521962 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/823b95d6-3aa7-4fda-bd75-9fe7e95a209f-utilities\") pod \"823b95d6-3aa7-4fda-bd75-9fe7e95a209f\" (UID: \"823b95d6-3aa7-4fda-bd75-9fe7e95a209f\") " Mar 10 06:48:57 crc kubenswrapper[4825]: I0310 06:48:57.522995 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/823b95d6-3aa7-4fda-bd75-9fe7e95a209f-utilities" (OuterVolumeSpecName: "utilities") pod "823b95d6-3aa7-4fda-bd75-9fe7e95a209f" (UID: "823b95d6-3aa7-4fda-bd75-9fe7e95a209f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:48:57 crc kubenswrapper[4825]: I0310 06:48:57.523403 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/823b95d6-3aa7-4fda-bd75-9fe7e95a209f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:57 crc kubenswrapper[4825]: I0310 06:48:57.533482 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/823b95d6-3aa7-4fda-bd75-9fe7e95a209f-kube-api-access-fn9kg" (OuterVolumeSpecName: "kube-api-access-fn9kg") pod "823b95d6-3aa7-4fda-bd75-9fe7e95a209f" (UID: "823b95d6-3aa7-4fda-bd75-9fe7e95a209f"). InnerVolumeSpecName "kube-api-access-fn9kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:48:57 crc kubenswrapper[4825]: I0310 06:48:57.550814 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/823b95d6-3aa7-4fda-bd75-9fe7e95a209f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "823b95d6-3aa7-4fda-bd75-9fe7e95a209f" (UID: "823b95d6-3aa7-4fda-bd75-9fe7e95a209f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:48:57 crc kubenswrapper[4825]: I0310 06:48:57.625269 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/823b95d6-3aa7-4fda-bd75-9fe7e95a209f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:57 crc kubenswrapper[4825]: I0310 06:48:57.625302 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fn9kg\" (UniqueName: \"kubernetes.io/projected/823b95d6-3aa7-4fda-bd75-9fe7e95a209f-kube-api-access-fn9kg\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:57 crc kubenswrapper[4825]: I0310 06:48:57.945304 4825 generic.go:334] "Generic (PLEG): container finished" podID="823b95d6-3aa7-4fda-bd75-9fe7e95a209f" containerID="9551d524c53c2343b1dfcd7f6a4247a5495d670c3403a8839a30f91df10a8d70" exitCode=0 Mar 10 06:48:57 crc kubenswrapper[4825]: I0310 06:48:57.945395 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lfxbn" event={"ID":"823b95d6-3aa7-4fda-bd75-9fe7e95a209f","Type":"ContainerDied","Data":"9551d524c53c2343b1dfcd7f6a4247a5495d670c3403a8839a30f91df10a8d70"} Mar 10 06:48:57 crc kubenswrapper[4825]: I0310 06:48:57.945441 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lfxbn" Mar 10 06:48:57 crc kubenswrapper[4825]: I0310 06:48:57.945466 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lfxbn" event={"ID":"823b95d6-3aa7-4fda-bd75-9fe7e95a209f","Type":"ContainerDied","Data":"330caed1df0291cf2f9b79b348597ab836872a68dd3151b16fbdaf8f85df95c0"} Mar 10 06:48:57 crc kubenswrapper[4825]: I0310 06:48:57.945498 4825 scope.go:117] "RemoveContainer" containerID="9551d524c53c2343b1dfcd7f6a4247a5495d670c3403a8839a30f91df10a8d70" Mar 10 06:48:57 crc kubenswrapper[4825]: I0310 06:48:57.969632 4825 scope.go:117] "RemoveContainer" containerID="2b21d0117de4c1b41f35b94dbcdc46feb174d950c22f6f642ce38167b2226e4e" Mar 10 06:48:57 crc kubenswrapper[4825]: I0310 06:48:57.980648 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lfxbn"] Mar 10 06:48:57 crc kubenswrapper[4825]: I0310 06:48:57.983505 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lfxbn"] Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.010054 4825 scope.go:117] "RemoveContainer" containerID="b3da0856c57dd8f0381398216d95ec848f6fa7015fcd8844ccd31233e59c1403" Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.037153 4825 scope.go:117] "RemoveContainer" containerID="9551d524c53c2343b1dfcd7f6a4247a5495d670c3403a8839a30f91df10a8d70" Mar 10 06:48:58 crc kubenswrapper[4825]: E0310 06:48:58.037920 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9551d524c53c2343b1dfcd7f6a4247a5495d670c3403a8839a30f91df10a8d70\": container with ID starting with 9551d524c53c2343b1dfcd7f6a4247a5495d670c3403a8839a30f91df10a8d70 not found: ID does not exist" containerID="9551d524c53c2343b1dfcd7f6a4247a5495d670c3403a8839a30f91df10a8d70" Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.037977 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9551d524c53c2343b1dfcd7f6a4247a5495d670c3403a8839a30f91df10a8d70"} err="failed to get container status \"9551d524c53c2343b1dfcd7f6a4247a5495d670c3403a8839a30f91df10a8d70\": rpc error: code = NotFound desc = could not find container \"9551d524c53c2343b1dfcd7f6a4247a5495d670c3403a8839a30f91df10a8d70\": container with ID starting with 9551d524c53c2343b1dfcd7f6a4247a5495d670c3403a8839a30f91df10a8d70 not found: ID does not exist" Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.038024 4825 scope.go:117] "RemoveContainer" containerID="2b21d0117de4c1b41f35b94dbcdc46feb174d950c22f6f642ce38167b2226e4e" Mar 10 06:48:58 crc kubenswrapper[4825]: E0310 06:48:58.038723 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b21d0117de4c1b41f35b94dbcdc46feb174d950c22f6f642ce38167b2226e4e\": container with ID starting with 2b21d0117de4c1b41f35b94dbcdc46feb174d950c22f6f642ce38167b2226e4e not found: ID does not exist" containerID="2b21d0117de4c1b41f35b94dbcdc46feb174d950c22f6f642ce38167b2226e4e" Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.038835 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b21d0117de4c1b41f35b94dbcdc46feb174d950c22f6f642ce38167b2226e4e"} err="failed to get container status \"2b21d0117de4c1b41f35b94dbcdc46feb174d950c22f6f642ce38167b2226e4e\": rpc error: code = NotFound desc = could not find container \"2b21d0117de4c1b41f35b94dbcdc46feb174d950c22f6f642ce38167b2226e4e\": container with ID starting with 2b21d0117de4c1b41f35b94dbcdc46feb174d950c22f6f642ce38167b2226e4e not found: ID does not exist" Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.038903 4825 scope.go:117] "RemoveContainer" containerID="b3da0856c57dd8f0381398216d95ec848f6fa7015fcd8844ccd31233e59c1403" Mar 10 06:48:58 crc kubenswrapper[4825]: E0310 06:48:58.039558 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3da0856c57dd8f0381398216d95ec848f6fa7015fcd8844ccd31233e59c1403\": container with ID starting with b3da0856c57dd8f0381398216d95ec848f6fa7015fcd8844ccd31233e59c1403 not found: ID does not exist" containerID="b3da0856c57dd8f0381398216d95ec848f6fa7015fcd8844ccd31233e59c1403" Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.039611 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3da0856c57dd8f0381398216d95ec848f6fa7015fcd8844ccd31233e59c1403"} err="failed to get container status \"b3da0856c57dd8f0381398216d95ec848f6fa7015fcd8844ccd31233e59c1403\": rpc error: code = NotFound desc = could not find container \"b3da0856c57dd8f0381398216d95ec848f6fa7015fcd8844ccd31233e59c1403\": container with ID starting with b3da0856c57dd8f0381398216d95ec848f6fa7015fcd8844ccd31233e59c1403 not found: ID does not exist" Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.113400 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2gg7k"] Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.113734 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2gg7k" podUID="58d27fca-dd33-4753-9ea6-4b55c191b2b8" containerName="registry-server" containerID="cri-o://16702d347b624b3bd369486d2ad384ae2237c37d9deaac93249aca1b3e347f67" gracePeriod=2 Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.645258 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2gg7k" Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.742869 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58d27fca-dd33-4753-9ea6-4b55c191b2b8-catalog-content\") pod \"58d27fca-dd33-4753-9ea6-4b55c191b2b8\" (UID: \"58d27fca-dd33-4753-9ea6-4b55c191b2b8\") " Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.743012 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkc9w\" (UniqueName: \"kubernetes.io/projected/58d27fca-dd33-4753-9ea6-4b55c191b2b8-kube-api-access-hkc9w\") pod \"58d27fca-dd33-4753-9ea6-4b55c191b2b8\" (UID: \"58d27fca-dd33-4753-9ea6-4b55c191b2b8\") " Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.743099 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58d27fca-dd33-4753-9ea6-4b55c191b2b8-utilities\") pod \"58d27fca-dd33-4753-9ea6-4b55c191b2b8\" (UID: \"58d27fca-dd33-4753-9ea6-4b55c191b2b8\") " Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.743937 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58d27fca-dd33-4753-9ea6-4b55c191b2b8-utilities" (OuterVolumeSpecName: "utilities") pod "58d27fca-dd33-4753-9ea6-4b55c191b2b8" (UID: "58d27fca-dd33-4753-9ea6-4b55c191b2b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.751468 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58d27fca-dd33-4753-9ea6-4b55c191b2b8-kube-api-access-hkc9w" (OuterVolumeSpecName: "kube-api-access-hkc9w") pod "58d27fca-dd33-4753-9ea6-4b55c191b2b8" (UID: "58d27fca-dd33-4753-9ea6-4b55c191b2b8"). InnerVolumeSpecName "kube-api-access-hkc9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.845846 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkc9w\" (UniqueName: \"kubernetes.io/projected/58d27fca-dd33-4753-9ea6-4b55c191b2b8-kube-api-access-hkc9w\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.845902 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58d27fca-dd33-4753-9ea6-4b55c191b2b8-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.872427 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58d27fca-dd33-4753-9ea6-4b55c191b2b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58d27fca-dd33-4753-9ea6-4b55c191b2b8" (UID: "58d27fca-dd33-4753-9ea6-4b55c191b2b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.946818 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58d27fca-dd33-4753-9ea6-4b55c191b2b8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.957273 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gg7k" event={"ID":"58d27fca-dd33-4753-9ea6-4b55c191b2b8","Type":"ContainerDied","Data":"16702d347b624b3bd369486d2ad384ae2237c37d9deaac93249aca1b3e347f67"} Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.957308 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2gg7k" Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.957371 4825 scope.go:117] "RemoveContainer" containerID="16702d347b624b3bd369486d2ad384ae2237c37d9deaac93249aca1b3e347f67" Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.957036 4825 generic.go:334] "Generic (PLEG): container finished" podID="58d27fca-dd33-4753-9ea6-4b55c191b2b8" containerID="16702d347b624b3bd369486d2ad384ae2237c37d9deaac93249aca1b3e347f67" exitCode=0 Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.957765 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gg7k" event={"ID":"58d27fca-dd33-4753-9ea6-4b55c191b2b8","Type":"ContainerDied","Data":"0c2e196c25e5380e2ecd9888b8afb6756aca7df98c3ac9f3fda07f21268c2417"} Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.982397 4825 scope.go:117] "RemoveContainer" containerID="3d3c788e71e7ecc21bd432e78cfca9d09c70e1fabcd5fc612ecdd9e2c7cd1ef2" Mar 10 06:48:58 crc kubenswrapper[4825]: I0310 06:48:58.989359 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2gg7k"] Mar 10 06:48:59 crc kubenswrapper[4825]: I0310 06:48:59.000660 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2gg7k"] Mar 10 06:48:59 crc kubenswrapper[4825]: I0310 06:48:59.019361 4825 scope.go:117] "RemoveContainer" containerID="ffea082d781ea4dd1701576ab4dbc553eb6b6655cc4aab424e4722dd61f13604" Mar 10 06:48:59 crc kubenswrapper[4825]: I0310 06:48:59.051646 4825 scope.go:117] "RemoveContainer" containerID="16702d347b624b3bd369486d2ad384ae2237c37d9deaac93249aca1b3e347f67" Mar 10 06:48:59 crc kubenswrapper[4825]: E0310 06:48:59.052297 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16702d347b624b3bd369486d2ad384ae2237c37d9deaac93249aca1b3e347f67\": container with ID starting with 16702d347b624b3bd369486d2ad384ae2237c37d9deaac93249aca1b3e347f67 not found: ID does not exist" containerID="16702d347b624b3bd369486d2ad384ae2237c37d9deaac93249aca1b3e347f67" Mar 10 06:48:59 crc kubenswrapper[4825]: I0310 06:48:59.052331 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16702d347b624b3bd369486d2ad384ae2237c37d9deaac93249aca1b3e347f67"} err="failed to get container status \"16702d347b624b3bd369486d2ad384ae2237c37d9deaac93249aca1b3e347f67\": rpc error: code = NotFound desc = could not find container \"16702d347b624b3bd369486d2ad384ae2237c37d9deaac93249aca1b3e347f67\": container with ID starting with 16702d347b624b3bd369486d2ad384ae2237c37d9deaac93249aca1b3e347f67 not found: ID does not exist" Mar 10 06:48:59 crc kubenswrapper[4825]: I0310 06:48:59.052357 4825 scope.go:117] "RemoveContainer" containerID="3d3c788e71e7ecc21bd432e78cfca9d09c70e1fabcd5fc612ecdd9e2c7cd1ef2" Mar 10 06:48:59 crc kubenswrapper[4825]: E0310 06:48:59.052693 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d3c788e71e7ecc21bd432e78cfca9d09c70e1fabcd5fc612ecdd9e2c7cd1ef2\": container with ID starting with 3d3c788e71e7ecc21bd432e78cfca9d09c70e1fabcd5fc612ecdd9e2c7cd1ef2 not found: ID does not exist" containerID="3d3c788e71e7ecc21bd432e78cfca9d09c70e1fabcd5fc612ecdd9e2c7cd1ef2" Mar 10 06:48:59 crc kubenswrapper[4825]: I0310 06:48:59.052723 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d3c788e71e7ecc21bd432e78cfca9d09c70e1fabcd5fc612ecdd9e2c7cd1ef2"} err="failed to get container status \"3d3c788e71e7ecc21bd432e78cfca9d09c70e1fabcd5fc612ecdd9e2c7cd1ef2\": rpc error: code = NotFound desc = could not find container \"3d3c788e71e7ecc21bd432e78cfca9d09c70e1fabcd5fc612ecdd9e2c7cd1ef2\": container with ID starting with 3d3c788e71e7ecc21bd432e78cfca9d09c70e1fabcd5fc612ecdd9e2c7cd1ef2 not found: ID does not exist" Mar 10 06:48:59 crc kubenswrapper[4825]: I0310 06:48:59.052786 4825 scope.go:117] "RemoveContainer" containerID="ffea082d781ea4dd1701576ab4dbc553eb6b6655cc4aab424e4722dd61f13604" Mar 10 06:48:59 crc kubenswrapper[4825]: E0310 06:48:59.053454 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffea082d781ea4dd1701576ab4dbc553eb6b6655cc4aab424e4722dd61f13604\": container with ID starting with ffea082d781ea4dd1701576ab4dbc553eb6b6655cc4aab424e4722dd61f13604 not found: ID does not exist" containerID="ffea082d781ea4dd1701576ab4dbc553eb6b6655cc4aab424e4722dd61f13604" Mar 10 06:48:59 crc kubenswrapper[4825]: I0310 06:48:59.053529 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffea082d781ea4dd1701576ab4dbc553eb6b6655cc4aab424e4722dd61f13604"} err="failed to get container status \"ffea082d781ea4dd1701576ab4dbc553eb6b6655cc4aab424e4722dd61f13604\": rpc error: code = NotFound desc = could not find container \"ffea082d781ea4dd1701576ab4dbc553eb6b6655cc4aab424e4722dd61f13604\": container with ID starting with ffea082d781ea4dd1701576ab4dbc553eb6b6655cc4aab424e4722dd61f13604 not found: ID does not exist" Mar 10 06:48:59 crc kubenswrapper[4825]: I0310 06:48:59.251094 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58d27fca-dd33-4753-9ea6-4b55c191b2b8" path="/var/lib/kubelet/pods/58d27fca-dd33-4753-9ea6-4b55c191b2b8/volumes" Mar 10 06:48:59 crc kubenswrapper[4825]: I0310 06:48:59.253264 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="823b95d6-3aa7-4fda-bd75-9fe7e95a209f" path="/var/lib/kubelet/pods/823b95d6-3aa7-4fda-bd75-9fe7e95a209f/volumes" Mar 10 06:49:09 crc kubenswrapper[4825]: I0310 06:49:09.312475 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll"] Mar 10 06:49:09 crc kubenswrapper[4825]: I0310 06:49:09.313383 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" podUID="a48b68df-105c-418f-9cb7-525cb3641e20" containerName="controller-manager" containerID="cri-o://ffceb64c382508bdd878ad3e3ae44bd2e72c872f3e13b5c45be505c20730c0b2" gracePeriod=30 Mar 10 06:49:09 crc kubenswrapper[4825]: I0310 06:49:09.405412 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj"] Mar 10 06:49:09 crc kubenswrapper[4825]: I0310 06:49:09.405972 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj" podUID="c6152dc4-04ac-4ace-880d-67f4be272e50" containerName="route-controller-manager" containerID="cri-o://a36301ca3ac9dc807e52bfa5b981c9190333c861f97f982cb238f864042460a7" gracePeriod=30 Mar 10 06:49:09 crc kubenswrapper[4825]: I0310 06:49:09.834498 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" Mar 10 06:49:09 crc kubenswrapper[4825]: I0310 06:49:09.840572 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj" Mar 10 06:49:09 crc kubenswrapper[4825]: I0310 06:49:09.940788 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6152dc4-04ac-4ace-880d-67f4be272e50-serving-cert\") pod \"c6152dc4-04ac-4ace-880d-67f4be272e50\" (UID: \"c6152dc4-04ac-4ace-880d-67f4be272e50\") " Mar 10 06:49:09 crc kubenswrapper[4825]: I0310 06:49:09.940867 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxvrj\" (UniqueName: \"kubernetes.io/projected/c6152dc4-04ac-4ace-880d-67f4be272e50-kube-api-access-sxvrj\") pod \"c6152dc4-04ac-4ace-880d-67f4be272e50\" (UID: \"c6152dc4-04ac-4ace-880d-67f4be272e50\") " Mar 10 06:49:09 crc kubenswrapper[4825]: I0310 06:49:09.940932 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a48b68df-105c-418f-9cb7-525cb3641e20-serving-cert\") pod \"a48b68df-105c-418f-9cb7-525cb3641e20\" (UID: \"a48b68df-105c-418f-9cb7-525cb3641e20\") " Mar 10 06:49:09 crc kubenswrapper[4825]: I0310 06:49:09.940963 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a48b68df-105c-418f-9cb7-525cb3641e20-config\") pod \"a48b68df-105c-418f-9cb7-525cb3641e20\" (UID: \"a48b68df-105c-418f-9cb7-525cb3641e20\") " Mar 10 06:49:09 crc kubenswrapper[4825]: I0310 06:49:09.941026 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wflvv\" (UniqueName: \"kubernetes.io/projected/a48b68df-105c-418f-9cb7-525cb3641e20-kube-api-access-wflvv\") pod \"a48b68df-105c-418f-9cb7-525cb3641e20\" (UID: \"a48b68df-105c-418f-9cb7-525cb3641e20\") " Mar 10 06:49:09 crc kubenswrapper[4825]: I0310 06:49:09.941076 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a48b68df-105c-418f-9cb7-525cb3641e20-proxy-ca-bundles\") pod \"a48b68df-105c-418f-9cb7-525cb3641e20\" (UID: \"a48b68df-105c-418f-9cb7-525cb3641e20\") " Mar 10 06:49:09 crc kubenswrapper[4825]: I0310 06:49:09.941099 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6152dc4-04ac-4ace-880d-67f4be272e50-config\") pod \"c6152dc4-04ac-4ace-880d-67f4be272e50\" (UID: \"c6152dc4-04ac-4ace-880d-67f4be272e50\") " Mar 10 06:49:09 crc kubenswrapper[4825]: I0310 06:49:09.941171 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a48b68df-105c-418f-9cb7-525cb3641e20-client-ca\") pod \"a48b68df-105c-418f-9cb7-525cb3641e20\" (UID: \"a48b68df-105c-418f-9cb7-525cb3641e20\") " Mar 10 06:49:09 crc kubenswrapper[4825]: I0310 06:49:09.941202 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c6152dc4-04ac-4ace-880d-67f4be272e50-client-ca\") pod \"c6152dc4-04ac-4ace-880d-67f4be272e50\" (UID: \"c6152dc4-04ac-4ace-880d-67f4be272e50\") " Mar 10 06:49:09 crc kubenswrapper[4825]: I0310 06:49:09.942328 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6152dc4-04ac-4ace-880d-67f4be272e50-client-ca" (OuterVolumeSpecName: "client-ca") pod "c6152dc4-04ac-4ace-880d-67f4be272e50" (UID: "c6152dc4-04ac-4ace-880d-67f4be272e50"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:49:09 crc kubenswrapper[4825]: I0310 06:49:09.943119 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a48b68df-105c-418f-9cb7-525cb3641e20-client-ca" (OuterVolumeSpecName: "client-ca") pod "a48b68df-105c-418f-9cb7-525cb3641e20" (UID: "a48b68df-105c-418f-9cb7-525cb3641e20"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:49:09 crc kubenswrapper[4825]: I0310 06:49:09.943226 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a48b68df-105c-418f-9cb7-525cb3641e20-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a48b68df-105c-418f-9cb7-525cb3641e20" (UID: "a48b68df-105c-418f-9cb7-525cb3641e20"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:49:09 crc kubenswrapper[4825]: I0310 06:49:09.944043 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6152dc4-04ac-4ace-880d-67f4be272e50-config" (OuterVolumeSpecName: "config") pod "c6152dc4-04ac-4ace-880d-67f4be272e50" (UID: "c6152dc4-04ac-4ace-880d-67f4be272e50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:49:09 crc kubenswrapper[4825]: I0310 06:49:09.945354 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a48b68df-105c-418f-9cb7-525cb3641e20-config" (OuterVolumeSpecName: "config") pod "a48b68df-105c-418f-9cb7-525cb3641e20" (UID: "a48b68df-105c-418f-9cb7-525cb3641e20"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:49:09 crc kubenswrapper[4825]: I0310 06:49:09.948635 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6152dc4-04ac-4ace-880d-67f4be272e50-kube-api-access-sxvrj" (OuterVolumeSpecName: "kube-api-access-sxvrj") pod "c6152dc4-04ac-4ace-880d-67f4be272e50" (UID: "c6152dc4-04ac-4ace-880d-67f4be272e50"). InnerVolumeSpecName "kube-api-access-sxvrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:49:09 crc kubenswrapper[4825]: I0310 06:49:09.948696 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a48b68df-105c-418f-9cb7-525cb3641e20-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a48b68df-105c-418f-9cb7-525cb3641e20" (UID: "a48b68df-105c-418f-9cb7-525cb3641e20"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:49:09 crc kubenswrapper[4825]: I0310 06:49:09.948879 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a48b68df-105c-418f-9cb7-525cb3641e20-kube-api-access-wflvv" (OuterVolumeSpecName: "kube-api-access-wflvv") pod "a48b68df-105c-418f-9cb7-525cb3641e20" (UID: "a48b68df-105c-418f-9cb7-525cb3641e20"). InnerVolumeSpecName "kube-api-access-wflvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:49:09 crc kubenswrapper[4825]: I0310 06:49:09.951430 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6152dc4-04ac-4ace-880d-67f4be272e50-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c6152dc4-04ac-4ace-880d-67f4be272e50" (UID: "c6152dc4-04ac-4ace-880d-67f4be272e50"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.043525 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wflvv\" (UniqueName: \"kubernetes.io/projected/a48b68df-105c-418f-9cb7-525cb3641e20-kube-api-access-wflvv\") on node \"crc\" DevicePath \"\"" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.044027 4825 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a48b68df-105c-418f-9cb7-525cb3641e20-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.044049 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6152dc4-04ac-4ace-880d-67f4be272e50-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.044063 4825 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a48b68df-105c-418f-9cb7-525cb3641e20-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.044078 4825 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c6152dc4-04ac-4ace-880d-67f4be272e50-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.044090 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6152dc4-04ac-4ace-880d-67f4be272e50-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.044102 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxvrj\" (UniqueName: \"kubernetes.io/projected/c6152dc4-04ac-4ace-880d-67f4be272e50-kube-api-access-sxvrj\") on node \"crc\" DevicePath \"\"" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.044114 4825 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a48b68df-105c-418f-9cb7-525cb3641e20-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.044179 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a48b68df-105c-418f-9cb7-525cb3641e20-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.070422 4825 generic.go:334] "Generic (PLEG): container finished" podID="a48b68df-105c-418f-9cb7-525cb3641e20" containerID="ffceb64c382508bdd878ad3e3ae44bd2e72c872f3e13b5c45be505c20730c0b2" exitCode=0 Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.070529 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" event={"ID":"a48b68df-105c-418f-9cb7-525cb3641e20","Type":"ContainerDied","Data":"ffceb64c382508bdd878ad3e3ae44bd2e72c872f3e13b5c45be505c20730c0b2"} Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.070574 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" event={"ID":"a48b68df-105c-418f-9cb7-525cb3641e20","Type":"ContainerDied","Data":"7673d6e05a681f6445cb3df09b4306b814ea627794e397b0e3b0e41dad3f5eaf"} Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.070604 4825 scope.go:117] "RemoveContainer" containerID="ffceb64c382508bdd878ad3e3ae44bd2e72c872f3e13b5c45be505c20730c0b2" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.070765 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.075345 4825 generic.go:334] "Generic (PLEG): container finished" podID="c6152dc4-04ac-4ace-880d-67f4be272e50" containerID="a36301ca3ac9dc807e52bfa5b981c9190333c861f97f982cb238f864042460a7" exitCode=0 Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.075406 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj" event={"ID":"c6152dc4-04ac-4ace-880d-67f4be272e50","Type":"ContainerDied","Data":"a36301ca3ac9dc807e52bfa5b981c9190333c861f97f982cb238f864042460a7"} Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.075438 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj" event={"ID":"c6152dc4-04ac-4ace-880d-67f4be272e50","Type":"ContainerDied","Data":"90923028eb71f7a7b7023e77d0d0db15b826b4c3f4891e39b0066aae61b98600"} Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.075499 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.096402 4825 scope.go:117] "RemoveContainer" containerID="ffceb64c382508bdd878ad3e3ae44bd2e72c872f3e13b5c45be505c20730c0b2" Mar 10 06:49:10 crc kubenswrapper[4825]: E0310 06:49:10.096809 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffceb64c382508bdd878ad3e3ae44bd2e72c872f3e13b5c45be505c20730c0b2\": container with ID starting with ffceb64c382508bdd878ad3e3ae44bd2e72c872f3e13b5c45be505c20730c0b2 not found: ID does not exist" containerID="ffceb64c382508bdd878ad3e3ae44bd2e72c872f3e13b5c45be505c20730c0b2" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.096843 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffceb64c382508bdd878ad3e3ae44bd2e72c872f3e13b5c45be505c20730c0b2"} err="failed to get container status \"ffceb64c382508bdd878ad3e3ae44bd2e72c872f3e13b5c45be505c20730c0b2\": rpc error: code = NotFound desc = could not find container \"ffceb64c382508bdd878ad3e3ae44bd2e72c872f3e13b5c45be505c20730c0b2\": container with ID starting with ffceb64c382508bdd878ad3e3ae44bd2e72c872f3e13b5c45be505c20730c0b2 not found: ID does not exist" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.096866 4825 scope.go:117] "RemoveContainer" containerID="a36301ca3ac9dc807e52bfa5b981c9190333c861f97f982cb238f864042460a7" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.109019 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj"] Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.116149 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57886dd6c5-vszwj"] Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.117327 4825 scope.go:117] "RemoveContainer" containerID="a36301ca3ac9dc807e52bfa5b981c9190333c861f97f982cb238f864042460a7" Mar 10 06:49:10 crc kubenswrapper[4825]: E0310 06:49:10.117945 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a36301ca3ac9dc807e52bfa5b981c9190333c861f97f982cb238f864042460a7\": container with ID starting with a36301ca3ac9dc807e52bfa5b981c9190333c861f97f982cb238f864042460a7 not found: ID does not exist" containerID="a36301ca3ac9dc807e52bfa5b981c9190333c861f97f982cb238f864042460a7" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.117979 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a36301ca3ac9dc807e52bfa5b981c9190333c861f97f982cb238f864042460a7"} err="failed to get container status \"a36301ca3ac9dc807e52bfa5b981c9190333c861f97f982cb238f864042460a7\": rpc error: code = NotFound desc = could not find container \"a36301ca3ac9dc807e52bfa5b981c9190333c861f97f982cb238f864042460a7\": container with ID starting with a36301ca3ac9dc807e52bfa5b981c9190333c861f97f982cb238f864042460a7 not found: ID does not exist" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.127167 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll"] Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.133543 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5ffd5f77bc-vh5ll"] Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.538541 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-56467868d6-9z9wj"] Mar 10 06:49:10 crc kubenswrapper[4825]: E0310 06:49:10.540377 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d232f8a7-f013-4a5c-a7dc-2150c4b3040c" containerName="registry-server" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.540399 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d232f8a7-f013-4a5c-a7dc-2150c4b3040c" containerName="registry-server" Mar 10 06:49:10 crc kubenswrapper[4825]: E0310 06:49:10.540439 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ed941e-9788-4c9e-bcd5-6af206460adb" containerName="registry-server" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.540460 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ed941e-9788-4c9e-bcd5-6af206460adb" containerName="registry-server" Mar 10 06:49:10 crc kubenswrapper[4825]: E0310 06:49:10.540485 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ed941e-9788-4c9e-bcd5-6af206460adb" containerName="extract-utilities" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.540494 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ed941e-9788-4c9e-bcd5-6af206460adb" containerName="extract-utilities" Mar 10 06:49:10 crc kubenswrapper[4825]: E0310 06:49:10.540527 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="823b95d6-3aa7-4fda-bd75-9fe7e95a209f" containerName="extract-content" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.540535 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="823b95d6-3aa7-4fda-bd75-9fe7e95a209f" containerName="extract-content" Mar 10 06:49:10 crc kubenswrapper[4825]: E0310 06:49:10.540547 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d27fca-dd33-4753-9ea6-4b55c191b2b8" containerName="registry-server" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.540557 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d27fca-dd33-4753-9ea6-4b55c191b2b8" containerName="registry-server" Mar 10 06:49:10 crc kubenswrapper[4825]: E0310 06:49:10.540572 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6152dc4-04ac-4ace-880d-67f4be272e50" containerName="route-controller-manager" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.540580 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6152dc4-04ac-4ace-880d-67f4be272e50" containerName="route-controller-manager" Mar 10 06:49:10 crc kubenswrapper[4825]: E0310 06:49:10.540597 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d232f8a7-f013-4a5c-a7dc-2150c4b3040c" containerName="extract-utilities" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.540605 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d232f8a7-f013-4a5c-a7dc-2150c4b3040c" containerName="extract-utilities" Mar 10 06:49:10 crc kubenswrapper[4825]: E0310 06:49:10.540622 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d27fca-dd33-4753-9ea6-4b55c191b2b8" containerName="extract-content" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.540629 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d27fca-dd33-4753-9ea6-4b55c191b2b8" containerName="extract-content" Mar 10 06:49:10 crc kubenswrapper[4825]: E0310 06:49:10.540644 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d232f8a7-f013-4a5c-a7dc-2150c4b3040c" containerName="extract-content" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.540652 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d232f8a7-f013-4a5c-a7dc-2150c4b3040c" containerName="extract-content" Mar 10 06:49:10 crc kubenswrapper[4825]: E0310 06:49:10.540666 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ed941e-9788-4c9e-bcd5-6af206460adb" containerName="extract-content" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.540674 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ed941e-9788-4c9e-bcd5-6af206460adb" containerName="extract-content" Mar 10 06:49:10 crc kubenswrapper[4825]: E0310 06:49:10.540688 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="823b95d6-3aa7-4fda-bd75-9fe7e95a209f" containerName="extract-utilities" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.540698 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="823b95d6-3aa7-4fda-bd75-9fe7e95a209f" containerName="extract-utilities" Mar 10 06:49:10 crc kubenswrapper[4825]: E0310 06:49:10.540711 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d27fca-dd33-4753-9ea6-4b55c191b2b8" containerName="extract-utilities" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.540718 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d27fca-dd33-4753-9ea6-4b55c191b2b8" containerName="extract-utilities" Mar 10 06:49:10 crc kubenswrapper[4825]: E0310 06:49:10.540734 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="823b95d6-3aa7-4fda-bd75-9fe7e95a209f" containerName="registry-server" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.540741 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="823b95d6-3aa7-4fda-bd75-9fe7e95a209f" containerName="registry-server" Mar 10 06:49:10 crc kubenswrapper[4825]: E0310 06:49:10.540752 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a48b68df-105c-418f-9cb7-525cb3641e20" containerName="controller-manager" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.540758 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a48b68df-105c-418f-9cb7-525cb3641e20" containerName="controller-manager" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.541285 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="d232f8a7-f013-4a5c-a7dc-2150c4b3040c" containerName="registry-server" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.541315 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="823b95d6-3aa7-4fda-bd75-9fe7e95a209f" containerName="registry-server" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.541328 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6152dc4-04ac-4ace-880d-67f4be272e50" containerName="route-controller-manager" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.541341 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a48b68df-105c-418f-9cb7-525cb3641e20" containerName="controller-manager" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.541351 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ed941e-9788-4c9e-bcd5-6af206460adb" containerName="registry-server" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.541362 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="58d27fca-dd33-4753-9ea6-4b55c191b2b8" containerName="registry-server" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.542002 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56467868d6-9z9wj" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.547098 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.557984 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.558251 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.559877 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bc5c444d6-8djg5"] Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.560369 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.575795 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.576718 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.577016 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bc5c444d6-8djg5" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.580515 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.580898 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.580994 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.581029 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.581207 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.581275 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.585875 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.596365 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56467868d6-9z9wj"] Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.601065 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bc5c444d6-8djg5"] Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.658584 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f1c5058-50e3-4239-af50-fab46ade0662-client-ca\") pod \"route-controller-manager-7bc5c444d6-8djg5\" (UID: \"6f1c5058-50e3-4239-af50-fab46ade0662\") " pod="openshift-route-controller-manager/route-controller-manager-7bc5c444d6-8djg5" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.658675 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0fa4d04c-7600-4648-9ca9-26e92fa1d281-serving-cert\") pod \"controller-manager-56467868d6-9z9wj\" (UID: \"0fa4d04c-7600-4648-9ca9-26e92fa1d281\") " pod="openshift-controller-manager/controller-manager-56467868d6-9z9wj" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.658705 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fa4d04c-7600-4648-9ca9-26e92fa1d281-config\") pod \"controller-manager-56467868d6-9z9wj\" (UID: \"0fa4d04c-7600-4648-9ca9-26e92fa1d281\") " pod="openshift-controller-manager/controller-manager-56467868d6-9z9wj" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.658757 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0fa4d04c-7600-4648-9ca9-26e92fa1d281-proxy-ca-bundles\") pod \"controller-manager-56467868d6-9z9wj\" (UID: \"0fa4d04c-7600-4648-9ca9-26e92fa1d281\") " pod="openshift-controller-manager/controller-manager-56467868d6-9z9wj" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.658990 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f1c5058-50e3-4239-af50-fab46ade0662-serving-cert\") pod \"route-controller-manager-7bc5c444d6-8djg5\" (UID: \"6f1c5058-50e3-4239-af50-fab46ade0662\") " pod="openshift-route-controller-manager/route-controller-manager-7bc5c444d6-8djg5" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.659144 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x65x5\" (UniqueName: \"kubernetes.io/projected/6f1c5058-50e3-4239-af50-fab46ade0662-kube-api-access-x65x5\") pod \"route-controller-manager-7bc5c444d6-8djg5\" (UID: \"6f1c5058-50e3-4239-af50-fab46ade0662\") " pod="openshift-route-controller-manager/route-controller-manager-7bc5c444d6-8djg5" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.659206 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r69gq\" (UniqueName: \"kubernetes.io/projected/0fa4d04c-7600-4648-9ca9-26e92fa1d281-kube-api-access-r69gq\") pod \"controller-manager-56467868d6-9z9wj\" (UID: \"0fa4d04c-7600-4648-9ca9-26e92fa1d281\") " pod="openshift-controller-manager/controller-manager-56467868d6-9z9wj" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.659250 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f1c5058-50e3-4239-af50-fab46ade0662-config\") pod \"route-controller-manager-7bc5c444d6-8djg5\" (UID: \"6f1c5058-50e3-4239-af50-fab46ade0662\") " pod="openshift-route-controller-manager/route-controller-manager-7bc5c444d6-8djg5" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.659313 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0fa4d04c-7600-4648-9ca9-26e92fa1d281-client-ca\") pod \"controller-manager-56467868d6-9z9wj\" (UID: \"0fa4d04c-7600-4648-9ca9-26e92fa1d281\") " pod="openshift-controller-manager/controller-manager-56467868d6-9z9wj" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.761170 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f1c5058-50e3-4239-af50-fab46ade0662-serving-cert\") pod \"route-controller-manager-7bc5c444d6-8djg5\" (UID: \"6f1c5058-50e3-4239-af50-fab46ade0662\") " pod="openshift-route-controller-manager/route-controller-manager-7bc5c444d6-8djg5" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.761251 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x65x5\" (UniqueName: \"kubernetes.io/projected/6f1c5058-50e3-4239-af50-fab46ade0662-kube-api-access-x65x5\") pod \"route-controller-manager-7bc5c444d6-8djg5\" (UID: \"6f1c5058-50e3-4239-af50-fab46ade0662\") " pod="openshift-route-controller-manager/route-controller-manager-7bc5c444d6-8djg5" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.761279 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r69gq\" (UniqueName: \"kubernetes.io/projected/0fa4d04c-7600-4648-9ca9-26e92fa1d281-kube-api-access-r69gq\") pod \"controller-manager-56467868d6-9z9wj\" (UID: \"0fa4d04c-7600-4648-9ca9-26e92fa1d281\") " pod="openshift-controller-manager/controller-manager-56467868d6-9z9wj" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.761305 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f1c5058-50e3-4239-af50-fab46ade0662-config\") pod \"route-controller-manager-7bc5c444d6-8djg5\" (UID: \"6f1c5058-50e3-4239-af50-fab46ade0662\") " pod="openshift-route-controller-manager/route-controller-manager-7bc5c444d6-8djg5" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.761346 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0fa4d04c-7600-4648-9ca9-26e92fa1d281-client-ca\") pod \"controller-manager-56467868d6-9z9wj\" (UID: \"0fa4d04c-7600-4648-9ca9-26e92fa1d281\") " pod="openshift-controller-manager/controller-manager-56467868d6-9z9wj" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.761390 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f1c5058-50e3-4239-af50-fab46ade0662-client-ca\") pod \"route-controller-manager-7bc5c444d6-8djg5\" (UID: \"6f1c5058-50e3-4239-af50-fab46ade0662\") " pod="openshift-route-controller-manager/route-controller-manager-7bc5c444d6-8djg5" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.761432 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0fa4d04c-7600-4648-9ca9-26e92fa1d281-serving-cert\") pod \"controller-manager-56467868d6-9z9wj\" (UID: \"0fa4d04c-7600-4648-9ca9-26e92fa1d281\") " pod="openshift-controller-manager/controller-manager-56467868d6-9z9wj" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.761458 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fa4d04c-7600-4648-9ca9-26e92fa1d281-config\") pod \"controller-manager-56467868d6-9z9wj\" (UID: \"0fa4d04c-7600-4648-9ca9-26e92fa1d281\") " pod="openshift-controller-manager/controller-manager-56467868d6-9z9wj" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.761479 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0fa4d04c-7600-4648-9ca9-26e92fa1d281-proxy-ca-bundles\") pod \"controller-manager-56467868d6-9z9wj\" (UID: \"0fa4d04c-7600-4648-9ca9-26e92fa1d281\") " pod="openshift-controller-manager/controller-manager-56467868d6-9z9wj" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.762870 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0fa4d04c-7600-4648-9ca9-26e92fa1d281-proxy-ca-bundles\") pod \"controller-manager-56467868d6-9z9wj\" (UID: \"0fa4d04c-7600-4648-9ca9-26e92fa1d281\") " pod="openshift-controller-manager/controller-manager-56467868d6-9z9wj" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.762899 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0fa4d04c-7600-4648-9ca9-26e92fa1d281-client-ca\") pod \"controller-manager-56467868d6-9z9wj\" (UID: \"0fa4d04c-7600-4648-9ca9-26e92fa1d281\") " pod="openshift-controller-manager/controller-manager-56467868d6-9z9wj" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.763164 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f1c5058-50e3-4239-af50-fab46ade0662-client-ca\") pod \"route-controller-manager-7bc5c444d6-8djg5\" (UID: \"6f1c5058-50e3-4239-af50-fab46ade0662\") " pod="openshift-route-controller-manager/route-controller-manager-7bc5c444d6-8djg5" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.763839 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f1c5058-50e3-4239-af50-fab46ade0662-config\") pod \"route-controller-manager-7bc5c444d6-8djg5\" (UID: \"6f1c5058-50e3-4239-af50-fab46ade0662\") " pod="openshift-route-controller-manager/route-controller-manager-7bc5c444d6-8djg5" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.764068 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fa4d04c-7600-4648-9ca9-26e92fa1d281-config\") pod \"controller-manager-56467868d6-9z9wj\" (UID: \"0fa4d04c-7600-4648-9ca9-26e92fa1d281\") " pod="openshift-controller-manager/controller-manager-56467868d6-9z9wj" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.767337 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f1c5058-50e3-4239-af50-fab46ade0662-serving-cert\") pod \"route-controller-manager-7bc5c444d6-8djg5\" (UID: \"6f1c5058-50e3-4239-af50-fab46ade0662\") " pod="openshift-route-controller-manager/route-controller-manager-7bc5c444d6-8djg5" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.767490 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0fa4d04c-7600-4648-9ca9-26e92fa1d281-serving-cert\") pod \"controller-manager-56467868d6-9z9wj\" (UID: \"0fa4d04c-7600-4648-9ca9-26e92fa1d281\") " pod="openshift-controller-manager/controller-manager-56467868d6-9z9wj" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.786993 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x65x5\" (UniqueName: \"kubernetes.io/projected/6f1c5058-50e3-4239-af50-fab46ade0662-kube-api-access-x65x5\") pod \"route-controller-manager-7bc5c444d6-8djg5\" (UID: \"6f1c5058-50e3-4239-af50-fab46ade0662\") " pod="openshift-route-controller-manager/route-controller-manager-7bc5c444d6-8djg5" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.789960 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r69gq\" (UniqueName: \"kubernetes.io/projected/0fa4d04c-7600-4648-9ca9-26e92fa1d281-kube-api-access-r69gq\") pod \"controller-manager-56467868d6-9z9wj\" (UID: \"0fa4d04c-7600-4648-9ca9-26e92fa1d281\") " pod="openshift-controller-manager/controller-manager-56467868d6-9z9wj" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.877617 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56467868d6-9z9wj" Mar 10 06:49:10 crc kubenswrapper[4825]: I0310 06:49:10.895804 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bc5c444d6-8djg5" Mar 10 06:49:11 crc kubenswrapper[4825]: I0310 06:49:11.251326 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a48b68df-105c-418f-9cb7-525cb3641e20" path="/var/lib/kubelet/pods/a48b68df-105c-418f-9cb7-525cb3641e20/volumes" Mar 10 06:49:11 crc kubenswrapper[4825]: I0310 06:49:11.252228 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6152dc4-04ac-4ace-880d-67f4be272e50" path="/var/lib/kubelet/pods/c6152dc4-04ac-4ace-880d-67f4be272e50/volumes" Mar 10 06:49:11 crc kubenswrapper[4825]: I0310 06:49:11.307266 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56467868d6-9z9wj"] Mar 10 06:49:11 crc kubenswrapper[4825]: I0310 06:49:11.356692 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bc5c444d6-8djg5"] Mar 10 06:49:11 crc kubenswrapper[4825]: W0310 06:49:11.366728 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f1c5058_50e3_4239_af50_fab46ade0662.slice/crio-dad09d1598d913f5b7d42ad85ed9470121fcdb83538291aa373a9c56f4b71512 WatchSource:0}: Error finding container dad09d1598d913f5b7d42ad85ed9470121fcdb83538291aa373a9c56f4b71512: Status 404 returned error can't find the container with id dad09d1598d913f5b7d42ad85ed9470121fcdb83538291aa373a9c56f4b71512 Mar 10 06:49:12 crc kubenswrapper[4825]: I0310 06:49:12.098989 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bc5c444d6-8djg5" event={"ID":"6f1c5058-50e3-4239-af50-fab46ade0662","Type":"ContainerStarted","Data":"d5e3ad9bf905b21e013bce1011ec709aa7de551954133916ebe9f906be6f9a1c"} Mar 10 06:49:12 crc kubenswrapper[4825]: I0310 06:49:12.099052 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bc5c444d6-8djg5" event={"ID":"6f1c5058-50e3-4239-af50-fab46ade0662","Type":"ContainerStarted","Data":"dad09d1598d913f5b7d42ad85ed9470121fcdb83538291aa373a9c56f4b71512"} Mar 10 06:49:12 crc kubenswrapper[4825]: I0310 06:49:12.099079 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7bc5c444d6-8djg5" Mar 10 06:49:12 crc kubenswrapper[4825]: I0310 06:49:12.102089 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56467868d6-9z9wj" event={"ID":"0fa4d04c-7600-4648-9ca9-26e92fa1d281","Type":"ContainerStarted","Data":"e95a94f60e0f462b2a13f2e97cbb5ffafc0f89cbc4754c7bc1d4fc0b8a25ab46"} Mar 10 06:49:12 crc kubenswrapper[4825]: I0310 06:49:12.102599 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-56467868d6-9z9wj" Mar 10 06:49:12 crc kubenswrapper[4825]: I0310 06:49:12.102616 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56467868d6-9z9wj" event={"ID":"0fa4d04c-7600-4648-9ca9-26e92fa1d281","Type":"ContainerStarted","Data":"388ac7c3247c1c03336fb9572651ca48776f50e6b2829f3881f2d3c1e439bc63"} Mar 10 06:49:12 crc kubenswrapper[4825]: I0310 06:49:12.105946 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7bc5c444d6-8djg5" Mar 10 06:49:12 crc kubenswrapper[4825]: I0310 06:49:12.107626 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-56467868d6-9z9wj" Mar 10 06:49:12 crc kubenswrapper[4825]: I0310 06:49:12.120689 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7bc5c444d6-8djg5" podStartSLOduration=3.120669229 podStartE2EDuration="3.120669229s" podCreationTimestamp="2026-03-10 06:49:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:49:12.11706226 +0000 UTC m=+305.146842885" watchObservedRunningTime="2026-03-10 06:49:12.120669229 +0000 UTC m=+305.150449844" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.860127 4825 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.861164 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3" gracePeriod=15 Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.861223 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce" gracePeriod=15 Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.861278 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0" gracePeriod=15 Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.861288 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e" gracePeriod=15 Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.863599 4825 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 06:49:16 crc kubenswrapper[4825]: E0310 06:49:16.864083 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.864114 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 06:49:16 crc kubenswrapper[4825]: E0310 06:49:16.864186 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.864205 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 06:49:16 crc kubenswrapper[4825]: E0310 06:49:16.864237 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.864255 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 06:49:16 crc kubenswrapper[4825]: E0310 06:49:16.864282 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.864299 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 06:49:16 crc kubenswrapper[4825]: E0310 06:49:16.864322 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.864338 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 06:49:16 crc kubenswrapper[4825]: E0310 06:49:16.864361 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.864376 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 06:49:16 crc kubenswrapper[4825]: E0310 06:49:16.864399 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.864414 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 10 06:49:16 crc kubenswrapper[4825]: E0310 06:49:16.864440 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.864455 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.864696 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.864722 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.864744 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.864764 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.864786 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.864805 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.864826 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.864848 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.861387 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7" gracePeriod=15 Mar 10 06:49:16 crc kubenswrapper[4825]: E0310 06:49:16.865106 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.865315 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 06:49:16 crc kubenswrapper[4825]: E0310 06:49:16.865381 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.865402 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.865763 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.868151 4825 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.869263 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.875858 4825 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.888755 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.888821 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.888870 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.889750 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.889824 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c" gracePeriod=600 Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.964841 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.964904 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.964932 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.964953 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.964973 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.965000 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.965076 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 06:49:16 crc kubenswrapper[4825]: I0310 06:49:16.965110 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.066083 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.066673 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.066703 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.066721 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.066742 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.066766 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.066790 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.066843 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.066908 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.066231 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.066965 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.066987 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.067010 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.067034 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.067055 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.067074 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.141790 4825 generic.go:334] "Generic (PLEG): container finished" podID="c5d3354d-1386-4f42-9c53-134c6583650d" containerID="726fc181965abc85a7bc538762549b8ee17fd04b8c3ed4c1a01009331b2dfb70" exitCode=0 Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.141898 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c5d3354d-1386-4f42-9c53-134c6583650d","Type":"ContainerDied","Data":"726fc181965abc85a7bc538762549b8ee17fd04b8c3ed4c1a01009331b2dfb70"} Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.142651 4825 status_manager.go:851] "Failed to get status for pod" podUID="c5d3354d-1386-4f42-9c53-134c6583650d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.145484 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c" exitCode=0 Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.145547 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c"} Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.148001 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.149870 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.152314 4825 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0" exitCode=0 Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.152699 4825 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce" exitCode=0 Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.152723 4825 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e" exitCode=0 Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.152739 4825 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7" exitCode=2 Mar 10 06:49:17 crc kubenswrapper[4825]: I0310 06:49:17.152392 4825 scope.go:117] "RemoveContainer" containerID="414fcb0fab700a5b75d47b961cb7709479eabd2b736e2cb663c77383bc874a0d" Mar 10 06:49:18 crc kubenswrapper[4825]: I0310 06:49:18.167967 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 06:49:18 crc kubenswrapper[4825]: I0310 06:49:18.176593 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"530719938ecf342726dd5c9606058576ce615ad39fc2f242ce6515ac6bbc463f"} Mar 10 06:49:18 crc kubenswrapper[4825]: I0310 06:49:18.178102 4825 status_manager.go:851] "Failed to get status for pod" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bvt9j\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:18 crc kubenswrapper[4825]: I0310 06:49:18.179174 4825 status_manager.go:851] "Failed to get status for pod" podUID="c5d3354d-1386-4f42-9c53-134c6583650d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:18 crc kubenswrapper[4825]: I0310 06:49:18.673208 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 06:49:18 crc kubenswrapper[4825]: I0310 06:49:18.674360 4825 status_manager.go:851] "Failed to get status for pod" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bvt9j\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:18 crc kubenswrapper[4825]: I0310 06:49:18.675001 4825 status_manager.go:851] "Failed to get status for pod" podUID="c5d3354d-1386-4f42-9c53-134c6583650d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:18 crc kubenswrapper[4825]: I0310 06:49:18.799373 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5d3354d-1386-4f42-9c53-134c6583650d-kube-api-access\") pod \"c5d3354d-1386-4f42-9c53-134c6583650d\" (UID: \"c5d3354d-1386-4f42-9c53-134c6583650d\") " Mar 10 06:49:18 crc kubenswrapper[4825]: I0310 06:49:18.799477 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5d3354d-1386-4f42-9c53-134c6583650d-kubelet-dir\") pod \"c5d3354d-1386-4f42-9c53-134c6583650d\" (UID: \"c5d3354d-1386-4f42-9c53-134c6583650d\") " Mar 10 06:49:18 crc kubenswrapper[4825]: I0310 06:49:18.799677 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c5d3354d-1386-4f42-9c53-134c6583650d-var-lock\") pod \"c5d3354d-1386-4f42-9c53-134c6583650d\" (UID: \"c5d3354d-1386-4f42-9c53-134c6583650d\") " Mar 10 06:49:18 crc kubenswrapper[4825]: I0310 06:49:18.799702 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5d3354d-1386-4f42-9c53-134c6583650d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c5d3354d-1386-4f42-9c53-134c6583650d" (UID: "c5d3354d-1386-4f42-9c53-134c6583650d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:49:18 crc kubenswrapper[4825]: I0310 06:49:18.799890 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5d3354d-1386-4f42-9c53-134c6583650d-var-lock" (OuterVolumeSpecName: "var-lock") pod "c5d3354d-1386-4f42-9c53-134c6583650d" (UID: "c5d3354d-1386-4f42-9c53-134c6583650d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:49:18 crc kubenswrapper[4825]: I0310 06:49:18.800214 4825 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c5d3354d-1386-4f42-9c53-134c6583650d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 06:49:18 crc kubenswrapper[4825]: I0310 06:49:18.800249 4825 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c5d3354d-1386-4f42-9c53-134c6583650d-var-lock\") on node \"crc\" DevicePath \"\"" Mar 10 06:49:18 crc kubenswrapper[4825]: I0310 06:49:18.807604 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5d3354d-1386-4f42-9c53-134c6583650d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c5d3354d-1386-4f42-9c53-134c6583650d" (UID: "c5d3354d-1386-4f42-9c53-134c6583650d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:49:18 crc kubenswrapper[4825]: I0310 06:49:18.902287 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5d3354d-1386-4f42-9c53-134c6583650d-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 06:49:19 crc kubenswrapper[4825]: I0310 06:49:19.187389 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 06:49:19 crc kubenswrapper[4825]: I0310 06:49:19.191203 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c5d3354d-1386-4f42-9c53-134c6583650d","Type":"ContainerDied","Data":"3cccda6dc9f8f6468ff7c1852aa62c04165f7c89056f5cb797e4cb74c8c06c19"} Mar 10 06:49:19 crc kubenswrapper[4825]: I0310 06:49:19.191262 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cccda6dc9f8f6468ff7c1852aa62c04165f7c89056f5cb797e4cb74c8c06c19" Mar 10 06:49:19 crc kubenswrapper[4825]: I0310 06:49:19.205232 4825 status_manager.go:851] "Failed to get status for pod" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bvt9j\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:19 crc kubenswrapper[4825]: I0310 06:49:19.205796 4825 status_manager.go:851] "Failed to get status for pod" podUID="c5d3354d-1386-4f42-9c53-134c6583650d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:19 crc kubenswrapper[4825]: I0310 06:49:19.240048 4825 status_manager.go:851] "Failed to get status for pod" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bvt9j\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:19 crc kubenswrapper[4825]: I0310 06:49:19.240857 4825 status_manager.go:851] "Failed to get status for pod" podUID="c5d3354d-1386-4f42-9c53-134c6583650d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:19 crc kubenswrapper[4825]: I0310 06:49:19.336415 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 06:49:19 crc kubenswrapper[4825]: I0310 06:49:19.337853 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:49:19 crc kubenswrapper[4825]: I0310 06:49:19.338716 4825 status_manager.go:851] "Failed to get status for pod" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bvt9j\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:19 crc kubenswrapper[4825]: I0310 06:49:19.339381 4825 status_manager.go:851] "Failed to get status for pod" podUID="c5d3354d-1386-4f42-9c53-134c6583650d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:19 crc kubenswrapper[4825]: I0310 06:49:19.340024 4825 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:19 crc kubenswrapper[4825]: I0310 06:49:19.408406 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 06:49:19 crc kubenswrapper[4825]: I0310 06:49:19.408511 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:49:19 crc kubenswrapper[4825]: I0310 06:49:19.408730 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 06:49:19 crc kubenswrapper[4825]: I0310 06:49:19.408757 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 06:49:19 crc kubenswrapper[4825]: I0310 06:49:19.408869 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:49:19 crc kubenswrapper[4825]: I0310 06:49:19.408858 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:49:19 crc kubenswrapper[4825]: I0310 06:49:19.409256 4825 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 10 06:49:19 crc kubenswrapper[4825]: I0310 06:49:19.409274 4825 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 10 06:49:19 crc kubenswrapper[4825]: I0310 06:49:19.409282 4825 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 06:49:20 crc kubenswrapper[4825]: I0310 06:49:20.202073 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 06:49:20 crc kubenswrapper[4825]: I0310 06:49:20.204278 4825 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3" exitCode=0 Mar 10 06:49:20 crc kubenswrapper[4825]: I0310 06:49:20.204368 4825 scope.go:117] "RemoveContainer" containerID="ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0" Mar 10 06:49:20 crc kubenswrapper[4825]: I0310 06:49:20.204433 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:49:20 crc kubenswrapper[4825]: I0310 06:49:20.221023 4825 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:20 crc kubenswrapper[4825]: I0310 06:49:20.221747 4825 status_manager.go:851] "Failed to get status for pod" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bvt9j\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:20 crc kubenswrapper[4825]: I0310 06:49:20.222321 4825 status_manager.go:851] "Failed to get status for pod" podUID="c5d3354d-1386-4f42-9c53-134c6583650d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:20 crc kubenswrapper[4825]: I0310 06:49:20.231936 4825 scope.go:117] "RemoveContainer" containerID="355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce" Mar 10 06:49:20 crc kubenswrapper[4825]: I0310 06:49:20.249550 4825 scope.go:117] "RemoveContainer" containerID="aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e" Mar 10 06:49:20 crc kubenswrapper[4825]: I0310 06:49:20.268224 4825 scope.go:117] "RemoveContainer" containerID="58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7" Mar 10 06:49:20 crc kubenswrapper[4825]: I0310 06:49:20.287465 4825 scope.go:117] "RemoveContainer" containerID="72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3" Mar 10 06:49:20 crc kubenswrapper[4825]: I0310 06:49:20.310549 4825 scope.go:117] "RemoveContainer" containerID="da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc" Mar 10 06:49:20 crc kubenswrapper[4825]: I0310 06:49:20.338017 4825 scope.go:117] "RemoveContainer" containerID="ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0" Mar 10 06:49:20 crc kubenswrapper[4825]: E0310 06:49:20.338787 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0\": container with ID starting with ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0 not found: ID does not exist" containerID="ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0" Mar 10 06:49:20 crc kubenswrapper[4825]: I0310 06:49:20.338866 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0"} err="failed to get container status \"ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0\": rpc error: code = NotFound desc = could not find container \"ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0\": container with ID starting with ddf92f28cf551cde58efe622659d4395d895a1129ac99b065975c991364f03a0 not found: ID does not exist" Mar 10 06:49:20 crc kubenswrapper[4825]: I0310 06:49:20.338892 4825 scope.go:117] "RemoveContainer" containerID="355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce" Mar 10 06:49:20 crc kubenswrapper[4825]: E0310 06:49:20.339680 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\": container with ID starting with 355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce not found: ID does not exist" containerID="355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce" Mar 10 06:49:20 crc kubenswrapper[4825]: I0310 06:49:20.339704 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce"} err="failed to get container status \"355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\": rpc error: code = NotFound desc = could not find container \"355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce\": container with ID starting with 355876c0a27b1bb3e849a28b7567ffe387855fc88f2cd7635bcd4df87f1cf8ce not found: ID does not exist" Mar 10 06:49:20 crc kubenswrapper[4825]: I0310 06:49:20.339722 4825 scope.go:117] "RemoveContainer" containerID="aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e" Mar 10 06:49:20 crc kubenswrapper[4825]: E0310 06:49:20.340620 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\": container with ID starting with aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e not found: ID does not exist" containerID="aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e" Mar 10 06:49:20 crc kubenswrapper[4825]: I0310 06:49:20.340660 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e"} err="failed to get container status \"aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\": rpc error: code = NotFound desc = could not find container \"aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e\": container with ID starting with aa001618eec35fd33e1b950931c1569abe2a8831324c3e99c743640d0e92985e not found: ID does not exist" Mar 10 06:49:20 crc kubenswrapper[4825]: I0310 06:49:20.340692 4825 scope.go:117] "RemoveContainer" containerID="58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7" Mar 10 06:49:20 crc kubenswrapper[4825]: E0310 06:49:20.343064 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\": container with ID starting with 58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7 not found: ID does not exist" containerID="58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7" Mar 10 06:49:20 crc kubenswrapper[4825]: I0310 06:49:20.343197 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7"} err="failed to get container status \"58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\": rpc error: code = NotFound desc = could not find container \"58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7\": container with ID starting with 58bc64b1731ff84f0c0bf721fda5cac80663a966b699bdb9903d585b1b94fcc7 not found: ID does not exist" Mar 10 06:49:20 crc kubenswrapper[4825]: I0310 06:49:20.343281 4825 scope.go:117] "RemoveContainer" containerID="72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3" Mar 10 06:49:20 crc kubenswrapper[4825]: E0310 06:49:20.344537 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\": container with ID starting with 72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3 not found: ID does not exist" containerID="72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3" Mar 10 06:49:20 crc kubenswrapper[4825]: I0310 06:49:20.344620 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3"} err="failed to get container status \"72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\": rpc error: code = NotFound desc = could not find container \"72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3\": container with ID starting with 72fd21e27ecde9a0e560706cba79d0bf57b8883a0ade28aae7fdbe2d005670d3 not found: ID does not exist" Mar 10 06:49:20 crc kubenswrapper[4825]: I0310 06:49:20.344657 4825 scope.go:117] "RemoveContainer" containerID="da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc" Mar 10 06:49:20 crc kubenswrapper[4825]: E0310 06:49:20.345532 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\": container with ID starting with da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc not found: ID does not exist" containerID="da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc" Mar 10 06:49:20 crc kubenswrapper[4825]: I0310 06:49:20.345596 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc"} err="failed to get container status \"da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\": rpc error: code = NotFound desc = could not find container \"da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc\": container with ID starting with da163ca8e501eeae7376055d584875ee9cfb37d6b8731d18419442294f8ccddc not found: ID does not exist" Mar 10 06:49:21 crc kubenswrapper[4825]: I0310 06:49:21.246432 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 10 06:49:21 crc kubenswrapper[4825]: E0310 06:49:21.909704 4825 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/events\": dial tcp 38.102.83.222:6443: connect: connection refused" event="&Event{ObjectMeta:{machine-config-daemon-bvt9j.189b681e1fa79f50 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:machine-config-daemon-bvt9j,UID:9beb5814-89d0-47c0-8b0e-24376a358fc3,APIVersion:v1,ResourceVersion:26846,FieldPath:spec.containers{machine-config-daemon},},Reason:Killing,Message:Container machine-config-daemon failed liveness probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:49:16.88980872 +0000 UTC m=+309.919589325,LastTimestamp:2026-03-10 06:49:16.88980872 +0000 UTC m=+309.919589325,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:49:21 crc kubenswrapper[4825]: E0310 06:49:21.930846 4825 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.222:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 06:49:21 crc kubenswrapper[4825]: I0310 06:49:21.931567 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 06:49:22 crc kubenswrapper[4825]: I0310 06:49:22.222744 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"fd67a4895a36384937f2a9f2e608f38f4a247d0a702f78b207c978a9cb7731d5"} Mar 10 06:49:22 crc kubenswrapper[4825]: E0310 06:49:22.897412 4825 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/events\": dial tcp 38.102.83.222:6443: connect: connection refused" event="&Event{ObjectMeta:{machine-config-daemon-bvt9j.189b681e1fa79f50 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:machine-config-daemon-bvt9j,UID:9beb5814-89d0-47c0-8b0e-24376a358fc3,APIVersion:v1,ResourceVersion:26846,FieldPath:spec.containers{machine-config-daemon},},Reason:Killing,Message:Container machine-config-daemon failed liveness probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:49:16.88980872 +0000 UTC m=+309.919589325,LastTimestamp:2026-03-10 06:49:16.88980872 +0000 UTC m=+309.919589325,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:49:23 crc kubenswrapper[4825]: E0310 06:49:23.238936 4825 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.222:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 06:49:23 crc kubenswrapper[4825]: I0310 06:49:23.239062 4825 status_manager.go:851] "Failed to get status for pod" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bvt9j\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:23 crc kubenswrapper[4825]: I0310 06:49:23.240686 4825 status_manager.go:851] "Failed to get status for pod" podUID="c5d3354d-1386-4f42-9c53-134c6583650d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:23 crc kubenswrapper[4825]: I0310 06:49:23.254260 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2838369c1d0edf34b9393bf06fcfd6b628049a63fe54561783c9e4aa998e88db"} Mar 10 06:49:24 crc kubenswrapper[4825]: E0310 06:49:24.244408 4825 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.222:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 06:49:24 crc kubenswrapper[4825]: E0310 06:49:24.751267 4825 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:24 crc kubenswrapper[4825]: E0310 06:49:24.752384 4825 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:24 crc kubenswrapper[4825]: E0310 06:49:24.753257 4825 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:24 crc kubenswrapper[4825]: E0310 06:49:24.753901 4825 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:24 crc kubenswrapper[4825]: E0310 06:49:24.754496 4825 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:24 crc kubenswrapper[4825]: I0310 06:49:24.754553 4825 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 10 06:49:24 crc kubenswrapper[4825]: E0310 06:49:24.755018 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="200ms" Mar 10 06:49:24 crc kubenswrapper[4825]: E0310 06:49:24.956959 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="400ms" Mar 10 06:49:25 crc kubenswrapper[4825]: E0310 06:49:25.358366 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="800ms" Mar 10 06:49:26 crc kubenswrapper[4825]: E0310 06:49:26.159535 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="1.6s" Mar 10 06:49:26 crc kubenswrapper[4825]: E0310 06:49:26.273397 4825 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.222:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" volumeName="registry-storage" Mar 10 06:49:27 crc kubenswrapper[4825]: E0310 06:49:27.761367 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="3.2s" Mar 10 06:49:29 crc kubenswrapper[4825]: I0310 06:49:29.073102 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:49:29 crc kubenswrapper[4825]: I0310 06:49:29.073230 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:49:29 crc kubenswrapper[4825]: I0310 06:49:29.073285 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:49:29 crc kubenswrapper[4825]: I0310 06:49:29.073342 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:49:29 crc kubenswrapper[4825]: W0310 06:49:29.074302 4825 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27248": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:49:29 crc kubenswrapper[4825]: E0310 06:49:29.074414 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27248\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Mar 10 06:49:29 crc kubenswrapper[4825]: W0310 06:49:29.074579 4825 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27246": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:49:29 crc kubenswrapper[4825]: E0310 06:49:29.074631 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27246\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Mar 10 06:49:29 crc kubenswrapper[4825]: W0310 06:49:29.074332 4825 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27246": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:49:29 crc kubenswrapper[4825]: E0310 06:49:29.075243 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27246\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Mar 10 06:49:29 crc kubenswrapper[4825]: I0310 06:49:29.239749 4825 status_manager.go:851] "Failed to get status for pod" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bvt9j\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:29 crc kubenswrapper[4825]: I0310 06:49:29.240400 4825 status_manager.go:851] "Failed to get status for pod" podUID="c5d3354d-1386-4f42-9c53-134c6583650d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:30 crc kubenswrapper[4825]: E0310 06:49:30.074469 4825 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 10 06:49:30 crc kubenswrapper[4825]: E0310 06:49:30.075477 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 06:51:32.07543506 +0000 UTC m=+445.105215685 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Mar 10 06:49:30 crc kubenswrapper[4825]: E0310 06:49:30.074593 4825 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:49:30 crc kubenswrapper[4825]: E0310 06:49:30.074609 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:49:30 crc kubenswrapper[4825]: E0310 06:49:30.075656 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 06:51:32.075611676 +0000 UTC m=+445.105392331 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Mar 10 06:49:30 crc kubenswrapper[4825]: E0310 06:49:30.074645 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:49:30 crc kubenswrapper[4825]: W0310 06:49:30.076569 4825 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27246": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:49:30 crc kubenswrapper[4825]: E0310 06:49:30.076680 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27246\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Mar 10 06:49:30 crc kubenswrapper[4825]: E0310 06:49:30.962452 4825 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="6.4s" Mar 10 06:49:31 crc kubenswrapper[4825]: E0310 06:49:31.076355 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:49:31 crc kubenswrapper[4825]: E0310 06:49:31.076421 4825 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:49:31 crc kubenswrapper[4825]: E0310 06:49:31.076541 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 06:51:33.076503046 +0000 UTC m=+446.106283701 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition Mar 10 06:49:31 crc kubenswrapper[4825]: E0310 06:49:31.076378 4825 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:49:31 crc kubenswrapper[4825]: E0310 06:49:31.076589 4825 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Mar 10 06:49:31 crc kubenswrapper[4825]: E0310 06:49:31.076656 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 06:51:33.07663755 +0000 UTC m=+446.106418205 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Mar 10 06:49:31 crc kubenswrapper[4825]: W0310 06:49:31.298888 4825 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27246": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:49:31 crc kubenswrapper[4825]: E0310 06:49:31.299015 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27246\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Mar 10 06:49:31 crc kubenswrapper[4825]: I0310 06:49:31.311381 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 06:49:31 crc kubenswrapper[4825]: I0310 06:49:31.312630 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 06:49:31 crc kubenswrapper[4825]: I0310 06:49:31.312719 4825 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="311d715e54fdfe9fcdda7648d431fa69ca054e330fd9415f9c4819462508b9ae" exitCode=1 Mar 10 06:49:31 crc kubenswrapper[4825]: I0310 06:49:31.312768 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"311d715e54fdfe9fcdda7648d431fa69ca054e330fd9415f9c4819462508b9ae"} Mar 10 06:49:31 crc kubenswrapper[4825]: I0310 06:49:31.313522 4825 scope.go:117] "RemoveContainer" containerID="311d715e54fdfe9fcdda7648d431fa69ca054e330fd9415f9c4819462508b9ae" Mar 10 06:49:31 crc kubenswrapper[4825]: I0310 06:49:31.314299 4825 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:31 crc kubenswrapper[4825]: I0310 06:49:31.315098 4825 status_manager.go:851] "Failed to get status for pod" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bvt9j\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:31 crc kubenswrapper[4825]: I0310 06:49:31.315673 4825 status_manager.go:851] "Failed to get status for pod" podUID="c5d3354d-1386-4f42-9c53-134c6583650d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:31 crc kubenswrapper[4825]: W0310 06:49:31.868023 4825 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27248": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:49:31 crc kubenswrapper[4825]: E0310 06:49:31.868180 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27248\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Mar 10 06:49:31 crc kubenswrapper[4825]: W0310 06:49:31.871494 4825 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27246": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:49:31 crc kubenswrapper[4825]: E0310 06:49:31.871580 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27246\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Mar 10 06:49:32 crc kubenswrapper[4825]: I0310 06:49:32.236018 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:49:32 crc kubenswrapper[4825]: I0310 06:49:32.237646 4825 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:32 crc kubenswrapper[4825]: I0310 06:49:32.238556 4825 status_manager.go:851] "Failed to get status for pod" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bvt9j\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:32 crc kubenswrapper[4825]: I0310 06:49:32.239174 4825 status_manager.go:851] "Failed to get status for pod" podUID="c5d3354d-1386-4f42-9c53-134c6583650d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:32 crc kubenswrapper[4825]: I0310 06:49:32.259360 4825 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="468bfefa-2636-4ab4-b62d-2c5c738eb872" Mar 10 06:49:32 crc kubenswrapper[4825]: I0310 06:49:32.259593 4825 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="468bfefa-2636-4ab4-b62d-2c5c738eb872" Mar 10 06:49:32 crc kubenswrapper[4825]: E0310 06:49:32.260195 4825 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:49:32 crc kubenswrapper[4825]: I0310 06:49:32.260836 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:49:32 crc kubenswrapper[4825]: W0310 06:49:32.288560 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-c48ad35dd0785ac981fd0b23893eab66025288c7452d0f893288a815253ef3e7 WatchSource:0}: Error finding container c48ad35dd0785ac981fd0b23893eab66025288c7452d0f893288a815253ef3e7: Status 404 returned error can't find the container with id c48ad35dd0785ac981fd0b23893eab66025288c7452d0f893288a815253ef3e7 Mar 10 06:49:32 crc kubenswrapper[4825]: I0310 06:49:32.322597 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c48ad35dd0785ac981fd0b23893eab66025288c7452d0f893288a815253ef3e7"} Mar 10 06:49:32 crc kubenswrapper[4825]: I0310 06:49:32.328261 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 06:49:32 crc kubenswrapper[4825]: I0310 06:49:32.329620 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 06:49:32 crc kubenswrapper[4825]: I0310 06:49:32.329727 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"37000b821eee3687bde64717b5889e84c0d17b0eea6686faa286d283be831ec1"} Mar 10 06:49:32 crc kubenswrapper[4825]: I0310 06:49:32.330929 4825 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:32 crc kubenswrapper[4825]: I0310 06:49:32.331646 4825 status_manager.go:851] "Failed to get status for pod" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bvt9j\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:32 crc kubenswrapper[4825]: I0310 06:49:32.332089 4825 status_manager.go:851] "Failed to get status for pod" podUID="c5d3354d-1386-4f42-9c53-134c6583650d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:32 crc kubenswrapper[4825]: E0310 06:49:32.661887 4825 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-8fc9207fe503b7d2100ee9bbccf4974f9dd28d2e38a6523b051f9e53b83a85c7.scope\": RecentStats: unable to find data in memory cache]" Mar 10 06:49:32 crc kubenswrapper[4825]: E0310 06:49:32.899552 4825 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/events\": dial tcp 38.102.83.222:6443: connect: connection refused" event="&Event{ObjectMeta:{machine-config-daemon-bvt9j.189b681e1fa79f50 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:machine-config-daemon-bvt9j,UID:9beb5814-89d0-47c0-8b0e-24376a358fc3,APIVersion:v1,ResourceVersion:26846,FieldPath:spec.containers{machine-config-daemon},},Reason:Killing,Message:Container machine-config-daemon failed liveness probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 06:49:16.88980872 +0000 UTC m=+309.919589325,LastTimestamp:2026-03-10 06:49:16.88980872 +0000 UTC m=+309.919589325,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 06:49:32 crc kubenswrapper[4825]: W0310 06:49:32.967954 4825 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27246": dial tcp 38.102.83.222:6443: connect: connection refused Mar 10 06:49:32 crc kubenswrapper[4825]: E0310 06:49:32.968082 4825 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27246\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Mar 10 06:49:33 crc kubenswrapper[4825]: I0310 06:49:33.343022 4825 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="8fc9207fe503b7d2100ee9bbccf4974f9dd28d2e38a6523b051f9e53b83a85c7" exitCode=0 Mar 10 06:49:33 crc kubenswrapper[4825]: I0310 06:49:33.343322 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"8fc9207fe503b7d2100ee9bbccf4974f9dd28d2e38a6523b051f9e53b83a85c7"} Mar 10 06:49:33 crc kubenswrapper[4825]: I0310 06:49:33.343888 4825 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="468bfefa-2636-4ab4-b62d-2c5c738eb872" Mar 10 06:49:33 crc kubenswrapper[4825]: I0310 06:49:33.343925 4825 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="468bfefa-2636-4ab4-b62d-2c5c738eb872" Mar 10 06:49:33 crc kubenswrapper[4825]: E0310 06:49:33.344653 4825 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:49:33 crc kubenswrapper[4825]: I0310 06:49:33.344725 4825 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:33 crc kubenswrapper[4825]: I0310 06:49:33.345751 4825 status_manager.go:851] "Failed to get status for pod" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bvt9j\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:33 crc kubenswrapper[4825]: I0310 06:49:33.346731 4825 status_manager.go:851] "Failed to get status for pod" podUID="c5d3354d-1386-4f42-9c53-134c6583650d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:33 crc kubenswrapper[4825]: I0310 06:49:33.669929 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 06:49:33 crc kubenswrapper[4825]: I0310 06:49:33.677502 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 06:49:33 crc kubenswrapper[4825]: I0310 06:49:33.678481 4825 status_manager.go:851] "Failed to get status for pod" podUID="c5d3354d-1386-4f42-9c53-134c6583650d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:33 crc kubenswrapper[4825]: I0310 06:49:33.679030 4825 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:33 crc kubenswrapper[4825]: I0310 06:49:33.679532 4825 status_manager.go:851] "Failed to get status for pod" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-bvt9j\": dial tcp 38.102.83.222:6443: connect: connection refused" Mar 10 06:49:34 crc kubenswrapper[4825]: I0310 06:49:34.359247 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4d8c6959da8d36ff2d21d7c6d60d038eb63adfd59328161fbf72b71ebed44e2e"} Mar 10 06:49:34 crc kubenswrapper[4825]: I0310 06:49:34.359300 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1c4fed894bf37d4b132ee93a94476504f5586274167c918254e6463316296b0e"} Mar 10 06:49:34 crc kubenswrapper[4825]: I0310 06:49:34.359310 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7a11f3ea01580d48469ff3e2f23330a69a9c226101422bb2799c3356653ea1df"} Mar 10 06:49:34 crc kubenswrapper[4825]: I0310 06:49:34.359378 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 06:49:35 crc kubenswrapper[4825]: I0310 06:49:35.370045 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3d8ec2311b1aeef208eac6c8e65b5d66add49f77d8ba8894eb19b474b4131579"} Mar 10 06:49:35 crc kubenswrapper[4825]: I0310 06:49:35.370203 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"908bd0da629c4e2be969f594b972f8e9633624bb3b6e8c71bb80d572d6084377"} Mar 10 06:49:35 crc kubenswrapper[4825]: I0310 06:49:35.370527 4825 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="468bfefa-2636-4ab4-b62d-2c5c738eb872" Mar 10 06:49:35 crc kubenswrapper[4825]: I0310 06:49:35.370564 4825 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="468bfefa-2636-4ab4-b62d-2c5c738eb872" Mar 10 06:49:36 crc kubenswrapper[4825]: I0310 06:49:36.743856 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 06:49:37 crc kubenswrapper[4825]: I0310 06:49:37.261407 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:49:37 crc kubenswrapper[4825]: I0310 06:49:37.261543 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:49:37 crc kubenswrapper[4825]: I0310 06:49:37.272970 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:49:40 crc kubenswrapper[4825]: I0310 06:49:40.243775 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 06:49:40 crc kubenswrapper[4825]: I0310 06:49:40.338025 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 06:49:40 crc kubenswrapper[4825]: I0310 06:49:40.401987 4825 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:49:40 crc kubenswrapper[4825]: I0310 06:49:40.501255 4825 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b01daa62-f11b-4e77-961c-fc6cf6a87869" Mar 10 06:49:41 crc kubenswrapper[4825]: I0310 06:49:41.408114 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:49:41 crc kubenswrapper[4825]: I0310 06:49:41.408662 4825 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="468bfefa-2636-4ab4-b62d-2c5c738eb872" Mar 10 06:49:41 crc kubenswrapper[4825]: I0310 06:49:41.408701 4825 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="468bfefa-2636-4ab4-b62d-2c5c738eb872" Mar 10 06:49:41 crc kubenswrapper[4825]: I0310 06:49:41.411680 4825 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b01daa62-f11b-4e77-961c-fc6cf6a87869" Mar 10 06:49:41 crc kubenswrapper[4825]: I0310 06:49:41.412196 4825 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://7a11f3ea01580d48469ff3e2f23330a69a9c226101422bb2799c3356653ea1df" Mar 10 06:49:41 crc kubenswrapper[4825]: I0310 06:49:41.412217 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:49:41 crc kubenswrapper[4825]: I0310 06:49:41.486685 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 06:49:42 crc kubenswrapper[4825]: I0310 06:49:42.416736 4825 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="468bfefa-2636-4ab4-b62d-2c5c738eb872" Mar 10 06:49:42 crc kubenswrapper[4825]: I0310 06:49:42.416788 4825 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="468bfefa-2636-4ab4-b62d-2c5c738eb872" Mar 10 06:49:42 crc kubenswrapper[4825]: I0310 06:49:42.422626 4825 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="b01daa62-f11b-4e77-961c-fc6cf6a87869" Mar 10 06:49:43 crc kubenswrapper[4825]: E0310 06:49:43.262041 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 06:49:43 crc kubenswrapper[4825]: E0310 06:49:43.282611 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 06:49:44 crc kubenswrapper[4825]: E0310 06:49:44.261791 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 06:49:49 crc kubenswrapper[4825]: I0310 06:49:49.161394 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 06:49:49 crc kubenswrapper[4825]: I0310 06:49:49.602392 4825 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 10 06:49:49 crc kubenswrapper[4825]: I0310 06:49:49.789626 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 10 06:49:50 crc kubenswrapper[4825]: I0310 06:49:50.401933 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 10 06:49:50 crc kubenswrapper[4825]: I0310 06:49:50.510497 4825 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 06:49:51 crc kubenswrapper[4825]: I0310 06:49:51.072124 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 06:49:51 crc kubenswrapper[4825]: I0310 06:49:51.309992 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 10 06:49:51 crc kubenswrapper[4825]: I0310 06:49:51.340791 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 10 06:49:51 crc kubenswrapper[4825]: I0310 06:49:51.389125 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 10 06:49:51 crc kubenswrapper[4825]: I0310 06:49:51.434718 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 10 06:49:51 crc kubenswrapper[4825]: I0310 06:49:51.710254 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 06:49:51 crc kubenswrapper[4825]: I0310 06:49:51.797952 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 10 06:49:51 crc kubenswrapper[4825]: I0310 06:49:51.982216 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 10 06:49:52 crc kubenswrapper[4825]: I0310 06:49:52.100361 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 10 06:49:52 crc kubenswrapper[4825]: I0310 06:49:52.495985 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 06:49:53 crc kubenswrapper[4825]: I0310 06:49:53.254673 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 10 06:49:53 crc kubenswrapper[4825]: I0310 06:49:53.324976 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 10 06:49:53 crc kubenswrapper[4825]: I0310 06:49:53.639011 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 10 06:49:53 crc kubenswrapper[4825]: I0310 06:49:53.705376 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 10 06:49:53 crc kubenswrapper[4825]: I0310 06:49:53.731946 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 10 06:49:53 crc kubenswrapper[4825]: I0310 06:49:53.770923 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 06:49:53 crc kubenswrapper[4825]: I0310 06:49:53.793156 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 10 06:49:53 crc kubenswrapper[4825]: I0310 06:49:53.968863 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 06:49:53 crc kubenswrapper[4825]: I0310 06:49:53.973409 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.007049 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.016508 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.098929 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.100231 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.124302 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.218482 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.236429 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.330129 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.330853 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.336454 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.341439 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.369498 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.438064 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.475887 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.502429 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.550094 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.567801 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.662784 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.684770 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.702907 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.736796 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.761245 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.850491 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.878618 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.892843 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.900169 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.913917 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 10 06:49:54 crc kubenswrapper[4825]: I0310 06:49:54.984558 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.041256 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.067150 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.072291 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.073908 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.122403 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.166570 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.204781 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.223983 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.236035 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.246213 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.254065 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.333913 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.341192 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.365396 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.367007 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.409095 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.413223 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.416983 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.443006 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.534823 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.695227 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.699994 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.708225 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.730236 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.733668 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.820195 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.860015 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 10 06:49:55 crc kubenswrapper[4825]: I0310 06:49:55.979272 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 10 06:49:56 crc kubenswrapper[4825]: I0310 06:49:56.015374 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 10 06:49:56 crc kubenswrapper[4825]: I0310 06:49:56.231282 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 10 06:49:56 crc kubenswrapper[4825]: I0310 06:49:56.247906 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 10 06:49:56 crc kubenswrapper[4825]: I0310 06:49:56.265255 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 10 06:49:56 crc kubenswrapper[4825]: I0310 06:49:56.269008 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 10 06:49:56 crc kubenswrapper[4825]: I0310 06:49:56.392303 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 10 06:49:56 crc kubenswrapper[4825]: I0310 06:49:56.415044 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 10 06:49:56 crc kubenswrapper[4825]: I0310 06:49:56.425972 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 10 06:49:56 crc kubenswrapper[4825]: I0310 06:49:56.493935 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 10 06:49:56 crc kubenswrapper[4825]: I0310 06:49:56.529743 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 10 06:49:56 crc kubenswrapper[4825]: I0310 06:49:56.585557 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 10 06:49:56 crc kubenswrapper[4825]: I0310 06:49:56.661807 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 10 06:49:56 crc kubenswrapper[4825]: I0310 06:49:56.763066 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 06:49:56 crc kubenswrapper[4825]: I0310 06:49:56.767468 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 10 06:49:56 crc kubenswrapper[4825]: I0310 06:49:56.836940 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 06:49:56 crc kubenswrapper[4825]: I0310 06:49:56.908360 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.030467 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.097439 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.159749 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.220383 4825 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.237660 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.260178 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.262644 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.291261 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.351527 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.362843 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.418330 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.420596 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.426324 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.437519 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.491036 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.492701 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.492927 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.511787 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.609741 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.633501 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.725182 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.725377 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.766707 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.771012 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.843861 4825 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.890719 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 10 06:49:57 crc kubenswrapper[4825]: I0310 06:49:57.956990 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 10 06:49:58 crc kubenswrapper[4825]: I0310 06:49:58.011713 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 10 06:49:58 crc kubenswrapper[4825]: I0310 06:49:58.012924 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 10 06:49:58 crc kubenswrapper[4825]: I0310 06:49:58.015758 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 06:49:58 crc kubenswrapper[4825]: I0310 06:49:58.222184 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 10 06:49:58 crc kubenswrapper[4825]: I0310 06:49:58.319300 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 10 06:49:58 crc kubenswrapper[4825]: I0310 06:49:58.410566 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 10 06:49:58 crc kubenswrapper[4825]: I0310 06:49:58.430483 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 10 06:49:58 crc kubenswrapper[4825]: I0310 06:49:58.539955 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 10 06:49:58 crc kubenswrapper[4825]: I0310 06:49:58.666057 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 10 06:49:58 crc kubenswrapper[4825]: I0310 06:49:58.705362 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 10 06:49:58 crc kubenswrapper[4825]: I0310 06:49:58.720637 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 10 06:49:58 crc kubenswrapper[4825]: I0310 06:49:58.805226 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 06:49:58 crc kubenswrapper[4825]: I0310 06:49:58.808958 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 10 06:49:59 crc kubenswrapper[4825]: I0310 06:49:59.095920 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 06:49:59 crc kubenswrapper[4825]: I0310 06:49:59.136436 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 10 06:49:59 crc kubenswrapper[4825]: I0310 06:49:59.136475 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 10 06:49:59 crc kubenswrapper[4825]: I0310 06:49:59.193405 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 10 06:49:59 crc kubenswrapper[4825]: I0310 06:49:59.226895 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 06:49:59 crc kubenswrapper[4825]: I0310 06:49:59.303524 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 06:49:59 crc kubenswrapper[4825]: I0310 06:49:59.303539 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 06:49:59 crc kubenswrapper[4825]: I0310 06:49:59.479797 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 06:49:59 crc kubenswrapper[4825]: I0310 06:49:59.546649 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 10 06:49:59 crc kubenswrapper[4825]: I0310 06:49:59.637588 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 10 06:49:59 crc kubenswrapper[4825]: I0310 06:49:59.785423 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 10 06:49:59 crc kubenswrapper[4825]: I0310 06:49:59.836867 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 06:49:59 crc kubenswrapper[4825]: I0310 06:49:59.856949 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 10 06:49:59 crc kubenswrapper[4825]: I0310 06:49:59.875929 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 10 06:49:59 crc kubenswrapper[4825]: I0310 06:49:59.878345 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 10 06:49:59 crc kubenswrapper[4825]: I0310 06:49:59.921866 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 10 06:50:00 crc kubenswrapper[4825]: I0310 06:50:00.017453 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 10 06:50:00 crc kubenswrapper[4825]: I0310 06:50:00.095635 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 10 06:50:00 crc kubenswrapper[4825]: I0310 06:50:00.149114 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 10 06:50:00 crc kubenswrapper[4825]: I0310 06:50:00.303649 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 06:50:00 crc kubenswrapper[4825]: I0310 06:50:00.304390 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 10 06:50:00 crc kubenswrapper[4825]: I0310 06:50:00.439129 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 10 06:50:00 crc kubenswrapper[4825]: I0310 06:50:00.609383 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 10 06:50:00 crc kubenswrapper[4825]: I0310 06:50:00.747684 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 10 06:50:00 crc kubenswrapper[4825]: I0310 06:50:00.801779 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 10 06:50:00 crc kubenswrapper[4825]: I0310 06:50:00.952992 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 10 06:50:00 crc kubenswrapper[4825]: I0310 06:50:00.957629 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 10 06:50:00 crc kubenswrapper[4825]: I0310 06:50:00.978536 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 10 06:50:00 crc kubenswrapper[4825]: I0310 06:50:00.990431 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 10 06:50:01 crc kubenswrapper[4825]: I0310 06:50:01.002536 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 10 06:50:01 crc kubenswrapper[4825]: I0310 06:50:01.011532 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 10 06:50:01 crc kubenswrapper[4825]: I0310 06:50:01.036426 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 10 06:50:01 crc kubenswrapper[4825]: I0310 06:50:01.064333 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 10 06:50:01 crc kubenswrapper[4825]: I0310 06:50:01.184109 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 10 06:50:01 crc kubenswrapper[4825]: I0310 06:50:01.272701 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 10 06:50:01 crc kubenswrapper[4825]: I0310 06:50:01.315871 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 10 06:50:01 crc kubenswrapper[4825]: I0310 06:50:01.407363 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 10 06:50:01 crc kubenswrapper[4825]: I0310 06:50:01.418831 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 10 06:50:01 crc kubenswrapper[4825]: I0310 06:50:01.497699 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 10 06:50:01 crc kubenswrapper[4825]: I0310 06:50:01.520725 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 10 06:50:01 crc kubenswrapper[4825]: I0310 06:50:01.533224 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 10 06:50:01 crc kubenswrapper[4825]: I0310 06:50:01.543109 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 10 06:50:01 crc kubenswrapper[4825]: I0310 06:50:01.585777 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 10 06:50:01 crc kubenswrapper[4825]: I0310 06:50:01.624558 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 10 06:50:01 crc kubenswrapper[4825]: I0310 06:50:01.728742 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 10 06:50:01 crc kubenswrapper[4825]: I0310 06:50:01.896963 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 10 06:50:01 crc kubenswrapper[4825]: I0310 06:50:01.922642 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 10 06:50:01 crc kubenswrapper[4825]: I0310 06:50:01.951128 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 10 06:50:02 crc kubenswrapper[4825]: I0310 06:50:02.000920 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 10 06:50:02 crc kubenswrapper[4825]: I0310 06:50:02.030410 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 10 06:50:02 crc kubenswrapper[4825]: I0310 06:50:02.076842 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 10 06:50:02 crc kubenswrapper[4825]: I0310 06:50:02.320353 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 10 06:50:02 crc kubenswrapper[4825]: I0310 06:50:02.375087 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 10 06:50:02 crc kubenswrapper[4825]: I0310 06:50:02.390545 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 06:50:02 crc kubenswrapper[4825]: I0310 06:50:02.418741 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 10 06:50:02 crc kubenswrapper[4825]: I0310 06:50:02.510610 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 10 06:50:02 crc kubenswrapper[4825]: I0310 06:50:02.551240 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 10 06:50:02 crc kubenswrapper[4825]: I0310 06:50:02.585397 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 10 06:50:02 crc kubenswrapper[4825]: I0310 06:50:02.647308 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 10 06:50:02 crc kubenswrapper[4825]: I0310 06:50:02.722583 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 10 06:50:03 crc kubenswrapper[4825]: I0310 06:50:03.170729 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 10 06:50:03 crc kubenswrapper[4825]: I0310 06:50:03.237934 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 10 06:50:03 crc kubenswrapper[4825]: I0310 06:50:03.296470 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 10 06:50:03 crc kubenswrapper[4825]: I0310 06:50:03.354320 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 10 06:50:03 crc kubenswrapper[4825]: I0310 06:50:03.391356 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 10 06:50:03 crc kubenswrapper[4825]: I0310 06:50:03.567321 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 10 06:50:03 crc kubenswrapper[4825]: I0310 06:50:03.593318 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 10 06:50:03 crc kubenswrapper[4825]: I0310 06:50:03.631962 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 10 06:50:03 crc kubenswrapper[4825]: I0310 06:50:03.644348 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 10 06:50:03 crc kubenswrapper[4825]: I0310 06:50:03.671077 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.046343 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.115248 4825 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.118178 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-56467868d6-9z9wj" podStartSLOduration=55.118153881 podStartE2EDuration="55.118153881s" podCreationTimestamp="2026-03-10 06:49:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:49:12.170775606 +0000 UTC m=+305.200556221" watchObservedRunningTime="2026-03-10 06:50:04.118153881 +0000 UTC m=+357.147934506" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.118749 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.120957 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.121012 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552090-q7kdt","openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 06:50:04 crc kubenswrapper[4825]: E0310 06:50:04.121302 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5d3354d-1386-4f42-9c53-134c6583650d" containerName="installer" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.121329 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d3354d-1386-4f42-9c53-134c6583650d" containerName="installer" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.121414 4825 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="468bfefa-2636-4ab4-b62d-2c5c738eb872" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.121433 4825 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="468bfefa-2636-4ab4-b62d-2c5c738eb872" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.121518 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5d3354d-1386-4f42-9c53-134c6583650d" containerName="installer" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.121998 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552090-q7kdt" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.124575 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.124604 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.124702 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.130486 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.133005 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.150375 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=24.15035331 podStartE2EDuration="24.15035331s" podCreationTimestamp="2026-03-10 06:49:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:50:04.146590116 +0000 UTC m=+357.176370761" watchObservedRunningTime="2026-03-10 06:50:04.15035331 +0000 UTC m=+357.180133925" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.196965 4825 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.285353 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75qb7\" (UniqueName: \"kubernetes.io/projected/b5b8a09c-f74d-40c1-81a7-1e0ed85f5025-kube-api-access-75qb7\") pod \"auto-csr-approver-29552090-q7kdt\" (UID: \"b5b8a09c-f74d-40c1-81a7-1e0ed85f5025\") " pod="openshift-infra/auto-csr-approver-29552090-q7kdt" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.318970 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.341235 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.386804 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75qb7\" (UniqueName: \"kubernetes.io/projected/b5b8a09c-f74d-40c1-81a7-1e0ed85f5025-kube-api-access-75qb7\") pod \"auto-csr-approver-29552090-q7kdt\" (UID: \"b5b8a09c-f74d-40c1-81a7-1e0ed85f5025\") " pod="openshift-infra/auto-csr-approver-29552090-q7kdt" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.418118 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75qb7\" (UniqueName: \"kubernetes.io/projected/b5b8a09c-f74d-40c1-81a7-1e0ed85f5025-kube-api-access-75qb7\") pod \"auto-csr-approver-29552090-q7kdt\" (UID: \"b5b8a09c-f74d-40c1-81a7-1e0ed85f5025\") " pod="openshift-infra/auto-csr-approver-29552090-q7kdt" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.439962 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552090-q7kdt" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.521264 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.669816 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.700834 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.731368 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.852687 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.871674 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.918750 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.929446 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 10 06:50:04 crc kubenswrapper[4825]: I0310 06:50:04.946466 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 06:50:05 crc kubenswrapper[4825]: I0310 06:50:05.088247 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 10 06:50:05 crc kubenswrapper[4825]: I0310 06:50:05.339967 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 06:50:05 crc kubenswrapper[4825]: I0310 06:50:05.469459 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 10 06:50:05 crc kubenswrapper[4825]: I0310 06:50:05.688117 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 10 06:50:05 crc kubenswrapper[4825]: I0310 06:50:05.750803 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 10 06:50:05 crc kubenswrapper[4825]: I0310 06:50:05.771217 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 10 06:50:05 crc kubenswrapper[4825]: I0310 06:50:05.799664 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 10 06:50:05 crc kubenswrapper[4825]: I0310 06:50:05.827208 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 10 06:50:05 crc kubenswrapper[4825]: I0310 06:50:05.850384 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 10 06:50:07 crc kubenswrapper[4825]: E0310 06:50:07.736349 4825 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 10 06:50:07 crc kubenswrapper[4825]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29552090-q7kdt_openshift-infra_b5b8a09c-f74d-40c1-81a7-1e0ed85f5025_0(b8cad262883bb22d9f8061017210d0aa324b3b731ac784b1c8fc44806061696e): error adding pod openshift-infra_auto-csr-approver-29552090-q7kdt to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b8cad262883bb22d9f8061017210d0aa324b3b731ac784b1c8fc44806061696e" Netns:"/var/run/netns/4e990187-7216-484a-9f7b-4c82fd27592e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29552090-q7kdt;K8S_POD_INFRA_CONTAINER_ID=b8cad262883bb22d9f8061017210d0aa324b3b731ac784b1c8fc44806061696e;K8S_POD_UID=b5b8a09c-f74d-40c1-81a7-1e0ed85f5025" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29552090-q7kdt] networking: Multus: [openshift-infra/auto-csr-approver-29552090-q7kdt/b5b8a09c-f74d-40c1-81a7-1e0ed85f5025]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29552090-q7kdt in out of cluster comm: pod "auto-csr-approver-29552090-q7kdt" not found Mar 10 06:50:07 crc kubenswrapper[4825]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 06:50:07 crc kubenswrapper[4825]: > Mar 10 06:50:07 crc kubenswrapper[4825]: E0310 06:50:07.736900 4825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 10 06:50:07 crc kubenswrapper[4825]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29552090-q7kdt_openshift-infra_b5b8a09c-f74d-40c1-81a7-1e0ed85f5025_0(b8cad262883bb22d9f8061017210d0aa324b3b731ac784b1c8fc44806061696e): error adding pod openshift-infra_auto-csr-approver-29552090-q7kdt to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b8cad262883bb22d9f8061017210d0aa324b3b731ac784b1c8fc44806061696e" Netns:"/var/run/netns/4e990187-7216-484a-9f7b-4c82fd27592e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29552090-q7kdt;K8S_POD_INFRA_CONTAINER_ID=b8cad262883bb22d9f8061017210d0aa324b3b731ac784b1c8fc44806061696e;K8S_POD_UID=b5b8a09c-f74d-40c1-81a7-1e0ed85f5025" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29552090-q7kdt] networking: Multus: [openshift-infra/auto-csr-approver-29552090-q7kdt/b5b8a09c-f74d-40c1-81a7-1e0ed85f5025]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29552090-q7kdt in out of cluster comm: pod "auto-csr-approver-29552090-q7kdt" not found Mar 10 06:50:07 crc kubenswrapper[4825]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 06:50:07 crc kubenswrapper[4825]: > pod="openshift-infra/auto-csr-approver-29552090-q7kdt" Mar 10 06:50:07 crc kubenswrapper[4825]: E0310 06:50:07.736931 4825 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 10 06:50:07 crc kubenswrapper[4825]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29552090-q7kdt_openshift-infra_b5b8a09c-f74d-40c1-81a7-1e0ed85f5025_0(b8cad262883bb22d9f8061017210d0aa324b3b731ac784b1c8fc44806061696e): error adding pod openshift-infra_auto-csr-approver-29552090-q7kdt to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b8cad262883bb22d9f8061017210d0aa324b3b731ac784b1c8fc44806061696e" Netns:"/var/run/netns/4e990187-7216-484a-9f7b-4c82fd27592e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29552090-q7kdt;K8S_POD_INFRA_CONTAINER_ID=b8cad262883bb22d9f8061017210d0aa324b3b731ac784b1c8fc44806061696e;K8S_POD_UID=b5b8a09c-f74d-40c1-81a7-1e0ed85f5025" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29552090-q7kdt] networking: Multus: [openshift-infra/auto-csr-approver-29552090-q7kdt/b5b8a09c-f74d-40c1-81a7-1e0ed85f5025]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29552090-q7kdt in out of cluster comm: pod "auto-csr-approver-29552090-q7kdt" not found Mar 10 06:50:07 crc kubenswrapper[4825]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 06:50:07 crc kubenswrapper[4825]: > pod="openshift-infra/auto-csr-approver-29552090-q7kdt" Mar 10 06:50:07 crc kubenswrapper[4825]: E0310 06:50:07.736993 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29552090-q7kdt_openshift-infra(b5b8a09c-f74d-40c1-81a7-1e0ed85f5025)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29552090-q7kdt_openshift-infra(b5b8a09c-f74d-40c1-81a7-1e0ed85f5025)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29552090-q7kdt_openshift-infra_b5b8a09c-f74d-40c1-81a7-1e0ed85f5025_0(b8cad262883bb22d9f8061017210d0aa324b3b731ac784b1c8fc44806061696e): error adding pod openshift-infra_auto-csr-approver-29552090-q7kdt to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"b8cad262883bb22d9f8061017210d0aa324b3b731ac784b1c8fc44806061696e\\\" Netns:\\\"/var/run/netns/4e990187-7216-484a-9f7b-4c82fd27592e\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29552090-q7kdt;K8S_POD_INFRA_CONTAINER_ID=b8cad262883bb22d9f8061017210d0aa324b3b731ac784b1c8fc44806061696e;K8S_POD_UID=b5b8a09c-f74d-40c1-81a7-1e0ed85f5025\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29552090-q7kdt] networking: Multus: [openshift-infra/auto-csr-approver-29552090-q7kdt/b5b8a09c-f74d-40c1-81a7-1e0ed85f5025]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29552090-q7kdt in out of cluster comm: pod \\\"auto-csr-approver-29552090-q7kdt\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-infra/auto-csr-approver-29552090-q7kdt" podUID="b5b8a09c-f74d-40c1-81a7-1e0ed85f5025" Mar 10 06:50:14 crc kubenswrapper[4825]: I0310 06:50:14.344361 4825 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 06:50:14 crc kubenswrapper[4825]: I0310 06:50:14.345742 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://2838369c1d0edf34b9393bf06fcfd6b628049a63fe54561783c9e4aa998e88db" gracePeriod=5 Mar 10 06:50:16 crc kubenswrapper[4825]: I0310 06:50:16.016719 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 10 06:50:19 crc kubenswrapper[4825]: I0310 06:50:19.236474 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552090-q7kdt" Mar 10 06:50:19 crc kubenswrapper[4825]: I0310 06:50:19.243489 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552090-q7kdt" Mar 10 06:50:19 crc kubenswrapper[4825]: I0310 06:50:19.691339 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 06:50:19 crc kubenswrapper[4825]: I0310 06:50:19.691760 4825 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="2838369c1d0edf34b9393bf06fcfd6b628049a63fe54561783c9e4aa998e88db" exitCode=137 Mar 10 06:50:19 crc kubenswrapper[4825]: I0310 06:50:19.944666 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 06:50:19 crc kubenswrapper[4825]: I0310 06:50:19.944767 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 06:50:19 crc kubenswrapper[4825]: I0310 06:50:19.946015 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 06:50:19 crc kubenswrapper[4825]: I0310 06:50:19.946084 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 06:50:19 crc kubenswrapper[4825]: I0310 06:50:19.946211 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:50:19 crc kubenswrapper[4825]: I0310 06:50:19.946392 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:50:19 crc kubenswrapper[4825]: I0310 06:50:19.946816 4825 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 10 06:50:19 crc kubenswrapper[4825]: I0310 06:50:19.946848 4825 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 06:50:20 crc kubenswrapper[4825]: I0310 06:50:20.047427 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 06:50:20 crc kubenswrapper[4825]: I0310 06:50:20.047502 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 06:50:20 crc kubenswrapper[4825]: I0310 06:50:20.047544 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 06:50:20 crc kubenswrapper[4825]: I0310 06:50:20.047649 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:50:20 crc kubenswrapper[4825]: I0310 06:50:20.047784 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:50:20 crc kubenswrapper[4825]: I0310 06:50:20.048339 4825 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 10 06:50:20 crc kubenswrapper[4825]: I0310 06:50:20.048374 4825 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 10 06:50:20 crc kubenswrapper[4825]: I0310 06:50:20.057769 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:50:20 crc kubenswrapper[4825]: I0310 06:50:20.149453 4825 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 06:50:20 crc kubenswrapper[4825]: I0310 06:50:20.700794 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 06:50:20 crc kubenswrapper[4825]: I0310 06:50:20.700890 4825 scope.go:117] "RemoveContainer" containerID="2838369c1d0edf34b9393bf06fcfd6b628049a63fe54561783c9e4aa998e88db" Mar 10 06:50:20 crc kubenswrapper[4825]: I0310 06:50:20.701078 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 06:50:20 crc kubenswrapper[4825]: I0310 06:50:20.821777 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 10 06:50:21 crc kubenswrapper[4825]: I0310 06:50:21.245466 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 10 06:50:21 crc kubenswrapper[4825]: I0310 06:50:21.711712 4825 generic.go:334] "Generic (PLEG): container finished" podID="4d23b66c-f736-4af2-9dc7-6167ca4d53ef" containerID="4d4ea277f678ddf2345f9186c8aa38db3e06aebef380d7640660e915cd075c41" exitCode=0 Mar 10 06:50:21 crc kubenswrapper[4825]: I0310 06:50:21.711855 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" event={"ID":"4d23b66c-f736-4af2-9dc7-6167ca4d53ef","Type":"ContainerDied","Data":"4d4ea277f678ddf2345f9186c8aa38db3e06aebef380d7640660e915cd075c41"} Mar 10 06:50:21 crc kubenswrapper[4825]: I0310 06:50:21.713600 4825 scope.go:117] "RemoveContainer" containerID="4d4ea277f678ddf2345f9186c8aa38db3e06aebef380d7640660e915cd075c41" Mar 10 06:50:22 crc kubenswrapper[4825]: E0310 06:50:22.391044 4825 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 10 06:50:22 crc kubenswrapper[4825]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29552090-q7kdt_openshift-infra_b5b8a09c-f74d-40c1-81a7-1e0ed85f5025_0(51b85c71afa380779d712b6da80dcf5d043000bff89c94c888f71a843907d272): error adding pod openshift-infra_auto-csr-approver-29552090-q7kdt to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"51b85c71afa380779d712b6da80dcf5d043000bff89c94c888f71a843907d272" Netns:"/var/run/netns/291a1e9d-8a44-47df-8826-003dad4fdb4c" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29552090-q7kdt;K8S_POD_INFRA_CONTAINER_ID=51b85c71afa380779d712b6da80dcf5d043000bff89c94c888f71a843907d272;K8S_POD_UID=b5b8a09c-f74d-40c1-81a7-1e0ed85f5025" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29552090-q7kdt] networking: Multus: [openshift-infra/auto-csr-approver-29552090-q7kdt/b5b8a09c-f74d-40c1-81a7-1e0ed85f5025]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29552090-q7kdt in out of cluster comm: pod "auto-csr-approver-29552090-q7kdt" not found Mar 10 06:50:22 crc kubenswrapper[4825]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 06:50:22 crc kubenswrapper[4825]: > Mar 10 06:50:22 crc kubenswrapper[4825]: E0310 06:50:22.391435 4825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 10 06:50:22 crc kubenswrapper[4825]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29552090-q7kdt_openshift-infra_b5b8a09c-f74d-40c1-81a7-1e0ed85f5025_0(51b85c71afa380779d712b6da80dcf5d043000bff89c94c888f71a843907d272): error adding pod openshift-infra_auto-csr-approver-29552090-q7kdt to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"51b85c71afa380779d712b6da80dcf5d043000bff89c94c888f71a843907d272" Netns:"/var/run/netns/291a1e9d-8a44-47df-8826-003dad4fdb4c" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29552090-q7kdt;K8S_POD_INFRA_CONTAINER_ID=51b85c71afa380779d712b6da80dcf5d043000bff89c94c888f71a843907d272;K8S_POD_UID=b5b8a09c-f74d-40c1-81a7-1e0ed85f5025" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29552090-q7kdt] networking: Multus: [openshift-infra/auto-csr-approver-29552090-q7kdt/b5b8a09c-f74d-40c1-81a7-1e0ed85f5025]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29552090-q7kdt in out of cluster comm: pod "auto-csr-approver-29552090-q7kdt" not found Mar 10 06:50:22 crc kubenswrapper[4825]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 06:50:22 crc kubenswrapper[4825]: > pod="openshift-infra/auto-csr-approver-29552090-q7kdt" Mar 10 06:50:22 crc kubenswrapper[4825]: E0310 06:50:22.391460 4825 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 10 06:50:22 crc kubenswrapper[4825]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29552090-q7kdt_openshift-infra_b5b8a09c-f74d-40c1-81a7-1e0ed85f5025_0(51b85c71afa380779d712b6da80dcf5d043000bff89c94c888f71a843907d272): error adding pod openshift-infra_auto-csr-approver-29552090-q7kdt to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"51b85c71afa380779d712b6da80dcf5d043000bff89c94c888f71a843907d272" Netns:"/var/run/netns/291a1e9d-8a44-47df-8826-003dad4fdb4c" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29552090-q7kdt;K8S_POD_INFRA_CONTAINER_ID=51b85c71afa380779d712b6da80dcf5d043000bff89c94c888f71a843907d272;K8S_POD_UID=b5b8a09c-f74d-40c1-81a7-1e0ed85f5025" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29552090-q7kdt] networking: Multus: [openshift-infra/auto-csr-approver-29552090-q7kdt/b5b8a09c-f74d-40c1-81a7-1e0ed85f5025]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29552090-q7kdt in out of cluster comm: pod "auto-csr-approver-29552090-q7kdt" not found Mar 10 06:50:22 crc kubenswrapper[4825]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 06:50:22 crc kubenswrapper[4825]: > pod="openshift-infra/auto-csr-approver-29552090-q7kdt" Mar 10 06:50:22 crc kubenswrapper[4825]: E0310 06:50:22.391528 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29552090-q7kdt_openshift-infra(b5b8a09c-f74d-40c1-81a7-1e0ed85f5025)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29552090-q7kdt_openshift-infra(b5b8a09c-f74d-40c1-81a7-1e0ed85f5025)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29552090-q7kdt_openshift-infra_b5b8a09c-f74d-40c1-81a7-1e0ed85f5025_0(51b85c71afa380779d712b6da80dcf5d043000bff89c94c888f71a843907d272): error adding pod openshift-infra_auto-csr-approver-29552090-q7kdt to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"51b85c71afa380779d712b6da80dcf5d043000bff89c94c888f71a843907d272\\\" Netns:\\\"/var/run/netns/291a1e9d-8a44-47df-8826-003dad4fdb4c\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29552090-q7kdt;K8S_POD_INFRA_CONTAINER_ID=51b85c71afa380779d712b6da80dcf5d043000bff89c94c888f71a843907d272;K8S_POD_UID=b5b8a09c-f74d-40c1-81a7-1e0ed85f5025\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29552090-q7kdt] networking: Multus: [openshift-infra/auto-csr-approver-29552090-q7kdt/b5b8a09c-f74d-40c1-81a7-1e0ed85f5025]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29552090-q7kdt in out of cluster comm: pod \\\"auto-csr-approver-29552090-q7kdt\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-infra/auto-csr-approver-29552090-q7kdt" podUID="b5b8a09c-f74d-40c1-81a7-1e0ed85f5025" Mar 10 06:50:22 crc kubenswrapper[4825]: I0310 06:50:22.607604 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 10 06:50:22 crc kubenswrapper[4825]: I0310 06:50:22.723922 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" event={"ID":"4d23b66c-f736-4af2-9dc7-6167ca4d53ef","Type":"ContainerStarted","Data":"80d26be988b3dcce8e1ac3516924261ca1e76ea685d249acc7ea264fcdb8111e"} Mar 10 06:50:22 crc kubenswrapper[4825]: I0310 06:50:22.725491 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" Mar 10 06:50:22 crc kubenswrapper[4825]: I0310 06:50:22.728539 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" Mar 10 06:50:23 crc kubenswrapper[4825]: I0310 06:50:23.652085 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 10 06:50:25 crc kubenswrapper[4825]: I0310 06:50:25.802979 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 10 06:50:26 crc kubenswrapper[4825]: I0310 06:50:26.255310 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 10 06:50:27 crc kubenswrapper[4825]: I0310 06:50:27.669530 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 10 06:50:28 crc kubenswrapper[4825]: I0310 06:50:28.196182 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 10 06:50:28 crc kubenswrapper[4825]: I0310 06:50:28.343877 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 10 06:50:28 crc kubenswrapper[4825]: I0310 06:50:28.903319 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 10 06:50:31 crc kubenswrapper[4825]: I0310 06:50:31.038072 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 10 06:50:33 crc kubenswrapper[4825]: I0310 06:50:33.577918 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 10 06:50:33 crc kubenswrapper[4825]: I0310 06:50:33.974841 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 10 06:50:34 crc kubenswrapper[4825]: I0310 06:50:34.104797 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 06:50:34 crc kubenswrapper[4825]: I0310 06:50:34.235876 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552090-q7kdt" Mar 10 06:50:34 crc kubenswrapper[4825]: I0310 06:50:34.236855 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552090-q7kdt" Mar 10 06:50:37 crc kubenswrapper[4825]: E0310 06:50:37.446973 4825 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 10 06:50:37 crc kubenswrapper[4825]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29552090-q7kdt_openshift-infra_b5b8a09c-f74d-40c1-81a7-1e0ed85f5025_0(f0e4affa29dccd66d1ad64c3b3a9bb7b4015a21cd2af3ec0f1bf8cf2fb276422): error adding pod openshift-infra_auto-csr-approver-29552090-q7kdt to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f0e4affa29dccd66d1ad64c3b3a9bb7b4015a21cd2af3ec0f1bf8cf2fb276422" Netns:"/var/run/netns/844beb2d-ceb7-4229-a127-ec79e96450b6" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29552090-q7kdt;K8S_POD_INFRA_CONTAINER_ID=f0e4affa29dccd66d1ad64c3b3a9bb7b4015a21cd2af3ec0f1bf8cf2fb276422;K8S_POD_UID=b5b8a09c-f74d-40c1-81a7-1e0ed85f5025" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29552090-q7kdt] networking: Multus: [openshift-infra/auto-csr-approver-29552090-q7kdt/b5b8a09c-f74d-40c1-81a7-1e0ed85f5025]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29552090-q7kdt in out of cluster comm: pod "auto-csr-approver-29552090-q7kdt" not found Mar 10 06:50:37 crc kubenswrapper[4825]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 06:50:37 crc kubenswrapper[4825]: > Mar 10 06:50:37 crc kubenswrapper[4825]: E0310 06:50:37.447363 4825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 10 06:50:37 crc kubenswrapper[4825]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29552090-q7kdt_openshift-infra_b5b8a09c-f74d-40c1-81a7-1e0ed85f5025_0(f0e4affa29dccd66d1ad64c3b3a9bb7b4015a21cd2af3ec0f1bf8cf2fb276422): error adding pod openshift-infra_auto-csr-approver-29552090-q7kdt to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f0e4affa29dccd66d1ad64c3b3a9bb7b4015a21cd2af3ec0f1bf8cf2fb276422" Netns:"/var/run/netns/844beb2d-ceb7-4229-a127-ec79e96450b6" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29552090-q7kdt;K8S_POD_INFRA_CONTAINER_ID=f0e4affa29dccd66d1ad64c3b3a9bb7b4015a21cd2af3ec0f1bf8cf2fb276422;K8S_POD_UID=b5b8a09c-f74d-40c1-81a7-1e0ed85f5025" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29552090-q7kdt] networking: Multus: [openshift-infra/auto-csr-approver-29552090-q7kdt/b5b8a09c-f74d-40c1-81a7-1e0ed85f5025]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29552090-q7kdt in out of cluster comm: pod "auto-csr-approver-29552090-q7kdt" not found Mar 10 06:50:37 crc kubenswrapper[4825]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 06:50:37 crc kubenswrapper[4825]: > pod="openshift-infra/auto-csr-approver-29552090-q7kdt" Mar 10 06:50:37 crc kubenswrapper[4825]: E0310 06:50:37.447388 4825 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 10 06:50:37 crc kubenswrapper[4825]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29552090-q7kdt_openshift-infra_b5b8a09c-f74d-40c1-81a7-1e0ed85f5025_0(f0e4affa29dccd66d1ad64c3b3a9bb7b4015a21cd2af3ec0f1bf8cf2fb276422): error adding pod openshift-infra_auto-csr-approver-29552090-q7kdt to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f0e4affa29dccd66d1ad64c3b3a9bb7b4015a21cd2af3ec0f1bf8cf2fb276422" Netns:"/var/run/netns/844beb2d-ceb7-4229-a127-ec79e96450b6" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29552090-q7kdt;K8S_POD_INFRA_CONTAINER_ID=f0e4affa29dccd66d1ad64c3b3a9bb7b4015a21cd2af3ec0f1bf8cf2fb276422;K8S_POD_UID=b5b8a09c-f74d-40c1-81a7-1e0ed85f5025" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29552090-q7kdt] networking: Multus: [openshift-infra/auto-csr-approver-29552090-q7kdt/b5b8a09c-f74d-40c1-81a7-1e0ed85f5025]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29552090-q7kdt in out of cluster comm: pod "auto-csr-approver-29552090-q7kdt" not found Mar 10 06:50:37 crc kubenswrapper[4825]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 06:50:37 crc kubenswrapper[4825]: > pod="openshift-infra/auto-csr-approver-29552090-q7kdt" Mar 10 06:50:37 crc kubenswrapper[4825]: E0310 06:50:37.447468 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29552090-q7kdt_openshift-infra(b5b8a09c-f74d-40c1-81a7-1e0ed85f5025)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29552090-q7kdt_openshift-infra(b5b8a09c-f74d-40c1-81a7-1e0ed85f5025)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29552090-q7kdt_openshift-infra_b5b8a09c-f74d-40c1-81a7-1e0ed85f5025_0(f0e4affa29dccd66d1ad64c3b3a9bb7b4015a21cd2af3ec0f1bf8cf2fb276422): error adding pod openshift-infra_auto-csr-approver-29552090-q7kdt to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"f0e4affa29dccd66d1ad64c3b3a9bb7b4015a21cd2af3ec0f1bf8cf2fb276422\\\" Netns:\\\"/var/run/netns/844beb2d-ceb7-4229-a127-ec79e96450b6\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29552090-q7kdt;K8S_POD_INFRA_CONTAINER_ID=f0e4affa29dccd66d1ad64c3b3a9bb7b4015a21cd2af3ec0f1bf8cf2fb276422;K8S_POD_UID=b5b8a09c-f74d-40c1-81a7-1e0ed85f5025\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29552090-q7kdt] networking: Multus: [openshift-infra/auto-csr-approver-29552090-q7kdt/b5b8a09c-f74d-40c1-81a7-1e0ed85f5025]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod auto-csr-approver-29552090-q7kdt in out of cluster comm: pod \\\"auto-csr-approver-29552090-q7kdt\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-infra/auto-csr-approver-29552090-q7kdt" podUID="b5b8a09c-f74d-40c1-81a7-1e0ed85f5025" Mar 10 06:50:39 crc kubenswrapper[4825]: I0310 06:50:39.409776 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 10 06:50:40 crc kubenswrapper[4825]: I0310 06:50:40.218235 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 10 06:50:41 crc kubenswrapper[4825]: I0310 06:50:41.180202 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 10 06:50:45 crc kubenswrapper[4825]: I0310 06:50:45.982699 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 10 06:50:46 crc kubenswrapper[4825]: I0310 06:50:46.260728 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 10 06:50:46 crc kubenswrapper[4825]: I0310 06:50:46.899623 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 06:50:47 crc kubenswrapper[4825]: I0310 06:50:47.227084 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 10 06:50:53 crc kubenswrapper[4825]: I0310 06:50:53.236168 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552090-q7kdt" Mar 10 06:50:53 crc kubenswrapper[4825]: I0310 06:50:53.238275 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552090-q7kdt" Mar 10 06:50:53 crc kubenswrapper[4825]: I0310 06:50:53.572121 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552090-q7kdt"] Mar 10 06:50:53 crc kubenswrapper[4825]: W0310 06:50:53.585860 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5b8a09c_f74d_40c1_81a7_1e0ed85f5025.slice/crio-4797536f851b1b0bf8594e460ff65788ba1ac9b9d0f3606f4d4b72737a1a39ef WatchSource:0}: Error finding container 4797536f851b1b0bf8594e460ff65788ba1ac9b9d0f3606f4d4b72737a1a39ef: Status 404 returned error can't find the container with id 4797536f851b1b0bf8594e460ff65788ba1ac9b9d0f3606f4d4b72737a1a39ef Mar 10 06:50:53 crc kubenswrapper[4825]: I0310 06:50:53.992726 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552090-q7kdt" event={"ID":"b5b8a09c-f74d-40c1-81a7-1e0ed85f5025","Type":"ContainerStarted","Data":"4797536f851b1b0bf8594e460ff65788ba1ac9b9d0f3606f4d4b72737a1a39ef"} Mar 10 06:50:55 crc kubenswrapper[4825]: I0310 06:50:55.000856 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552090-q7kdt" event={"ID":"b5b8a09c-f74d-40c1-81a7-1e0ed85f5025","Type":"ContainerStarted","Data":"c26c20226f2bf9036ea8d2ec751fe525cb2dde550a2a42c88c572626afcfaff8"} Mar 10 06:50:55 crc kubenswrapper[4825]: I0310 06:50:55.022239 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552090-q7kdt" podStartSLOduration=54.073015082 podStartE2EDuration="55.022215308s" podCreationTimestamp="2026-03-10 06:50:00 +0000 UTC" firstStartedPulling="2026-03-10 06:50:53.589604353 +0000 UTC m=+406.619385008" lastFinishedPulling="2026-03-10 06:50:54.538804599 +0000 UTC m=+407.568585234" observedRunningTime="2026-03-10 06:50:55.021293572 +0000 UTC m=+408.051074217" watchObservedRunningTime="2026-03-10 06:50:55.022215308 +0000 UTC m=+408.051995933" Mar 10 06:50:56 crc kubenswrapper[4825]: I0310 06:50:56.012678 4825 generic.go:334] "Generic (PLEG): container finished" podID="b5b8a09c-f74d-40c1-81a7-1e0ed85f5025" containerID="c26c20226f2bf9036ea8d2ec751fe525cb2dde550a2a42c88c572626afcfaff8" exitCode=0 Mar 10 06:50:56 crc kubenswrapper[4825]: I0310 06:50:56.012787 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552090-q7kdt" event={"ID":"b5b8a09c-f74d-40c1-81a7-1e0ed85f5025","Type":"ContainerDied","Data":"c26c20226f2bf9036ea8d2ec751fe525cb2dde550a2a42c88c572626afcfaff8"} Mar 10 06:50:57 crc kubenswrapper[4825]: I0310 06:50:57.361124 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552090-q7kdt" Mar 10 06:50:57 crc kubenswrapper[4825]: I0310 06:50:57.426414 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75qb7\" (UniqueName: \"kubernetes.io/projected/b5b8a09c-f74d-40c1-81a7-1e0ed85f5025-kube-api-access-75qb7\") pod \"b5b8a09c-f74d-40c1-81a7-1e0ed85f5025\" (UID: \"b5b8a09c-f74d-40c1-81a7-1e0ed85f5025\") " Mar 10 06:50:57 crc kubenswrapper[4825]: I0310 06:50:57.436764 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5b8a09c-f74d-40c1-81a7-1e0ed85f5025-kube-api-access-75qb7" (OuterVolumeSpecName: "kube-api-access-75qb7") pod "b5b8a09c-f74d-40c1-81a7-1e0ed85f5025" (UID: "b5b8a09c-f74d-40c1-81a7-1e0ed85f5025"). InnerVolumeSpecName "kube-api-access-75qb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:50:57 crc kubenswrapper[4825]: I0310 06:50:57.529494 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75qb7\" (UniqueName: \"kubernetes.io/projected/b5b8a09c-f74d-40c1-81a7-1e0ed85f5025-kube-api-access-75qb7\") on node \"crc\" DevicePath \"\"" Mar 10 06:50:58 crc kubenswrapper[4825]: I0310 06:50:58.030412 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552090-q7kdt" event={"ID":"b5b8a09c-f74d-40c1-81a7-1e0ed85f5025","Type":"ContainerDied","Data":"4797536f851b1b0bf8594e460ff65788ba1ac9b9d0f3606f4d4b72737a1a39ef"} Mar 10 06:50:58 crc kubenswrapper[4825]: I0310 06:50:58.030474 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4797536f851b1b0bf8594e460ff65788ba1ac9b9d0f3606f4d4b72737a1a39ef" Mar 10 06:50:58 crc kubenswrapper[4825]: I0310 06:50:58.030546 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552090-q7kdt" Mar 10 06:51:32 crc kubenswrapper[4825]: I0310 06:51:32.112843 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:51:32 crc kubenswrapper[4825]: I0310 06:51:32.113639 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:51:32 crc kubenswrapper[4825]: I0310 06:51:32.117894 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:51:32 crc kubenswrapper[4825]: I0310 06:51:32.127399 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:51:32 crc kubenswrapper[4825]: I0310 06:51:32.339320 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 06:51:33 crc kubenswrapper[4825]: I0310 06:51:33.128513 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:51:33 crc kubenswrapper[4825]: I0310 06:51:33.129504 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:51:33 crc kubenswrapper[4825]: I0310 06:51:33.136787 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:51:33 crc kubenswrapper[4825]: I0310 06:51:33.137240 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:51:33 crc kubenswrapper[4825]: I0310 06:51:33.237303 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:51:33 crc kubenswrapper[4825]: I0310 06:51:33.315720 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"564f9960de376fac0d65a32d487768c8d87f688d5fdc8e5d93033f0dfd661f9e"} Mar 10 06:51:33 crc kubenswrapper[4825]: I0310 06:51:33.315779 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"372b94106ab897de32af8454f4ab8f629d4e29321ef1cc292ca7b3a69eb7ecd5"} Mar 10 06:51:33 crc kubenswrapper[4825]: I0310 06:51:33.337510 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 06:51:33 crc kubenswrapper[4825]: W0310 06:51:33.599940 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-69668818ab541f05f95fc9ace3f48617ccce3cc74e40868f6a38c0f2a21ff37c WatchSource:0}: Error finding container 69668818ab541f05f95fc9ace3f48617ccce3cc74e40868f6a38c0f2a21ff37c: Status 404 returned error can't find the container with id 69668818ab541f05f95fc9ace3f48617ccce3cc74e40868f6a38c0f2a21ff37c Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.324909 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b2068d772e57cffff97c7deaf670616a38a7022fc9a6b46bbf586330fa6eccce"} Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.325553 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"69668818ab541f05f95fc9ace3f48617ccce3cc74e40868f6a38c0f2a21ff37c"} Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.326953 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ed768edea0b354be76a7a0ed63f40ae94d245bb0869ba02c3be2026d3e39e41f"} Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.327039 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"594751d354978297d5956f479a72ffb7986be6acf71a914b046a01acef16d3e0"} Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.327357 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.484358 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-b4z2p"] Mar 10 06:51:34 crc kubenswrapper[4825]: E0310 06:51:34.484653 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.484675 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 06:51:34 crc kubenswrapper[4825]: E0310 06:51:34.484690 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5b8a09c-f74d-40c1-81a7-1e0ed85f5025" containerName="oc" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.484700 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b8a09c-f74d-40c1-81a7-1e0ed85f5025" containerName="oc" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.484850 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5b8a09c-f74d-40c1-81a7-1e0ed85f5025" containerName="oc" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.484877 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.485370 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.496533 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-b4z2p"] Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.564423 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/23e3a3f1-a817-4079-b684-f280f55bda61-ca-trust-extracted\") pod \"image-registry-66df7c8f76-b4z2p\" (UID: \"23e3a3f1-a817-4079-b684-f280f55bda61\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.564700 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vszbj\" (UniqueName: \"kubernetes.io/projected/23e3a3f1-a817-4079-b684-f280f55bda61-kube-api-access-vszbj\") pod \"image-registry-66df7c8f76-b4z2p\" (UID: \"23e3a3f1-a817-4079-b684-f280f55bda61\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.564863 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23e3a3f1-a817-4079-b684-f280f55bda61-trusted-ca\") pod \"image-registry-66df7c8f76-b4z2p\" (UID: \"23e3a3f1-a817-4079-b684-f280f55bda61\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.564981 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/23e3a3f1-a817-4079-b684-f280f55bda61-installation-pull-secrets\") pod \"image-registry-66df7c8f76-b4z2p\" (UID: \"23e3a3f1-a817-4079-b684-f280f55bda61\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.565105 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23e3a3f1-a817-4079-b684-f280f55bda61-bound-sa-token\") pod \"image-registry-66df7c8f76-b4z2p\" (UID: \"23e3a3f1-a817-4079-b684-f280f55bda61\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.565248 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/23e3a3f1-a817-4079-b684-f280f55bda61-registry-certificates\") pod \"image-registry-66df7c8f76-b4z2p\" (UID: \"23e3a3f1-a817-4079-b684-f280f55bda61\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.565417 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-b4z2p\" (UID: \"23e3a3f1-a817-4079-b684-f280f55bda61\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.565595 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23e3a3f1-a817-4079-b684-f280f55bda61-registry-tls\") pod \"image-registry-66df7c8f76-b4z2p\" (UID: \"23e3a3f1-a817-4079-b684-f280f55bda61\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.612744 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-b4z2p\" (UID: \"23e3a3f1-a817-4079-b684-f280f55bda61\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.666851 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23e3a3f1-a817-4079-b684-f280f55bda61-registry-tls\") pod \"image-registry-66df7c8f76-b4z2p\" (UID: \"23e3a3f1-a817-4079-b684-f280f55bda61\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.666942 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/23e3a3f1-a817-4079-b684-f280f55bda61-ca-trust-extracted\") pod \"image-registry-66df7c8f76-b4z2p\" (UID: \"23e3a3f1-a817-4079-b684-f280f55bda61\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.666979 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vszbj\" (UniqueName: \"kubernetes.io/projected/23e3a3f1-a817-4079-b684-f280f55bda61-kube-api-access-vszbj\") pod \"image-registry-66df7c8f76-b4z2p\" (UID: \"23e3a3f1-a817-4079-b684-f280f55bda61\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.667024 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23e3a3f1-a817-4079-b684-f280f55bda61-trusted-ca\") pod \"image-registry-66df7c8f76-b4z2p\" (UID: \"23e3a3f1-a817-4079-b684-f280f55bda61\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.667056 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/23e3a3f1-a817-4079-b684-f280f55bda61-installation-pull-secrets\") pod \"image-registry-66df7c8f76-b4z2p\" (UID: \"23e3a3f1-a817-4079-b684-f280f55bda61\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.667093 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23e3a3f1-a817-4079-b684-f280f55bda61-bound-sa-token\") pod \"image-registry-66df7c8f76-b4z2p\" (UID: \"23e3a3f1-a817-4079-b684-f280f55bda61\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.667151 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/23e3a3f1-a817-4079-b684-f280f55bda61-registry-certificates\") pod \"image-registry-66df7c8f76-b4z2p\" (UID: \"23e3a3f1-a817-4079-b684-f280f55bda61\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.668733 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/23e3a3f1-a817-4079-b684-f280f55bda61-registry-certificates\") pod \"image-registry-66df7c8f76-b4z2p\" (UID: \"23e3a3f1-a817-4079-b684-f280f55bda61\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.679812 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23e3a3f1-a817-4079-b684-f280f55bda61-trusted-ca\") pod \"image-registry-66df7c8f76-b4z2p\" (UID: \"23e3a3f1-a817-4079-b684-f280f55bda61\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.686775 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23e3a3f1-a817-4079-b684-f280f55bda61-registry-tls\") pod \"image-registry-66df7c8f76-b4z2p\" (UID: \"23e3a3f1-a817-4079-b684-f280f55bda61\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.692261 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/23e3a3f1-a817-4079-b684-f280f55bda61-ca-trust-extracted\") pod \"image-registry-66df7c8f76-b4z2p\" (UID: \"23e3a3f1-a817-4079-b684-f280f55bda61\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.692490 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/23e3a3f1-a817-4079-b684-f280f55bda61-installation-pull-secrets\") pod \"image-registry-66df7c8f76-b4z2p\" (UID: \"23e3a3f1-a817-4079-b684-f280f55bda61\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.720976 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vszbj\" (UniqueName: \"kubernetes.io/projected/23e3a3f1-a817-4079-b684-f280f55bda61-kube-api-access-vszbj\") pod \"image-registry-66df7c8f76-b4z2p\" (UID: \"23e3a3f1-a817-4079-b684-f280f55bda61\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.742593 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23e3a3f1-a817-4079-b684-f280f55bda61-bound-sa-token\") pod \"image-registry-66df7c8f76-b4z2p\" (UID: \"23e3a3f1-a817-4079-b684-f280f55bda61\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:34 crc kubenswrapper[4825]: I0310 06:51:34.809965 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:35 crc kubenswrapper[4825]: I0310 06:51:35.702868 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-b4z2p"] Mar 10 06:51:35 crc kubenswrapper[4825]: W0310 06:51:35.710789 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23e3a3f1_a817_4079_b684_f280f55bda61.slice/crio-ecd770f406b97124b33f549c4d84724cf5b456f802590791ce7c4e8257a982f9 WatchSource:0}: Error finding container ecd770f406b97124b33f549c4d84724cf5b456f802590791ce7c4e8257a982f9: Status 404 returned error can't find the container with id ecd770f406b97124b33f549c4d84724cf5b456f802590791ce7c4e8257a982f9 Mar 10 06:51:36 crc kubenswrapper[4825]: I0310 06:51:36.345390 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" event={"ID":"23e3a3f1-a817-4079-b684-f280f55bda61","Type":"ContainerStarted","Data":"3a41ea5a05a3b5efe4c382d3464f9e614eaccd8991f9a6a36cd3aa1b87b39f03"} Mar 10 06:51:36 crc kubenswrapper[4825]: I0310 06:51:36.346366 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:36 crc kubenswrapper[4825]: I0310 06:51:36.346522 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" event={"ID":"23e3a3f1-a817-4079-b684-f280f55bda61","Type":"ContainerStarted","Data":"ecd770f406b97124b33f549c4d84724cf5b456f802590791ce7c4e8257a982f9"} Mar 10 06:51:36 crc kubenswrapper[4825]: I0310 06:51:36.375062 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" podStartSLOduration=2.375034663 podStartE2EDuration="2.375034663s" podCreationTimestamp="2026-03-10 06:51:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:51:36.36637667 +0000 UTC m=+449.396157335" watchObservedRunningTime="2026-03-10 06:51:36.375034663 +0000 UTC m=+449.404815318" Mar 10 06:51:40 crc kubenswrapper[4825]: I0310 06:51:40.659601 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4gwdl"] Mar 10 06:51:40 crc kubenswrapper[4825]: I0310 06:51:40.663860 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4gwdl" podUID="f7b359b6-5dbd-4270-9195-a355b8ce3dbd" containerName="registry-server" containerID="cri-o://962de788726a07017b622672c07b4be37a71c9ca654aec23a2069145d527f6e1" gracePeriod=30 Mar 10 06:51:40 crc kubenswrapper[4825]: I0310 06:51:40.673988 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9zmmv"] Mar 10 06:51:40 crc kubenswrapper[4825]: I0310 06:51:40.674888 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9zmmv" podUID="e945d496-847a-4109-ae2d-41e169b241c2" containerName="registry-server" containerID="cri-o://58a324ade7abfa6edc714b16d5c4422ff524ed8541e992b8ddf9e249f3eb58a3" gracePeriod=30 Mar 10 06:51:40 crc kubenswrapper[4825]: I0310 06:51:40.699354 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-txg9q"] Mar 10 06:51:40 crc kubenswrapper[4825]: I0310 06:51:40.699786 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" podUID="4d23b66c-f736-4af2-9dc7-6167ca4d53ef" containerName="marketplace-operator" containerID="cri-o://80d26be988b3dcce8e1ac3516924261ca1e76ea685d249acc7ea264fcdb8111e" gracePeriod=30 Mar 10 06:51:40 crc kubenswrapper[4825]: I0310 06:51:40.710979 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vkvnj"] Mar 10 06:51:40 crc kubenswrapper[4825]: I0310 06:51:40.711465 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vkvnj" podUID="d41ec286-2462-448b-ab27-1acc0e1dab3c" containerName="registry-server" containerID="cri-o://d1b239e14465c419a2f7f5c90b87daa5e46a35ae3f43167c2bac4bb852f84614" gracePeriod=30 Mar 10 06:51:40 crc kubenswrapper[4825]: I0310 06:51:40.718766 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jrdg7"] Mar 10 06:51:40 crc kubenswrapper[4825]: I0310 06:51:40.720067 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jrdg7" Mar 10 06:51:40 crc kubenswrapper[4825]: I0310 06:51:40.722649 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7h5g2"] Mar 10 06:51:40 crc kubenswrapper[4825]: I0310 06:51:40.723461 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7h5g2" podUID="27fbaf63-cf93-470e-b012-ad8bb403f65a" containerName="registry-server" containerID="cri-o://3d3eb1c1fad78b075a8af2242d56cb89c43230c27301c8216b1bdeeb592c227e" gracePeriod=30 Mar 10 06:51:40 crc kubenswrapper[4825]: I0310 06:51:40.769277 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jrdg7"] Mar 10 06:51:40 crc kubenswrapper[4825]: E0310 06:51:40.860091 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 58a324ade7abfa6edc714b16d5c4422ff524ed8541e992b8ddf9e249f3eb58a3 is running failed: container process not found" containerID="58a324ade7abfa6edc714b16d5c4422ff524ed8541e992b8ddf9e249f3eb58a3" cmd=["grpc_health_probe","-addr=:50051"] Mar 10 06:51:40 crc kubenswrapper[4825]: E0310 06:51:40.860618 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 58a324ade7abfa6edc714b16d5c4422ff524ed8541e992b8ddf9e249f3eb58a3 is running failed: container process not found" containerID="58a324ade7abfa6edc714b16d5c4422ff524ed8541e992b8ddf9e249f3eb58a3" cmd=["grpc_health_probe","-addr=:50051"] Mar 10 06:51:40 crc kubenswrapper[4825]: E0310 06:51:40.861314 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 58a324ade7abfa6edc714b16d5c4422ff524ed8541e992b8ddf9e249f3eb58a3 is running failed: container process not found" containerID="58a324ade7abfa6edc714b16d5c4422ff524ed8541e992b8ddf9e249f3eb58a3" cmd=["grpc_health_probe","-addr=:50051"] Mar 10 06:51:40 crc kubenswrapper[4825]: E0310 06:51:40.861341 4825 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 58a324ade7abfa6edc714b16d5c4422ff524ed8541e992b8ddf9e249f3eb58a3 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-9zmmv" podUID="e945d496-847a-4109-ae2d-41e169b241c2" containerName="registry-server" Mar 10 06:51:40 crc kubenswrapper[4825]: I0310 06:51:40.873221 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fdaf5903-c728-4085-9c33-30359e2c9af3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jrdg7\" (UID: \"fdaf5903-c728-4085-9c33-30359e2c9af3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jrdg7" Mar 10 06:51:40 crc kubenswrapper[4825]: I0310 06:51:40.873295 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxl8l\" (UniqueName: \"kubernetes.io/projected/fdaf5903-c728-4085-9c33-30359e2c9af3-kube-api-access-bxl8l\") pod \"marketplace-operator-79b997595-jrdg7\" (UID: \"fdaf5903-c728-4085-9c33-30359e2c9af3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jrdg7" Mar 10 06:51:40 crc kubenswrapper[4825]: I0310 06:51:40.873348 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fdaf5903-c728-4085-9c33-30359e2c9af3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jrdg7\" (UID: \"fdaf5903-c728-4085-9c33-30359e2c9af3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jrdg7" Mar 10 06:51:40 crc kubenswrapper[4825]: I0310 06:51:40.984185 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fdaf5903-c728-4085-9c33-30359e2c9af3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jrdg7\" (UID: \"fdaf5903-c728-4085-9c33-30359e2c9af3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jrdg7" Mar 10 06:51:40 crc kubenswrapper[4825]: I0310 06:51:40.984305 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fdaf5903-c728-4085-9c33-30359e2c9af3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jrdg7\" (UID: \"fdaf5903-c728-4085-9c33-30359e2c9af3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jrdg7" Mar 10 06:51:40 crc kubenswrapper[4825]: I0310 06:51:40.984333 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxl8l\" (UniqueName: \"kubernetes.io/projected/fdaf5903-c728-4085-9c33-30359e2c9af3-kube-api-access-bxl8l\") pod \"marketplace-operator-79b997595-jrdg7\" (UID: \"fdaf5903-c728-4085-9c33-30359e2c9af3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jrdg7" Mar 10 06:51:40 crc kubenswrapper[4825]: I0310 06:51:40.986109 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fdaf5903-c728-4085-9c33-30359e2c9af3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jrdg7\" (UID: \"fdaf5903-c728-4085-9c33-30359e2c9af3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jrdg7" Mar 10 06:51:40 crc kubenswrapper[4825]: I0310 06:51:40.995478 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fdaf5903-c728-4085-9c33-30359e2c9af3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jrdg7\" (UID: \"fdaf5903-c728-4085-9c33-30359e2c9af3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jrdg7" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.006475 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxl8l\" (UniqueName: \"kubernetes.io/projected/fdaf5903-c728-4085-9c33-30359e2c9af3-kube-api-access-bxl8l\") pod \"marketplace-operator-79b997595-jrdg7\" (UID: \"fdaf5903-c728-4085-9c33-30359e2c9af3\") " pod="openshift-marketplace/marketplace-operator-79b997595-jrdg7" Mar 10 06:51:41 crc kubenswrapper[4825]: E0310 06:51:41.138497 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 962de788726a07017b622672c07b4be37a71c9ca654aec23a2069145d527f6e1 is running failed: container process not found" containerID="962de788726a07017b622672c07b4be37a71c9ca654aec23a2069145d527f6e1" cmd=["grpc_health_probe","-addr=:50051"] Mar 10 06:51:41 crc kubenswrapper[4825]: E0310 06:51:41.139366 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 962de788726a07017b622672c07b4be37a71c9ca654aec23a2069145d527f6e1 is running failed: container process not found" containerID="962de788726a07017b622672c07b4be37a71c9ca654aec23a2069145d527f6e1" cmd=["grpc_health_probe","-addr=:50051"] Mar 10 06:51:41 crc kubenswrapper[4825]: E0310 06:51:41.139812 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 962de788726a07017b622672c07b4be37a71c9ca654aec23a2069145d527f6e1 is running failed: container process not found" containerID="962de788726a07017b622672c07b4be37a71c9ca654aec23a2069145d527f6e1" cmd=["grpc_health_probe","-addr=:50051"] Mar 10 06:51:41 crc kubenswrapper[4825]: E0310 06:51:41.139878 4825 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 962de788726a07017b622672c07b4be37a71c9ca654aec23a2069145d527f6e1 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-4gwdl" podUID="f7b359b6-5dbd-4270-9195-a355b8ce3dbd" containerName="registry-server" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.173797 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jrdg7" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.187852 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gwdl" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.195382 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7h5g2" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.197099 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vkvnj" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.290678 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5wjz\" (UniqueName: \"kubernetes.io/projected/f7b359b6-5dbd-4270-9195-a355b8ce3dbd-kube-api-access-w5wjz\") pod \"f7b359b6-5dbd-4270-9195-a355b8ce3dbd\" (UID: \"f7b359b6-5dbd-4270-9195-a355b8ce3dbd\") " Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.290750 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7b359b6-5dbd-4270-9195-a355b8ce3dbd-utilities\") pod \"f7b359b6-5dbd-4270-9195-a355b8ce3dbd\" (UID: \"f7b359b6-5dbd-4270-9195-a355b8ce3dbd\") " Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.290807 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7b359b6-5dbd-4270-9195-a355b8ce3dbd-catalog-content\") pod \"f7b359b6-5dbd-4270-9195-a355b8ce3dbd\" (UID: \"f7b359b6-5dbd-4270-9195-a355b8ce3dbd\") " Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.292594 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7b359b6-5dbd-4270-9195-a355b8ce3dbd-utilities" (OuterVolumeSpecName: "utilities") pod "f7b359b6-5dbd-4270-9195-a355b8ce3dbd" (UID: "f7b359b6-5dbd-4270-9195-a355b8ce3dbd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.294766 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7b359b6-5dbd-4270-9195-a355b8ce3dbd-kube-api-access-w5wjz" (OuterVolumeSpecName: "kube-api-access-w5wjz") pod "f7b359b6-5dbd-4270-9195-a355b8ce3dbd" (UID: "f7b359b6-5dbd-4270-9195-a355b8ce3dbd"). InnerVolumeSpecName "kube-api-access-w5wjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.368440 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7b359b6-5dbd-4270-9195-a355b8ce3dbd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7b359b6-5dbd-4270-9195-a355b8ce3dbd" (UID: "f7b359b6-5dbd-4270-9195-a355b8ce3dbd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.392213 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d41ec286-2462-448b-ab27-1acc0e1dab3c-catalog-content\") pod \"d41ec286-2462-448b-ab27-1acc0e1dab3c\" (UID: \"d41ec286-2462-448b-ab27-1acc0e1dab3c\") " Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.392310 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcm4d\" (UniqueName: \"kubernetes.io/projected/27fbaf63-cf93-470e-b012-ad8bb403f65a-kube-api-access-pcm4d\") pod \"27fbaf63-cf93-470e-b012-ad8bb403f65a\" (UID: \"27fbaf63-cf93-470e-b012-ad8bb403f65a\") " Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.392336 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27fbaf63-cf93-470e-b012-ad8bb403f65a-catalog-content\") pod \"27fbaf63-cf93-470e-b012-ad8bb403f65a\" (UID: \"27fbaf63-cf93-470e-b012-ad8bb403f65a\") " Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.392368 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d41ec286-2462-448b-ab27-1acc0e1dab3c-utilities\") pod \"d41ec286-2462-448b-ab27-1acc0e1dab3c\" (UID: \"d41ec286-2462-448b-ab27-1acc0e1dab3c\") " Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.392393 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27fbaf63-cf93-470e-b012-ad8bb403f65a-utilities\") pod \"27fbaf63-cf93-470e-b012-ad8bb403f65a\" (UID: \"27fbaf63-cf93-470e-b012-ad8bb403f65a\") " Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.392420 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9p8v\" (UniqueName: \"kubernetes.io/projected/d41ec286-2462-448b-ab27-1acc0e1dab3c-kube-api-access-t9p8v\") pod \"d41ec286-2462-448b-ab27-1acc0e1dab3c\" (UID: \"d41ec286-2462-448b-ab27-1acc0e1dab3c\") " Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.392632 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5wjz\" (UniqueName: \"kubernetes.io/projected/f7b359b6-5dbd-4270-9195-a355b8ce3dbd-kube-api-access-w5wjz\") on node \"crc\" DevicePath \"\"" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.392662 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7b359b6-5dbd-4270-9195-a355b8ce3dbd-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.392674 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7b359b6-5dbd-4270-9195-a355b8ce3dbd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.393452 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27fbaf63-cf93-470e-b012-ad8bb403f65a-utilities" (OuterVolumeSpecName: "utilities") pod "27fbaf63-cf93-470e-b012-ad8bb403f65a" (UID: "27fbaf63-cf93-470e-b012-ad8bb403f65a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.393894 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d41ec286-2462-448b-ab27-1acc0e1dab3c-utilities" (OuterVolumeSpecName: "utilities") pod "d41ec286-2462-448b-ab27-1acc0e1dab3c" (UID: "d41ec286-2462-448b-ab27-1acc0e1dab3c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.396759 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d41ec286-2462-448b-ab27-1acc0e1dab3c-kube-api-access-t9p8v" (OuterVolumeSpecName: "kube-api-access-t9p8v") pod "d41ec286-2462-448b-ab27-1acc0e1dab3c" (UID: "d41ec286-2462-448b-ab27-1acc0e1dab3c"). InnerVolumeSpecName "kube-api-access-t9p8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.396979 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27fbaf63-cf93-470e-b012-ad8bb403f65a-kube-api-access-pcm4d" (OuterVolumeSpecName: "kube-api-access-pcm4d") pod "27fbaf63-cf93-470e-b012-ad8bb403f65a" (UID: "27fbaf63-cf93-470e-b012-ad8bb403f65a"). InnerVolumeSpecName "kube-api-access-pcm4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.397459 4825 generic.go:334] "Generic (PLEG): container finished" podID="d41ec286-2462-448b-ab27-1acc0e1dab3c" containerID="d1b239e14465c419a2f7f5c90b87daa5e46a35ae3f43167c2bac4bb852f84614" exitCode=0 Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.397532 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vkvnj" event={"ID":"d41ec286-2462-448b-ab27-1acc0e1dab3c","Type":"ContainerDied","Data":"d1b239e14465c419a2f7f5c90b87daa5e46a35ae3f43167c2bac4bb852f84614"} Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.397576 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vkvnj" event={"ID":"d41ec286-2462-448b-ab27-1acc0e1dab3c","Type":"ContainerDied","Data":"e2519a5aca1f5139e9a5ba481a45ae0def4a882840d4d3bd1593fea658ba1e00"} Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.397600 4825 scope.go:117] "RemoveContainer" containerID="d1b239e14465c419a2f7f5c90b87daa5e46a35ae3f43167c2bac4bb852f84614" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.397781 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vkvnj" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.415436 4825 generic.go:334] "Generic (PLEG): container finished" podID="27fbaf63-cf93-470e-b012-ad8bb403f65a" containerID="3d3eb1c1fad78b075a8af2242d56cb89c43230c27301c8216b1bdeeb592c227e" exitCode=0 Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.415570 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h5g2" event={"ID":"27fbaf63-cf93-470e-b012-ad8bb403f65a","Type":"ContainerDied","Data":"3d3eb1c1fad78b075a8af2242d56cb89c43230c27301c8216b1bdeeb592c227e"} Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.415622 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h5g2" event={"ID":"27fbaf63-cf93-470e-b012-ad8bb403f65a","Type":"ContainerDied","Data":"ccd786935bdb282ed517896e006002ce36d82fbb5f90ecdfedc5b6fca09cad22"} Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.415569 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7h5g2" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.418089 4825 generic.go:334] "Generic (PLEG): container finished" podID="4d23b66c-f736-4af2-9dc7-6167ca4d53ef" containerID="80d26be988b3dcce8e1ac3516924261ca1e76ea685d249acc7ea264fcdb8111e" exitCode=0 Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.418170 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" event={"ID":"4d23b66c-f736-4af2-9dc7-6167ca4d53ef","Type":"ContainerDied","Data":"80d26be988b3dcce8e1ac3516924261ca1e76ea685d249acc7ea264fcdb8111e"} Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.419882 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jrdg7"] Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.420641 4825 generic.go:334] "Generic (PLEG): container finished" podID="f7b359b6-5dbd-4270-9195-a355b8ce3dbd" containerID="962de788726a07017b622672c07b4be37a71c9ca654aec23a2069145d527f6e1" exitCode=0 Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.420689 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gwdl" event={"ID":"f7b359b6-5dbd-4270-9195-a355b8ce3dbd","Type":"ContainerDied","Data":"962de788726a07017b622672c07b4be37a71c9ca654aec23a2069145d527f6e1"} Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.420710 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gwdl" event={"ID":"f7b359b6-5dbd-4270-9195-a355b8ce3dbd","Type":"ContainerDied","Data":"5b1caaa0c6dce7c64aae35b69d036af11e91e672b4b1c88e6a4b8a2fe2c96d9b"} Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.420786 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gwdl" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.432461 4825 scope.go:117] "RemoveContainer" containerID="b04271cd9b4e4bdf29345f6d5cb92625e4711bb84e566a466273a97200d21000" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.433992 4825 generic.go:334] "Generic (PLEG): container finished" podID="e945d496-847a-4109-ae2d-41e169b241c2" containerID="58a324ade7abfa6edc714b16d5c4422ff524ed8541e992b8ddf9e249f3eb58a3" exitCode=0 Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.434030 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zmmv" event={"ID":"e945d496-847a-4109-ae2d-41e169b241c2","Type":"ContainerDied","Data":"58a324ade7abfa6edc714b16d5c4422ff524ed8541e992b8ddf9e249f3eb58a3"} Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.439278 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d41ec286-2462-448b-ab27-1acc0e1dab3c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d41ec286-2462-448b-ab27-1acc0e1dab3c" (UID: "d41ec286-2462-448b-ab27-1acc0e1dab3c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.454247 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4gwdl"] Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.464315 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4gwdl"] Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.489795 4825 scope.go:117] "RemoveContainer" containerID="240cdac1966db2f595a0f9bb13f2f782bb43bbced94342bb7a5e7435dc697408" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.494729 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d41ec286-2462-448b-ab27-1acc0e1dab3c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.494774 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcm4d\" (UniqueName: \"kubernetes.io/projected/27fbaf63-cf93-470e-b012-ad8bb403f65a-kube-api-access-pcm4d\") on node \"crc\" DevicePath \"\"" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.494791 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d41ec286-2462-448b-ab27-1acc0e1dab3c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.494806 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27fbaf63-cf93-470e-b012-ad8bb403f65a-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.494821 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9p8v\" (UniqueName: \"kubernetes.io/projected/d41ec286-2462-448b-ab27-1acc0e1dab3c-kube-api-access-t9p8v\") on node \"crc\" DevicePath \"\"" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.539195 4825 scope.go:117] "RemoveContainer" containerID="d1b239e14465c419a2f7f5c90b87daa5e46a35ae3f43167c2bac4bb852f84614" Mar 10 06:51:41 crc kubenswrapper[4825]: E0310 06:51:41.539772 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1b239e14465c419a2f7f5c90b87daa5e46a35ae3f43167c2bac4bb852f84614\": container with ID starting with d1b239e14465c419a2f7f5c90b87daa5e46a35ae3f43167c2bac4bb852f84614 not found: ID does not exist" containerID="d1b239e14465c419a2f7f5c90b87daa5e46a35ae3f43167c2bac4bb852f84614" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.539808 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1b239e14465c419a2f7f5c90b87daa5e46a35ae3f43167c2bac4bb852f84614"} err="failed to get container status \"d1b239e14465c419a2f7f5c90b87daa5e46a35ae3f43167c2bac4bb852f84614\": rpc error: code = NotFound desc = could not find container \"d1b239e14465c419a2f7f5c90b87daa5e46a35ae3f43167c2bac4bb852f84614\": container with ID starting with d1b239e14465c419a2f7f5c90b87daa5e46a35ae3f43167c2bac4bb852f84614 not found: ID does not exist" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.539832 4825 scope.go:117] "RemoveContainer" containerID="b04271cd9b4e4bdf29345f6d5cb92625e4711bb84e566a466273a97200d21000" Mar 10 06:51:41 crc kubenswrapper[4825]: E0310 06:51:41.540024 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b04271cd9b4e4bdf29345f6d5cb92625e4711bb84e566a466273a97200d21000\": container with ID starting with b04271cd9b4e4bdf29345f6d5cb92625e4711bb84e566a466273a97200d21000 not found: ID does not exist" containerID="b04271cd9b4e4bdf29345f6d5cb92625e4711bb84e566a466273a97200d21000" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.540052 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b04271cd9b4e4bdf29345f6d5cb92625e4711bb84e566a466273a97200d21000"} err="failed to get container status \"b04271cd9b4e4bdf29345f6d5cb92625e4711bb84e566a466273a97200d21000\": rpc error: code = NotFound desc = could not find container \"b04271cd9b4e4bdf29345f6d5cb92625e4711bb84e566a466273a97200d21000\": container with ID starting with b04271cd9b4e4bdf29345f6d5cb92625e4711bb84e566a466273a97200d21000 not found: ID does not exist" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.540065 4825 scope.go:117] "RemoveContainer" containerID="240cdac1966db2f595a0f9bb13f2f782bb43bbced94342bb7a5e7435dc697408" Mar 10 06:51:41 crc kubenswrapper[4825]: E0310 06:51:41.540260 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"240cdac1966db2f595a0f9bb13f2f782bb43bbced94342bb7a5e7435dc697408\": container with ID starting with 240cdac1966db2f595a0f9bb13f2f782bb43bbced94342bb7a5e7435dc697408 not found: ID does not exist" containerID="240cdac1966db2f595a0f9bb13f2f782bb43bbced94342bb7a5e7435dc697408" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.540279 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"240cdac1966db2f595a0f9bb13f2f782bb43bbced94342bb7a5e7435dc697408"} err="failed to get container status \"240cdac1966db2f595a0f9bb13f2f782bb43bbced94342bb7a5e7435dc697408\": rpc error: code = NotFound desc = could not find container \"240cdac1966db2f595a0f9bb13f2f782bb43bbced94342bb7a5e7435dc697408\": container with ID starting with 240cdac1966db2f595a0f9bb13f2f782bb43bbced94342bb7a5e7435dc697408 not found: ID does not exist" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.540292 4825 scope.go:117] "RemoveContainer" containerID="3d3eb1c1fad78b075a8af2242d56cb89c43230c27301c8216b1bdeeb592c227e" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.580152 4825 scope.go:117] "RemoveContainer" containerID="3e301dd8065a4a8ef3343728382f154d549d583f3cfc28e9fc21316a68215461" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.592027 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27fbaf63-cf93-470e-b012-ad8bb403f65a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27fbaf63-cf93-470e-b012-ad8bb403f65a" (UID: "27fbaf63-cf93-470e-b012-ad8bb403f65a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.592110 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zmmv" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.595859 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27fbaf63-cf93-470e-b012-ad8bb403f65a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.605904 4825 scope.go:117] "RemoveContainer" containerID="7555ba4068a98143157d525135c2b58738e4dcc9550e5fdea5b751522c27c3d1" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.629027 4825 scope.go:117] "RemoveContainer" containerID="3d3eb1c1fad78b075a8af2242d56cb89c43230c27301c8216b1bdeeb592c227e" Mar 10 06:51:41 crc kubenswrapper[4825]: E0310 06:51:41.629475 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d3eb1c1fad78b075a8af2242d56cb89c43230c27301c8216b1bdeeb592c227e\": container with ID starting with 3d3eb1c1fad78b075a8af2242d56cb89c43230c27301c8216b1bdeeb592c227e not found: ID does not exist" containerID="3d3eb1c1fad78b075a8af2242d56cb89c43230c27301c8216b1bdeeb592c227e" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.629507 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d3eb1c1fad78b075a8af2242d56cb89c43230c27301c8216b1bdeeb592c227e"} err="failed to get container status \"3d3eb1c1fad78b075a8af2242d56cb89c43230c27301c8216b1bdeeb592c227e\": rpc error: code = NotFound desc = could not find container \"3d3eb1c1fad78b075a8af2242d56cb89c43230c27301c8216b1bdeeb592c227e\": container with ID starting with 3d3eb1c1fad78b075a8af2242d56cb89c43230c27301c8216b1bdeeb592c227e not found: ID does not exist" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.629535 4825 scope.go:117] "RemoveContainer" containerID="3e301dd8065a4a8ef3343728382f154d549d583f3cfc28e9fc21316a68215461" Mar 10 06:51:41 crc kubenswrapper[4825]: E0310 06:51:41.629723 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e301dd8065a4a8ef3343728382f154d549d583f3cfc28e9fc21316a68215461\": container with ID starting with 3e301dd8065a4a8ef3343728382f154d549d583f3cfc28e9fc21316a68215461 not found: ID does not exist" containerID="3e301dd8065a4a8ef3343728382f154d549d583f3cfc28e9fc21316a68215461" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.629746 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e301dd8065a4a8ef3343728382f154d549d583f3cfc28e9fc21316a68215461"} err="failed to get container status \"3e301dd8065a4a8ef3343728382f154d549d583f3cfc28e9fc21316a68215461\": rpc error: code = NotFound desc = could not find container \"3e301dd8065a4a8ef3343728382f154d549d583f3cfc28e9fc21316a68215461\": container with ID starting with 3e301dd8065a4a8ef3343728382f154d549d583f3cfc28e9fc21316a68215461 not found: ID does not exist" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.629760 4825 scope.go:117] "RemoveContainer" containerID="7555ba4068a98143157d525135c2b58738e4dcc9550e5fdea5b751522c27c3d1" Mar 10 06:51:41 crc kubenswrapper[4825]: E0310 06:51:41.629917 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7555ba4068a98143157d525135c2b58738e4dcc9550e5fdea5b751522c27c3d1\": container with ID starting with 7555ba4068a98143157d525135c2b58738e4dcc9550e5fdea5b751522c27c3d1 not found: ID does not exist" containerID="7555ba4068a98143157d525135c2b58738e4dcc9550e5fdea5b751522c27c3d1" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.629936 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7555ba4068a98143157d525135c2b58738e4dcc9550e5fdea5b751522c27c3d1"} err="failed to get container status \"7555ba4068a98143157d525135c2b58738e4dcc9550e5fdea5b751522c27c3d1\": rpc error: code = NotFound desc = could not find container \"7555ba4068a98143157d525135c2b58738e4dcc9550e5fdea5b751522c27c3d1\": container with ID starting with 7555ba4068a98143157d525135c2b58738e4dcc9550e5fdea5b751522c27c3d1 not found: ID does not exist" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.629948 4825 scope.go:117] "RemoveContainer" containerID="4d4ea277f678ddf2345f9186c8aa38db3e06aebef380d7640660e915cd075c41" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.649176 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.656599 4825 scope.go:117] "RemoveContainer" containerID="962de788726a07017b622672c07b4be37a71c9ca654aec23a2069145d527f6e1" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.679918 4825 scope.go:117] "RemoveContainer" containerID="fac4bb211af00d8f9c1dbebd44d9d67ed30eee5c2232504f09c4c1150fa89544" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.696707 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e945d496-847a-4109-ae2d-41e169b241c2-catalog-content\") pod \"e945d496-847a-4109-ae2d-41e169b241c2\" (UID: \"e945d496-847a-4109-ae2d-41e169b241c2\") " Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.696774 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e945d496-847a-4109-ae2d-41e169b241c2-utilities\") pod \"e945d496-847a-4109-ae2d-41e169b241c2\" (UID: \"e945d496-847a-4109-ae2d-41e169b241c2\") " Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.696908 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcqbl\" (UniqueName: \"kubernetes.io/projected/e945d496-847a-4109-ae2d-41e169b241c2-kube-api-access-tcqbl\") pod \"e945d496-847a-4109-ae2d-41e169b241c2\" (UID: \"e945d496-847a-4109-ae2d-41e169b241c2\") " Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.699102 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e945d496-847a-4109-ae2d-41e169b241c2-utilities" (OuterVolumeSpecName: "utilities") pod "e945d496-847a-4109-ae2d-41e169b241c2" (UID: "e945d496-847a-4109-ae2d-41e169b241c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.703962 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e945d496-847a-4109-ae2d-41e169b241c2-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.705336 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e945d496-847a-4109-ae2d-41e169b241c2-kube-api-access-tcqbl" (OuterVolumeSpecName: "kube-api-access-tcqbl") pod "e945d496-847a-4109-ae2d-41e169b241c2" (UID: "e945d496-847a-4109-ae2d-41e169b241c2"). InnerVolumeSpecName "kube-api-access-tcqbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.724830 4825 scope.go:117] "RemoveContainer" containerID="4623f7ccc4c40b9a1a9edc649b908f5ba0b76b3bf82762a14b9a20d479bc2adb" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.754293 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vkvnj"] Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.756120 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vkvnj"] Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.762458 4825 scope.go:117] "RemoveContainer" containerID="962de788726a07017b622672c07b4be37a71c9ca654aec23a2069145d527f6e1" Mar 10 06:51:41 crc kubenswrapper[4825]: E0310 06:51:41.763111 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"962de788726a07017b622672c07b4be37a71c9ca654aec23a2069145d527f6e1\": container with ID starting with 962de788726a07017b622672c07b4be37a71c9ca654aec23a2069145d527f6e1 not found: ID does not exist" containerID="962de788726a07017b622672c07b4be37a71c9ca654aec23a2069145d527f6e1" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.763230 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"962de788726a07017b622672c07b4be37a71c9ca654aec23a2069145d527f6e1"} err="failed to get container status \"962de788726a07017b622672c07b4be37a71c9ca654aec23a2069145d527f6e1\": rpc error: code = NotFound desc = could not find container \"962de788726a07017b622672c07b4be37a71c9ca654aec23a2069145d527f6e1\": container with ID starting with 962de788726a07017b622672c07b4be37a71c9ca654aec23a2069145d527f6e1 not found: ID does not exist" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.763266 4825 scope.go:117] "RemoveContainer" containerID="fac4bb211af00d8f9c1dbebd44d9d67ed30eee5c2232504f09c4c1150fa89544" Mar 10 06:51:41 crc kubenswrapper[4825]: E0310 06:51:41.763637 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fac4bb211af00d8f9c1dbebd44d9d67ed30eee5c2232504f09c4c1150fa89544\": container with ID starting with fac4bb211af00d8f9c1dbebd44d9d67ed30eee5c2232504f09c4c1150fa89544 not found: ID does not exist" containerID="fac4bb211af00d8f9c1dbebd44d9d67ed30eee5c2232504f09c4c1150fa89544" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.763699 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fac4bb211af00d8f9c1dbebd44d9d67ed30eee5c2232504f09c4c1150fa89544"} err="failed to get container status \"fac4bb211af00d8f9c1dbebd44d9d67ed30eee5c2232504f09c4c1150fa89544\": rpc error: code = NotFound desc = could not find container \"fac4bb211af00d8f9c1dbebd44d9d67ed30eee5c2232504f09c4c1150fa89544\": container with ID starting with fac4bb211af00d8f9c1dbebd44d9d67ed30eee5c2232504f09c4c1150fa89544 not found: ID does not exist" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.763745 4825 scope.go:117] "RemoveContainer" containerID="4623f7ccc4c40b9a1a9edc649b908f5ba0b76b3bf82762a14b9a20d479bc2adb" Mar 10 06:51:41 crc kubenswrapper[4825]: E0310 06:51:41.764401 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4623f7ccc4c40b9a1a9edc649b908f5ba0b76b3bf82762a14b9a20d479bc2adb\": container with ID starting with 4623f7ccc4c40b9a1a9edc649b908f5ba0b76b3bf82762a14b9a20d479bc2adb not found: ID does not exist" containerID="4623f7ccc4c40b9a1a9edc649b908f5ba0b76b3bf82762a14b9a20d479bc2adb" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.764436 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4623f7ccc4c40b9a1a9edc649b908f5ba0b76b3bf82762a14b9a20d479bc2adb"} err="failed to get container status \"4623f7ccc4c40b9a1a9edc649b908f5ba0b76b3bf82762a14b9a20d479bc2adb\": rpc error: code = NotFound desc = could not find container \"4623f7ccc4c40b9a1a9edc649b908f5ba0b76b3bf82762a14b9a20d479bc2adb\": container with ID starting with 4623f7ccc4c40b9a1a9edc649b908f5ba0b76b3bf82762a14b9a20d479bc2adb not found: ID does not exist" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.767522 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7h5g2"] Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.770637 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e945d496-847a-4109-ae2d-41e169b241c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e945d496-847a-4109-ae2d-41e169b241c2" (UID: "e945d496-847a-4109-ae2d-41e169b241c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.770849 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7h5g2"] Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.806996 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc5w9\" (UniqueName: \"kubernetes.io/projected/4d23b66c-f736-4af2-9dc7-6167ca4d53ef-kube-api-access-mc5w9\") pod \"4d23b66c-f736-4af2-9dc7-6167ca4d53ef\" (UID: \"4d23b66c-f736-4af2-9dc7-6167ca4d53ef\") " Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.807049 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d23b66c-f736-4af2-9dc7-6167ca4d53ef-marketplace-trusted-ca\") pod \"4d23b66c-f736-4af2-9dc7-6167ca4d53ef\" (UID: \"4d23b66c-f736-4af2-9dc7-6167ca4d53ef\") " Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.807118 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4d23b66c-f736-4af2-9dc7-6167ca4d53ef-marketplace-operator-metrics\") pod \"4d23b66c-f736-4af2-9dc7-6167ca4d53ef\" (UID: \"4d23b66c-f736-4af2-9dc7-6167ca4d53ef\") " Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.807317 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e945d496-847a-4109-ae2d-41e169b241c2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.807331 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcqbl\" (UniqueName: \"kubernetes.io/projected/e945d496-847a-4109-ae2d-41e169b241c2-kube-api-access-tcqbl\") on node \"crc\" DevicePath \"\"" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.807747 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d23b66c-f736-4af2-9dc7-6167ca4d53ef-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "4d23b66c-f736-4af2-9dc7-6167ca4d53ef" (UID: "4d23b66c-f736-4af2-9dc7-6167ca4d53ef"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.814313 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d23b66c-f736-4af2-9dc7-6167ca4d53ef-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "4d23b66c-f736-4af2-9dc7-6167ca4d53ef" (UID: "4d23b66c-f736-4af2-9dc7-6167ca4d53ef"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.814494 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d23b66c-f736-4af2-9dc7-6167ca4d53ef-kube-api-access-mc5w9" (OuterVolumeSpecName: "kube-api-access-mc5w9") pod "4d23b66c-f736-4af2-9dc7-6167ca4d53ef" (UID: "4d23b66c-f736-4af2-9dc7-6167ca4d53ef"). InnerVolumeSpecName "kube-api-access-mc5w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.908393 4825 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4d23b66c-f736-4af2-9dc7-6167ca4d53ef-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.908456 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc5w9\" (UniqueName: \"kubernetes.io/projected/4d23b66c-f736-4af2-9dc7-6167ca4d53ef-kube-api-access-mc5w9\") on node \"crc\" DevicePath \"\"" Mar 10 06:51:41 crc kubenswrapper[4825]: I0310 06:51:41.908476 4825 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d23b66c-f736-4af2-9dc7-6167ca4d53ef-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.443851 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.443821 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-txg9q" event={"ID":"4d23b66c-f736-4af2-9dc7-6167ca4d53ef","Type":"ContainerDied","Data":"83fb2a979171b266537a6400dec6d73b3a93f176ac49b11d7b0d9cbb89e4c14b"} Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.444720 4825 scope.go:117] "RemoveContainer" containerID="80d26be988b3dcce8e1ac3516924261ca1e76ea685d249acc7ea264fcdb8111e" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.450416 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zmmv" event={"ID":"e945d496-847a-4109-ae2d-41e169b241c2","Type":"ContainerDied","Data":"3b8b29a0243edbb7a048bcd7a324d016e71460515b8950d383ec8fe104a64074"} Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.451042 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zmmv" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.458715 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jrdg7" event={"ID":"fdaf5903-c728-4085-9c33-30359e2c9af3","Type":"ContainerStarted","Data":"f02a61acfb6e613e36adac07320e4868c38621b0e47a0a0b6c9f9bab71e1c0bb"} Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.458749 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jrdg7" event={"ID":"fdaf5903-c728-4085-9c33-30359e2c9af3","Type":"ContainerStarted","Data":"9b7d95c1b247d40662008ba348b113f75b432e52fd00bcc63eedfd06a4c491ca"} Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.462064 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jrdg7" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.467991 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jrdg7" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.490923 4825 scope.go:117] "RemoveContainer" containerID="58a324ade7abfa6edc714b16d5c4422ff524ed8541e992b8ddf9e249f3eb58a3" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.516097 4825 scope.go:117] "RemoveContainer" containerID="505aed85c350b976d22668271546308dfb7ba915a5c98bc578f6a43d0ea182b0" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.517354 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jrdg7" podStartSLOduration=2.516901062 podStartE2EDuration="2.516901062s" podCreationTimestamp="2026-03-10 06:51:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:51:42.511061032 +0000 UTC m=+455.540841677" watchObservedRunningTime="2026-03-10 06:51:42.516901062 +0000 UTC m=+455.546681737" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.533019 4825 scope.go:117] "RemoveContainer" containerID="bf5279fde39c6ec638486a54176ba3fa9d3bf341ae6fb17ff1c0ca8ff14c8288" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.567785 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-txg9q"] Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.584219 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-txg9q"] Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.596239 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9zmmv"] Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.599972 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9zmmv"] Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.900045 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-twdqr"] Mar 10 06:51:42 crc kubenswrapper[4825]: E0310 06:51:42.900771 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d41ec286-2462-448b-ab27-1acc0e1dab3c" containerName="extract-utilities" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.900794 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d41ec286-2462-448b-ab27-1acc0e1dab3c" containerName="extract-utilities" Mar 10 06:51:42 crc kubenswrapper[4825]: E0310 06:51:42.900823 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27fbaf63-cf93-470e-b012-ad8bb403f65a" containerName="registry-server" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.900837 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="27fbaf63-cf93-470e-b012-ad8bb403f65a" containerName="registry-server" Mar 10 06:51:42 crc kubenswrapper[4825]: E0310 06:51:42.900857 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d23b66c-f736-4af2-9dc7-6167ca4d53ef" containerName="marketplace-operator" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.900896 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d23b66c-f736-4af2-9dc7-6167ca4d53ef" containerName="marketplace-operator" Mar 10 06:51:42 crc kubenswrapper[4825]: E0310 06:51:42.900916 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d41ec286-2462-448b-ab27-1acc0e1dab3c" containerName="extract-content" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.900929 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d41ec286-2462-448b-ab27-1acc0e1dab3c" containerName="extract-content" Mar 10 06:51:42 crc kubenswrapper[4825]: E0310 06:51:42.900947 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e945d496-847a-4109-ae2d-41e169b241c2" containerName="registry-server" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.900959 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e945d496-847a-4109-ae2d-41e169b241c2" containerName="registry-server" Mar 10 06:51:42 crc kubenswrapper[4825]: E0310 06:51:42.900979 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7b359b6-5dbd-4270-9195-a355b8ce3dbd" containerName="extract-content" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.900991 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7b359b6-5dbd-4270-9195-a355b8ce3dbd" containerName="extract-content" Mar 10 06:51:42 crc kubenswrapper[4825]: E0310 06:51:42.901010 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27fbaf63-cf93-470e-b012-ad8bb403f65a" containerName="extract-utilities" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.901023 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="27fbaf63-cf93-470e-b012-ad8bb403f65a" containerName="extract-utilities" Mar 10 06:51:42 crc kubenswrapper[4825]: E0310 06:51:42.901041 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27fbaf63-cf93-470e-b012-ad8bb403f65a" containerName="extract-content" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.901053 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="27fbaf63-cf93-470e-b012-ad8bb403f65a" containerName="extract-content" Mar 10 06:51:42 crc kubenswrapper[4825]: E0310 06:51:42.901068 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d41ec286-2462-448b-ab27-1acc0e1dab3c" containerName="registry-server" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.901083 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d41ec286-2462-448b-ab27-1acc0e1dab3c" containerName="registry-server" Mar 10 06:51:42 crc kubenswrapper[4825]: E0310 06:51:42.901108 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e945d496-847a-4109-ae2d-41e169b241c2" containerName="extract-content" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.901121 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e945d496-847a-4109-ae2d-41e169b241c2" containerName="extract-content" Mar 10 06:51:42 crc kubenswrapper[4825]: E0310 06:51:42.901223 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7b359b6-5dbd-4270-9195-a355b8ce3dbd" containerName="extract-utilities" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.901239 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7b359b6-5dbd-4270-9195-a355b8ce3dbd" containerName="extract-utilities" Mar 10 06:51:42 crc kubenswrapper[4825]: E0310 06:51:42.901253 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7b359b6-5dbd-4270-9195-a355b8ce3dbd" containerName="registry-server" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.901265 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7b359b6-5dbd-4270-9195-a355b8ce3dbd" containerName="registry-server" Mar 10 06:51:42 crc kubenswrapper[4825]: E0310 06:51:42.901281 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e945d496-847a-4109-ae2d-41e169b241c2" containerName="extract-utilities" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.901293 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e945d496-847a-4109-ae2d-41e169b241c2" containerName="extract-utilities" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.901453 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7b359b6-5dbd-4270-9195-a355b8ce3dbd" containerName="registry-server" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.901470 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d23b66c-f736-4af2-9dc7-6167ca4d53ef" containerName="marketplace-operator" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.901489 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e945d496-847a-4109-ae2d-41e169b241c2" containerName="registry-server" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.901510 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d23b66c-f736-4af2-9dc7-6167ca4d53ef" containerName="marketplace-operator" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.901528 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="d41ec286-2462-448b-ab27-1acc0e1dab3c" containerName="registry-server" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.901550 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="27fbaf63-cf93-470e-b012-ad8bb403f65a" containerName="registry-server" Mar 10 06:51:42 crc kubenswrapper[4825]: E0310 06:51:42.901715 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d23b66c-f736-4af2-9dc7-6167ca4d53ef" containerName="marketplace-operator" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.901729 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d23b66c-f736-4af2-9dc7-6167ca4d53ef" containerName="marketplace-operator" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.902905 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-twdqr" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.908282 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.914345 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-twdqr"] Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.924537 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftll7\" (UniqueName: \"kubernetes.io/projected/1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63-kube-api-access-ftll7\") pod \"redhat-marketplace-twdqr\" (UID: \"1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63\") " pod="openshift-marketplace/redhat-marketplace-twdqr" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.924630 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63-utilities\") pod \"redhat-marketplace-twdqr\" (UID: \"1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63\") " pod="openshift-marketplace/redhat-marketplace-twdqr" Mar 10 06:51:42 crc kubenswrapper[4825]: I0310 06:51:42.924691 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63-catalog-content\") pod \"redhat-marketplace-twdqr\" (UID: \"1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63\") " pod="openshift-marketplace/redhat-marketplace-twdqr" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.025797 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63-utilities\") pod \"redhat-marketplace-twdqr\" (UID: \"1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63\") " pod="openshift-marketplace/redhat-marketplace-twdqr" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.026264 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63-catalog-content\") pod \"redhat-marketplace-twdqr\" (UID: \"1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63\") " pod="openshift-marketplace/redhat-marketplace-twdqr" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.026473 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftll7\" (UniqueName: \"kubernetes.io/projected/1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63-kube-api-access-ftll7\") pod \"redhat-marketplace-twdqr\" (UID: \"1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63\") " pod="openshift-marketplace/redhat-marketplace-twdqr" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.026489 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63-utilities\") pod \"redhat-marketplace-twdqr\" (UID: \"1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63\") " pod="openshift-marketplace/redhat-marketplace-twdqr" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.027349 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63-catalog-content\") pod \"redhat-marketplace-twdqr\" (UID: \"1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63\") " pod="openshift-marketplace/redhat-marketplace-twdqr" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.053496 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftll7\" (UniqueName: \"kubernetes.io/projected/1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63-kube-api-access-ftll7\") pod \"redhat-marketplace-twdqr\" (UID: \"1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63\") " pod="openshift-marketplace/redhat-marketplace-twdqr" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.081672 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8kmh2"] Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.084810 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8kmh2" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.088242 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.094527 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8kmh2"] Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.228774 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-twdqr" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.229014 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b80fe579-c776-4eee-8b58-04e7ab7ac4cb-catalog-content\") pod \"redhat-operators-8kmh2\" (UID: \"b80fe579-c776-4eee-8b58-04e7ab7ac4cb\") " pod="openshift-marketplace/redhat-operators-8kmh2" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.229343 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrkdq\" (UniqueName: \"kubernetes.io/projected/b80fe579-c776-4eee-8b58-04e7ab7ac4cb-kube-api-access-mrkdq\") pod \"redhat-operators-8kmh2\" (UID: \"b80fe579-c776-4eee-8b58-04e7ab7ac4cb\") " pod="openshift-marketplace/redhat-operators-8kmh2" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.229382 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b80fe579-c776-4eee-8b58-04e7ab7ac4cb-utilities\") pod \"redhat-operators-8kmh2\" (UID: \"b80fe579-c776-4eee-8b58-04e7ab7ac4cb\") " pod="openshift-marketplace/redhat-operators-8kmh2" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.251333 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27fbaf63-cf93-470e-b012-ad8bb403f65a" path="/var/lib/kubelet/pods/27fbaf63-cf93-470e-b012-ad8bb403f65a/volumes" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.252833 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d23b66c-f736-4af2-9dc7-6167ca4d53ef" path="/var/lib/kubelet/pods/4d23b66c-f736-4af2-9dc7-6167ca4d53ef/volumes" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.255065 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d41ec286-2462-448b-ab27-1acc0e1dab3c" path="/var/lib/kubelet/pods/d41ec286-2462-448b-ab27-1acc0e1dab3c/volumes" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.257390 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e945d496-847a-4109-ae2d-41e169b241c2" path="/var/lib/kubelet/pods/e945d496-847a-4109-ae2d-41e169b241c2/volumes" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.258909 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7b359b6-5dbd-4270-9195-a355b8ce3dbd" path="/var/lib/kubelet/pods/f7b359b6-5dbd-4270-9195-a355b8ce3dbd/volumes" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.336334 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b80fe579-c776-4eee-8b58-04e7ab7ac4cb-catalog-content\") pod \"redhat-operators-8kmh2\" (UID: \"b80fe579-c776-4eee-8b58-04e7ab7ac4cb\") " pod="openshift-marketplace/redhat-operators-8kmh2" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.336384 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrkdq\" (UniqueName: \"kubernetes.io/projected/b80fe579-c776-4eee-8b58-04e7ab7ac4cb-kube-api-access-mrkdq\") pod \"redhat-operators-8kmh2\" (UID: \"b80fe579-c776-4eee-8b58-04e7ab7ac4cb\") " pod="openshift-marketplace/redhat-operators-8kmh2" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.336406 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b80fe579-c776-4eee-8b58-04e7ab7ac4cb-utilities\") pod \"redhat-operators-8kmh2\" (UID: \"b80fe579-c776-4eee-8b58-04e7ab7ac4cb\") " pod="openshift-marketplace/redhat-operators-8kmh2" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.337183 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b80fe579-c776-4eee-8b58-04e7ab7ac4cb-utilities\") pod \"redhat-operators-8kmh2\" (UID: \"b80fe579-c776-4eee-8b58-04e7ab7ac4cb\") " pod="openshift-marketplace/redhat-operators-8kmh2" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.337291 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b80fe579-c776-4eee-8b58-04e7ab7ac4cb-catalog-content\") pod \"redhat-operators-8kmh2\" (UID: \"b80fe579-c776-4eee-8b58-04e7ab7ac4cb\") " pod="openshift-marketplace/redhat-operators-8kmh2" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.363638 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrkdq\" (UniqueName: \"kubernetes.io/projected/b80fe579-c776-4eee-8b58-04e7ab7ac4cb-kube-api-access-mrkdq\") pod \"redhat-operators-8kmh2\" (UID: \"b80fe579-c776-4eee-8b58-04e7ab7ac4cb\") " pod="openshift-marketplace/redhat-operators-8kmh2" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.440556 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8kmh2" Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.440840 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-twdqr"] Mar 10 06:51:43 crc kubenswrapper[4825]: W0310 06:51:43.446793 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b90190d_cc9e_4a1e_9e24_cb85d5e4fd63.slice/crio-a7dd7b9bd9d72671c122c96acc71ebc691fc36d052894c29d6f20e52c97a81c4 WatchSource:0}: Error finding container a7dd7b9bd9d72671c122c96acc71ebc691fc36d052894c29d6f20e52c97a81c4: Status 404 returned error can't find the container with id a7dd7b9bd9d72671c122c96acc71ebc691fc36d052894c29d6f20e52c97a81c4 Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.473893 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twdqr" event={"ID":"1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63","Type":"ContainerStarted","Data":"a7dd7b9bd9d72671c122c96acc71ebc691fc36d052894c29d6f20e52c97a81c4"} Mar 10 06:51:43 crc kubenswrapper[4825]: I0310 06:51:43.767789 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8kmh2"] Mar 10 06:51:43 crc kubenswrapper[4825]: W0310 06:51:43.772736 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb80fe579_c776_4eee_8b58_04e7ab7ac4cb.slice/crio-7b40f751f60e384b7434d22c573c31e71d501f8c8903f8a7149f3586e1681193 WatchSource:0}: Error finding container 7b40f751f60e384b7434d22c573c31e71d501f8c8903f8a7149f3586e1681193: Status 404 returned error can't find the container with id 7b40f751f60e384b7434d22c573c31e71d501f8c8903f8a7149f3586e1681193 Mar 10 06:51:44 crc kubenswrapper[4825]: I0310 06:51:44.493896 4825 generic.go:334] "Generic (PLEG): container finished" podID="b80fe579-c776-4eee-8b58-04e7ab7ac4cb" containerID="976c6e1a134c094dd08b36f074c5ad7ba15169aba725cf91209633d3650301b4" exitCode=0 Mar 10 06:51:44 crc kubenswrapper[4825]: I0310 06:51:44.494027 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kmh2" event={"ID":"b80fe579-c776-4eee-8b58-04e7ab7ac4cb","Type":"ContainerDied","Data":"976c6e1a134c094dd08b36f074c5ad7ba15169aba725cf91209633d3650301b4"} Mar 10 06:51:44 crc kubenswrapper[4825]: I0310 06:51:44.494077 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kmh2" event={"ID":"b80fe579-c776-4eee-8b58-04e7ab7ac4cb","Type":"ContainerStarted","Data":"7b40f751f60e384b7434d22c573c31e71d501f8c8903f8a7149f3586e1681193"} Mar 10 06:51:44 crc kubenswrapper[4825]: I0310 06:51:44.500913 4825 generic.go:334] "Generic (PLEG): container finished" podID="1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63" containerID="ce790b381ec2b573cdbb6fba91626ec414f40c21a68abcfb8d6b15a42cbbda32" exitCode=0 Mar 10 06:51:44 crc kubenswrapper[4825]: I0310 06:51:44.501267 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twdqr" event={"ID":"1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63","Type":"ContainerDied","Data":"ce790b381ec2b573cdbb6fba91626ec414f40c21a68abcfb8d6b15a42cbbda32"} Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.286361 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lhnmv"] Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.287890 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lhnmv" Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.296255 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lhnmv"] Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.297903 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.368833 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw8kd\" (UniqueName: \"kubernetes.io/projected/3d75eddb-b190-4e4e-8f99-339d7923a4c0-kube-api-access-pw8kd\") pod \"certified-operators-lhnmv\" (UID: \"3d75eddb-b190-4e4e-8f99-339d7923a4c0\") " pod="openshift-marketplace/certified-operators-lhnmv" Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.368885 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d75eddb-b190-4e4e-8f99-339d7923a4c0-utilities\") pod \"certified-operators-lhnmv\" (UID: \"3d75eddb-b190-4e4e-8f99-339d7923a4c0\") " pod="openshift-marketplace/certified-operators-lhnmv" Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.368929 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d75eddb-b190-4e4e-8f99-339d7923a4c0-catalog-content\") pod \"certified-operators-lhnmv\" (UID: \"3d75eddb-b190-4e4e-8f99-339d7923a4c0\") " pod="openshift-marketplace/certified-operators-lhnmv" Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.470768 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d75eddb-b190-4e4e-8f99-339d7923a4c0-catalog-content\") pod \"certified-operators-lhnmv\" (UID: \"3d75eddb-b190-4e4e-8f99-339d7923a4c0\") " pod="openshift-marketplace/certified-operators-lhnmv" Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.471226 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw8kd\" (UniqueName: \"kubernetes.io/projected/3d75eddb-b190-4e4e-8f99-339d7923a4c0-kube-api-access-pw8kd\") pod \"certified-operators-lhnmv\" (UID: \"3d75eddb-b190-4e4e-8f99-339d7923a4c0\") " pod="openshift-marketplace/certified-operators-lhnmv" Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.471305 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d75eddb-b190-4e4e-8f99-339d7923a4c0-catalog-content\") pod \"certified-operators-lhnmv\" (UID: \"3d75eddb-b190-4e4e-8f99-339d7923a4c0\") " pod="openshift-marketplace/certified-operators-lhnmv" Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.471355 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d75eddb-b190-4e4e-8f99-339d7923a4c0-utilities\") pod \"certified-operators-lhnmv\" (UID: \"3d75eddb-b190-4e4e-8f99-339d7923a4c0\") " pod="openshift-marketplace/certified-operators-lhnmv" Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.472117 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d75eddb-b190-4e4e-8f99-339d7923a4c0-utilities\") pod \"certified-operators-lhnmv\" (UID: \"3d75eddb-b190-4e4e-8f99-339d7923a4c0\") " pod="openshift-marketplace/certified-operators-lhnmv" Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.487826 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bjjzc"] Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.489232 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bjjzc" Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.492366 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.496784 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bjjzc"] Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.513360 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw8kd\" (UniqueName: \"kubernetes.io/projected/3d75eddb-b190-4e4e-8f99-339d7923a4c0-kube-api-access-pw8kd\") pod \"certified-operators-lhnmv\" (UID: \"3d75eddb-b190-4e4e-8f99-339d7923a4c0\") " pod="openshift-marketplace/certified-operators-lhnmv" Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.574108 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4818c2a2-e616-4b5f-87c5-5c1e5c0cea22-utilities\") pod \"community-operators-bjjzc\" (UID: \"4818c2a2-e616-4b5f-87c5-5c1e5c0cea22\") " pod="openshift-marketplace/community-operators-bjjzc" Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.574202 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv68t\" (UniqueName: \"kubernetes.io/projected/4818c2a2-e616-4b5f-87c5-5c1e5c0cea22-kube-api-access-sv68t\") pod \"community-operators-bjjzc\" (UID: \"4818c2a2-e616-4b5f-87c5-5c1e5c0cea22\") " pod="openshift-marketplace/community-operators-bjjzc" Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.574251 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4818c2a2-e616-4b5f-87c5-5c1e5c0cea22-catalog-content\") pod \"community-operators-bjjzc\" (UID: \"4818c2a2-e616-4b5f-87c5-5c1e5c0cea22\") " pod="openshift-marketplace/community-operators-bjjzc" Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.614732 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lhnmv" Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.675465 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4818c2a2-e616-4b5f-87c5-5c1e5c0cea22-utilities\") pod \"community-operators-bjjzc\" (UID: \"4818c2a2-e616-4b5f-87c5-5c1e5c0cea22\") " pod="openshift-marketplace/community-operators-bjjzc" Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.675550 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv68t\" (UniqueName: \"kubernetes.io/projected/4818c2a2-e616-4b5f-87c5-5c1e5c0cea22-kube-api-access-sv68t\") pod \"community-operators-bjjzc\" (UID: \"4818c2a2-e616-4b5f-87c5-5c1e5c0cea22\") " pod="openshift-marketplace/community-operators-bjjzc" Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.675697 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4818c2a2-e616-4b5f-87c5-5c1e5c0cea22-catalog-content\") pod \"community-operators-bjjzc\" (UID: \"4818c2a2-e616-4b5f-87c5-5c1e5c0cea22\") " pod="openshift-marketplace/community-operators-bjjzc" Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.676501 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4818c2a2-e616-4b5f-87c5-5c1e5c0cea22-utilities\") pod \"community-operators-bjjzc\" (UID: \"4818c2a2-e616-4b5f-87c5-5c1e5c0cea22\") " pod="openshift-marketplace/community-operators-bjjzc" Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.676567 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4818c2a2-e616-4b5f-87c5-5c1e5c0cea22-catalog-content\") pod \"community-operators-bjjzc\" (UID: \"4818c2a2-e616-4b5f-87c5-5c1e5c0cea22\") " pod="openshift-marketplace/community-operators-bjjzc" Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.696439 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv68t\" (UniqueName: \"kubernetes.io/projected/4818c2a2-e616-4b5f-87c5-5c1e5c0cea22-kube-api-access-sv68t\") pod \"community-operators-bjjzc\" (UID: \"4818c2a2-e616-4b5f-87c5-5c1e5c0cea22\") " pod="openshift-marketplace/community-operators-bjjzc" Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.839638 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bjjzc" Mar 10 06:51:45 crc kubenswrapper[4825]: I0310 06:51:45.866218 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lhnmv"] Mar 10 06:51:45 crc kubenswrapper[4825]: W0310 06:51:45.869875 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d75eddb_b190_4e4e_8f99_339d7923a4c0.slice/crio-91763806981d23b8e4ffb287254620952fd6de77cf4ffc53122598503e7fc424 WatchSource:0}: Error finding container 91763806981d23b8e4ffb287254620952fd6de77cf4ffc53122598503e7fc424: Status 404 returned error can't find the container with id 91763806981d23b8e4ffb287254620952fd6de77cf4ffc53122598503e7fc424 Mar 10 06:51:46 crc kubenswrapper[4825]: I0310 06:51:46.066147 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bjjzc"] Mar 10 06:51:46 crc kubenswrapper[4825]: W0310 06:51:46.113001 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4818c2a2_e616_4b5f_87c5_5c1e5c0cea22.slice/crio-3c271d8b19f65a6633a30e7db25e0d83c47f416060dfba5e369adc40d6fb350a WatchSource:0}: Error finding container 3c271d8b19f65a6633a30e7db25e0d83c47f416060dfba5e369adc40d6fb350a: Status 404 returned error can't find the container with id 3c271d8b19f65a6633a30e7db25e0d83c47f416060dfba5e369adc40d6fb350a Mar 10 06:51:46 crc kubenswrapper[4825]: I0310 06:51:46.523296 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kmh2" event={"ID":"b80fe579-c776-4eee-8b58-04e7ab7ac4cb","Type":"ContainerStarted","Data":"01d2b48df14b5e6917c015a795d5839af6b19862724d7241088f342a0e33370b"} Mar 10 06:51:46 crc kubenswrapper[4825]: I0310 06:51:46.529651 4825 generic.go:334] "Generic (PLEG): container finished" podID="3d75eddb-b190-4e4e-8f99-339d7923a4c0" containerID="62a545b5838b74dab930ca1824c5ef81119474d1ef29a7b4e534d6e89abc830f" exitCode=0 Mar 10 06:51:46 crc kubenswrapper[4825]: I0310 06:51:46.529709 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhnmv" event={"ID":"3d75eddb-b190-4e4e-8f99-339d7923a4c0","Type":"ContainerDied","Data":"62a545b5838b74dab930ca1824c5ef81119474d1ef29a7b4e534d6e89abc830f"} Mar 10 06:51:46 crc kubenswrapper[4825]: I0310 06:51:46.529778 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhnmv" event={"ID":"3d75eddb-b190-4e4e-8f99-339d7923a4c0","Type":"ContainerStarted","Data":"91763806981d23b8e4ffb287254620952fd6de77cf4ffc53122598503e7fc424"} Mar 10 06:51:46 crc kubenswrapper[4825]: I0310 06:51:46.533915 4825 generic.go:334] "Generic (PLEG): container finished" podID="1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63" containerID="22ef40323512737debfee823e8755e03ea6722b4b3e586e3fa864ae597f240c7" exitCode=0 Mar 10 06:51:46 crc kubenswrapper[4825]: I0310 06:51:46.534027 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twdqr" event={"ID":"1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63","Type":"ContainerDied","Data":"22ef40323512737debfee823e8755e03ea6722b4b3e586e3fa864ae597f240c7"} Mar 10 06:51:46 crc kubenswrapper[4825]: I0310 06:51:46.562033 4825 generic.go:334] "Generic (PLEG): container finished" podID="4818c2a2-e616-4b5f-87c5-5c1e5c0cea22" containerID="e739bc6bf3649e9c01afccf949453e29ec4e8f93b1b010d4bf89f0c884633678" exitCode=0 Mar 10 06:51:46 crc kubenswrapper[4825]: I0310 06:51:46.562106 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjjzc" event={"ID":"4818c2a2-e616-4b5f-87c5-5c1e5c0cea22","Type":"ContainerDied","Data":"e739bc6bf3649e9c01afccf949453e29ec4e8f93b1b010d4bf89f0c884633678"} Mar 10 06:51:46 crc kubenswrapper[4825]: I0310 06:51:46.562157 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjjzc" event={"ID":"4818c2a2-e616-4b5f-87c5-5c1e5c0cea22","Type":"ContainerStarted","Data":"3c271d8b19f65a6633a30e7db25e0d83c47f416060dfba5e369adc40d6fb350a"} Mar 10 06:51:46 crc kubenswrapper[4825]: I0310 06:51:46.888116 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 06:51:46 crc kubenswrapper[4825]: I0310 06:51:46.888207 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 06:51:47 crc kubenswrapper[4825]: I0310 06:51:47.568055 4825 generic.go:334] "Generic (PLEG): container finished" podID="b80fe579-c776-4eee-8b58-04e7ab7ac4cb" containerID="01d2b48df14b5e6917c015a795d5839af6b19862724d7241088f342a0e33370b" exitCode=0 Mar 10 06:51:47 crc kubenswrapper[4825]: I0310 06:51:47.568270 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kmh2" event={"ID":"b80fe579-c776-4eee-8b58-04e7ab7ac4cb","Type":"ContainerDied","Data":"01d2b48df14b5e6917c015a795d5839af6b19862724d7241088f342a0e33370b"} Mar 10 06:51:47 crc kubenswrapper[4825]: I0310 06:51:47.572784 4825 generic.go:334] "Generic (PLEG): container finished" podID="3d75eddb-b190-4e4e-8f99-339d7923a4c0" containerID="2d9b0adb42e5d887e62b0e7beff93f940930ce9dad529d9c836e434605c67c6a" exitCode=0 Mar 10 06:51:47 crc kubenswrapper[4825]: I0310 06:51:47.572845 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhnmv" event={"ID":"3d75eddb-b190-4e4e-8f99-339d7923a4c0","Type":"ContainerDied","Data":"2d9b0adb42e5d887e62b0e7beff93f940930ce9dad529d9c836e434605c67c6a"} Mar 10 06:51:47 crc kubenswrapper[4825]: I0310 06:51:47.578362 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-twdqr" event={"ID":"1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63","Type":"ContainerStarted","Data":"166701ee20866bdedff58d62c069c4dffb173c39947587f772252becac3c048a"} Mar 10 06:51:47 crc kubenswrapper[4825]: I0310 06:51:47.580250 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjjzc" event={"ID":"4818c2a2-e616-4b5f-87c5-5c1e5c0cea22","Type":"ContainerStarted","Data":"623684389e82894ec4aa717e21d7701d78946357119d30204550f0a02fd97290"} Mar 10 06:51:47 crc kubenswrapper[4825]: I0310 06:51:47.611533 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-twdqr" podStartSLOduration=2.952910535 podStartE2EDuration="5.611508475s" podCreationTimestamp="2026-03-10 06:51:42 +0000 UTC" firstStartedPulling="2026-03-10 06:51:44.503483362 +0000 UTC m=+457.533263987" lastFinishedPulling="2026-03-10 06:51:47.162081302 +0000 UTC m=+460.191861927" observedRunningTime="2026-03-10 06:51:47.606832155 +0000 UTC m=+460.636612780" watchObservedRunningTime="2026-03-10 06:51:47.611508475 +0000 UTC m=+460.641289110" Mar 10 06:51:48 crc kubenswrapper[4825]: I0310 06:51:48.589023 4825 generic.go:334] "Generic (PLEG): container finished" podID="4818c2a2-e616-4b5f-87c5-5c1e5c0cea22" containerID="623684389e82894ec4aa717e21d7701d78946357119d30204550f0a02fd97290" exitCode=0 Mar 10 06:51:48 crc kubenswrapper[4825]: I0310 06:51:48.589146 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjjzc" event={"ID":"4818c2a2-e616-4b5f-87c5-5c1e5c0cea22","Type":"ContainerDied","Data":"623684389e82894ec4aa717e21d7701d78946357119d30204550f0a02fd97290"} Mar 10 06:51:48 crc kubenswrapper[4825]: I0310 06:51:48.592909 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kmh2" event={"ID":"b80fe579-c776-4eee-8b58-04e7ab7ac4cb","Type":"ContainerStarted","Data":"c463ebf6fab4960d816bf3fa83f077c07e682a07ebfaae3b5f0ee44f6adcc361"} Mar 10 06:51:48 crc kubenswrapper[4825]: I0310 06:51:48.596444 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhnmv" event={"ID":"3d75eddb-b190-4e4e-8f99-339d7923a4c0","Type":"ContainerStarted","Data":"416f5fac03ee6b5691657111577415cf63b215c1781ae7967c68b687a08f9a8c"} Mar 10 06:51:48 crc kubenswrapper[4825]: I0310 06:51:48.674458 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8kmh2" podStartSLOduration=2.23517187 podStartE2EDuration="5.674434861s" podCreationTimestamp="2026-03-10 06:51:43 +0000 UTC" firstStartedPulling="2026-03-10 06:51:44.497442526 +0000 UTC m=+457.527223181" lastFinishedPulling="2026-03-10 06:51:47.936705517 +0000 UTC m=+460.966486172" observedRunningTime="2026-03-10 06:51:48.674337338 +0000 UTC m=+461.704117953" watchObservedRunningTime="2026-03-10 06:51:48.674434861 +0000 UTC m=+461.704215476" Mar 10 06:51:48 crc kubenswrapper[4825]: I0310 06:51:48.710300 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lhnmv" podStartSLOduration=1.899264968 podStartE2EDuration="3.710274904s" podCreationTimestamp="2026-03-10 06:51:45 +0000 UTC" firstStartedPulling="2026-03-10 06:51:46.533403749 +0000 UTC m=+459.563184404" lastFinishedPulling="2026-03-10 06:51:48.344413685 +0000 UTC m=+461.374194340" observedRunningTime="2026-03-10 06:51:48.708381976 +0000 UTC m=+461.738162591" watchObservedRunningTime="2026-03-10 06:51:48.710274904 +0000 UTC m=+461.740055519" Mar 10 06:51:49 crc kubenswrapper[4825]: I0310 06:51:49.603556 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjjzc" event={"ID":"4818c2a2-e616-4b5f-87c5-5c1e5c0cea22","Type":"ContainerStarted","Data":"e0b1e61183cacb5ad38d7abf965b7cb62e3f01be9ead070df9b9c91272d7033a"} Mar 10 06:51:53 crc kubenswrapper[4825]: I0310 06:51:53.229082 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-twdqr" Mar 10 06:51:53 crc kubenswrapper[4825]: I0310 06:51:53.229481 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-twdqr" Mar 10 06:51:53 crc kubenswrapper[4825]: I0310 06:51:53.308479 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-twdqr" Mar 10 06:51:53 crc kubenswrapper[4825]: I0310 06:51:53.336606 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bjjzc" podStartSLOduration=5.907529724 podStartE2EDuration="8.336577949s" podCreationTimestamp="2026-03-10 06:51:45 +0000 UTC" firstStartedPulling="2026-03-10 06:51:46.570074944 +0000 UTC m=+459.599855559" lastFinishedPulling="2026-03-10 06:51:48.999123169 +0000 UTC m=+462.028903784" observedRunningTime="2026-03-10 06:51:49.635741937 +0000 UTC m=+462.665522542" watchObservedRunningTime="2026-03-10 06:51:53.336577949 +0000 UTC m=+466.366358604" Mar 10 06:51:53 crc kubenswrapper[4825]: I0310 06:51:53.441952 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8kmh2" Mar 10 06:51:53 crc kubenswrapper[4825]: I0310 06:51:53.442822 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8kmh2" Mar 10 06:51:53 crc kubenswrapper[4825]: I0310 06:51:53.714364 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-twdqr" Mar 10 06:51:54 crc kubenswrapper[4825]: I0310 06:51:54.519337 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8kmh2" podUID="b80fe579-c776-4eee-8b58-04e7ab7ac4cb" containerName="registry-server" probeResult="failure" output=< Mar 10 06:51:54 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 06:51:54 crc kubenswrapper[4825]: > Mar 10 06:51:54 crc kubenswrapper[4825]: I0310 06:51:54.818710 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-b4z2p" Mar 10 06:51:54 crc kubenswrapper[4825]: I0310 06:51:54.885813 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vlkfb"] Mar 10 06:51:55 crc kubenswrapper[4825]: I0310 06:51:55.615845 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lhnmv" Mar 10 06:51:55 crc kubenswrapper[4825]: I0310 06:51:55.616217 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lhnmv" Mar 10 06:51:55 crc kubenswrapper[4825]: I0310 06:51:55.684997 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lhnmv" Mar 10 06:51:55 crc kubenswrapper[4825]: I0310 06:51:55.841484 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bjjzc" Mar 10 06:51:55 crc kubenswrapper[4825]: I0310 06:51:55.841572 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bjjzc" Mar 10 06:51:55 crc kubenswrapper[4825]: I0310 06:51:55.911485 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bjjzc" Mar 10 06:51:56 crc kubenswrapper[4825]: I0310 06:51:56.740803 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lhnmv" Mar 10 06:51:56 crc kubenswrapper[4825]: I0310 06:51:56.746785 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bjjzc" Mar 10 06:52:00 crc kubenswrapper[4825]: I0310 06:52:00.144757 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552092-2pgzg"] Mar 10 06:52:00 crc kubenswrapper[4825]: I0310 06:52:00.146040 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552092-2pgzg" Mar 10 06:52:00 crc kubenswrapper[4825]: I0310 06:52:00.148540 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 06:52:00 crc kubenswrapper[4825]: I0310 06:52:00.150780 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 06:52:00 crc kubenswrapper[4825]: I0310 06:52:00.150863 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 06:52:00 crc kubenswrapper[4825]: I0310 06:52:00.164821 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552092-2pgzg"] Mar 10 06:52:00 crc kubenswrapper[4825]: I0310 06:52:00.305307 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nhgd\" (UniqueName: \"kubernetes.io/projected/1941c411-9bd8-4ff4-85ea-a803394e8eb0-kube-api-access-9nhgd\") pod \"auto-csr-approver-29552092-2pgzg\" (UID: \"1941c411-9bd8-4ff4-85ea-a803394e8eb0\") " pod="openshift-infra/auto-csr-approver-29552092-2pgzg" Mar 10 06:52:00 crc kubenswrapper[4825]: I0310 06:52:00.407665 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nhgd\" (UniqueName: \"kubernetes.io/projected/1941c411-9bd8-4ff4-85ea-a803394e8eb0-kube-api-access-9nhgd\") pod \"auto-csr-approver-29552092-2pgzg\" (UID: \"1941c411-9bd8-4ff4-85ea-a803394e8eb0\") " pod="openshift-infra/auto-csr-approver-29552092-2pgzg" Mar 10 06:52:00 crc kubenswrapper[4825]: I0310 06:52:00.432776 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nhgd\" (UniqueName: \"kubernetes.io/projected/1941c411-9bd8-4ff4-85ea-a803394e8eb0-kube-api-access-9nhgd\") pod \"auto-csr-approver-29552092-2pgzg\" (UID: \"1941c411-9bd8-4ff4-85ea-a803394e8eb0\") " pod="openshift-infra/auto-csr-approver-29552092-2pgzg" Mar 10 06:52:00 crc kubenswrapper[4825]: I0310 06:52:00.462480 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552092-2pgzg" Mar 10 06:52:00 crc kubenswrapper[4825]: I0310 06:52:00.696682 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552092-2pgzg"] Mar 10 06:52:00 crc kubenswrapper[4825]: W0310 06:52:00.710342 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1941c411_9bd8_4ff4_85ea_a803394e8eb0.slice/crio-49fcdda974294a5e54bfe3d10cc650b4dcf7a73d91c0a8771e0a6e1bfc5baf03 WatchSource:0}: Error finding container 49fcdda974294a5e54bfe3d10cc650b4dcf7a73d91c0a8771e0a6e1bfc5baf03: Status 404 returned error can't find the container with id 49fcdda974294a5e54bfe3d10cc650b4dcf7a73d91c0a8771e0a6e1bfc5baf03 Mar 10 06:52:01 crc kubenswrapper[4825]: I0310 06:52:01.699211 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552092-2pgzg" event={"ID":"1941c411-9bd8-4ff4-85ea-a803394e8eb0","Type":"ContainerStarted","Data":"49fcdda974294a5e54bfe3d10cc650b4dcf7a73d91c0a8771e0a6e1bfc5baf03"} Mar 10 06:52:02 crc kubenswrapper[4825]: I0310 06:52:02.715760 4825 generic.go:334] "Generic (PLEG): container finished" podID="1941c411-9bd8-4ff4-85ea-a803394e8eb0" containerID="d11ea08c246ccf9d91483d0d3e31652cc4fb3ecf7aa6f9a99836a6d2f942ac38" exitCode=0 Mar 10 06:52:02 crc kubenswrapper[4825]: I0310 06:52:02.715819 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552092-2pgzg" event={"ID":"1941c411-9bd8-4ff4-85ea-a803394e8eb0","Type":"ContainerDied","Data":"d11ea08c246ccf9d91483d0d3e31652cc4fb3ecf7aa6f9a99836a6d2f942ac38"} Mar 10 06:52:03 crc kubenswrapper[4825]: I0310 06:52:03.253335 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 06:52:03 crc kubenswrapper[4825]: I0310 06:52:03.515539 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8kmh2" Mar 10 06:52:03 crc kubenswrapper[4825]: I0310 06:52:03.589824 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8kmh2" Mar 10 06:52:04 crc kubenswrapper[4825]: I0310 06:52:04.017460 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552092-2pgzg" Mar 10 06:52:04 crc kubenswrapper[4825]: I0310 06:52:04.185039 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nhgd\" (UniqueName: \"kubernetes.io/projected/1941c411-9bd8-4ff4-85ea-a803394e8eb0-kube-api-access-9nhgd\") pod \"1941c411-9bd8-4ff4-85ea-a803394e8eb0\" (UID: \"1941c411-9bd8-4ff4-85ea-a803394e8eb0\") " Mar 10 06:52:04 crc kubenswrapper[4825]: I0310 06:52:04.194632 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1941c411-9bd8-4ff4-85ea-a803394e8eb0-kube-api-access-9nhgd" (OuterVolumeSpecName: "kube-api-access-9nhgd") pod "1941c411-9bd8-4ff4-85ea-a803394e8eb0" (UID: "1941c411-9bd8-4ff4-85ea-a803394e8eb0"). InnerVolumeSpecName "kube-api-access-9nhgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:52:04 crc kubenswrapper[4825]: I0310 06:52:04.286922 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nhgd\" (UniqueName: \"kubernetes.io/projected/1941c411-9bd8-4ff4-85ea-a803394e8eb0-kube-api-access-9nhgd\") on node \"crc\" DevicePath \"\"" Mar 10 06:52:04 crc kubenswrapper[4825]: I0310 06:52:04.736953 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552092-2pgzg" Mar 10 06:52:04 crc kubenswrapper[4825]: I0310 06:52:04.789076 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552092-2pgzg" event={"ID":"1941c411-9bd8-4ff4-85ea-a803394e8eb0","Type":"ContainerDied","Data":"49fcdda974294a5e54bfe3d10cc650b4dcf7a73d91c0a8771e0a6e1bfc5baf03"} Mar 10 06:52:04 crc kubenswrapper[4825]: I0310 06:52:04.789157 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49fcdda974294a5e54bfe3d10cc650b4dcf7a73d91c0a8771e0a6e1bfc5baf03" Mar 10 06:52:04 crc kubenswrapper[4825]: E0310 06:52:04.936371 4825 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1941c411_9bd8_4ff4_85ea_a803394e8eb0.slice\": RecentStats: unable to find data in memory cache]" Mar 10 06:52:05 crc kubenswrapper[4825]: I0310 06:52:05.105294 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552086-rkhdp"] Mar 10 06:52:05 crc kubenswrapper[4825]: I0310 06:52:05.113894 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552086-rkhdp"] Mar 10 06:52:05 crc kubenswrapper[4825]: I0310 06:52:05.252867 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19b3fb37-5072-45f0-8349-1296e31a1193" path="/var/lib/kubelet/pods/19b3fb37-5072-45f0-8349-1296e31a1193/volumes" Mar 10 06:52:16 crc kubenswrapper[4825]: I0310 06:52:16.887964 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 06:52:16 crc kubenswrapper[4825]: I0310 06:52:16.888753 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 06:52:19 crc kubenswrapper[4825]: I0310 06:52:19.934474 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" podUID="c9944ab9-f72f-44bd-8850-2898feff4a28" containerName="registry" containerID="cri-o://b1cb629928a9d8d3d8b2f4e35f90517ad19d163dc03e966429ace1ec7cea2401" gracePeriod=30 Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.365414 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.444839 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c9944ab9-f72f-44bd-8850-2898feff4a28-installation-pull-secrets\") pod \"c9944ab9-f72f-44bd-8850-2898feff4a28\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.445523 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c9944ab9-f72f-44bd-8850-2898feff4a28-ca-trust-extracted\") pod \"c9944ab9-f72f-44bd-8850-2898feff4a28\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.445669 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c9944ab9-f72f-44bd-8850-2898feff4a28-registry-tls\") pod \"c9944ab9-f72f-44bd-8850-2898feff4a28\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.445926 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"c9944ab9-f72f-44bd-8850-2898feff4a28\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.446206 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjdzk\" (UniqueName: \"kubernetes.io/projected/c9944ab9-f72f-44bd-8850-2898feff4a28-kube-api-access-wjdzk\") pod \"c9944ab9-f72f-44bd-8850-2898feff4a28\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.446492 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9944ab9-f72f-44bd-8850-2898feff4a28-bound-sa-token\") pod \"c9944ab9-f72f-44bd-8850-2898feff4a28\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.447366 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c9944ab9-f72f-44bd-8850-2898feff4a28-registry-certificates\") pod \"c9944ab9-f72f-44bd-8850-2898feff4a28\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.447531 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9944ab9-f72f-44bd-8850-2898feff4a28-trusted-ca\") pod \"c9944ab9-f72f-44bd-8850-2898feff4a28\" (UID: \"c9944ab9-f72f-44bd-8850-2898feff4a28\") " Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.448471 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9944ab9-f72f-44bd-8850-2898feff4a28-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c9944ab9-f72f-44bd-8850-2898feff4a28" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.448638 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9944ab9-f72f-44bd-8850-2898feff4a28-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c9944ab9-f72f-44bd-8850-2898feff4a28" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.454205 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9944ab9-f72f-44bd-8850-2898feff4a28-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c9944ab9-f72f-44bd-8850-2898feff4a28" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.454353 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9944ab9-f72f-44bd-8850-2898feff4a28-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c9944ab9-f72f-44bd-8850-2898feff4a28" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.454573 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9944ab9-f72f-44bd-8850-2898feff4a28-kube-api-access-wjdzk" (OuterVolumeSpecName: "kube-api-access-wjdzk") pod "c9944ab9-f72f-44bd-8850-2898feff4a28" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28"). InnerVolumeSpecName "kube-api-access-wjdzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.458781 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9944ab9-f72f-44bd-8850-2898feff4a28-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c9944ab9-f72f-44bd-8850-2898feff4a28" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.461103 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "c9944ab9-f72f-44bd-8850-2898feff4a28" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.472087 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9944ab9-f72f-44bd-8850-2898feff4a28-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c9944ab9-f72f-44bd-8850-2898feff4a28" (UID: "c9944ab9-f72f-44bd-8850-2898feff4a28"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.550400 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjdzk\" (UniqueName: \"kubernetes.io/projected/c9944ab9-f72f-44bd-8850-2898feff4a28-kube-api-access-wjdzk\") on node \"crc\" DevicePath \"\"" Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.550491 4825 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9944ab9-f72f-44bd-8850-2898feff4a28-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.550519 4825 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c9944ab9-f72f-44bd-8850-2898feff4a28-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.550549 4825 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9944ab9-f72f-44bd-8850-2898feff4a28-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.550576 4825 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c9944ab9-f72f-44bd-8850-2898feff4a28-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.550599 4825 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c9944ab9-f72f-44bd-8850-2898feff4a28-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.550621 4825 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c9944ab9-f72f-44bd-8850-2898feff4a28-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.859862 4825 generic.go:334] "Generic (PLEG): container finished" podID="c9944ab9-f72f-44bd-8850-2898feff4a28" containerID="b1cb629928a9d8d3d8b2f4e35f90517ad19d163dc03e966429ace1ec7cea2401" exitCode=0 Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.859926 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" event={"ID":"c9944ab9-f72f-44bd-8850-2898feff4a28","Type":"ContainerDied","Data":"b1cb629928a9d8d3d8b2f4e35f90517ad19d163dc03e966429ace1ec7cea2401"} Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.859968 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.859994 4825 scope.go:117] "RemoveContainer" containerID="b1cb629928a9d8d3d8b2f4e35f90517ad19d163dc03e966429ace1ec7cea2401" Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.859975 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vlkfb" event={"ID":"c9944ab9-f72f-44bd-8850-2898feff4a28","Type":"ContainerDied","Data":"e9b032bda118bca66c38fe9b1f4f89f6237eeb8597e6f62d4dffec6a653474b9"} Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.882733 4825 scope.go:117] "RemoveContainer" containerID="b1cb629928a9d8d3d8b2f4e35f90517ad19d163dc03e966429ace1ec7cea2401" Mar 10 06:52:20 crc kubenswrapper[4825]: E0310 06:52:20.886932 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1cb629928a9d8d3d8b2f4e35f90517ad19d163dc03e966429ace1ec7cea2401\": container with ID starting with b1cb629928a9d8d3d8b2f4e35f90517ad19d163dc03e966429ace1ec7cea2401 not found: ID does not exist" containerID="b1cb629928a9d8d3d8b2f4e35f90517ad19d163dc03e966429ace1ec7cea2401" Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.887039 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1cb629928a9d8d3d8b2f4e35f90517ad19d163dc03e966429ace1ec7cea2401"} err="failed to get container status \"b1cb629928a9d8d3d8b2f4e35f90517ad19d163dc03e966429ace1ec7cea2401\": rpc error: code = NotFound desc = could not find container \"b1cb629928a9d8d3d8b2f4e35f90517ad19d163dc03e966429ace1ec7cea2401\": container with ID starting with b1cb629928a9d8d3d8b2f4e35f90517ad19d163dc03e966429ace1ec7cea2401 not found: ID does not exist" Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.908400 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vlkfb"] Mar 10 06:52:20 crc kubenswrapper[4825]: I0310 06:52:20.913062 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vlkfb"] Mar 10 06:52:21 crc kubenswrapper[4825]: I0310 06:52:21.248333 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9944ab9-f72f-44bd-8850-2898feff4a28" path="/var/lib/kubelet/pods/c9944ab9-f72f-44bd-8850-2898feff4a28/volumes" Mar 10 06:52:46 crc kubenswrapper[4825]: I0310 06:52:46.888782 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 06:52:46 crc kubenswrapper[4825]: I0310 06:52:46.890172 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 06:52:46 crc kubenswrapper[4825]: I0310 06:52:46.890250 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 06:52:46 crc kubenswrapper[4825]: I0310 06:52:46.892566 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"530719938ecf342726dd5c9606058576ce615ad39fc2f242ce6515ac6bbc463f"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 06:52:46 crc kubenswrapper[4825]: I0310 06:52:46.892727 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://530719938ecf342726dd5c9606058576ce615ad39fc2f242ce6515ac6bbc463f" gracePeriod=600 Mar 10 06:52:47 crc kubenswrapper[4825]: I0310 06:52:47.074049 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="530719938ecf342726dd5c9606058576ce615ad39fc2f242ce6515ac6bbc463f" exitCode=0 Mar 10 06:52:47 crc kubenswrapper[4825]: I0310 06:52:47.074172 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"530719938ecf342726dd5c9606058576ce615ad39fc2f242ce6515ac6bbc463f"} Mar 10 06:52:47 crc kubenswrapper[4825]: I0310 06:52:47.074275 4825 scope.go:117] "RemoveContainer" containerID="88bb20a12b63dd94732d6f3413fa4c14b8931a2d82e49f45769c15a14202fe1c" Mar 10 06:52:48 crc kubenswrapper[4825]: I0310 06:52:48.088591 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"879be6c5ce3fe58a67d0da559938f171bc2a9441b209ba47dbc49f9ed90467ac"} Mar 10 06:54:00 crc kubenswrapper[4825]: I0310 06:54:00.153289 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552094-2nwbl"] Mar 10 06:54:00 crc kubenswrapper[4825]: E0310 06:54:00.154555 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9944ab9-f72f-44bd-8850-2898feff4a28" containerName="registry" Mar 10 06:54:00 crc kubenswrapper[4825]: I0310 06:54:00.154579 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9944ab9-f72f-44bd-8850-2898feff4a28" containerName="registry" Mar 10 06:54:00 crc kubenswrapper[4825]: E0310 06:54:00.154595 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1941c411-9bd8-4ff4-85ea-a803394e8eb0" containerName="oc" Mar 10 06:54:00 crc kubenswrapper[4825]: I0310 06:54:00.154607 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1941c411-9bd8-4ff4-85ea-a803394e8eb0" containerName="oc" Mar 10 06:54:00 crc kubenswrapper[4825]: I0310 06:54:00.154768 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9944ab9-f72f-44bd-8850-2898feff4a28" containerName="registry" Mar 10 06:54:00 crc kubenswrapper[4825]: I0310 06:54:00.154786 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="1941c411-9bd8-4ff4-85ea-a803394e8eb0" containerName="oc" Mar 10 06:54:00 crc kubenswrapper[4825]: I0310 06:54:00.155451 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552094-2nwbl" Mar 10 06:54:00 crc kubenswrapper[4825]: I0310 06:54:00.158590 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 06:54:00 crc kubenswrapper[4825]: I0310 06:54:00.159030 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 06:54:00 crc kubenswrapper[4825]: I0310 06:54:00.159331 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 06:54:00 crc kubenswrapper[4825]: I0310 06:54:00.175216 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552094-2nwbl"] Mar 10 06:54:00 crc kubenswrapper[4825]: I0310 06:54:00.317207 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp8bl\" (UniqueName: \"kubernetes.io/projected/39b2113e-4a4f-4124-87f7-8439cb92af77-kube-api-access-mp8bl\") pod \"auto-csr-approver-29552094-2nwbl\" (UID: \"39b2113e-4a4f-4124-87f7-8439cb92af77\") " pod="openshift-infra/auto-csr-approver-29552094-2nwbl" Mar 10 06:54:00 crc kubenswrapper[4825]: I0310 06:54:00.420242 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp8bl\" (UniqueName: \"kubernetes.io/projected/39b2113e-4a4f-4124-87f7-8439cb92af77-kube-api-access-mp8bl\") pod \"auto-csr-approver-29552094-2nwbl\" (UID: \"39b2113e-4a4f-4124-87f7-8439cb92af77\") " pod="openshift-infra/auto-csr-approver-29552094-2nwbl" Mar 10 06:54:00 crc kubenswrapper[4825]: I0310 06:54:00.451241 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp8bl\" (UniqueName: \"kubernetes.io/projected/39b2113e-4a4f-4124-87f7-8439cb92af77-kube-api-access-mp8bl\") pod \"auto-csr-approver-29552094-2nwbl\" (UID: \"39b2113e-4a4f-4124-87f7-8439cb92af77\") " pod="openshift-infra/auto-csr-approver-29552094-2nwbl" Mar 10 06:54:00 crc kubenswrapper[4825]: I0310 06:54:00.491717 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552094-2nwbl" Mar 10 06:54:00 crc kubenswrapper[4825]: I0310 06:54:00.791585 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552094-2nwbl"] Mar 10 06:54:00 crc kubenswrapper[4825]: I0310 06:54:00.835685 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 06:54:01 crc kubenswrapper[4825]: I0310 06:54:01.704408 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552094-2nwbl" event={"ID":"39b2113e-4a4f-4124-87f7-8439cb92af77","Type":"ContainerStarted","Data":"601a1f55988d62c0ab8b78f18a080d37ea1d2cf5034492ed5216020332f10672"} Mar 10 06:54:02 crc kubenswrapper[4825]: I0310 06:54:02.715184 4825 generic.go:334] "Generic (PLEG): container finished" podID="39b2113e-4a4f-4124-87f7-8439cb92af77" containerID="4121a0db8fe028b8d9c0ab22f7649791664ecd76901f0f382df50b36c0ab26ea" exitCode=0 Mar 10 06:54:02 crc kubenswrapper[4825]: I0310 06:54:02.715250 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552094-2nwbl" event={"ID":"39b2113e-4a4f-4124-87f7-8439cb92af77","Type":"ContainerDied","Data":"4121a0db8fe028b8d9c0ab22f7649791664ecd76901f0f382df50b36c0ab26ea"} Mar 10 06:54:03 crc kubenswrapper[4825]: I0310 06:54:03.992762 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552094-2nwbl" Mar 10 06:54:04 crc kubenswrapper[4825]: I0310 06:54:04.187732 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp8bl\" (UniqueName: \"kubernetes.io/projected/39b2113e-4a4f-4124-87f7-8439cb92af77-kube-api-access-mp8bl\") pod \"39b2113e-4a4f-4124-87f7-8439cb92af77\" (UID: \"39b2113e-4a4f-4124-87f7-8439cb92af77\") " Mar 10 06:54:04 crc kubenswrapper[4825]: I0310 06:54:04.196847 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39b2113e-4a4f-4124-87f7-8439cb92af77-kube-api-access-mp8bl" (OuterVolumeSpecName: "kube-api-access-mp8bl") pod "39b2113e-4a4f-4124-87f7-8439cb92af77" (UID: "39b2113e-4a4f-4124-87f7-8439cb92af77"). InnerVolumeSpecName "kube-api-access-mp8bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:54:04 crc kubenswrapper[4825]: I0310 06:54:04.290278 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp8bl\" (UniqueName: \"kubernetes.io/projected/39b2113e-4a4f-4124-87f7-8439cb92af77-kube-api-access-mp8bl\") on node \"crc\" DevicePath \"\"" Mar 10 06:54:04 crc kubenswrapper[4825]: I0310 06:54:04.741554 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552094-2nwbl" event={"ID":"39b2113e-4a4f-4124-87f7-8439cb92af77","Type":"ContainerDied","Data":"601a1f55988d62c0ab8b78f18a080d37ea1d2cf5034492ed5216020332f10672"} Mar 10 06:54:04 crc kubenswrapper[4825]: I0310 06:54:04.741634 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="601a1f55988d62c0ab8b78f18a080d37ea1d2cf5034492ed5216020332f10672" Mar 10 06:54:04 crc kubenswrapper[4825]: I0310 06:54:04.741681 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552094-2nwbl" Mar 10 06:54:05 crc kubenswrapper[4825]: I0310 06:54:05.069020 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552088-7fs5s"] Mar 10 06:54:05 crc kubenswrapper[4825]: I0310 06:54:05.081389 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552088-7fs5s"] Mar 10 06:54:05 crc kubenswrapper[4825]: I0310 06:54:05.249610 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccb47913-9550-4871-9cf8-47fa8734b406" path="/var/lib/kubelet/pods/ccb47913-9550-4871-9cf8-47fa8734b406/volumes" Mar 10 06:54:29 crc kubenswrapper[4825]: I0310 06:54:29.389470 4825 scope.go:117] "RemoveContainer" containerID="639a015422630d0721259b232accc01baa4b433bac72a5f3c4e6d9e334208d22" Mar 10 06:54:29 crc kubenswrapper[4825]: I0310 06:54:29.444828 4825 scope.go:117] "RemoveContainer" containerID="ffe75fbb6c6c36c4c9c3922b2217be7106e13b72145242a4e7a4cd41544d5f91" Mar 10 06:55:16 crc kubenswrapper[4825]: I0310 06:55:16.888052 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 06:55:16 crc kubenswrapper[4825]: I0310 06:55:16.888863 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 06:55:29 crc kubenswrapper[4825]: I0310 06:55:29.502103 4825 scope.go:117] "RemoveContainer" containerID="4f3fb183a2ed1f131e03c1dfc1a862219753e0f6b41ec9a24a609a228e351314" Mar 10 06:55:46 crc kubenswrapper[4825]: I0310 06:55:46.888310 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 06:55:46 crc kubenswrapper[4825]: I0310 06:55:46.889230 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 06:56:00 crc kubenswrapper[4825]: I0310 06:56:00.150246 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552096-v78x6"] Mar 10 06:56:00 crc kubenswrapper[4825]: E0310 06:56:00.152302 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39b2113e-4a4f-4124-87f7-8439cb92af77" containerName="oc" Mar 10 06:56:00 crc kubenswrapper[4825]: I0310 06:56:00.152399 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="39b2113e-4a4f-4124-87f7-8439cb92af77" containerName="oc" Mar 10 06:56:00 crc kubenswrapper[4825]: I0310 06:56:00.152601 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="39b2113e-4a4f-4124-87f7-8439cb92af77" containerName="oc" Mar 10 06:56:00 crc kubenswrapper[4825]: I0310 06:56:00.153193 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552096-v78x6" Mar 10 06:56:00 crc kubenswrapper[4825]: I0310 06:56:00.155401 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 06:56:00 crc kubenswrapper[4825]: I0310 06:56:00.155526 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 06:56:00 crc kubenswrapper[4825]: I0310 06:56:00.155402 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 06:56:00 crc kubenswrapper[4825]: I0310 06:56:00.171082 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552096-v78x6"] Mar 10 06:56:00 crc kubenswrapper[4825]: I0310 06:56:00.348234 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n4qp\" (UniqueName: \"kubernetes.io/projected/a7d62c52-2ae1-46a3-8b1d-7f086d612775-kube-api-access-8n4qp\") pod \"auto-csr-approver-29552096-v78x6\" (UID: \"a7d62c52-2ae1-46a3-8b1d-7f086d612775\") " pod="openshift-infra/auto-csr-approver-29552096-v78x6" Mar 10 06:56:00 crc kubenswrapper[4825]: I0310 06:56:00.449945 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n4qp\" (UniqueName: \"kubernetes.io/projected/a7d62c52-2ae1-46a3-8b1d-7f086d612775-kube-api-access-8n4qp\") pod \"auto-csr-approver-29552096-v78x6\" (UID: \"a7d62c52-2ae1-46a3-8b1d-7f086d612775\") " pod="openshift-infra/auto-csr-approver-29552096-v78x6" Mar 10 06:56:00 crc kubenswrapper[4825]: I0310 06:56:00.486300 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n4qp\" (UniqueName: \"kubernetes.io/projected/a7d62c52-2ae1-46a3-8b1d-7f086d612775-kube-api-access-8n4qp\") pod \"auto-csr-approver-29552096-v78x6\" (UID: \"a7d62c52-2ae1-46a3-8b1d-7f086d612775\") " pod="openshift-infra/auto-csr-approver-29552096-v78x6" Mar 10 06:56:00 crc kubenswrapper[4825]: I0310 06:56:00.771033 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552096-v78x6" Mar 10 06:56:01 crc kubenswrapper[4825]: I0310 06:56:01.341493 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552096-v78x6"] Mar 10 06:56:01 crc kubenswrapper[4825]: I0310 06:56:01.698802 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552096-v78x6" event={"ID":"a7d62c52-2ae1-46a3-8b1d-7f086d612775","Type":"ContainerStarted","Data":"d3077687ed16ebf2029fa0657ec2f0254f4905abbc20f5c5cda101fd6db62c1c"} Mar 10 06:56:02 crc kubenswrapper[4825]: I0310 06:56:02.708050 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552096-v78x6" event={"ID":"a7d62c52-2ae1-46a3-8b1d-7f086d612775","Type":"ContainerStarted","Data":"465f1bf1cf4ff5239246dfecc7c2986fca805552a906139108d9a1b433be3ebe"} Mar 10 06:56:02 crc kubenswrapper[4825]: I0310 06:56:02.727397 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552096-v78x6" podStartSLOduration=1.867221104 podStartE2EDuration="2.727366833s" podCreationTimestamp="2026-03-10 06:56:00 +0000 UTC" firstStartedPulling="2026-03-10 06:56:01.325542862 +0000 UTC m=+714.355323477" lastFinishedPulling="2026-03-10 06:56:02.185688591 +0000 UTC m=+715.215469206" observedRunningTime="2026-03-10 06:56:02.726395089 +0000 UTC m=+715.756175714" watchObservedRunningTime="2026-03-10 06:56:02.727366833 +0000 UTC m=+715.757147458" Mar 10 06:56:03 crc kubenswrapper[4825]: I0310 06:56:03.721449 4825 generic.go:334] "Generic (PLEG): container finished" podID="a7d62c52-2ae1-46a3-8b1d-7f086d612775" containerID="465f1bf1cf4ff5239246dfecc7c2986fca805552a906139108d9a1b433be3ebe" exitCode=0 Mar 10 06:56:03 crc kubenswrapper[4825]: I0310 06:56:03.721574 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552096-v78x6" event={"ID":"a7d62c52-2ae1-46a3-8b1d-7f086d612775","Type":"ContainerDied","Data":"465f1bf1cf4ff5239246dfecc7c2986fca805552a906139108d9a1b433be3ebe"} Mar 10 06:56:04 crc kubenswrapper[4825]: I0310 06:56:04.994582 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552096-v78x6" Mar 10 06:56:05 crc kubenswrapper[4825]: I0310 06:56:05.027709 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n4qp\" (UniqueName: \"kubernetes.io/projected/a7d62c52-2ae1-46a3-8b1d-7f086d612775-kube-api-access-8n4qp\") pod \"a7d62c52-2ae1-46a3-8b1d-7f086d612775\" (UID: \"a7d62c52-2ae1-46a3-8b1d-7f086d612775\") " Mar 10 06:56:05 crc kubenswrapper[4825]: I0310 06:56:05.037407 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7d62c52-2ae1-46a3-8b1d-7f086d612775-kube-api-access-8n4qp" (OuterVolumeSpecName: "kube-api-access-8n4qp") pod "a7d62c52-2ae1-46a3-8b1d-7f086d612775" (UID: "a7d62c52-2ae1-46a3-8b1d-7f086d612775"). InnerVolumeSpecName "kube-api-access-8n4qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:56:05 crc kubenswrapper[4825]: I0310 06:56:05.129872 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n4qp\" (UniqueName: \"kubernetes.io/projected/a7d62c52-2ae1-46a3-8b1d-7f086d612775-kube-api-access-8n4qp\") on node \"crc\" DevicePath \"\"" Mar 10 06:56:05 crc kubenswrapper[4825]: I0310 06:56:05.736017 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552096-v78x6" event={"ID":"a7d62c52-2ae1-46a3-8b1d-7f086d612775","Type":"ContainerDied","Data":"d3077687ed16ebf2029fa0657ec2f0254f4905abbc20f5c5cda101fd6db62c1c"} Mar 10 06:56:05 crc kubenswrapper[4825]: I0310 06:56:05.736453 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3077687ed16ebf2029fa0657ec2f0254f4905abbc20f5c5cda101fd6db62c1c" Mar 10 06:56:05 crc kubenswrapper[4825]: I0310 06:56:05.736173 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552096-v78x6" Mar 10 06:56:05 crc kubenswrapper[4825]: I0310 06:56:05.793081 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552090-q7kdt"] Mar 10 06:56:05 crc kubenswrapper[4825]: I0310 06:56:05.797532 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552090-q7kdt"] Mar 10 06:56:07 crc kubenswrapper[4825]: I0310 06:56:07.251065 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5b8a09c-f74d-40c1-81a7-1e0ed85f5025" path="/var/lib/kubelet/pods/b5b8a09c-f74d-40c1-81a7-1e0ed85f5025/volumes" Mar 10 06:56:16 crc kubenswrapper[4825]: I0310 06:56:16.888675 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 06:56:16 crc kubenswrapper[4825]: I0310 06:56:16.889588 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 06:56:16 crc kubenswrapper[4825]: I0310 06:56:16.889671 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 06:56:16 crc kubenswrapper[4825]: I0310 06:56:16.890735 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"879be6c5ce3fe58a67d0da559938f171bc2a9441b209ba47dbc49f9ed90467ac"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 06:56:16 crc kubenswrapper[4825]: I0310 06:56:16.890864 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://879be6c5ce3fe58a67d0da559938f171bc2a9441b209ba47dbc49f9ed90467ac" gracePeriod=600 Mar 10 06:56:17 crc kubenswrapper[4825]: I0310 06:56:17.825847 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="879be6c5ce3fe58a67d0da559938f171bc2a9441b209ba47dbc49f9ed90467ac" exitCode=0 Mar 10 06:56:17 crc kubenswrapper[4825]: I0310 06:56:17.825926 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"879be6c5ce3fe58a67d0da559938f171bc2a9441b209ba47dbc49f9ed90467ac"} Mar 10 06:56:17 crc kubenswrapper[4825]: I0310 06:56:17.826374 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"1b4ca6f9c5b9ae0428c11c821b788c2b1bec41466ba3ff84c82938c6518a2828"} Mar 10 06:56:17 crc kubenswrapper[4825]: I0310 06:56:17.826405 4825 scope.go:117] "RemoveContainer" containerID="530719938ecf342726dd5c9606058576ce615ad39fc2f242ce6515ac6bbc463f" Mar 10 06:57:29 crc kubenswrapper[4825]: I0310 06:57:29.593196 4825 scope.go:117] "RemoveContainer" containerID="c26c20226f2bf9036ea8d2ec751fe525cb2dde550a2a42c88c572626afcfaff8" Mar 10 06:57:58 crc kubenswrapper[4825]: I0310 06:57:58.373281 4825 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 06:58:00 crc kubenswrapper[4825]: I0310 06:58:00.202743 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552098-dffsl"] Mar 10 06:58:00 crc kubenswrapper[4825]: E0310 06:58:00.203088 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d62c52-2ae1-46a3-8b1d-7f086d612775" containerName="oc" Mar 10 06:58:00 crc kubenswrapper[4825]: I0310 06:58:00.203115 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d62c52-2ae1-46a3-8b1d-7f086d612775" containerName="oc" Mar 10 06:58:00 crc kubenswrapper[4825]: I0310 06:58:00.203367 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7d62c52-2ae1-46a3-8b1d-7f086d612775" containerName="oc" Mar 10 06:58:00 crc kubenswrapper[4825]: I0310 06:58:00.203979 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552098-dffsl" Mar 10 06:58:00 crc kubenswrapper[4825]: I0310 06:58:00.207102 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 06:58:00 crc kubenswrapper[4825]: I0310 06:58:00.207121 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 06:58:00 crc kubenswrapper[4825]: I0310 06:58:00.207442 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 06:58:00 crc kubenswrapper[4825]: I0310 06:58:00.213749 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552098-dffsl"] Mar 10 06:58:00 crc kubenswrapper[4825]: I0310 06:58:00.338425 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h528b\" (UniqueName: \"kubernetes.io/projected/9e427caf-e060-4f22-b5cd-96b47c0cf797-kube-api-access-h528b\") pod \"auto-csr-approver-29552098-dffsl\" (UID: \"9e427caf-e060-4f22-b5cd-96b47c0cf797\") " pod="openshift-infra/auto-csr-approver-29552098-dffsl" Mar 10 06:58:00 crc kubenswrapper[4825]: I0310 06:58:00.439546 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h528b\" (UniqueName: \"kubernetes.io/projected/9e427caf-e060-4f22-b5cd-96b47c0cf797-kube-api-access-h528b\") pod \"auto-csr-approver-29552098-dffsl\" (UID: \"9e427caf-e060-4f22-b5cd-96b47c0cf797\") " pod="openshift-infra/auto-csr-approver-29552098-dffsl" Mar 10 06:58:00 crc kubenswrapper[4825]: I0310 06:58:00.459922 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h528b\" (UniqueName: \"kubernetes.io/projected/9e427caf-e060-4f22-b5cd-96b47c0cf797-kube-api-access-h528b\") pod \"auto-csr-approver-29552098-dffsl\" (UID: \"9e427caf-e060-4f22-b5cd-96b47c0cf797\") " pod="openshift-infra/auto-csr-approver-29552098-dffsl" Mar 10 06:58:00 crc kubenswrapper[4825]: I0310 06:58:00.532673 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552098-dffsl" Mar 10 06:58:00 crc kubenswrapper[4825]: I0310 06:58:00.808020 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552098-dffsl"] Mar 10 06:58:01 crc kubenswrapper[4825]: I0310 06:58:01.591759 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552098-dffsl" event={"ID":"9e427caf-e060-4f22-b5cd-96b47c0cf797","Type":"ContainerStarted","Data":"3256a5b75d6308b5a3788dd65103b44a83be54d5a3fb700de181a6c53577d4c1"} Mar 10 06:58:02 crc kubenswrapper[4825]: I0310 06:58:02.598582 4825 generic.go:334] "Generic (PLEG): container finished" podID="9e427caf-e060-4f22-b5cd-96b47c0cf797" containerID="5bb17d36525609847fef21235c27a86842bece8a7b650a7e31d643f4dd7cd3f3" exitCode=0 Mar 10 06:58:02 crc kubenswrapper[4825]: I0310 06:58:02.598832 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552098-dffsl" event={"ID":"9e427caf-e060-4f22-b5cd-96b47c0cf797","Type":"ContainerDied","Data":"5bb17d36525609847fef21235c27a86842bece8a7b650a7e31d643f4dd7cd3f3"} Mar 10 06:58:03 crc kubenswrapper[4825]: I0310 06:58:03.890645 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552098-dffsl" Mar 10 06:58:03 crc kubenswrapper[4825]: I0310 06:58:03.988037 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h528b\" (UniqueName: \"kubernetes.io/projected/9e427caf-e060-4f22-b5cd-96b47c0cf797-kube-api-access-h528b\") pod \"9e427caf-e060-4f22-b5cd-96b47c0cf797\" (UID: \"9e427caf-e060-4f22-b5cd-96b47c0cf797\") " Mar 10 06:58:03 crc kubenswrapper[4825]: I0310 06:58:03.993688 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e427caf-e060-4f22-b5cd-96b47c0cf797-kube-api-access-h528b" (OuterVolumeSpecName: "kube-api-access-h528b") pod "9e427caf-e060-4f22-b5cd-96b47c0cf797" (UID: "9e427caf-e060-4f22-b5cd-96b47c0cf797"). InnerVolumeSpecName "kube-api-access-h528b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:58:04 crc kubenswrapper[4825]: I0310 06:58:04.090093 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h528b\" (UniqueName: \"kubernetes.io/projected/9e427caf-e060-4f22-b5cd-96b47c0cf797-kube-api-access-h528b\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:04 crc kubenswrapper[4825]: I0310 06:58:04.617321 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552098-dffsl" event={"ID":"9e427caf-e060-4f22-b5cd-96b47c0cf797","Type":"ContainerDied","Data":"3256a5b75d6308b5a3788dd65103b44a83be54d5a3fb700de181a6c53577d4c1"} Mar 10 06:58:04 crc kubenswrapper[4825]: I0310 06:58:04.617383 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3256a5b75d6308b5a3788dd65103b44a83be54d5a3fb700de181a6c53577d4c1" Mar 10 06:58:04 crc kubenswrapper[4825]: I0310 06:58:04.617451 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552098-dffsl" Mar 10 06:58:04 crc kubenswrapper[4825]: I0310 06:58:04.973041 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552092-2pgzg"] Mar 10 06:58:04 crc kubenswrapper[4825]: I0310 06:58:04.979280 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552092-2pgzg"] Mar 10 06:58:05 crc kubenswrapper[4825]: I0310 06:58:05.247822 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1941c411-9bd8-4ff4-85ea-a803394e8eb0" path="/var/lib/kubelet/pods/1941c411-9bd8-4ff4-85ea-a803394e8eb0/volumes" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.017879 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jhkb9"] Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.020400 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="ovn-controller" containerID="cri-o://3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841" gracePeriod=30 Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.020542 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="sbdb" containerID="cri-o://c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395" gracePeriod=30 Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.020641 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff" gracePeriod=30 Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.020484 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="nbdb" containerID="cri-o://c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952" gracePeriod=30 Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.020733 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="kube-rbac-proxy-node" containerID="cri-o://e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3" gracePeriod=30 Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.020714 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="ovn-acl-logging" containerID="cri-o://b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3" gracePeriod=30 Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.020752 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="northd" containerID="cri-o://8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54" gracePeriod=30 Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.068056 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="ovnkube-controller" containerID="cri-o://ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0" gracePeriod=30 Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.407468 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhkb9_79ec9d89-dc71-4f36-9254-00bd86795e43/ovnkube-controller/3.log" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.411127 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhkb9_79ec9d89-dc71-4f36-9254-00bd86795e43/ovn-acl-logging/0.log" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.411903 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhkb9_79ec9d89-dc71-4f36-9254-00bd86795e43/ovn-controller/0.log" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.412594 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.476927 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xhntk"] Mar 10 06:58:15 crc kubenswrapper[4825]: E0310 06:58:15.477223 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e427caf-e060-4f22-b5cd-96b47c0cf797" containerName="oc" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477238 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e427caf-e060-4f22-b5cd-96b47c0cf797" containerName="oc" Mar 10 06:58:15 crc kubenswrapper[4825]: E0310 06:58:15.477251 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="ovn-acl-logging" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477260 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="ovn-acl-logging" Mar 10 06:58:15 crc kubenswrapper[4825]: E0310 06:58:15.477270 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="ovnkube-controller" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477279 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="ovnkube-controller" Mar 10 06:58:15 crc kubenswrapper[4825]: E0310 06:58:15.477288 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="kube-rbac-proxy-node" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477296 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="kube-rbac-proxy-node" Mar 10 06:58:15 crc kubenswrapper[4825]: E0310 06:58:15.477304 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="ovnkube-controller" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477312 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="ovnkube-controller" Mar 10 06:58:15 crc kubenswrapper[4825]: E0310 06:58:15.477325 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="ovn-controller" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477334 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="ovn-controller" Mar 10 06:58:15 crc kubenswrapper[4825]: E0310 06:58:15.477343 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="northd" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477350 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="northd" Mar 10 06:58:15 crc kubenswrapper[4825]: E0310 06:58:15.477360 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="ovnkube-controller" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477367 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="ovnkube-controller" Mar 10 06:58:15 crc kubenswrapper[4825]: E0310 06:58:15.477380 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477387 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 06:58:15 crc kubenswrapper[4825]: E0310 06:58:15.477402 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="sbdb" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477409 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="sbdb" Mar 10 06:58:15 crc kubenswrapper[4825]: E0310 06:58:15.477420 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="kubecfg-setup" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477427 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="kubecfg-setup" Mar 10 06:58:15 crc kubenswrapper[4825]: E0310 06:58:15.477436 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="nbdb" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477443 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="nbdb" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477589 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="ovnkube-controller" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477605 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="sbdb" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477614 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="northd" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477621 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="ovn-acl-logging" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477632 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="nbdb" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477643 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="kube-rbac-proxy-node" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477651 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477660 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="ovnkube-controller" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477669 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e427caf-e060-4f22-b5cd-96b47c0cf797" containerName="oc" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477677 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="ovnkube-controller" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477685 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="ovn-controller" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477695 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="ovnkube-controller" Mar 10 06:58:15 crc kubenswrapper[4825]: E0310 06:58:15.477809 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="ovnkube-controller" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477818 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="ovnkube-controller" Mar 10 06:58:15 crc kubenswrapper[4825]: E0310 06:58:15.477831 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="ovnkube-controller" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477839 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="ovnkube-controller" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.477968 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerName="ovnkube-controller" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.483487 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.499469 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-host-kubelet\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.499518 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cc987463-f5a0-47f1-883a-83e226574334-ovnkube-script-lib\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.499548 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-systemd-units\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.499570 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-run-openvswitch\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.499593 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps2h7\" (UniqueName: \"kubernetes.io/projected/cc987463-f5a0-47f1-883a-83e226574334-kube-api-access-ps2h7\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.499625 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-host-run-ovn-kubernetes\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.499651 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-host-run-netns\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.499676 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cc987463-f5a0-47f1-883a-83e226574334-env-overrides\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.499708 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.499759 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-var-lib-openvswitch\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.499781 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-host-cni-netd\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.499803 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-run-ovn\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.499826 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-etc-openvswitch\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.499851 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-log-socket\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.499904 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cc987463-f5a0-47f1-883a-83e226574334-ovnkube-config\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.499934 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-node-log\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.499963 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-host-slash\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.499985 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cc987463-f5a0-47f1-883a-83e226574334-ovn-node-metrics-cert\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.500009 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-host-cni-bin\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.500054 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-run-systemd\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.601084 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-node-log\") pod \"79ec9d89-dc71-4f36-9254-00bd86795e43\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.601357 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-var-lib-cni-networks-ovn-kubernetes\") pod \"79ec9d89-dc71-4f36-9254-00bd86795e43\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.601443 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-etc-openvswitch\") pod \"79ec9d89-dc71-4f36-9254-00bd86795e43\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.601655 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-slash\") pod \"79ec9d89-dc71-4f36-9254-00bd86795e43\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.601745 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-var-lib-openvswitch\") pod \"79ec9d89-dc71-4f36-9254-00bd86795e43\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.601849 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/79ec9d89-dc71-4f36-9254-00bd86795e43-env-overrides\") pod \"79ec9d89-dc71-4f36-9254-00bd86795e43\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.601925 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-systemd-units\") pod \"79ec9d89-dc71-4f36-9254-00bd86795e43\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.601999 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/79ec9d89-dc71-4f36-9254-00bd86795e43-ovnkube-config\") pod \"79ec9d89-dc71-4f36-9254-00bd86795e43\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.602067 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-run-netns\") pod \"79ec9d89-dc71-4f36-9254-00bd86795e43\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.602155 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-run-openvswitch\") pod \"79ec9d89-dc71-4f36-9254-00bd86795e43\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.602230 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-cni-netd\") pod \"79ec9d89-dc71-4f36-9254-00bd86795e43\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.602309 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/79ec9d89-dc71-4f36-9254-00bd86795e43-ovn-node-metrics-cert\") pod \"79ec9d89-dc71-4f36-9254-00bd86795e43\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.602380 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-run-ovn\") pod \"79ec9d89-dc71-4f36-9254-00bd86795e43\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.602442 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-cni-bin\") pod \"79ec9d89-dc71-4f36-9254-00bd86795e43\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.602508 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/79ec9d89-dc71-4f36-9254-00bd86795e43-ovnkube-script-lib\") pod \"79ec9d89-dc71-4f36-9254-00bd86795e43\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.602583 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvxrb\" (UniqueName: \"kubernetes.io/projected/79ec9d89-dc71-4f36-9254-00bd86795e43-kube-api-access-rvxrb\") pod \"79ec9d89-dc71-4f36-9254-00bd86795e43\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.602652 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-run-ovn-kubernetes\") pod \"79ec9d89-dc71-4f36-9254-00bd86795e43\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.602721 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-kubelet\") pod \"79ec9d89-dc71-4f36-9254-00bd86795e43\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.602793 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-run-systemd\") pod \"79ec9d89-dc71-4f36-9254-00bd86795e43\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.602852 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-log-socket\") pod \"79ec9d89-dc71-4f36-9254-00bd86795e43\" (UID: \"79ec9d89-dc71-4f36-9254-00bd86795e43\") " Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.603027 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-host-run-ovn-kubernetes\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.603105 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-host-run-netns\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.603195 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cc987463-f5a0-47f1-883a-83e226574334-env-overrides\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.603271 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.603387 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-var-lib-openvswitch\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.603456 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-host-cni-netd\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.603525 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-run-ovn\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.603593 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-etc-openvswitch\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.603665 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-log-socket\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.603739 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cc987463-f5a0-47f1-883a-83e226574334-ovnkube-config\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.603807 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-node-log\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.603872 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-host-slash\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.603938 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cc987463-f5a0-47f1-883a-83e226574334-ovn-node-metrics-cert\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.604012 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-host-cni-bin\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.604087 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-run-systemd\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.604174 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-host-kubelet\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.604239 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cc987463-f5a0-47f1-883a-83e226574334-ovnkube-script-lib\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.604311 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-systemd-units\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.604383 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-run-openvswitch\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.604461 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps2h7\" (UniqueName: \"kubernetes.io/projected/cc987463-f5a0-47f1-883a-83e226574334-kube-api-access-ps2h7\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.601568 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-node-log" (OuterVolumeSpecName: "node-log") pod "79ec9d89-dc71-4f36-9254-00bd86795e43" (UID: "79ec9d89-dc71-4f36-9254-00bd86795e43"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.601596 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "79ec9d89-dc71-4f36-9254-00bd86795e43" (UID: "79ec9d89-dc71-4f36-9254-00bd86795e43"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.601618 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "79ec9d89-dc71-4f36-9254-00bd86795e43" (UID: "79ec9d89-dc71-4f36-9254-00bd86795e43"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.601713 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-slash" (OuterVolumeSpecName: "host-slash") pod "79ec9d89-dc71-4f36-9254-00bd86795e43" (UID: "79ec9d89-dc71-4f36-9254-00bd86795e43"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.601801 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "79ec9d89-dc71-4f36-9254-00bd86795e43" (UID: "79ec9d89-dc71-4f36-9254-00bd86795e43"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.602250 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79ec9d89-dc71-4f36-9254-00bd86795e43-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "79ec9d89-dc71-4f36-9254-00bd86795e43" (UID: "79ec9d89-dc71-4f36-9254-00bd86795e43"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.602470 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79ec9d89-dc71-4f36-9254-00bd86795e43-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "79ec9d89-dc71-4f36-9254-00bd86795e43" (UID: "79ec9d89-dc71-4f36-9254-00bd86795e43"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.602489 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "79ec9d89-dc71-4f36-9254-00bd86795e43" (UID: "79ec9d89-dc71-4f36-9254-00bd86795e43"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.602508 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "79ec9d89-dc71-4f36-9254-00bd86795e43" (UID: "79ec9d89-dc71-4f36-9254-00bd86795e43"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.602523 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "79ec9d89-dc71-4f36-9254-00bd86795e43" (UID: "79ec9d89-dc71-4f36-9254-00bd86795e43"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.602535 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "79ec9d89-dc71-4f36-9254-00bd86795e43" (UID: "79ec9d89-dc71-4f36-9254-00bd86795e43"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.602553 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "79ec9d89-dc71-4f36-9254-00bd86795e43" (UID: "79ec9d89-dc71-4f36-9254-00bd86795e43"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.604892 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "79ec9d89-dc71-4f36-9254-00bd86795e43" (UID: "79ec9d89-dc71-4f36-9254-00bd86795e43"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.606081 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79ec9d89-dc71-4f36-9254-00bd86795e43-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "79ec9d89-dc71-4f36-9254-00bd86795e43" (UID: "79ec9d89-dc71-4f36-9254-00bd86795e43"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.606607 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-etc-openvswitch\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.606650 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-host-cni-bin\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.606669 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-log-socket\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.606718 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-host-slash\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.606739 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-node-log\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.606754 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-host-run-netns\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.606773 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "79ec9d89-dc71-4f36-9254-00bd86795e43" (UID: "79ec9d89-dc71-4f36-9254-00bd86795e43"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.606796 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "79ec9d89-dc71-4f36-9254-00bd86795e43" (UID: "79ec9d89-dc71-4f36-9254-00bd86795e43"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.607073 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-log-socket" (OuterVolumeSpecName: "log-socket") pod "79ec9d89-dc71-4f36-9254-00bd86795e43" (UID: "79ec9d89-dc71-4f36-9254-00bd86795e43"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.607160 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-host-run-ovn-kubernetes\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.607180 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-var-lib-openvswitch\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.607264 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cc987463-f5a0-47f1-883a-83e226574334-ovnkube-config\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.607299 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-host-kubelet\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.607318 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-run-openvswitch\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.607363 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-host-cni-netd\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.607384 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-run-ovn\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.607400 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.607700 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cc987463-f5a0-47f1-883a-83e226574334-env-overrides\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.607916 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cc987463-f5a0-47f1-883a-83e226574334-ovnkube-script-lib\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.607102 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-run-systemd\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.607144 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cc987463-f5a0-47f1-883a-83e226574334-systemd-units\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.610898 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cc987463-f5a0-47f1-883a-83e226574334-ovn-node-metrics-cert\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.611162 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79ec9d89-dc71-4f36-9254-00bd86795e43-kube-api-access-rvxrb" (OuterVolumeSpecName: "kube-api-access-rvxrb") pod "79ec9d89-dc71-4f36-9254-00bd86795e43" (UID: "79ec9d89-dc71-4f36-9254-00bd86795e43"). InnerVolumeSpecName "kube-api-access-rvxrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.612380 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79ec9d89-dc71-4f36-9254-00bd86795e43-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "79ec9d89-dc71-4f36-9254-00bd86795e43" (UID: "79ec9d89-dc71-4f36-9254-00bd86795e43"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.627462 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "79ec9d89-dc71-4f36-9254-00bd86795e43" (UID: "79ec9d89-dc71-4f36-9254-00bd86795e43"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.627705 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps2h7\" (UniqueName: \"kubernetes.io/projected/cc987463-f5a0-47f1-883a-83e226574334-kube-api-access-ps2h7\") pod \"ovnkube-node-xhntk\" (UID: \"cc987463-f5a0-47f1-883a-83e226574334\") " pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.705252 4825 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-slash\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.705283 4825 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.705293 4825 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/79ec9d89-dc71-4f36-9254-00bd86795e43-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.705303 4825 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.705313 4825 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/79ec9d89-dc71-4f36-9254-00bd86795e43-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.705321 4825 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.705329 4825 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.705338 4825 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.705346 4825 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.705354 4825 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.705362 4825 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/79ec9d89-dc71-4f36-9254-00bd86795e43-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.705371 4825 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/79ec9d89-dc71-4f36-9254-00bd86795e43-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.705379 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvxrb\" (UniqueName: \"kubernetes.io/projected/79ec9d89-dc71-4f36-9254-00bd86795e43-kube-api-access-rvxrb\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.705388 4825 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.705396 4825 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.705403 4825 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.705411 4825 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-log-socket\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.705418 4825 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-node-log\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.705428 4825 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.705437 4825 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/79ec9d89-dc71-4f36-9254-00bd86795e43-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.712652 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhkb9_79ec9d89-dc71-4f36-9254-00bd86795e43/ovnkube-controller/3.log" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.715260 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhkb9_79ec9d89-dc71-4f36-9254-00bd86795e43/ovn-acl-logging/0.log" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716110 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jhkb9_79ec9d89-dc71-4f36-9254-00bd86795e43/ovn-controller/0.log" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716522 4825 generic.go:334] "Generic (PLEG): container finished" podID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerID="ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0" exitCode=0 Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716561 4825 generic.go:334] "Generic (PLEG): container finished" podID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerID="c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395" exitCode=0 Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716573 4825 generic.go:334] "Generic (PLEG): container finished" podID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerID="c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952" exitCode=0 Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716565 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerDied","Data":"ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716626 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerDied","Data":"c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716630 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716643 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerDied","Data":"c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716654 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerDied","Data":"8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716583 4825 generic.go:334] "Generic (PLEG): container finished" podID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerID="8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54" exitCode=0 Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716675 4825 scope.go:117] "RemoveContainer" containerID="ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716683 4825 generic.go:334] "Generic (PLEG): container finished" podID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerID="8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff" exitCode=0 Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716693 4825 generic.go:334] "Generic (PLEG): container finished" podID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerID="e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3" exitCode=0 Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716702 4825 generic.go:334] "Generic (PLEG): container finished" podID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerID="b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3" exitCode=143 Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716711 4825 generic.go:334] "Generic (PLEG): container finished" podID="79ec9d89-dc71-4f36-9254-00bd86795e43" containerID="3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841" exitCode=143 Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716768 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerDied","Data":"8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716801 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerDied","Data":"e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716818 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716831 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716839 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716847 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716854 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716862 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716870 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716877 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716884 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716894 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerDied","Data":"b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716907 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716916 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716924 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716931 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716939 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716946 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716954 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716961 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716970 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716977 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.716987 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerDied","Data":"3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.717000 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.717009 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.717016 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.717023 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.717031 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.717038 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.717045 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.717053 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.717061 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.717068 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.717078 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jhkb9" event={"ID":"79ec9d89-dc71-4f36-9254-00bd86795e43","Type":"ContainerDied","Data":"9be67cfb02442189023b4ded525ce954e90db103e45ad55a85fbcbb0c06d905f"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.717089 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.717099 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.717106 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.717115 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.717123 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.717149 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.717157 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.717164 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.717171 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.717178 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.721972 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8dkbt_165351e4-3c96-4a68-8c75-43b001b0ec60/kube-multus/2.log" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.722648 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8dkbt_165351e4-3c96-4a68-8c75-43b001b0ec60/kube-multus/1.log" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.722729 4825 generic.go:334] "Generic (PLEG): container finished" podID="165351e4-3c96-4a68-8c75-43b001b0ec60" containerID="948ff761b39a162bd7fe26b7700c7066bff6e2a7519a4d0bab5a4c56c3d2470a" exitCode=2 Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.722783 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8dkbt" event={"ID":"165351e4-3c96-4a68-8c75-43b001b0ec60","Type":"ContainerDied","Data":"948ff761b39a162bd7fe26b7700c7066bff6e2a7519a4d0bab5a4c56c3d2470a"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.722819 4825 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56708571d26121728c9ab436b6bd8aac9945e7b28e3f4c5d12035fa9809c70ef"} Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.723521 4825 scope.go:117] "RemoveContainer" containerID="948ff761b39a162bd7fe26b7700c7066bff6e2a7519a4d0bab5a4c56c3d2470a" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.766064 4825 scope.go:117] "RemoveContainer" containerID="73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.779774 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jhkb9"] Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.786537 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jhkb9"] Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.800786 4825 scope.go:117] "RemoveContainer" containerID="c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.819255 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.840426 4825 scope.go:117] "RemoveContainer" containerID="c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.868228 4825 scope.go:117] "RemoveContainer" containerID="8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.893955 4825 scope.go:117] "RemoveContainer" containerID="8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.920573 4825 scope.go:117] "RemoveContainer" containerID="e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.953593 4825 scope.go:117] "RemoveContainer" containerID="b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.972557 4825 scope.go:117] "RemoveContainer" containerID="3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841" Mar 10 06:58:15 crc kubenswrapper[4825]: I0310 06:58:15.999082 4825 scope.go:117] "RemoveContainer" containerID="1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.057714 4825 scope.go:117] "RemoveContainer" containerID="ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0" Mar 10 06:58:16 crc kubenswrapper[4825]: E0310 06:58:16.058322 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0\": container with ID starting with ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0 not found: ID does not exist" containerID="ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.058365 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0"} err="failed to get container status \"ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0\": rpc error: code = NotFound desc = could not find container \"ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0\": container with ID starting with ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.058400 4825 scope.go:117] "RemoveContainer" containerID="73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d" Mar 10 06:58:16 crc kubenswrapper[4825]: E0310 06:58:16.058770 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d\": container with ID starting with 73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d not found: ID does not exist" containerID="73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.058806 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d"} err="failed to get container status \"73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d\": rpc error: code = NotFound desc = could not find container \"73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d\": container with ID starting with 73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.058857 4825 scope.go:117] "RemoveContainer" containerID="c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395" Mar 10 06:58:16 crc kubenswrapper[4825]: E0310 06:58:16.059249 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\": container with ID starting with c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395 not found: ID does not exist" containerID="c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.059286 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395"} err="failed to get container status \"c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\": rpc error: code = NotFound desc = could not find container \"c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\": container with ID starting with c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.059310 4825 scope.go:117] "RemoveContainer" containerID="c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952" Mar 10 06:58:16 crc kubenswrapper[4825]: E0310 06:58:16.059673 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\": container with ID starting with c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952 not found: ID does not exist" containerID="c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.059732 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952"} err="failed to get container status \"c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\": rpc error: code = NotFound desc = could not find container \"c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\": container with ID starting with c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.059761 4825 scope.go:117] "RemoveContainer" containerID="8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54" Mar 10 06:58:16 crc kubenswrapper[4825]: E0310 06:58:16.060168 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\": container with ID starting with 8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54 not found: ID does not exist" containerID="8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.060211 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54"} err="failed to get container status \"8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\": rpc error: code = NotFound desc = could not find container \"8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\": container with ID starting with 8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.060236 4825 scope.go:117] "RemoveContainer" containerID="8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff" Mar 10 06:58:16 crc kubenswrapper[4825]: E0310 06:58:16.060634 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\": container with ID starting with 8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff not found: ID does not exist" containerID="8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.060702 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff"} err="failed to get container status \"8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\": rpc error: code = NotFound desc = could not find container \"8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\": container with ID starting with 8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.060728 4825 scope.go:117] "RemoveContainer" containerID="e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3" Mar 10 06:58:16 crc kubenswrapper[4825]: E0310 06:58:16.061060 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\": container with ID starting with e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3 not found: ID does not exist" containerID="e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.061101 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3"} err="failed to get container status \"e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\": rpc error: code = NotFound desc = could not find container \"e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\": container with ID starting with e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.061124 4825 scope.go:117] "RemoveContainer" containerID="b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3" Mar 10 06:58:16 crc kubenswrapper[4825]: E0310 06:58:16.061554 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\": container with ID starting with b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3 not found: ID does not exist" containerID="b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.061594 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3"} err="failed to get container status \"b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\": rpc error: code = NotFound desc = could not find container \"b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\": container with ID starting with b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.061668 4825 scope.go:117] "RemoveContainer" containerID="3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841" Mar 10 06:58:16 crc kubenswrapper[4825]: E0310 06:58:16.062088 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\": container with ID starting with 3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841 not found: ID does not exist" containerID="3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.062229 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841"} err="failed to get container status \"3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\": rpc error: code = NotFound desc = could not find container \"3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\": container with ID starting with 3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.062307 4825 scope.go:117] "RemoveContainer" containerID="1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b" Mar 10 06:58:16 crc kubenswrapper[4825]: E0310 06:58:16.062671 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\": container with ID starting with 1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b not found: ID does not exist" containerID="1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.062748 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b"} err="failed to get container status \"1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\": rpc error: code = NotFound desc = could not find container \"1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\": container with ID starting with 1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.062816 4825 scope.go:117] "RemoveContainer" containerID="ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.063170 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0"} err="failed to get container status \"ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0\": rpc error: code = NotFound desc = could not find container \"ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0\": container with ID starting with ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.063208 4825 scope.go:117] "RemoveContainer" containerID="73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.063579 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d"} err="failed to get container status \"73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d\": rpc error: code = NotFound desc = could not find container \"73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d\": container with ID starting with 73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.063614 4825 scope.go:117] "RemoveContainer" containerID="c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.063942 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395"} err="failed to get container status \"c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\": rpc error: code = NotFound desc = could not find container \"c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\": container with ID starting with c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.063978 4825 scope.go:117] "RemoveContainer" containerID="c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.064407 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952"} err="failed to get container status \"c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\": rpc error: code = NotFound desc = could not find container \"c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\": container with ID starting with c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.064449 4825 scope.go:117] "RemoveContainer" containerID="8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.064909 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54"} err="failed to get container status \"8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\": rpc error: code = NotFound desc = could not find container \"8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\": container with ID starting with 8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.064944 4825 scope.go:117] "RemoveContainer" containerID="8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.065300 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff"} err="failed to get container status \"8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\": rpc error: code = NotFound desc = could not find container \"8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\": container with ID starting with 8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.065339 4825 scope.go:117] "RemoveContainer" containerID="e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.065726 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3"} err="failed to get container status \"e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\": rpc error: code = NotFound desc = could not find container \"e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\": container with ID starting with e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.065764 4825 scope.go:117] "RemoveContainer" containerID="b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.066097 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3"} err="failed to get container status \"b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\": rpc error: code = NotFound desc = could not find container \"b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\": container with ID starting with b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.066159 4825 scope.go:117] "RemoveContainer" containerID="3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.066520 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841"} err="failed to get container status \"3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\": rpc error: code = NotFound desc = could not find container \"3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\": container with ID starting with 3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.066556 4825 scope.go:117] "RemoveContainer" containerID="1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.067020 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b"} err="failed to get container status \"1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\": rpc error: code = NotFound desc = could not find container \"1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\": container with ID starting with 1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.067055 4825 scope.go:117] "RemoveContainer" containerID="ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.067433 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0"} err="failed to get container status \"ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0\": rpc error: code = NotFound desc = could not find container \"ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0\": container with ID starting with ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.067469 4825 scope.go:117] "RemoveContainer" containerID="73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.067823 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d"} err="failed to get container status \"73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d\": rpc error: code = NotFound desc = could not find container \"73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d\": container with ID starting with 73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.067860 4825 scope.go:117] "RemoveContainer" containerID="c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.068324 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395"} err="failed to get container status \"c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\": rpc error: code = NotFound desc = could not find container \"c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\": container with ID starting with c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.068361 4825 scope.go:117] "RemoveContainer" containerID="c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.068689 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952"} err="failed to get container status \"c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\": rpc error: code = NotFound desc = could not find container \"c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\": container with ID starting with c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.068776 4825 scope.go:117] "RemoveContainer" containerID="8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.069170 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54"} err="failed to get container status \"8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\": rpc error: code = NotFound desc = could not find container \"8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\": container with ID starting with 8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.069207 4825 scope.go:117] "RemoveContainer" containerID="8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.069572 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff"} err="failed to get container status \"8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\": rpc error: code = NotFound desc = could not find container \"8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\": container with ID starting with 8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.069609 4825 scope.go:117] "RemoveContainer" containerID="e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.069989 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3"} err="failed to get container status \"e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\": rpc error: code = NotFound desc = could not find container \"e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\": container with ID starting with e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.070023 4825 scope.go:117] "RemoveContainer" containerID="b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.070360 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3"} err="failed to get container status \"b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\": rpc error: code = NotFound desc = could not find container \"b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\": container with ID starting with b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.070446 4825 scope.go:117] "RemoveContainer" containerID="3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.070855 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841"} err="failed to get container status \"3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\": rpc error: code = NotFound desc = could not find container \"3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\": container with ID starting with 3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.070891 4825 scope.go:117] "RemoveContainer" containerID="1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.071236 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b"} err="failed to get container status \"1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\": rpc error: code = NotFound desc = could not find container \"1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\": container with ID starting with 1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.071272 4825 scope.go:117] "RemoveContainer" containerID="ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.071586 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0"} err="failed to get container status \"ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0\": rpc error: code = NotFound desc = could not find container \"ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0\": container with ID starting with ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.071647 4825 scope.go:117] "RemoveContainer" containerID="73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.072004 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d"} err="failed to get container status \"73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d\": rpc error: code = NotFound desc = could not find container \"73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d\": container with ID starting with 73fa7532ee0deedea4b9cafa99e07402d830e8e36b82f23918932208aeca220d not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.072067 4825 scope.go:117] "RemoveContainer" containerID="c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.072426 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395"} err="failed to get container status \"c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\": rpc error: code = NotFound desc = could not find container \"c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395\": container with ID starting with c9975dcac8fe2fdeeb7c5ccb45275953265345abd9e57b4b56a43e1a3eca7395 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.072555 4825 scope.go:117] "RemoveContainer" containerID="c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.072971 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952"} err="failed to get container status \"c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\": rpc error: code = NotFound desc = could not find container \"c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952\": container with ID starting with c209e1a44950b1a6c8b837db499afacecaa94ff7eedd4f564ec8c7bef266d952 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.073027 4825 scope.go:117] "RemoveContainer" containerID="8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.073438 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54"} err="failed to get container status \"8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\": rpc error: code = NotFound desc = could not find container \"8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54\": container with ID starting with 8b86891b9bbe61cd2f08f619f1e976c9b7314661e77dcccae1c534b8e6e8cd54 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.073477 4825 scope.go:117] "RemoveContainer" containerID="8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.073901 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff"} err="failed to get container status \"8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\": rpc error: code = NotFound desc = could not find container \"8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff\": container with ID starting with 8728fd4b81f51067932f2f74fa53e1c65cdbf887872ea2fe414f071cfe53f5ff not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.073983 4825 scope.go:117] "RemoveContainer" containerID="e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.074372 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3"} err="failed to get container status \"e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\": rpc error: code = NotFound desc = could not find container \"e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3\": container with ID starting with e33345990e860e3638423e22ceb778d6e6a90544d40e343a05a413dbeb5492b3 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.074433 4825 scope.go:117] "RemoveContainer" containerID="b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.074865 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3"} err="failed to get container status \"b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\": rpc error: code = NotFound desc = could not find container \"b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3\": container with ID starting with b2561f2ea86508a0001f948594fd3d526fa10bc916f241eb99fc9b51f0ee40d3 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.074901 4825 scope.go:117] "RemoveContainer" containerID="3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.075268 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841"} err="failed to get container status \"3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\": rpc error: code = NotFound desc = could not find container \"3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841\": container with ID starting with 3a368f5a6baa72bdbdbf66fc48b3d38652e76fce618173c232d28124de94c841 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.075304 4825 scope.go:117] "RemoveContainer" containerID="1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.075664 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b"} err="failed to get container status \"1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\": rpc error: code = NotFound desc = could not find container \"1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b\": container with ID starting with 1ba4509d2279abba46b25d6b76af459b2351b513a4aff97552f9d0dfae5a5a3b not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.075698 4825 scope.go:117] "RemoveContainer" containerID="ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.076171 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0"} err="failed to get container status \"ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0\": rpc error: code = NotFound desc = could not find container \"ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0\": container with ID starting with ce0f84219babc6df2d532f37040cc31d4d331fb508fd4468903673595ba31db0 not found: ID does not exist" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.733809 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8dkbt_165351e4-3c96-4a68-8c75-43b001b0ec60/kube-multus/2.log" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.735883 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8dkbt_165351e4-3c96-4a68-8c75-43b001b0ec60/kube-multus/1.log" Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.736033 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8dkbt" event={"ID":"165351e4-3c96-4a68-8c75-43b001b0ec60","Type":"ContainerStarted","Data":"f30d9a90f6eec18f2a3cbfa0211ef6706c9d331f57bb36f177cadf39bcd1e3d0"} Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.740686 4825 generic.go:334] "Generic (PLEG): container finished" podID="cc987463-f5a0-47f1-883a-83e226574334" containerID="61f70bcf9055775680eb6087d4313b6d7ccdaa1797f51fe1d781840bc318d55f" exitCode=0 Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.740754 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" event={"ID":"cc987463-f5a0-47f1-883a-83e226574334","Type":"ContainerDied","Data":"61f70bcf9055775680eb6087d4313b6d7ccdaa1797f51fe1d781840bc318d55f"} Mar 10 06:58:16 crc kubenswrapper[4825]: I0310 06:58:16.740807 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" event={"ID":"cc987463-f5a0-47f1-883a-83e226574334","Type":"ContainerStarted","Data":"3d156b51f9d543bfcdbdcabf90574f59c6492229ea8356b049e7f61bd42983c7"} Mar 10 06:58:17 crc kubenswrapper[4825]: I0310 06:58:17.259607 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79ec9d89-dc71-4f36-9254-00bd86795e43" path="/var/lib/kubelet/pods/79ec9d89-dc71-4f36-9254-00bd86795e43/volumes" Mar 10 06:58:17 crc kubenswrapper[4825]: I0310 06:58:17.753037 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" event={"ID":"cc987463-f5a0-47f1-883a-83e226574334","Type":"ContainerStarted","Data":"8eeeabba421e1ecf090a5dde052d96e8bdbf8cbf7abd7c900c5250433af1e20a"} Mar 10 06:58:17 crc kubenswrapper[4825]: I0310 06:58:17.753525 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" event={"ID":"cc987463-f5a0-47f1-883a-83e226574334","Type":"ContainerStarted","Data":"dee7b4ffa3f059aee65dc595beafaaa4d3db500e7a30b2af551748e05619774d"} Mar 10 06:58:17 crc kubenswrapper[4825]: I0310 06:58:17.753559 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" event={"ID":"cc987463-f5a0-47f1-883a-83e226574334","Type":"ContainerStarted","Data":"94297e7ecd8139c356f44cc54a17a57c08ad0ca99eb55e9a05579ad27ad6659f"} Mar 10 06:58:17 crc kubenswrapper[4825]: I0310 06:58:17.753584 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" event={"ID":"cc987463-f5a0-47f1-883a-83e226574334","Type":"ContainerStarted","Data":"2dec8fb420421007d72e2b6b67cc5e309b6ac44b3d7699ee41cd9533fe96491f"} Mar 10 06:58:17 crc kubenswrapper[4825]: I0310 06:58:17.753612 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" event={"ID":"cc987463-f5a0-47f1-883a-83e226574334","Type":"ContainerStarted","Data":"a9fc61f115a9e58ce9b69e7204363b8ea92496a11cf33f9b91d87f4d98dad744"} Mar 10 06:58:17 crc kubenswrapper[4825]: I0310 06:58:17.753636 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" event={"ID":"cc987463-f5a0-47f1-883a-83e226574334","Type":"ContainerStarted","Data":"a5ada259013921fdd3ad31c677d2bbfc6400d0feef41e72f5c6952e1cd5e1e76"} Mar 10 06:58:20 crc kubenswrapper[4825]: I0310 06:58:20.782233 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" event={"ID":"cc987463-f5a0-47f1-883a-83e226574334","Type":"ContainerStarted","Data":"f30cb30c2caa2c35d14025505dfa6256fc8271c4551be0a20a9972bd180c7e10"} Mar 10 06:58:21 crc kubenswrapper[4825]: I0310 06:58:21.400168 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-nr28s"] Mar 10 06:58:21 crc kubenswrapper[4825]: I0310 06:58:21.401206 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-nr28s" Mar 10 06:58:21 crc kubenswrapper[4825]: I0310 06:58:21.404027 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 10 06:58:21 crc kubenswrapper[4825]: I0310 06:58:21.404199 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 10 06:58:21 crc kubenswrapper[4825]: I0310 06:58:21.405291 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 10 06:58:21 crc kubenswrapper[4825]: I0310 06:58:21.406557 4825 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-8w9nk" Mar 10 06:58:21 crc kubenswrapper[4825]: I0310 06:58:21.586472 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0ca2b7f9-6ee2-4a55-8029-f956f85c0466-node-mnt\") pod \"crc-storage-crc-nr28s\" (UID: \"0ca2b7f9-6ee2-4a55-8029-f956f85c0466\") " pod="crc-storage/crc-storage-crc-nr28s" Mar 10 06:58:21 crc kubenswrapper[4825]: I0310 06:58:21.586723 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wswdp\" (UniqueName: \"kubernetes.io/projected/0ca2b7f9-6ee2-4a55-8029-f956f85c0466-kube-api-access-wswdp\") pod \"crc-storage-crc-nr28s\" (UID: \"0ca2b7f9-6ee2-4a55-8029-f956f85c0466\") " pod="crc-storage/crc-storage-crc-nr28s" Mar 10 06:58:21 crc kubenswrapper[4825]: I0310 06:58:21.587068 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0ca2b7f9-6ee2-4a55-8029-f956f85c0466-crc-storage\") pod \"crc-storage-crc-nr28s\" (UID: \"0ca2b7f9-6ee2-4a55-8029-f956f85c0466\") " pod="crc-storage/crc-storage-crc-nr28s" Mar 10 06:58:21 crc kubenswrapper[4825]: I0310 06:58:21.689118 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0ca2b7f9-6ee2-4a55-8029-f956f85c0466-node-mnt\") pod \"crc-storage-crc-nr28s\" (UID: \"0ca2b7f9-6ee2-4a55-8029-f956f85c0466\") " pod="crc-storage/crc-storage-crc-nr28s" Mar 10 06:58:21 crc kubenswrapper[4825]: I0310 06:58:21.689299 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wswdp\" (UniqueName: \"kubernetes.io/projected/0ca2b7f9-6ee2-4a55-8029-f956f85c0466-kube-api-access-wswdp\") pod \"crc-storage-crc-nr28s\" (UID: \"0ca2b7f9-6ee2-4a55-8029-f956f85c0466\") " pod="crc-storage/crc-storage-crc-nr28s" Mar 10 06:58:21 crc kubenswrapper[4825]: I0310 06:58:21.689376 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0ca2b7f9-6ee2-4a55-8029-f956f85c0466-crc-storage\") pod \"crc-storage-crc-nr28s\" (UID: \"0ca2b7f9-6ee2-4a55-8029-f956f85c0466\") " pod="crc-storage/crc-storage-crc-nr28s" Mar 10 06:58:21 crc kubenswrapper[4825]: I0310 06:58:21.689882 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0ca2b7f9-6ee2-4a55-8029-f956f85c0466-node-mnt\") pod \"crc-storage-crc-nr28s\" (UID: \"0ca2b7f9-6ee2-4a55-8029-f956f85c0466\") " pod="crc-storage/crc-storage-crc-nr28s" Mar 10 06:58:21 crc kubenswrapper[4825]: I0310 06:58:21.690915 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0ca2b7f9-6ee2-4a55-8029-f956f85c0466-crc-storage\") pod \"crc-storage-crc-nr28s\" (UID: \"0ca2b7f9-6ee2-4a55-8029-f956f85c0466\") " pod="crc-storage/crc-storage-crc-nr28s" Mar 10 06:58:21 crc kubenswrapper[4825]: I0310 06:58:21.722669 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wswdp\" (UniqueName: \"kubernetes.io/projected/0ca2b7f9-6ee2-4a55-8029-f956f85c0466-kube-api-access-wswdp\") pod \"crc-storage-crc-nr28s\" (UID: \"0ca2b7f9-6ee2-4a55-8029-f956f85c0466\") " pod="crc-storage/crc-storage-crc-nr28s" Mar 10 06:58:21 crc kubenswrapper[4825]: I0310 06:58:21.723662 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-nr28s" Mar 10 06:58:21 crc kubenswrapper[4825]: E0310 06:58:21.789157 4825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-nr28s_crc-storage_0ca2b7f9-6ee2-4a55-8029-f956f85c0466_0(14123a68a9a41c396cdd715f2b048e0efb92ee9e35f5f8e7e2c004b4f575e918): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 06:58:21 crc kubenswrapper[4825]: E0310 06:58:21.789234 4825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-nr28s_crc-storage_0ca2b7f9-6ee2-4a55-8029-f956f85c0466_0(14123a68a9a41c396cdd715f2b048e0efb92ee9e35f5f8e7e2c004b4f575e918): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-nr28s" Mar 10 06:58:21 crc kubenswrapper[4825]: E0310 06:58:21.789262 4825 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-nr28s_crc-storage_0ca2b7f9-6ee2-4a55-8029-f956f85c0466_0(14123a68a9a41c396cdd715f2b048e0efb92ee9e35f5f8e7e2c004b4f575e918): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-nr28s" Mar 10 06:58:21 crc kubenswrapper[4825]: E0310 06:58:21.789331 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-nr28s_crc-storage(0ca2b7f9-6ee2-4a55-8029-f956f85c0466)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-nr28s_crc-storage(0ca2b7f9-6ee2-4a55-8029-f956f85c0466)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-nr28s_crc-storage_0ca2b7f9-6ee2-4a55-8029-f956f85c0466_0(14123a68a9a41c396cdd715f2b048e0efb92ee9e35f5f8e7e2c004b4f575e918): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-nr28s" podUID="0ca2b7f9-6ee2-4a55-8029-f956f85c0466" Mar 10 06:58:22 crc kubenswrapper[4825]: I0310 06:58:22.801530 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" event={"ID":"cc987463-f5a0-47f1-883a-83e226574334","Type":"ContainerStarted","Data":"6cc452e39a9294999b12d9a2173ad038f703f4d64954bee6694b980fb99c87e3"} Mar 10 06:58:22 crc kubenswrapper[4825]: I0310 06:58:22.801985 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:22 crc kubenswrapper[4825]: I0310 06:58:22.802023 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:22 crc kubenswrapper[4825]: I0310 06:58:22.802038 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:22 crc kubenswrapper[4825]: I0310 06:58:22.835945 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:22 crc kubenswrapper[4825]: I0310 06:58:22.840270 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:22 crc kubenswrapper[4825]: I0310 06:58:22.848308 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" podStartSLOduration=7.84828965 podStartE2EDuration="7.84828965s" podCreationTimestamp="2026-03-10 06:58:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:58:22.843116225 +0000 UTC m=+855.872896880" watchObservedRunningTime="2026-03-10 06:58:22.84828965 +0000 UTC m=+855.878070275" Mar 10 06:58:23 crc kubenswrapper[4825]: I0310 06:58:23.036505 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-nr28s"] Mar 10 06:58:23 crc kubenswrapper[4825]: I0310 06:58:23.036641 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-nr28s" Mar 10 06:58:23 crc kubenswrapper[4825]: I0310 06:58:23.037237 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-nr28s" Mar 10 06:58:23 crc kubenswrapper[4825]: E0310 06:58:23.067902 4825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-nr28s_crc-storage_0ca2b7f9-6ee2-4a55-8029-f956f85c0466_0(240c64583044916feab9bbe577b37d54b90a70b12ce2886b2616cb6b51920840): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 06:58:23 crc kubenswrapper[4825]: E0310 06:58:23.068010 4825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-nr28s_crc-storage_0ca2b7f9-6ee2-4a55-8029-f956f85c0466_0(240c64583044916feab9bbe577b37d54b90a70b12ce2886b2616cb6b51920840): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-nr28s" Mar 10 06:58:23 crc kubenswrapper[4825]: E0310 06:58:23.068046 4825 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-nr28s_crc-storage_0ca2b7f9-6ee2-4a55-8029-f956f85c0466_0(240c64583044916feab9bbe577b37d54b90a70b12ce2886b2616cb6b51920840): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-nr28s" Mar 10 06:58:23 crc kubenswrapper[4825]: E0310 06:58:23.068119 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-nr28s_crc-storage(0ca2b7f9-6ee2-4a55-8029-f956f85c0466)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-nr28s_crc-storage(0ca2b7f9-6ee2-4a55-8029-f956f85c0466)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-nr28s_crc-storage_0ca2b7f9-6ee2-4a55-8029-f956f85c0466_0(240c64583044916feab9bbe577b37d54b90a70b12ce2886b2616cb6b51920840): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-nr28s" podUID="0ca2b7f9-6ee2-4a55-8029-f956f85c0466" Mar 10 06:58:29 crc kubenswrapper[4825]: I0310 06:58:29.684238 4825 scope.go:117] "RemoveContainer" containerID="d11ea08c246ccf9d91483d0d3e31652cc4fb3ecf7aa6f9a99836a6d2f942ac38" Mar 10 06:58:29 crc kubenswrapper[4825]: I0310 06:58:29.739104 4825 scope.go:117] "RemoveContainer" containerID="56708571d26121728c9ab436b6bd8aac9945e7b28e3f4c5d12035fa9809c70ef" Mar 10 06:58:29 crc kubenswrapper[4825]: I0310 06:58:29.856236 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8dkbt_165351e4-3c96-4a68-8c75-43b001b0ec60/kube-multus/2.log" Mar 10 06:58:36 crc kubenswrapper[4825]: I0310 06:58:36.236340 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-nr28s" Mar 10 06:58:36 crc kubenswrapper[4825]: I0310 06:58:36.237387 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-nr28s" Mar 10 06:58:36 crc kubenswrapper[4825]: I0310 06:58:36.523162 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-nr28s"] Mar 10 06:58:36 crc kubenswrapper[4825]: W0310 06:58:36.528510 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ca2b7f9_6ee2_4a55_8029_f956f85c0466.slice/crio-ffdd01306ffbb77edfae282f8af7171a0570704182b5affcfa32de97b84c92f2 WatchSource:0}: Error finding container ffdd01306ffbb77edfae282f8af7171a0570704182b5affcfa32de97b84c92f2: Status 404 returned error can't find the container with id ffdd01306ffbb77edfae282f8af7171a0570704182b5affcfa32de97b84c92f2 Mar 10 06:58:36 crc kubenswrapper[4825]: I0310 06:58:36.905998 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-nr28s" event={"ID":"0ca2b7f9-6ee2-4a55-8029-f956f85c0466","Type":"ContainerStarted","Data":"ffdd01306ffbb77edfae282f8af7171a0570704182b5affcfa32de97b84c92f2"} Mar 10 06:58:37 crc kubenswrapper[4825]: I0310 06:58:37.921826 4825 generic.go:334] "Generic (PLEG): container finished" podID="0ca2b7f9-6ee2-4a55-8029-f956f85c0466" containerID="988e5e73af76d370ca7ebba422484bb17fca7ca10664c878f367817b00f4355b" exitCode=0 Mar 10 06:58:37 crc kubenswrapper[4825]: I0310 06:58:37.921906 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-nr28s" event={"ID":"0ca2b7f9-6ee2-4a55-8029-f956f85c0466","Type":"ContainerDied","Data":"988e5e73af76d370ca7ebba422484bb17fca7ca10664c878f367817b00f4355b"} Mar 10 06:58:39 crc kubenswrapper[4825]: I0310 06:58:39.216389 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-nr28s" Mar 10 06:58:39 crc kubenswrapper[4825]: I0310 06:58:39.296936 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wswdp\" (UniqueName: \"kubernetes.io/projected/0ca2b7f9-6ee2-4a55-8029-f956f85c0466-kube-api-access-wswdp\") pod \"0ca2b7f9-6ee2-4a55-8029-f956f85c0466\" (UID: \"0ca2b7f9-6ee2-4a55-8029-f956f85c0466\") " Mar 10 06:58:39 crc kubenswrapper[4825]: I0310 06:58:39.297067 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0ca2b7f9-6ee2-4a55-8029-f956f85c0466-node-mnt\") pod \"0ca2b7f9-6ee2-4a55-8029-f956f85c0466\" (UID: \"0ca2b7f9-6ee2-4a55-8029-f956f85c0466\") " Mar 10 06:58:39 crc kubenswrapper[4825]: I0310 06:58:39.297209 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0ca2b7f9-6ee2-4a55-8029-f956f85c0466-crc-storage\") pod \"0ca2b7f9-6ee2-4a55-8029-f956f85c0466\" (UID: \"0ca2b7f9-6ee2-4a55-8029-f956f85c0466\") " Mar 10 06:58:39 crc kubenswrapper[4825]: I0310 06:58:39.298323 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ca2b7f9-6ee2-4a55-8029-f956f85c0466-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "0ca2b7f9-6ee2-4a55-8029-f956f85c0466" (UID: "0ca2b7f9-6ee2-4a55-8029-f956f85c0466"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 06:58:39 crc kubenswrapper[4825]: I0310 06:58:39.304748 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ca2b7f9-6ee2-4a55-8029-f956f85c0466-kube-api-access-wswdp" (OuterVolumeSpecName: "kube-api-access-wswdp") pod "0ca2b7f9-6ee2-4a55-8029-f956f85c0466" (UID: "0ca2b7f9-6ee2-4a55-8029-f956f85c0466"). InnerVolumeSpecName "kube-api-access-wswdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:58:39 crc kubenswrapper[4825]: I0310 06:58:39.321411 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ca2b7f9-6ee2-4a55-8029-f956f85c0466-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "0ca2b7f9-6ee2-4a55-8029-f956f85c0466" (UID: "0ca2b7f9-6ee2-4a55-8029-f956f85c0466"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:58:39 crc kubenswrapper[4825]: I0310 06:58:39.399036 4825 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0ca2b7f9-6ee2-4a55-8029-f956f85c0466-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:39 crc kubenswrapper[4825]: I0310 06:58:39.399106 4825 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0ca2b7f9-6ee2-4a55-8029-f956f85c0466-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:39 crc kubenswrapper[4825]: I0310 06:58:39.399164 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wswdp\" (UniqueName: \"kubernetes.io/projected/0ca2b7f9-6ee2-4a55-8029-f956f85c0466-kube-api-access-wswdp\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:39 crc kubenswrapper[4825]: I0310 06:58:39.937831 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-nr28s" event={"ID":"0ca2b7f9-6ee2-4a55-8029-f956f85c0466","Type":"ContainerDied","Data":"ffdd01306ffbb77edfae282f8af7171a0570704182b5affcfa32de97b84c92f2"} Mar 10 06:58:39 crc kubenswrapper[4825]: I0310 06:58:39.937902 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffdd01306ffbb77edfae282f8af7171a0570704182b5affcfa32de97b84c92f2" Mar 10 06:58:39 crc kubenswrapper[4825]: I0310 06:58:39.938357 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-nr28s" Mar 10 06:58:45 crc kubenswrapper[4825]: I0310 06:58:45.863077 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xhntk" Mar 10 06:58:46 crc kubenswrapper[4825]: I0310 06:58:46.888639 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 06:58:46 crc kubenswrapper[4825]: I0310 06:58:46.888685 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 06:58:48 crc kubenswrapper[4825]: I0310 06:58:48.043759 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6"] Mar 10 06:58:48 crc kubenswrapper[4825]: E0310 06:58:48.044464 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ca2b7f9-6ee2-4a55-8029-f956f85c0466" containerName="storage" Mar 10 06:58:48 crc kubenswrapper[4825]: I0310 06:58:48.044485 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ca2b7f9-6ee2-4a55-8029-f956f85c0466" containerName="storage" Mar 10 06:58:48 crc kubenswrapper[4825]: I0310 06:58:48.044644 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ca2b7f9-6ee2-4a55-8029-f956f85c0466" containerName="storage" Mar 10 06:58:48 crc kubenswrapper[4825]: I0310 06:58:48.045786 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6" Mar 10 06:58:48 crc kubenswrapper[4825]: I0310 06:58:48.048533 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 06:58:48 crc kubenswrapper[4825]: I0310 06:58:48.058863 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6"] Mar 10 06:58:48 crc kubenswrapper[4825]: I0310 06:58:48.119659 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/174bce7a-2e4f-4dfa-b6a4-d57d028d00de-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6\" (UID: \"174bce7a-2e4f-4dfa-b6a4-d57d028d00de\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6" Mar 10 06:58:48 crc kubenswrapper[4825]: I0310 06:58:48.119731 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x745k\" (UniqueName: \"kubernetes.io/projected/174bce7a-2e4f-4dfa-b6a4-d57d028d00de-kube-api-access-x745k\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6\" (UID: \"174bce7a-2e4f-4dfa-b6a4-d57d028d00de\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6" Mar 10 06:58:48 crc kubenswrapper[4825]: I0310 06:58:48.119850 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/174bce7a-2e4f-4dfa-b6a4-d57d028d00de-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6\" (UID: \"174bce7a-2e4f-4dfa-b6a4-d57d028d00de\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6" Mar 10 06:58:48 crc kubenswrapper[4825]: I0310 06:58:48.221498 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/174bce7a-2e4f-4dfa-b6a4-d57d028d00de-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6\" (UID: \"174bce7a-2e4f-4dfa-b6a4-d57d028d00de\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6" Mar 10 06:58:48 crc kubenswrapper[4825]: I0310 06:58:48.221600 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/174bce7a-2e4f-4dfa-b6a4-d57d028d00de-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6\" (UID: \"174bce7a-2e4f-4dfa-b6a4-d57d028d00de\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6" Mar 10 06:58:48 crc kubenswrapper[4825]: I0310 06:58:48.221633 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x745k\" (UniqueName: \"kubernetes.io/projected/174bce7a-2e4f-4dfa-b6a4-d57d028d00de-kube-api-access-x745k\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6\" (UID: \"174bce7a-2e4f-4dfa-b6a4-d57d028d00de\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6" Mar 10 06:58:48 crc kubenswrapper[4825]: I0310 06:58:48.222581 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/174bce7a-2e4f-4dfa-b6a4-d57d028d00de-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6\" (UID: \"174bce7a-2e4f-4dfa-b6a4-d57d028d00de\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6" Mar 10 06:58:48 crc kubenswrapper[4825]: I0310 06:58:48.226024 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/174bce7a-2e4f-4dfa-b6a4-d57d028d00de-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6\" (UID: \"174bce7a-2e4f-4dfa-b6a4-d57d028d00de\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6" Mar 10 06:58:48 crc kubenswrapper[4825]: I0310 06:58:48.257280 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x745k\" (UniqueName: \"kubernetes.io/projected/174bce7a-2e4f-4dfa-b6a4-d57d028d00de-kube-api-access-x745k\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6\" (UID: \"174bce7a-2e4f-4dfa-b6a4-d57d028d00de\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6" Mar 10 06:58:48 crc kubenswrapper[4825]: I0310 06:58:48.368725 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6" Mar 10 06:58:48 crc kubenswrapper[4825]: I0310 06:58:48.824635 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6"] Mar 10 06:58:49 crc kubenswrapper[4825]: I0310 06:58:49.005475 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6" event={"ID":"174bce7a-2e4f-4dfa-b6a4-d57d028d00de","Type":"ContainerStarted","Data":"d0c655403168d4ea2081134ce9f0ca57fd01544ef61c5c084587cbab61283729"} Mar 10 06:58:50 crc kubenswrapper[4825]: I0310 06:58:50.014064 4825 generic.go:334] "Generic (PLEG): container finished" podID="174bce7a-2e4f-4dfa-b6a4-d57d028d00de" containerID="e1caba3c8baa1fc757b73373b21e7e13f1b2e9a048fc5a79a78088ec2b2859ec" exitCode=0 Mar 10 06:58:50 crc kubenswrapper[4825]: I0310 06:58:50.014208 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6" event={"ID":"174bce7a-2e4f-4dfa-b6a4-d57d028d00de","Type":"ContainerDied","Data":"e1caba3c8baa1fc757b73373b21e7e13f1b2e9a048fc5a79a78088ec2b2859ec"} Mar 10 06:58:50 crc kubenswrapper[4825]: I0310 06:58:50.047420 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n9k94"] Mar 10 06:58:50 crc kubenswrapper[4825]: I0310 06:58:50.050061 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9k94" Mar 10 06:58:50 crc kubenswrapper[4825]: I0310 06:58:50.063799 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n9k94"] Mar 10 06:58:50 crc kubenswrapper[4825]: I0310 06:58:50.148733 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq4kj\" (UniqueName: \"kubernetes.io/projected/1629bc68-78f4-4222-990d-3107e5d3eac7-kube-api-access-nq4kj\") pod \"redhat-operators-n9k94\" (UID: \"1629bc68-78f4-4222-990d-3107e5d3eac7\") " pod="openshift-marketplace/redhat-operators-n9k94" Mar 10 06:58:50 crc kubenswrapper[4825]: I0310 06:58:50.148866 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1629bc68-78f4-4222-990d-3107e5d3eac7-catalog-content\") pod \"redhat-operators-n9k94\" (UID: \"1629bc68-78f4-4222-990d-3107e5d3eac7\") " pod="openshift-marketplace/redhat-operators-n9k94" Mar 10 06:58:50 crc kubenswrapper[4825]: I0310 06:58:50.148912 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1629bc68-78f4-4222-990d-3107e5d3eac7-utilities\") pod \"redhat-operators-n9k94\" (UID: \"1629bc68-78f4-4222-990d-3107e5d3eac7\") " pod="openshift-marketplace/redhat-operators-n9k94" Mar 10 06:58:50 crc kubenswrapper[4825]: I0310 06:58:50.250340 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq4kj\" (UniqueName: \"kubernetes.io/projected/1629bc68-78f4-4222-990d-3107e5d3eac7-kube-api-access-nq4kj\") pod \"redhat-operators-n9k94\" (UID: \"1629bc68-78f4-4222-990d-3107e5d3eac7\") " pod="openshift-marketplace/redhat-operators-n9k94" Mar 10 06:58:50 crc kubenswrapper[4825]: I0310 06:58:50.250396 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1629bc68-78f4-4222-990d-3107e5d3eac7-catalog-content\") pod \"redhat-operators-n9k94\" (UID: \"1629bc68-78f4-4222-990d-3107e5d3eac7\") " pod="openshift-marketplace/redhat-operators-n9k94" Mar 10 06:58:50 crc kubenswrapper[4825]: I0310 06:58:50.250431 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1629bc68-78f4-4222-990d-3107e5d3eac7-utilities\") pod \"redhat-operators-n9k94\" (UID: \"1629bc68-78f4-4222-990d-3107e5d3eac7\") " pod="openshift-marketplace/redhat-operators-n9k94" Mar 10 06:58:50 crc kubenswrapper[4825]: I0310 06:58:50.250826 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1629bc68-78f4-4222-990d-3107e5d3eac7-utilities\") pod \"redhat-operators-n9k94\" (UID: \"1629bc68-78f4-4222-990d-3107e5d3eac7\") " pod="openshift-marketplace/redhat-operators-n9k94" Mar 10 06:58:50 crc kubenswrapper[4825]: I0310 06:58:50.251358 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1629bc68-78f4-4222-990d-3107e5d3eac7-catalog-content\") pod \"redhat-operators-n9k94\" (UID: \"1629bc68-78f4-4222-990d-3107e5d3eac7\") " pod="openshift-marketplace/redhat-operators-n9k94" Mar 10 06:58:50 crc kubenswrapper[4825]: I0310 06:58:50.285514 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq4kj\" (UniqueName: \"kubernetes.io/projected/1629bc68-78f4-4222-990d-3107e5d3eac7-kube-api-access-nq4kj\") pod \"redhat-operators-n9k94\" (UID: \"1629bc68-78f4-4222-990d-3107e5d3eac7\") " pod="openshift-marketplace/redhat-operators-n9k94" Mar 10 06:58:50 crc kubenswrapper[4825]: I0310 06:58:50.377035 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9k94" Mar 10 06:58:50 crc kubenswrapper[4825]: I0310 06:58:50.607969 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n9k94"] Mar 10 06:58:51 crc kubenswrapper[4825]: I0310 06:58:51.022409 4825 generic.go:334] "Generic (PLEG): container finished" podID="1629bc68-78f4-4222-990d-3107e5d3eac7" containerID="6c36ac854fc03b4772ccf2547674a61060d29f43e1e82e730c4e1e616f1c634b" exitCode=0 Mar 10 06:58:51 crc kubenswrapper[4825]: I0310 06:58:51.022507 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9k94" event={"ID":"1629bc68-78f4-4222-990d-3107e5d3eac7","Type":"ContainerDied","Data":"6c36ac854fc03b4772ccf2547674a61060d29f43e1e82e730c4e1e616f1c634b"} Mar 10 06:58:51 crc kubenswrapper[4825]: I0310 06:58:51.022747 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9k94" event={"ID":"1629bc68-78f4-4222-990d-3107e5d3eac7","Type":"ContainerStarted","Data":"c4fd356dc15edd998f48fb5ac30cee58f74c2218f9742b915f4608bfec29ac68"} Mar 10 06:58:52 crc kubenswrapper[4825]: I0310 06:58:52.031461 4825 generic.go:334] "Generic (PLEG): container finished" podID="174bce7a-2e4f-4dfa-b6a4-d57d028d00de" containerID="70f30957e5508ab2541add8dc4f5ca1c6fff90ad9cf79303b50fe03bd988fa07" exitCode=0 Mar 10 06:58:52 crc kubenswrapper[4825]: I0310 06:58:52.031570 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6" event={"ID":"174bce7a-2e4f-4dfa-b6a4-d57d028d00de","Type":"ContainerDied","Data":"70f30957e5508ab2541add8dc4f5ca1c6fff90ad9cf79303b50fe03bd988fa07"} Mar 10 06:58:52 crc kubenswrapper[4825]: I0310 06:58:52.034380 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9k94" event={"ID":"1629bc68-78f4-4222-990d-3107e5d3eac7","Type":"ContainerStarted","Data":"72efa95f887e7be37d198e7ef78bf879228a9773f7410c185ead1551d47d40ca"} Mar 10 06:58:53 crc kubenswrapper[4825]: I0310 06:58:53.047330 4825 generic.go:334] "Generic (PLEG): container finished" podID="1629bc68-78f4-4222-990d-3107e5d3eac7" containerID="72efa95f887e7be37d198e7ef78bf879228a9773f7410c185ead1551d47d40ca" exitCode=0 Mar 10 06:58:53 crc kubenswrapper[4825]: I0310 06:58:53.047444 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9k94" event={"ID":"1629bc68-78f4-4222-990d-3107e5d3eac7","Type":"ContainerDied","Data":"72efa95f887e7be37d198e7ef78bf879228a9773f7410c185ead1551d47d40ca"} Mar 10 06:58:53 crc kubenswrapper[4825]: I0310 06:58:53.052676 4825 generic.go:334] "Generic (PLEG): container finished" podID="174bce7a-2e4f-4dfa-b6a4-d57d028d00de" containerID="5a6b7d2bf5d37ef9e43813e250c1571c0691f6d78127868e54331ed07c01446b" exitCode=0 Mar 10 06:58:53 crc kubenswrapper[4825]: I0310 06:58:53.054339 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6" event={"ID":"174bce7a-2e4f-4dfa-b6a4-d57d028d00de","Type":"ContainerDied","Data":"5a6b7d2bf5d37ef9e43813e250c1571c0691f6d78127868e54331ed07c01446b"} Mar 10 06:58:54 crc kubenswrapper[4825]: I0310 06:58:54.065867 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9k94" event={"ID":"1629bc68-78f4-4222-990d-3107e5d3eac7","Type":"ContainerStarted","Data":"7b612319c25b8d4a5435eb058dfd167156cd6aadddd953f302eb799c68c9d445"} Mar 10 06:58:54 crc kubenswrapper[4825]: I0310 06:58:54.093725 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n9k94" podStartSLOduration=1.641611922 podStartE2EDuration="4.093701008s" podCreationTimestamp="2026-03-10 06:58:50 +0000 UTC" firstStartedPulling="2026-03-10 06:58:51.024350818 +0000 UTC m=+884.054131443" lastFinishedPulling="2026-03-10 06:58:53.476439874 +0000 UTC m=+886.506220529" observedRunningTime="2026-03-10 06:58:54.087437215 +0000 UTC m=+887.117217910" watchObservedRunningTime="2026-03-10 06:58:54.093701008 +0000 UTC m=+887.123481633" Mar 10 06:58:54 crc kubenswrapper[4825]: I0310 06:58:54.479567 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6" Mar 10 06:58:54 crc kubenswrapper[4825]: I0310 06:58:54.613427 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x745k\" (UniqueName: \"kubernetes.io/projected/174bce7a-2e4f-4dfa-b6a4-d57d028d00de-kube-api-access-x745k\") pod \"174bce7a-2e4f-4dfa-b6a4-d57d028d00de\" (UID: \"174bce7a-2e4f-4dfa-b6a4-d57d028d00de\") " Mar 10 06:58:54 crc kubenswrapper[4825]: I0310 06:58:54.613654 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/174bce7a-2e4f-4dfa-b6a4-d57d028d00de-util\") pod \"174bce7a-2e4f-4dfa-b6a4-d57d028d00de\" (UID: \"174bce7a-2e4f-4dfa-b6a4-d57d028d00de\") " Mar 10 06:58:54 crc kubenswrapper[4825]: I0310 06:58:54.613721 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/174bce7a-2e4f-4dfa-b6a4-d57d028d00de-bundle\") pod \"174bce7a-2e4f-4dfa-b6a4-d57d028d00de\" (UID: \"174bce7a-2e4f-4dfa-b6a4-d57d028d00de\") " Mar 10 06:58:54 crc kubenswrapper[4825]: I0310 06:58:54.614540 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/174bce7a-2e4f-4dfa-b6a4-d57d028d00de-bundle" (OuterVolumeSpecName: "bundle") pod "174bce7a-2e4f-4dfa-b6a4-d57d028d00de" (UID: "174bce7a-2e4f-4dfa-b6a4-d57d028d00de"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:58:54 crc kubenswrapper[4825]: I0310 06:58:54.623502 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/174bce7a-2e4f-4dfa-b6a4-d57d028d00de-kube-api-access-x745k" (OuterVolumeSpecName: "kube-api-access-x745k") pod "174bce7a-2e4f-4dfa-b6a4-d57d028d00de" (UID: "174bce7a-2e4f-4dfa-b6a4-d57d028d00de"). InnerVolumeSpecName "kube-api-access-x745k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:58:54 crc kubenswrapper[4825]: I0310 06:58:54.651363 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/174bce7a-2e4f-4dfa-b6a4-d57d028d00de-util" (OuterVolumeSpecName: "util") pod "174bce7a-2e4f-4dfa-b6a4-d57d028d00de" (UID: "174bce7a-2e4f-4dfa-b6a4-d57d028d00de"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:58:54 crc kubenswrapper[4825]: I0310 06:58:54.715279 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x745k\" (UniqueName: \"kubernetes.io/projected/174bce7a-2e4f-4dfa-b6a4-d57d028d00de-kube-api-access-x745k\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:54 crc kubenswrapper[4825]: I0310 06:58:54.715335 4825 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/174bce7a-2e4f-4dfa-b6a4-d57d028d00de-util\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:54 crc kubenswrapper[4825]: I0310 06:58:54.715354 4825 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/174bce7a-2e4f-4dfa-b6a4-d57d028d00de-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 06:58:55 crc kubenswrapper[4825]: I0310 06:58:55.079216 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6" event={"ID":"174bce7a-2e4f-4dfa-b6a4-d57d028d00de","Type":"ContainerDied","Data":"d0c655403168d4ea2081134ce9f0ca57fd01544ef61c5c084587cbab61283729"} Mar 10 06:58:55 crc kubenswrapper[4825]: I0310 06:58:55.079286 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0c655403168d4ea2081134ce9f0ca57fd01544ef61c5c084587cbab61283729" Mar 10 06:58:55 crc kubenswrapper[4825]: I0310 06:58:55.080221 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6" Mar 10 06:58:58 crc kubenswrapper[4825]: I0310 06:58:58.655318 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-qttr9"] Mar 10 06:58:58 crc kubenswrapper[4825]: E0310 06:58:58.655884 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="174bce7a-2e4f-4dfa-b6a4-d57d028d00de" containerName="pull" Mar 10 06:58:58 crc kubenswrapper[4825]: I0310 06:58:58.655900 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="174bce7a-2e4f-4dfa-b6a4-d57d028d00de" containerName="pull" Mar 10 06:58:58 crc kubenswrapper[4825]: E0310 06:58:58.655914 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="174bce7a-2e4f-4dfa-b6a4-d57d028d00de" containerName="util" Mar 10 06:58:58 crc kubenswrapper[4825]: I0310 06:58:58.655922 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="174bce7a-2e4f-4dfa-b6a4-d57d028d00de" containerName="util" Mar 10 06:58:58 crc kubenswrapper[4825]: E0310 06:58:58.655939 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="174bce7a-2e4f-4dfa-b6a4-d57d028d00de" containerName="extract" Mar 10 06:58:58 crc kubenswrapper[4825]: I0310 06:58:58.655948 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="174bce7a-2e4f-4dfa-b6a4-d57d028d00de" containerName="extract" Mar 10 06:58:58 crc kubenswrapper[4825]: I0310 06:58:58.656079 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="174bce7a-2e4f-4dfa-b6a4-d57d028d00de" containerName="extract" Mar 10 06:58:58 crc kubenswrapper[4825]: I0310 06:58:58.656624 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-qttr9" Mar 10 06:58:58 crc kubenswrapper[4825]: I0310 06:58:58.660019 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 10 06:58:58 crc kubenswrapper[4825]: I0310 06:58:58.660097 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 10 06:58:58 crc kubenswrapper[4825]: I0310 06:58:58.660574 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-gpqpt" Mar 10 06:58:58 crc kubenswrapper[4825]: I0310 06:58:58.674369 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-qttr9"] Mar 10 06:58:58 crc kubenswrapper[4825]: I0310 06:58:58.787162 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5wrz\" (UniqueName: \"kubernetes.io/projected/8e892956-8421-4579-a5e1-bef91a563c26-kube-api-access-f5wrz\") pod \"nmstate-operator-75c5dccd6c-qttr9\" (UID: \"8e892956-8421-4579-a5e1-bef91a563c26\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-qttr9" Mar 10 06:58:58 crc kubenswrapper[4825]: I0310 06:58:58.888620 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5wrz\" (UniqueName: \"kubernetes.io/projected/8e892956-8421-4579-a5e1-bef91a563c26-kube-api-access-f5wrz\") pod \"nmstate-operator-75c5dccd6c-qttr9\" (UID: \"8e892956-8421-4579-a5e1-bef91a563c26\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-qttr9" Mar 10 06:58:58 crc kubenswrapper[4825]: I0310 06:58:58.910003 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5wrz\" (UniqueName: \"kubernetes.io/projected/8e892956-8421-4579-a5e1-bef91a563c26-kube-api-access-f5wrz\") pod \"nmstate-operator-75c5dccd6c-qttr9\" (UID: \"8e892956-8421-4579-a5e1-bef91a563c26\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-qttr9" Mar 10 06:58:58 crc kubenswrapper[4825]: I0310 06:58:58.975504 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-qttr9" Mar 10 06:58:59 crc kubenswrapper[4825]: I0310 06:58:59.283401 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-qttr9"] Mar 10 06:59:00 crc kubenswrapper[4825]: I0310 06:59:00.111149 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-qttr9" event={"ID":"8e892956-8421-4579-a5e1-bef91a563c26","Type":"ContainerStarted","Data":"81ffd701b4796b56bb488d2002c519d0712907eba639eac6b97da7df434cce89"} Mar 10 06:59:00 crc kubenswrapper[4825]: I0310 06:59:00.377511 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n9k94" Mar 10 06:59:00 crc kubenswrapper[4825]: I0310 06:59:00.377571 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n9k94" Mar 10 06:59:01 crc kubenswrapper[4825]: I0310 06:59:01.430070 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n9k94" podUID="1629bc68-78f4-4222-990d-3107e5d3eac7" containerName="registry-server" probeResult="failure" output=< Mar 10 06:59:01 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 06:59:01 crc kubenswrapper[4825]: > Mar 10 06:59:02 crc kubenswrapper[4825]: I0310 06:59:02.125845 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-qttr9" event={"ID":"8e892956-8421-4579-a5e1-bef91a563c26","Type":"ContainerStarted","Data":"c7074b0c949ae79226f374c74a94929d06fa3ca8dd319d00ae3e8202c2a018df"} Mar 10 06:59:02 crc kubenswrapper[4825]: I0310 06:59:02.152020 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-qttr9" podStartSLOduration=1.777597646 podStartE2EDuration="4.151993629s" podCreationTimestamp="2026-03-10 06:58:58 +0000 UTC" firstStartedPulling="2026-03-10 06:58:59.299754022 +0000 UTC m=+892.329534687" lastFinishedPulling="2026-03-10 06:59:01.674150035 +0000 UTC m=+894.703930670" observedRunningTime="2026-03-10 06:59:02.147831001 +0000 UTC m=+895.177611656" watchObservedRunningTime="2026-03-10 06:59:02.151993629 +0000 UTC m=+895.181774294" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.243863 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-5hjqp"] Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.245827 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-5hjqp" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.248016 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-64bs7" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.257752 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-5hjqp"] Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.281216 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-d8h2s"] Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.282183 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-d8h2s" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.290503 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.302897 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-k8c9x"] Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.303911 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-k8c9x" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.310030 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-d8h2s"] Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.337651 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d4991d4a-6815-4cc2-84e6-b7be04db45bf-nmstate-lock\") pod \"nmstate-handler-k8c9x\" (UID: \"d4991d4a-6815-4cc2-84e6-b7be04db45bf\") " pod="openshift-nmstate/nmstate-handler-k8c9x" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.337695 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcxgv\" (UniqueName: \"kubernetes.io/projected/3cb1cbce-e37d-43fb-8b62-fc1b414cf7b9-kube-api-access-lcxgv\") pod \"nmstate-metrics-69594cc75-5hjqp\" (UID: \"3cb1cbce-e37d-43fb-8b62-fc1b414cf7b9\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-5hjqp" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.337752 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d75451be-5545-4c15-ae11-5c795d29494c-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-d8h2s\" (UID: \"d75451be-5545-4c15-ae11-5c795d29494c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-d8h2s" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.337776 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95n2n\" (UniqueName: \"kubernetes.io/projected/d75451be-5545-4c15-ae11-5c795d29494c-kube-api-access-95n2n\") pod \"nmstate-webhook-786f45cff4-d8h2s\" (UID: \"d75451be-5545-4c15-ae11-5c795d29494c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-d8h2s" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.337800 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn99c\" (UniqueName: \"kubernetes.io/projected/d4991d4a-6815-4cc2-84e6-b7be04db45bf-kube-api-access-cn99c\") pod \"nmstate-handler-k8c9x\" (UID: \"d4991d4a-6815-4cc2-84e6-b7be04db45bf\") " pod="openshift-nmstate/nmstate-handler-k8c9x" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.337953 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d4991d4a-6815-4cc2-84e6-b7be04db45bf-dbus-socket\") pod \"nmstate-handler-k8c9x\" (UID: \"d4991d4a-6815-4cc2-84e6-b7be04db45bf\") " pod="openshift-nmstate/nmstate-handler-k8c9x" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.337984 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d4991d4a-6815-4cc2-84e6-b7be04db45bf-ovs-socket\") pod \"nmstate-handler-k8c9x\" (UID: \"d4991d4a-6815-4cc2-84e6-b7be04db45bf\") " pod="openshift-nmstate/nmstate-handler-k8c9x" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.388838 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-q777z"] Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.389675 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-q777z" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.394740 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-6rtxx" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.394787 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.394826 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.402076 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-q777z"] Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.439200 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d75451be-5545-4c15-ae11-5c795d29494c-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-d8h2s\" (UID: \"d75451be-5545-4c15-ae11-5c795d29494c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-d8h2s" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.439253 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95n2n\" (UniqueName: \"kubernetes.io/projected/d75451be-5545-4c15-ae11-5c795d29494c-kube-api-access-95n2n\") pod \"nmstate-webhook-786f45cff4-d8h2s\" (UID: \"d75451be-5545-4c15-ae11-5c795d29494c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-d8h2s" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.439284 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7cm4\" (UniqueName: \"kubernetes.io/projected/b267da99-b608-441b-a6de-7a74d1923a8b-kube-api-access-m7cm4\") pod \"nmstate-console-plugin-5dcbbd79cf-q777z\" (UID: \"b267da99-b608-441b-a6de-7a74d1923a8b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-q777z" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.439309 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn99c\" (UniqueName: \"kubernetes.io/projected/d4991d4a-6815-4cc2-84e6-b7be04db45bf-kube-api-access-cn99c\") pod \"nmstate-handler-k8c9x\" (UID: \"d4991d4a-6815-4cc2-84e6-b7be04db45bf\") " pod="openshift-nmstate/nmstate-handler-k8c9x" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.439337 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b267da99-b608-441b-a6de-7a74d1923a8b-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-q777z\" (UID: \"b267da99-b608-441b-a6de-7a74d1923a8b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-q777z" Mar 10 06:59:08 crc kubenswrapper[4825]: E0310 06:59:08.439386 4825 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.439391 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d4991d4a-6815-4cc2-84e6-b7be04db45bf-dbus-socket\") pod \"nmstate-handler-k8c9x\" (UID: \"d4991d4a-6815-4cc2-84e6-b7be04db45bf\") " pod="openshift-nmstate/nmstate-handler-k8c9x" Mar 10 06:59:08 crc kubenswrapper[4825]: E0310 06:59:08.439451 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d75451be-5545-4c15-ae11-5c795d29494c-tls-key-pair podName:d75451be-5545-4c15-ae11-5c795d29494c nodeName:}" failed. No retries permitted until 2026-03-10 06:59:08.939433385 +0000 UTC m=+901.969214000 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/d75451be-5545-4c15-ae11-5c795d29494c-tls-key-pair") pod "nmstate-webhook-786f45cff4-d8h2s" (UID: "d75451be-5545-4c15-ae11-5c795d29494c") : secret "openshift-nmstate-webhook" not found Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.439470 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d4991d4a-6815-4cc2-84e6-b7be04db45bf-ovs-socket\") pod \"nmstate-handler-k8c9x\" (UID: \"d4991d4a-6815-4cc2-84e6-b7be04db45bf\") " pod="openshift-nmstate/nmstate-handler-k8c9x" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.439514 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d4991d4a-6815-4cc2-84e6-b7be04db45bf-nmstate-lock\") pod \"nmstate-handler-k8c9x\" (UID: \"d4991d4a-6815-4cc2-84e6-b7be04db45bf\") " pod="openshift-nmstate/nmstate-handler-k8c9x" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.439542 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcxgv\" (UniqueName: \"kubernetes.io/projected/3cb1cbce-e37d-43fb-8b62-fc1b414cf7b9-kube-api-access-lcxgv\") pod \"nmstate-metrics-69594cc75-5hjqp\" (UID: \"3cb1cbce-e37d-43fb-8b62-fc1b414cf7b9\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-5hjqp" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.439546 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d4991d4a-6815-4cc2-84e6-b7be04db45bf-ovs-socket\") pod \"nmstate-handler-k8c9x\" (UID: \"d4991d4a-6815-4cc2-84e6-b7be04db45bf\") " pod="openshift-nmstate/nmstate-handler-k8c9x" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.439561 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d4991d4a-6815-4cc2-84e6-b7be04db45bf-nmstate-lock\") pod \"nmstate-handler-k8c9x\" (UID: \"d4991d4a-6815-4cc2-84e6-b7be04db45bf\") " pod="openshift-nmstate/nmstate-handler-k8c9x" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.439609 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b267da99-b608-441b-a6de-7a74d1923a8b-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-q777z\" (UID: \"b267da99-b608-441b-a6de-7a74d1923a8b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-q777z" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.439624 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d4991d4a-6815-4cc2-84e6-b7be04db45bf-dbus-socket\") pod \"nmstate-handler-k8c9x\" (UID: \"d4991d4a-6815-4cc2-84e6-b7be04db45bf\") " pod="openshift-nmstate/nmstate-handler-k8c9x" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.458542 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95n2n\" (UniqueName: \"kubernetes.io/projected/d75451be-5545-4c15-ae11-5c795d29494c-kube-api-access-95n2n\") pod \"nmstate-webhook-786f45cff4-d8h2s\" (UID: \"d75451be-5545-4c15-ae11-5c795d29494c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-d8h2s" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.459230 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn99c\" (UniqueName: \"kubernetes.io/projected/d4991d4a-6815-4cc2-84e6-b7be04db45bf-kube-api-access-cn99c\") pod \"nmstate-handler-k8c9x\" (UID: \"d4991d4a-6815-4cc2-84e6-b7be04db45bf\") " pod="openshift-nmstate/nmstate-handler-k8c9x" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.465791 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcxgv\" (UniqueName: \"kubernetes.io/projected/3cb1cbce-e37d-43fb-8b62-fc1b414cf7b9-kube-api-access-lcxgv\") pod \"nmstate-metrics-69594cc75-5hjqp\" (UID: \"3cb1cbce-e37d-43fb-8b62-fc1b414cf7b9\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-5hjqp" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.541215 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b267da99-b608-441b-a6de-7a74d1923a8b-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-q777z\" (UID: \"b267da99-b608-441b-a6de-7a74d1923a8b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-q777z" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.541622 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7cm4\" (UniqueName: \"kubernetes.io/projected/b267da99-b608-441b-a6de-7a74d1923a8b-kube-api-access-m7cm4\") pod \"nmstate-console-plugin-5dcbbd79cf-q777z\" (UID: \"b267da99-b608-441b-a6de-7a74d1923a8b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-q777z" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.541657 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b267da99-b608-441b-a6de-7a74d1923a8b-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-q777z\" (UID: \"b267da99-b608-441b-a6de-7a74d1923a8b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-q777z" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.542628 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b267da99-b608-441b-a6de-7a74d1923a8b-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-q777z\" (UID: \"b267da99-b608-441b-a6de-7a74d1923a8b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-q777z" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.548105 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b267da99-b608-441b-a6de-7a74d1923a8b-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-q777z\" (UID: \"b267da99-b608-441b-a6de-7a74d1923a8b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-q777z" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.560001 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7cm4\" (UniqueName: \"kubernetes.io/projected/b267da99-b608-441b-a6de-7a74d1923a8b-kube-api-access-m7cm4\") pod \"nmstate-console-plugin-5dcbbd79cf-q777z\" (UID: \"b267da99-b608-441b-a6de-7a74d1923a8b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-q777z" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.564864 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7c5fccd6f9-ckqzv"] Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.565719 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.572902 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-5hjqp" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.577323 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c5fccd6f9-ckqzv"] Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.622913 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-k8c9x" Mar 10 06:59:08 crc kubenswrapper[4825]: W0310 06:59:08.638642 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4991d4a_6815_4cc2_84e6_b7be04db45bf.slice/crio-19e1a31a8622c9f40c3ddc09f55f28c25f55377b3844ab6c798cda56b53624f7 WatchSource:0}: Error finding container 19e1a31a8622c9f40c3ddc09f55f28c25f55377b3844ab6c798cda56b53624f7: Status 404 returned error can't find the container with id 19e1a31a8622c9f40c3ddc09f55f28c25f55377b3844ab6c798cda56b53624f7 Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.640516 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.642577 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/77e9d2b4-dd2e-4ebf-8820-e85bf749a38c-console-serving-cert\") pod \"console-7c5fccd6f9-ckqzv\" (UID: \"77e9d2b4-dd2e-4ebf-8820-e85bf749a38c\") " pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.642620 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/77e9d2b4-dd2e-4ebf-8820-e85bf749a38c-console-config\") pod \"console-7c5fccd6f9-ckqzv\" (UID: \"77e9d2b4-dd2e-4ebf-8820-e85bf749a38c\") " pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.642639 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/77e9d2b4-dd2e-4ebf-8820-e85bf749a38c-service-ca\") pod \"console-7c5fccd6f9-ckqzv\" (UID: \"77e9d2b4-dd2e-4ebf-8820-e85bf749a38c\") " pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.642656 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/77e9d2b4-dd2e-4ebf-8820-e85bf749a38c-oauth-serving-cert\") pod \"console-7c5fccd6f9-ckqzv\" (UID: \"77e9d2b4-dd2e-4ebf-8820-e85bf749a38c\") " pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.642697 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77e9d2b4-dd2e-4ebf-8820-e85bf749a38c-trusted-ca-bundle\") pod \"console-7c5fccd6f9-ckqzv\" (UID: \"77e9d2b4-dd2e-4ebf-8820-e85bf749a38c\") " pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.642723 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/77e9d2b4-dd2e-4ebf-8820-e85bf749a38c-console-oauth-config\") pod \"console-7c5fccd6f9-ckqzv\" (UID: \"77e9d2b4-dd2e-4ebf-8820-e85bf749a38c\") " pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.642751 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npz8j\" (UniqueName: \"kubernetes.io/projected/77e9d2b4-dd2e-4ebf-8820-e85bf749a38c-kube-api-access-npz8j\") pod \"console-7c5fccd6f9-ckqzv\" (UID: \"77e9d2b4-dd2e-4ebf-8820-e85bf749a38c\") " pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.705640 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-q777z" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.744435 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/77e9d2b4-dd2e-4ebf-8820-e85bf749a38c-console-serving-cert\") pod \"console-7c5fccd6f9-ckqzv\" (UID: \"77e9d2b4-dd2e-4ebf-8820-e85bf749a38c\") " pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.744736 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/77e9d2b4-dd2e-4ebf-8820-e85bf749a38c-console-config\") pod \"console-7c5fccd6f9-ckqzv\" (UID: \"77e9d2b4-dd2e-4ebf-8820-e85bf749a38c\") " pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.744760 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/77e9d2b4-dd2e-4ebf-8820-e85bf749a38c-service-ca\") pod \"console-7c5fccd6f9-ckqzv\" (UID: \"77e9d2b4-dd2e-4ebf-8820-e85bf749a38c\") " pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.744777 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/77e9d2b4-dd2e-4ebf-8820-e85bf749a38c-oauth-serving-cert\") pod \"console-7c5fccd6f9-ckqzv\" (UID: \"77e9d2b4-dd2e-4ebf-8820-e85bf749a38c\") " pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.744811 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/77e9d2b4-dd2e-4ebf-8820-e85bf749a38c-console-oauth-config\") pod \"console-7c5fccd6f9-ckqzv\" (UID: \"77e9d2b4-dd2e-4ebf-8820-e85bf749a38c\") " pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.744826 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77e9d2b4-dd2e-4ebf-8820-e85bf749a38c-trusted-ca-bundle\") pod \"console-7c5fccd6f9-ckqzv\" (UID: \"77e9d2b4-dd2e-4ebf-8820-e85bf749a38c\") " pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.744847 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npz8j\" (UniqueName: \"kubernetes.io/projected/77e9d2b4-dd2e-4ebf-8820-e85bf749a38c-kube-api-access-npz8j\") pod \"console-7c5fccd6f9-ckqzv\" (UID: \"77e9d2b4-dd2e-4ebf-8820-e85bf749a38c\") " pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.745958 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/77e9d2b4-dd2e-4ebf-8820-e85bf749a38c-console-config\") pod \"console-7c5fccd6f9-ckqzv\" (UID: \"77e9d2b4-dd2e-4ebf-8820-e85bf749a38c\") " pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.746232 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/77e9d2b4-dd2e-4ebf-8820-e85bf749a38c-service-ca\") pod \"console-7c5fccd6f9-ckqzv\" (UID: \"77e9d2b4-dd2e-4ebf-8820-e85bf749a38c\") " pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.746590 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77e9d2b4-dd2e-4ebf-8820-e85bf749a38c-trusted-ca-bundle\") pod \"console-7c5fccd6f9-ckqzv\" (UID: \"77e9d2b4-dd2e-4ebf-8820-e85bf749a38c\") " pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.748275 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/77e9d2b4-dd2e-4ebf-8820-e85bf749a38c-oauth-serving-cert\") pod \"console-7c5fccd6f9-ckqzv\" (UID: \"77e9d2b4-dd2e-4ebf-8820-e85bf749a38c\") " pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.750259 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/77e9d2b4-dd2e-4ebf-8820-e85bf749a38c-console-oauth-config\") pod \"console-7c5fccd6f9-ckqzv\" (UID: \"77e9d2b4-dd2e-4ebf-8820-e85bf749a38c\") " pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.751061 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/77e9d2b4-dd2e-4ebf-8820-e85bf749a38c-console-serving-cert\") pod \"console-7c5fccd6f9-ckqzv\" (UID: \"77e9d2b4-dd2e-4ebf-8820-e85bf749a38c\") " pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.764170 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-5hjqp"] Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.767470 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npz8j\" (UniqueName: \"kubernetes.io/projected/77e9d2b4-dd2e-4ebf-8820-e85bf749a38c-kube-api-access-npz8j\") pod \"console-7c5fccd6f9-ckqzv\" (UID: \"77e9d2b4-dd2e-4ebf-8820-e85bf749a38c\") " pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:08 crc kubenswrapper[4825]: W0310 06:59:08.772410 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cb1cbce_e37d_43fb_8b62_fc1b414cf7b9.slice/crio-e468da9a1f4111619c687b21d28b97da42701c54690a5c89d064517686b75266 WatchSource:0}: Error finding container e468da9a1f4111619c687b21d28b97da42701c54690a5c89d064517686b75266: Status 404 returned error can't find the container with id e468da9a1f4111619c687b21d28b97da42701c54690a5c89d064517686b75266 Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.924451 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.947512 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d75451be-5545-4c15-ae11-5c795d29494c-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-d8h2s\" (UID: \"d75451be-5545-4c15-ae11-5c795d29494c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-d8h2s" Mar 10 06:59:08 crc kubenswrapper[4825]: I0310 06:59:08.950510 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d75451be-5545-4c15-ae11-5c795d29494c-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-d8h2s\" (UID: \"d75451be-5545-4c15-ae11-5c795d29494c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-d8h2s" Mar 10 06:59:09 crc kubenswrapper[4825]: I0310 06:59:09.094092 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-q777z"] Mar 10 06:59:09 crc kubenswrapper[4825]: I0310 06:59:09.151860 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c5fccd6f9-ckqzv"] Mar 10 06:59:09 crc kubenswrapper[4825]: W0310 06:59:09.156514 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77e9d2b4_dd2e_4ebf_8820_e85bf749a38c.slice/crio-03c7866354eedc3d57e2bfd959a644b012cf5a0751281674aaf9a9a42699748d WatchSource:0}: Error finding container 03c7866354eedc3d57e2bfd959a644b012cf5a0751281674aaf9a9a42699748d: Status 404 returned error can't find the container with id 03c7866354eedc3d57e2bfd959a644b012cf5a0751281674aaf9a9a42699748d Mar 10 06:59:09 crc kubenswrapper[4825]: I0310 06:59:09.180454 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c5fccd6f9-ckqzv" event={"ID":"77e9d2b4-dd2e-4ebf-8820-e85bf749a38c","Type":"ContainerStarted","Data":"03c7866354eedc3d57e2bfd959a644b012cf5a0751281674aaf9a9a42699748d"} Mar 10 06:59:09 crc kubenswrapper[4825]: I0310 06:59:09.181707 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-5hjqp" event={"ID":"3cb1cbce-e37d-43fb-8b62-fc1b414cf7b9","Type":"ContainerStarted","Data":"e468da9a1f4111619c687b21d28b97da42701c54690a5c89d064517686b75266"} Mar 10 06:59:09 crc kubenswrapper[4825]: I0310 06:59:09.183381 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-k8c9x" event={"ID":"d4991d4a-6815-4cc2-84e6-b7be04db45bf","Type":"ContainerStarted","Data":"19e1a31a8622c9f40c3ddc09f55f28c25f55377b3844ab6c798cda56b53624f7"} Mar 10 06:59:09 crc kubenswrapper[4825]: I0310 06:59:09.184409 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-q777z" event={"ID":"b267da99-b608-441b-a6de-7a74d1923a8b","Type":"ContainerStarted","Data":"f3e5ec56ac7b7ebb641142d123648ae29604b6739f4bdd40e199d559bf5a29e6"} Mar 10 06:59:09 crc kubenswrapper[4825]: I0310 06:59:09.197587 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-d8h2s" Mar 10 06:59:09 crc kubenswrapper[4825]: I0310 06:59:09.479442 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-d8h2s"] Mar 10 06:59:10 crc kubenswrapper[4825]: I0310 06:59:10.197401 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-d8h2s" event={"ID":"d75451be-5545-4c15-ae11-5c795d29494c","Type":"ContainerStarted","Data":"16ea7dda29588f1a5bf9fb1fbd16493872281eaf9e989ea0ac9aa8cbb604b10f"} Mar 10 06:59:10 crc kubenswrapper[4825]: I0310 06:59:10.200386 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c5fccd6f9-ckqzv" event={"ID":"77e9d2b4-dd2e-4ebf-8820-e85bf749a38c","Type":"ContainerStarted","Data":"039cde189e34b833690eef65af779087edf0d32ca60de6567c07bb5a31764205"} Mar 10 06:59:10 crc kubenswrapper[4825]: I0310 06:59:10.226215 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7c5fccd6f9-ckqzv" podStartSLOduration=2.226183844 podStartE2EDuration="2.226183844s" podCreationTimestamp="2026-03-10 06:59:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 06:59:10.225532737 +0000 UTC m=+903.255313442" watchObservedRunningTime="2026-03-10 06:59:10.226183844 +0000 UTC m=+903.255964489" Mar 10 06:59:10 crc kubenswrapper[4825]: I0310 06:59:10.432421 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n9k94" Mar 10 06:59:10 crc kubenswrapper[4825]: I0310 06:59:10.495521 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n9k94" Mar 10 06:59:10 crc kubenswrapper[4825]: I0310 06:59:10.665840 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n9k94"] Mar 10 06:59:12 crc kubenswrapper[4825]: I0310 06:59:12.221086 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-q777z" event={"ID":"b267da99-b608-441b-a6de-7a74d1923a8b","Type":"ContainerStarted","Data":"f1bf8ad44b18fa34f715a800259cf0906aadc256f5b079a7abae3357f369ccf5"} Mar 10 06:59:12 crc kubenswrapper[4825]: I0310 06:59:12.227968 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-5hjqp" event={"ID":"3cb1cbce-e37d-43fb-8b62-fc1b414cf7b9","Type":"ContainerStarted","Data":"70f9d6efda41bd34f3a6b1c437d02c9f5f38ce9744544efc2c7fe731022e9033"} Mar 10 06:59:12 crc kubenswrapper[4825]: I0310 06:59:12.229636 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-d8h2s" event={"ID":"d75451be-5545-4c15-ae11-5c795d29494c","Type":"ContainerStarted","Data":"4a1d2fe35dd379a58d0e7906b2fd91c95446c3134fc433526f305e3dcfb0bb55"} Mar 10 06:59:12 crc kubenswrapper[4825]: I0310 06:59:12.229822 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n9k94" podUID="1629bc68-78f4-4222-990d-3107e5d3eac7" containerName="registry-server" containerID="cri-o://7b612319c25b8d4a5435eb058dfd167156cd6aadddd953f302eb799c68c9d445" gracePeriod=2 Mar 10 06:59:12 crc kubenswrapper[4825]: I0310 06:59:12.229948 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-d8h2s" Mar 10 06:59:12 crc kubenswrapper[4825]: I0310 06:59:12.249681 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-q777z" podStartSLOduration=1.468654677 podStartE2EDuration="4.249656759s" podCreationTimestamp="2026-03-10 06:59:08 +0000 UTC" firstStartedPulling="2026-03-10 06:59:09.102581794 +0000 UTC m=+902.132362409" lastFinishedPulling="2026-03-10 06:59:11.883583876 +0000 UTC m=+904.913364491" observedRunningTime="2026-03-10 06:59:12.238268752 +0000 UTC m=+905.268049357" watchObservedRunningTime="2026-03-10 06:59:12.249656759 +0000 UTC m=+905.279437374" Mar 10 06:59:12 crc kubenswrapper[4825]: I0310 06:59:12.274587 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-d8h2s" podStartSLOduration=1.873141881 podStartE2EDuration="4.274560097s" podCreationTimestamp="2026-03-10 06:59:08 +0000 UTC" firstStartedPulling="2026-03-10 06:59:09.495322402 +0000 UTC m=+902.525103027" lastFinishedPulling="2026-03-10 06:59:11.896740598 +0000 UTC m=+904.926521243" observedRunningTime="2026-03-10 06:59:12.266743694 +0000 UTC m=+905.296524319" watchObservedRunningTime="2026-03-10 06:59:12.274560097 +0000 UTC m=+905.304340732" Mar 10 06:59:12 crc kubenswrapper[4825]: I0310 06:59:12.602108 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9k94" Mar 10 06:59:12 crc kubenswrapper[4825]: I0310 06:59:12.699583 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq4kj\" (UniqueName: \"kubernetes.io/projected/1629bc68-78f4-4222-990d-3107e5d3eac7-kube-api-access-nq4kj\") pod \"1629bc68-78f4-4222-990d-3107e5d3eac7\" (UID: \"1629bc68-78f4-4222-990d-3107e5d3eac7\") " Mar 10 06:59:12 crc kubenswrapper[4825]: I0310 06:59:12.699723 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1629bc68-78f4-4222-990d-3107e5d3eac7-catalog-content\") pod \"1629bc68-78f4-4222-990d-3107e5d3eac7\" (UID: \"1629bc68-78f4-4222-990d-3107e5d3eac7\") " Mar 10 06:59:12 crc kubenswrapper[4825]: I0310 06:59:12.699783 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1629bc68-78f4-4222-990d-3107e5d3eac7-utilities\") pod \"1629bc68-78f4-4222-990d-3107e5d3eac7\" (UID: \"1629bc68-78f4-4222-990d-3107e5d3eac7\") " Mar 10 06:59:12 crc kubenswrapper[4825]: I0310 06:59:12.701770 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1629bc68-78f4-4222-990d-3107e5d3eac7-utilities" (OuterVolumeSpecName: "utilities") pod "1629bc68-78f4-4222-990d-3107e5d3eac7" (UID: "1629bc68-78f4-4222-990d-3107e5d3eac7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:59:12 crc kubenswrapper[4825]: I0310 06:59:12.705957 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1629bc68-78f4-4222-990d-3107e5d3eac7-kube-api-access-nq4kj" (OuterVolumeSpecName: "kube-api-access-nq4kj") pod "1629bc68-78f4-4222-990d-3107e5d3eac7" (UID: "1629bc68-78f4-4222-990d-3107e5d3eac7"). InnerVolumeSpecName "kube-api-access-nq4kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:59:12 crc kubenswrapper[4825]: I0310 06:59:12.800952 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq4kj\" (UniqueName: \"kubernetes.io/projected/1629bc68-78f4-4222-990d-3107e5d3eac7-kube-api-access-nq4kj\") on node \"crc\" DevicePath \"\"" Mar 10 06:59:12 crc kubenswrapper[4825]: I0310 06:59:12.800996 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1629bc68-78f4-4222-990d-3107e5d3eac7-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 06:59:12 crc kubenswrapper[4825]: I0310 06:59:12.833937 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1629bc68-78f4-4222-990d-3107e5d3eac7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1629bc68-78f4-4222-990d-3107e5d3eac7" (UID: "1629bc68-78f4-4222-990d-3107e5d3eac7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:59:12 crc kubenswrapper[4825]: I0310 06:59:12.903993 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1629bc68-78f4-4222-990d-3107e5d3eac7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 06:59:13 crc kubenswrapper[4825]: I0310 06:59:13.243910 4825 generic.go:334] "Generic (PLEG): container finished" podID="1629bc68-78f4-4222-990d-3107e5d3eac7" containerID="7b612319c25b8d4a5435eb058dfd167156cd6aadddd953f302eb799c68c9d445" exitCode=0 Mar 10 06:59:13 crc kubenswrapper[4825]: I0310 06:59:13.244070 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9k94" Mar 10 06:59:13 crc kubenswrapper[4825]: I0310 06:59:13.246629 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-k8c9x" event={"ID":"d4991d4a-6815-4cc2-84e6-b7be04db45bf","Type":"ContainerStarted","Data":"7f13094d3e21563e57fa2baec826da92e4e133ec47ed8427c65c8559ab25c3f2"} Mar 10 06:59:13 crc kubenswrapper[4825]: I0310 06:59:13.246682 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-k8c9x" Mar 10 06:59:13 crc kubenswrapper[4825]: I0310 06:59:13.246701 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9k94" event={"ID":"1629bc68-78f4-4222-990d-3107e5d3eac7","Type":"ContainerDied","Data":"7b612319c25b8d4a5435eb058dfd167156cd6aadddd953f302eb799c68c9d445"} Mar 10 06:59:13 crc kubenswrapper[4825]: I0310 06:59:13.246729 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9k94" event={"ID":"1629bc68-78f4-4222-990d-3107e5d3eac7","Type":"ContainerDied","Data":"c4fd356dc15edd998f48fb5ac30cee58f74c2218f9742b915f4608bfec29ac68"} Mar 10 06:59:13 crc kubenswrapper[4825]: I0310 06:59:13.246757 4825 scope.go:117] "RemoveContainer" containerID="7b612319c25b8d4a5435eb058dfd167156cd6aadddd953f302eb799c68c9d445" Mar 10 06:59:13 crc kubenswrapper[4825]: I0310 06:59:13.288458 4825 scope.go:117] "RemoveContainer" containerID="72efa95f887e7be37d198e7ef78bf879228a9773f7410c185ead1551d47d40ca" Mar 10 06:59:13 crc kubenswrapper[4825]: I0310 06:59:13.298443 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-k8c9x" podStartSLOduration=2.044204505 podStartE2EDuration="5.29841466s" podCreationTimestamp="2026-03-10 06:59:08 +0000 UTC" firstStartedPulling="2026-03-10 06:59:08.640281645 +0000 UTC m=+901.670062260" lastFinishedPulling="2026-03-10 06:59:11.89449176 +0000 UTC m=+904.924272415" observedRunningTime="2026-03-10 06:59:13.275600406 +0000 UTC m=+906.305381061" watchObservedRunningTime="2026-03-10 06:59:13.29841466 +0000 UTC m=+906.328195315" Mar 10 06:59:13 crc kubenswrapper[4825]: I0310 06:59:13.307953 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n9k94"] Mar 10 06:59:13 crc kubenswrapper[4825]: I0310 06:59:13.314177 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n9k94"] Mar 10 06:59:13 crc kubenswrapper[4825]: I0310 06:59:13.327563 4825 scope.go:117] "RemoveContainer" containerID="6c36ac854fc03b4772ccf2547674a61060d29f43e1e82e730c4e1e616f1c634b" Mar 10 06:59:13 crc kubenswrapper[4825]: I0310 06:59:13.350901 4825 scope.go:117] "RemoveContainer" containerID="7b612319c25b8d4a5435eb058dfd167156cd6aadddd953f302eb799c68c9d445" Mar 10 06:59:13 crc kubenswrapper[4825]: E0310 06:59:13.351607 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b612319c25b8d4a5435eb058dfd167156cd6aadddd953f302eb799c68c9d445\": container with ID starting with 7b612319c25b8d4a5435eb058dfd167156cd6aadddd953f302eb799c68c9d445 not found: ID does not exist" containerID="7b612319c25b8d4a5435eb058dfd167156cd6aadddd953f302eb799c68c9d445" Mar 10 06:59:13 crc kubenswrapper[4825]: I0310 06:59:13.351666 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b612319c25b8d4a5435eb058dfd167156cd6aadddd953f302eb799c68c9d445"} err="failed to get container status \"7b612319c25b8d4a5435eb058dfd167156cd6aadddd953f302eb799c68c9d445\": rpc error: code = NotFound desc = could not find container \"7b612319c25b8d4a5435eb058dfd167156cd6aadddd953f302eb799c68c9d445\": container with ID starting with 7b612319c25b8d4a5435eb058dfd167156cd6aadddd953f302eb799c68c9d445 not found: ID does not exist" Mar 10 06:59:13 crc kubenswrapper[4825]: I0310 06:59:13.351706 4825 scope.go:117] "RemoveContainer" containerID="72efa95f887e7be37d198e7ef78bf879228a9773f7410c185ead1551d47d40ca" Mar 10 06:59:13 crc kubenswrapper[4825]: E0310 06:59:13.352626 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72efa95f887e7be37d198e7ef78bf879228a9773f7410c185ead1551d47d40ca\": container with ID starting with 72efa95f887e7be37d198e7ef78bf879228a9773f7410c185ead1551d47d40ca not found: ID does not exist" containerID="72efa95f887e7be37d198e7ef78bf879228a9773f7410c185ead1551d47d40ca" Mar 10 06:59:13 crc kubenswrapper[4825]: I0310 06:59:13.352692 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72efa95f887e7be37d198e7ef78bf879228a9773f7410c185ead1551d47d40ca"} err="failed to get container status \"72efa95f887e7be37d198e7ef78bf879228a9773f7410c185ead1551d47d40ca\": rpc error: code = NotFound desc = could not find container \"72efa95f887e7be37d198e7ef78bf879228a9773f7410c185ead1551d47d40ca\": container with ID starting with 72efa95f887e7be37d198e7ef78bf879228a9773f7410c185ead1551d47d40ca not found: ID does not exist" Mar 10 06:59:13 crc kubenswrapper[4825]: I0310 06:59:13.352735 4825 scope.go:117] "RemoveContainer" containerID="6c36ac854fc03b4772ccf2547674a61060d29f43e1e82e730c4e1e616f1c634b" Mar 10 06:59:13 crc kubenswrapper[4825]: E0310 06:59:13.353360 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c36ac854fc03b4772ccf2547674a61060d29f43e1e82e730c4e1e616f1c634b\": container with ID starting with 6c36ac854fc03b4772ccf2547674a61060d29f43e1e82e730c4e1e616f1c634b not found: ID does not exist" containerID="6c36ac854fc03b4772ccf2547674a61060d29f43e1e82e730c4e1e616f1c634b" Mar 10 06:59:13 crc kubenswrapper[4825]: I0310 06:59:13.353417 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c36ac854fc03b4772ccf2547674a61060d29f43e1e82e730c4e1e616f1c634b"} err="failed to get container status \"6c36ac854fc03b4772ccf2547674a61060d29f43e1e82e730c4e1e616f1c634b\": rpc error: code = NotFound desc = could not find container \"6c36ac854fc03b4772ccf2547674a61060d29f43e1e82e730c4e1e616f1c634b\": container with ID starting with 6c36ac854fc03b4772ccf2547674a61060d29f43e1e82e730c4e1e616f1c634b not found: ID does not exist" Mar 10 06:59:15 crc kubenswrapper[4825]: I0310 06:59:15.249181 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1629bc68-78f4-4222-990d-3107e5d3eac7" path="/var/lib/kubelet/pods/1629bc68-78f4-4222-990d-3107e5d3eac7/volumes" Mar 10 06:59:15 crc kubenswrapper[4825]: I0310 06:59:15.266694 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-5hjqp" event={"ID":"3cb1cbce-e37d-43fb-8b62-fc1b414cf7b9","Type":"ContainerStarted","Data":"d00876cf58548132c96749eca661e63929423e9da1a7d8571ee0c2041b37c655"} Mar 10 06:59:15 crc kubenswrapper[4825]: I0310 06:59:15.288219 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-5hjqp" podStartSLOduration=1.576420623 podStartE2EDuration="7.288207097s" podCreationTimestamp="2026-03-10 06:59:08 +0000 UTC" firstStartedPulling="2026-03-10 06:59:08.776072931 +0000 UTC m=+901.805853546" lastFinishedPulling="2026-03-10 06:59:14.487859365 +0000 UTC m=+907.517640020" observedRunningTime="2026-03-10 06:59:15.285367643 +0000 UTC m=+908.315148258" watchObservedRunningTime="2026-03-10 06:59:15.288207097 +0000 UTC m=+908.317987712" Mar 10 06:59:16 crc kubenswrapper[4825]: I0310 06:59:16.888972 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 06:59:16 crc kubenswrapper[4825]: I0310 06:59:16.889451 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 06:59:18 crc kubenswrapper[4825]: I0310 06:59:18.651935 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-k8c9x" Mar 10 06:59:18 crc kubenswrapper[4825]: I0310 06:59:18.924775 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:18 crc kubenswrapper[4825]: I0310 06:59:18.924887 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:18 crc kubenswrapper[4825]: I0310 06:59:18.932063 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:19 crc kubenswrapper[4825]: I0310 06:59:19.304466 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7c5fccd6f9-ckqzv" Mar 10 06:59:19 crc kubenswrapper[4825]: I0310 06:59:19.421533 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-6bjtt"] Mar 10 06:59:29 crc kubenswrapper[4825]: I0310 06:59:29.206306 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-d8h2s" Mar 10 06:59:42 crc kubenswrapper[4825]: I0310 06:59:42.722397 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79"] Mar 10 06:59:42 crc kubenswrapper[4825]: E0310 06:59:42.723249 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1629bc68-78f4-4222-990d-3107e5d3eac7" containerName="registry-server" Mar 10 06:59:42 crc kubenswrapper[4825]: I0310 06:59:42.723270 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1629bc68-78f4-4222-990d-3107e5d3eac7" containerName="registry-server" Mar 10 06:59:42 crc kubenswrapper[4825]: E0310 06:59:42.723294 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1629bc68-78f4-4222-990d-3107e5d3eac7" containerName="extract-utilities" Mar 10 06:59:42 crc kubenswrapper[4825]: I0310 06:59:42.723306 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1629bc68-78f4-4222-990d-3107e5d3eac7" containerName="extract-utilities" Mar 10 06:59:42 crc kubenswrapper[4825]: E0310 06:59:42.723341 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1629bc68-78f4-4222-990d-3107e5d3eac7" containerName="extract-content" Mar 10 06:59:42 crc kubenswrapper[4825]: I0310 06:59:42.723354 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1629bc68-78f4-4222-990d-3107e5d3eac7" containerName="extract-content" Mar 10 06:59:42 crc kubenswrapper[4825]: I0310 06:59:42.723520 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="1629bc68-78f4-4222-990d-3107e5d3eac7" containerName="registry-server" Mar 10 06:59:42 crc kubenswrapper[4825]: I0310 06:59:42.724779 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79" Mar 10 06:59:42 crc kubenswrapper[4825]: I0310 06:59:42.730057 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 06:59:42 crc kubenswrapper[4825]: I0310 06:59:42.745560 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79"] Mar 10 06:59:42 crc kubenswrapper[4825]: I0310 06:59:42.865442 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvpw6\" (UniqueName: \"kubernetes.io/projected/b30ba77f-d659-4501-94e6-cc3e980b3f41-kube-api-access-zvpw6\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79\" (UID: \"b30ba77f-d659-4501-94e6-cc3e980b3f41\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79" Mar 10 06:59:42 crc kubenswrapper[4825]: I0310 06:59:42.865528 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b30ba77f-d659-4501-94e6-cc3e980b3f41-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79\" (UID: \"b30ba77f-d659-4501-94e6-cc3e980b3f41\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79" Mar 10 06:59:42 crc kubenswrapper[4825]: I0310 06:59:42.865612 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b30ba77f-d659-4501-94e6-cc3e980b3f41-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79\" (UID: \"b30ba77f-d659-4501-94e6-cc3e980b3f41\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79" Mar 10 06:59:42 crc kubenswrapper[4825]: I0310 06:59:42.967425 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvpw6\" (UniqueName: \"kubernetes.io/projected/b30ba77f-d659-4501-94e6-cc3e980b3f41-kube-api-access-zvpw6\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79\" (UID: \"b30ba77f-d659-4501-94e6-cc3e980b3f41\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79" Mar 10 06:59:42 crc kubenswrapper[4825]: I0310 06:59:42.967504 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b30ba77f-d659-4501-94e6-cc3e980b3f41-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79\" (UID: \"b30ba77f-d659-4501-94e6-cc3e980b3f41\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79" Mar 10 06:59:42 crc kubenswrapper[4825]: I0310 06:59:42.967560 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b30ba77f-d659-4501-94e6-cc3e980b3f41-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79\" (UID: \"b30ba77f-d659-4501-94e6-cc3e980b3f41\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79" Mar 10 06:59:42 crc kubenswrapper[4825]: I0310 06:59:42.968444 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b30ba77f-d659-4501-94e6-cc3e980b3f41-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79\" (UID: \"b30ba77f-d659-4501-94e6-cc3e980b3f41\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79" Mar 10 06:59:42 crc kubenswrapper[4825]: I0310 06:59:42.968486 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b30ba77f-d659-4501-94e6-cc3e980b3f41-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79\" (UID: \"b30ba77f-d659-4501-94e6-cc3e980b3f41\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79" Mar 10 06:59:42 crc kubenswrapper[4825]: I0310 06:59:42.996226 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvpw6\" (UniqueName: \"kubernetes.io/projected/b30ba77f-d659-4501-94e6-cc3e980b3f41-kube-api-access-zvpw6\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79\" (UID: \"b30ba77f-d659-4501-94e6-cc3e980b3f41\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79" Mar 10 06:59:43 crc kubenswrapper[4825]: I0310 06:59:43.066237 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79" Mar 10 06:59:43 crc kubenswrapper[4825]: I0310 06:59:43.310022 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79"] Mar 10 06:59:43 crc kubenswrapper[4825]: I0310 06:59:43.471787 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79" event={"ID":"b30ba77f-d659-4501-94e6-cc3e980b3f41","Type":"ContainerStarted","Data":"e6b6b4882835b4ec5bb632bb0760853434fa68e81622d27967e694848c1db06f"} Mar 10 06:59:44 crc kubenswrapper[4825]: I0310 06:59:44.476314 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-6bjtt" podUID="f3a60327-2809-415b-abde-d1569a2453b6" containerName="console" containerID="cri-o://e1caf58256b6246018ee52bb36d21e9e27ff8bb5b218d2d8869057a0e006a818" gracePeriod=15 Mar 10 06:59:44 crc kubenswrapper[4825]: I0310 06:59:44.485748 4825 generic.go:334] "Generic (PLEG): container finished" podID="b30ba77f-d659-4501-94e6-cc3e980b3f41" containerID="0cf7434aaec6b8b3372f28cca58b6deb42b43b2a560b9620665eb55001b1bf99" exitCode=0 Mar 10 06:59:44 crc kubenswrapper[4825]: I0310 06:59:44.485916 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79" event={"ID":"b30ba77f-d659-4501-94e6-cc3e980b3f41","Type":"ContainerDied","Data":"0cf7434aaec6b8b3372f28cca58b6deb42b43b2a560b9620665eb55001b1bf99"} Mar 10 06:59:44 crc kubenswrapper[4825]: I0310 06:59:44.919478 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-6bjtt_f3a60327-2809-415b-abde-d1569a2453b6/console/0.log" Mar 10 06:59:44 crc kubenswrapper[4825]: I0310 06:59:44.919826 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.011101 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3a60327-2809-415b-abde-d1569a2453b6-console-serving-cert\") pod \"f3a60327-2809-415b-abde-d1569a2453b6\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.011222 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dzl2\" (UniqueName: \"kubernetes.io/projected/f3a60327-2809-415b-abde-d1569a2453b6-kube-api-access-9dzl2\") pod \"f3a60327-2809-415b-abde-d1569a2453b6\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.011247 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3a60327-2809-415b-abde-d1569a2453b6-console-oauth-config\") pod \"f3a60327-2809-415b-abde-d1569a2453b6\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.011275 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3a60327-2809-415b-abde-d1569a2453b6-service-ca\") pod \"f3a60327-2809-415b-abde-d1569a2453b6\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.011300 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3a60327-2809-415b-abde-d1569a2453b6-oauth-serving-cert\") pod \"f3a60327-2809-415b-abde-d1569a2453b6\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.011322 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3a60327-2809-415b-abde-d1569a2453b6-console-config\") pod \"f3a60327-2809-415b-abde-d1569a2453b6\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.011346 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3a60327-2809-415b-abde-d1569a2453b6-trusted-ca-bundle\") pod \"f3a60327-2809-415b-abde-d1569a2453b6\" (UID: \"f3a60327-2809-415b-abde-d1569a2453b6\") " Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.012697 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3a60327-2809-415b-abde-d1569a2453b6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f3a60327-2809-415b-abde-d1569a2453b6" (UID: "f3a60327-2809-415b-abde-d1569a2453b6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.013044 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3a60327-2809-415b-abde-d1569a2453b6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f3a60327-2809-415b-abde-d1569a2453b6" (UID: "f3a60327-2809-415b-abde-d1569a2453b6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.013383 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3a60327-2809-415b-abde-d1569a2453b6-console-config" (OuterVolumeSpecName: "console-config") pod "f3a60327-2809-415b-abde-d1569a2453b6" (UID: "f3a60327-2809-415b-abde-d1569a2453b6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.013724 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3a60327-2809-415b-abde-d1569a2453b6-service-ca" (OuterVolumeSpecName: "service-ca") pod "f3a60327-2809-415b-abde-d1569a2453b6" (UID: "f3a60327-2809-415b-abde-d1569a2453b6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.018252 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3a60327-2809-415b-abde-d1569a2453b6-kube-api-access-9dzl2" (OuterVolumeSpecName: "kube-api-access-9dzl2") pod "f3a60327-2809-415b-abde-d1569a2453b6" (UID: "f3a60327-2809-415b-abde-d1569a2453b6"). InnerVolumeSpecName "kube-api-access-9dzl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.019786 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3a60327-2809-415b-abde-d1569a2453b6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f3a60327-2809-415b-abde-d1569a2453b6" (UID: "f3a60327-2809-415b-abde-d1569a2453b6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.019989 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3a60327-2809-415b-abde-d1569a2453b6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f3a60327-2809-415b-abde-d1569a2453b6" (UID: "f3a60327-2809-415b-abde-d1569a2453b6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.112893 4825 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3a60327-2809-415b-abde-d1569a2453b6-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.112967 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dzl2\" (UniqueName: \"kubernetes.io/projected/f3a60327-2809-415b-abde-d1569a2453b6-kube-api-access-9dzl2\") on node \"crc\" DevicePath \"\"" Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.112984 4825 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3a60327-2809-415b-abde-d1569a2453b6-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.113030 4825 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3a60327-2809-415b-abde-d1569a2453b6-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.113049 4825 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3a60327-2809-415b-abde-d1569a2453b6-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.113065 4825 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3a60327-2809-415b-abde-d1569a2453b6-console-config\") on node \"crc\" DevicePath \"\"" Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.113108 4825 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3a60327-2809-415b-abde-d1569a2453b6-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.494906 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-6bjtt_f3a60327-2809-415b-abde-d1569a2453b6/console/0.log" Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.494981 4825 generic.go:334] "Generic (PLEG): container finished" podID="f3a60327-2809-415b-abde-d1569a2453b6" containerID="e1caf58256b6246018ee52bb36d21e9e27ff8bb5b218d2d8869057a0e006a818" exitCode=2 Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.495022 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6bjtt" event={"ID":"f3a60327-2809-415b-abde-d1569a2453b6","Type":"ContainerDied","Data":"e1caf58256b6246018ee52bb36d21e9e27ff8bb5b218d2d8869057a0e006a818"} Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.495059 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6bjtt" event={"ID":"f3a60327-2809-415b-abde-d1569a2453b6","Type":"ContainerDied","Data":"646706ff1b6a2ebe6615bad16203e62b5b8c9f40ae343769fcfbb3c7fdfbea43"} Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.495086 4825 scope.go:117] "RemoveContainer" containerID="e1caf58256b6246018ee52bb36d21e9e27ff8bb5b218d2d8869057a0e006a818" Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.495187 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6bjtt" Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.523803 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-6bjtt"] Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.532938 4825 scope.go:117] "RemoveContainer" containerID="e1caf58256b6246018ee52bb36d21e9e27ff8bb5b218d2d8869057a0e006a818" Mar 10 06:59:45 crc kubenswrapper[4825]: E0310 06:59:45.533562 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1caf58256b6246018ee52bb36d21e9e27ff8bb5b218d2d8869057a0e006a818\": container with ID starting with e1caf58256b6246018ee52bb36d21e9e27ff8bb5b218d2d8869057a0e006a818 not found: ID does not exist" containerID="e1caf58256b6246018ee52bb36d21e9e27ff8bb5b218d2d8869057a0e006a818" Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.533604 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1caf58256b6246018ee52bb36d21e9e27ff8bb5b218d2d8869057a0e006a818"} err="failed to get container status \"e1caf58256b6246018ee52bb36d21e9e27ff8bb5b218d2d8869057a0e006a818\": rpc error: code = NotFound desc = could not find container \"e1caf58256b6246018ee52bb36d21e9e27ff8bb5b218d2d8869057a0e006a818\": container with ID starting with e1caf58256b6246018ee52bb36d21e9e27ff8bb5b218d2d8869057a0e006a818 not found: ID does not exist" Mar 10 06:59:45 crc kubenswrapper[4825]: I0310 06:59:45.535061 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-6bjtt"] Mar 10 06:59:46 crc kubenswrapper[4825]: I0310 06:59:46.508173 4825 generic.go:334] "Generic (PLEG): container finished" podID="b30ba77f-d659-4501-94e6-cc3e980b3f41" containerID="90843d1cb8edaa955a6262496a034d0a4bfeaec08af7331dc9d007fa08780476" exitCode=0 Mar 10 06:59:46 crc kubenswrapper[4825]: I0310 06:59:46.508325 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79" event={"ID":"b30ba77f-d659-4501-94e6-cc3e980b3f41","Type":"ContainerDied","Data":"90843d1cb8edaa955a6262496a034d0a4bfeaec08af7331dc9d007fa08780476"} Mar 10 06:59:46 crc kubenswrapper[4825]: I0310 06:59:46.888034 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 06:59:46 crc kubenswrapper[4825]: I0310 06:59:46.888161 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 06:59:46 crc kubenswrapper[4825]: I0310 06:59:46.888238 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 06:59:46 crc kubenswrapper[4825]: I0310 06:59:46.889266 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1b4ca6f9c5b9ae0428c11c821b788c2b1bec41466ba3ff84c82938c6518a2828"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 06:59:46 crc kubenswrapper[4825]: I0310 06:59:46.889350 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://1b4ca6f9c5b9ae0428c11c821b788c2b1bec41466ba3ff84c82938c6518a2828" gracePeriod=600 Mar 10 06:59:47 crc kubenswrapper[4825]: I0310 06:59:47.246547 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3a60327-2809-415b-abde-d1569a2453b6" path="/var/lib/kubelet/pods/f3a60327-2809-415b-abde-d1569a2453b6/volumes" Mar 10 06:59:47 crc kubenswrapper[4825]: I0310 06:59:47.516004 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="1b4ca6f9c5b9ae0428c11c821b788c2b1bec41466ba3ff84c82938c6518a2828" exitCode=0 Mar 10 06:59:47 crc kubenswrapper[4825]: I0310 06:59:47.516062 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"1b4ca6f9c5b9ae0428c11c821b788c2b1bec41466ba3ff84c82938c6518a2828"} Mar 10 06:59:47 crc kubenswrapper[4825]: I0310 06:59:47.516188 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"864f20c49ba566b3322366215d4709b7053290c4a77d8bfd2767699cce8f044b"} Mar 10 06:59:47 crc kubenswrapper[4825]: I0310 06:59:47.516227 4825 scope.go:117] "RemoveContainer" containerID="879be6c5ce3fe58a67d0da559938f171bc2a9441b209ba47dbc49f9ed90467ac" Mar 10 06:59:47 crc kubenswrapper[4825]: I0310 06:59:47.518359 4825 generic.go:334] "Generic (PLEG): container finished" podID="b30ba77f-d659-4501-94e6-cc3e980b3f41" containerID="8a04149b8009c9b7fa71f584a84db398eff26794dc4566a8aed63ea1492637ef" exitCode=0 Mar 10 06:59:47 crc kubenswrapper[4825]: I0310 06:59:47.518393 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79" event={"ID":"b30ba77f-d659-4501-94e6-cc3e980b3f41","Type":"ContainerDied","Data":"8a04149b8009c9b7fa71f584a84db398eff26794dc4566a8aed63ea1492637ef"} Mar 10 06:59:48 crc kubenswrapper[4825]: I0310 06:59:48.778113 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79" Mar 10 06:59:48 crc kubenswrapper[4825]: I0310 06:59:48.876733 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvpw6\" (UniqueName: \"kubernetes.io/projected/b30ba77f-d659-4501-94e6-cc3e980b3f41-kube-api-access-zvpw6\") pod \"b30ba77f-d659-4501-94e6-cc3e980b3f41\" (UID: \"b30ba77f-d659-4501-94e6-cc3e980b3f41\") " Mar 10 06:59:48 crc kubenswrapper[4825]: I0310 06:59:48.877366 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b30ba77f-d659-4501-94e6-cc3e980b3f41-util\") pod \"b30ba77f-d659-4501-94e6-cc3e980b3f41\" (UID: \"b30ba77f-d659-4501-94e6-cc3e980b3f41\") " Mar 10 06:59:48 crc kubenswrapper[4825]: I0310 06:59:48.877546 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b30ba77f-d659-4501-94e6-cc3e980b3f41-bundle\") pod \"b30ba77f-d659-4501-94e6-cc3e980b3f41\" (UID: \"b30ba77f-d659-4501-94e6-cc3e980b3f41\") " Mar 10 06:59:48 crc kubenswrapper[4825]: I0310 06:59:48.881846 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b30ba77f-d659-4501-94e6-cc3e980b3f41-bundle" (OuterVolumeSpecName: "bundle") pod "b30ba77f-d659-4501-94e6-cc3e980b3f41" (UID: "b30ba77f-d659-4501-94e6-cc3e980b3f41"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:59:48 crc kubenswrapper[4825]: I0310 06:59:48.884385 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b30ba77f-d659-4501-94e6-cc3e980b3f41-kube-api-access-zvpw6" (OuterVolumeSpecName: "kube-api-access-zvpw6") pod "b30ba77f-d659-4501-94e6-cc3e980b3f41" (UID: "b30ba77f-d659-4501-94e6-cc3e980b3f41"). InnerVolumeSpecName "kube-api-access-zvpw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 06:59:48 crc kubenswrapper[4825]: I0310 06:59:48.891520 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b30ba77f-d659-4501-94e6-cc3e980b3f41-util" (OuterVolumeSpecName: "util") pod "b30ba77f-d659-4501-94e6-cc3e980b3f41" (UID: "b30ba77f-d659-4501-94e6-cc3e980b3f41"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 06:59:48 crc kubenswrapper[4825]: I0310 06:59:48.980513 4825 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b30ba77f-d659-4501-94e6-cc3e980b3f41-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 06:59:48 crc kubenswrapper[4825]: I0310 06:59:48.980563 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvpw6\" (UniqueName: \"kubernetes.io/projected/b30ba77f-d659-4501-94e6-cc3e980b3f41-kube-api-access-zvpw6\") on node \"crc\" DevicePath \"\"" Mar 10 06:59:48 crc kubenswrapper[4825]: I0310 06:59:48.980586 4825 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b30ba77f-d659-4501-94e6-cc3e980b3f41-util\") on node \"crc\" DevicePath \"\"" Mar 10 06:59:49 crc kubenswrapper[4825]: I0310 06:59:49.548316 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79" event={"ID":"b30ba77f-d659-4501-94e6-cc3e980b3f41","Type":"ContainerDied","Data":"e6b6b4882835b4ec5bb632bb0760853434fa68e81622d27967e694848c1db06f"} Mar 10 06:59:49 crc kubenswrapper[4825]: I0310 06:59:49.548376 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6b6b4882835b4ec5bb632bb0760853434fa68e81622d27967e694848c1db06f" Mar 10 06:59:49 crc kubenswrapper[4825]: I0310 06:59:49.548479 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79" Mar 10 06:59:57 crc kubenswrapper[4825]: I0310 06:59:57.813256 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5488d4f4f7-kjl8v"] Mar 10 06:59:57 crc kubenswrapper[4825]: E0310 06:59:57.814002 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b30ba77f-d659-4501-94e6-cc3e980b3f41" containerName="extract" Mar 10 06:59:57 crc kubenswrapper[4825]: I0310 06:59:57.814018 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30ba77f-d659-4501-94e6-cc3e980b3f41" containerName="extract" Mar 10 06:59:57 crc kubenswrapper[4825]: E0310 06:59:57.814031 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a60327-2809-415b-abde-d1569a2453b6" containerName="console" Mar 10 06:59:57 crc kubenswrapper[4825]: I0310 06:59:57.814039 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a60327-2809-415b-abde-d1569a2453b6" containerName="console" Mar 10 06:59:57 crc kubenswrapper[4825]: E0310 06:59:57.814055 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b30ba77f-d659-4501-94e6-cc3e980b3f41" containerName="util" Mar 10 06:59:57 crc kubenswrapper[4825]: I0310 06:59:57.814062 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30ba77f-d659-4501-94e6-cc3e980b3f41" containerName="util" Mar 10 06:59:57 crc kubenswrapper[4825]: E0310 06:59:57.814082 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b30ba77f-d659-4501-94e6-cc3e980b3f41" containerName="pull" Mar 10 06:59:57 crc kubenswrapper[4825]: I0310 06:59:57.814089 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30ba77f-d659-4501-94e6-cc3e980b3f41" containerName="pull" Mar 10 06:59:57 crc kubenswrapper[4825]: I0310 06:59:57.814216 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a60327-2809-415b-abde-d1569a2453b6" containerName="console" Mar 10 06:59:57 crc kubenswrapper[4825]: I0310 06:59:57.814233 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b30ba77f-d659-4501-94e6-cc3e980b3f41" containerName="extract" Mar 10 06:59:57 crc kubenswrapper[4825]: I0310 06:59:57.814688 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5488d4f4f7-kjl8v" Mar 10 06:59:57 crc kubenswrapper[4825]: I0310 06:59:57.816454 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 10 06:59:57 crc kubenswrapper[4825]: I0310 06:59:57.816619 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 10 06:59:57 crc kubenswrapper[4825]: I0310 06:59:57.816651 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 10 06:59:57 crc kubenswrapper[4825]: I0310 06:59:57.816872 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-tks42" Mar 10 06:59:57 crc kubenswrapper[4825]: I0310 06:59:57.817015 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 10 06:59:57 crc kubenswrapper[4825]: I0310 06:59:57.836507 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5488d4f4f7-kjl8v"] Mar 10 06:59:57 crc kubenswrapper[4825]: I0310 06:59:57.923770 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1876f466-8750-4a15-bbe2-e03da6d0df87-apiservice-cert\") pod \"metallb-operator-controller-manager-5488d4f4f7-kjl8v\" (UID: \"1876f466-8750-4a15-bbe2-e03da6d0df87\") " pod="metallb-system/metallb-operator-controller-manager-5488d4f4f7-kjl8v" Mar 10 06:59:57 crc kubenswrapper[4825]: I0310 06:59:57.923909 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z9m7\" (UniqueName: \"kubernetes.io/projected/1876f466-8750-4a15-bbe2-e03da6d0df87-kube-api-access-6z9m7\") pod \"metallb-operator-controller-manager-5488d4f4f7-kjl8v\" (UID: \"1876f466-8750-4a15-bbe2-e03da6d0df87\") " pod="metallb-system/metallb-operator-controller-manager-5488d4f4f7-kjl8v" Mar 10 06:59:57 crc kubenswrapper[4825]: I0310 06:59:57.923948 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1876f466-8750-4a15-bbe2-e03da6d0df87-webhook-cert\") pod \"metallb-operator-controller-manager-5488d4f4f7-kjl8v\" (UID: \"1876f466-8750-4a15-bbe2-e03da6d0df87\") " pod="metallb-system/metallb-operator-controller-manager-5488d4f4f7-kjl8v" Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.025597 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z9m7\" (UniqueName: \"kubernetes.io/projected/1876f466-8750-4a15-bbe2-e03da6d0df87-kube-api-access-6z9m7\") pod \"metallb-operator-controller-manager-5488d4f4f7-kjl8v\" (UID: \"1876f466-8750-4a15-bbe2-e03da6d0df87\") " pod="metallb-system/metallb-operator-controller-manager-5488d4f4f7-kjl8v" Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.025643 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1876f466-8750-4a15-bbe2-e03da6d0df87-webhook-cert\") pod \"metallb-operator-controller-manager-5488d4f4f7-kjl8v\" (UID: \"1876f466-8750-4a15-bbe2-e03da6d0df87\") " pod="metallb-system/metallb-operator-controller-manager-5488d4f4f7-kjl8v" Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.025705 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1876f466-8750-4a15-bbe2-e03da6d0df87-apiservice-cert\") pod \"metallb-operator-controller-manager-5488d4f4f7-kjl8v\" (UID: \"1876f466-8750-4a15-bbe2-e03da6d0df87\") " pod="metallb-system/metallb-operator-controller-manager-5488d4f4f7-kjl8v" Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.032224 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1876f466-8750-4a15-bbe2-e03da6d0df87-apiservice-cert\") pod \"metallb-operator-controller-manager-5488d4f4f7-kjl8v\" (UID: \"1876f466-8750-4a15-bbe2-e03da6d0df87\") " pod="metallb-system/metallb-operator-controller-manager-5488d4f4f7-kjl8v" Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.032224 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1876f466-8750-4a15-bbe2-e03da6d0df87-webhook-cert\") pod \"metallb-operator-controller-manager-5488d4f4f7-kjl8v\" (UID: \"1876f466-8750-4a15-bbe2-e03da6d0df87\") " pod="metallb-system/metallb-operator-controller-manager-5488d4f4f7-kjl8v" Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.053913 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z9m7\" (UniqueName: \"kubernetes.io/projected/1876f466-8750-4a15-bbe2-e03da6d0df87-kube-api-access-6z9m7\") pod \"metallb-operator-controller-manager-5488d4f4f7-kjl8v\" (UID: \"1876f466-8750-4a15-bbe2-e03da6d0df87\") " pod="metallb-system/metallb-operator-controller-manager-5488d4f4f7-kjl8v" Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.128591 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-74cf7b6d9d-m2z87"] Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.129608 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-74cf7b6d9d-m2z87" Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.130254 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5488d4f4f7-kjl8v" Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.134718 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.135284 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-8w4l5" Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.135494 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.165829 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-74cf7b6d9d-m2z87"] Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.229469 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv86z\" (UniqueName: \"kubernetes.io/projected/944a661d-8d36-464b-a9d3-b2477f6e4663-kube-api-access-lv86z\") pod \"metallb-operator-webhook-server-74cf7b6d9d-m2z87\" (UID: \"944a661d-8d36-464b-a9d3-b2477f6e4663\") " pod="metallb-system/metallb-operator-webhook-server-74cf7b6d9d-m2z87" Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.229537 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/944a661d-8d36-464b-a9d3-b2477f6e4663-apiservice-cert\") pod \"metallb-operator-webhook-server-74cf7b6d9d-m2z87\" (UID: \"944a661d-8d36-464b-a9d3-b2477f6e4663\") " pod="metallb-system/metallb-operator-webhook-server-74cf7b6d9d-m2z87" Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.229580 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/944a661d-8d36-464b-a9d3-b2477f6e4663-webhook-cert\") pod \"metallb-operator-webhook-server-74cf7b6d9d-m2z87\" (UID: \"944a661d-8d36-464b-a9d3-b2477f6e4663\") " pod="metallb-system/metallb-operator-webhook-server-74cf7b6d9d-m2z87" Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.331061 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/944a661d-8d36-464b-a9d3-b2477f6e4663-webhook-cert\") pod \"metallb-operator-webhook-server-74cf7b6d9d-m2z87\" (UID: \"944a661d-8d36-464b-a9d3-b2477f6e4663\") " pod="metallb-system/metallb-operator-webhook-server-74cf7b6d9d-m2z87" Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.331191 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv86z\" (UniqueName: \"kubernetes.io/projected/944a661d-8d36-464b-a9d3-b2477f6e4663-kube-api-access-lv86z\") pod \"metallb-operator-webhook-server-74cf7b6d9d-m2z87\" (UID: \"944a661d-8d36-464b-a9d3-b2477f6e4663\") " pod="metallb-system/metallb-operator-webhook-server-74cf7b6d9d-m2z87" Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.331213 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/944a661d-8d36-464b-a9d3-b2477f6e4663-apiservice-cert\") pod \"metallb-operator-webhook-server-74cf7b6d9d-m2z87\" (UID: \"944a661d-8d36-464b-a9d3-b2477f6e4663\") " pod="metallb-system/metallb-operator-webhook-server-74cf7b6d9d-m2z87" Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.335514 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/944a661d-8d36-464b-a9d3-b2477f6e4663-apiservice-cert\") pod \"metallb-operator-webhook-server-74cf7b6d9d-m2z87\" (UID: \"944a661d-8d36-464b-a9d3-b2477f6e4663\") " pod="metallb-system/metallb-operator-webhook-server-74cf7b6d9d-m2z87" Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.347837 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/944a661d-8d36-464b-a9d3-b2477f6e4663-webhook-cert\") pod \"metallb-operator-webhook-server-74cf7b6d9d-m2z87\" (UID: \"944a661d-8d36-464b-a9d3-b2477f6e4663\") " pod="metallb-system/metallb-operator-webhook-server-74cf7b6d9d-m2z87" Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.354236 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv86z\" (UniqueName: \"kubernetes.io/projected/944a661d-8d36-464b-a9d3-b2477f6e4663-kube-api-access-lv86z\") pod \"metallb-operator-webhook-server-74cf7b6d9d-m2z87\" (UID: \"944a661d-8d36-464b-a9d3-b2477f6e4663\") " pod="metallb-system/metallb-operator-webhook-server-74cf7b6d9d-m2z87" Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.407071 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5488d4f4f7-kjl8v"] Mar 10 06:59:58 crc kubenswrapper[4825]: W0310 06:59:58.410791 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1876f466_8750_4a15_bbe2_e03da6d0df87.slice/crio-0eb54c3cb32dba8105d592df43a3975b151b27e970453c02c2cbbc7dcec647d7 WatchSource:0}: Error finding container 0eb54c3cb32dba8105d592df43a3975b151b27e970453c02c2cbbc7dcec647d7: Status 404 returned error can't find the container with id 0eb54c3cb32dba8105d592df43a3975b151b27e970453c02c2cbbc7dcec647d7 Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.501878 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-74cf7b6d9d-m2z87" Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.604058 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5488d4f4f7-kjl8v" event={"ID":"1876f466-8750-4a15-bbe2-e03da6d0df87","Type":"ContainerStarted","Data":"0eb54c3cb32dba8105d592df43a3975b151b27e970453c02c2cbbc7dcec647d7"} Mar 10 06:59:58 crc kubenswrapper[4825]: W0310 06:59:58.728214 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod944a661d_8d36_464b_a9d3_b2477f6e4663.slice/crio-b6d4564650eb0463ed6242472753075ad36f00eee01ce04768cc13a06fd18c24 WatchSource:0}: Error finding container b6d4564650eb0463ed6242472753075ad36f00eee01ce04768cc13a06fd18c24: Status 404 returned error can't find the container with id b6d4564650eb0463ed6242472753075ad36f00eee01ce04768cc13a06fd18c24 Mar 10 06:59:58 crc kubenswrapper[4825]: I0310 06:59:58.733322 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-74cf7b6d9d-m2z87"] Mar 10 06:59:59 crc kubenswrapper[4825]: I0310 06:59:59.610872 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-74cf7b6d9d-m2z87" event={"ID":"944a661d-8d36-464b-a9d3-b2477f6e4663","Type":"ContainerStarted","Data":"b6d4564650eb0463ed6242472753075ad36f00eee01ce04768cc13a06fd18c24"} Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.141853 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552100-n59dv"] Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.142757 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552100-n59dv" Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.144226 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.144651 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.144861 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.150547 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552100-5wl6j"] Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.152385 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552100-5wl6j" Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.154175 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552100-n59dv"] Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.155732 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.155999 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.159835 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552100-5wl6j"] Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.258746 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqdrw\" (UniqueName: \"kubernetes.io/projected/e69e652a-0413-42a4-9fea-d20d094111ac-kube-api-access-mqdrw\") pod \"auto-csr-approver-29552100-n59dv\" (UID: \"e69e652a-0413-42a4-9fea-d20d094111ac\") " pod="openshift-infra/auto-csr-approver-29552100-n59dv" Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.258785 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0f2bc77-d38f-42b6-a789-38843e17bcbb-config-volume\") pod \"collect-profiles-29552100-5wl6j\" (UID: \"b0f2bc77-d38f-42b6-a789-38843e17bcbb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552100-5wl6j" Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.258816 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlvpp\" (UniqueName: \"kubernetes.io/projected/b0f2bc77-d38f-42b6-a789-38843e17bcbb-kube-api-access-hlvpp\") pod \"collect-profiles-29552100-5wl6j\" (UID: \"b0f2bc77-d38f-42b6-a789-38843e17bcbb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552100-5wl6j" Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.258845 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b0f2bc77-d38f-42b6-a789-38843e17bcbb-secret-volume\") pod \"collect-profiles-29552100-5wl6j\" (UID: \"b0f2bc77-d38f-42b6-a789-38843e17bcbb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552100-5wl6j" Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.360617 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqdrw\" (UniqueName: \"kubernetes.io/projected/e69e652a-0413-42a4-9fea-d20d094111ac-kube-api-access-mqdrw\") pod \"auto-csr-approver-29552100-n59dv\" (UID: \"e69e652a-0413-42a4-9fea-d20d094111ac\") " pod="openshift-infra/auto-csr-approver-29552100-n59dv" Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.360680 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0f2bc77-d38f-42b6-a789-38843e17bcbb-config-volume\") pod \"collect-profiles-29552100-5wl6j\" (UID: \"b0f2bc77-d38f-42b6-a789-38843e17bcbb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552100-5wl6j" Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.360720 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlvpp\" (UniqueName: \"kubernetes.io/projected/b0f2bc77-d38f-42b6-a789-38843e17bcbb-kube-api-access-hlvpp\") pod \"collect-profiles-29552100-5wl6j\" (UID: \"b0f2bc77-d38f-42b6-a789-38843e17bcbb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552100-5wl6j" Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.360775 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b0f2bc77-d38f-42b6-a789-38843e17bcbb-secret-volume\") pod \"collect-profiles-29552100-5wl6j\" (UID: \"b0f2bc77-d38f-42b6-a789-38843e17bcbb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552100-5wl6j" Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.362886 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0f2bc77-d38f-42b6-a789-38843e17bcbb-config-volume\") pod \"collect-profiles-29552100-5wl6j\" (UID: \"b0f2bc77-d38f-42b6-a789-38843e17bcbb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552100-5wl6j" Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.376551 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlvpp\" (UniqueName: \"kubernetes.io/projected/b0f2bc77-d38f-42b6-a789-38843e17bcbb-kube-api-access-hlvpp\") pod \"collect-profiles-29552100-5wl6j\" (UID: \"b0f2bc77-d38f-42b6-a789-38843e17bcbb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552100-5wl6j" Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.376775 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b0f2bc77-d38f-42b6-a789-38843e17bcbb-secret-volume\") pod \"collect-profiles-29552100-5wl6j\" (UID: \"b0f2bc77-d38f-42b6-a789-38843e17bcbb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552100-5wl6j" Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.377521 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqdrw\" (UniqueName: \"kubernetes.io/projected/e69e652a-0413-42a4-9fea-d20d094111ac-kube-api-access-mqdrw\") pod \"auto-csr-approver-29552100-n59dv\" (UID: \"e69e652a-0413-42a4-9fea-d20d094111ac\") " pod="openshift-infra/auto-csr-approver-29552100-n59dv" Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.464285 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552100-n59dv" Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.472564 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552100-5wl6j" Mar 10 07:00:00 crc kubenswrapper[4825]: I0310 07:00:00.944990 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552100-n59dv"] Mar 10 07:00:01 crc kubenswrapper[4825]: I0310 07:00:01.009914 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552100-5wl6j"] Mar 10 07:00:01 crc kubenswrapper[4825]: I0310 07:00:01.622015 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552100-5wl6j" event={"ID":"b0f2bc77-d38f-42b6-a789-38843e17bcbb","Type":"ContainerStarted","Data":"63e57d7eb4fc75f1130281b18ed6b36a0dae9fd17d4e8d845468b99d5e9ef505"} Mar 10 07:00:01 crc kubenswrapper[4825]: I0310 07:00:01.623016 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552100-n59dv" event={"ID":"e69e652a-0413-42a4-9fea-d20d094111ac","Type":"ContainerStarted","Data":"2082659d362d0d45f81bc87e8b26d8db92c1a7d38ba9b006f8521a00f2028a31"} Mar 10 07:00:02 crc kubenswrapper[4825]: I0310 07:00:02.630211 4825 generic.go:334] "Generic (PLEG): container finished" podID="b0f2bc77-d38f-42b6-a789-38843e17bcbb" containerID="abffb41e22121e7640bd4bbc82fb8e4c99dee0b7b8cf28bc6b3407c6f194645f" exitCode=0 Mar 10 07:00:02 crc kubenswrapper[4825]: I0310 07:00:02.630292 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552100-5wl6j" event={"ID":"b0f2bc77-d38f-42b6-a789-38843e17bcbb","Type":"ContainerDied","Data":"abffb41e22121e7640bd4bbc82fb8e4c99dee0b7b8cf28bc6b3407c6f194645f"} Mar 10 07:00:02 crc kubenswrapper[4825]: I0310 07:00:02.632164 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5488d4f4f7-kjl8v" event={"ID":"1876f466-8750-4a15-bbe2-e03da6d0df87","Type":"ContainerStarted","Data":"201ac00f22ab3c5384d21b31591e362e9807d5a7c36eee2d89da590cbb3feb0a"} Mar 10 07:00:02 crc kubenswrapper[4825]: I0310 07:00:02.632280 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5488d4f4f7-kjl8v" Mar 10 07:00:02 crc kubenswrapper[4825]: I0310 07:00:02.671350 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5488d4f4f7-kjl8v" podStartSLOduration=2.453799959 podStartE2EDuration="5.671332318s" podCreationTimestamp="2026-03-10 06:59:57 +0000 UTC" firstStartedPulling="2026-03-10 06:59:58.415878729 +0000 UTC m=+951.445659344" lastFinishedPulling="2026-03-10 07:00:01.633411098 +0000 UTC m=+954.663191703" observedRunningTime="2026-03-10 07:00:02.665910887 +0000 UTC m=+955.695691512" watchObservedRunningTime="2026-03-10 07:00:02.671332318 +0000 UTC m=+955.701112933" Mar 10 07:00:03 crc kubenswrapper[4825]: I0310 07:00:03.641225 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-74cf7b6d9d-m2z87" event={"ID":"944a661d-8d36-464b-a9d3-b2477f6e4663","Type":"ContainerStarted","Data":"35efacbfe97085038590d06b4fd196e263f5679fe2dcf9c7bd267be906b60544"} Mar 10 07:00:03 crc kubenswrapper[4825]: I0310 07:00:03.641730 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-74cf7b6d9d-m2z87" Mar 10 07:00:03 crc kubenswrapper[4825]: I0310 07:00:03.671580 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-74cf7b6d9d-m2z87" podStartSLOduration=1.056383279 podStartE2EDuration="5.671566696s" podCreationTimestamp="2026-03-10 06:59:58 +0000 UTC" firstStartedPulling="2026-03-10 06:59:58.735283757 +0000 UTC m=+951.765064362" lastFinishedPulling="2026-03-10 07:00:03.350467164 +0000 UTC m=+956.380247779" observedRunningTime="2026-03-10 07:00:03.66597553 +0000 UTC m=+956.695756155" watchObservedRunningTime="2026-03-10 07:00:03.671566696 +0000 UTC m=+956.701347311" Mar 10 07:00:03 crc kubenswrapper[4825]: I0310 07:00:03.908794 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552100-5wl6j" Mar 10 07:00:04 crc kubenswrapper[4825]: I0310 07:00:04.007217 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b0f2bc77-d38f-42b6-a789-38843e17bcbb-secret-volume\") pod \"b0f2bc77-d38f-42b6-a789-38843e17bcbb\" (UID: \"b0f2bc77-d38f-42b6-a789-38843e17bcbb\") " Mar 10 07:00:04 crc kubenswrapper[4825]: I0310 07:00:04.007375 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlvpp\" (UniqueName: \"kubernetes.io/projected/b0f2bc77-d38f-42b6-a789-38843e17bcbb-kube-api-access-hlvpp\") pod \"b0f2bc77-d38f-42b6-a789-38843e17bcbb\" (UID: \"b0f2bc77-d38f-42b6-a789-38843e17bcbb\") " Mar 10 07:00:04 crc kubenswrapper[4825]: I0310 07:00:04.007484 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0f2bc77-d38f-42b6-a789-38843e17bcbb-config-volume\") pod \"b0f2bc77-d38f-42b6-a789-38843e17bcbb\" (UID: \"b0f2bc77-d38f-42b6-a789-38843e17bcbb\") " Mar 10 07:00:04 crc kubenswrapper[4825]: I0310 07:00:04.008708 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0f2bc77-d38f-42b6-a789-38843e17bcbb-config-volume" (OuterVolumeSpecName: "config-volume") pod "b0f2bc77-d38f-42b6-a789-38843e17bcbb" (UID: "b0f2bc77-d38f-42b6-a789-38843e17bcbb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:00:04 crc kubenswrapper[4825]: I0310 07:00:04.013599 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0f2bc77-d38f-42b6-a789-38843e17bcbb-kube-api-access-hlvpp" (OuterVolumeSpecName: "kube-api-access-hlvpp") pod "b0f2bc77-d38f-42b6-a789-38843e17bcbb" (UID: "b0f2bc77-d38f-42b6-a789-38843e17bcbb"). InnerVolumeSpecName "kube-api-access-hlvpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:00:04 crc kubenswrapper[4825]: I0310 07:00:04.014705 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0f2bc77-d38f-42b6-a789-38843e17bcbb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b0f2bc77-d38f-42b6-a789-38843e17bcbb" (UID: "b0f2bc77-d38f-42b6-a789-38843e17bcbb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:00:04 crc kubenswrapper[4825]: I0310 07:00:04.109601 4825 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b0f2bc77-d38f-42b6-a789-38843e17bcbb-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 07:00:04 crc kubenswrapper[4825]: I0310 07:00:04.109660 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlvpp\" (UniqueName: \"kubernetes.io/projected/b0f2bc77-d38f-42b6-a789-38843e17bcbb-kube-api-access-hlvpp\") on node \"crc\" DevicePath \"\"" Mar 10 07:00:04 crc kubenswrapper[4825]: I0310 07:00:04.109673 4825 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0f2bc77-d38f-42b6-a789-38843e17bcbb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 07:00:04 crc kubenswrapper[4825]: I0310 07:00:04.658717 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552100-5wl6j" Mar 10 07:00:04 crc kubenswrapper[4825]: I0310 07:00:04.663739 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552100-5wl6j" event={"ID":"b0f2bc77-d38f-42b6-a789-38843e17bcbb","Type":"ContainerDied","Data":"63e57d7eb4fc75f1130281b18ed6b36a0dae9fd17d4e8d845468b99d5e9ef505"} Mar 10 07:00:04 crc kubenswrapper[4825]: I0310 07:00:04.663986 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63e57d7eb4fc75f1130281b18ed6b36a0dae9fd17d4e8d845468b99d5e9ef505" Mar 10 07:00:05 crc kubenswrapper[4825]: I0310 07:00:05.668022 4825 generic.go:334] "Generic (PLEG): container finished" podID="e69e652a-0413-42a4-9fea-d20d094111ac" containerID="801f701059c4cc548a98f3a4ba85c8a061b2d540f0fc8220d4041016633e9a0a" exitCode=0 Mar 10 07:00:05 crc kubenswrapper[4825]: I0310 07:00:05.668546 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552100-n59dv" event={"ID":"e69e652a-0413-42a4-9fea-d20d094111ac","Type":"ContainerDied","Data":"801f701059c4cc548a98f3a4ba85c8a061b2d540f0fc8220d4041016633e9a0a"} Mar 10 07:00:06 crc kubenswrapper[4825]: I0310 07:00:06.897317 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552100-n59dv" Mar 10 07:00:06 crc kubenswrapper[4825]: I0310 07:00:06.949733 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqdrw\" (UniqueName: \"kubernetes.io/projected/e69e652a-0413-42a4-9fea-d20d094111ac-kube-api-access-mqdrw\") pod \"e69e652a-0413-42a4-9fea-d20d094111ac\" (UID: \"e69e652a-0413-42a4-9fea-d20d094111ac\") " Mar 10 07:00:06 crc kubenswrapper[4825]: I0310 07:00:06.959350 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e69e652a-0413-42a4-9fea-d20d094111ac-kube-api-access-mqdrw" (OuterVolumeSpecName: "kube-api-access-mqdrw") pod "e69e652a-0413-42a4-9fea-d20d094111ac" (UID: "e69e652a-0413-42a4-9fea-d20d094111ac"). InnerVolumeSpecName "kube-api-access-mqdrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:00:07 crc kubenswrapper[4825]: I0310 07:00:07.050855 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqdrw\" (UniqueName: \"kubernetes.io/projected/e69e652a-0413-42a4-9fea-d20d094111ac-kube-api-access-mqdrw\") on node \"crc\" DevicePath \"\"" Mar 10 07:00:07 crc kubenswrapper[4825]: I0310 07:00:07.682205 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552100-n59dv" event={"ID":"e69e652a-0413-42a4-9fea-d20d094111ac","Type":"ContainerDied","Data":"2082659d362d0d45f81bc87e8b26d8db92c1a7d38ba9b006f8521a00f2028a31"} Mar 10 07:00:07 crc kubenswrapper[4825]: I0310 07:00:07.682472 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2082659d362d0d45f81bc87e8b26d8db92c1a7d38ba9b006f8521a00f2028a31" Mar 10 07:00:07 crc kubenswrapper[4825]: I0310 07:00:07.682300 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552100-n59dv" Mar 10 07:00:07 crc kubenswrapper[4825]: I0310 07:00:07.946256 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552094-2nwbl"] Mar 10 07:00:07 crc kubenswrapper[4825]: I0310 07:00:07.953293 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552094-2nwbl"] Mar 10 07:00:09 crc kubenswrapper[4825]: I0310 07:00:09.246280 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39b2113e-4a4f-4124-87f7-8439cb92af77" path="/var/lib/kubelet/pods/39b2113e-4a4f-4124-87f7-8439cb92af77/volumes" Mar 10 07:00:18 crc kubenswrapper[4825]: I0310 07:00:18.518285 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-74cf7b6d9d-m2z87" Mar 10 07:00:29 crc kubenswrapper[4825]: I0310 07:00:29.882325 4825 scope.go:117] "RemoveContainer" containerID="4121a0db8fe028b8d9c0ab22f7649791664ecd76901f0f382df50b36c0ab26ea" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.134474 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5488d4f4f7-kjl8v" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.886975 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-lrjtj"] Mar 10 07:00:38 crc kubenswrapper[4825]: E0310 07:00:38.887267 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0f2bc77-d38f-42b6-a789-38843e17bcbb" containerName="collect-profiles" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.887288 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0f2bc77-d38f-42b6-a789-38843e17bcbb" containerName="collect-profiles" Mar 10 07:00:38 crc kubenswrapper[4825]: E0310 07:00:38.887303 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e69e652a-0413-42a4-9fea-d20d094111ac" containerName="oc" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.887312 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e69e652a-0413-42a4-9fea-d20d094111ac" containerName="oc" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.887433 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e69e652a-0413-42a4-9fea-d20d094111ac" containerName="oc" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.887455 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0f2bc77-d38f-42b6-a789-38843e17bcbb" containerName="collect-profiles" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.889705 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.891904 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-vtj88" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.892297 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.892331 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.901630 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-v6sfm"] Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.904632 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-v6sfm" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.907812 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.915962 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-v6sfm"] Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.934849 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/876b5cff-b150-487c-8f79-752c845a44da-frr-sockets\") pod \"frr-k8s-lrjtj\" (UID: \"876b5cff-b150-487c-8f79-752c845a44da\") " pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.934929 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/876b5cff-b150-487c-8f79-752c845a44da-frr-startup\") pod \"frr-k8s-lrjtj\" (UID: \"876b5cff-b150-487c-8f79-752c845a44da\") " pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.934999 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/876b5cff-b150-487c-8f79-752c845a44da-metrics-certs\") pod \"frr-k8s-lrjtj\" (UID: \"876b5cff-b150-487c-8f79-752c845a44da\") " pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.935053 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw8rj\" (UniqueName: \"kubernetes.io/projected/0f74a263-0c43-4b78-8d8e-6a3b66166658-kube-api-access-xw8rj\") pod \"frr-k8s-webhook-server-7f989f654f-v6sfm\" (UID: \"0f74a263-0c43-4b78-8d8e-6a3b66166658\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-v6sfm" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.935096 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/876b5cff-b150-487c-8f79-752c845a44da-frr-conf\") pod \"frr-k8s-lrjtj\" (UID: \"876b5cff-b150-487c-8f79-752c845a44da\") " pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.935164 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/876b5cff-b150-487c-8f79-752c845a44da-metrics\") pod \"frr-k8s-lrjtj\" (UID: \"876b5cff-b150-487c-8f79-752c845a44da\") " pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.935200 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/876b5cff-b150-487c-8f79-752c845a44da-reloader\") pod \"frr-k8s-lrjtj\" (UID: \"876b5cff-b150-487c-8f79-752c845a44da\") " pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.935251 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f74a263-0c43-4b78-8d8e-6a3b66166658-cert\") pod \"frr-k8s-webhook-server-7f989f654f-v6sfm\" (UID: \"0f74a263-0c43-4b78-8d8e-6a3b66166658\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-v6sfm" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.935290 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgkwq\" (UniqueName: \"kubernetes.io/projected/876b5cff-b150-487c-8f79-752c845a44da-kube-api-access-tgkwq\") pod \"frr-k8s-lrjtj\" (UID: \"876b5cff-b150-487c-8f79-752c845a44da\") " pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.988222 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-tsj92"] Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.989119 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tsj92" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.992061 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.993056 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.993371 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-l66gj" Mar 10 07:00:38 crc kubenswrapper[4825]: I0310 07:00:38.995773 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.018784 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-sb9dl"] Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.020243 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-sb9dl" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.023661 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.036331 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw8rj\" (UniqueName: \"kubernetes.io/projected/0f74a263-0c43-4b78-8d8e-6a3b66166658-kube-api-access-xw8rj\") pod \"frr-k8s-webhook-server-7f989f654f-v6sfm\" (UID: \"0f74a263-0c43-4b78-8d8e-6a3b66166658\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-v6sfm" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.036394 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/876b5cff-b150-487c-8f79-752c845a44da-frr-conf\") pod \"frr-k8s-lrjtj\" (UID: \"876b5cff-b150-487c-8f79-752c845a44da\") " pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.036437 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/876b5cff-b150-487c-8f79-752c845a44da-metrics\") pod \"frr-k8s-lrjtj\" (UID: \"876b5cff-b150-487c-8f79-752c845a44da\") " pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.036468 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/876b5cff-b150-487c-8f79-752c845a44da-reloader\") pod \"frr-k8s-lrjtj\" (UID: \"876b5cff-b150-487c-8f79-752c845a44da\") " pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.036509 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f74a263-0c43-4b78-8d8e-6a3b66166658-cert\") pod \"frr-k8s-webhook-server-7f989f654f-v6sfm\" (UID: \"0f74a263-0c43-4b78-8d8e-6a3b66166658\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-v6sfm" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.036539 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgkwq\" (UniqueName: \"kubernetes.io/projected/876b5cff-b150-487c-8f79-752c845a44da-kube-api-access-tgkwq\") pod \"frr-k8s-lrjtj\" (UID: \"876b5cff-b150-487c-8f79-752c845a44da\") " pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.036579 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j49wf\" (UniqueName: \"kubernetes.io/projected/f9daa544-4d9f-4106-a729-d330dc8b6cc3-kube-api-access-j49wf\") pod \"speaker-tsj92\" (UID: \"f9daa544-4d9f-4106-a729-d330dc8b6cc3\") " pod="metallb-system/speaker-tsj92" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.036616 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9daa544-4d9f-4106-a729-d330dc8b6cc3-metrics-certs\") pod \"speaker-tsj92\" (UID: \"f9daa544-4d9f-4106-a729-d330dc8b6cc3\") " pod="metallb-system/speaker-tsj92" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.036649 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/876b5cff-b150-487c-8f79-752c845a44da-frr-sockets\") pod \"frr-k8s-lrjtj\" (UID: \"876b5cff-b150-487c-8f79-752c845a44da\") " pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.036686 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/876b5cff-b150-487c-8f79-752c845a44da-frr-startup\") pod \"frr-k8s-lrjtj\" (UID: \"876b5cff-b150-487c-8f79-752c845a44da\") " pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.036721 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f9daa544-4d9f-4106-a729-d330dc8b6cc3-metallb-excludel2\") pod \"speaker-tsj92\" (UID: \"f9daa544-4d9f-4106-a729-d330dc8b6cc3\") " pod="metallb-system/speaker-tsj92" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.036755 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f9daa544-4d9f-4106-a729-d330dc8b6cc3-memberlist\") pod \"speaker-tsj92\" (UID: \"f9daa544-4d9f-4106-a729-d330dc8b6cc3\") " pod="metallb-system/speaker-tsj92" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.036789 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/876b5cff-b150-487c-8f79-752c845a44da-metrics-certs\") pod \"frr-k8s-lrjtj\" (UID: \"876b5cff-b150-487c-8f79-752c845a44da\") " pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:39 crc kubenswrapper[4825]: E0310 07:00:39.036958 4825 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 10 07:00:39 crc kubenswrapper[4825]: E0310 07:00:39.037026 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/876b5cff-b150-487c-8f79-752c845a44da-metrics-certs podName:876b5cff-b150-487c-8f79-752c845a44da nodeName:}" failed. No retries permitted until 2026-03-10 07:00:39.537005056 +0000 UTC m=+992.566785671 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/876b5cff-b150-487c-8f79-752c845a44da-metrics-certs") pod "frr-k8s-lrjtj" (UID: "876b5cff-b150-487c-8f79-752c845a44da") : secret "frr-k8s-certs-secret" not found Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.037692 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/876b5cff-b150-487c-8f79-752c845a44da-reloader\") pod \"frr-k8s-lrjtj\" (UID: \"876b5cff-b150-487c-8f79-752c845a44da\") " pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.037890 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/876b5cff-b150-487c-8f79-752c845a44da-frr-conf\") pod \"frr-k8s-lrjtj\" (UID: \"876b5cff-b150-487c-8f79-752c845a44da\") " pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.038486 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/876b5cff-b150-487c-8f79-752c845a44da-frr-startup\") pod \"frr-k8s-lrjtj\" (UID: \"876b5cff-b150-487c-8f79-752c845a44da\") " pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.041562 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/876b5cff-b150-487c-8f79-752c845a44da-metrics\") pod \"frr-k8s-lrjtj\" (UID: \"876b5cff-b150-487c-8f79-752c845a44da\") " pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.042180 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/876b5cff-b150-487c-8f79-752c845a44da-frr-sockets\") pod \"frr-k8s-lrjtj\" (UID: \"876b5cff-b150-487c-8f79-752c845a44da\") " pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.055998 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f74a263-0c43-4b78-8d8e-6a3b66166658-cert\") pod \"frr-k8s-webhook-server-7f989f654f-v6sfm\" (UID: \"0f74a263-0c43-4b78-8d8e-6a3b66166658\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-v6sfm" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.058074 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-sb9dl"] Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.067813 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw8rj\" (UniqueName: \"kubernetes.io/projected/0f74a263-0c43-4b78-8d8e-6a3b66166658-kube-api-access-xw8rj\") pod \"frr-k8s-webhook-server-7f989f654f-v6sfm\" (UID: \"0f74a263-0c43-4b78-8d8e-6a3b66166658\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-v6sfm" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.081836 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgkwq\" (UniqueName: \"kubernetes.io/projected/876b5cff-b150-487c-8f79-752c845a44da-kube-api-access-tgkwq\") pod \"frr-k8s-lrjtj\" (UID: \"876b5cff-b150-487c-8f79-752c845a44da\") " pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.138109 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3a458e6-d3de-498f-83c2-215eb477a030-cert\") pod \"controller-86ddb6bd46-sb9dl\" (UID: \"e3a458e6-d3de-498f-83c2-215eb477a030\") " pod="metallb-system/controller-86ddb6bd46-sb9dl" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.138181 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f9daa544-4d9f-4106-a729-d330dc8b6cc3-metallb-excludel2\") pod \"speaker-tsj92\" (UID: \"f9daa544-4d9f-4106-a729-d330dc8b6cc3\") " pod="metallb-system/speaker-tsj92" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.138201 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqpll\" (UniqueName: \"kubernetes.io/projected/e3a458e6-d3de-498f-83c2-215eb477a030-kube-api-access-hqpll\") pod \"controller-86ddb6bd46-sb9dl\" (UID: \"e3a458e6-d3de-498f-83c2-215eb477a030\") " pod="metallb-system/controller-86ddb6bd46-sb9dl" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.138218 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f9daa544-4d9f-4106-a729-d330dc8b6cc3-memberlist\") pod \"speaker-tsj92\" (UID: \"f9daa544-4d9f-4106-a729-d330dc8b6cc3\") " pod="metallb-system/speaker-tsj92" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.138273 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3a458e6-d3de-498f-83c2-215eb477a030-metrics-certs\") pod \"controller-86ddb6bd46-sb9dl\" (UID: \"e3a458e6-d3de-498f-83c2-215eb477a030\") " pod="metallb-system/controller-86ddb6bd46-sb9dl" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.138312 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j49wf\" (UniqueName: \"kubernetes.io/projected/f9daa544-4d9f-4106-a729-d330dc8b6cc3-kube-api-access-j49wf\") pod \"speaker-tsj92\" (UID: \"f9daa544-4d9f-4106-a729-d330dc8b6cc3\") " pod="metallb-system/speaker-tsj92" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.138347 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9daa544-4d9f-4106-a729-d330dc8b6cc3-metrics-certs\") pod \"speaker-tsj92\" (UID: \"f9daa544-4d9f-4106-a729-d330dc8b6cc3\") " pod="metallb-system/speaker-tsj92" Mar 10 07:00:39 crc kubenswrapper[4825]: E0310 07:00:39.138928 4825 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 10 07:00:39 crc kubenswrapper[4825]: E0310 07:00:39.139044 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9daa544-4d9f-4106-a729-d330dc8b6cc3-memberlist podName:f9daa544-4d9f-4106-a729-d330dc8b6cc3 nodeName:}" failed. No retries permitted until 2026-03-10 07:00:39.639028936 +0000 UTC m=+992.668809551 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f9daa544-4d9f-4106-a729-d330dc8b6cc3-memberlist") pod "speaker-tsj92" (UID: "f9daa544-4d9f-4106-a729-d330dc8b6cc3") : secret "metallb-memberlist" not found Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.139412 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f9daa544-4d9f-4106-a729-d330dc8b6cc3-metallb-excludel2\") pod \"speaker-tsj92\" (UID: \"f9daa544-4d9f-4106-a729-d330dc8b6cc3\") " pod="metallb-system/speaker-tsj92" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.142456 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9daa544-4d9f-4106-a729-d330dc8b6cc3-metrics-certs\") pod \"speaker-tsj92\" (UID: \"f9daa544-4d9f-4106-a729-d330dc8b6cc3\") " pod="metallb-system/speaker-tsj92" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.163071 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j49wf\" (UniqueName: \"kubernetes.io/projected/f9daa544-4d9f-4106-a729-d330dc8b6cc3-kube-api-access-j49wf\") pod \"speaker-tsj92\" (UID: \"f9daa544-4d9f-4106-a729-d330dc8b6cc3\") " pod="metallb-system/speaker-tsj92" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.229480 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-v6sfm" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.239366 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3a458e6-d3de-498f-83c2-215eb477a030-cert\") pod \"controller-86ddb6bd46-sb9dl\" (UID: \"e3a458e6-d3de-498f-83c2-215eb477a030\") " pod="metallb-system/controller-86ddb6bd46-sb9dl" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.239424 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqpll\" (UniqueName: \"kubernetes.io/projected/e3a458e6-d3de-498f-83c2-215eb477a030-kube-api-access-hqpll\") pod \"controller-86ddb6bd46-sb9dl\" (UID: \"e3a458e6-d3de-498f-83c2-215eb477a030\") " pod="metallb-system/controller-86ddb6bd46-sb9dl" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.239492 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3a458e6-d3de-498f-83c2-215eb477a030-metrics-certs\") pod \"controller-86ddb6bd46-sb9dl\" (UID: \"e3a458e6-d3de-498f-83c2-215eb477a030\") " pod="metallb-system/controller-86ddb6bd46-sb9dl" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.244710 4825 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.244876 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3a458e6-d3de-498f-83c2-215eb477a030-metrics-certs\") pod \"controller-86ddb6bd46-sb9dl\" (UID: \"e3a458e6-d3de-498f-83c2-215eb477a030\") " pod="metallb-system/controller-86ddb6bd46-sb9dl" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.254745 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3a458e6-d3de-498f-83c2-215eb477a030-cert\") pod \"controller-86ddb6bd46-sb9dl\" (UID: \"e3a458e6-d3de-498f-83c2-215eb477a030\") " pod="metallb-system/controller-86ddb6bd46-sb9dl" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.256850 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqpll\" (UniqueName: \"kubernetes.io/projected/e3a458e6-d3de-498f-83c2-215eb477a030-kube-api-access-hqpll\") pod \"controller-86ddb6bd46-sb9dl\" (UID: \"e3a458e6-d3de-498f-83c2-215eb477a030\") " pod="metallb-system/controller-86ddb6bd46-sb9dl" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.333387 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-sb9dl" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.504626 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-v6sfm"] Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.543268 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/876b5cff-b150-487c-8f79-752c845a44da-metrics-certs\") pod \"frr-k8s-lrjtj\" (UID: \"876b5cff-b150-487c-8f79-752c845a44da\") " pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.548895 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/876b5cff-b150-487c-8f79-752c845a44da-metrics-certs\") pod \"frr-k8s-lrjtj\" (UID: \"876b5cff-b150-487c-8f79-752c845a44da\") " pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.558729 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-sb9dl"] Mar 10 07:00:39 crc kubenswrapper[4825]: W0310 07:00:39.571283 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3a458e6_d3de_498f_83c2_215eb477a030.slice/crio-a50b35b1929dd80382ebaa9c91b3f180c9c17c8a6fdc8b2fcdc80930a869c286 WatchSource:0}: Error finding container a50b35b1929dd80382ebaa9c91b3f180c9c17c8a6fdc8b2fcdc80930a869c286: Status 404 returned error can't find the container with id a50b35b1929dd80382ebaa9c91b3f180c9c17c8a6fdc8b2fcdc80930a869c286 Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.644295 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f9daa544-4d9f-4106-a729-d330dc8b6cc3-memberlist\") pod \"speaker-tsj92\" (UID: \"f9daa544-4d9f-4106-a729-d330dc8b6cc3\") " pod="metallb-system/speaker-tsj92" Mar 10 07:00:39 crc kubenswrapper[4825]: E0310 07:00:39.644490 4825 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 10 07:00:39 crc kubenswrapper[4825]: E0310 07:00:39.644577 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9daa544-4d9f-4106-a729-d330dc8b6cc3-memberlist podName:f9daa544-4d9f-4106-a729-d330dc8b6cc3 nodeName:}" failed. No retries permitted until 2026-03-10 07:00:40.644556595 +0000 UTC m=+993.674337220 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f9daa544-4d9f-4106-a729-d330dc8b6cc3-memberlist") pod "speaker-tsj92" (UID: "f9daa544-4d9f-4106-a729-d330dc8b6cc3") : secret "metallb-memberlist" not found Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.818521 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.937924 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-v6sfm" event={"ID":"0f74a263-0c43-4b78-8d8e-6a3b66166658","Type":"ContainerStarted","Data":"37716157418fb3fd79f1faf1aa981738731040f954d2fe522d0d48ee001fc70d"} Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.939826 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-sb9dl" event={"ID":"e3a458e6-d3de-498f-83c2-215eb477a030","Type":"ContainerStarted","Data":"085bc920994781f37c52d559b034566f32c034cae8bf21a3709f638a23e36608"} Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.940349 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-sb9dl" Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.940368 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-sb9dl" event={"ID":"e3a458e6-d3de-498f-83c2-215eb477a030","Type":"ContainerStarted","Data":"40537134c13964ff1027ff34f173cb20b83130d02a3ce56240caf3797f7fd5ba"} Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.940381 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-sb9dl" event={"ID":"e3a458e6-d3de-498f-83c2-215eb477a030","Type":"ContainerStarted","Data":"a50b35b1929dd80382ebaa9c91b3f180c9c17c8a6fdc8b2fcdc80930a869c286"} Mar 10 07:00:39 crc kubenswrapper[4825]: I0310 07:00:39.959704 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-sb9dl" podStartSLOduration=1.95968181 podStartE2EDuration="1.95968181s" podCreationTimestamp="2026-03-10 07:00:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:00:39.95471978 +0000 UTC m=+992.984500405" watchObservedRunningTime="2026-03-10 07:00:39.95968181 +0000 UTC m=+992.989462425" Mar 10 07:00:40 crc kubenswrapper[4825]: I0310 07:00:40.668519 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f9daa544-4d9f-4106-a729-d330dc8b6cc3-memberlist\") pod \"speaker-tsj92\" (UID: \"f9daa544-4d9f-4106-a729-d330dc8b6cc3\") " pod="metallb-system/speaker-tsj92" Mar 10 07:00:40 crc kubenswrapper[4825]: I0310 07:00:40.688475 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f9daa544-4d9f-4106-a729-d330dc8b6cc3-memberlist\") pod \"speaker-tsj92\" (UID: \"f9daa544-4d9f-4106-a729-d330dc8b6cc3\") " pod="metallb-system/speaker-tsj92" Mar 10 07:00:40 crc kubenswrapper[4825]: I0310 07:00:40.803494 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tsj92" Mar 10 07:00:40 crc kubenswrapper[4825]: W0310 07:00:40.860044 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9daa544_4d9f_4106_a729_d330dc8b6cc3.slice/crio-30565fc35a3c226c37710de4cb0c896913e28db30f64e87e78a2504509716938 WatchSource:0}: Error finding container 30565fc35a3c226c37710de4cb0c896913e28db30f64e87e78a2504509716938: Status 404 returned error can't find the container with id 30565fc35a3c226c37710de4cb0c896913e28db30f64e87e78a2504509716938 Mar 10 07:00:40 crc kubenswrapper[4825]: I0310 07:00:40.952193 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tsj92" event={"ID":"f9daa544-4d9f-4106-a729-d330dc8b6cc3","Type":"ContainerStarted","Data":"30565fc35a3c226c37710de4cb0c896913e28db30f64e87e78a2504509716938"} Mar 10 07:00:40 crc kubenswrapper[4825]: I0310 07:00:40.953453 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lrjtj" event={"ID":"876b5cff-b150-487c-8f79-752c845a44da","Type":"ContainerStarted","Data":"e28e648c523267aa1fb2c6098cbef32c91e6f93f06c42d813d0bae6df51cd54d"} Mar 10 07:00:41 crc kubenswrapper[4825]: I0310 07:00:41.965959 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tsj92" event={"ID":"f9daa544-4d9f-4106-a729-d330dc8b6cc3","Type":"ContainerStarted","Data":"f86c79b8c0d774872d1306c92036396c3f72e2083e32c7a9aca2db47f96c84c3"} Mar 10 07:00:41 crc kubenswrapper[4825]: I0310 07:00:41.966267 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tsj92" event={"ID":"f9daa544-4d9f-4106-a729-d330dc8b6cc3","Type":"ContainerStarted","Data":"b6555c2f09d5105ee012d6ae4d78600ef6e957ee14cf3acca3c3ec69663d0dcc"} Mar 10 07:00:41 crc kubenswrapper[4825]: I0310 07:00:41.966471 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-tsj92" Mar 10 07:00:41 crc kubenswrapper[4825]: I0310 07:00:41.989458 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-tsj92" podStartSLOduration=3.989438337 podStartE2EDuration="3.989438337s" podCreationTimestamp="2026-03-10 07:00:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:00:41.988451181 +0000 UTC m=+995.018231806" watchObservedRunningTime="2026-03-10 07:00:41.989438337 +0000 UTC m=+995.019218952" Mar 10 07:00:47 crc kubenswrapper[4825]: I0310 07:00:47.014285 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-v6sfm" event={"ID":"0f74a263-0c43-4b78-8d8e-6a3b66166658","Type":"ContainerStarted","Data":"e7e4f94db47e01b97c946398392eb88347bd6bee1a3350b42d1cada14e0a5cfc"} Mar 10 07:00:47 crc kubenswrapper[4825]: I0310 07:00:47.015193 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-v6sfm" Mar 10 07:00:47 crc kubenswrapper[4825]: I0310 07:00:47.016834 4825 generic.go:334] "Generic (PLEG): container finished" podID="876b5cff-b150-487c-8f79-752c845a44da" containerID="849ea01631c287c3971d79d65f25d929782c42d1edf621e63e4624e162fec6ee" exitCode=0 Mar 10 07:00:47 crc kubenswrapper[4825]: I0310 07:00:47.016874 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lrjtj" event={"ID":"876b5cff-b150-487c-8f79-752c845a44da","Type":"ContainerDied","Data":"849ea01631c287c3971d79d65f25d929782c42d1edf621e63e4624e162fec6ee"} Mar 10 07:00:47 crc kubenswrapper[4825]: I0310 07:00:47.053157 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-v6sfm" podStartSLOduration=1.984766164 podStartE2EDuration="9.053104207s" podCreationTimestamp="2026-03-10 07:00:38 +0000 UTC" firstStartedPulling="2026-03-10 07:00:39.520549252 +0000 UTC m=+992.550329877" lastFinishedPulling="2026-03-10 07:00:46.588887295 +0000 UTC m=+999.618667920" observedRunningTime="2026-03-10 07:00:47.05130822 +0000 UTC m=+1000.081088855" watchObservedRunningTime="2026-03-10 07:00:47.053104207 +0000 UTC m=+1000.082884862" Mar 10 07:00:48 crc kubenswrapper[4825]: I0310 07:00:48.029730 4825 generic.go:334] "Generic (PLEG): container finished" podID="876b5cff-b150-487c-8f79-752c845a44da" containerID="2b2a0d94b4761fbf5d6d1e9042aaeb49b044130baae6f9838e01d4de19a5e272" exitCode=0 Mar 10 07:00:48 crc kubenswrapper[4825]: I0310 07:00:48.029862 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lrjtj" event={"ID":"876b5cff-b150-487c-8f79-752c845a44da","Type":"ContainerDied","Data":"2b2a0d94b4761fbf5d6d1e9042aaeb49b044130baae6f9838e01d4de19a5e272"} Mar 10 07:00:49 crc kubenswrapper[4825]: I0310 07:00:49.044819 4825 generic.go:334] "Generic (PLEG): container finished" podID="876b5cff-b150-487c-8f79-752c845a44da" containerID="c22bcd00658d09483ba4bb11b62b6c08f0d5379e9bd306feec2745faa8cffad9" exitCode=0 Mar 10 07:00:49 crc kubenswrapper[4825]: I0310 07:00:49.044883 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lrjtj" event={"ID":"876b5cff-b150-487c-8f79-752c845a44da","Type":"ContainerDied","Data":"c22bcd00658d09483ba4bb11b62b6c08f0d5379e9bd306feec2745faa8cffad9"} Mar 10 07:00:49 crc kubenswrapper[4825]: I0310 07:00:49.345315 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-sb9dl" Mar 10 07:00:50 crc kubenswrapper[4825]: I0310 07:00:50.058539 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lrjtj" event={"ID":"876b5cff-b150-487c-8f79-752c845a44da","Type":"ContainerStarted","Data":"c59be39ab940f18a5270f35fcf6bf4ec0aa954138b3d307a09d77246a7d29067"} Mar 10 07:00:50 crc kubenswrapper[4825]: I0310 07:00:50.058950 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lrjtj" event={"ID":"876b5cff-b150-487c-8f79-752c845a44da","Type":"ContainerStarted","Data":"e0bf848b47f06e6efb10bfebd05603f5b98781c2f2963e7affffae21df591bed"} Mar 10 07:00:50 crc kubenswrapper[4825]: I0310 07:00:50.058966 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lrjtj" event={"ID":"876b5cff-b150-487c-8f79-752c845a44da","Type":"ContainerStarted","Data":"ab2503fbc89abcef343c920a1b4d64d9774a0823c301035cc94d933c27257da8"} Mar 10 07:00:50 crc kubenswrapper[4825]: I0310 07:00:50.058980 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lrjtj" event={"ID":"876b5cff-b150-487c-8f79-752c845a44da","Type":"ContainerStarted","Data":"8c158cd856888fc775a3df0f1c8ed4bbf3b5c414ec30011c2fe4b0e3d0a5ff09"} Mar 10 07:00:50 crc kubenswrapper[4825]: I0310 07:00:50.058989 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lrjtj" event={"ID":"876b5cff-b150-487c-8f79-752c845a44da","Type":"ContainerStarted","Data":"87f74dd491c82b794817aa51d28a188c94774c70d0b33b0d4d02725ca49010c5"} Mar 10 07:00:51 crc kubenswrapper[4825]: I0310 07:00:51.069639 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lrjtj" event={"ID":"876b5cff-b150-487c-8f79-752c845a44da","Type":"ContainerStarted","Data":"d0e877c008f28e05498f4367d47da9e57e223a7ead33824e42f9fa5f4a1fe041"} Mar 10 07:00:51 crc kubenswrapper[4825]: I0310 07:00:51.069899 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:51 crc kubenswrapper[4825]: I0310 07:00:51.105531 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-lrjtj" podStartSLOduration=6.490096118 podStartE2EDuration="13.105502894s" podCreationTimestamp="2026-03-10 07:00:38 +0000 UTC" firstStartedPulling="2026-03-10 07:00:39.964625229 +0000 UTC m=+992.994405844" lastFinishedPulling="2026-03-10 07:00:46.580031995 +0000 UTC m=+999.609812620" observedRunningTime="2026-03-10 07:00:51.097972167 +0000 UTC m=+1004.127752822" watchObservedRunningTime="2026-03-10 07:00:51.105502894 +0000 UTC m=+1004.135283549" Mar 10 07:00:54 crc kubenswrapper[4825]: I0310 07:00:54.818899 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:54 crc kubenswrapper[4825]: I0310 07:00:54.871828 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:00:59 crc kubenswrapper[4825]: I0310 07:00:59.249493 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-v6sfm" Mar 10 07:00:59 crc kubenswrapper[4825]: I0310 07:00:59.823213 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-lrjtj" Mar 10 07:01:00 crc kubenswrapper[4825]: I0310 07:01:00.810382 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-tsj92" Mar 10 07:01:02 crc kubenswrapper[4825]: I0310 07:01:02.216098 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp"] Mar 10 07:01:02 crc kubenswrapper[4825]: I0310 07:01:02.217434 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp" Mar 10 07:01:02 crc kubenswrapper[4825]: I0310 07:01:02.220500 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 07:01:02 crc kubenswrapper[4825]: I0310 07:01:02.231849 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp"] Mar 10 07:01:02 crc kubenswrapper[4825]: I0310 07:01:02.299803 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/91c44c21-0e53-4940-a438-d4e4761d50e0-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp\" (UID: \"91c44c21-0e53-4940-a438-d4e4761d50e0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp" Mar 10 07:01:02 crc kubenswrapper[4825]: I0310 07:01:02.299959 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzknw\" (UniqueName: \"kubernetes.io/projected/91c44c21-0e53-4940-a438-d4e4761d50e0-kube-api-access-dzknw\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp\" (UID: \"91c44c21-0e53-4940-a438-d4e4761d50e0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp" Mar 10 07:01:02 crc kubenswrapper[4825]: I0310 07:01:02.299991 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/91c44c21-0e53-4940-a438-d4e4761d50e0-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp\" (UID: \"91c44c21-0e53-4940-a438-d4e4761d50e0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp" Mar 10 07:01:02 crc kubenswrapper[4825]: I0310 07:01:02.400757 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzknw\" (UniqueName: \"kubernetes.io/projected/91c44c21-0e53-4940-a438-d4e4761d50e0-kube-api-access-dzknw\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp\" (UID: \"91c44c21-0e53-4940-a438-d4e4761d50e0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp" Mar 10 07:01:02 crc kubenswrapper[4825]: I0310 07:01:02.400811 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/91c44c21-0e53-4940-a438-d4e4761d50e0-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp\" (UID: \"91c44c21-0e53-4940-a438-d4e4761d50e0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp" Mar 10 07:01:02 crc kubenswrapper[4825]: I0310 07:01:02.400848 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/91c44c21-0e53-4940-a438-d4e4761d50e0-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp\" (UID: \"91c44c21-0e53-4940-a438-d4e4761d50e0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp" Mar 10 07:01:02 crc kubenswrapper[4825]: I0310 07:01:02.401421 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/91c44c21-0e53-4940-a438-d4e4761d50e0-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp\" (UID: \"91c44c21-0e53-4940-a438-d4e4761d50e0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp" Mar 10 07:01:02 crc kubenswrapper[4825]: I0310 07:01:02.401505 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/91c44c21-0e53-4940-a438-d4e4761d50e0-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp\" (UID: \"91c44c21-0e53-4940-a438-d4e4761d50e0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp" Mar 10 07:01:02 crc kubenswrapper[4825]: I0310 07:01:02.446917 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzknw\" (UniqueName: \"kubernetes.io/projected/91c44c21-0e53-4940-a438-d4e4761d50e0-kube-api-access-dzknw\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp\" (UID: \"91c44c21-0e53-4940-a438-d4e4761d50e0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp" Mar 10 07:01:02 crc kubenswrapper[4825]: I0310 07:01:02.544051 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp" Mar 10 07:01:02 crc kubenswrapper[4825]: I0310 07:01:02.843107 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp"] Mar 10 07:01:02 crc kubenswrapper[4825]: W0310 07:01:02.846022 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91c44c21_0e53_4940_a438_d4e4761d50e0.slice/crio-a9515c3a159bcda3a03306eabc70acbe2933335eb2ba0c34d941683c9116abda WatchSource:0}: Error finding container a9515c3a159bcda3a03306eabc70acbe2933335eb2ba0c34d941683c9116abda: Status 404 returned error can't find the container with id a9515c3a159bcda3a03306eabc70acbe2933335eb2ba0c34d941683c9116abda Mar 10 07:01:03 crc kubenswrapper[4825]: I0310 07:01:03.180746 4825 generic.go:334] "Generic (PLEG): container finished" podID="91c44c21-0e53-4940-a438-d4e4761d50e0" containerID="a218bd9241ff07a4fb8092182477e5bd967c96fc68625370183883227988467b" exitCode=0 Mar 10 07:01:03 crc kubenswrapper[4825]: I0310 07:01:03.180823 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp" event={"ID":"91c44c21-0e53-4940-a438-d4e4761d50e0","Type":"ContainerDied","Data":"a218bd9241ff07a4fb8092182477e5bd967c96fc68625370183883227988467b"} Mar 10 07:01:03 crc kubenswrapper[4825]: I0310 07:01:03.181323 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp" event={"ID":"91c44c21-0e53-4940-a438-d4e4761d50e0","Type":"ContainerStarted","Data":"a9515c3a159bcda3a03306eabc70acbe2933335eb2ba0c34d941683c9116abda"} Mar 10 07:01:07 crc kubenswrapper[4825]: I0310 07:01:07.219322 4825 generic.go:334] "Generic (PLEG): container finished" podID="91c44c21-0e53-4940-a438-d4e4761d50e0" containerID="b1eb6eeb81f69111e389cc38e2239c4e6abde4c40d5718c8888ca1acb83fb86b" exitCode=0 Mar 10 07:01:07 crc kubenswrapper[4825]: I0310 07:01:07.219430 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp" event={"ID":"91c44c21-0e53-4940-a438-d4e4761d50e0","Type":"ContainerDied","Data":"b1eb6eeb81f69111e389cc38e2239c4e6abde4c40d5718c8888ca1acb83fb86b"} Mar 10 07:01:08 crc kubenswrapper[4825]: I0310 07:01:08.231323 4825 generic.go:334] "Generic (PLEG): container finished" podID="91c44c21-0e53-4940-a438-d4e4761d50e0" containerID="68759e4f768d4ba4da54fb6b39253085d010cfe9e74cd53b35a26bf5d8777ce2" exitCode=0 Mar 10 07:01:08 crc kubenswrapper[4825]: I0310 07:01:08.231393 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp" event={"ID":"91c44c21-0e53-4940-a438-d4e4761d50e0","Type":"ContainerDied","Data":"68759e4f768d4ba4da54fb6b39253085d010cfe9e74cd53b35a26bf5d8777ce2"} Mar 10 07:01:09 crc kubenswrapper[4825]: I0310 07:01:09.160615 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cqff6"] Mar 10 07:01:09 crc kubenswrapper[4825]: I0310 07:01:09.164665 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cqff6" Mar 10 07:01:09 crc kubenswrapper[4825]: I0310 07:01:09.183089 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cqff6"] Mar 10 07:01:09 crc kubenswrapper[4825]: I0310 07:01:09.322367 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wczf\" (UniqueName: \"kubernetes.io/projected/eaf4d2ef-21e7-430f-8d0a-b038c15c347d-kube-api-access-4wczf\") pod \"redhat-marketplace-cqff6\" (UID: \"eaf4d2ef-21e7-430f-8d0a-b038c15c347d\") " pod="openshift-marketplace/redhat-marketplace-cqff6" Mar 10 07:01:09 crc kubenswrapper[4825]: I0310 07:01:09.322428 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaf4d2ef-21e7-430f-8d0a-b038c15c347d-catalog-content\") pod \"redhat-marketplace-cqff6\" (UID: \"eaf4d2ef-21e7-430f-8d0a-b038c15c347d\") " pod="openshift-marketplace/redhat-marketplace-cqff6" Mar 10 07:01:09 crc kubenswrapper[4825]: I0310 07:01:09.322459 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaf4d2ef-21e7-430f-8d0a-b038c15c347d-utilities\") pod \"redhat-marketplace-cqff6\" (UID: \"eaf4d2ef-21e7-430f-8d0a-b038c15c347d\") " pod="openshift-marketplace/redhat-marketplace-cqff6" Mar 10 07:01:09 crc kubenswrapper[4825]: I0310 07:01:09.424394 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wczf\" (UniqueName: \"kubernetes.io/projected/eaf4d2ef-21e7-430f-8d0a-b038c15c347d-kube-api-access-4wczf\") pod \"redhat-marketplace-cqff6\" (UID: \"eaf4d2ef-21e7-430f-8d0a-b038c15c347d\") " pod="openshift-marketplace/redhat-marketplace-cqff6" Mar 10 07:01:09 crc kubenswrapper[4825]: I0310 07:01:09.424470 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaf4d2ef-21e7-430f-8d0a-b038c15c347d-catalog-content\") pod \"redhat-marketplace-cqff6\" (UID: \"eaf4d2ef-21e7-430f-8d0a-b038c15c347d\") " pod="openshift-marketplace/redhat-marketplace-cqff6" Mar 10 07:01:09 crc kubenswrapper[4825]: I0310 07:01:09.424510 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaf4d2ef-21e7-430f-8d0a-b038c15c347d-utilities\") pod \"redhat-marketplace-cqff6\" (UID: \"eaf4d2ef-21e7-430f-8d0a-b038c15c347d\") " pod="openshift-marketplace/redhat-marketplace-cqff6" Mar 10 07:01:09 crc kubenswrapper[4825]: I0310 07:01:09.425234 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaf4d2ef-21e7-430f-8d0a-b038c15c347d-utilities\") pod \"redhat-marketplace-cqff6\" (UID: \"eaf4d2ef-21e7-430f-8d0a-b038c15c347d\") " pod="openshift-marketplace/redhat-marketplace-cqff6" Mar 10 07:01:09 crc kubenswrapper[4825]: I0310 07:01:09.425962 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaf4d2ef-21e7-430f-8d0a-b038c15c347d-catalog-content\") pod \"redhat-marketplace-cqff6\" (UID: \"eaf4d2ef-21e7-430f-8d0a-b038c15c347d\") " pod="openshift-marketplace/redhat-marketplace-cqff6" Mar 10 07:01:09 crc kubenswrapper[4825]: I0310 07:01:09.452852 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wczf\" (UniqueName: \"kubernetes.io/projected/eaf4d2ef-21e7-430f-8d0a-b038c15c347d-kube-api-access-4wczf\") pod \"redhat-marketplace-cqff6\" (UID: \"eaf4d2ef-21e7-430f-8d0a-b038c15c347d\") " pod="openshift-marketplace/redhat-marketplace-cqff6" Mar 10 07:01:09 crc kubenswrapper[4825]: I0310 07:01:09.494203 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cqff6" Mar 10 07:01:09 crc kubenswrapper[4825]: I0310 07:01:09.557669 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp" Mar 10 07:01:09 crc kubenswrapper[4825]: I0310 07:01:09.729681 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/91c44c21-0e53-4940-a438-d4e4761d50e0-bundle\") pod \"91c44c21-0e53-4940-a438-d4e4761d50e0\" (UID: \"91c44c21-0e53-4940-a438-d4e4761d50e0\") " Mar 10 07:01:09 crc kubenswrapper[4825]: I0310 07:01:09.729754 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/91c44c21-0e53-4940-a438-d4e4761d50e0-util\") pod \"91c44c21-0e53-4940-a438-d4e4761d50e0\" (UID: \"91c44c21-0e53-4940-a438-d4e4761d50e0\") " Mar 10 07:01:09 crc kubenswrapper[4825]: I0310 07:01:09.729876 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzknw\" (UniqueName: \"kubernetes.io/projected/91c44c21-0e53-4940-a438-d4e4761d50e0-kube-api-access-dzknw\") pod \"91c44c21-0e53-4940-a438-d4e4761d50e0\" (UID: \"91c44c21-0e53-4940-a438-d4e4761d50e0\") " Mar 10 07:01:09 crc kubenswrapper[4825]: I0310 07:01:09.731587 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91c44c21-0e53-4940-a438-d4e4761d50e0-bundle" (OuterVolumeSpecName: "bundle") pod "91c44c21-0e53-4940-a438-d4e4761d50e0" (UID: "91c44c21-0e53-4940-a438-d4e4761d50e0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:01:09 crc kubenswrapper[4825]: I0310 07:01:09.737072 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91c44c21-0e53-4940-a438-d4e4761d50e0-kube-api-access-dzknw" (OuterVolumeSpecName: "kube-api-access-dzknw") pod "91c44c21-0e53-4940-a438-d4e4761d50e0" (UID: "91c44c21-0e53-4940-a438-d4e4761d50e0"). InnerVolumeSpecName "kube-api-access-dzknw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:01:09 crc kubenswrapper[4825]: I0310 07:01:09.760007 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91c44c21-0e53-4940-a438-d4e4761d50e0-util" (OuterVolumeSpecName: "util") pod "91c44c21-0e53-4940-a438-d4e4761d50e0" (UID: "91c44c21-0e53-4940-a438-d4e4761d50e0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:01:09 crc kubenswrapper[4825]: I0310 07:01:09.831513 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzknw\" (UniqueName: \"kubernetes.io/projected/91c44c21-0e53-4940-a438-d4e4761d50e0-kube-api-access-dzknw\") on node \"crc\" DevicePath \"\"" Mar 10 07:01:09 crc kubenswrapper[4825]: I0310 07:01:09.831554 4825 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/91c44c21-0e53-4940-a438-d4e4761d50e0-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:01:09 crc kubenswrapper[4825]: I0310 07:01:09.831565 4825 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/91c44c21-0e53-4940-a438-d4e4761d50e0-util\") on node \"crc\" DevicePath \"\"" Mar 10 07:01:09 crc kubenswrapper[4825]: I0310 07:01:09.921681 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cqff6"] Mar 10 07:01:09 crc kubenswrapper[4825]: W0310 07:01:09.929000 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaf4d2ef_21e7_430f_8d0a_b038c15c347d.slice/crio-968a693151eced6ec19cb282282de98ffeb383caa00e959cdaa201a82b037645 WatchSource:0}: Error finding container 968a693151eced6ec19cb282282de98ffeb383caa00e959cdaa201a82b037645: Status 404 returned error can't find the container with id 968a693151eced6ec19cb282282de98ffeb383caa00e959cdaa201a82b037645 Mar 10 07:01:10 crc kubenswrapper[4825]: I0310 07:01:10.247815 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp" event={"ID":"91c44c21-0e53-4940-a438-d4e4761d50e0","Type":"ContainerDied","Data":"a9515c3a159bcda3a03306eabc70acbe2933335eb2ba0c34d941683c9116abda"} Mar 10 07:01:10 crc kubenswrapper[4825]: I0310 07:01:10.248178 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9515c3a159bcda3a03306eabc70acbe2933335eb2ba0c34d941683c9116abda" Mar 10 07:01:10 crc kubenswrapper[4825]: I0310 07:01:10.247867 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp" Mar 10 07:01:10 crc kubenswrapper[4825]: I0310 07:01:10.249901 4825 generic.go:334] "Generic (PLEG): container finished" podID="eaf4d2ef-21e7-430f-8d0a-b038c15c347d" containerID="662829f4979c777190519ae05e7fc99a265ecfc401485a93e987d4413c310339" exitCode=0 Mar 10 07:01:10 crc kubenswrapper[4825]: I0310 07:01:10.249954 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cqff6" event={"ID":"eaf4d2ef-21e7-430f-8d0a-b038c15c347d","Type":"ContainerDied","Data":"662829f4979c777190519ae05e7fc99a265ecfc401485a93e987d4413c310339"} Mar 10 07:01:10 crc kubenswrapper[4825]: I0310 07:01:10.250000 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cqff6" event={"ID":"eaf4d2ef-21e7-430f-8d0a-b038c15c347d","Type":"ContainerStarted","Data":"968a693151eced6ec19cb282282de98ffeb383caa00e959cdaa201a82b037645"} Mar 10 07:01:11 crc kubenswrapper[4825]: I0310 07:01:11.571033 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xnkcr"] Mar 10 07:01:11 crc kubenswrapper[4825]: E0310 07:01:11.572017 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c44c21-0e53-4940-a438-d4e4761d50e0" containerName="extract" Mar 10 07:01:11 crc kubenswrapper[4825]: I0310 07:01:11.572047 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c44c21-0e53-4940-a438-d4e4761d50e0" containerName="extract" Mar 10 07:01:11 crc kubenswrapper[4825]: E0310 07:01:11.572065 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c44c21-0e53-4940-a438-d4e4761d50e0" containerName="pull" Mar 10 07:01:11 crc kubenswrapper[4825]: I0310 07:01:11.572081 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c44c21-0e53-4940-a438-d4e4761d50e0" containerName="pull" Mar 10 07:01:11 crc kubenswrapper[4825]: E0310 07:01:11.572119 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c44c21-0e53-4940-a438-d4e4761d50e0" containerName="util" Mar 10 07:01:11 crc kubenswrapper[4825]: I0310 07:01:11.572173 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c44c21-0e53-4940-a438-d4e4761d50e0" containerName="util" Mar 10 07:01:11 crc kubenswrapper[4825]: I0310 07:01:11.572411 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c44c21-0e53-4940-a438-d4e4761d50e0" containerName="extract" Mar 10 07:01:11 crc kubenswrapper[4825]: I0310 07:01:11.574275 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xnkcr" Mar 10 07:01:11 crc kubenswrapper[4825]: I0310 07:01:11.588553 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xnkcr"] Mar 10 07:01:11 crc kubenswrapper[4825]: I0310 07:01:11.656803 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc8a46ad-c578-4f98-978f-fb0336bbbc07-catalog-content\") pod \"community-operators-xnkcr\" (UID: \"fc8a46ad-c578-4f98-978f-fb0336bbbc07\") " pod="openshift-marketplace/community-operators-xnkcr" Mar 10 07:01:11 crc kubenswrapper[4825]: I0310 07:01:11.656851 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc8a46ad-c578-4f98-978f-fb0336bbbc07-utilities\") pod \"community-operators-xnkcr\" (UID: \"fc8a46ad-c578-4f98-978f-fb0336bbbc07\") " pod="openshift-marketplace/community-operators-xnkcr" Mar 10 07:01:11 crc kubenswrapper[4825]: I0310 07:01:11.656894 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pf2r\" (UniqueName: \"kubernetes.io/projected/fc8a46ad-c578-4f98-978f-fb0336bbbc07-kube-api-access-2pf2r\") pod \"community-operators-xnkcr\" (UID: \"fc8a46ad-c578-4f98-978f-fb0336bbbc07\") " pod="openshift-marketplace/community-operators-xnkcr" Mar 10 07:01:11 crc kubenswrapper[4825]: I0310 07:01:11.758760 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc8a46ad-c578-4f98-978f-fb0336bbbc07-catalog-content\") pod \"community-operators-xnkcr\" (UID: \"fc8a46ad-c578-4f98-978f-fb0336bbbc07\") " pod="openshift-marketplace/community-operators-xnkcr" Mar 10 07:01:11 crc kubenswrapper[4825]: I0310 07:01:11.758808 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc8a46ad-c578-4f98-978f-fb0336bbbc07-utilities\") pod \"community-operators-xnkcr\" (UID: \"fc8a46ad-c578-4f98-978f-fb0336bbbc07\") " pod="openshift-marketplace/community-operators-xnkcr" Mar 10 07:01:11 crc kubenswrapper[4825]: I0310 07:01:11.758841 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pf2r\" (UniqueName: \"kubernetes.io/projected/fc8a46ad-c578-4f98-978f-fb0336bbbc07-kube-api-access-2pf2r\") pod \"community-operators-xnkcr\" (UID: \"fc8a46ad-c578-4f98-978f-fb0336bbbc07\") " pod="openshift-marketplace/community-operators-xnkcr" Mar 10 07:01:11 crc kubenswrapper[4825]: I0310 07:01:11.759777 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc8a46ad-c578-4f98-978f-fb0336bbbc07-catalog-content\") pod \"community-operators-xnkcr\" (UID: \"fc8a46ad-c578-4f98-978f-fb0336bbbc07\") " pod="openshift-marketplace/community-operators-xnkcr" Mar 10 07:01:11 crc kubenswrapper[4825]: I0310 07:01:11.760053 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc8a46ad-c578-4f98-978f-fb0336bbbc07-utilities\") pod \"community-operators-xnkcr\" (UID: \"fc8a46ad-c578-4f98-978f-fb0336bbbc07\") " pod="openshift-marketplace/community-operators-xnkcr" Mar 10 07:01:11 crc kubenswrapper[4825]: I0310 07:01:11.795890 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pf2r\" (UniqueName: \"kubernetes.io/projected/fc8a46ad-c578-4f98-978f-fb0336bbbc07-kube-api-access-2pf2r\") pod \"community-operators-xnkcr\" (UID: \"fc8a46ad-c578-4f98-978f-fb0336bbbc07\") " pod="openshift-marketplace/community-operators-xnkcr" Mar 10 07:01:11 crc kubenswrapper[4825]: I0310 07:01:11.945861 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xnkcr" Mar 10 07:01:12 crc kubenswrapper[4825]: I0310 07:01:12.214938 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xnkcr"] Mar 10 07:01:12 crc kubenswrapper[4825]: I0310 07:01:12.269379 4825 generic.go:334] "Generic (PLEG): container finished" podID="eaf4d2ef-21e7-430f-8d0a-b038c15c347d" containerID="18a4d1ae07d393aafb11b6cd9dda419d601347bd61cdaaff8e2a66fc53d5398e" exitCode=0 Mar 10 07:01:12 crc kubenswrapper[4825]: I0310 07:01:12.269419 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cqff6" event={"ID":"eaf4d2ef-21e7-430f-8d0a-b038c15c347d","Type":"ContainerDied","Data":"18a4d1ae07d393aafb11b6cd9dda419d601347bd61cdaaff8e2a66fc53d5398e"} Mar 10 07:01:12 crc kubenswrapper[4825]: I0310 07:01:12.271404 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnkcr" event={"ID":"fc8a46ad-c578-4f98-978f-fb0336bbbc07","Type":"ContainerStarted","Data":"ea2854182a6f2f5ed66038928276209a15ea91c9a4b2a80406778eb59ad8acb8"} Mar 10 07:01:13 crc kubenswrapper[4825]: I0310 07:01:13.286165 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cqff6" event={"ID":"eaf4d2ef-21e7-430f-8d0a-b038c15c347d","Type":"ContainerStarted","Data":"619aadfcf1fa9cffc66af047cde7b089400ebe770c8064af3af91a21aa852993"} Mar 10 07:01:13 crc kubenswrapper[4825]: I0310 07:01:13.291621 4825 generic.go:334] "Generic (PLEG): container finished" podID="fc8a46ad-c578-4f98-978f-fb0336bbbc07" containerID="c487043cefa1f99622765f865ff7d86575a498acb8938984608895fb0a588444" exitCode=0 Mar 10 07:01:13 crc kubenswrapper[4825]: I0310 07:01:13.291671 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnkcr" event={"ID":"fc8a46ad-c578-4f98-978f-fb0336bbbc07","Type":"ContainerDied","Data":"c487043cefa1f99622765f865ff7d86575a498acb8938984608895fb0a588444"} Mar 10 07:01:13 crc kubenswrapper[4825]: I0310 07:01:13.335909 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cqff6" podStartSLOduration=1.887430362 podStartE2EDuration="4.335887614s" podCreationTimestamp="2026-03-10 07:01:09 +0000 UTC" firstStartedPulling="2026-03-10 07:01:10.253428134 +0000 UTC m=+1023.283208759" lastFinishedPulling="2026-03-10 07:01:12.701885376 +0000 UTC m=+1025.731666011" observedRunningTime="2026-03-10 07:01:13.331071549 +0000 UTC m=+1026.360852164" watchObservedRunningTime="2026-03-10 07:01:13.335887614 +0000 UTC m=+1026.365668229" Mar 10 07:01:14 crc kubenswrapper[4825]: I0310 07:01:14.046404 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-t2cx5"] Mar 10 07:01:14 crc kubenswrapper[4825]: I0310 07:01:14.047110 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-t2cx5" Mar 10 07:01:14 crc kubenswrapper[4825]: I0310 07:01:14.049821 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 10 07:01:14 crc kubenswrapper[4825]: I0310 07:01:14.051941 4825 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-27d7w" Mar 10 07:01:14 crc kubenswrapper[4825]: I0310 07:01:14.061189 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 10 07:01:14 crc kubenswrapper[4825]: I0310 07:01:14.066963 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-t2cx5"] Mar 10 07:01:14 crc kubenswrapper[4825]: I0310 07:01:14.194942 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7342394-6f46-4fea-9dec-06e99e9b9eff-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-t2cx5\" (UID: \"d7342394-6f46-4fea-9dec-06e99e9b9eff\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-t2cx5" Mar 10 07:01:14 crc kubenswrapper[4825]: I0310 07:01:14.194991 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8pj8\" (UniqueName: \"kubernetes.io/projected/d7342394-6f46-4fea-9dec-06e99e9b9eff-kube-api-access-h8pj8\") pod \"cert-manager-operator-controller-manager-66c8bdd694-t2cx5\" (UID: \"d7342394-6f46-4fea-9dec-06e99e9b9eff\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-t2cx5" Mar 10 07:01:14 crc kubenswrapper[4825]: I0310 07:01:14.295946 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7342394-6f46-4fea-9dec-06e99e9b9eff-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-t2cx5\" (UID: \"d7342394-6f46-4fea-9dec-06e99e9b9eff\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-t2cx5" Mar 10 07:01:14 crc kubenswrapper[4825]: I0310 07:01:14.295987 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8pj8\" (UniqueName: \"kubernetes.io/projected/d7342394-6f46-4fea-9dec-06e99e9b9eff-kube-api-access-h8pj8\") pod \"cert-manager-operator-controller-manager-66c8bdd694-t2cx5\" (UID: \"d7342394-6f46-4fea-9dec-06e99e9b9eff\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-t2cx5" Mar 10 07:01:14 crc kubenswrapper[4825]: I0310 07:01:14.296463 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7342394-6f46-4fea-9dec-06e99e9b9eff-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-t2cx5\" (UID: \"d7342394-6f46-4fea-9dec-06e99e9b9eff\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-t2cx5" Mar 10 07:01:14 crc kubenswrapper[4825]: I0310 07:01:14.301335 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnkcr" event={"ID":"fc8a46ad-c578-4f98-978f-fb0336bbbc07","Type":"ContainerStarted","Data":"4470548f42a074a30f3cffbd2428a4a442bd6aec8a9278f7d76e53fb88261cbd"} Mar 10 07:01:14 crc kubenswrapper[4825]: I0310 07:01:14.340087 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8pj8\" (UniqueName: \"kubernetes.io/projected/d7342394-6f46-4fea-9dec-06e99e9b9eff-kube-api-access-h8pj8\") pod \"cert-manager-operator-controller-manager-66c8bdd694-t2cx5\" (UID: \"d7342394-6f46-4fea-9dec-06e99e9b9eff\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-t2cx5" Mar 10 07:01:14 crc kubenswrapper[4825]: I0310 07:01:14.364556 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-t2cx5" Mar 10 07:01:14 crc kubenswrapper[4825]: I0310 07:01:14.815074 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-t2cx5"] Mar 10 07:01:14 crc kubenswrapper[4825]: W0310 07:01:14.822429 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7342394_6f46_4fea_9dec_06e99e9b9eff.slice/crio-ad6a2e68e9217aba48bf3801d956cf9e25172473c575dcee52fd5c690b986743 WatchSource:0}: Error finding container ad6a2e68e9217aba48bf3801d956cf9e25172473c575dcee52fd5c690b986743: Status 404 returned error can't find the container with id ad6a2e68e9217aba48bf3801d956cf9e25172473c575dcee52fd5c690b986743 Mar 10 07:01:15 crc kubenswrapper[4825]: I0310 07:01:15.310965 4825 generic.go:334] "Generic (PLEG): container finished" podID="fc8a46ad-c578-4f98-978f-fb0336bbbc07" containerID="4470548f42a074a30f3cffbd2428a4a442bd6aec8a9278f7d76e53fb88261cbd" exitCode=0 Mar 10 07:01:15 crc kubenswrapper[4825]: I0310 07:01:15.311039 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnkcr" event={"ID":"fc8a46ad-c578-4f98-978f-fb0336bbbc07","Type":"ContainerDied","Data":"4470548f42a074a30f3cffbd2428a4a442bd6aec8a9278f7d76e53fb88261cbd"} Mar 10 07:01:15 crc kubenswrapper[4825]: I0310 07:01:15.312970 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-t2cx5" event={"ID":"d7342394-6f46-4fea-9dec-06e99e9b9eff","Type":"ContainerStarted","Data":"ad6a2e68e9217aba48bf3801d956cf9e25172473c575dcee52fd5c690b986743"} Mar 10 07:01:16 crc kubenswrapper[4825]: I0310 07:01:16.323699 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnkcr" event={"ID":"fc8a46ad-c578-4f98-978f-fb0336bbbc07","Type":"ContainerStarted","Data":"47fc39ff164f57925dd53db1982b4d95eea5efdef828f6f2731c4b831e92521a"} Mar 10 07:01:16 crc kubenswrapper[4825]: I0310 07:01:16.359539 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xnkcr" podStartSLOduration=2.93754828 podStartE2EDuration="5.359523561s" podCreationTimestamp="2026-03-10 07:01:11 +0000 UTC" firstStartedPulling="2026-03-10 07:01:13.295782749 +0000 UTC m=+1026.325563364" lastFinishedPulling="2026-03-10 07:01:15.71775801 +0000 UTC m=+1028.747538645" observedRunningTime="2026-03-10 07:01:16.355372453 +0000 UTC m=+1029.385153068" watchObservedRunningTime="2026-03-10 07:01:16.359523561 +0000 UTC m=+1029.389304176" Mar 10 07:01:19 crc kubenswrapper[4825]: I0310 07:01:19.346356 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-t2cx5" event={"ID":"d7342394-6f46-4fea-9dec-06e99e9b9eff","Type":"ContainerStarted","Data":"ce8cffd72316d78a42f12cd42849f7a701467c8bba0a75341ef253016316e2a3"} Mar 10 07:01:19 crc kubenswrapper[4825]: I0310 07:01:19.503525 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cqff6" Mar 10 07:01:19 crc kubenswrapper[4825]: I0310 07:01:19.503595 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cqff6" Mar 10 07:01:19 crc kubenswrapper[4825]: I0310 07:01:19.571579 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cqff6" Mar 10 07:01:19 crc kubenswrapper[4825]: I0310 07:01:19.592898 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-t2cx5" podStartSLOduration=2.144376361 podStartE2EDuration="5.592880345s" podCreationTimestamp="2026-03-10 07:01:14 +0000 UTC" firstStartedPulling="2026-03-10 07:01:14.823971869 +0000 UTC m=+1027.853752494" lastFinishedPulling="2026-03-10 07:01:18.272475863 +0000 UTC m=+1031.302256478" observedRunningTime="2026-03-10 07:01:19.384427691 +0000 UTC m=+1032.414208326" watchObservedRunningTime="2026-03-10 07:01:19.592880345 +0000 UTC m=+1032.622660960" Mar 10 07:01:20 crc kubenswrapper[4825]: I0310 07:01:20.392189 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cqff6" Mar 10 07:01:21 crc kubenswrapper[4825]: I0310 07:01:21.121986 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-7sr7h"] Mar 10 07:01:21 crc kubenswrapper[4825]: I0310 07:01:21.122984 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-7sr7h" Mar 10 07:01:21 crc kubenswrapper[4825]: I0310 07:01:21.125519 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 10 07:01:21 crc kubenswrapper[4825]: I0310 07:01:21.125738 4825 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-qz8dd" Mar 10 07:01:21 crc kubenswrapper[4825]: I0310 07:01:21.126065 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 10 07:01:21 crc kubenswrapper[4825]: I0310 07:01:21.127057 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11b4eb4d-13b1-4244-ae96-c77df6e04d59-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-7sr7h\" (UID: \"11b4eb4d-13b1-4244-ae96-c77df6e04d59\") " pod="cert-manager/cert-manager-webhook-6888856db4-7sr7h" Mar 10 07:01:21 crc kubenswrapper[4825]: I0310 07:01:21.127106 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppsp4\" (UniqueName: \"kubernetes.io/projected/11b4eb4d-13b1-4244-ae96-c77df6e04d59-kube-api-access-ppsp4\") pod \"cert-manager-webhook-6888856db4-7sr7h\" (UID: \"11b4eb4d-13b1-4244-ae96-c77df6e04d59\") " pod="cert-manager/cert-manager-webhook-6888856db4-7sr7h" Mar 10 07:01:21 crc kubenswrapper[4825]: I0310 07:01:21.133266 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-7sr7h"] Mar 10 07:01:21 crc kubenswrapper[4825]: I0310 07:01:21.228165 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11b4eb4d-13b1-4244-ae96-c77df6e04d59-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-7sr7h\" (UID: \"11b4eb4d-13b1-4244-ae96-c77df6e04d59\") " pod="cert-manager/cert-manager-webhook-6888856db4-7sr7h" Mar 10 07:01:21 crc kubenswrapper[4825]: I0310 07:01:21.228214 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppsp4\" (UniqueName: \"kubernetes.io/projected/11b4eb4d-13b1-4244-ae96-c77df6e04d59-kube-api-access-ppsp4\") pod \"cert-manager-webhook-6888856db4-7sr7h\" (UID: \"11b4eb4d-13b1-4244-ae96-c77df6e04d59\") " pod="cert-manager/cert-manager-webhook-6888856db4-7sr7h" Mar 10 07:01:21 crc kubenswrapper[4825]: I0310 07:01:21.266799 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11b4eb4d-13b1-4244-ae96-c77df6e04d59-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-7sr7h\" (UID: \"11b4eb4d-13b1-4244-ae96-c77df6e04d59\") " pod="cert-manager/cert-manager-webhook-6888856db4-7sr7h" Mar 10 07:01:21 crc kubenswrapper[4825]: I0310 07:01:21.267213 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppsp4\" (UniqueName: \"kubernetes.io/projected/11b4eb4d-13b1-4244-ae96-c77df6e04d59-kube-api-access-ppsp4\") pod \"cert-manager-webhook-6888856db4-7sr7h\" (UID: \"11b4eb4d-13b1-4244-ae96-c77df6e04d59\") " pod="cert-manager/cert-manager-webhook-6888856db4-7sr7h" Mar 10 07:01:21 crc kubenswrapper[4825]: I0310 07:01:21.441004 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-7sr7h" Mar 10 07:01:21 crc kubenswrapper[4825]: I0310 07:01:21.922575 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-7sr7h"] Mar 10 07:01:21 crc kubenswrapper[4825]: I0310 07:01:21.946331 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xnkcr" Mar 10 07:01:21 crc kubenswrapper[4825]: I0310 07:01:21.946386 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xnkcr" Mar 10 07:01:21 crc kubenswrapper[4825]: I0310 07:01:21.952755 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cqff6"] Mar 10 07:01:21 crc kubenswrapper[4825]: I0310 07:01:21.997174 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xnkcr" Mar 10 07:01:22 crc kubenswrapper[4825]: I0310 07:01:22.364638 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-7sr7h" event={"ID":"11b4eb4d-13b1-4244-ae96-c77df6e04d59","Type":"ContainerStarted","Data":"9fcda5aa5089fb0b7a7cbb628e1f9aa52128457bb35cf59711b2f675df9a2824"} Mar 10 07:01:22 crc kubenswrapper[4825]: I0310 07:01:22.364827 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cqff6" podUID="eaf4d2ef-21e7-430f-8d0a-b038c15c347d" containerName="registry-server" containerID="cri-o://619aadfcf1fa9cffc66af047cde7b089400ebe770c8064af3af91a21aa852993" gracePeriod=2 Mar 10 07:01:22 crc kubenswrapper[4825]: I0310 07:01:22.406615 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xnkcr" Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.385789 4825 generic.go:334] "Generic (PLEG): container finished" podID="eaf4d2ef-21e7-430f-8d0a-b038c15c347d" containerID="619aadfcf1fa9cffc66af047cde7b089400ebe770c8064af3af91a21aa852993" exitCode=0 Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.386364 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cqff6" event={"ID":"eaf4d2ef-21e7-430f-8d0a-b038c15c347d","Type":"ContainerDied","Data":"619aadfcf1fa9cffc66af047cde7b089400ebe770c8064af3af91a21aa852993"} Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.386404 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cqff6" event={"ID":"eaf4d2ef-21e7-430f-8d0a-b038c15c347d","Type":"ContainerDied","Data":"968a693151eced6ec19cb282282de98ffeb383caa00e959cdaa201a82b037645"} Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.386419 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="968a693151eced6ec19cb282282de98ffeb383caa00e959cdaa201a82b037645" Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.414881 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cqff6" Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.495125 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-hlzgp"] Mar 10 07:01:23 crc kubenswrapper[4825]: E0310 07:01:23.495391 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf4d2ef-21e7-430f-8d0a-b038c15c347d" containerName="extract-utilities" Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.495404 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf4d2ef-21e7-430f-8d0a-b038c15c347d" containerName="extract-utilities" Mar 10 07:01:23 crc kubenswrapper[4825]: E0310 07:01:23.495418 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf4d2ef-21e7-430f-8d0a-b038c15c347d" containerName="extract-content" Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.495424 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf4d2ef-21e7-430f-8d0a-b038c15c347d" containerName="extract-content" Mar 10 07:01:23 crc kubenswrapper[4825]: E0310 07:01:23.495435 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf4d2ef-21e7-430f-8d0a-b038c15c347d" containerName="registry-server" Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.495441 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf4d2ef-21e7-430f-8d0a-b038c15c347d" containerName="registry-server" Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.495538 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf4d2ef-21e7-430f-8d0a-b038c15c347d" containerName="registry-server" Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.495926 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-hlzgp" Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.503567 4825 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-p2vpr" Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.518732 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-hlzgp"] Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.578236 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wczf\" (UniqueName: \"kubernetes.io/projected/eaf4d2ef-21e7-430f-8d0a-b038c15c347d-kube-api-access-4wczf\") pod \"eaf4d2ef-21e7-430f-8d0a-b038c15c347d\" (UID: \"eaf4d2ef-21e7-430f-8d0a-b038c15c347d\") " Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.578680 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaf4d2ef-21e7-430f-8d0a-b038c15c347d-utilities\") pod \"eaf4d2ef-21e7-430f-8d0a-b038c15c347d\" (UID: \"eaf4d2ef-21e7-430f-8d0a-b038c15c347d\") " Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.578804 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaf4d2ef-21e7-430f-8d0a-b038c15c347d-catalog-content\") pod \"eaf4d2ef-21e7-430f-8d0a-b038c15c347d\" (UID: \"eaf4d2ef-21e7-430f-8d0a-b038c15c347d\") " Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.579122 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfvkq\" (UniqueName: \"kubernetes.io/projected/a5f3c59e-41fb-4805-b696-47f2095503e1-kube-api-access-wfvkq\") pod \"cert-manager-cainjector-5545bd876-hlzgp\" (UID: \"a5f3c59e-41fb-4805-b696-47f2095503e1\") " pod="cert-manager/cert-manager-cainjector-5545bd876-hlzgp" Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.580395 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5f3c59e-41fb-4805-b696-47f2095503e1-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-hlzgp\" (UID: \"a5f3c59e-41fb-4805-b696-47f2095503e1\") " pod="cert-manager/cert-manager-cainjector-5545bd876-hlzgp" Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.580048 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaf4d2ef-21e7-430f-8d0a-b038c15c347d-utilities" (OuterVolumeSpecName: "utilities") pod "eaf4d2ef-21e7-430f-8d0a-b038c15c347d" (UID: "eaf4d2ef-21e7-430f-8d0a-b038c15c347d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.580661 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaf4d2ef-21e7-430f-8d0a-b038c15c347d-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.603309 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaf4d2ef-21e7-430f-8d0a-b038c15c347d-kube-api-access-4wczf" (OuterVolumeSpecName: "kube-api-access-4wczf") pod "eaf4d2ef-21e7-430f-8d0a-b038c15c347d" (UID: "eaf4d2ef-21e7-430f-8d0a-b038c15c347d"). InnerVolumeSpecName "kube-api-access-4wczf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.610606 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaf4d2ef-21e7-430f-8d0a-b038c15c347d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eaf4d2ef-21e7-430f-8d0a-b038c15c347d" (UID: "eaf4d2ef-21e7-430f-8d0a-b038c15c347d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.681567 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfvkq\" (UniqueName: \"kubernetes.io/projected/a5f3c59e-41fb-4805-b696-47f2095503e1-kube-api-access-wfvkq\") pod \"cert-manager-cainjector-5545bd876-hlzgp\" (UID: \"a5f3c59e-41fb-4805-b696-47f2095503e1\") " pod="cert-manager/cert-manager-cainjector-5545bd876-hlzgp" Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.681624 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5f3c59e-41fb-4805-b696-47f2095503e1-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-hlzgp\" (UID: \"a5f3c59e-41fb-4805-b696-47f2095503e1\") " pod="cert-manager/cert-manager-cainjector-5545bd876-hlzgp" Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.681720 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaf4d2ef-21e7-430f-8d0a-b038c15c347d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.681733 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wczf\" (UniqueName: \"kubernetes.io/projected/eaf4d2ef-21e7-430f-8d0a-b038c15c347d-kube-api-access-4wczf\") on node \"crc\" DevicePath \"\"" Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.696813 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a5f3c59e-41fb-4805-b696-47f2095503e1-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-hlzgp\" (UID: \"a5f3c59e-41fb-4805-b696-47f2095503e1\") " pod="cert-manager/cert-manager-cainjector-5545bd876-hlzgp" Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.697061 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfvkq\" (UniqueName: \"kubernetes.io/projected/a5f3c59e-41fb-4805-b696-47f2095503e1-kube-api-access-wfvkq\") pod \"cert-manager-cainjector-5545bd876-hlzgp\" (UID: \"a5f3c59e-41fb-4805-b696-47f2095503e1\") " pod="cert-manager/cert-manager-cainjector-5545bd876-hlzgp" Mar 10 07:01:23 crc kubenswrapper[4825]: I0310 07:01:23.823485 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-hlzgp" Mar 10 07:01:24 crc kubenswrapper[4825]: I0310 07:01:24.257383 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-hlzgp"] Mar 10 07:01:24 crc kubenswrapper[4825]: I0310 07:01:24.397035 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-hlzgp" event={"ID":"a5f3c59e-41fb-4805-b696-47f2095503e1","Type":"ContainerStarted","Data":"f6674e06eb26d4cfce62e7dcf05d9a690c4ef38e3d1c1bb07e2fecbce0d99371"} Mar 10 07:01:24 crc kubenswrapper[4825]: I0310 07:01:24.397087 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cqff6" Mar 10 07:01:24 crc kubenswrapper[4825]: I0310 07:01:24.425094 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cqff6"] Mar 10 07:01:24 crc kubenswrapper[4825]: I0310 07:01:24.428447 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cqff6"] Mar 10 07:01:25 crc kubenswrapper[4825]: I0310 07:01:25.244615 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaf4d2ef-21e7-430f-8d0a-b038c15c347d" path="/var/lib/kubelet/pods/eaf4d2ef-21e7-430f-8d0a-b038c15c347d/volumes" Mar 10 07:01:25 crc kubenswrapper[4825]: I0310 07:01:25.548308 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xnkcr"] Mar 10 07:01:25 crc kubenswrapper[4825]: I0310 07:01:25.548620 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xnkcr" podUID="fc8a46ad-c578-4f98-978f-fb0336bbbc07" containerName="registry-server" containerID="cri-o://47fc39ff164f57925dd53db1982b4d95eea5efdef828f6f2731c4b831e92521a" gracePeriod=2 Mar 10 07:01:26 crc kubenswrapper[4825]: I0310 07:01:26.412330 4825 generic.go:334] "Generic (PLEG): container finished" podID="fc8a46ad-c578-4f98-978f-fb0336bbbc07" containerID="47fc39ff164f57925dd53db1982b4d95eea5efdef828f6f2731c4b831e92521a" exitCode=0 Mar 10 07:01:26 crc kubenswrapper[4825]: I0310 07:01:26.412370 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnkcr" event={"ID":"fc8a46ad-c578-4f98-978f-fb0336bbbc07","Type":"ContainerDied","Data":"47fc39ff164f57925dd53db1982b4d95eea5efdef828f6f2731c4b831e92521a"} Mar 10 07:01:27 crc kubenswrapper[4825]: I0310 07:01:27.222973 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xnkcr" Mar 10 07:01:27 crc kubenswrapper[4825]: I0310 07:01:27.231519 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc8a46ad-c578-4f98-978f-fb0336bbbc07-catalog-content\") pod \"fc8a46ad-c578-4f98-978f-fb0336bbbc07\" (UID: \"fc8a46ad-c578-4f98-978f-fb0336bbbc07\") " Mar 10 07:01:27 crc kubenswrapper[4825]: I0310 07:01:27.231567 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc8a46ad-c578-4f98-978f-fb0336bbbc07-utilities\") pod \"fc8a46ad-c578-4f98-978f-fb0336bbbc07\" (UID: \"fc8a46ad-c578-4f98-978f-fb0336bbbc07\") " Mar 10 07:01:27 crc kubenswrapper[4825]: I0310 07:01:27.232563 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc8a46ad-c578-4f98-978f-fb0336bbbc07-utilities" (OuterVolumeSpecName: "utilities") pod "fc8a46ad-c578-4f98-978f-fb0336bbbc07" (UID: "fc8a46ad-c578-4f98-978f-fb0336bbbc07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:01:27 crc kubenswrapper[4825]: I0310 07:01:27.280780 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc8a46ad-c578-4f98-978f-fb0336bbbc07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc8a46ad-c578-4f98-978f-fb0336bbbc07" (UID: "fc8a46ad-c578-4f98-978f-fb0336bbbc07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:01:27 crc kubenswrapper[4825]: I0310 07:01:27.332753 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pf2r\" (UniqueName: \"kubernetes.io/projected/fc8a46ad-c578-4f98-978f-fb0336bbbc07-kube-api-access-2pf2r\") pod \"fc8a46ad-c578-4f98-978f-fb0336bbbc07\" (UID: \"fc8a46ad-c578-4f98-978f-fb0336bbbc07\") " Mar 10 07:01:27 crc kubenswrapper[4825]: I0310 07:01:27.333317 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc8a46ad-c578-4f98-978f-fb0336bbbc07-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 07:01:27 crc kubenswrapper[4825]: I0310 07:01:27.333333 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc8a46ad-c578-4f98-978f-fb0336bbbc07-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 07:01:27 crc kubenswrapper[4825]: I0310 07:01:27.339240 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc8a46ad-c578-4f98-978f-fb0336bbbc07-kube-api-access-2pf2r" (OuterVolumeSpecName: "kube-api-access-2pf2r") pod "fc8a46ad-c578-4f98-978f-fb0336bbbc07" (UID: "fc8a46ad-c578-4f98-978f-fb0336bbbc07"). InnerVolumeSpecName "kube-api-access-2pf2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:01:27 crc kubenswrapper[4825]: I0310 07:01:27.420765 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-7sr7h" event={"ID":"11b4eb4d-13b1-4244-ae96-c77df6e04d59","Type":"ContainerStarted","Data":"ac13cb8daf5561a1b7259178bf4906ed41afd74b96578cd14b3fee6c7b9ce1b8"} Mar 10 07:01:27 crc kubenswrapper[4825]: I0310 07:01:27.421090 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-7sr7h" Mar 10 07:01:27 crc kubenswrapper[4825]: I0310 07:01:27.422589 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-hlzgp" event={"ID":"a5f3c59e-41fb-4805-b696-47f2095503e1","Type":"ContainerStarted","Data":"cb790980aae2dfce98a572235aa56a3b1210893da212a9c10ffbd43c75bc1c99"} Mar 10 07:01:27 crc kubenswrapper[4825]: I0310 07:01:27.425402 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xnkcr" event={"ID":"fc8a46ad-c578-4f98-978f-fb0336bbbc07","Type":"ContainerDied","Data":"ea2854182a6f2f5ed66038928276209a15ea91c9a4b2a80406778eb59ad8acb8"} Mar 10 07:01:27 crc kubenswrapper[4825]: I0310 07:01:27.425446 4825 scope.go:117] "RemoveContainer" containerID="47fc39ff164f57925dd53db1982b4d95eea5efdef828f6f2731c4b831e92521a" Mar 10 07:01:27 crc kubenswrapper[4825]: I0310 07:01:27.425464 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xnkcr" Mar 10 07:01:27 crc kubenswrapper[4825]: I0310 07:01:27.434521 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pf2r\" (UniqueName: \"kubernetes.io/projected/fc8a46ad-c578-4f98-978f-fb0336bbbc07-kube-api-access-2pf2r\") on node \"crc\" DevicePath \"\"" Mar 10 07:01:27 crc kubenswrapper[4825]: I0310 07:01:27.444491 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-7sr7h" podStartSLOduration=1.458568335 podStartE2EDuration="6.444464588s" podCreationTimestamp="2026-03-10 07:01:21 +0000 UTC" firstStartedPulling="2026-03-10 07:01:21.931800292 +0000 UTC m=+1034.961580907" lastFinishedPulling="2026-03-10 07:01:26.917696535 +0000 UTC m=+1039.947477160" observedRunningTime="2026-03-10 07:01:27.439117418 +0000 UTC m=+1040.468898033" watchObservedRunningTime="2026-03-10 07:01:27.444464588 +0000 UTC m=+1040.474245223" Mar 10 07:01:27 crc kubenswrapper[4825]: I0310 07:01:27.446387 4825 scope.go:117] "RemoveContainer" containerID="4470548f42a074a30f3cffbd2428a4a442bd6aec8a9278f7d76e53fb88261cbd" Mar 10 07:01:27 crc kubenswrapper[4825]: I0310 07:01:27.456397 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xnkcr"] Mar 10 07:01:27 crc kubenswrapper[4825]: I0310 07:01:27.461479 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xnkcr"] Mar 10 07:01:27 crc kubenswrapper[4825]: I0310 07:01:27.467750 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-hlzgp" podStartSLOduration=1.815708816 podStartE2EDuration="4.467728704s" podCreationTimestamp="2026-03-10 07:01:23 +0000 UTC" firstStartedPulling="2026-03-10 07:01:24.271830527 +0000 UTC m=+1037.301611182" lastFinishedPulling="2026-03-10 07:01:26.923850445 +0000 UTC m=+1039.953631070" observedRunningTime="2026-03-10 07:01:27.464208842 +0000 UTC m=+1040.493989477" watchObservedRunningTime="2026-03-10 07:01:27.467728704 +0000 UTC m=+1040.497509339" Mar 10 07:01:27 crc kubenswrapper[4825]: I0310 07:01:27.478767 4825 scope.go:117] "RemoveContainer" containerID="c487043cefa1f99622765f865ff7d86575a498acb8938984608895fb0a588444" Mar 10 07:01:29 crc kubenswrapper[4825]: I0310 07:01:29.253549 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc8a46ad-c578-4f98-978f-fb0336bbbc07" path="/var/lib/kubelet/pods/fc8a46ad-c578-4f98-978f-fb0336bbbc07/volumes" Mar 10 07:01:30 crc kubenswrapper[4825]: I0310 07:01:30.898008 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-58dfk"] Mar 10 07:01:30 crc kubenswrapper[4825]: E0310 07:01:30.898640 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc8a46ad-c578-4f98-978f-fb0336bbbc07" containerName="extract-content" Mar 10 07:01:30 crc kubenswrapper[4825]: I0310 07:01:30.898655 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc8a46ad-c578-4f98-978f-fb0336bbbc07" containerName="extract-content" Mar 10 07:01:30 crc kubenswrapper[4825]: E0310 07:01:30.898675 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc8a46ad-c578-4f98-978f-fb0336bbbc07" containerName="registry-server" Mar 10 07:01:30 crc kubenswrapper[4825]: I0310 07:01:30.898683 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc8a46ad-c578-4f98-978f-fb0336bbbc07" containerName="registry-server" Mar 10 07:01:30 crc kubenswrapper[4825]: E0310 07:01:30.898703 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc8a46ad-c578-4f98-978f-fb0336bbbc07" containerName="extract-utilities" Mar 10 07:01:30 crc kubenswrapper[4825]: I0310 07:01:30.898711 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc8a46ad-c578-4f98-978f-fb0336bbbc07" containerName="extract-utilities" Mar 10 07:01:30 crc kubenswrapper[4825]: I0310 07:01:30.898853 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc8a46ad-c578-4f98-978f-fb0336bbbc07" containerName="registry-server" Mar 10 07:01:30 crc kubenswrapper[4825]: I0310 07:01:30.899842 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-58dfk" Mar 10 07:01:30 crc kubenswrapper[4825]: I0310 07:01:30.953229 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-58dfk"] Mar 10 07:01:31 crc kubenswrapper[4825]: I0310 07:01:31.084613 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a79ceb35-a79e-4988-a6e9-9d9694ca27d2-utilities\") pod \"certified-operators-58dfk\" (UID: \"a79ceb35-a79e-4988-a6e9-9d9694ca27d2\") " pod="openshift-marketplace/certified-operators-58dfk" Mar 10 07:01:31 crc kubenswrapper[4825]: I0310 07:01:31.084679 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79rl5\" (UniqueName: \"kubernetes.io/projected/a79ceb35-a79e-4988-a6e9-9d9694ca27d2-kube-api-access-79rl5\") pod \"certified-operators-58dfk\" (UID: \"a79ceb35-a79e-4988-a6e9-9d9694ca27d2\") " pod="openshift-marketplace/certified-operators-58dfk" Mar 10 07:01:31 crc kubenswrapper[4825]: I0310 07:01:31.084746 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a79ceb35-a79e-4988-a6e9-9d9694ca27d2-catalog-content\") pod \"certified-operators-58dfk\" (UID: \"a79ceb35-a79e-4988-a6e9-9d9694ca27d2\") " pod="openshift-marketplace/certified-operators-58dfk" Mar 10 07:01:31 crc kubenswrapper[4825]: I0310 07:01:31.185848 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a79ceb35-a79e-4988-a6e9-9d9694ca27d2-utilities\") pod \"certified-operators-58dfk\" (UID: \"a79ceb35-a79e-4988-a6e9-9d9694ca27d2\") " pod="openshift-marketplace/certified-operators-58dfk" Mar 10 07:01:31 crc kubenswrapper[4825]: I0310 07:01:31.185938 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79rl5\" (UniqueName: \"kubernetes.io/projected/a79ceb35-a79e-4988-a6e9-9d9694ca27d2-kube-api-access-79rl5\") pod \"certified-operators-58dfk\" (UID: \"a79ceb35-a79e-4988-a6e9-9d9694ca27d2\") " pod="openshift-marketplace/certified-operators-58dfk" Mar 10 07:01:31 crc kubenswrapper[4825]: I0310 07:01:31.186016 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a79ceb35-a79e-4988-a6e9-9d9694ca27d2-catalog-content\") pod \"certified-operators-58dfk\" (UID: \"a79ceb35-a79e-4988-a6e9-9d9694ca27d2\") " pod="openshift-marketplace/certified-operators-58dfk" Mar 10 07:01:31 crc kubenswrapper[4825]: I0310 07:01:31.186565 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a79ceb35-a79e-4988-a6e9-9d9694ca27d2-utilities\") pod \"certified-operators-58dfk\" (UID: \"a79ceb35-a79e-4988-a6e9-9d9694ca27d2\") " pod="openshift-marketplace/certified-operators-58dfk" Mar 10 07:01:31 crc kubenswrapper[4825]: I0310 07:01:31.186629 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a79ceb35-a79e-4988-a6e9-9d9694ca27d2-catalog-content\") pod \"certified-operators-58dfk\" (UID: \"a79ceb35-a79e-4988-a6e9-9d9694ca27d2\") " pod="openshift-marketplace/certified-operators-58dfk" Mar 10 07:01:31 crc kubenswrapper[4825]: I0310 07:01:31.213225 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79rl5\" (UniqueName: \"kubernetes.io/projected/a79ceb35-a79e-4988-a6e9-9d9694ca27d2-kube-api-access-79rl5\") pod \"certified-operators-58dfk\" (UID: \"a79ceb35-a79e-4988-a6e9-9d9694ca27d2\") " pod="openshift-marketplace/certified-operators-58dfk" Mar 10 07:01:31 crc kubenswrapper[4825]: I0310 07:01:31.225151 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-58dfk" Mar 10 07:01:31 crc kubenswrapper[4825]: I0310 07:01:31.682523 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-58dfk"] Mar 10 07:01:31 crc kubenswrapper[4825]: W0310 07:01:31.686141 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda79ceb35_a79e_4988_a6e9_9d9694ca27d2.slice/crio-0ef7337547fddd2d1c53e36bd91759bfaafe2a3aefbb52bcc74d5fa680a9980b WatchSource:0}: Error finding container 0ef7337547fddd2d1c53e36bd91759bfaafe2a3aefbb52bcc74d5fa680a9980b: Status 404 returned error can't find the container with id 0ef7337547fddd2d1c53e36bd91759bfaafe2a3aefbb52bcc74d5fa680a9980b Mar 10 07:01:32 crc kubenswrapper[4825]: I0310 07:01:32.469368 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58dfk" event={"ID":"a79ceb35-a79e-4988-a6e9-9d9694ca27d2","Type":"ContainerStarted","Data":"0ef7337547fddd2d1c53e36bd91759bfaafe2a3aefbb52bcc74d5fa680a9980b"} Mar 10 07:01:32 crc kubenswrapper[4825]: I0310 07:01:32.504280 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-f568q"] Mar 10 07:01:32 crc kubenswrapper[4825]: I0310 07:01:32.506227 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-f568q" Mar 10 07:01:32 crc kubenswrapper[4825]: I0310 07:01:32.511506 4825 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-xpvr6" Mar 10 07:01:32 crc kubenswrapper[4825]: I0310 07:01:32.512231 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-f568q"] Mar 10 07:01:32 crc kubenswrapper[4825]: I0310 07:01:32.614950 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bttrk\" (UniqueName: \"kubernetes.io/projected/2d2dbde8-ed02-48ce-9b9e-103834db8e3a-kube-api-access-bttrk\") pod \"cert-manager-545d4d4674-f568q\" (UID: \"2d2dbde8-ed02-48ce-9b9e-103834db8e3a\") " pod="cert-manager/cert-manager-545d4d4674-f568q" Mar 10 07:01:32 crc kubenswrapper[4825]: I0310 07:01:32.615195 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d2dbde8-ed02-48ce-9b9e-103834db8e3a-bound-sa-token\") pod \"cert-manager-545d4d4674-f568q\" (UID: \"2d2dbde8-ed02-48ce-9b9e-103834db8e3a\") " pod="cert-manager/cert-manager-545d4d4674-f568q" Mar 10 07:01:32 crc kubenswrapper[4825]: I0310 07:01:32.717354 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d2dbde8-ed02-48ce-9b9e-103834db8e3a-bound-sa-token\") pod \"cert-manager-545d4d4674-f568q\" (UID: \"2d2dbde8-ed02-48ce-9b9e-103834db8e3a\") " pod="cert-manager/cert-manager-545d4d4674-f568q" Mar 10 07:01:32 crc kubenswrapper[4825]: I0310 07:01:32.717553 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bttrk\" (UniqueName: \"kubernetes.io/projected/2d2dbde8-ed02-48ce-9b9e-103834db8e3a-kube-api-access-bttrk\") pod \"cert-manager-545d4d4674-f568q\" (UID: \"2d2dbde8-ed02-48ce-9b9e-103834db8e3a\") " pod="cert-manager/cert-manager-545d4d4674-f568q" Mar 10 07:01:32 crc kubenswrapper[4825]: I0310 07:01:32.753739 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d2dbde8-ed02-48ce-9b9e-103834db8e3a-bound-sa-token\") pod \"cert-manager-545d4d4674-f568q\" (UID: \"2d2dbde8-ed02-48ce-9b9e-103834db8e3a\") " pod="cert-manager/cert-manager-545d4d4674-f568q" Mar 10 07:01:32 crc kubenswrapper[4825]: I0310 07:01:32.754297 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bttrk\" (UniqueName: \"kubernetes.io/projected/2d2dbde8-ed02-48ce-9b9e-103834db8e3a-kube-api-access-bttrk\") pod \"cert-manager-545d4d4674-f568q\" (UID: \"2d2dbde8-ed02-48ce-9b9e-103834db8e3a\") " pod="cert-manager/cert-manager-545d4d4674-f568q" Mar 10 07:01:32 crc kubenswrapper[4825]: I0310 07:01:32.832617 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-f568q" Mar 10 07:01:33 crc kubenswrapper[4825]: I0310 07:01:33.377321 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-f568q"] Mar 10 07:01:33 crc kubenswrapper[4825]: W0310 07:01:33.386406 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d2dbde8_ed02_48ce_9b9e_103834db8e3a.slice/crio-e304ef9316d38b090a7630bc769779d5a00fb2bfe70d1529651aaa3be2ff1784 WatchSource:0}: Error finding container e304ef9316d38b090a7630bc769779d5a00fb2bfe70d1529651aaa3be2ff1784: Status 404 returned error can't find the container with id e304ef9316d38b090a7630bc769779d5a00fb2bfe70d1529651aaa3be2ff1784 Mar 10 07:01:33 crc kubenswrapper[4825]: I0310 07:01:33.478067 4825 generic.go:334] "Generic (PLEG): container finished" podID="a79ceb35-a79e-4988-a6e9-9d9694ca27d2" containerID="b371d5d10558c5d9fa2201ad7ed07749c32fa3f9b2f413516faab89a4ca2d8fb" exitCode=0 Mar 10 07:01:33 crc kubenswrapper[4825]: I0310 07:01:33.478191 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58dfk" event={"ID":"a79ceb35-a79e-4988-a6e9-9d9694ca27d2","Type":"ContainerDied","Data":"b371d5d10558c5d9fa2201ad7ed07749c32fa3f9b2f413516faab89a4ca2d8fb"} Mar 10 07:01:33 crc kubenswrapper[4825]: I0310 07:01:33.483629 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-f568q" event={"ID":"2d2dbde8-ed02-48ce-9b9e-103834db8e3a","Type":"ContainerStarted","Data":"e304ef9316d38b090a7630bc769779d5a00fb2bfe70d1529651aaa3be2ff1784"} Mar 10 07:01:34 crc kubenswrapper[4825]: I0310 07:01:34.496084 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-f568q" event={"ID":"2d2dbde8-ed02-48ce-9b9e-103834db8e3a","Type":"ContainerStarted","Data":"ff29380564006a1c58adacdc9fb6d1ffc7835305116586bb0e464f1d3f900fc3"} Mar 10 07:01:34 crc kubenswrapper[4825]: I0310 07:01:34.537323 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-f568q" podStartSLOduration=2.537294479 podStartE2EDuration="2.537294479s" podCreationTimestamp="2026-03-10 07:01:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:01:34.518942201 +0000 UTC m=+1047.548722846" watchObservedRunningTime="2026-03-10 07:01:34.537294479 +0000 UTC m=+1047.567075134" Mar 10 07:01:35 crc kubenswrapper[4825]: I0310 07:01:35.515176 4825 generic.go:334] "Generic (PLEG): container finished" podID="a79ceb35-a79e-4988-a6e9-9d9694ca27d2" containerID="769b0c06e66e360f78d6128d674eb4ff5f13babb14fc08125da7261463c9dca2" exitCode=0 Mar 10 07:01:35 crc kubenswrapper[4825]: I0310 07:01:35.515730 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58dfk" event={"ID":"a79ceb35-a79e-4988-a6e9-9d9694ca27d2","Type":"ContainerDied","Data":"769b0c06e66e360f78d6128d674eb4ff5f13babb14fc08125da7261463c9dca2"} Mar 10 07:01:36 crc kubenswrapper[4825]: I0310 07:01:36.443805 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-7sr7h" Mar 10 07:01:36 crc kubenswrapper[4825]: I0310 07:01:36.524585 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58dfk" event={"ID":"a79ceb35-a79e-4988-a6e9-9d9694ca27d2","Type":"ContainerStarted","Data":"d4bd8f62309e303de3b9cf2bcff44ed364b619f06b5bbb12e99709a35177cb56"} Mar 10 07:01:36 crc kubenswrapper[4825]: I0310 07:01:36.546572 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-58dfk" podStartSLOduration=4.057848269 podStartE2EDuration="6.546555121s" podCreationTimestamp="2026-03-10 07:01:30 +0000 UTC" firstStartedPulling="2026-03-10 07:01:33.481372941 +0000 UTC m=+1046.511153596" lastFinishedPulling="2026-03-10 07:01:35.970079803 +0000 UTC m=+1048.999860448" observedRunningTime="2026-03-10 07:01:36.540680588 +0000 UTC m=+1049.570461213" watchObservedRunningTime="2026-03-10 07:01:36.546555121 +0000 UTC m=+1049.576335746" Mar 10 07:01:39 crc kubenswrapper[4825]: I0310 07:01:39.780442 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-6g7zc"] Mar 10 07:01:39 crc kubenswrapper[4825]: I0310 07:01:39.781498 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6g7zc" Mar 10 07:01:39 crc kubenswrapper[4825]: I0310 07:01:39.783967 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 10 07:01:39 crc kubenswrapper[4825]: I0310 07:01:39.784270 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-vklmj" Mar 10 07:01:39 crc kubenswrapper[4825]: I0310 07:01:39.784628 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 10 07:01:39 crc kubenswrapper[4825]: I0310 07:01:39.810765 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6g7zc"] Mar 10 07:01:39 crc kubenswrapper[4825]: I0310 07:01:39.953527 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hscx\" (UniqueName: \"kubernetes.io/projected/e71cebfd-7873-4481-aeb4-9f896e061e4e-kube-api-access-5hscx\") pod \"openstack-operator-index-6g7zc\" (UID: \"e71cebfd-7873-4481-aeb4-9f896e061e4e\") " pod="openstack-operators/openstack-operator-index-6g7zc" Mar 10 07:01:40 crc kubenswrapper[4825]: I0310 07:01:40.055083 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hscx\" (UniqueName: \"kubernetes.io/projected/e71cebfd-7873-4481-aeb4-9f896e061e4e-kube-api-access-5hscx\") pod \"openstack-operator-index-6g7zc\" (UID: \"e71cebfd-7873-4481-aeb4-9f896e061e4e\") " pod="openstack-operators/openstack-operator-index-6g7zc" Mar 10 07:01:40 crc kubenswrapper[4825]: I0310 07:01:40.073498 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hscx\" (UniqueName: \"kubernetes.io/projected/e71cebfd-7873-4481-aeb4-9f896e061e4e-kube-api-access-5hscx\") pod \"openstack-operator-index-6g7zc\" (UID: \"e71cebfd-7873-4481-aeb4-9f896e061e4e\") " pod="openstack-operators/openstack-operator-index-6g7zc" Mar 10 07:01:40 crc kubenswrapper[4825]: I0310 07:01:40.103738 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6g7zc" Mar 10 07:01:40 crc kubenswrapper[4825]: I0310 07:01:40.619095 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6g7zc"] Mar 10 07:01:40 crc kubenswrapper[4825]: W0310 07:01:40.629318 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode71cebfd_7873_4481_aeb4_9f896e061e4e.slice/crio-20bdd9df49444c915ef6842d71c567c1c4d48df0d528f94c218357eac28625a9 WatchSource:0}: Error finding container 20bdd9df49444c915ef6842d71c567c1c4d48df0d528f94c218357eac28625a9: Status 404 returned error can't find the container with id 20bdd9df49444c915ef6842d71c567c1c4d48df0d528f94c218357eac28625a9 Mar 10 07:01:41 crc kubenswrapper[4825]: I0310 07:01:41.226056 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-58dfk" Mar 10 07:01:41 crc kubenswrapper[4825]: I0310 07:01:41.226789 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-58dfk" Mar 10 07:01:41 crc kubenswrapper[4825]: I0310 07:01:41.298442 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-58dfk" Mar 10 07:01:41 crc kubenswrapper[4825]: I0310 07:01:41.563013 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6g7zc" event={"ID":"e71cebfd-7873-4481-aeb4-9f896e061e4e","Type":"ContainerStarted","Data":"48b38c5d5028858a509de24c11dbc5257ce28a91333e6487c6202a4100d5e0e6"} Mar 10 07:01:41 crc kubenswrapper[4825]: I0310 07:01:41.563080 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6g7zc" event={"ID":"e71cebfd-7873-4481-aeb4-9f896e061e4e","Type":"ContainerStarted","Data":"20bdd9df49444c915ef6842d71c567c1c4d48df0d528f94c218357eac28625a9"} Mar 10 07:01:41 crc kubenswrapper[4825]: I0310 07:01:41.586471 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-6g7zc" podStartSLOduration=1.9076312739999999 podStartE2EDuration="2.586457001s" podCreationTimestamp="2026-03-10 07:01:39 +0000 UTC" firstStartedPulling="2026-03-10 07:01:40.631078905 +0000 UTC m=+1053.660859520" lastFinishedPulling="2026-03-10 07:01:41.309904632 +0000 UTC m=+1054.339685247" observedRunningTime="2026-03-10 07:01:41.582623941 +0000 UTC m=+1054.612404586" watchObservedRunningTime="2026-03-10 07:01:41.586457001 +0000 UTC m=+1054.616237616" Mar 10 07:01:41 crc kubenswrapper[4825]: I0310 07:01:41.634830 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-58dfk" Mar 10 07:01:44 crc kubenswrapper[4825]: I0310 07:01:44.573386 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-6g7zc"] Mar 10 07:01:44 crc kubenswrapper[4825]: I0310 07:01:44.574024 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-6g7zc" podUID="e71cebfd-7873-4481-aeb4-9f896e061e4e" containerName="registry-server" containerID="cri-o://48b38c5d5028858a509de24c11dbc5257ce28a91333e6487c6202a4100d5e0e6" gracePeriod=2 Mar 10 07:01:45 crc kubenswrapper[4825]: I0310 07:01:45.029737 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6g7zc" Mar 10 07:01:45 crc kubenswrapper[4825]: I0310 07:01:45.128491 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hscx\" (UniqueName: \"kubernetes.io/projected/e71cebfd-7873-4481-aeb4-9f896e061e4e-kube-api-access-5hscx\") pod \"e71cebfd-7873-4481-aeb4-9f896e061e4e\" (UID: \"e71cebfd-7873-4481-aeb4-9f896e061e4e\") " Mar 10 07:01:45 crc kubenswrapper[4825]: I0310 07:01:45.134321 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e71cebfd-7873-4481-aeb4-9f896e061e4e-kube-api-access-5hscx" (OuterVolumeSpecName: "kube-api-access-5hscx") pod "e71cebfd-7873-4481-aeb4-9f896e061e4e" (UID: "e71cebfd-7873-4481-aeb4-9f896e061e4e"). InnerVolumeSpecName "kube-api-access-5hscx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:01:45 crc kubenswrapper[4825]: I0310 07:01:45.230268 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hscx\" (UniqueName: \"kubernetes.io/projected/e71cebfd-7873-4481-aeb4-9f896e061e4e-kube-api-access-5hscx\") on node \"crc\" DevicePath \"\"" Mar 10 07:01:45 crc kubenswrapper[4825]: I0310 07:01:45.384373 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-f9ksf"] Mar 10 07:01:45 crc kubenswrapper[4825]: E0310 07:01:45.384789 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e71cebfd-7873-4481-aeb4-9f896e061e4e" containerName="registry-server" Mar 10 07:01:45 crc kubenswrapper[4825]: I0310 07:01:45.384817 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e71cebfd-7873-4481-aeb4-9f896e061e4e" containerName="registry-server" Mar 10 07:01:45 crc kubenswrapper[4825]: I0310 07:01:45.385021 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e71cebfd-7873-4481-aeb4-9f896e061e4e" containerName="registry-server" Mar 10 07:01:45 crc kubenswrapper[4825]: I0310 07:01:45.385729 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f9ksf" Mar 10 07:01:45 crc kubenswrapper[4825]: I0310 07:01:45.394101 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-f9ksf"] Mar 10 07:01:45 crc kubenswrapper[4825]: I0310 07:01:45.535630 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkz2k\" (UniqueName: \"kubernetes.io/projected/38fb683b-3108-46e6-9532-3d971e047bde-kube-api-access-xkz2k\") pod \"openstack-operator-index-f9ksf\" (UID: \"38fb683b-3108-46e6-9532-3d971e047bde\") " pod="openstack-operators/openstack-operator-index-f9ksf" Mar 10 07:01:45 crc kubenswrapper[4825]: I0310 07:01:45.604334 4825 generic.go:334] "Generic (PLEG): container finished" podID="e71cebfd-7873-4481-aeb4-9f896e061e4e" containerID="48b38c5d5028858a509de24c11dbc5257ce28a91333e6487c6202a4100d5e0e6" exitCode=0 Mar 10 07:01:45 crc kubenswrapper[4825]: I0310 07:01:45.604404 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6g7zc" event={"ID":"e71cebfd-7873-4481-aeb4-9f896e061e4e","Type":"ContainerDied","Data":"48b38c5d5028858a509de24c11dbc5257ce28a91333e6487c6202a4100d5e0e6"} Mar 10 07:01:45 crc kubenswrapper[4825]: I0310 07:01:45.604472 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6g7zc" event={"ID":"e71cebfd-7873-4481-aeb4-9f896e061e4e","Type":"ContainerDied","Data":"20bdd9df49444c915ef6842d71c567c1c4d48df0d528f94c218357eac28625a9"} Mar 10 07:01:45 crc kubenswrapper[4825]: I0310 07:01:45.604500 4825 scope.go:117] "RemoveContainer" containerID="48b38c5d5028858a509de24c11dbc5257ce28a91333e6487c6202a4100d5e0e6" Mar 10 07:01:45 crc kubenswrapper[4825]: I0310 07:01:45.605760 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6g7zc" Mar 10 07:01:45 crc kubenswrapper[4825]: I0310 07:01:45.636837 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkz2k\" (UniqueName: \"kubernetes.io/projected/38fb683b-3108-46e6-9532-3d971e047bde-kube-api-access-xkz2k\") pod \"openstack-operator-index-f9ksf\" (UID: \"38fb683b-3108-46e6-9532-3d971e047bde\") " pod="openstack-operators/openstack-operator-index-f9ksf" Mar 10 07:01:45 crc kubenswrapper[4825]: I0310 07:01:45.638674 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-6g7zc"] Mar 10 07:01:45 crc kubenswrapper[4825]: I0310 07:01:45.640506 4825 scope.go:117] "RemoveContainer" containerID="48b38c5d5028858a509de24c11dbc5257ce28a91333e6487c6202a4100d5e0e6" Mar 10 07:01:45 crc kubenswrapper[4825]: E0310 07:01:45.641573 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48b38c5d5028858a509de24c11dbc5257ce28a91333e6487c6202a4100d5e0e6\": container with ID starting with 48b38c5d5028858a509de24c11dbc5257ce28a91333e6487c6202a4100d5e0e6 not found: ID does not exist" containerID="48b38c5d5028858a509de24c11dbc5257ce28a91333e6487c6202a4100d5e0e6" Mar 10 07:01:45 crc kubenswrapper[4825]: I0310 07:01:45.641626 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48b38c5d5028858a509de24c11dbc5257ce28a91333e6487c6202a4100d5e0e6"} err="failed to get container status \"48b38c5d5028858a509de24c11dbc5257ce28a91333e6487c6202a4100d5e0e6\": rpc error: code = NotFound desc = could not find container \"48b38c5d5028858a509de24c11dbc5257ce28a91333e6487c6202a4100d5e0e6\": container with ID starting with 48b38c5d5028858a509de24c11dbc5257ce28a91333e6487c6202a4100d5e0e6 not found: ID does not exist" Mar 10 07:01:45 crc kubenswrapper[4825]: I0310 07:01:45.647754 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-6g7zc"] Mar 10 07:01:45 crc kubenswrapper[4825]: I0310 07:01:45.668434 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkz2k\" (UniqueName: \"kubernetes.io/projected/38fb683b-3108-46e6-9532-3d971e047bde-kube-api-access-xkz2k\") pod \"openstack-operator-index-f9ksf\" (UID: \"38fb683b-3108-46e6-9532-3d971e047bde\") " pod="openstack-operators/openstack-operator-index-f9ksf" Mar 10 07:01:45 crc kubenswrapper[4825]: I0310 07:01:45.717859 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f9ksf" Mar 10 07:01:45 crc kubenswrapper[4825]: I0310 07:01:45.964077 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-58dfk"] Mar 10 07:01:45 crc kubenswrapper[4825]: I0310 07:01:45.964660 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-58dfk" podUID="a79ceb35-a79e-4988-a6e9-9d9694ca27d2" containerName="registry-server" containerID="cri-o://d4bd8f62309e303de3b9cf2bcff44ed364b619f06b5bbb12e99709a35177cb56" gracePeriod=2 Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.021041 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-f9ksf"] Mar 10 07:01:46 crc kubenswrapper[4825]: W0310 07:01:46.075673 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38fb683b_3108_46e6_9532_3d971e047bde.slice/crio-92c7bcd10fc136a53ebfc5ef63f2711bc8ac22bd421dfc3b616d7efaeb1e7516 WatchSource:0}: Error finding container 92c7bcd10fc136a53ebfc5ef63f2711bc8ac22bd421dfc3b616d7efaeb1e7516: Status 404 returned error can't find the container with id 92c7bcd10fc136a53ebfc5ef63f2711bc8ac22bd421dfc3b616d7efaeb1e7516 Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.372094 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-58dfk" Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.550443 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79rl5\" (UniqueName: \"kubernetes.io/projected/a79ceb35-a79e-4988-a6e9-9d9694ca27d2-kube-api-access-79rl5\") pod \"a79ceb35-a79e-4988-a6e9-9d9694ca27d2\" (UID: \"a79ceb35-a79e-4988-a6e9-9d9694ca27d2\") " Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.550639 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a79ceb35-a79e-4988-a6e9-9d9694ca27d2-catalog-content\") pod \"a79ceb35-a79e-4988-a6e9-9d9694ca27d2\" (UID: \"a79ceb35-a79e-4988-a6e9-9d9694ca27d2\") " Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.550829 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a79ceb35-a79e-4988-a6e9-9d9694ca27d2-utilities\") pod \"a79ceb35-a79e-4988-a6e9-9d9694ca27d2\" (UID: \"a79ceb35-a79e-4988-a6e9-9d9694ca27d2\") " Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.553385 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a79ceb35-a79e-4988-a6e9-9d9694ca27d2-utilities" (OuterVolumeSpecName: "utilities") pod "a79ceb35-a79e-4988-a6e9-9d9694ca27d2" (UID: "a79ceb35-a79e-4988-a6e9-9d9694ca27d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.561248 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a79ceb35-a79e-4988-a6e9-9d9694ca27d2-kube-api-access-79rl5" (OuterVolumeSpecName: "kube-api-access-79rl5") pod "a79ceb35-a79e-4988-a6e9-9d9694ca27d2" (UID: "a79ceb35-a79e-4988-a6e9-9d9694ca27d2"). InnerVolumeSpecName "kube-api-access-79rl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.614835 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f9ksf" event={"ID":"38fb683b-3108-46e6-9532-3d971e047bde","Type":"ContainerStarted","Data":"92c7bcd10fc136a53ebfc5ef63f2711bc8ac22bd421dfc3b616d7efaeb1e7516"} Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.617652 4825 generic.go:334] "Generic (PLEG): container finished" podID="a79ceb35-a79e-4988-a6e9-9d9694ca27d2" containerID="d4bd8f62309e303de3b9cf2bcff44ed364b619f06b5bbb12e99709a35177cb56" exitCode=0 Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.617694 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58dfk" event={"ID":"a79ceb35-a79e-4988-a6e9-9d9694ca27d2","Type":"ContainerDied","Data":"d4bd8f62309e303de3b9cf2bcff44ed364b619f06b5bbb12e99709a35177cb56"} Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.617711 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58dfk" event={"ID":"a79ceb35-a79e-4988-a6e9-9d9694ca27d2","Type":"ContainerDied","Data":"0ef7337547fddd2d1c53e36bd91759bfaafe2a3aefbb52bcc74d5fa680a9980b"} Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.617727 4825 scope.go:117] "RemoveContainer" containerID="d4bd8f62309e303de3b9cf2bcff44ed364b619f06b5bbb12e99709a35177cb56" Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.617823 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-58dfk" Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.629158 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a79ceb35-a79e-4988-a6e9-9d9694ca27d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a79ceb35-a79e-4988-a6e9-9d9694ca27d2" (UID: "a79ceb35-a79e-4988-a6e9-9d9694ca27d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.654495 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a79ceb35-a79e-4988-a6e9-9d9694ca27d2-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.654523 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79rl5\" (UniqueName: \"kubernetes.io/projected/a79ceb35-a79e-4988-a6e9-9d9694ca27d2-kube-api-access-79rl5\") on node \"crc\" DevicePath \"\"" Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.654533 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a79ceb35-a79e-4988-a6e9-9d9694ca27d2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.663549 4825 scope.go:117] "RemoveContainer" containerID="769b0c06e66e360f78d6128d674eb4ff5f13babb14fc08125da7261463c9dca2" Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.684557 4825 scope.go:117] "RemoveContainer" containerID="b371d5d10558c5d9fa2201ad7ed07749c32fa3f9b2f413516faab89a4ca2d8fb" Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.712694 4825 scope.go:117] "RemoveContainer" containerID="d4bd8f62309e303de3b9cf2bcff44ed364b619f06b5bbb12e99709a35177cb56" Mar 10 07:01:46 crc kubenswrapper[4825]: E0310 07:01:46.713357 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4bd8f62309e303de3b9cf2bcff44ed364b619f06b5bbb12e99709a35177cb56\": container with ID starting with d4bd8f62309e303de3b9cf2bcff44ed364b619f06b5bbb12e99709a35177cb56 not found: ID does not exist" containerID="d4bd8f62309e303de3b9cf2bcff44ed364b619f06b5bbb12e99709a35177cb56" Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.713442 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4bd8f62309e303de3b9cf2bcff44ed364b619f06b5bbb12e99709a35177cb56"} err="failed to get container status \"d4bd8f62309e303de3b9cf2bcff44ed364b619f06b5bbb12e99709a35177cb56\": rpc error: code = NotFound desc = could not find container \"d4bd8f62309e303de3b9cf2bcff44ed364b619f06b5bbb12e99709a35177cb56\": container with ID starting with d4bd8f62309e303de3b9cf2bcff44ed364b619f06b5bbb12e99709a35177cb56 not found: ID does not exist" Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.713481 4825 scope.go:117] "RemoveContainer" containerID="769b0c06e66e360f78d6128d674eb4ff5f13babb14fc08125da7261463c9dca2" Mar 10 07:01:46 crc kubenswrapper[4825]: E0310 07:01:46.714051 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"769b0c06e66e360f78d6128d674eb4ff5f13babb14fc08125da7261463c9dca2\": container with ID starting with 769b0c06e66e360f78d6128d674eb4ff5f13babb14fc08125da7261463c9dca2 not found: ID does not exist" containerID="769b0c06e66e360f78d6128d674eb4ff5f13babb14fc08125da7261463c9dca2" Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.714082 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"769b0c06e66e360f78d6128d674eb4ff5f13babb14fc08125da7261463c9dca2"} err="failed to get container status \"769b0c06e66e360f78d6128d674eb4ff5f13babb14fc08125da7261463c9dca2\": rpc error: code = NotFound desc = could not find container \"769b0c06e66e360f78d6128d674eb4ff5f13babb14fc08125da7261463c9dca2\": container with ID starting with 769b0c06e66e360f78d6128d674eb4ff5f13babb14fc08125da7261463c9dca2 not found: ID does not exist" Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.714102 4825 scope.go:117] "RemoveContainer" containerID="b371d5d10558c5d9fa2201ad7ed07749c32fa3f9b2f413516faab89a4ca2d8fb" Mar 10 07:01:46 crc kubenswrapper[4825]: E0310 07:01:46.714515 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b371d5d10558c5d9fa2201ad7ed07749c32fa3f9b2f413516faab89a4ca2d8fb\": container with ID starting with b371d5d10558c5d9fa2201ad7ed07749c32fa3f9b2f413516faab89a4ca2d8fb not found: ID does not exist" containerID="b371d5d10558c5d9fa2201ad7ed07749c32fa3f9b2f413516faab89a4ca2d8fb" Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.714581 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b371d5d10558c5d9fa2201ad7ed07749c32fa3f9b2f413516faab89a4ca2d8fb"} err="failed to get container status \"b371d5d10558c5d9fa2201ad7ed07749c32fa3f9b2f413516faab89a4ca2d8fb\": rpc error: code = NotFound desc = could not find container \"b371d5d10558c5d9fa2201ad7ed07749c32fa3f9b2f413516faab89a4ca2d8fb\": container with ID starting with b371d5d10558c5d9fa2201ad7ed07749c32fa3f9b2f413516faab89a4ca2d8fb not found: ID does not exist" Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.955201 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-58dfk"] Mar 10 07:01:46 crc kubenswrapper[4825]: I0310 07:01:46.976522 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-58dfk"] Mar 10 07:01:47 crc kubenswrapper[4825]: I0310 07:01:47.248616 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a79ceb35-a79e-4988-a6e9-9d9694ca27d2" path="/var/lib/kubelet/pods/a79ceb35-a79e-4988-a6e9-9d9694ca27d2/volumes" Mar 10 07:01:47 crc kubenswrapper[4825]: I0310 07:01:47.250364 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e71cebfd-7873-4481-aeb4-9f896e061e4e" path="/var/lib/kubelet/pods/e71cebfd-7873-4481-aeb4-9f896e061e4e/volumes" Mar 10 07:01:47 crc kubenswrapper[4825]: I0310 07:01:47.633209 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f9ksf" event={"ID":"38fb683b-3108-46e6-9532-3d971e047bde","Type":"ContainerStarted","Data":"50cd6566a8bf04f9433c8fbd731240e33c3d0b6f745ab65d5b355fb716a11465"} Mar 10 07:01:47 crc kubenswrapper[4825]: I0310 07:01:47.663260 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-f9ksf" podStartSLOduration=2.226040232 podStartE2EDuration="2.66322961s" podCreationTimestamp="2026-03-10 07:01:45 +0000 UTC" firstStartedPulling="2026-03-10 07:01:46.084021061 +0000 UTC m=+1059.113801706" lastFinishedPulling="2026-03-10 07:01:46.521210479 +0000 UTC m=+1059.550991084" observedRunningTime="2026-03-10 07:01:47.653889168 +0000 UTC m=+1060.683669823" watchObservedRunningTime="2026-03-10 07:01:47.66322961 +0000 UTC m=+1060.693010265" Mar 10 07:01:55 crc kubenswrapper[4825]: I0310 07:01:55.718370 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-f9ksf" Mar 10 07:01:55 crc kubenswrapper[4825]: I0310 07:01:55.718668 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-f9ksf" Mar 10 07:01:55 crc kubenswrapper[4825]: I0310 07:01:55.768046 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-f9ksf" Mar 10 07:01:56 crc kubenswrapper[4825]: I0310 07:01:56.734092 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-f9ksf" Mar 10 07:02:00 crc kubenswrapper[4825]: I0310 07:02:00.138972 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552102-5gmst"] Mar 10 07:02:00 crc kubenswrapper[4825]: E0310 07:02:00.139728 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a79ceb35-a79e-4988-a6e9-9d9694ca27d2" containerName="extract-content" Mar 10 07:02:00 crc kubenswrapper[4825]: I0310 07:02:00.139750 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a79ceb35-a79e-4988-a6e9-9d9694ca27d2" containerName="extract-content" Mar 10 07:02:00 crc kubenswrapper[4825]: E0310 07:02:00.139773 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a79ceb35-a79e-4988-a6e9-9d9694ca27d2" containerName="extract-utilities" Mar 10 07:02:00 crc kubenswrapper[4825]: I0310 07:02:00.139785 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a79ceb35-a79e-4988-a6e9-9d9694ca27d2" containerName="extract-utilities" Mar 10 07:02:00 crc kubenswrapper[4825]: E0310 07:02:00.139819 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a79ceb35-a79e-4988-a6e9-9d9694ca27d2" containerName="registry-server" Mar 10 07:02:00 crc kubenswrapper[4825]: I0310 07:02:00.139832 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a79ceb35-a79e-4988-a6e9-9d9694ca27d2" containerName="registry-server" Mar 10 07:02:00 crc kubenswrapper[4825]: I0310 07:02:00.140029 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a79ceb35-a79e-4988-a6e9-9d9694ca27d2" containerName="registry-server" Mar 10 07:02:00 crc kubenswrapper[4825]: I0310 07:02:00.140736 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552102-5gmst" Mar 10 07:02:00 crc kubenswrapper[4825]: I0310 07:02:00.144816 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:02:00 crc kubenswrapper[4825]: I0310 07:02:00.144893 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:02:00 crc kubenswrapper[4825]: I0310 07:02:00.146835 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:02:00 crc kubenswrapper[4825]: I0310 07:02:00.159724 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552102-5gmst"] Mar 10 07:02:00 crc kubenswrapper[4825]: I0310 07:02:00.191754 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9rbj\" (UniqueName: \"kubernetes.io/projected/31aa0f29-4516-4ee9-b3a3-378723e6945e-kube-api-access-z9rbj\") pod \"auto-csr-approver-29552102-5gmst\" (UID: \"31aa0f29-4516-4ee9-b3a3-378723e6945e\") " pod="openshift-infra/auto-csr-approver-29552102-5gmst" Mar 10 07:02:00 crc kubenswrapper[4825]: I0310 07:02:00.292843 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9rbj\" (UniqueName: \"kubernetes.io/projected/31aa0f29-4516-4ee9-b3a3-378723e6945e-kube-api-access-z9rbj\") pod \"auto-csr-approver-29552102-5gmst\" (UID: \"31aa0f29-4516-4ee9-b3a3-378723e6945e\") " pod="openshift-infra/auto-csr-approver-29552102-5gmst" Mar 10 07:02:00 crc kubenswrapper[4825]: I0310 07:02:00.312568 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9rbj\" (UniqueName: \"kubernetes.io/projected/31aa0f29-4516-4ee9-b3a3-378723e6945e-kube-api-access-z9rbj\") pod \"auto-csr-approver-29552102-5gmst\" (UID: \"31aa0f29-4516-4ee9-b3a3-378723e6945e\") " pod="openshift-infra/auto-csr-approver-29552102-5gmst" Mar 10 07:02:00 crc kubenswrapper[4825]: I0310 07:02:00.486297 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552102-5gmst" Mar 10 07:02:01 crc kubenswrapper[4825]: I0310 07:02:01.043781 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552102-5gmst"] Mar 10 07:02:01 crc kubenswrapper[4825]: I0310 07:02:01.618456 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl"] Mar 10 07:02:01 crc kubenswrapper[4825]: I0310 07:02:01.620487 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl" Mar 10 07:02:01 crc kubenswrapper[4825]: I0310 07:02:01.626449 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-wj5x5" Mar 10 07:02:01 crc kubenswrapper[4825]: I0310 07:02:01.634064 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl"] Mar 10 07:02:01 crc kubenswrapper[4825]: I0310 07:02:01.721413 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe29f2e8-8710-42c7-b36b-820eb611fd11-bundle\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl\" (UID: \"fe29f2e8-8710-42c7-b36b-820eb611fd11\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl" Mar 10 07:02:01 crc kubenswrapper[4825]: I0310 07:02:01.721504 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe29f2e8-8710-42c7-b36b-820eb611fd11-util\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl\" (UID: \"fe29f2e8-8710-42c7-b36b-820eb611fd11\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl" Mar 10 07:02:01 crc kubenswrapper[4825]: I0310 07:02:01.721822 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff8bh\" (UniqueName: \"kubernetes.io/projected/fe29f2e8-8710-42c7-b36b-820eb611fd11-kube-api-access-ff8bh\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl\" (UID: \"fe29f2e8-8710-42c7-b36b-820eb611fd11\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl" Mar 10 07:02:01 crc kubenswrapper[4825]: I0310 07:02:01.752191 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552102-5gmst" event={"ID":"31aa0f29-4516-4ee9-b3a3-378723e6945e","Type":"ContainerStarted","Data":"e4fd904bfe95fa63368f425a7dbb7322d21590021c35c4f279eba5015146aa67"} Mar 10 07:02:01 crc kubenswrapper[4825]: I0310 07:02:01.823057 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe29f2e8-8710-42c7-b36b-820eb611fd11-bundle\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl\" (UID: \"fe29f2e8-8710-42c7-b36b-820eb611fd11\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl" Mar 10 07:02:01 crc kubenswrapper[4825]: I0310 07:02:01.823116 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe29f2e8-8710-42c7-b36b-820eb611fd11-util\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl\" (UID: \"fe29f2e8-8710-42c7-b36b-820eb611fd11\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl" Mar 10 07:02:01 crc kubenswrapper[4825]: I0310 07:02:01.823230 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff8bh\" (UniqueName: \"kubernetes.io/projected/fe29f2e8-8710-42c7-b36b-820eb611fd11-kube-api-access-ff8bh\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl\" (UID: \"fe29f2e8-8710-42c7-b36b-820eb611fd11\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl" Mar 10 07:02:01 crc kubenswrapper[4825]: I0310 07:02:01.824054 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe29f2e8-8710-42c7-b36b-820eb611fd11-bundle\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl\" (UID: \"fe29f2e8-8710-42c7-b36b-820eb611fd11\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl" Mar 10 07:02:01 crc kubenswrapper[4825]: I0310 07:02:01.824094 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe29f2e8-8710-42c7-b36b-820eb611fd11-util\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl\" (UID: \"fe29f2e8-8710-42c7-b36b-820eb611fd11\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl" Mar 10 07:02:01 crc kubenswrapper[4825]: I0310 07:02:01.851130 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff8bh\" (UniqueName: \"kubernetes.io/projected/fe29f2e8-8710-42c7-b36b-820eb611fd11-kube-api-access-ff8bh\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl\" (UID: \"fe29f2e8-8710-42c7-b36b-820eb611fd11\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl" Mar 10 07:02:01 crc kubenswrapper[4825]: I0310 07:02:01.940085 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl" Mar 10 07:02:02 crc kubenswrapper[4825]: I0310 07:02:02.467746 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl"] Mar 10 07:02:02 crc kubenswrapper[4825]: W0310 07:02:02.480710 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe29f2e8_8710_42c7_b36b_820eb611fd11.slice/crio-0f3ce370a0b9436230470bc52e3536e05912094a1ef644942e283b0e24f4dc72 WatchSource:0}: Error finding container 0f3ce370a0b9436230470bc52e3536e05912094a1ef644942e283b0e24f4dc72: Status 404 returned error can't find the container with id 0f3ce370a0b9436230470bc52e3536e05912094a1ef644942e283b0e24f4dc72 Mar 10 07:02:02 crc kubenswrapper[4825]: I0310 07:02:02.763071 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552102-5gmst" event={"ID":"31aa0f29-4516-4ee9-b3a3-378723e6945e","Type":"ContainerStarted","Data":"dca2ad7e4a1d246c7cb38e7f11895db1d7d4fb25450a32dd1c07efe0f4d74bf7"} Mar 10 07:02:02 crc kubenswrapper[4825]: I0310 07:02:02.767290 4825 generic.go:334] "Generic (PLEG): container finished" podID="fe29f2e8-8710-42c7-b36b-820eb611fd11" containerID="07864f088701bce2111ac807781763d201f2a123b63d880119c03dd068b90028" exitCode=0 Mar 10 07:02:02 crc kubenswrapper[4825]: I0310 07:02:02.767356 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl" event={"ID":"fe29f2e8-8710-42c7-b36b-820eb611fd11","Type":"ContainerDied","Data":"07864f088701bce2111ac807781763d201f2a123b63d880119c03dd068b90028"} Mar 10 07:02:02 crc kubenswrapper[4825]: I0310 07:02:02.767399 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl" event={"ID":"fe29f2e8-8710-42c7-b36b-820eb611fd11","Type":"ContainerStarted","Data":"0f3ce370a0b9436230470bc52e3536e05912094a1ef644942e283b0e24f4dc72"} Mar 10 07:02:02 crc kubenswrapper[4825]: I0310 07:02:02.784452 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552102-5gmst" podStartSLOduration=1.592300654 podStartE2EDuration="2.784432422s" podCreationTimestamp="2026-03-10 07:02:00 +0000 UTC" firstStartedPulling="2026-03-10 07:02:01.054540204 +0000 UTC m=+1074.084320819" lastFinishedPulling="2026-03-10 07:02:02.246671952 +0000 UTC m=+1075.276452587" observedRunningTime="2026-03-10 07:02:02.778689232 +0000 UTC m=+1075.808469867" watchObservedRunningTime="2026-03-10 07:02:02.784432422 +0000 UTC m=+1075.814213047" Mar 10 07:02:03 crc kubenswrapper[4825]: I0310 07:02:03.784916 4825 generic.go:334] "Generic (PLEG): container finished" podID="fe29f2e8-8710-42c7-b36b-820eb611fd11" containerID="3646f9ade14f53394f3cfed44ea8e64789df433910b3ca136c1787d5d23fb670" exitCode=0 Mar 10 07:02:03 crc kubenswrapper[4825]: I0310 07:02:03.785002 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl" event={"ID":"fe29f2e8-8710-42c7-b36b-820eb611fd11","Type":"ContainerDied","Data":"3646f9ade14f53394f3cfed44ea8e64789df433910b3ca136c1787d5d23fb670"} Mar 10 07:02:03 crc kubenswrapper[4825]: I0310 07:02:03.791508 4825 generic.go:334] "Generic (PLEG): container finished" podID="31aa0f29-4516-4ee9-b3a3-378723e6945e" containerID="dca2ad7e4a1d246c7cb38e7f11895db1d7d4fb25450a32dd1c07efe0f4d74bf7" exitCode=0 Mar 10 07:02:03 crc kubenswrapper[4825]: I0310 07:02:03.791633 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552102-5gmst" event={"ID":"31aa0f29-4516-4ee9-b3a3-378723e6945e","Type":"ContainerDied","Data":"dca2ad7e4a1d246c7cb38e7f11895db1d7d4fb25450a32dd1c07efe0f4d74bf7"} Mar 10 07:02:04 crc kubenswrapper[4825]: I0310 07:02:04.801515 4825 generic.go:334] "Generic (PLEG): container finished" podID="fe29f2e8-8710-42c7-b36b-820eb611fd11" containerID="b64cc08679cf0cf3d1779b50079c49b1a07cc51ba52c2113279dd8c58d1b417b" exitCode=0 Mar 10 07:02:04 crc kubenswrapper[4825]: I0310 07:02:04.801716 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl" event={"ID":"fe29f2e8-8710-42c7-b36b-820eb611fd11","Type":"ContainerDied","Data":"b64cc08679cf0cf3d1779b50079c49b1a07cc51ba52c2113279dd8c58d1b417b"} Mar 10 07:02:05 crc kubenswrapper[4825]: I0310 07:02:05.043356 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552102-5gmst" Mar 10 07:02:05 crc kubenswrapper[4825]: I0310 07:02:05.077112 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9rbj\" (UniqueName: \"kubernetes.io/projected/31aa0f29-4516-4ee9-b3a3-378723e6945e-kube-api-access-z9rbj\") pod \"31aa0f29-4516-4ee9-b3a3-378723e6945e\" (UID: \"31aa0f29-4516-4ee9-b3a3-378723e6945e\") " Mar 10 07:02:05 crc kubenswrapper[4825]: I0310 07:02:05.082632 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31aa0f29-4516-4ee9-b3a3-378723e6945e-kube-api-access-z9rbj" (OuterVolumeSpecName: "kube-api-access-z9rbj") pod "31aa0f29-4516-4ee9-b3a3-378723e6945e" (UID: "31aa0f29-4516-4ee9-b3a3-378723e6945e"). InnerVolumeSpecName "kube-api-access-z9rbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:02:05 crc kubenswrapper[4825]: I0310 07:02:05.179389 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9rbj\" (UniqueName: \"kubernetes.io/projected/31aa0f29-4516-4ee9-b3a3-378723e6945e-kube-api-access-z9rbj\") on node \"crc\" DevicePath \"\"" Mar 10 07:02:05 crc kubenswrapper[4825]: I0310 07:02:05.811734 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552102-5gmst" event={"ID":"31aa0f29-4516-4ee9-b3a3-378723e6945e","Type":"ContainerDied","Data":"e4fd904bfe95fa63368f425a7dbb7322d21590021c35c4f279eba5015146aa67"} Mar 10 07:02:05 crc kubenswrapper[4825]: I0310 07:02:05.811812 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4fd904bfe95fa63368f425a7dbb7322d21590021c35c4f279eba5015146aa67" Mar 10 07:02:05 crc kubenswrapper[4825]: I0310 07:02:05.811929 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552102-5gmst" Mar 10 07:02:05 crc kubenswrapper[4825]: I0310 07:02:05.902162 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552096-v78x6"] Mar 10 07:02:05 crc kubenswrapper[4825]: I0310 07:02:05.911848 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552096-v78x6"] Mar 10 07:02:06 crc kubenswrapper[4825]: I0310 07:02:06.150161 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl" Mar 10 07:02:06 crc kubenswrapper[4825]: I0310 07:02:06.193552 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe29f2e8-8710-42c7-b36b-820eb611fd11-bundle\") pod \"fe29f2e8-8710-42c7-b36b-820eb611fd11\" (UID: \"fe29f2e8-8710-42c7-b36b-820eb611fd11\") " Mar 10 07:02:06 crc kubenswrapper[4825]: I0310 07:02:06.193660 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff8bh\" (UniqueName: \"kubernetes.io/projected/fe29f2e8-8710-42c7-b36b-820eb611fd11-kube-api-access-ff8bh\") pod \"fe29f2e8-8710-42c7-b36b-820eb611fd11\" (UID: \"fe29f2e8-8710-42c7-b36b-820eb611fd11\") " Mar 10 07:02:06 crc kubenswrapper[4825]: I0310 07:02:06.193710 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe29f2e8-8710-42c7-b36b-820eb611fd11-util\") pod \"fe29f2e8-8710-42c7-b36b-820eb611fd11\" (UID: \"fe29f2e8-8710-42c7-b36b-820eb611fd11\") " Mar 10 07:02:06 crc kubenswrapper[4825]: I0310 07:02:06.194899 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe29f2e8-8710-42c7-b36b-820eb611fd11-bundle" (OuterVolumeSpecName: "bundle") pod "fe29f2e8-8710-42c7-b36b-820eb611fd11" (UID: "fe29f2e8-8710-42c7-b36b-820eb611fd11"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:02:06 crc kubenswrapper[4825]: I0310 07:02:06.200542 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe29f2e8-8710-42c7-b36b-820eb611fd11-kube-api-access-ff8bh" (OuterVolumeSpecName: "kube-api-access-ff8bh") pod "fe29f2e8-8710-42c7-b36b-820eb611fd11" (UID: "fe29f2e8-8710-42c7-b36b-820eb611fd11"). InnerVolumeSpecName "kube-api-access-ff8bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:02:06 crc kubenswrapper[4825]: I0310 07:02:06.223989 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe29f2e8-8710-42c7-b36b-820eb611fd11-util" (OuterVolumeSpecName: "util") pod "fe29f2e8-8710-42c7-b36b-820eb611fd11" (UID: "fe29f2e8-8710-42c7-b36b-820eb611fd11"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:02:06 crc kubenswrapper[4825]: I0310 07:02:06.295843 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff8bh\" (UniqueName: \"kubernetes.io/projected/fe29f2e8-8710-42c7-b36b-820eb611fd11-kube-api-access-ff8bh\") on node \"crc\" DevicePath \"\"" Mar 10 07:02:06 crc kubenswrapper[4825]: I0310 07:02:06.295895 4825 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe29f2e8-8710-42c7-b36b-820eb611fd11-util\") on node \"crc\" DevicePath \"\"" Mar 10 07:02:06 crc kubenswrapper[4825]: I0310 07:02:06.295915 4825 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe29f2e8-8710-42c7-b36b-820eb611fd11-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:02:06 crc kubenswrapper[4825]: I0310 07:02:06.827993 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl" event={"ID":"fe29f2e8-8710-42c7-b36b-820eb611fd11","Type":"ContainerDied","Data":"0f3ce370a0b9436230470bc52e3536e05912094a1ef644942e283b0e24f4dc72"} Mar 10 07:02:06 crc kubenswrapper[4825]: I0310 07:02:06.828052 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f3ce370a0b9436230470bc52e3536e05912094a1ef644942e283b0e24f4dc72" Mar 10 07:02:06 crc kubenswrapper[4825]: I0310 07:02:06.828079 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl" Mar 10 07:02:07 crc kubenswrapper[4825]: I0310 07:02:07.255224 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7d62c52-2ae1-46a3-8b1d-7f086d612775" path="/var/lib/kubelet/pods/a7d62c52-2ae1-46a3-8b1d-7f086d612775/volumes" Mar 10 07:02:08 crc kubenswrapper[4825]: I0310 07:02:08.908560 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-568b7cf6db-7vlc2"] Mar 10 07:02:08 crc kubenswrapper[4825]: E0310 07:02:08.908832 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31aa0f29-4516-4ee9-b3a3-378723e6945e" containerName="oc" Mar 10 07:02:08 crc kubenswrapper[4825]: I0310 07:02:08.908848 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="31aa0f29-4516-4ee9-b3a3-378723e6945e" containerName="oc" Mar 10 07:02:08 crc kubenswrapper[4825]: E0310 07:02:08.908862 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe29f2e8-8710-42c7-b36b-820eb611fd11" containerName="extract" Mar 10 07:02:08 crc kubenswrapper[4825]: I0310 07:02:08.908869 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe29f2e8-8710-42c7-b36b-820eb611fd11" containerName="extract" Mar 10 07:02:08 crc kubenswrapper[4825]: E0310 07:02:08.908887 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe29f2e8-8710-42c7-b36b-820eb611fd11" containerName="pull" Mar 10 07:02:08 crc kubenswrapper[4825]: I0310 07:02:08.908893 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe29f2e8-8710-42c7-b36b-820eb611fd11" containerName="pull" Mar 10 07:02:08 crc kubenswrapper[4825]: E0310 07:02:08.908901 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe29f2e8-8710-42c7-b36b-820eb611fd11" containerName="util" Mar 10 07:02:08 crc kubenswrapper[4825]: I0310 07:02:08.908907 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe29f2e8-8710-42c7-b36b-820eb611fd11" containerName="util" Mar 10 07:02:08 crc kubenswrapper[4825]: I0310 07:02:08.909009 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe29f2e8-8710-42c7-b36b-820eb611fd11" containerName="extract" Mar 10 07:02:08 crc kubenswrapper[4825]: I0310 07:02:08.909025 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="31aa0f29-4516-4ee9-b3a3-378723e6945e" containerName="oc" Mar 10 07:02:08 crc kubenswrapper[4825]: I0310 07:02:08.909465 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-7vlc2" Mar 10 07:02:08 crc kubenswrapper[4825]: I0310 07:02:08.915224 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-6s6qh" Mar 10 07:02:08 crc kubenswrapper[4825]: I0310 07:02:08.936325 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvf9t\" (UniqueName: \"kubernetes.io/projected/5e253cf5-24cb-4d27-ab16-7aaa2cafa25b-kube-api-access-rvf9t\") pod \"openstack-operator-controller-init-568b7cf6db-7vlc2\" (UID: \"5e253cf5-24cb-4d27-ab16-7aaa2cafa25b\") " pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-7vlc2" Mar 10 07:02:08 crc kubenswrapper[4825]: I0310 07:02:08.938823 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-568b7cf6db-7vlc2"] Mar 10 07:02:09 crc kubenswrapper[4825]: I0310 07:02:09.037839 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvf9t\" (UniqueName: \"kubernetes.io/projected/5e253cf5-24cb-4d27-ab16-7aaa2cafa25b-kube-api-access-rvf9t\") pod \"openstack-operator-controller-init-568b7cf6db-7vlc2\" (UID: \"5e253cf5-24cb-4d27-ab16-7aaa2cafa25b\") " pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-7vlc2" Mar 10 07:02:09 crc kubenswrapper[4825]: I0310 07:02:09.056419 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvf9t\" (UniqueName: \"kubernetes.io/projected/5e253cf5-24cb-4d27-ab16-7aaa2cafa25b-kube-api-access-rvf9t\") pod \"openstack-operator-controller-init-568b7cf6db-7vlc2\" (UID: \"5e253cf5-24cb-4d27-ab16-7aaa2cafa25b\") " pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-7vlc2" Mar 10 07:02:09 crc kubenswrapper[4825]: I0310 07:02:09.224824 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-6s6qh" Mar 10 07:02:09 crc kubenswrapper[4825]: I0310 07:02:09.233425 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-7vlc2" Mar 10 07:02:09 crc kubenswrapper[4825]: I0310 07:02:09.521101 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-568b7cf6db-7vlc2"] Mar 10 07:02:09 crc kubenswrapper[4825]: I0310 07:02:09.851008 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-7vlc2" event={"ID":"5e253cf5-24cb-4d27-ab16-7aaa2cafa25b","Type":"ContainerStarted","Data":"30dba196cc19c1623b54fd318dd202cf529a2cff8c3d6277e9d987a5554995b8"} Mar 10 07:02:11 crc kubenswrapper[4825]: I0310 07:02:11.887377 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-tsj92" podUID="f9daa544-4d9f-4106-a729-d330dc8b6cc3" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 07:02:11 crc kubenswrapper[4825]: I0310 07:02:11.887405 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-tsj92" podUID="f9daa544-4d9f-4106-a729-d330dc8b6cc3" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 07:02:14 crc kubenswrapper[4825]: I0310 07:02:14.894039 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-7vlc2" event={"ID":"5e253cf5-24cb-4d27-ab16-7aaa2cafa25b","Type":"ContainerStarted","Data":"69569aaeff34403157d4bbbee6d1c1c3a4c4eaea48d0e4cdcc1e66633176c04e"} Mar 10 07:02:14 crc kubenswrapper[4825]: I0310 07:02:14.894719 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-7vlc2" Mar 10 07:02:14 crc kubenswrapper[4825]: I0310 07:02:14.940051 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-7vlc2" podStartSLOduration=1.793718224 podStartE2EDuration="6.94002783s" podCreationTimestamp="2026-03-10 07:02:08 +0000 UTC" firstStartedPulling="2026-03-10 07:02:09.528006947 +0000 UTC m=+1082.557787562" lastFinishedPulling="2026-03-10 07:02:14.674316563 +0000 UTC m=+1087.704097168" observedRunningTime="2026-03-10 07:02:14.93351723 +0000 UTC m=+1087.963297855" watchObservedRunningTime="2026-03-10 07:02:14.94002783 +0000 UTC m=+1087.969808445" Mar 10 07:02:16 crc kubenswrapper[4825]: I0310 07:02:16.888209 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:02:16 crc kubenswrapper[4825]: I0310 07:02:16.888292 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:02:19 crc kubenswrapper[4825]: I0310 07:02:19.245717 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-7vlc2" Mar 10 07:02:30 crc kubenswrapper[4825]: I0310 07:02:30.039004 4825 scope.go:117] "RemoveContainer" containerID="465f1bf1cf4ff5239246dfecc7c2986fca805552a906139108d9a1b433be3ebe" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.459598 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-nl9ph"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.461210 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-nl9ph" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.462952 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-bvpxf" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.468642 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-nl9ph"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.476009 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sd5jw"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.476913 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sd5jw" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.479173 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-5zttb" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.491101 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sd5jw"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.525060 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-qqskl"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.526354 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-qqskl" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.532718 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-tftkb" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.554428 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-qqskl"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.576257 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkt5j\" (UniqueName: \"kubernetes.io/projected/53bfb88f-ffff-4945-acc2-ae245147edcb-kube-api-access-gkt5j\") pod \"barbican-operator-controller-manager-6db6876945-nl9ph\" (UID: \"53bfb88f-ffff-4945-acc2-ae245147edcb\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-nl9ph" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.576318 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mbzp\" (UniqueName: \"kubernetes.io/projected/acc23278-44ed-4cee-bd08-4562e17175db-kube-api-access-5mbzp\") pod \"cinder-operator-controller-manager-55d77d7b5c-sd5jw\" (UID: \"acc23278-44ed-4cee-bd08-4562e17175db\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sd5jw" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.596935 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-6jm6b"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.597966 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-6jm6b" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.601262 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-st58m" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.604202 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-7lfrk"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.605375 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7lfrk" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.608302 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-69xmd" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.637398 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-zbvvz"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.638221 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-zbvvz" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.643414 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-tn2g6" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.662369 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-b5zsk"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.663431 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-b5zsk" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.665679 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.665891 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-cjz9z" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.677819 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4dt2\" (UniqueName: \"kubernetes.io/projected/7ed2e288-eed6-49bf-8eb8-be9a1e8415a3-kube-api-access-v4dt2\") pod \"designate-operator-controller-manager-5d87c9d997-qqskl\" (UID: \"7ed2e288-eed6-49bf-8eb8-be9a1e8415a3\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-qqskl" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.677959 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkt5j\" (UniqueName: \"kubernetes.io/projected/53bfb88f-ffff-4945-acc2-ae245147edcb-kube-api-access-gkt5j\") pod \"barbican-operator-controller-manager-6db6876945-nl9ph\" (UID: \"53bfb88f-ffff-4945-acc2-ae245147edcb\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-nl9ph" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.678010 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mbzp\" (UniqueName: \"kubernetes.io/projected/acc23278-44ed-4cee-bd08-4562e17175db-kube-api-access-5mbzp\") pod \"cinder-operator-controller-manager-55d77d7b5c-sd5jw\" (UID: \"acc23278-44ed-4cee-bd08-4562e17175db\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sd5jw" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.678729 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-zbvvz"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.701842 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-6jm6b"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.714612 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mbzp\" (UniqueName: \"kubernetes.io/projected/acc23278-44ed-4cee-bd08-4562e17175db-kube-api-access-5mbzp\") pod \"cinder-operator-controller-manager-55d77d7b5c-sd5jw\" (UID: \"acc23278-44ed-4cee-bd08-4562e17175db\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sd5jw" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.716220 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-bpf8z"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.717321 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-bpf8z" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.720098 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-5s4qr" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.731214 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-b5zsk"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.731282 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-7lfrk"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.735225 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-pnfls"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.737158 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-pnfls" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.739078 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-pnfls"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.747496 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-bpf8z"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.749536 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-hcm55" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.749925 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkt5j\" (UniqueName: \"kubernetes.io/projected/53bfb88f-ffff-4945-acc2-ae245147edcb-kube-api-access-gkt5j\") pod \"barbican-operator-controller-manager-6db6876945-nl9ph\" (UID: \"53bfb88f-ffff-4945-acc2-ae245147edcb\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-nl9ph" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.763647 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-npxgg"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.764503 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-npxgg" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.767820 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-gfqtz" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.775391 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-cmg8t"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.776320 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-cmg8t" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.784527 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-jmftg" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.786027 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4dt2\" (UniqueName: \"kubernetes.io/projected/7ed2e288-eed6-49bf-8eb8-be9a1e8415a3-kube-api-access-v4dt2\") pod \"designate-operator-controller-manager-5d87c9d997-qqskl\" (UID: \"7ed2e288-eed6-49bf-8eb8-be9a1e8415a3\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-qqskl" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.786085 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfrfd\" (UniqueName: \"kubernetes.io/projected/3ee3f9b2-09ba-4f80-823f-a4fd8639a14a-kube-api-access-gfrfd\") pod \"glance-operator-controller-manager-64db6967f8-6jm6b\" (UID: \"3ee3f9b2-09ba-4f80-823f-a4fd8639a14a\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-6jm6b" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.786115 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adbbc5e4-169b-4b18-819f-2ca9136edf9b-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-b5zsk\" (UID: \"adbbc5e4-169b-4b18-819f-2ca9136edf9b\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-b5zsk" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.786356 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qmsk\" (UniqueName: \"kubernetes.io/projected/a16a88fb-4aab-48c8-a540-f999a66f712b-kube-api-access-5qmsk\") pod \"heat-operator-controller-manager-cf99c678f-7lfrk\" (UID: \"a16a88fb-4aab-48c8-a540-f999a66f712b\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7lfrk" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.786388 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t56bn\" (UniqueName: \"kubernetes.io/projected/2c8e75e0-135a-4d58-9230-a0f18ad89b25-kube-api-access-t56bn\") pod \"horizon-operator-controller-manager-78bc7f9bd9-zbvvz\" (UID: \"2c8e75e0-135a-4d58-9230-a0f18ad89b25\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-zbvvz" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.786412 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66ks4\" (UniqueName: \"kubernetes.io/projected/adbbc5e4-169b-4b18-819f-2ca9136edf9b-kube-api-access-66ks4\") pod \"infra-operator-controller-manager-f7fcc58b9-b5zsk\" (UID: \"adbbc5e4-169b-4b18-819f-2ca9136edf9b\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-b5zsk" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.789211 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-qhwdw"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.790097 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-qhwdw" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.793505 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-mpg7r" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.795182 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-nl9ph" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.818598 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-npxgg"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.824525 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-cmg8t"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.824744 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sd5jw" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.833299 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-dn5wn"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.836139 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dn5wn" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.850018 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-dn5wn"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.854168 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-sjkf7" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.857232 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-qhwdw"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.858675 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4dt2\" (UniqueName: \"kubernetes.io/projected/7ed2e288-eed6-49bf-8eb8-be9a1e8415a3-kube-api-access-v4dt2\") pod \"designate-operator-controller-manager-5d87c9d997-qqskl\" (UID: \"7ed2e288-eed6-49bf-8eb8-be9a1e8415a3\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-qqskl" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.866611 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-qqskl" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.890283 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-z8726"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.891272 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-z8726" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.892316 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h784z\" (UniqueName: \"kubernetes.io/projected/18fccb01-59df-4020-88c5-9c70b9f0edec-kube-api-access-h784z\") pod \"mariadb-operator-controller-manager-7b6bfb6475-npxgg\" (UID: \"18fccb01-59df-4020-88c5-9c70b9f0edec\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-npxgg" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.892353 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qmsk\" (UniqueName: \"kubernetes.io/projected/a16a88fb-4aab-48c8-a540-f999a66f712b-kube-api-access-5qmsk\") pod \"heat-operator-controller-manager-cf99c678f-7lfrk\" (UID: \"a16a88fb-4aab-48c8-a540-f999a66f712b\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7lfrk" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.892380 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7zvc\" (UniqueName: \"kubernetes.io/projected/d1dffe25-5486-4aba-932a-41725895d7cf-kube-api-access-p7zvc\") pod \"neutron-operator-controller-manager-54688575f-qhwdw\" (UID: \"d1dffe25-5486-4aba-932a-41725895d7cf\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-qhwdw" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.892398 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6spc9\" (UniqueName: \"kubernetes.io/projected/ff530b27-be8c-40d4-9af7-ed6200fcdcac-kube-api-access-6spc9\") pod \"ironic-operator-controller-manager-545456dc4-bpf8z\" (UID: \"ff530b27-be8c-40d4-9af7-ed6200fcdcac\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-bpf8z" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.892424 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t56bn\" (UniqueName: \"kubernetes.io/projected/2c8e75e0-135a-4d58-9230-a0f18ad89b25-kube-api-access-t56bn\") pod \"horizon-operator-controller-manager-78bc7f9bd9-zbvvz\" (UID: \"2c8e75e0-135a-4d58-9230-a0f18ad89b25\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-zbvvz" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.892470 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66ks4\" (UniqueName: \"kubernetes.io/projected/adbbc5e4-169b-4b18-819f-2ca9136edf9b-kube-api-access-66ks4\") pod \"infra-operator-controller-manager-f7fcc58b9-b5zsk\" (UID: \"adbbc5e4-169b-4b18-819f-2ca9136edf9b\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-b5zsk" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.892511 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfrfd\" (UniqueName: \"kubernetes.io/projected/3ee3f9b2-09ba-4f80-823f-a4fd8639a14a-kube-api-access-gfrfd\") pod \"glance-operator-controller-manager-64db6967f8-6jm6b\" (UID: \"3ee3f9b2-09ba-4f80-823f-a4fd8639a14a\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-6jm6b" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.892531 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cch66\" (UniqueName: \"kubernetes.io/projected/c947e95f-2771-4a5e-8adc-adf86fbbcc48-kube-api-access-cch66\") pod \"manila-operator-controller-manager-67d996989d-cmg8t\" (UID: \"c947e95f-2771-4a5e-8adc-adf86fbbcc48\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-cmg8t" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.892562 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adbbc5e4-169b-4b18-819f-2ca9136edf9b-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-b5zsk\" (UID: \"adbbc5e4-169b-4b18-819f-2ca9136edf9b\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-b5zsk" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.892592 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psz7n\" (UniqueName: \"kubernetes.io/projected/c0c7b817-9526-41fb-9a08-f2951ef9db20-kube-api-access-psz7n\") pod \"keystone-operator-controller-manager-7c789f89c6-pnfls\" (UID: \"c0c7b817-9526-41fb-9a08-f2951ef9db20\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-pnfls" Mar 10 07:02:38 crc kubenswrapper[4825]: E0310 07:02:38.893462 4825 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 07:02:38 crc kubenswrapper[4825]: E0310 07:02:38.893509 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adbbc5e4-169b-4b18-819f-2ca9136edf9b-cert podName:adbbc5e4-169b-4b18-819f-2ca9136edf9b nodeName:}" failed. No retries permitted until 2026-03-10 07:02:39.39349426 +0000 UTC m=+1112.423274875 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/adbbc5e4-169b-4b18-819f-2ca9136edf9b-cert") pod "infra-operator-controller-manager-f7fcc58b9-b5zsk" (UID: "adbbc5e4-169b-4b18-819f-2ca9136edf9b") : secret "infra-operator-webhook-server-cert" not found Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.907089 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-z8726"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.907416 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-mx4wq" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.944815 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qmsk\" (UniqueName: \"kubernetes.io/projected/a16a88fb-4aab-48c8-a540-f999a66f712b-kube-api-access-5qmsk\") pod \"heat-operator-controller-manager-cf99c678f-7lfrk\" (UID: \"a16a88fb-4aab-48c8-a540-f999a66f712b\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7lfrk" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.946179 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-4fxt4"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.949821 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-4fxt4" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.955075 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66ks4\" (UniqueName: \"kubernetes.io/projected/adbbc5e4-169b-4b18-819f-2ca9136edf9b-kube-api-access-66ks4\") pod \"infra-operator-controller-manager-f7fcc58b9-b5zsk\" (UID: \"adbbc5e4-169b-4b18-819f-2ca9136edf9b\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-b5zsk" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.960869 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfrfd\" (UniqueName: \"kubernetes.io/projected/3ee3f9b2-09ba-4f80-823f-a4fd8639a14a-kube-api-access-gfrfd\") pod \"glance-operator-controller-manager-64db6967f8-6jm6b\" (UID: \"3ee3f9b2-09ba-4f80-823f-a4fd8639a14a\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-6jm6b" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.962282 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t56bn\" (UniqueName: \"kubernetes.io/projected/2c8e75e0-135a-4d58-9230-a0f18ad89b25-kube-api-access-t56bn\") pod \"horizon-operator-controller-manager-78bc7f9bd9-zbvvz\" (UID: \"2c8e75e0-135a-4d58-9230-a0f18ad89b25\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-zbvvz" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.962630 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-nn25z" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.974221 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-4fxt4"] Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.974557 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-zbvvz" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.993830 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7zvc\" (UniqueName: \"kubernetes.io/projected/d1dffe25-5486-4aba-932a-41725895d7cf-kube-api-access-p7zvc\") pod \"neutron-operator-controller-manager-54688575f-qhwdw\" (UID: \"d1dffe25-5486-4aba-932a-41725895d7cf\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-qhwdw" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.993874 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6spc9\" (UniqueName: \"kubernetes.io/projected/ff530b27-be8c-40d4-9af7-ed6200fcdcac-kube-api-access-6spc9\") pod \"ironic-operator-controller-manager-545456dc4-bpf8z\" (UID: \"ff530b27-be8c-40d4-9af7-ed6200fcdcac\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-bpf8z" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.993918 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g6vk\" (UniqueName: \"kubernetes.io/projected/6667b99e-cca6-4450-b44d-03edafac3e7a-kube-api-access-6g6vk\") pod \"nova-operator-controller-manager-74b6b5dc96-dn5wn\" (UID: \"6667b99e-cca6-4450-b44d-03edafac3e7a\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dn5wn" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.993950 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwdlz\" (UniqueName: \"kubernetes.io/projected/406e912a-4ca3-4760-ba95-e63b4d342849-kube-api-access-rwdlz\") pod \"octavia-operator-controller-manager-5d86c7ddb7-z8726\" (UID: \"406e912a-4ca3-4760-ba95-e63b4d342849\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-z8726" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.993983 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cch66\" (UniqueName: \"kubernetes.io/projected/c947e95f-2771-4a5e-8adc-adf86fbbcc48-kube-api-access-cch66\") pod \"manila-operator-controller-manager-67d996989d-cmg8t\" (UID: \"c947e95f-2771-4a5e-8adc-adf86fbbcc48\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-cmg8t" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.994036 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psz7n\" (UniqueName: \"kubernetes.io/projected/c0c7b817-9526-41fb-9a08-f2951ef9db20-kube-api-access-psz7n\") pod \"keystone-operator-controller-manager-7c789f89c6-pnfls\" (UID: \"c0c7b817-9526-41fb-9a08-f2951ef9db20\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-pnfls" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.994058 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h784z\" (UniqueName: \"kubernetes.io/projected/18fccb01-59df-4020-88c5-9c70b9f0edec-kube-api-access-h784z\") pod \"mariadb-operator-controller-manager-7b6bfb6475-npxgg\" (UID: \"18fccb01-59df-4020-88c5-9c70b9f0edec\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-npxgg" Mar 10 07:02:38 crc kubenswrapper[4825]: I0310 07:02:38.997860 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-wx259"] Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.001415 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wx259" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.019430 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-ffcgn" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.037943 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h784z\" (UniqueName: \"kubernetes.io/projected/18fccb01-59df-4020-88c5-9c70b9f0edec-kube-api-access-h784z\") pod \"mariadb-operator-controller-manager-7b6bfb6475-npxgg\" (UID: \"18fccb01-59df-4020-88c5-9c70b9f0edec\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-npxgg" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.040710 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7zvc\" (UniqueName: \"kubernetes.io/projected/d1dffe25-5486-4aba-932a-41725895d7cf-kube-api-access-p7zvc\") pod \"neutron-operator-controller-manager-54688575f-qhwdw\" (UID: \"d1dffe25-5486-4aba-932a-41725895d7cf\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-qhwdw" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.060653 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psz7n\" (UniqueName: \"kubernetes.io/projected/c0c7b817-9526-41fb-9a08-f2951ef9db20-kube-api-access-psz7n\") pod \"keystone-operator-controller-manager-7c789f89c6-pnfls\" (UID: \"c0c7b817-9526-41fb-9a08-f2951ef9db20\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-pnfls" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.063852 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cch66\" (UniqueName: \"kubernetes.io/projected/c947e95f-2771-4a5e-8adc-adf86fbbcc48-kube-api-access-cch66\") pod \"manila-operator-controller-manager-67d996989d-cmg8t\" (UID: \"c947e95f-2771-4a5e-8adc-adf86fbbcc48\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-cmg8t" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.063881 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6spc9\" (UniqueName: \"kubernetes.io/projected/ff530b27-be8c-40d4-9af7-ed6200fcdcac-kube-api-access-6spc9\") pod \"ironic-operator-controller-manager-545456dc4-bpf8z\" (UID: \"ff530b27-be8c-40d4-9af7-ed6200fcdcac\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-bpf8z" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.070405 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-wx259"] Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.091057 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-bpf8z" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.099293 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g6vk\" (UniqueName: \"kubernetes.io/projected/6667b99e-cca6-4450-b44d-03edafac3e7a-kube-api-access-6g6vk\") pod \"nova-operator-controller-manager-74b6b5dc96-dn5wn\" (UID: \"6667b99e-cca6-4450-b44d-03edafac3e7a\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dn5wn" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.099351 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwdlz\" (UniqueName: \"kubernetes.io/projected/406e912a-4ca3-4760-ba95-e63b4d342849-kube-api-access-rwdlz\") pod \"octavia-operator-controller-manager-5d86c7ddb7-z8726\" (UID: \"406e912a-4ca3-4760-ba95-e63b4d342849\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-z8726" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.099441 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz2q5\" (UniqueName: \"kubernetes.io/projected/7004ec9c-96b9-41e9-85f2-f57a2ca785ef-kube-api-access-sz2q5\") pod \"ovn-operator-controller-manager-75684d597f-4fxt4\" (UID: \"7004ec9c-96b9-41e9-85f2-f57a2ca785ef\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-4fxt4" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.121297 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-pnfls" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.133878 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc"] Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.135617 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g6vk\" (UniqueName: \"kubernetes.io/projected/6667b99e-cca6-4450-b44d-03edafac3e7a-kube-api-access-6g6vk\") pod \"nova-operator-controller-manager-74b6b5dc96-dn5wn\" (UID: \"6667b99e-cca6-4450-b44d-03edafac3e7a\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dn5wn" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.147619 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.154236 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwdlz\" (UniqueName: \"kubernetes.io/projected/406e912a-4ca3-4760-ba95-e63b4d342849-kube-api-access-rwdlz\") pod \"octavia-operator-controller-manager-5d86c7ddb7-z8726\" (UID: \"406e912a-4ca3-4760-ba95-e63b4d342849\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-z8726" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.184830 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.204384 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bdxtj"] Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.206685 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz2q5\" (UniqueName: \"kubernetes.io/projected/7004ec9c-96b9-41e9-85f2-f57a2ca785ef-kube-api-access-sz2q5\") pod \"ovn-operator-controller-manager-75684d597f-4fxt4\" (UID: \"7004ec9c-96b9-41e9-85f2-f57a2ca785ef\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-4fxt4" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.207423 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxt7g\" (UniqueName: \"kubernetes.io/projected/4753d4b8-4f42-480b-bed5-54aa4dd2e77b-kube-api-access-mxt7g\") pod \"placement-operator-controller-manager-648564c9fc-wx259\" (UID: \"4753d4b8-4f42-480b-bed5-54aa4dd2e77b\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wx259" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.207981 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-kdx76" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.226709 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bdxtj" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.231226 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-hwvlj" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.231974 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-npxgg" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.232496 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-6jm6b" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.241849 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7lfrk" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.244223 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz2q5\" (UniqueName: \"kubernetes.io/projected/7004ec9c-96b9-41e9-85f2-f57a2ca785ef-kube-api-access-sz2q5\") pod \"ovn-operator-controller-manager-75684d597f-4fxt4\" (UID: \"7004ec9c-96b9-41e9-85f2-f57a2ca785ef\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-4fxt4" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.253808 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-cmg8t" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.270964 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-qhwdw" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.273088 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bdxtj"] Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.273110 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc"] Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.273121 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-6bw6n"] Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.276442 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6bw6n" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.277981 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-6bw6n"] Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.283388 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-tjrvc" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.294283 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dn5wn" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.297519 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-s56gc"] Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.298561 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-s56gc" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.304604 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-l7fqn" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.305200 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-z8726" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.308546 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p568\" (UniqueName: \"kubernetes.io/projected/0db55365-104a-42c2-ba9b-1c084fdd08cf-kube-api-access-4p568\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc\" (UID: \"0db55365-104a-42c2-ba9b-1c084fdd08cf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.308581 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxt7g\" (UniqueName: \"kubernetes.io/projected/4753d4b8-4f42-480b-bed5-54aa4dd2e77b-kube-api-access-mxt7g\") pod \"placement-operator-controller-manager-648564c9fc-wx259\" (UID: \"4753d4b8-4f42-480b-bed5-54aa4dd2e77b\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wx259" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.308617 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0db55365-104a-42c2-ba9b-1c084fdd08cf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc\" (UID: \"0db55365-104a-42c2-ba9b-1c084fdd08cf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.308757 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vd6k\" (UniqueName: \"kubernetes.io/projected/e5de5a57-9f17-4101-952a-cc33147fc220-kube-api-access-4vd6k\") pod \"swift-operator-controller-manager-9b9ff9f4d-bdxtj\" (UID: \"e5de5a57-9f17-4101-952a-cc33147fc220\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bdxtj" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.334253 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-s56gc"] Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.340125 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxt7g\" (UniqueName: \"kubernetes.io/projected/4753d4b8-4f42-480b-bed5-54aa4dd2e77b-kube-api-access-mxt7g\") pod \"placement-operator-controller-manager-648564c9fc-wx259\" (UID: \"4753d4b8-4f42-480b-bed5-54aa4dd2e77b\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wx259" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.349252 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-t88nq"] Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.350154 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-t88nq" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.351330 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-4fxt4" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.364765 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-t88nq"] Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.371655 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-bbpng" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.392120 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wx259" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.409792 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adbbc5e4-169b-4b18-819f-2ca9136edf9b-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-b5zsk\" (UID: \"adbbc5e4-169b-4b18-819f-2ca9136edf9b\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-b5zsk" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.409837 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p568\" (UniqueName: \"kubernetes.io/projected/0db55365-104a-42c2-ba9b-1c084fdd08cf-kube-api-access-4p568\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc\" (UID: \"0db55365-104a-42c2-ba9b-1c084fdd08cf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.409878 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlqvt\" (UniqueName: \"kubernetes.io/projected/88d2e717-1281-45d1-a787-74fafe29c95b-kube-api-access-dlqvt\") pod \"telemetry-operator-controller-manager-5fdb694969-6bw6n\" (UID: \"88d2e717-1281-45d1-a787-74fafe29c95b\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6bw6n" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.409907 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0db55365-104a-42c2-ba9b-1c084fdd08cf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc\" (UID: \"0db55365-104a-42c2-ba9b-1c084fdd08cf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.409926 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz4z2\" (UniqueName: \"kubernetes.io/projected/6a3b19ba-bd45-4cae-a466-a4f380a10cd0-kube-api-access-wz4z2\") pod \"test-operator-controller-manager-55b5ff4dbb-s56gc\" (UID: \"6a3b19ba-bd45-4cae-a466-a4f380a10cd0\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-s56gc" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.409949 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vd6k\" (UniqueName: \"kubernetes.io/projected/e5de5a57-9f17-4101-952a-cc33147fc220-kube-api-access-4vd6k\") pod \"swift-operator-controller-manager-9b9ff9f4d-bdxtj\" (UID: \"e5de5a57-9f17-4101-952a-cc33147fc220\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bdxtj" Mar 10 07:02:39 crc kubenswrapper[4825]: E0310 07:02:39.409966 4825 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 07:02:39 crc kubenswrapper[4825]: E0310 07:02:39.410036 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adbbc5e4-169b-4b18-819f-2ca9136edf9b-cert podName:adbbc5e4-169b-4b18-819f-2ca9136edf9b nodeName:}" failed. No retries permitted until 2026-03-10 07:02:40.410014105 +0000 UTC m=+1113.439794720 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/adbbc5e4-169b-4b18-819f-2ca9136edf9b-cert") pod "infra-operator-controller-manager-f7fcc58b9-b5zsk" (UID: "adbbc5e4-169b-4b18-819f-2ca9136edf9b") : secret "infra-operator-webhook-server-cert" not found Mar 10 07:02:39 crc kubenswrapper[4825]: E0310 07:02:39.410397 4825 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 07:02:39 crc kubenswrapper[4825]: E0310 07:02:39.410424 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0db55365-104a-42c2-ba9b-1c084fdd08cf-cert podName:0db55365-104a-42c2-ba9b-1c084fdd08cf nodeName:}" failed. No retries permitted until 2026-03-10 07:02:39.910415615 +0000 UTC m=+1112.940196230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0db55365-104a-42c2-ba9b-1c084fdd08cf-cert") pod "openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc" (UID: "0db55365-104a-42c2-ba9b-1c084fdd08cf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.438492 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vd6k\" (UniqueName: \"kubernetes.io/projected/e5de5a57-9f17-4101-952a-cc33147fc220-kube-api-access-4vd6k\") pod \"swift-operator-controller-manager-9b9ff9f4d-bdxtj\" (UID: \"e5de5a57-9f17-4101-952a-cc33147fc220\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bdxtj" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.441074 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p568\" (UniqueName: \"kubernetes.io/projected/0db55365-104a-42c2-ba9b-1c084fdd08cf-kube-api-access-4p568\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc\" (UID: \"0db55365-104a-42c2-ba9b-1c084fdd08cf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.471970 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk"] Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.473155 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.479268 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-k4s75" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.479575 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.481056 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.514872 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rtkc\" (UniqueName: \"kubernetes.io/projected/ab1a7215-c090-4f4e-a994-603bbbafb22e-kube-api-access-2rtkc\") pod \"watcher-operator-controller-manager-bccc79885-t88nq\" (UID: \"ab1a7215-c090-4f4e-a994-603bbbafb22e\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-t88nq" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.515024 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlqvt\" (UniqueName: \"kubernetes.io/projected/88d2e717-1281-45d1-a787-74fafe29c95b-kube-api-access-dlqvt\") pod \"telemetry-operator-controller-manager-5fdb694969-6bw6n\" (UID: \"88d2e717-1281-45d1-a787-74fafe29c95b\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6bw6n" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.515079 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz4z2\" (UniqueName: \"kubernetes.io/projected/6a3b19ba-bd45-4cae-a466-a4f380a10cd0-kube-api-access-wz4z2\") pod \"test-operator-controller-manager-55b5ff4dbb-s56gc\" (UID: \"6a3b19ba-bd45-4cae-a466-a4f380a10cd0\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-s56gc" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.518874 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk"] Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.539017 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz4z2\" (UniqueName: \"kubernetes.io/projected/6a3b19ba-bd45-4cae-a466-a4f380a10cd0-kube-api-access-wz4z2\") pod \"test-operator-controller-manager-55b5ff4dbb-s56gc\" (UID: \"6a3b19ba-bd45-4cae-a466-a4f380a10cd0\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-s56gc" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.539091 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlqvt\" (UniqueName: \"kubernetes.io/projected/88d2e717-1281-45d1-a787-74fafe29c95b-kube-api-access-dlqvt\") pod \"telemetry-operator-controller-manager-5fdb694969-6bw6n\" (UID: \"88d2e717-1281-45d1-a787-74fafe29c95b\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6bw6n" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.558408 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k2l9r"] Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.565873 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k2l9r" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.568643 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-krxcr" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.575629 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bdxtj" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.578341 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k2l9r"] Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.596572 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6bw6n" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.619505 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rtkc\" (UniqueName: \"kubernetes.io/projected/ab1a7215-c090-4f4e-a994-603bbbafb22e-kube-api-access-2rtkc\") pod \"watcher-operator-controller-manager-bccc79885-t88nq\" (UID: \"ab1a7215-c090-4f4e-a994-603bbbafb22e\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-t88nq" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.619603 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgk9r\" (UniqueName: \"kubernetes.io/projected/5e92a016-c5ae-4ae4-a853-f69e701639fd-kube-api-access-wgk9r\") pod \"openstack-operator-controller-manager-59b6c9788f-n2gmk\" (UID: \"5e92a016-c5ae-4ae4-a853-f69e701639fd\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.619668 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-n2gmk\" (UID: \"5e92a016-c5ae-4ae4-a853-f69e701639fd\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.619697 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-n2gmk\" (UID: \"5e92a016-c5ae-4ae4-a853-f69e701639fd\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.630914 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-s56gc" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.636875 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rtkc\" (UniqueName: \"kubernetes.io/projected/ab1a7215-c090-4f4e-a994-603bbbafb22e-kube-api-access-2rtkc\") pod \"watcher-operator-controller-manager-bccc79885-t88nq\" (UID: \"ab1a7215-c090-4f4e-a994-603bbbafb22e\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-t88nq" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.689972 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sd5jw"] Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.701083 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-t88nq" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.720460 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-n2gmk\" (UID: \"5e92a016-c5ae-4ae4-a853-f69e701639fd\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.720509 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-n2gmk\" (UID: \"5e92a016-c5ae-4ae4-a853-f69e701639fd\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.720594 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-988q5\" (UniqueName: \"kubernetes.io/projected/41dde8f5-d43b-4cdf-beb8-56e67290e024-kube-api-access-988q5\") pod \"rabbitmq-cluster-operator-manager-668c99d594-k2l9r\" (UID: \"41dde8f5-d43b-4cdf-beb8-56e67290e024\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k2l9r" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.720617 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgk9r\" (UniqueName: \"kubernetes.io/projected/5e92a016-c5ae-4ae4-a853-f69e701639fd-kube-api-access-wgk9r\") pod \"openstack-operator-controller-manager-59b6c9788f-n2gmk\" (UID: \"5e92a016-c5ae-4ae4-a853-f69e701639fd\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:02:39 crc kubenswrapper[4825]: E0310 07:02:39.721020 4825 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 07:02:39 crc kubenswrapper[4825]: E0310 07:02:39.721069 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-webhook-certs podName:5e92a016-c5ae-4ae4-a853-f69e701639fd nodeName:}" failed. No retries permitted until 2026-03-10 07:02:40.221053137 +0000 UTC m=+1113.250833752 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-webhook-certs") pod "openstack-operator-controller-manager-59b6c9788f-n2gmk" (UID: "5e92a016-c5ae-4ae4-a853-f69e701639fd") : secret "webhook-server-cert" not found Mar 10 07:02:39 crc kubenswrapper[4825]: E0310 07:02:39.721234 4825 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 07:02:39 crc kubenswrapper[4825]: E0310 07:02:39.721261 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-metrics-certs podName:5e92a016-c5ae-4ae4-a853-f69e701639fd nodeName:}" failed. No retries permitted until 2026-03-10 07:02:40.221254892 +0000 UTC m=+1113.251035507 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-metrics-certs") pod "openstack-operator-controller-manager-59b6c9788f-n2gmk" (UID: "5e92a016-c5ae-4ae4-a853-f69e701639fd") : secret "metrics-server-cert" not found Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.770657 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgk9r\" (UniqueName: \"kubernetes.io/projected/5e92a016-c5ae-4ae4-a853-f69e701639fd-kube-api-access-wgk9r\") pod \"openstack-operator-controller-manager-59b6c9788f-n2gmk\" (UID: \"5e92a016-c5ae-4ae4-a853-f69e701639fd\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.815314 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-qqskl"] Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.821526 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-988q5\" (UniqueName: \"kubernetes.io/projected/41dde8f5-d43b-4cdf-beb8-56e67290e024-kube-api-access-988q5\") pod \"rabbitmq-cluster-operator-manager-668c99d594-k2l9r\" (UID: \"41dde8f5-d43b-4cdf-beb8-56e67290e024\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k2l9r" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.862800 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-988q5\" (UniqueName: \"kubernetes.io/projected/41dde8f5-d43b-4cdf-beb8-56e67290e024-kube-api-access-988q5\") pod \"rabbitmq-cluster-operator-manager-668c99d594-k2l9r\" (UID: \"41dde8f5-d43b-4cdf-beb8-56e67290e024\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k2l9r" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.909469 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k2l9r" Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.925420 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0db55365-104a-42c2-ba9b-1c084fdd08cf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc\" (UID: \"0db55365-104a-42c2-ba9b-1c084fdd08cf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc" Mar 10 07:02:39 crc kubenswrapper[4825]: E0310 07:02:39.925651 4825 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 07:02:39 crc kubenswrapper[4825]: E0310 07:02:39.925719 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0db55365-104a-42c2-ba9b-1c084fdd08cf-cert podName:0db55365-104a-42c2-ba9b-1c084fdd08cf nodeName:}" failed. No retries permitted until 2026-03-10 07:02:40.925703148 +0000 UTC m=+1113.955483763 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0db55365-104a-42c2-ba9b-1c084fdd08cf-cert") pod "openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc" (UID: "0db55365-104a-42c2-ba9b-1c084fdd08cf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 07:02:39 crc kubenswrapper[4825]: I0310 07:02:39.986723 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-zbvvz"] Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.048093 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-bpf8z"] Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.103575 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-bpf8z" event={"ID":"ff530b27-be8c-40d4-9af7-ed6200fcdcac","Type":"ContainerStarted","Data":"5dd9da56d1c87e0593ce7bf788a88ee538521eda39a77ed5b54f5f0c248c1de6"} Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.105300 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-zbvvz" event={"ID":"2c8e75e0-135a-4d58-9230-a0f18ad89b25","Type":"ContainerStarted","Data":"d8b05041de31b6560cc1975df556370f87bb7c436318162936170d0303669451"} Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.106362 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sd5jw" event={"ID":"acc23278-44ed-4cee-bd08-4562e17175db","Type":"ContainerStarted","Data":"9b3ac764991f4fa6ed0659df7e8d2ac70aef6ee43d5a937549a1bd32f5ec4097"} Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.107594 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-qqskl" event={"ID":"7ed2e288-eed6-49bf-8eb8-be9a1e8415a3","Type":"ContainerStarted","Data":"52a26f7bc12ab89a7a0b7aece1802e9fd79a11f0af51f3fe44dcc23d836e9766"} Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.151621 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-nl9ph"] Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.253820 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-npxgg"] Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.259038 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-6jm6b"] Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.263308 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-n2gmk\" (UID: \"5e92a016-c5ae-4ae4-a853-f69e701639fd\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.263348 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-n2gmk\" (UID: \"5e92a016-c5ae-4ae4-a853-f69e701639fd\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:02:40 crc kubenswrapper[4825]: E0310 07:02:40.263534 4825 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 07:02:40 crc kubenswrapper[4825]: E0310 07:02:40.263574 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-metrics-certs podName:5e92a016-c5ae-4ae4-a853-f69e701639fd nodeName:}" failed. No retries permitted until 2026-03-10 07:02:41.263559519 +0000 UTC m=+1114.293340134 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-metrics-certs") pod "openstack-operator-controller-manager-59b6c9788f-n2gmk" (UID: "5e92a016-c5ae-4ae4-a853-f69e701639fd") : secret "metrics-server-cert" not found Mar 10 07:02:40 crc kubenswrapper[4825]: E0310 07:02:40.263771 4825 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 07:02:40 crc kubenswrapper[4825]: E0310 07:02:40.263838 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-webhook-certs podName:5e92a016-c5ae-4ae4-a853-f69e701639fd nodeName:}" failed. No retries permitted until 2026-03-10 07:02:41.263819406 +0000 UTC m=+1114.293600021 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-webhook-certs") pod "openstack-operator-controller-manager-59b6c9788f-n2gmk" (UID: "5e92a016-c5ae-4ae4-a853-f69e701639fd") : secret "webhook-server-cert" not found Mar 10 07:02:40 crc kubenswrapper[4825]: W0310 07:02:40.264559 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ee3f9b2_09ba_4f80_823f_a4fd8639a14a.slice/crio-da814202d6da2c05cdf70dda80adeac89ea5b52380286a16fa067ae0694188b6 WatchSource:0}: Error finding container da814202d6da2c05cdf70dda80adeac89ea5b52380286a16fa067ae0694188b6: Status 404 returned error can't find the container with id da814202d6da2c05cdf70dda80adeac89ea5b52380286a16fa067ae0694188b6 Mar 10 07:02:40 crc kubenswrapper[4825]: W0310 07:02:40.441270 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1dffe25_5486_4aba_932a_41725895d7cf.slice/crio-cfe28cb588b3aace32b3c1d5930b6d762fcff5e6946788bf8f762b990082159e WatchSource:0}: Error finding container cfe28cb588b3aace32b3c1d5930b6d762fcff5e6946788bf8f762b990082159e: Status 404 returned error can't find the container with id cfe28cb588b3aace32b3c1d5930b6d762fcff5e6946788bf8f762b990082159e Mar 10 07:02:40 crc kubenswrapper[4825]: W0310 07:02:40.445311 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0c7b817_9526_41fb_9a08_f2951ef9db20.slice/crio-fe2d2090632199fc76f3881de5ae30cb642176158b31d4b191aba96633b63b62 WatchSource:0}: Error finding container fe2d2090632199fc76f3881de5ae30cb642176158b31d4b191aba96633b63b62: Status 404 returned error can't find the container with id fe2d2090632199fc76f3881de5ae30cb642176158b31d4b191aba96633b63b62 Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.447795 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-qhwdw"] Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.459499 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-pnfls"] Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.469270 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adbbc5e4-169b-4b18-819f-2ca9136edf9b-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-b5zsk\" (UID: \"adbbc5e4-169b-4b18-819f-2ca9136edf9b\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-b5zsk" Mar 10 07:02:40 crc kubenswrapper[4825]: E0310 07:02:40.469430 4825 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 07:02:40 crc kubenswrapper[4825]: E0310 07:02:40.469499 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adbbc5e4-169b-4b18-819f-2ca9136edf9b-cert podName:adbbc5e4-169b-4b18-819f-2ca9136edf9b nodeName:}" failed. No retries permitted until 2026-03-10 07:02:42.469481383 +0000 UTC m=+1115.499261998 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/adbbc5e4-169b-4b18-819f-2ca9136edf9b-cert") pod "infra-operator-controller-manager-f7fcc58b9-b5zsk" (UID: "adbbc5e4-169b-4b18-819f-2ca9136edf9b") : secret "infra-operator-webhook-server-cert" not found Mar 10 07:02:40 crc kubenswrapper[4825]: W0310 07:02:40.480847 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod406e912a_4ca3_4760_ba95_e63b4d342849.slice/crio-64d580e0c9014a982ac4080358e5d6c0876f54c855a69488dbcff065b936ff9c WatchSource:0}: Error finding container 64d580e0c9014a982ac4080358e5d6c0876f54c855a69488dbcff065b936ff9c: Status 404 returned error can't find the container with id 64d580e0c9014a982ac4080358e5d6c0876f54c855a69488dbcff065b936ff9c Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.485092 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-z8726"] Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.493988 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-7lfrk"] Mar 10 07:02:40 crc kubenswrapper[4825]: W0310 07:02:40.494881 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda16a88fb_4aab_48c8_a540_f999a66f712b.slice/crio-04e6a9e8ab79857d21ecb94afbc498e27343583ba30fdbbf14eef1cd0939cf6d WatchSource:0}: Error finding container 04e6a9e8ab79857d21ecb94afbc498e27343583ba30fdbbf14eef1cd0939cf6d: Status 404 returned error can't find the container with id 04e6a9e8ab79857d21ecb94afbc498e27343583ba30fdbbf14eef1cd0939cf6d Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.503438 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-wx259"] Mar 10 07:02:40 crc kubenswrapper[4825]: W0310 07:02:40.504722 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc947e95f_2771_4a5e_8adc_adf86fbbcc48.slice/crio-f7825645be47077383290d3a7fdcc512d5df277d4228546c5ac04ed6f47e1e12 WatchSource:0}: Error finding container f7825645be47077383290d3a7fdcc512d5df277d4228546c5ac04ed6f47e1e12: Status 404 returned error can't find the container with id f7825645be47077383290d3a7fdcc512d5df277d4228546c5ac04ed6f47e1e12 Mar 10 07:02:40 crc kubenswrapper[4825]: E0310 07:02:40.504983 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:ee642fcf655f9897d480460008cba2e98b497d3ffdf7ab1d48ea460eb20c2053,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5qmsk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-cf99c678f-7lfrk_openstack-operators(a16a88fb-4aab-48c8-a540-f999a66f712b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 07:02:40 crc kubenswrapper[4825]: E0310 07:02:40.506168 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7lfrk" podUID="a16a88fb-4aab-48c8-a540-f999a66f712b" Mar 10 07:02:40 crc kubenswrapper[4825]: E0310 07:02:40.506546 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cch66,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-67d996989d-cmg8t_openstack-operators(c947e95f-2771-4a5e-8adc-adf86fbbcc48): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 07:02:40 crc kubenswrapper[4825]: E0310 07:02:40.507806 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-67d996989d-cmg8t" podUID="c947e95f-2771-4a5e-8adc-adf86fbbcc48" Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.510499 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-4fxt4"] Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.524515 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-dn5wn"] Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.538960 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-cmg8t"] Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.627770 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-6bw6n"] Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.641550 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k2l9r"] Mar 10 07:02:40 crc kubenswrapper[4825]: E0310 07:02:40.643341 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wz4z2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-55b5ff4dbb-s56gc_openstack-operators(6a3b19ba-bd45-4cae-a466-a4f380a10cd0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 07:02:40 crc kubenswrapper[4825]: E0310 07:02:40.643048 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dlqvt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5fdb694969-6bw6n_openstack-operators(88d2e717-1281-45d1-a787-74fafe29c95b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 07:02:40 crc kubenswrapper[4825]: E0310 07:02:40.644829 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-s56gc" podUID="6a3b19ba-bd45-4cae-a466-a4f380a10cd0" Mar 10 07:02:40 crc kubenswrapper[4825]: E0310 07:02:40.645845 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6bw6n" podUID="88d2e717-1281-45d1-a787-74fafe29c95b" Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.647306 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-s56gc"] Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.652238 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bdxtj"] Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.655921 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-t88nq"] Mar 10 07:02:40 crc kubenswrapper[4825]: W0310 07:02:40.656768 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5de5a57_9f17_4101_952a_cc33147fc220.slice/crio-a3e45334c30b2e6276c6bd400a2e856cffdf3fd2d5ad5716cf5de9fdd501f2fb WatchSource:0}: Error finding container a3e45334c30b2e6276c6bd400a2e856cffdf3fd2d5ad5716cf5de9fdd501f2fb: Status 404 returned error can't find the container with id a3e45334c30b2e6276c6bd400a2e856cffdf3fd2d5ad5716cf5de9fdd501f2fb Mar 10 07:02:40 crc kubenswrapper[4825]: E0310 07:02:40.664349 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4vd6k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9b9ff9f4d-bdxtj_openstack-operators(e5de5a57-9f17-4101-952a-cc33147fc220): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 07:02:40 crc kubenswrapper[4825]: E0310 07:02:40.665613 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bdxtj" podUID="e5de5a57-9f17-4101-952a-cc33147fc220" Mar 10 07:02:40 crc kubenswrapper[4825]: E0310 07:02:40.668909 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2rtkc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-t88nq_openstack-operators(ab1a7215-c090-4f4e-a994-603bbbafb22e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 07:02:40 crc kubenswrapper[4825]: E0310 07:02:40.670276 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-t88nq" podUID="ab1a7215-c090-4f4e-a994-603bbbafb22e" Mar 10 07:02:40 crc kubenswrapper[4825]: I0310 07:02:40.977580 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0db55365-104a-42c2-ba9b-1c084fdd08cf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc\" (UID: \"0db55365-104a-42c2-ba9b-1c084fdd08cf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc" Mar 10 07:02:40 crc kubenswrapper[4825]: E0310 07:02:40.977805 4825 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 07:02:40 crc kubenswrapper[4825]: E0310 07:02:40.977895 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0db55365-104a-42c2-ba9b-1c084fdd08cf-cert podName:0db55365-104a-42c2-ba9b-1c084fdd08cf nodeName:}" failed. No retries permitted until 2026-03-10 07:02:42.977855326 +0000 UTC m=+1116.007635941 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0db55365-104a-42c2-ba9b-1c084fdd08cf-cert") pod "openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc" (UID: "0db55365-104a-42c2-ba9b-1c084fdd08cf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 07:02:41 crc kubenswrapper[4825]: I0310 07:02:41.119408 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-s56gc" event={"ID":"6a3b19ba-bd45-4cae-a466-a4f380a10cd0","Type":"ContainerStarted","Data":"0abcbf9ff9cb437f86d695d31ea24b6c9d78f24e4f3672755b5007a1104f7a7e"} Mar 10 07:02:41 crc kubenswrapper[4825]: I0310 07:02:41.122378 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-nl9ph" event={"ID":"53bfb88f-ffff-4945-acc2-ae245147edcb","Type":"ContainerStarted","Data":"10da7652fcb03efa6e47090e31b7ea93d237b14b4e5122a3144a1393b39d6c7d"} Mar 10 07:02:41 crc kubenswrapper[4825]: E0310 07:02:41.123831 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968\\\"\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-s56gc" podUID="6a3b19ba-bd45-4cae-a466-a4f380a10cd0" Mar 10 07:02:41 crc kubenswrapper[4825]: E0310 07:02:41.128288 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7\\\"\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bdxtj" podUID="e5de5a57-9f17-4101-952a-cc33147fc220" Mar 10 07:02:41 crc kubenswrapper[4825]: I0310 07:02:41.130075 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-npxgg" event={"ID":"18fccb01-59df-4020-88c5-9c70b9f0edec","Type":"ContainerStarted","Data":"3f1ced4c7119769b68ccea3b9bcb5891aab07ca423de0aceb3b93cf094b6acd7"} Mar 10 07:02:41 crc kubenswrapper[4825]: I0310 07:02:41.130415 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bdxtj" event={"ID":"e5de5a57-9f17-4101-952a-cc33147fc220","Type":"ContainerStarted","Data":"a3e45334c30b2e6276c6bd400a2e856cffdf3fd2d5ad5716cf5de9fdd501f2fb"} Mar 10 07:02:41 crc kubenswrapper[4825]: I0310 07:02:41.130577 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wx259" event={"ID":"4753d4b8-4f42-480b-bed5-54aa4dd2e77b","Type":"ContainerStarted","Data":"3f2cd56549775ac09141f05a5a21c83b427ee3c4a1a4428568a99682b31e598e"} Mar 10 07:02:41 crc kubenswrapper[4825]: I0310 07:02:41.136259 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-4fxt4" event={"ID":"7004ec9c-96b9-41e9-85f2-f57a2ca785ef","Type":"ContainerStarted","Data":"6864a40df297298143f6d834d1d324111df9c1a09414bbe66be340feb0f6cb53"} Mar 10 07:02:41 crc kubenswrapper[4825]: I0310 07:02:41.137978 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-t88nq" event={"ID":"ab1a7215-c090-4f4e-a994-603bbbafb22e","Type":"ContainerStarted","Data":"3d5ca67af0345f0c681522d1a3b1e90d4b6535f56d8f009514d5d8fd3cd0a4fa"} Mar 10 07:02:41 crc kubenswrapper[4825]: E0310 07:02:41.142883 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-t88nq" podUID="ab1a7215-c090-4f4e-a994-603bbbafb22e" Mar 10 07:02:41 crc kubenswrapper[4825]: I0310 07:02:41.146720 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-pnfls" event={"ID":"c0c7b817-9526-41fb-9a08-f2951ef9db20","Type":"ContainerStarted","Data":"fe2d2090632199fc76f3881de5ae30cb642176158b31d4b191aba96633b63b62"} Mar 10 07:02:41 crc kubenswrapper[4825]: I0310 07:02:41.180806 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-6jm6b" event={"ID":"3ee3f9b2-09ba-4f80-823f-a4fd8639a14a","Type":"ContainerStarted","Data":"da814202d6da2c05cdf70dda80adeac89ea5b52380286a16fa067ae0694188b6"} Mar 10 07:02:41 crc kubenswrapper[4825]: I0310 07:02:41.182940 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7lfrk" event={"ID":"a16a88fb-4aab-48c8-a540-f999a66f712b","Type":"ContainerStarted","Data":"04e6a9e8ab79857d21ecb94afbc498e27343583ba30fdbbf14eef1cd0939cf6d"} Mar 10 07:02:41 crc kubenswrapper[4825]: E0310 07:02:41.184660 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:ee642fcf655f9897d480460008cba2e98b497d3ffdf7ab1d48ea460eb20c2053\\\"\"" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7lfrk" podUID="a16a88fb-4aab-48c8-a540-f999a66f712b" Mar 10 07:02:41 crc kubenswrapper[4825]: I0310 07:02:41.185277 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-z8726" event={"ID":"406e912a-4ca3-4760-ba95-e63b4d342849","Type":"ContainerStarted","Data":"64d580e0c9014a982ac4080358e5d6c0876f54c855a69488dbcff065b936ff9c"} Mar 10 07:02:41 crc kubenswrapper[4825]: I0310 07:02:41.187201 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k2l9r" event={"ID":"41dde8f5-d43b-4cdf-beb8-56e67290e024","Type":"ContainerStarted","Data":"24e4199dd1f3149a3c039ae40950e95b7dcadae1202ee7ba7fb7186d4da83dae"} Mar 10 07:02:41 crc kubenswrapper[4825]: I0310 07:02:41.190216 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6bw6n" event={"ID":"88d2e717-1281-45d1-a787-74fafe29c95b","Type":"ContainerStarted","Data":"52279098ea71b50e3b177475d2c805fd9963b75a6352db71b40c47314633bd75"} Mar 10 07:02:41 crc kubenswrapper[4825]: E0310 07:02:41.192533 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6bw6n" podUID="88d2e717-1281-45d1-a787-74fafe29c95b" Mar 10 07:02:41 crc kubenswrapper[4825]: I0310 07:02:41.192907 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dn5wn" event={"ID":"6667b99e-cca6-4450-b44d-03edafac3e7a","Type":"ContainerStarted","Data":"15f4496677d1633fe7959ccc6083283dc416e7b277ab8f50d003372f09169423"} Mar 10 07:02:41 crc kubenswrapper[4825]: I0310 07:02:41.194427 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-cmg8t" event={"ID":"c947e95f-2771-4a5e-8adc-adf86fbbcc48","Type":"ContainerStarted","Data":"f7825645be47077383290d3a7fdcc512d5df277d4228546c5ac04ed6f47e1e12"} Mar 10 07:02:41 crc kubenswrapper[4825]: E0310 07:02:41.198028 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26\\\"\"" pod="openstack-operators/manila-operator-controller-manager-67d996989d-cmg8t" podUID="c947e95f-2771-4a5e-8adc-adf86fbbcc48" Mar 10 07:02:41 crc kubenswrapper[4825]: I0310 07:02:41.198318 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-qhwdw" event={"ID":"d1dffe25-5486-4aba-932a-41725895d7cf","Type":"ContainerStarted","Data":"cfe28cb588b3aace32b3c1d5930b6d762fcff5e6946788bf8f762b990082159e"} Mar 10 07:02:41 crc kubenswrapper[4825]: I0310 07:02:41.284805 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-n2gmk\" (UID: \"5e92a016-c5ae-4ae4-a853-f69e701639fd\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:02:41 crc kubenswrapper[4825]: I0310 07:02:41.284869 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-n2gmk\" (UID: \"5e92a016-c5ae-4ae4-a853-f69e701639fd\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:02:41 crc kubenswrapper[4825]: E0310 07:02:41.285038 4825 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 07:02:41 crc kubenswrapper[4825]: E0310 07:02:41.285099 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-metrics-certs podName:5e92a016-c5ae-4ae4-a853-f69e701639fd nodeName:}" failed. No retries permitted until 2026-03-10 07:02:43.285082699 +0000 UTC m=+1116.314863314 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-metrics-certs") pod "openstack-operator-controller-manager-59b6c9788f-n2gmk" (UID: "5e92a016-c5ae-4ae4-a853-f69e701639fd") : secret "metrics-server-cert" not found Mar 10 07:02:41 crc kubenswrapper[4825]: E0310 07:02:41.286341 4825 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 07:02:41 crc kubenswrapper[4825]: E0310 07:02:41.286401 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-webhook-certs podName:5e92a016-c5ae-4ae4-a853-f69e701639fd nodeName:}" failed. No retries permitted until 2026-03-10 07:02:43.286369603 +0000 UTC m=+1116.316150218 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-webhook-certs") pod "openstack-operator-controller-manager-59b6c9788f-n2gmk" (UID: "5e92a016-c5ae-4ae4-a853-f69e701639fd") : secret "webhook-server-cert" not found Mar 10 07:02:42 crc kubenswrapper[4825]: E0310 07:02:42.219358 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-t88nq" podUID="ab1a7215-c090-4f4e-a994-603bbbafb22e" Mar 10 07:02:42 crc kubenswrapper[4825]: E0310 07:02:42.219443 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7\\\"\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bdxtj" podUID="e5de5a57-9f17-4101-952a-cc33147fc220" Mar 10 07:02:42 crc kubenswrapper[4825]: E0310 07:02:42.220520 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968\\\"\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-s56gc" podUID="6a3b19ba-bd45-4cae-a466-a4f380a10cd0" Mar 10 07:02:42 crc kubenswrapper[4825]: E0310 07:02:42.227905 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:ee642fcf655f9897d480460008cba2e98b497d3ffdf7ab1d48ea460eb20c2053\\\"\"" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7lfrk" podUID="a16a88fb-4aab-48c8-a540-f999a66f712b" Mar 10 07:02:42 crc kubenswrapper[4825]: E0310 07:02:42.227970 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6bw6n" podUID="88d2e717-1281-45d1-a787-74fafe29c95b" Mar 10 07:02:42 crc kubenswrapper[4825]: E0310 07:02:42.228602 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f1158ec4d879c4646eee4323bc501eba4d377beb2ad6fbe08ed30070c441ac26\\\"\"" pod="openstack-operators/manila-operator-controller-manager-67d996989d-cmg8t" podUID="c947e95f-2771-4a5e-8adc-adf86fbbcc48" Mar 10 07:02:42 crc kubenswrapper[4825]: I0310 07:02:42.523070 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adbbc5e4-169b-4b18-819f-2ca9136edf9b-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-b5zsk\" (UID: \"adbbc5e4-169b-4b18-819f-2ca9136edf9b\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-b5zsk" Mar 10 07:02:42 crc kubenswrapper[4825]: E0310 07:02:42.523459 4825 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 07:02:42 crc kubenswrapper[4825]: E0310 07:02:42.523541 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adbbc5e4-169b-4b18-819f-2ca9136edf9b-cert podName:adbbc5e4-169b-4b18-819f-2ca9136edf9b nodeName:}" failed. No retries permitted until 2026-03-10 07:02:46.523523309 +0000 UTC m=+1119.553303924 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/adbbc5e4-169b-4b18-819f-2ca9136edf9b-cert") pod "infra-operator-controller-manager-f7fcc58b9-b5zsk" (UID: "adbbc5e4-169b-4b18-819f-2ca9136edf9b") : secret "infra-operator-webhook-server-cert" not found Mar 10 07:02:43 crc kubenswrapper[4825]: I0310 07:02:43.034519 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0db55365-104a-42c2-ba9b-1c084fdd08cf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc\" (UID: \"0db55365-104a-42c2-ba9b-1c084fdd08cf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc" Mar 10 07:02:43 crc kubenswrapper[4825]: E0310 07:02:43.034673 4825 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 07:02:43 crc kubenswrapper[4825]: E0310 07:02:43.034841 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0db55365-104a-42c2-ba9b-1c084fdd08cf-cert podName:0db55365-104a-42c2-ba9b-1c084fdd08cf nodeName:}" failed. No retries permitted until 2026-03-10 07:02:47.034825269 +0000 UTC m=+1120.064605884 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0db55365-104a-42c2-ba9b-1c084fdd08cf-cert") pod "openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc" (UID: "0db55365-104a-42c2-ba9b-1c084fdd08cf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 07:02:43 crc kubenswrapper[4825]: I0310 07:02:43.337938 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-n2gmk\" (UID: \"5e92a016-c5ae-4ae4-a853-f69e701639fd\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:02:43 crc kubenswrapper[4825]: E0310 07:02:43.338114 4825 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 07:02:43 crc kubenswrapper[4825]: E0310 07:02:43.338323 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-metrics-certs podName:5e92a016-c5ae-4ae4-a853-f69e701639fd nodeName:}" failed. No retries permitted until 2026-03-10 07:02:47.338299204 +0000 UTC m=+1120.368079869 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-metrics-certs") pod "openstack-operator-controller-manager-59b6c9788f-n2gmk" (UID: "5e92a016-c5ae-4ae4-a853-f69e701639fd") : secret "metrics-server-cert" not found Mar 10 07:02:43 crc kubenswrapper[4825]: I0310 07:02:43.338358 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-n2gmk\" (UID: \"5e92a016-c5ae-4ae4-a853-f69e701639fd\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:02:43 crc kubenswrapper[4825]: E0310 07:02:43.338570 4825 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 07:02:43 crc kubenswrapper[4825]: E0310 07:02:43.338603 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-webhook-certs podName:5e92a016-c5ae-4ae4-a853-f69e701639fd nodeName:}" failed. No retries permitted until 2026-03-10 07:02:47.338594322 +0000 UTC m=+1120.368374997 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-webhook-certs") pod "openstack-operator-controller-manager-59b6c9788f-n2gmk" (UID: "5e92a016-c5ae-4ae4-a853-f69e701639fd") : secret "webhook-server-cert" not found Mar 10 07:02:46 crc kubenswrapper[4825]: I0310 07:02:46.587841 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adbbc5e4-169b-4b18-819f-2ca9136edf9b-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-b5zsk\" (UID: \"adbbc5e4-169b-4b18-819f-2ca9136edf9b\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-b5zsk" Mar 10 07:02:46 crc kubenswrapper[4825]: E0310 07:02:46.588053 4825 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 07:02:46 crc kubenswrapper[4825]: E0310 07:02:46.588349 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adbbc5e4-169b-4b18-819f-2ca9136edf9b-cert podName:adbbc5e4-169b-4b18-819f-2ca9136edf9b nodeName:}" failed. No retries permitted until 2026-03-10 07:02:54.588334285 +0000 UTC m=+1127.618114900 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/adbbc5e4-169b-4b18-819f-2ca9136edf9b-cert") pod "infra-operator-controller-manager-f7fcc58b9-b5zsk" (UID: "adbbc5e4-169b-4b18-819f-2ca9136edf9b") : secret "infra-operator-webhook-server-cert" not found Mar 10 07:02:46 crc kubenswrapper[4825]: I0310 07:02:46.887863 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:02:46 crc kubenswrapper[4825]: I0310 07:02:46.887934 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:02:47 crc kubenswrapper[4825]: I0310 07:02:47.094990 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0db55365-104a-42c2-ba9b-1c084fdd08cf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc\" (UID: \"0db55365-104a-42c2-ba9b-1c084fdd08cf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc" Mar 10 07:02:47 crc kubenswrapper[4825]: E0310 07:02:47.095255 4825 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 07:02:47 crc kubenswrapper[4825]: E0310 07:02:47.095315 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0db55365-104a-42c2-ba9b-1c084fdd08cf-cert podName:0db55365-104a-42c2-ba9b-1c084fdd08cf nodeName:}" failed. No retries permitted until 2026-03-10 07:02:55.095299281 +0000 UTC m=+1128.125079896 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0db55365-104a-42c2-ba9b-1c084fdd08cf-cert") pod "openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc" (UID: "0db55365-104a-42c2-ba9b-1c084fdd08cf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 07:02:47 crc kubenswrapper[4825]: I0310 07:02:47.399326 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-n2gmk\" (UID: \"5e92a016-c5ae-4ae4-a853-f69e701639fd\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:02:47 crc kubenswrapper[4825]: I0310 07:02:47.399380 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-n2gmk\" (UID: \"5e92a016-c5ae-4ae4-a853-f69e701639fd\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:02:47 crc kubenswrapper[4825]: E0310 07:02:47.399528 4825 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 07:02:47 crc kubenswrapper[4825]: E0310 07:02:47.399588 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-metrics-certs podName:5e92a016-c5ae-4ae4-a853-f69e701639fd nodeName:}" failed. No retries permitted until 2026-03-10 07:02:55.399569107 +0000 UTC m=+1128.429349722 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-metrics-certs") pod "openstack-operator-controller-manager-59b6c9788f-n2gmk" (UID: "5e92a016-c5ae-4ae4-a853-f69e701639fd") : secret "metrics-server-cert" not found Mar 10 07:02:47 crc kubenswrapper[4825]: E0310 07:02:47.399592 4825 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 07:02:47 crc kubenswrapper[4825]: E0310 07:02:47.399676 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-webhook-certs podName:5e92a016-c5ae-4ae4-a853-f69e701639fd nodeName:}" failed. No retries permitted until 2026-03-10 07:02:55.39965782 +0000 UTC m=+1128.429438435 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-webhook-certs") pod "openstack-operator-controller-manager-59b6c9788f-n2gmk" (UID: "5e92a016-c5ae-4ae4-a853-f69e701639fd") : secret "webhook-server-cert" not found Mar 10 07:02:53 crc kubenswrapper[4825]: E0310 07:02:53.491290 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 10 07:02:53 crc kubenswrapper[4825]: E0310 07:02:53.492203 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-988q5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-k2l9r_openstack-operators(41dde8f5-d43b-4cdf-beb8-56e67290e024): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 07:02:53 crc kubenswrapper[4825]: E0310 07:02:53.493431 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k2l9r" podUID="41dde8f5-d43b-4cdf-beb8-56e67290e024" Mar 10 07:02:54 crc kubenswrapper[4825]: E0310 07:02:54.157202 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84" Mar 10 07:02:54 crc kubenswrapper[4825]: E0310 07:02:54.157739 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6g6vk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-74b6b5dc96-dn5wn_openstack-operators(6667b99e-cca6-4450-b44d-03edafac3e7a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 07:02:54 crc kubenswrapper[4825]: E0310 07:02:54.158924 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dn5wn" podUID="6667b99e-cca6-4450-b44d-03edafac3e7a" Mar 10 07:02:54 crc kubenswrapper[4825]: E0310 07:02:54.334074 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k2l9r" podUID="41dde8f5-d43b-4cdf-beb8-56e67290e024" Mar 10 07:02:54 crc kubenswrapper[4825]: E0310 07:02:54.334487 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84\\\"\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dn5wn" podUID="6667b99e-cca6-4450-b44d-03edafac3e7a" Mar 10 07:02:54 crc kubenswrapper[4825]: I0310 07:02:54.657622 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adbbc5e4-169b-4b18-819f-2ca9136edf9b-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-b5zsk\" (UID: \"adbbc5e4-169b-4b18-819f-2ca9136edf9b\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-b5zsk" Mar 10 07:02:54 crc kubenswrapper[4825]: I0310 07:02:54.678182 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/adbbc5e4-169b-4b18-819f-2ca9136edf9b-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-b5zsk\" (UID: \"adbbc5e4-169b-4b18-819f-2ca9136edf9b\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-b5zsk" Mar 10 07:02:54 crc kubenswrapper[4825]: I0310 07:02:54.898446 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-b5zsk" Mar 10 07:02:55 crc kubenswrapper[4825]: E0310 07:02:55.008606 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c" Mar 10 07:02:55 crc kubenswrapper[4825]: E0310 07:02:55.008785 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-psz7n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7c789f89c6-pnfls_openstack-operators(c0c7b817-9526-41fb-9a08-f2951ef9db20): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 07:02:55 crc kubenswrapper[4825]: E0310 07:02:55.010109 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-pnfls" podUID="c0c7b817-9526-41fb-9a08-f2951ef9db20" Mar 10 07:02:55 crc kubenswrapper[4825]: I0310 07:02:55.164860 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0db55365-104a-42c2-ba9b-1c084fdd08cf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc\" (UID: \"0db55365-104a-42c2-ba9b-1c084fdd08cf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc" Mar 10 07:02:55 crc kubenswrapper[4825]: I0310 07:02:55.177199 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0db55365-104a-42c2-ba9b-1c084fdd08cf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc\" (UID: \"0db55365-104a-42c2-ba9b-1c084fdd08cf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc" Mar 10 07:02:55 crc kubenswrapper[4825]: E0310 07:02:55.340791 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-pnfls" podUID="c0c7b817-9526-41fb-9a08-f2951ef9db20" Mar 10 07:02:55 crc kubenswrapper[4825]: I0310 07:02:55.442035 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc" Mar 10 07:02:55 crc kubenswrapper[4825]: I0310 07:02:55.477750 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-n2gmk\" (UID: \"5e92a016-c5ae-4ae4-a853-f69e701639fd\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:02:55 crc kubenswrapper[4825]: I0310 07:02:55.477802 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-n2gmk\" (UID: \"5e92a016-c5ae-4ae4-a853-f69e701639fd\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:02:55 crc kubenswrapper[4825]: E0310 07:02:55.477954 4825 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 07:02:55 crc kubenswrapper[4825]: E0310 07:02:55.477990 4825 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 07:02:55 crc kubenswrapper[4825]: E0310 07:02:55.478091 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-metrics-certs podName:5e92a016-c5ae-4ae4-a853-f69e701639fd nodeName:}" failed. No retries permitted until 2026-03-10 07:03:11.478074647 +0000 UTC m=+1144.507855262 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-metrics-certs") pod "openstack-operator-controller-manager-59b6c9788f-n2gmk" (UID: "5e92a016-c5ae-4ae4-a853-f69e701639fd") : secret "metrics-server-cert" not found Mar 10 07:02:55 crc kubenswrapper[4825]: E0310 07:02:55.478229 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-webhook-certs podName:5e92a016-c5ae-4ae4-a853-f69e701639fd nodeName:}" failed. No retries permitted until 2026-03-10 07:03:11.47819337 +0000 UTC m=+1144.507974185 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-webhook-certs") pod "openstack-operator-controller-manager-59b6c9788f-n2gmk" (UID: "5e92a016-c5ae-4ae4-a853-f69e701639fd") : secret "webhook-server-cert" not found Mar 10 07:02:56 crc kubenswrapper[4825]: I0310 07:02:56.350207 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sd5jw" event={"ID":"acc23278-44ed-4cee-bd08-4562e17175db","Type":"ContainerStarted","Data":"284e95bab10e23ab0eb73214c69af4f9da75ca858b40e0d033968288a61b34dd"} Mar 10 07:02:56 crc kubenswrapper[4825]: I0310 07:02:56.352488 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sd5jw" Mar 10 07:02:56 crc kubenswrapper[4825]: I0310 07:02:56.437847 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sd5jw" podStartSLOduration=4.733624702 podStartE2EDuration="18.437829827s" podCreationTimestamp="2026-03-10 07:02:38 +0000 UTC" firstStartedPulling="2026-03-10 07:02:39.772102577 +0000 UTC m=+1112.801883182" lastFinishedPulling="2026-03-10 07:02:53.476307652 +0000 UTC m=+1126.506088307" observedRunningTime="2026-03-10 07:02:56.37152545 +0000 UTC m=+1129.401306065" watchObservedRunningTime="2026-03-10 07:02:56.437829827 +0000 UTC m=+1129.467610442" Mar 10 07:02:56 crc kubenswrapper[4825]: I0310 07:02:56.442386 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-b5zsk"] Mar 10 07:02:56 crc kubenswrapper[4825]: W0310 07:02:56.813452 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadbbc5e4_169b_4b18_819f_2ca9136edf9b.slice/crio-0b0aa30f4ed99b50a42469d652fec2d64ed2c29620d00ca1b3ff5b078e0b7fa0 WatchSource:0}: Error finding container 0b0aa30f4ed99b50a42469d652fec2d64ed2c29620d00ca1b3ff5b078e0b7fa0: Status 404 returned error can't find the container with id 0b0aa30f4ed99b50a42469d652fec2d64ed2c29620d00ca1b3ff5b078e0b7fa0 Mar 10 07:02:57 crc kubenswrapper[4825]: I0310 07:02:57.359237 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-b5zsk" event={"ID":"adbbc5e4-169b-4b18-819f-2ca9136edf9b","Type":"ContainerStarted","Data":"0b0aa30f4ed99b50a42469d652fec2d64ed2c29620d00ca1b3ff5b078e0b7fa0"} Mar 10 07:02:58 crc kubenswrapper[4825]: I0310 07:02:58.393407 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc"] Mar 10 07:02:59 crc kubenswrapper[4825]: I0310 07:02:59.399680 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-zbvvz" event={"ID":"2c8e75e0-135a-4d58-9230-a0f18ad89b25","Type":"ContainerStarted","Data":"a5dfc77f14ff1dd772128db71d2bf6a240f703b96e38841b50c9c8530be56bf7"} Mar 10 07:02:59 crc kubenswrapper[4825]: I0310 07:02:59.401377 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-zbvvz" Mar 10 07:02:59 crc kubenswrapper[4825]: I0310 07:02:59.423810 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-zbvvz" podStartSLOduration=7.317834639 podStartE2EDuration="21.42378912s" podCreationTimestamp="2026-03-10 07:02:38 +0000 UTC" firstStartedPulling="2026-03-10 07:02:40.021682178 +0000 UTC m=+1113.051462793" lastFinishedPulling="2026-03-10 07:02:54.127636649 +0000 UTC m=+1127.157417274" observedRunningTime="2026-03-10 07:02:59.41688793 +0000 UTC m=+1132.446668555" watchObservedRunningTime="2026-03-10 07:02:59.42378912 +0000 UTC m=+1132.453569745" Mar 10 07:03:00 crc kubenswrapper[4825]: W0310 07:03:00.005418 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0db55365_104a_42c2_ba9b_1c084fdd08cf.slice/crio-f4892f6996e830f0d5fa61dbee92a3c1dd0d8d55425e8b1bed169b1e5a96099d WatchSource:0}: Error finding container f4892f6996e830f0d5fa61dbee92a3c1dd0d8d55425e8b1bed169b1e5a96099d: Status 404 returned error can't find the container with id f4892f6996e830f0d5fa61dbee92a3c1dd0d8d55425e8b1bed169b1e5a96099d Mar 10 07:03:00 crc kubenswrapper[4825]: I0310 07:03:00.406265 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-qhwdw" event={"ID":"d1dffe25-5486-4aba-932a-41725895d7cf","Type":"ContainerStarted","Data":"95a4c850753e2db6d9bfc1281bd7790ab621ef6c704243f8221a111d6ffcb5a6"} Mar 10 07:03:00 crc kubenswrapper[4825]: I0310 07:03:00.407357 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54688575f-qhwdw" Mar 10 07:03:00 crc kubenswrapper[4825]: I0310 07:03:00.411387 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc" event={"ID":"0db55365-104a-42c2-ba9b-1c084fdd08cf","Type":"ContainerStarted","Data":"f4892f6996e830f0d5fa61dbee92a3c1dd0d8d55425e8b1bed169b1e5a96099d"} Mar 10 07:03:00 crc kubenswrapper[4825]: I0310 07:03:00.412695 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-6jm6b" event={"ID":"3ee3f9b2-09ba-4f80-823f-a4fd8639a14a","Type":"ContainerStarted","Data":"2adb5cd2480543dc28898fb704144c4a7dd2794b26223e57133056e83b5a7634"} Mar 10 07:03:00 crc kubenswrapper[4825]: I0310 07:03:00.412885 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-6jm6b" Mar 10 07:03:00 crc kubenswrapper[4825]: I0310 07:03:00.414805 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wx259" event={"ID":"4753d4b8-4f42-480b-bed5-54aa4dd2e77b","Type":"ContainerStarted","Data":"72fd1e8cbe16a7359865717e475247f78de458a06f76c779bd77b656a80e6751"} Mar 10 07:03:00 crc kubenswrapper[4825]: I0310 07:03:00.415019 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wx259" Mar 10 07:03:00 crc kubenswrapper[4825]: I0310 07:03:00.416262 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-4fxt4" event={"ID":"7004ec9c-96b9-41e9-85f2-f57a2ca785ef","Type":"ContainerStarted","Data":"b38797f6cd850ae8f37db8ef7f91ea54cefac9a7563e2a5543f06abc5a84020d"} Mar 10 07:03:00 crc kubenswrapper[4825]: I0310 07:03:00.417074 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-4fxt4" Mar 10 07:03:00 crc kubenswrapper[4825]: I0310 07:03:00.419681 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-z8726" event={"ID":"406e912a-4ca3-4760-ba95-e63b4d342849","Type":"ContainerStarted","Data":"9d7d5a6f7958780bc7a1361c929db3b404e08eee56acb9647c54236a6e664cd5"} Mar 10 07:03:00 crc kubenswrapper[4825]: I0310 07:03:00.419729 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-z8726" Mar 10 07:03:00 crc kubenswrapper[4825]: I0310 07:03:00.424400 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54688575f-qhwdw" podStartSLOduration=9.394221218 podStartE2EDuration="22.424384345s" podCreationTimestamp="2026-03-10 07:02:38 +0000 UTC" firstStartedPulling="2026-03-10 07:02:40.445368495 +0000 UTC m=+1113.475149110" lastFinishedPulling="2026-03-10 07:02:53.475531612 +0000 UTC m=+1126.505312237" observedRunningTime="2026-03-10 07:03:00.419786515 +0000 UTC m=+1133.449567130" watchObservedRunningTime="2026-03-10 07:03:00.424384345 +0000 UTC m=+1133.454164960" Mar 10 07:03:00 crc kubenswrapper[4825]: I0310 07:03:00.436022 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-4fxt4" podStartSLOduration=9.462768505 podStartE2EDuration="22.436006828s" podCreationTimestamp="2026-03-10 07:02:38 +0000 UTC" firstStartedPulling="2026-03-10 07:02:40.502274018 +0000 UTC m=+1113.532054623" lastFinishedPulling="2026-03-10 07:02:53.475512321 +0000 UTC m=+1126.505292946" observedRunningTime="2026-03-10 07:03:00.435476394 +0000 UTC m=+1133.465257009" watchObservedRunningTime="2026-03-10 07:03:00.436006828 +0000 UTC m=+1133.465787443" Mar 10 07:03:00 crc kubenswrapper[4825]: I0310 07:03:00.454215 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wx259" podStartSLOduration=8.816346735 podStartE2EDuration="22.454201362s" podCreationTimestamp="2026-03-10 07:02:38 +0000 UTC" firstStartedPulling="2026-03-10 07:02:40.489788772 +0000 UTC m=+1113.519569387" lastFinishedPulling="2026-03-10 07:02:54.127643399 +0000 UTC m=+1127.157424014" observedRunningTime="2026-03-10 07:03:00.454165671 +0000 UTC m=+1133.483946296" watchObservedRunningTime="2026-03-10 07:03:00.454201362 +0000 UTC m=+1133.483981977" Mar 10 07:03:00 crc kubenswrapper[4825]: I0310 07:03:00.480940 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-z8726" podStartSLOduration=9.493883115 podStartE2EDuration="22.480920268s" podCreationTimestamp="2026-03-10 07:02:38 +0000 UTC" firstStartedPulling="2026-03-10 07:02:40.489367761 +0000 UTC m=+1113.519148376" lastFinishedPulling="2026-03-10 07:02:53.476404894 +0000 UTC m=+1126.506185529" observedRunningTime="2026-03-10 07:03:00.478972187 +0000 UTC m=+1133.508752812" watchObservedRunningTime="2026-03-10 07:03:00.480920268 +0000 UTC m=+1133.510700883" Mar 10 07:03:00 crc kubenswrapper[4825]: I0310 07:03:00.504541 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-6jm6b" podStartSLOduration=9.294905311 podStartE2EDuration="22.504521772s" podCreationTimestamp="2026-03-10 07:02:38 +0000 UTC" firstStartedPulling="2026-03-10 07:02:40.266799994 +0000 UTC m=+1113.296580609" lastFinishedPulling="2026-03-10 07:02:53.476416455 +0000 UTC m=+1126.506197070" observedRunningTime="2026-03-10 07:03:00.501082653 +0000 UTC m=+1133.530863268" watchObservedRunningTime="2026-03-10 07:03:00.504521772 +0000 UTC m=+1133.534302387" Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.441932 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-npxgg" event={"ID":"18fccb01-59df-4020-88c5-9c70b9f0edec","Type":"ContainerStarted","Data":"5fc0234096eef8d9cd34e55d1114abc9c6352383b9f22ac07707d13f7a8af2a3"} Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.442357 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-npxgg" Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.494475 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bdxtj" event={"ID":"e5de5a57-9f17-4101-952a-cc33147fc220","Type":"ContainerStarted","Data":"44dee38c554b45fc9209372a4740c785799249ab66a5eccdb5bac8bf4462dc93"} Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.495022 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bdxtj" Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.514539 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-t88nq" event={"ID":"ab1a7215-c090-4f4e-a994-603bbbafb22e","Type":"ContainerStarted","Data":"27409002031be2daf282f21d78a83cbb4bef3d8b827b8efbe4ab41a970286af4"} Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.515231 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-t88nq" Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.529950 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-s56gc" event={"ID":"6a3b19ba-bd45-4cae-a466-a4f380a10cd0","Type":"ContainerStarted","Data":"68af52884dd88753e7d2c944fe9e61b4212d2e47b99f1926e1a8987e520b999d"} Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.530646 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-s56gc" Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.548275 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-nl9ph" event={"ID":"53bfb88f-ffff-4945-acc2-ae245147edcb","Type":"ContainerStarted","Data":"a75ac781e6286e8cbb91abcd0273855beaf2320562d2ea68c70e4534096f435c"} Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.548814 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-nl9ph" Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.549637 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-bpf8z" event={"ID":"ff530b27-be8c-40d4-9af7-ed6200fcdcac","Type":"ContainerStarted","Data":"245928aa5bf10381ca02af8e04f9bb5abe0712fbc29a80cc7c708aec9ef20f1f"} Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.550274 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-bpf8z" Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.551239 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6bw6n" event={"ID":"88d2e717-1281-45d1-a787-74fafe29c95b","Type":"ContainerStarted","Data":"0dd79eda399195325cc6fc718a35c6286a5b2c629eb6a5567f1ec679db35d600"} Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.551631 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6bw6n" Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.552455 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-qqskl" event={"ID":"7ed2e288-eed6-49bf-8eb8-be9a1e8415a3","Type":"ContainerStarted","Data":"ef564bb87865a61a4a3068ed8a4ea5cfa5b1c26a4f38c115ac236cf74dc1ef47"} Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.552803 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-qqskl" Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.561732 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7lfrk" event={"ID":"a16a88fb-4aab-48c8-a540-f999a66f712b","Type":"ContainerStarted","Data":"04d3d0797ed55f2227e2f47d53e84748f4ea3f3ddb41b6f7824aed426a34c732"} Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.562173 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7lfrk" Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.564645 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-b5zsk" event={"ID":"adbbc5e4-169b-4b18-819f-2ca9136edf9b","Type":"ContainerStarted","Data":"4a8e9e01233965691a1541b6f004800e12d1b92e877333b0a60d1c08eeb87787"} Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.565334 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-b5zsk" Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.576062 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-cmg8t" event={"ID":"c947e95f-2771-4a5e-8adc-adf86fbbcc48","Type":"ContainerStarted","Data":"407f1c8ce39304aa551b9521a0b9bcaa470e4943385eca3545fed5fa769035a1"} Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.576476 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-cmg8t" Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.591818 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-npxgg" podStartSLOduration=9.870012452 podStartE2EDuration="24.591789515s" podCreationTimestamp="2026-03-10 07:02:38 +0000 UTC" firstStartedPulling="2026-03-10 07:02:40.260883159 +0000 UTC m=+1113.290663774" lastFinishedPulling="2026-03-10 07:02:54.982660222 +0000 UTC m=+1128.012440837" observedRunningTime="2026-03-10 07:03:02.587927994 +0000 UTC m=+1135.617708619" watchObservedRunningTime="2026-03-10 07:03:02.591789515 +0000 UTC m=+1135.621570130" Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.628487 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-b5zsk" podStartSLOduration=20.215170316 podStartE2EDuration="24.62846976s" podCreationTimestamp="2026-03-10 07:02:38 +0000 UTC" firstStartedPulling="2026-03-10 07:02:56.826282316 +0000 UTC m=+1129.856062931" lastFinishedPulling="2026-03-10 07:03:01.23958176 +0000 UTC m=+1134.269362375" observedRunningTime="2026-03-10 07:03:02.623886391 +0000 UTC m=+1135.653666996" watchObservedRunningTime="2026-03-10 07:03:02.62846976 +0000 UTC m=+1135.658250375" Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.657209 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-qqskl" podStartSLOduration=9.60849646 podStartE2EDuration="24.657193008s" podCreationTimestamp="2026-03-10 07:02:38 +0000 UTC" firstStartedPulling="2026-03-10 07:02:39.936545681 +0000 UTC m=+1112.966326296" lastFinishedPulling="2026-03-10 07:02:54.985242229 +0000 UTC m=+1128.015022844" observedRunningTime="2026-03-10 07:03:02.653280296 +0000 UTC m=+1135.683060911" watchObservedRunningTime="2026-03-10 07:03:02.657193008 +0000 UTC m=+1135.686973623" Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.725975 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-s56gc" podStartSLOduration=7.920288034 podStartE2EDuration="24.72595951s" podCreationTimestamp="2026-03-10 07:02:38 +0000 UTC" firstStartedPulling="2026-03-10 07:02:40.643210129 +0000 UTC m=+1113.672990744" lastFinishedPulling="2026-03-10 07:02:57.448881605 +0000 UTC m=+1130.478662220" observedRunningTime="2026-03-10 07:03:02.696468421 +0000 UTC m=+1135.726249056" watchObservedRunningTime="2026-03-10 07:03:02.72595951 +0000 UTC m=+1135.755740125" Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.729738 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-bpf8z" podStartSLOduration=9.765970412 podStartE2EDuration="24.729718577s" podCreationTimestamp="2026-03-10 07:02:38 +0000 UTC" firstStartedPulling="2026-03-10 07:02:40.02250213 +0000 UTC m=+1113.052282745" lastFinishedPulling="2026-03-10 07:02:54.986250295 +0000 UTC m=+1128.016030910" observedRunningTime="2026-03-10 07:03:02.723798223 +0000 UTC m=+1135.753578838" watchObservedRunningTime="2026-03-10 07:03:02.729718577 +0000 UTC m=+1135.759499192" Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.766918 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bdxtj" podStartSLOduration=4.208262208 podStartE2EDuration="24.766890566s" podCreationTimestamp="2026-03-10 07:02:38 +0000 UTC" firstStartedPulling="2026-03-10 07:02:40.663705093 +0000 UTC m=+1113.693485708" lastFinishedPulling="2026-03-10 07:03:01.222333441 +0000 UTC m=+1134.252114066" observedRunningTime="2026-03-10 07:03:02.760182661 +0000 UTC m=+1135.789963276" watchObservedRunningTime="2026-03-10 07:03:02.766890566 +0000 UTC m=+1135.796671181" Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.790674 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7lfrk" podStartSLOduration=4.082428439 podStartE2EDuration="24.790660575s" podCreationTimestamp="2026-03-10 07:02:38 +0000 UTC" firstStartedPulling="2026-03-10 07:02:40.504841594 +0000 UTC m=+1113.534622209" lastFinishedPulling="2026-03-10 07:03:01.21307373 +0000 UTC m=+1134.242854345" observedRunningTime="2026-03-10 07:03:02.785815339 +0000 UTC m=+1135.815595954" watchObservedRunningTime="2026-03-10 07:03:02.790660575 +0000 UTC m=+1135.820441190" Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.821932 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-t88nq" podStartSLOduration=3.312262687 podStartE2EDuration="23.821915599s" podCreationTimestamp="2026-03-10 07:02:39 +0000 UTC" firstStartedPulling="2026-03-10 07:02:40.668735654 +0000 UTC m=+1113.698516269" lastFinishedPulling="2026-03-10 07:03:01.178388566 +0000 UTC m=+1134.208169181" observedRunningTime="2026-03-10 07:03:02.817965556 +0000 UTC m=+1135.847746161" watchObservedRunningTime="2026-03-10 07:03:02.821915599 +0000 UTC m=+1135.851696214" Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.839400 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-cmg8t" podStartSLOduration=4.106056994 podStartE2EDuration="24.839216739s" podCreationTimestamp="2026-03-10 07:02:38 +0000 UTC" firstStartedPulling="2026-03-10 07:02:40.506444076 +0000 UTC m=+1113.536224691" lastFinishedPulling="2026-03-10 07:03:01.239603811 +0000 UTC m=+1134.269384436" observedRunningTime="2026-03-10 07:03:02.837300539 +0000 UTC m=+1135.867081154" watchObservedRunningTime="2026-03-10 07:03:02.839216739 +0000 UTC m=+1135.868997354" Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.855640 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-nl9ph" podStartSLOduration=10.047361341 podStartE2EDuration="24.855623296s" podCreationTimestamp="2026-03-10 07:02:38 +0000 UTC" firstStartedPulling="2026-03-10 07:02:40.174579421 +0000 UTC m=+1113.204360036" lastFinishedPulling="2026-03-10 07:02:54.982841376 +0000 UTC m=+1128.012621991" observedRunningTime="2026-03-10 07:03:02.850840451 +0000 UTC m=+1135.880621066" watchObservedRunningTime="2026-03-10 07:03:02.855623296 +0000 UTC m=+1135.885403911" Mar 10 07:03:02 crc kubenswrapper[4825]: I0310 07:03:02.874450 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6bw6n" podStartSLOduration=4.304394791 podStartE2EDuration="24.874415306s" podCreationTimestamp="2026-03-10 07:02:38 +0000 UTC" firstStartedPulling="2026-03-10 07:02:40.642903711 +0000 UTC m=+1113.672684326" lastFinishedPulling="2026-03-10 07:03:01.212924216 +0000 UTC m=+1134.242704841" observedRunningTime="2026-03-10 07:03:02.874030966 +0000 UTC m=+1135.903811581" watchObservedRunningTime="2026-03-10 07:03:02.874415306 +0000 UTC m=+1135.904195921" Mar 10 07:03:08 crc kubenswrapper[4825]: I0310 07:03:08.798439 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-nl9ph" Mar 10 07:03:08 crc kubenswrapper[4825]: I0310 07:03:08.832366 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-sd5jw" Mar 10 07:03:08 crc kubenswrapper[4825]: I0310 07:03:08.875076 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-qqskl" Mar 10 07:03:08 crc kubenswrapper[4825]: I0310 07:03:08.979296 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-zbvvz" Mar 10 07:03:09 crc kubenswrapper[4825]: I0310 07:03:09.095965 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-bpf8z" Mar 10 07:03:09 crc kubenswrapper[4825]: I0310 07:03:09.235030 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-npxgg" Mar 10 07:03:09 crc kubenswrapper[4825]: I0310 07:03:09.246417 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-6jm6b" Mar 10 07:03:09 crc kubenswrapper[4825]: I0310 07:03:09.247944 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-7lfrk" Mar 10 07:03:09 crc kubenswrapper[4825]: I0310 07:03:09.269593 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-cmg8t" Mar 10 07:03:09 crc kubenswrapper[4825]: I0310 07:03:09.281835 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54688575f-qhwdw" Mar 10 07:03:09 crc kubenswrapper[4825]: I0310 07:03:09.321491 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-z8726" Mar 10 07:03:09 crc kubenswrapper[4825]: I0310 07:03:09.370581 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-4fxt4" Mar 10 07:03:09 crc kubenswrapper[4825]: I0310 07:03:09.401848 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wx259" Mar 10 07:03:09 crc kubenswrapper[4825]: I0310 07:03:09.579182 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-bdxtj" Mar 10 07:03:09 crc kubenswrapper[4825]: I0310 07:03:09.601805 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-6bw6n" Mar 10 07:03:09 crc kubenswrapper[4825]: I0310 07:03:09.636402 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-s56gc" Mar 10 07:03:09 crc kubenswrapper[4825]: I0310 07:03:09.706854 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-t88nq" Mar 10 07:03:11 crc kubenswrapper[4825]: I0310 07:03:11.490660 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-n2gmk\" (UID: \"5e92a016-c5ae-4ae4-a853-f69e701639fd\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:03:11 crc kubenswrapper[4825]: I0310 07:03:11.492041 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-n2gmk\" (UID: \"5e92a016-c5ae-4ae4-a853-f69e701639fd\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:03:11 crc kubenswrapper[4825]: I0310 07:03:11.500027 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-n2gmk\" (UID: \"5e92a016-c5ae-4ae4-a853-f69e701639fd\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:03:11 crc kubenswrapper[4825]: I0310 07:03:11.500445 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e92a016-c5ae-4ae4-a853-f69e701639fd-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-n2gmk\" (UID: \"5e92a016-c5ae-4ae4-a853-f69e701639fd\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:03:11 crc kubenswrapper[4825]: I0310 07:03:11.641350 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-k4s75" Mar 10 07:03:11 crc kubenswrapper[4825]: I0310 07:03:11.647793 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:03:12 crc kubenswrapper[4825]: I0310 07:03:12.087249 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk"] Mar 10 07:03:12 crc kubenswrapper[4825]: W0310 07:03:12.092700 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e92a016_c5ae_4ae4_a853_f69e701639fd.slice/crio-bab4600bc51d84db31c85369767344a50f98499e4412f158961587f6db4a3853 WatchSource:0}: Error finding container bab4600bc51d84db31c85369767344a50f98499e4412f158961587f6db4a3853: Status 404 returned error can't find the container with id bab4600bc51d84db31c85369767344a50f98499e4412f158961587f6db4a3853 Mar 10 07:03:12 crc kubenswrapper[4825]: I0310 07:03:12.683240 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" event={"ID":"5e92a016-c5ae-4ae4-a853-f69e701639fd","Type":"ContainerStarted","Data":"2df4de8b93fc5b9f1c8399fdcbda6854dd000a2e5075a99bb890944f66779070"} Mar 10 07:03:12 crc kubenswrapper[4825]: I0310 07:03:12.683300 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" event={"ID":"5e92a016-c5ae-4ae4-a853-f69e701639fd","Type":"ContainerStarted","Data":"bab4600bc51d84db31c85369767344a50f98499e4412f158961587f6db4a3853"} Mar 10 07:03:12 crc kubenswrapper[4825]: I0310 07:03:12.683397 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:03:12 crc kubenswrapper[4825]: I0310 07:03:12.734993 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" podStartSLOduration=33.734968777 podStartE2EDuration="33.734968777s" podCreationTimestamp="2026-03-10 07:02:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:03:12.733100158 +0000 UTC m=+1145.762880763" watchObservedRunningTime="2026-03-10 07:03:12.734968777 +0000 UTC m=+1145.764749392" Mar 10 07:03:13 crc kubenswrapper[4825]: I0310 07:03:13.691873 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k2l9r" event={"ID":"41dde8f5-d43b-4cdf-beb8-56e67290e024","Type":"ContainerStarted","Data":"f63204769201f78be52fec0d3c078afa87f54828c183a6c4e1bce43555d137fa"} Mar 10 07:03:13 crc kubenswrapper[4825]: I0310 07:03:13.695433 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dn5wn" event={"ID":"6667b99e-cca6-4450-b44d-03edafac3e7a","Type":"ContainerStarted","Data":"c0a884f48b288352dc347593e840651b794dcdf6df13d14ab9f0e775b0764cda"} Mar 10 07:03:13 crc kubenswrapper[4825]: I0310 07:03:13.695710 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dn5wn" Mar 10 07:03:13 crc kubenswrapper[4825]: I0310 07:03:13.697905 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-pnfls" event={"ID":"c0c7b817-9526-41fb-9a08-f2951ef9db20","Type":"ContainerStarted","Data":"13c6a536fd95a06bcb8ea0ec472140112b471b4fba85456f7d50e6c2e4f33899"} Mar 10 07:03:13 crc kubenswrapper[4825]: I0310 07:03:13.698179 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-pnfls" Mar 10 07:03:13 crc kubenswrapper[4825]: I0310 07:03:13.700054 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc" event={"ID":"0db55365-104a-42c2-ba9b-1c084fdd08cf","Type":"ContainerStarted","Data":"789aa94fb21aa42c7eafc033e78499e2d09c17d2f91202dfdcb2203285611d2b"} Mar 10 07:03:13 crc kubenswrapper[4825]: I0310 07:03:13.700286 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc" Mar 10 07:03:13 crc kubenswrapper[4825]: I0310 07:03:13.706776 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-k2l9r" podStartSLOduration=2.375618077 podStartE2EDuration="34.70675591s" podCreationTimestamp="2026-03-10 07:02:39 +0000 UTC" firstStartedPulling="2026-03-10 07:02:40.642702886 +0000 UTC m=+1113.672483501" lastFinishedPulling="2026-03-10 07:03:12.973840699 +0000 UTC m=+1146.003621334" observedRunningTime="2026-03-10 07:03:13.705563029 +0000 UTC m=+1146.735343684" watchObservedRunningTime="2026-03-10 07:03:13.70675591 +0000 UTC m=+1146.736536535" Mar 10 07:03:13 crc kubenswrapper[4825]: I0310 07:03:13.738367 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc" podStartSLOduration=22.77816911 podStartE2EDuration="35.738345513s" podCreationTimestamp="2026-03-10 07:02:38 +0000 UTC" firstStartedPulling="2026-03-10 07:03:00.011478269 +0000 UTC m=+1133.041258884" lastFinishedPulling="2026-03-10 07:03:12.971654672 +0000 UTC m=+1146.001435287" observedRunningTime="2026-03-10 07:03:13.733845746 +0000 UTC m=+1146.763626381" watchObservedRunningTime="2026-03-10 07:03:13.738345513 +0000 UTC m=+1146.768126128" Mar 10 07:03:13 crc kubenswrapper[4825]: I0310 07:03:13.750390 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dn5wn" podStartSLOduration=3.280394166 podStartE2EDuration="35.750372826s" podCreationTimestamp="2026-03-10 07:02:38 +0000 UTC" firstStartedPulling="2026-03-10 07:02:40.502216136 +0000 UTC m=+1113.531996751" lastFinishedPulling="2026-03-10 07:03:12.972194786 +0000 UTC m=+1146.001975411" observedRunningTime="2026-03-10 07:03:13.749629147 +0000 UTC m=+1146.779409782" watchObservedRunningTime="2026-03-10 07:03:13.750372826 +0000 UTC m=+1146.780153441" Mar 10 07:03:13 crc kubenswrapper[4825]: I0310 07:03:13.768475 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-pnfls" podStartSLOduration=3.136275691 podStartE2EDuration="35.768455147s" podCreationTimestamp="2026-03-10 07:02:38 +0000 UTC" firstStartedPulling="2026-03-10 07:02:40.449516313 +0000 UTC m=+1113.479296918" lastFinishedPulling="2026-03-10 07:03:13.081695759 +0000 UTC m=+1146.111476374" observedRunningTime="2026-03-10 07:03:13.767067921 +0000 UTC m=+1146.796848536" watchObservedRunningTime="2026-03-10 07:03:13.768455147 +0000 UTC m=+1146.798235762" Mar 10 07:03:14 crc kubenswrapper[4825]: I0310 07:03:14.909556 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-b5zsk" Mar 10 07:03:16 crc kubenswrapper[4825]: I0310 07:03:16.888265 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:03:16 crc kubenswrapper[4825]: I0310 07:03:16.889454 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:03:16 crc kubenswrapper[4825]: I0310 07:03:16.889547 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 07:03:16 crc kubenswrapper[4825]: I0310 07:03:16.890259 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"864f20c49ba566b3322366215d4709b7053290c4a77d8bfd2767699cce8f044b"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 07:03:16 crc kubenswrapper[4825]: I0310 07:03:16.890319 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://864f20c49ba566b3322366215d4709b7053290c4a77d8bfd2767699cce8f044b" gracePeriod=600 Mar 10 07:03:17 crc kubenswrapper[4825]: I0310 07:03:17.738713 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="864f20c49ba566b3322366215d4709b7053290c4a77d8bfd2767699cce8f044b" exitCode=0 Mar 10 07:03:17 crc kubenswrapper[4825]: I0310 07:03:17.738782 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"864f20c49ba566b3322366215d4709b7053290c4a77d8bfd2767699cce8f044b"} Mar 10 07:03:17 crc kubenswrapper[4825]: I0310 07:03:17.739111 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"5fc08ef292fcd7ed8a89f718e5e3157a2a220984fb44d362f65a63f59128261b"} Mar 10 07:03:17 crc kubenswrapper[4825]: I0310 07:03:17.739194 4825 scope.go:117] "RemoveContainer" containerID="1b4ca6f9c5b9ae0428c11c821b788c2b1bec41466ba3ff84c82938c6518a2828" Mar 10 07:03:19 crc kubenswrapper[4825]: I0310 07:03:19.127860 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-pnfls" Mar 10 07:03:19 crc kubenswrapper[4825]: I0310 07:03:19.297762 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dn5wn" Mar 10 07:03:21 crc kubenswrapper[4825]: I0310 07:03:21.655686 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-n2gmk" Mar 10 07:03:25 crc kubenswrapper[4825]: I0310 07:03:25.455825 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.202167 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-qzmg5"] Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.204262 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-qzmg5" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.207794 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.207924 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.207938 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-j99jk" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.207928 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.228205 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-qzmg5"] Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.229507 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rztrw\" (UniqueName: \"kubernetes.io/projected/a364111b-fbb3-4eb4-9acf-14d491d15519-kube-api-access-rztrw\") pod \"dnsmasq-dns-589db6c89c-qzmg5\" (UID: \"a364111b-fbb3-4eb4-9acf-14d491d15519\") " pod="openstack/dnsmasq-dns-589db6c89c-qzmg5" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.229690 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a364111b-fbb3-4eb4-9acf-14d491d15519-config\") pod \"dnsmasq-dns-589db6c89c-qzmg5\" (UID: \"a364111b-fbb3-4eb4-9acf-14d491d15519\") " pod="openstack/dnsmasq-dns-589db6c89c-qzmg5" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.304960 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-fnzcj"] Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.312096 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-fnzcj" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.316543 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.320875 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-fnzcj"] Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.332089 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rztrw\" (UniqueName: \"kubernetes.io/projected/a364111b-fbb3-4eb4-9acf-14d491d15519-kube-api-access-rztrw\") pod \"dnsmasq-dns-589db6c89c-qzmg5\" (UID: \"a364111b-fbb3-4eb4-9acf-14d491d15519\") " pod="openstack/dnsmasq-dns-589db6c89c-qzmg5" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.332215 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrvml\" (UniqueName: \"kubernetes.io/projected/8750bff5-ab3f-4cbe-9dbb-4378b39c93a3-kube-api-access-hrvml\") pod \"dnsmasq-dns-86bbd886cf-fnzcj\" (UID: \"8750bff5-ab3f-4cbe-9dbb-4378b39c93a3\") " pod="openstack/dnsmasq-dns-86bbd886cf-fnzcj" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.332282 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8750bff5-ab3f-4cbe-9dbb-4378b39c93a3-config\") pod \"dnsmasq-dns-86bbd886cf-fnzcj\" (UID: \"8750bff5-ab3f-4cbe-9dbb-4378b39c93a3\") " pod="openstack/dnsmasq-dns-86bbd886cf-fnzcj" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.332368 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8750bff5-ab3f-4cbe-9dbb-4378b39c93a3-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-fnzcj\" (UID: \"8750bff5-ab3f-4cbe-9dbb-4378b39c93a3\") " pod="openstack/dnsmasq-dns-86bbd886cf-fnzcj" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.332444 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a364111b-fbb3-4eb4-9acf-14d491d15519-config\") pod \"dnsmasq-dns-589db6c89c-qzmg5\" (UID: \"a364111b-fbb3-4eb4-9acf-14d491d15519\") " pod="openstack/dnsmasq-dns-589db6c89c-qzmg5" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.334649 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a364111b-fbb3-4eb4-9acf-14d491d15519-config\") pod \"dnsmasq-dns-589db6c89c-qzmg5\" (UID: \"a364111b-fbb3-4eb4-9acf-14d491d15519\") " pod="openstack/dnsmasq-dns-589db6c89c-qzmg5" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.357771 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rztrw\" (UniqueName: \"kubernetes.io/projected/a364111b-fbb3-4eb4-9acf-14d491d15519-kube-api-access-rztrw\") pod \"dnsmasq-dns-589db6c89c-qzmg5\" (UID: \"a364111b-fbb3-4eb4-9acf-14d491d15519\") " pod="openstack/dnsmasq-dns-589db6c89c-qzmg5" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.434703 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrvml\" (UniqueName: \"kubernetes.io/projected/8750bff5-ab3f-4cbe-9dbb-4378b39c93a3-kube-api-access-hrvml\") pod \"dnsmasq-dns-86bbd886cf-fnzcj\" (UID: \"8750bff5-ab3f-4cbe-9dbb-4378b39c93a3\") " pod="openstack/dnsmasq-dns-86bbd886cf-fnzcj" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.434896 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8750bff5-ab3f-4cbe-9dbb-4378b39c93a3-config\") pod \"dnsmasq-dns-86bbd886cf-fnzcj\" (UID: \"8750bff5-ab3f-4cbe-9dbb-4378b39c93a3\") " pod="openstack/dnsmasq-dns-86bbd886cf-fnzcj" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.434960 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8750bff5-ab3f-4cbe-9dbb-4378b39c93a3-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-fnzcj\" (UID: \"8750bff5-ab3f-4cbe-9dbb-4378b39c93a3\") " pod="openstack/dnsmasq-dns-86bbd886cf-fnzcj" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.436113 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8750bff5-ab3f-4cbe-9dbb-4378b39c93a3-config\") pod \"dnsmasq-dns-86bbd886cf-fnzcj\" (UID: \"8750bff5-ab3f-4cbe-9dbb-4378b39c93a3\") " pod="openstack/dnsmasq-dns-86bbd886cf-fnzcj" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.436324 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8750bff5-ab3f-4cbe-9dbb-4378b39c93a3-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-fnzcj\" (UID: \"8750bff5-ab3f-4cbe-9dbb-4378b39c93a3\") " pod="openstack/dnsmasq-dns-86bbd886cf-fnzcj" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.452578 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrvml\" (UniqueName: \"kubernetes.io/projected/8750bff5-ab3f-4cbe-9dbb-4378b39c93a3-kube-api-access-hrvml\") pod \"dnsmasq-dns-86bbd886cf-fnzcj\" (UID: \"8750bff5-ab3f-4cbe-9dbb-4378b39c93a3\") " pod="openstack/dnsmasq-dns-86bbd886cf-fnzcj" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.531103 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-qzmg5" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.629238 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-fnzcj" Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.879242 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-fnzcj"] Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.944837 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-qzmg5"] Mar 10 07:03:41 crc kubenswrapper[4825]: W0310 07:03:41.945598 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda364111b_fbb3_4eb4_9acf_14d491d15519.slice/crio-e0304e221aa975d40ae52bf1b3d4acf750f7a7376ea1b4f036497fc4c040fa3b WatchSource:0}: Error finding container e0304e221aa975d40ae52bf1b3d4acf750f7a7376ea1b4f036497fc4c040fa3b: Status 404 returned error can't find the container with id e0304e221aa975d40ae52bf1b3d4acf750f7a7376ea1b4f036497fc4c040fa3b Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.959791 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-qzmg5" event={"ID":"a364111b-fbb3-4eb4-9acf-14d491d15519","Type":"ContainerStarted","Data":"e0304e221aa975d40ae52bf1b3d4acf750f7a7376ea1b4f036497fc4c040fa3b"} Mar 10 07:03:41 crc kubenswrapper[4825]: I0310 07:03:41.961259 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-fnzcj" event={"ID":"8750bff5-ab3f-4cbe-9dbb-4378b39c93a3","Type":"ContainerStarted","Data":"7f3bed86760bffd711f4956231a848c1af2683458981237d5a3c3c69d5c777f6"} Mar 10 07:03:43 crc kubenswrapper[4825]: I0310 07:03:43.317583 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-qzmg5"] Mar 10 07:03:43 crc kubenswrapper[4825]: I0310 07:03:43.334079 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-gxl2b"] Mar 10 07:03:43 crc kubenswrapper[4825]: I0310 07:03:43.364470 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-gxl2b"] Mar 10 07:03:43 crc kubenswrapper[4825]: I0310 07:03:43.364657 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f9fc56ff-gxl2b" Mar 10 07:03:43 crc kubenswrapper[4825]: I0310 07:03:43.471538 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b53862f-ef06-4fc8-9e9b-d409a3eb3671-config\") pod \"dnsmasq-dns-79f9fc56ff-gxl2b\" (UID: \"7b53862f-ef06-4fc8-9e9b-d409a3eb3671\") " pod="openstack/dnsmasq-dns-79f9fc56ff-gxl2b" Mar 10 07:03:43 crc kubenswrapper[4825]: I0310 07:03:43.471605 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b53862f-ef06-4fc8-9e9b-d409a3eb3671-dns-svc\") pod \"dnsmasq-dns-79f9fc56ff-gxl2b\" (UID: \"7b53862f-ef06-4fc8-9e9b-d409a3eb3671\") " pod="openstack/dnsmasq-dns-79f9fc56ff-gxl2b" Mar 10 07:03:43 crc kubenswrapper[4825]: I0310 07:03:43.471664 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5vcp\" (UniqueName: \"kubernetes.io/projected/7b53862f-ef06-4fc8-9e9b-d409a3eb3671-kube-api-access-w5vcp\") pod \"dnsmasq-dns-79f9fc56ff-gxl2b\" (UID: \"7b53862f-ef06-4fc8-9e9b-d409a3eb3671\") " pod="openstack/dnsmasq-dns-79f9fc56ff-gxl2b" Mar 10 07:03:43 crc kubenswrapper[4825]: I0310 07:03:43.572574 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b53862f-ef06-4fc8-9e9b-d409a3eb3671-config\") pod \"dnsmasq-dns-79f9fc56ff-gxl2b\" (UID: \"7b53862f-ef06-4fc8-9e9b-d409a3eb3671\") " pod="openstack/dnsmasq-dns-79f9fc56ff-gxl2b" Mar 10 07:03:43 crc kubenswrapper[4825]: I0310 07:03:43.572634 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b53862f-ef06-4fc8-9e9b-d409a3eb3671-dns-svc\") pod \"dnsmasq-dns-79f9fc56ff-gxl2b\" (UID: \"7b53862f-ef06-4fc8-9e9b-d409a3eb3671\") " pod="openstack/dnsmasq-dns-79f9fc56ff-gxl2b" Mar 10 07:03:43 crc kubenswrapper[4825]: I0310 07:03:43.572746 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5vcp\" (UniqueName: \"kubernetes.io/projected/7b53862f-ef06-4fc8-9e9b-d409a3eb3671-kube-api-access-w5vcp\") pod \"dnsmasq-dns-79f9fc56ff-gxl2b\" (UID: \"7b53862f-ef06-4fc8-9e9b-d409a3eb3671\") " pod="openstack/dnsmasq-dns-79f9fc56ff-gxl2b" Mar 10 07:03:43 crc kubenswrapper[4825]: I0310 07:03:43.573765 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b53862f-ef06-4fc8-9e9b-d409a3eb3671-dns-svc\") pod \"dnsmasq-dns-79f9fc56ff-gxl2b\" (UID: \"7b53862f-ef06-4fc8-9e9b-d409a3eb3671\") " pod="openstack/dnsmasq-dns-79f9fc56ff-gxl2b" Mar 10 07:03:43 crc kubenswrapper[4825]: I0310 07:03:43.574531 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b53862f-ef06-4fc8-9e9b-d409a3eb3671-config\") pod \"dnsmasq-dns-79f9fc56ff-gxl2b\" (UID: \"7b53862f-ef06-4fc8-9e9b-d409a3eb3671\") " pod="openstack/dnsmasq-dns-79f9fc56ff-gxl2b" Mar 10 07:03:43 crc kubenswrapper[4825]: I0310 07:03:43.595367 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5vcp\" (UniqueName: \"kubernetes.io/projected/7b53862f-ef06-4fc8-9e9b-d409a3eb3671-kube-api-access-w5vcp\") pod \"dnsmasq-dns-79f9fc56ff-gxl2b\" (UID: \"7b53862f-ef06-4fc8-9e9b-d409a3eb3671\") " pod="openstack/dnsmasq-dns-79f9fc56ff-gxl2b" Mar 10 07:03:43 crc kubenswrapper[4825]: I0310 07:03:43.708703 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f9fc56ff-gxl2b" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.178122 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-fnzcj"] Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.198026 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-w92qr"] Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.199230 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-w92qr" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.213824 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-w92qr"] Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.222991 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-gxl2b"] Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.386359 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9dbf9084-ea14-4c47-8d2e-682719c4f27c-config\") pod \"dnsmasq-dns-7c47bcb9f9-w92qr\" (UID: \"9dbf9084-ea14-4c47-8d2e-682719c4f27c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-w92qr" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.386435 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9dbf9084-ea14-4c47-8d2e-682719c4f27c-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-w92qr\" (UID: \"9dbf9084-ea14-4c47-8d2e-682719c4f27c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-w92qr" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.386775 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl6vm\" (UniqueName: \"kubernetes.io/projected/9dbf9084-ea14-4c47-8d2e-682719c4f27c-kube-api-access-xl6vm\") pod \"dnsmasq-dns-7c47bcb9f9-w92qr\" (UID: \"9dbf9084-ea14-4c47-8d2e-682719c4f27c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-w92qr" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.474242 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.475703 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.486215 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.486523 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.486597 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.486734 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.486792 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.486930 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.487866 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-qj5tf" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.488379 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl6vm\" (UniqueName: \"kubernetes.io/projected/9dbf9084-ea14-4c47-8d2e-682719c4f27c-kube-api-access-xl6vm\") pod \"dnsmasq-dns-7c47bcb9f9-w92qr\" (UID: \"9dbf9084-ea14-4c47-8d2e-682719c4f27c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-w92qr" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.488446 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9dbf9084-ea14-4c47-8d2e-682719c4f27c-config\") pod \"dnsmasq-dns-7c47bcb9f9-w92qr\" (UID: \"9dbf9084-ea14-4c47-8d2e-682719c4f27c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-w92qr" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.488483 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9dbf9084-ea14-4c47-8d2e-682719c4f27c-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-w92qr\" (UID: \"9dbf9084-ea14-4c47-8d2e-682719c4f27c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-w92qr" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.489602 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9dbf9084-ea14-4c47-8d2e-682719c4f27c-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-w92qr\" (UID: \"9dbf9084-ea14-4c47-8d2e-682719c4f27c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-w92qr" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.490454 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9dbf9084-ea14-4c47-8d2e-682719c4f27c-config\") pod \"dnsmasq-dns-7c47bcb9f9-w92qr\" (UID: \"9dbf9084-ea14-4c47-8d2e-682719c4f27c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-w92qr" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.505963 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.530877 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl6vm\" (UniqueName: \"kubernetes.io/projected/9dbf9084-ea14-4c47-8d2e-682719c4f27c-kube-api-access-xl6vm\") pod \"dnsmasq-dns-7c47bcb9f9-w92qr\" (UID: \"9dbf9084-ea14-4c47-8d2e-682719c4f27c\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-w92qr" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.542464 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-w92qr" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.590081 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.590204 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/40efa241-98cc-4dec-9ae8-8a892b367ebc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.590244 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/40efa241-98cc-4dec-9ae8-8a892b367ebc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.590267 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/40efa241-98cc-4dec-9ae8-8a892b367ebc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.590306 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/40efa241-98cc-4dec-9ae8-8a892b367ebc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.590325 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.590358 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-config-data\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.590392 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/40efa241-98cc-4dec-9ae8-8a892b367ebc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.590420 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2nzn\" (UniqueName: \"kubernetes.io/projected/40efa241-98cc-4dec-9ae8-8a892b367ebc-kube-api-access-p2nzn\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.590438 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/40efa241-98cc-4dec-9ae8-8a892b367ebc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.590492 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.691836 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/40efa241-98cc-4dec-9ae8-8a892b367ebc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.692157 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.692187 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.692235 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/40efa241-98cc-4dec-9ae8-8a892b367ebc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.692254 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/40efa241-98cc-4dec-9ae8-8a892b367ebc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.692274 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/40efa241-98cc-4dec-9ae8-8a892b367ebc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.692294 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/40efa241-98cc-4dec-9ae8-8a892b367ebc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.692318 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.692347 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-config-data\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.692363 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/40efa241-98cc-4dec-9ae8-8a892b367ebc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.692386 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2nzn\" (UniqueName: \"kubernetes.io/projected/40efa241-98cc-4dec-9ae8-8a892b367ebc-kube-api-access-p2nzn\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.693055 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/40efa241-98cc-4dec-9ae8-8a892b367ebc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.693480 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.693590 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.693734 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.694196 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/40efa241-98cc-4dec-9ae8-8a892b367ebc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.712712 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/40efa241-98cc-4dec-9ae8-8a892b367ebc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.713229 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/40efa241-98cc-4dec-9ae8-8a892b367ebc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.721715 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/40efa241-98cc-4dec-9ae8-8a892b367ebc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.722668 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/40efa241-98cc-4dec-9ae8-8a892b367ebc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.730885 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2nzn\" (UniqueName: \"kubernetes.io/projected/40efa241-98cc-4dec-9ae8-8a892b367ebc-kube-api-access-p2nzn\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.732874 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-config-data\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.739967 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " pod="openstack/rabbitmq-server-0" Mar 10 07:03:44 crc kubenswrapper[4825]: I0310 07:03:44.818564 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.002102 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9fc56ff-gxl2b" event={"ID":"7b53862f-ef06-4fc8-9e9b-d409a3eb3671","Type":"ContainerStarted","Data":"8566d0fe6392bf6233c4618fcc2501036e6a0c4258fbe143075560fe4f1b3529"} Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.064452 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-w92qr"] Mar 10 07:03:45 crc kubenswrapper[4825]: W0310 07:03:45.070211 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dbf9084_ea14_4c47_8d2e_682719c4f27c.slice/crio-34f5eeb6d64e0edf4b4811dbad76921becdef54e33773de603b151286039dde2 WatchSource:0}: Error finding container 34f5eeb6d64e0edf4b4811dbad76921becdef54e33773de603b151286039dde2: Status 404 returned error can't find the container with id 34f5eeb6d64e0edf4b4811dbad76921becdef54e33773de603b151286039dde2 Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.306260 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.337352 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.339680 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.350415 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.350996 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.351032 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.351395 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.351577 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.351734 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-4gzwr" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.352030 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.354888 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 07:03:45 crc kubenswrapper[4825]: W0310 07:03:45.363822 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40efa241_98cc_4dec_9ae8_8a892b367ebc.slice/crio-f182dd5eece1a0a99b1b33073d409d5bdc7dbaa561efb7a7b4099bcd91d756b7 WatchSource:0}: Error finding container f182dd5eece1a0a99b1b33073d409d5bdc7dbaa561efb7a7b4099bcd91d756b7: Status 404 returned error can't find the container with id f182dd5eece1a0a99b1b33073d409d5bdc7dbaa561efb7a7b4099bcd91d756b7 Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.506101 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9ba0e3ee-1309-4411-a927-866b35c2776b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.506156 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9ba0e3ee-1309-4411-a927-866b35c2776b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.506198 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7t4t\" (UniqueName: \"kubernetes.io/projected/9ba0e3ee-1309-4411-a927-866b35c2776b-kube-api-access-d7t4t\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.506231 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.506246 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ba0e3ee-1309-4411-a927-866b35c2776b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.506261 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9ba0e3ee-1309-4411-a927-866b35c2776b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.506295 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9ba0e3ee-1309-4411-a927-866b35c2776b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.506312 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9ba0e3ee-1309-4411-a927-866b35c2776b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.506337 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9ba0e3ee-1309-4411-a927-866b35c2776b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.506358 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9ba0e3ee-1309-4411-a927-866b35c2776b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.506384 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9ba0e3ee-1309-4411-a927-866b35c2776b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.607988 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9ba0e3ee-1309-4411-a927-866b35c2776b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.608042 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9ba0e3ee-1309-4411-a927-866b35c2776b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.608061 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9ba0e3ee-1309-4411-a927-866b35c2776b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.608099 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7t4t\" (UniqueName: \"kubernetes.io/projected/9ba0e3ee-1309-4411-a927-866b35c2776b-kube-api-access-d7t4t\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.608127 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.608155 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ba0e3ee-1309-4411-a927-866b35c2776b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.608186 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9ba0e3ee-1309-4411-a927-866b35c2776b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.608222 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9ba0e3ee-1309-4411-a927-866b35c2776b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.608238 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9ba0e3ee-1309-4411-a927-866b35c2776b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.608262 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9ba0e3ee-1309-4411-a927-866b35c2776b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.608290 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9ba0e3ee-1309-4411-a927-866b35c2776b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.609734 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ba0e3ee-1309-4411-a927-866b35c2776b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.610020 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9ba0e3ee-1309-4411-a927-866b35c2776b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.613086 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.613863 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9ba0e3ee-1309-4411-a927-866b35c2776b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.613917 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9ba0e3ee-1309-4411-a927-866b35c2776b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.614245 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9ba0e3ee-1309-4411-a927-866b35c2776b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.616655 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9ba0e3ee-1309-4411-a927-866b35c2776b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.616727 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9ba0e3ee-1309-4411-a927-866b35c2776b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.618446 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9ba0e3ee-1309-4411-a927-866b35c2776b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.622046 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9ba0e3ee-1309-4411-a927-866b35c2776b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.644767 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7t4t\" (UniqueName: \"kubernetes.io/projected/9ba0e3ee-1309-4411-a927-866b35c2776b-kube-api-access-d7t4t\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.652059 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:45 crc kubenswrapper[4825]: I0310 07:03:45.681523 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.012175 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-w92qr" event={"ID":"9dbf9084-ea14-4c47-8d2e-682719c4f27c","Type":"ContainerStarted","Data":"34f5eeb6d64e0edf4b4811dbad76921becdef54e33773de603b151286039dde2"} Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.014028 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"40efa241-98cc-4dec-9ae8-8a892b367ebc","Type":"ContainerStarted","Data":"f182dd5eece1a0a99b1b33073d409d5bdc7dbaa561efb7a7b4099bcd91d756b7"} Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.214415 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.545625 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.547930 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.552817 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.552988 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.553033 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.553408 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-w8jxw" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.558693 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.560275 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.620209 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b5b179f-f4fb-479c-9720-25587566c518-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.620280 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1b5b179f-f4fb-479c-9720-25587566c518-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.620320 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.620425 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5b179f-f4fb-479c-9720-25587566c518-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.620497 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b5b179f-f4fb-479c-9720-25587566c518-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.620536 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1b5b179f-f4fb-479c-9720-25587566c518-config-data-default\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.620615 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1b5b179f-f4fb-479c-9720-25587566c518-kolla-config\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.620653 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqzfd\" (UniqueName: \"kubernetes.io/projected/1b5b179f-f4fb-479c-9720-25587566c518-kube-api-access-pqzfd\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.721729 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b5b179f-f4fb-479c-9720-25587566c518-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.721774 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1b5b179f-f4fb-479c-9720-25587566c518-config-data-default\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.721794 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1b5b179f-f4fb-479c-9720-25587566c518-kolla-config\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.721820 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqzfd\" (UniqueName: \"kubernetes.io/projected/1b5b179f-f4fb-479c-9720-25587566c518-kube-api-access-pqzfd\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.721848 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b5b179f-f4fb-479c-9720-25587566c518-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.721866 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1b5b179f-f4fb-479c-9720-25587566c518-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.721883 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.721922 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5b179f-f4fb-479c-9720-25587566c518-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.723424 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.729956 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1b5b179f-f4fb-479c-9720-25587566c518-config-data-default\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.730109 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1b5b179f-f4fb-479c-9720-25587566c518-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.731295 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b5b179f-f4fb-479c-9720-25587566c518-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.731551 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1b5b179f-f4fb-479c-9720-25587566c518-kolla-config\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.732562 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5b179f-f4fb-479c-9720-25587566c518-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.739773 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b5b179f-f4fb-479c-9720-25587566c518-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.742011 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqzfd\" (UniqueName: \"kubernetes.io/projected/1b5b179f-f4fb-479c-9720-25587566c518-kube-api-access-pqzfd\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.765051 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " pod="openstack/openstack-galera-0" Mar 10 07:03:46 crc kubenswrapper[4825]: I0310 07:03:46.882036 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 07:03:47 crc kubenswrapper[4825]: I0310 07:03:47.996413 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 07:03:47 crc kubenswrapper[4825]: I0310 07:03:47.999005 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.002731 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.002761 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.004466 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.004648 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-dgp86" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.007963 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.045514 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/757635f4-9aaf-48bb-bd50-60398db738d4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.045584 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/757635f4-9aaf-48bb-bd50-60398db738d4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.045618 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj8r5\" (UniqueName: \"kubernetes.io/projected/757635f4-9aaf-48bb-bd50-60398db738d4-kube-api-access-zj8r5\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.045655 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/757635f4-9aaf-48bb-bd50-60398db738d4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.045689 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/757635f4-9aaf-48bb-bd50-60398db738d4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.045711 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.045726 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/757635f4-9aaf-48bb-bd50-60398db738d4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.045818 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/757635f4-9aaf-48bb-bd50-60398db738d4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.147562 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/757635f4-9aaf-48bb-bd50-60398db738d4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.147629 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/757635f4-9aaf-48bb-bd50-60398db738d4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.147654 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj8r5\" (UniqueName: \"kubernetes.io/projected/757635f4-9aaf-48bb-bd50-60398db738d4-kube-api-access-zj8r5\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.147681 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/757635f4-9aaf-48bb-bd50-60398db738d4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.147704 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/757635f4-9aaf-48bb-bd50-60398db738d4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.147721 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.147740 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/757635f4-9aaf-48bb-bd50-60398db738d4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.147778 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/757635f4-9aaf-48bb-bd50-60398db738d4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.148854 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/757635f4-9aaf-48bb-bd50-60398db738d4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.150440 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.151610 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/757635f4-9aaf-48bb-bd50-60398db738d4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.151629 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/757635f4-9aaf-48bb-bd50-60398db738d4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.152607 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/757635f4-9aaf-48bb-bd50-60398db738d4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.158101 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/757635f4-9aaf-48bb-bd50-60398db738d4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.158194 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/757635f4-9aaf-48bb-bd50-60398db738d4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.173906 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj8r5\" (UniqueName: \"kubernetes.io/projected/757635f4-9aaf-48bb-bd50-60398db738d4-kube-api-access-zj8r5\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.207400 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.336372 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.367634 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.368768 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.378822 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.378854 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.379167 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-rvx6s" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.389009 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.455816 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5d4252ca-f279-4c95-8f10-205339d028a5-kolla-config\") pod \"memcached-0\" (UID: \"5d4252ca-f279-4c95-8f10-205339d028a5\") " pod="openstack/memcached-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.456094 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cqq4\" (UniqueName: \"kubernetes.io/projected/5d4252ca-f279-4c95-8f10-205339d028a5-kube-api-access-2cqq4\") pod \"memcached-0\" (UID: \"5d4252ca-f279-4c95-8f10-205339d028a5\") " pod="openstack/memcached-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.456390 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d4252ca-f279-4c95-8f10-205339d028a5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5d4252ca-f279-4c95-8f10-205339d028a5\") " pod="openstack/memcached-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.456463 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d4252ca-f279-4c95-8f10-205339d028a5-config-data\") pod \"memcached-0\" (UID: \"5d4252ca-f279-4c95-8f10-205339d028a5\") " pod="openstack/memcached-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.456738 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d4252ca-f279-4c95-8f10-205339d028a5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5d4252ca-f279-4c95-8f10-205339d028a5\") " pod="openstack/memcached-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.558640 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d4252ca-f279-4c95-8f10-205339d028a5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5d4252ca-f279-4c95-8f10-205339d028a5\") " pod="openstack/memcached-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.558698 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d4252ca-f279-4c95-8f10-205339d028a5-config-data\") pod \"memcached-0\" (UID: \"5d4252ca-f279-4c95-8f10-205339d028a5\") " pod="openstack/memcached-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.558746 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d4252ca-f279-4c95-8f10-205339d028a5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5d4252ca-f279-4c95-8f10-205339d028a5\") " pod="openstack/memcached-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.558777 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5d4252ca-f279-4c95-8f10-205339d028a5-kolla-config\") pod \"memcached-0\" (UID: \"5d4252ca-f279-4c95-8f10-205339d028a5\") " pod="openstack/memcached-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.558824 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cqq4\" (UniqueName: \"kubernetes.io/projected/5d4252ca-f279-4c95-8f10-205339d028a5-kube-api-access-2cqq4\") pod \"memcached-0\" (UID: \"5d4252ca-f279-4c95-8f10-205339d028a5\") " pod="openstack/memcached-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.560625 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5d4252ca-f279-4c95-8f10-205339d028a5-kolla-config\") pod \"memcached-0\" (UID: \"5d4252ca-f279-4c95-8f10-205339d028a5\") " pod="openstack/memcached-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.560979 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d4252ca-f279-4c95-8f10-205339d028a5-config-data\") pod \"memcached-0\" (UID: \"5d4252ca-f279-4c95-8f10-205339d028a5\") " pod="openstack/memcached-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.566525 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d4252ca-f279-4c95-8f10-205339d028a5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5d4252ca-f279-4c95-8f10-205339d028a5\") " pod="openstack/memcached-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.574089 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d4252ca-f279-4c95-8f10-205339d028a5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5d4252ca-f279-4c95-8f10-205339d028a5\") " pod="openstack/memcached-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.576651 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cqq4\" (UniqueName: \"kubernetes.io/projected/5d4252ca-f279-4c95-8f10-205339d028a5-kube-api-access-2cqq4\") pod \"memcached-0\" (UID: \"5d4252ca-f279-4c95-8f10-205339d028a5\") " pod="openstack/memcached-0" Mar 10 07:03:48 crc kubenswrapper[4825]: I0310 07:03:48.700952 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 07:03:50 crc kubenswrapper[4825]: I0310 07:03:50.572616 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 07:03:50 crc kubenswrapper[4825]: I0310 07:03:50.575697 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 07:03:50 crc kubenswrapper[4825]: I0310 07:03:50.579170 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-fnjqt" Mar 10 07:03:50 crc kubenswrapper[4825]: I0310 07:03:50.587636 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 07:03:50 crc kubenswrapper[4825]: I0310 07:03:50.704741 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdrkz\" (UniqueName: \"kubernetes.io/projected/29b6d990-12b7-446f-9cd8-53c1111b1512-kube-api-access-wdrkz\") pod \"kube-state-metrics-0\" (UID: \"29b6d990-12b7-446f-9cd8-53c1111b1512\") " pod="openstack/kube-state-metrics-0" Mar 10 07:03:50 crc kubenswrapper[4825]: I0310 07:03:50.806444 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdrkz\" (UniqueName: \"kubernetes.io/projected/29b6d990-12b7-446f-9cd8-53c1111b1512-kube-api-access-wdrkz\") pod \"kube-state-metrics-0\" (UID: \"29b6d990-12b7-446f-9cd8-53c1111b1512\") " pod="openstack/kube-state-metrics-0" Mar 10 07:03:50 crc kubenswrapper[4825]: I0310 07:03:50.836142 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdrkz\" (UniqueName: \"kubernetes.io/projected/29b6d990-12b7-446f-9cd8-53c1111b1512-kube-api-access-wdrkz\") pod \"kube-state-metrics-0\" (UID: \"29b6d990-12b7-446f-9cd8-53c1111b1512\") " pod="openstack/kube-state-metrics-0" Mar 10 07:03:50 crc kubenswrapper[4825]: I0310 07:03:50.900113 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 07:03:51 crc kubenswrapper[4825]: I0310 07:03:51.094804 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9ba0e3ee-1309-4411-a927-866b35c2776b","Type":"ContainerStarted","Data":"21bde652f73dcd32c4af6e16297b56cef02a99aea3f4f774ca6588596babae0f"} Mar 10 07:03:53 crc kubenswrapper[4825]: I0310 07:03:53.916710 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-jctbb"] Mar 10 07:03:53 crc kubenswrapper[4825]: I0310 07:03:53.918268 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jctbb" Mar 10 07:03:53 crc kubenswrapper[4825]: I0310 07:03:53.921065 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-9mzxj" Mar 10 07:03:53 crc kubenswrapper[4825]: I0310 07:03:53.921487 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 10 07:03:53 crc kubenswrapper[4825]: I0310 07:03:53.924074 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 10 07:03:53 crc kubenswrapper[4825]: I0310 07:03:53.943730 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jctbb"] Mar 10 07:03:53 crc kubenswrapper[4825]: I0310 07:03:53.971350 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-scripts\") pod \"ovn-controller-jctbb\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " pod="openstack/ovn-controller-jctbb" Mar 10 07:03:53 crc kubenswrapper[4825]: I0310 07:03:53.971418 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-ovn-controller-tls-certs\") pod \"ovn-controller-jctbb\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " pod="openstack/ovn-controller-jctbb" Mar 10 07:03:53 crc kubenswrapper[4825]: I0310 07:03:53.971458 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcwbj\" (UniqueName: \"kubernetes.io/projected/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-kube-api-access-qcwbj\") pod \"ovn-controller-jctbb\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " pod="openstack/ovn-controller-jctbb" Mar 10 07:03:53 crc kubenswrapper[4825]: I0310 07:03:53.971486 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-var-run-ovn\") pod \"ovn-controller-jctbb\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " pod="openstack/ovn-controller-jctbb" Mar 10 07:03:53 crc kubenswrapper[4825]: I0310 07:03:53.971551 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-var-run\") pod \"ovn-controller-jctbb\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " pod="openstack/ovn-controller-jctbb" Mar 10 07:03:53 crc kubenswrapper[4825]: I0310 07:03:53.971576 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-var-log-ovn\") pod \"ovn-controller-jctbb\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " pod="openstack/ovn-controller-jctbb" Mar 10 07:03:53 crc kubenswrapper[4825]: I0310 07:03:53.971602 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-combined-ca-bundle\") pod \"ovn-controller-jctbb\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " pod="openstack/ovn-controller-jctbb" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.017920 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-8x4pm"] Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.019522 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.030888 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8x4pm"] Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.073479 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f04e11a7-e387-4e51-b878-8633ca528b1a-var-run\") pod \"ovn-controller-ovs-8x4pm\" (UID: \"f04e11a7-e387-4e51-b878-8633ca528b1a\") " pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.073554 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f04e11a7-e387-4e51-b878-8633ca528b1a-var-lib\") pod \"ovn-controller-ovs-8x4pm\" (UID: \"f04e11a7-e387-4e51-b878-8633ca528b1a\") " pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.073649 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f04e11a7-e387-4e51-b878-8633ca528b1a-etc-ovs\") pod \"ovn-controller-ovs-8x4pm\" (UID: \"f04e11a7-e387-4e51-b878-8633ca528b1a\") " pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.073750 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-scripts\") pod \"ovn-controller-jctbb\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " pod="openstack/ovn-controller-jctbb" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.073794 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-ovn-controller-tls-certs\") pod \"ovn-controller-jctbb\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " pod="openstack/ovn-controller-jctbb" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.073835 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmnpc\" (UniqueName: \"kubernetes.io/projected/f04e11a7-e387-4e51-b878-8633ca528b1a-kube-api-access-bmnpc\") pod \"ovn-controller-ovs-8x4pm\" (UID: \"f04e11a7-e387-4e51-b878-8633ca528b1a\") " pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.073869 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcwbj\" (UniqueName: \"kubernetes.io/projected/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-kube-api-access-qcwbj\") pod \"ovn-controller-jctbb\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " pod="openstack/ovn-controller-jctbb" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.073900 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-var-run-ovn\") pod \"ovn-controller-jctbb\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " pod="openstack/ovn-controller-jctbb" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.073932 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f04e11a7-e387-4e51-b878-8633ca528b1a-var-log\") pod \"ovn-controller-ovs-8x4pm\" (UID: \"f04e11a7-e387-4e51-b878-8633ca528b1a\") " pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.073979 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f04e11a7-e387-4e51-b878-8633ca528b1a-scripts\") pod \"ovn-controller-ovs-8x4pm\" (UID: \"f04e11a7-e387-4e51-b878-8633ca528b1a\") " pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.074013 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-var-run\") pod \"ovn-controller-jctbb\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " pod="openstack/ovn-controller-jctbb" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.074054 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-var-log-ovn\") pod \"ovn-controller-jctbb\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " pod="openstack/ovn-controller-jctbb" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.074088 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-combined-ca-bundle\") pod \"ovn-controller-jctbb\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " pod="openstack/ovn-controller-jctbb" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.077424 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-var-run-ovn\") pod \"ovn-controller-jctbb\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " pod="openstack/ovn-controller-jctbb" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.077633 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-var-log-ovn\") pod \"ovn-controller-jctbb\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " pod="openstack/ovn-controller-jctbb" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.077546 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-var-run\") pod \"ovn-controller-jctbb\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " pod="openstack/ovn-controller-jctbb" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.080405 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-scripts\") pod \"ovn-controller-jctbb\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " pod="openstack/ovn-controller-jctbb" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.087011 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-ovn-controller-tls-certs\") pod \"ovn-controller-jctbb\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " pod="openstack/ovn-controller-jctbb" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.091070 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-combined-ca-bundle\") pod \"ovn-controller-jctbb\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " pod="openstack/ovn-controller-jctbb" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.099393 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcwbj\" (UniqueName: \"kubernetes.io/projected/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-kube-api-access-qcwbj\") pod \"ovn-controller-jctbb\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " pod="openstack/ovn-controller-jctbb" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.176172 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f04e11a7-e387-4e51-b878-8633ca528b1a-etc-ovs\") pod \"ovn-controller-ovs-8x4pm\" (UID: \"f04e11a7-e387-4e51-b878-8633ca528b1a\") " pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.176273 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmnpc\" (UniqueName: \"kubernetes.io/projected/f04e11a7-e387-4e51-b878-8633ca528b1a-kube-api-access-bmnpc\") pod \"ovn-controller-ovs-8x4pm\" (UID: \"f04e11a7-e387-4e51-b878-8633ca528b1a\") " pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.176321 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f04e11a7-e387-4e51-b878-8633ca528b1a-var-log\") pod \"ovn-controller-ovs-8x4pm\" (UID: \"f04e11a7-e387-4e51-b878-8633ca528b1a\") " pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.176375 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f04e11a7-e387-4e51-b878-8633ca528b1a-scripts\") pod \"ovn-controller-ovs-8x4pm\" (UID: \"f04e11a7-e387-4e51-b878-8633ca528b1a\") " pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.176463 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f04e11a7-e387-4e51-b878-8633ca528b1a-var-run\") pod \"ovn-controller-ovs-8x4pm\" (UID: \"f04e11a7-e387-4e51-b878-8633ca528b1a\") " pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.176485 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f04e11a7-e387-4e51-b878-8633ca528b1a-var-lib\") pod \"ovn-controller-ovs-8x4pm\" (UID: \"f04e11a7-e387-4e51-b878-8633ca528b1a\") " pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.176732 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f04e11a7-e387-4e51-b878-8633ca528b1a-var-lib\") pod \"ovn-controller-ovs-8x4pm\" (UID: \"f04e11a7-e387-4e51-b878-8633ca528b1a\") " pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.176904 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f04e11a7-e387-4e51-b878-8633ca528b1a-etc-ovs\") pod \"ovn-controller-ovs-8x4pm\" (UID: \"f04e11a7-e387-4e51-b878-8633ca528b1a\") " pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.177360 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f04e11a7-e387-4e51-b878-8633ca528b1a-var-log\") pod \"ovn-controller-ovs-8x4pm\" (UID: \"f04e11a7-e387-4e51-b878-8633ca528b1a\") " pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.178055 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f04e11a7-e387-4e51-b878-8633ca528b1a-var-run\") pod \"ovn-controller-ovs-8x4pm\" (UID: \"f04e11a7-e387-4e51-b878-8633ca528b1a\") " pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.179624 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f04e11a7-e387-4e51-b878-8633ca528b1a-scripts\") pod \"ovn-controller-ovs-8x4pm\" (UID: \"f04e11a7-e387-4e51-b878-8633ca528b1a\") " pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.193602 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmnpc\" (UniqueName: \"kubernetes.io/projected/f04e11a7-e387-4e51-b878-8633ca528b1a-kube-api-access-bmnpc\") pod \"ovn-controller-ovs-8x4pm\" (UID: \"f04e11a7-e387-4e51-b878-8633ca528b1a\") " pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.247122 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jctbb" Mar 10 07:03:54 crc kubenswrapper[4825]: I0310 07:03:54.338552 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:03:56 crc kubenswrapper[4825]: I0310 07:03:56.821868 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 07:03:56 crc kubenswrapper[4825]: I0310 07:03:56.824723 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:56 crc kubenswrapper[4825]: I0310 07:03:56.827994 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 10 07:03:56 crc kubenswrapper[4825]: I0310 07:03:56.828346 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-2z426" Mar 10 07:03:56 crc kubenswrapper[4825]: I0310 07:03:56.828343 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 10 07:03:56 crc kubenswrapper[4825]: I0310 07:03:56.828544 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 10 07:03:56 crc kubenswrapper[4825]: I0310 07:03:56.828816 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 10 07:03:56 crc kubenswrapper[4825]: I0310 07:03:56.847033 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 07:03:56 crc kubenswrapper[4825]: I0310 07:03:56.935690 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6301c97-5fb5-4f12-9181-ae937aa01b33-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:56 crc kubenswrapper[4825]: I0310 07:03:56.935774 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6301c97-5fb5-4f12-9181-ae937aa01b33-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:56 crc kubenswrapper[4825]: I0310 07:03:56.935811 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whptl\" (UniqueName: \"kubernetes.io/projected/e6301c97-5fb5-4f12-9181-ae937aa01b33-kube-api-access-whptl\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:56 crc kubenswrapper[4825]: I0310 07:03:56.935851 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6301c97-5fb5-4f12-9181-ae937aa01b33-config\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:56 crc kubenswrapper[4825]: I0310 07:03:56.935885 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6301c97-5fb5-4f12-9181-ae937aa01b33-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:56 crc kubenswrapper[4825]: I0310 07:03:56.935913 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:56 crc kubenswrapper[4825]: I0310 07:03:56.936260 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e6301c97-5fb5-4f12-9181-ae937aa01b33-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:56 crc kubenswrapper[4825]: I0310 07:03:56.936395 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6301c97-5fb5-4f12-9181-ae937aa01b33-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.038021 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6301c97-5fb5-4f12-9181-ae937aa01b33-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.038086 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6301c97-5fb5-4f12-9181-ae937aa01b33-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.038110 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whptl\" (UniqueName: \"kubernetes.io/projected/e6301c97-5fb5-4f12-9181-ae937aa01b33-kube-api-access-whptl\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.038152 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6301c97-5fb5-4f12-9181-ae937aa01b33-config\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.038177 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6301c97-5fb5-4f12-9181-ae937aa01b33-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.038198 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.038220 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e6301c97-5fb5-4f12-9181-ae937aa01b33-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.038248 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6301c97-5fb5-4f12-9181-ae937aa01b33-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.039686 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.040729 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e6301c97-5fb5-4f12-9181-ae937aa01b33-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.040965 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6301c97-5fb5-4f12-9181-ae937aa01b33-config\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.043285 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6301c97-5fb5-4f12-9181-ae937aa01b33-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.045812 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6301c97-5fb5-4f12-9181-ae937aa01b33-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.049973 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6301c97-5fb5-4f12-9181-ae937aa01b33-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.052709 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6301c97-5fb5-4f12-9181-ae937aa01b33-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.061623 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whptl\" (UniqueName: \"kubernetes.io/projected/e6301c97-5fb5-4f12-9181-ae937aa01b33-kube-api-access-whptl\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.070917 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.161342 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.648777 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.650475 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.653048 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.653295 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.653407 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-ss7h8" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.655603 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.666345 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.849499 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0be589-d069-4183-b9f6-bde715ad716b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.849598 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.849757 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0be589-d069-4183-b9f6-bde715ad716b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.849986 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0be589-d069-4183-b9f6-bde715ad716b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.850296 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtbvq\" (UniqueName: \"kubernetes.io/projected/4d0be589-d069-4183-b9f6-bde715ad716b-kube-api-access-dtbvq\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.850402 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d0be589-d069-4183-b9f6-bde715ad716b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.850637 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4d0be589-d069-4183-b9f6-bde715ad716b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.851986 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d0be589-d069-4183-b9f6-bde715ad716b-config\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.953627 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.953689 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0be589-d069-4183-b9f6-bde715ad716b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.953741 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0be589-d069-4183-b9f6-bde715ad716b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.953790 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtbvq\" (UniqueName: \"kubernetes.io/projected/4d0be589-d069-4183-b9f6-bde715ad716b-kube-api-access-dtbvq\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.953834 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d0be589-d069-4183-b9f6-bde715ad716b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.953884 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4d0be589-d069-4183-b9f6-bde715ad716b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.953926 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d0be589-d069-4183-b9f6-bde715ad716b-config\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.953984 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0be589-d069-4183-b9f6-bde715ad716b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.953986 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.955947 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d0be589-d069-4183-b9f6-bde715ad716b-config\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.958758 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0be589-d069-4183-b9f6-bde715ad716b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.959451 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4d0be589-d069-4183-b9f6-bde715ad716b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.959973 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0be589-d069-4183-b9f6-bde715ad716b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.960622 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d0be589-d069-4183-b9f6-bde715ad716b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:57 crc kubenswrapper[4825]: I0310 07:03:57.972587 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0be589-d069-4183-b9f6-bde715ad716b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:58 crc kubenswrapper[4825]: I0310 07:03:58.001159 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtbvq\" (UniqueName: \"kubernetes.io/projected/4d0be589-d069-4183-b9f6-bde715ad716b-kube-api-access-dtbvq\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:58 crc kubenswrapper[4825]: I0310 07:03:58.002200 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " pod="openstack/ovsdbserver-nb-0" Mar 10 07:03:58 crc kubenswrapper[4825]: I0310 07:03:58.283986 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 07:04:00 crc kubenswrapper[4825]: I0310 07:04:00.136992 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552104-z7jkb"] Mar 10 07:04:00 crc kubenswrapper[4825]: I0310 07:04:00.139674 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552104-z7jkb" Mar 10 07:04:00 crc kubenswrapper[4825]: I0310 07:04:00.143555 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:04:00 crc kubenswrapper[4825]: I0310 07:04:00.146829 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:04:00 crc kubenswrapper[4825]: I0310 07:04:00.147463 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:04:00 crc kubenswrapper[4825]: I0310 07:04:00.160366 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552104-z7jkb"] Mar 10 07:04:00 crc kubenswrapper[4825]: I0310 07:04:00.298513 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6s7z\" (UniqueName: \"kubernetes.io/projected/2f5ca3f7-1d76-4749-a5c9-1063d3d44886-kube-api-access-c6s7z\") pod \"auto-csr-approver-29552104-z7jkb\" (UID: \"2f5ca3f7-1d76-4749-a5c9-1063d3d44886\") " pod="openshift-infra/auto-csr-approver-29552104-z7jkb" Mar 10 07:04:00 crc kubenswrapper[4825]: I0310 07:04:00.400848 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6s7z\" (UniqueName: \"kubernetes.io/projected/2f5ca3f7-1d76-4749-a5c9-1063d3d44886-kube-api-access-c6s7z\") pod \"auto-csr-approver-29552104-z7jkb\" (UID: \"2f5ca3f7-1d76-4749-a5c9-1063d3d44886\") " pod="openshift-infra/auto-csr-approver-29552104-z7jkb" Mar 10 07:04:00 crc kubenswrapper[4825]: I0310 07:04:00.422501 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6s7z\" (UniqueName: \"kubernetes.io/projected/2f5ca3f7-1d76-4749-a5c9-1063d3d44886-kube-api-access-c6s7z\") pod \"auto-csr-approver-29552104-z7jkb\" (UID: \"2f5ca3f7-1d76-4749-a5c9-1063d3d44886\") " pod="openshift-infra/auto-csr-approver-29552104-z7jkb" Mar 10 07:04:00 crc kubenswrapper[4825]: I0310 07:04:00.471183 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552104-z7jkb" Mar 10 07:04:01 crc kubenswrapper[4825]: I0310 07:04:01.436114 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 07:04:01 crc kubenswrapper[4825]: I0310 07:04:01.451983 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 07:04:01 crc kubenswrapper[4825]: I0310 07:04:01.469926 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jctbb"] Mar 10 07:04:01 crc kubenswrapper[4825]: I0310 07:04:01.591743 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 07:04:01 crc kubenswrapper[4825]: I0310 07:04:01.745647 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 07:04:01 crc kubenswrapper[4825]: I0310 07:04:01.767780 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552104-z7jkb"] Mar 10 07:04:01 crc kubenswrapper[4825]: I0310 07:04:01.793846 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 07:04:01 crc kubenswrapper[4825]: W0310 07:04:01.862489 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f5ca3f7_1d76_4749_a5c9_1063d3d44886.slice/crio-c6a4bc10acce355af606ccc9123ca390115d0d0755791776631ec89e993df16c WatchSource:0}: Error finding container c6a4bc10acce355af606ccc9123ca390115d0d0755791776631ec89e993df16c: Status 404 returned error can't find the container with id c6a4bc10acce355af606ccc9123ca390115d0d0755791776631ec89e993df16c Mar 10 07:04:01 crc kubenswrapper[4825]: W0310 07:04:01.867366 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d4252ca_f279_4c95_8f10_205339d028a5.slice/crio-07bcbd9d0c0735b9472b5e79d9c83d84abc98a52a384930a35a5190919aa256c WatchSource:0}: Error finding container 07bcbd9d0c0735b9472b5e79d9c83d84abc98a52a384930a35a5190919aa256c: Status 404 returned error can't find the container with id 07bcbd9d0c0735b9472b5e79d9c83d84abc98a52a384930a35a5190919aa256c Mar 10 07:04:02 crc kubenswrapper[4825]: I0310 07:04:02.134550 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 07:04:02 crc kubenswrapper[4825]: W0310 07:04:02.138958 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6301c97_5fb5_4f12_9181_ae937aa01b33.slice/crio-6db8497e28aa6376de67f3e7e22a09e5d46729a54a85d651a2e5a34cfc2e8f11 WatchSource:0}: Error finding container 6db8497e28aa6376de67f3e7e22a09e5d46729a54a85d651a2e5a34cfc2e8f11: Status 404 returned error can't find the container with id 6db8497e28aa6376de67f3e7e22a09e5d46729a54a85d651a2e5a34cfc2e8f11 Mar 10 07:04:02 crc kubenswrapper[4825]: I0310 07:04:02.193886 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552104-z7jkb" event={"ID":"2f5ca3f7-1d76-4749-a5c9-1063d3d44886","Type":"ContainerStarted","Data":"c6a4bc10acce355af606ccc9123ca390115d0d0755791776631ec89e993df16c"} Mar 10 07:04:02 crc kubenswrapper[4825]: I0310 07:04:02.196248 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e6301c97-5fb5-4f12-9181-ae937aa01b33","Type":"ContainerStarted","Data":"6db8497e28aa6376de67f3e7e22a09e5d46729a54a85d651a2e5a34cfc2e8f11"} Mar 10 07:04:02 crc kubenswrapper[4825]: I0310 07:04:02.199502 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"757635f4-9aaf-48bb-bd50-60398db738d4","Type":"ContainerStarted","Data":"8edd9f19672976cb56f4c38fafcec17938c94445105f5ca564ed9e0d46b66553"} Mar 10 07:04:02 crc kubenswrapper[4825]: I0310 07:04:02.201425 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1b5b179f-f4fb-479c-9720-25587566c518","Type":"ContainerStarted","Data":"c951929b4cab3e3155d3ea8f2442bc5d1e335eb49649bc66959b618a6bb7f4fb"} Mar 10 07:04:02 crc kubenswrapper[4825]: I0310 07:04:02.203043 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4d0be589-d069-4183-b9f6-bde715ad716b","Type":"ContainerStarted","Data":"b8310da36e0e36ef0978b2d77b8b278dccb06b90cb2d218f30ee6f9afc8ccd7b"} Mar 10 07:04:02 crc kubenswrapper[4825]: I0310 07:04:02.206197 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jctbb" event={"ID":"2b7468b7-ceaa-44b4-8364-5d3601f43c1b","Type":"ContainerStarted","Data":"374e92a40ee4ebc87a6630a572595e29f1d285aa432e2d541165266f1ae0d414"} Mar 10 07:04:02 crc kubenswrapper[4825]: I0310 07:04:02.208264 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5d4252ca-f279-4c95-8f10-205339d028a5","Type":"ContainerStarted","Data":"07bcbd9d0c0735b9472b5e79d9c83d84abc98a52a384930a35a5190919aa256c"} Mar 10 07:04:02 crc kubenswrapper[4825]: I0310 07:04:02.210198 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"29b6d990-12b7-446f-9cd8-53c1111b1512","Type":"ContainerStarted","Data":"b0b1f70ca4f0682b0de278256fe393fd457125b5c684916d6fa60bb244771f7a"} Mar 10 07:04:02 crc kubenswrapper[4825]: I0310 07:04:02.238416 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8x4pm"] Mar 10 07:04:02 crc kubenswrapper[4825]: W0310 07:04:02.250708 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf04e11a7_e387_4e51_b878_8633ca528b1a.slice/crio-754ad60a5b8873bafdcc44f51b7f245d114da8d4e8574e9082261004cb8fcd7e WatchSource:0}: Error finding container 754ad60a5b8873bafdcc44f51b7f245d114da8d4e8574e9082261004cb8fcd7e: Status 404 returned error can't find the container with id 754ad60a5b8873bafdcc44f51b7f245d114da8d4e8574e9082261004cb8fcd7e Mar 10 07:04:03 crc kubenswrapper[4825]: I0310 07:04:03.230503 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8x4pm" event={"ID":"f04e11a7-e387-4e51-b878-8633ca528b1a","Type":"ContainerStarted","Data":"754ad60a5b8873bafdcc44f51b7f245d114da8d4e8574e9082261004cb8fcd7e"} Mar 10 07:04:05 crc kubenswrapper[4825]: E0310 07:04:05.348070 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 10 07:04:05 crc kubenswrapper[4825]: E0310 07:04:05.348859 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rztrw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-589db6c89c-qzmg5_openstack(a364111b-fbb3-4eb4-9acf-14d491d15519): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 07:04:05 crc kubenswrapper[4825]: E0310 07:04:05.351260 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-589db6c89c-qzmg5" podUID="a364111b-fbb3-4eb4-9acf-14d491d15519" Mar 10 07:04:05 crc kubenswrapper[4825]: E0310 07:04:05.390829 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 10 07:04:05 crc kubenswrapper[4825]: E0310 07:04:05.391070 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hrvml,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86bbd886cf-fnzcj_openstack(8750bff5-ab3f-4cbe-9dbb-4378b39c93a3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 07:04:05 crc kubenswrapper[4825]: E0310 07:04:05.392728 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-86bbd886cf-fnzcj" podUID="8750bff5-ab3f-4cbe-9dbb-4378b39c93a3" Mar 10 07:04:05 crc kubenswrapper[4825]: E0310 07:04:05.449792 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 10 07:04:05 crc kubenswrapper[4825]: E0310 07:04:05.450038 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xl6vm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7c47bcb9f9-w92qr_openstack(9dbf9084-ea14-4c47-8d2e-682719c4f27c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 07:04:05 crc kubenswrapper[4825]: E0310 07:04:05.451327 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7c47bcb9f9-w92qr" podUID="9dbf9084-ea14-4c47-8d2e-682719c4f27c" Mar 10 07:04:05 crc kubenswrapper[4825]: E0310 07:04:05.535386 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 10 07:04:05 crc kubenswrapper[4825]: E0310 07:04:05.535714 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w5vcp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-79f9fc56ff-gxl2b_openstack(7b53862f-ef06-4fc8-9e9b-d409a3eb3671): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 07:04:05 crc kubenswrapper[4825]: E0310 07:04:05.537069 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-79f9fc56ff-gxl2b" podUID="7b53862f-ef06-4fc8-9e9b-d409a3eb3671" Mar 10 07:04:06 crc kubenswrapper[4825]: I0310 07:04:06.265415 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"40efa241-98cc-4dec-9ae8-8a892b367ebc","Type":"ContainerStarted","Data":"afabc77fd2eb1f467ac3dc9c5bc7ab28337af9e10ce2b6cf66724e5f31f41951"} Mar 10 07:04:06 crc kubenswrapper[4825]: E0310 07:04:06.269943 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514\\\"\"" pod="openstack/dnsmasq-dns-79f9fc56ff-gxl2b" podUID="7b53862f-ef06-4fc8-9e9b-d409a3eb3671" Mar 10 07:04:06 crc kubenswrapper[4825]: E0310 07:04:06.296291 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514\\\"\"" pod="openstack/dnsmasq-dns-7c47bcb9f9-w92qr" podUID="9dbf9084-ea14-4c47-8d2e-682719c4f27c" Mar 10 07:04:06 crc kubenswrapper[4825]: I0310 07:04:06.790407 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-qzmg5" Mar 10 07:04:06 crc kubenswrapper[4825]: I0310 07:04:06.793374 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-fnzcj" Mar 10 07:04:06 crc kubenswrapper[4825]: I0310 07:04:06.936152 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a364111b-fbb3-4eb4-9acf-14d491d15519-config\") pod \"a364111b-fbb3-4eb4-9acf-14d491d15519\" (UID: \"a364111b-fbb3-4eb4-9acf-14d491d15519\") " Mar 10 07:04:06 crc kubenswrapper[4825]: I0310 07:04:06.936263 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rztrw\" (UniqueName: \"kubernetes.io/projected/a364111b-fbb3-4eb4-9acf-14d491d15519-kube-api-access-rztrw\") pod \"a364111b-fbb3-4eb4-9acf-14d491d15519\" (UID: \"a364111b-fbb3-4eb4-9acf-14d491d15519\") " Mar 10 07:04:06 crc kubenswrapper[4825]: I0310 07:04:06.936355 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrvml\" (UniqueName: \"kubernetes.io/projected/8750bff5-ab3f-4cbe-9dbb-4378b39c93a3-kube-api-access-hrvml\") pod \"8750bff5-ab3f-4cbe-9dbb-4378b39c93a3\" (UID: \"8750bff5-ab3f-4cbe-9dbb-4378b39c93a3\") " Mar 10 07:04:06 crc kubenswrapper[4825]: I0310 07:04:06.936477 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8750bff5-ab3f-4cbe-9dbb-4378b39c93a3-config\") pod \"8750bff5-ab3f-4cbe-9dbb-4378b39c93a3\" (UID: \"8750bff5-ab3f-4cbe-9dbb-4378b39c93a3\") " Mar 10 07:04:06 crc kubenswrapper[4825]: I0310 07:04:06.936600 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8750bff5-ab3f-4cbe-9dbb-4378b39c93a3-dns-svc\") pod \"8750bff5-ab3f-4cbe-9dbb-4378b39c93a3\" (UID: \"8750bff5-ab3f-4cbe-9dbb-4378b39c93a3\") " Mar 10 07:04:06 crc kubenswrapper[4825]: I0310 07:04:06.936591 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a364111b-fbb3-4eb4-9acf-14d491d15519-config" (OuterVolumeSpecName: "config") pod "a364111b-fbb3-4eb4-9acf-14d491d15519" (UID: "a364111b-fbb3-4eb4-9acf-14d491d15519"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:06 crc kubenswrapper[4825]: I0310 07:04:06.936956 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8750bff5-ab3f-4cbe-9dbb-4378b39c93a3-config" (OuterVolumeSpecName: "config") pod "8750bff5-ab3f-4cbe-9dbb-4378b39c93a3" (UID: "8750bff5-ab3f-4cbe-9dbb-4378b39c93a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:06 crc kubenswrapper[4825]: I0310 07:04:06.936983 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8750bff5-ab3f-4cbe-9dbb-4378b39c93a3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8750bff5-ab3f-4cbe-9dbb-4378b39c93a3" (UID: "8750bff5-ab3f-4cbe-9dbb-4378b39c93a3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:06 crc kubenswrapper[4825]: I0310 07:04:06.937205 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8750bff5-ab3f-4cbe-9dbb-4378b39c93a3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:06 crc kubenswrapper[4825]: I0310 07:04:06.937237 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a364111b-fbb3-4eb4-9acf-14d491d15519-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:06 crc kubenswrapper[4825]: I0310 07:04:06.937252 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8750bff5-ab3f-4cbe-9dbb-4378b39c93a3-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:06 crc kubenswrapper[4825]: I0310 07:04:06.942544 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8750bff5-ab3f-4cbe-9dbb-4378b39c93a3-kube-api-access-hrvml" (OuterVolumeSpecName: "kube-api-access-hrvml") pod "8750bff5-ab3f-4cbe-9dbb-4378b39c93a3" (UID: "8750bff5-ab3f-4cbe-9dbb-4378b39c93a3"). InnerVolumeSpecName "kube-api-access-hrvml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:04:06 crc kubenswrapper[4825]: I0310 07:04:06.954420 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a364111b-fbb3-4eb4-9acf-14d491d15519-kube-api-access-rztrw" (OuterVolumeSpecName: "kube-api-access-rztrw") pod "a364111b-fbb3-4eb4-9acf-14d491d15519" (UID: "a364111b-fbb3-4eb4-9acf-14d491d15519"). InnerVolumeSpecName "kube-api-access-rztrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:04:07 crc kubenswrapper[4825]: I0310 07:04:07.040033 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rztrw\" (UniqueName: \"kubernetes.io/projected/a364111b-fbb3-4eb4-9acf-14d491d15519-kube-api-access-rztrw\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:07 crc kubenswrapper[4825]: I0310 07:04:07.040409 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrvml\" (UniqueName: \"kubernetes.io/projected/8750bff5-ab3f-4cbe-9dbb-4378b39c93a3-kube-api-access-hrvml\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:07 crc kubenswrapper[4825]: I0310 07:04:07.274045 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-qzmg5" Mar 10 07:04:07 crc kubenswrapper[4825]: I0310 07:04:07.274093 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-qzmg5" event={"ID":"a364111b-fbb3-4eb4-9acf-14d491d15519","Type":"ContainerDied","Data":"e0304e221aa975d40ae52bf1b3d4acf750f7a7376ea1b4f036497fc4c040fa3b"} Mar 10 07:04:07 crc kubenswrapper[4825]: I0310 07:04:07.276518 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-fnzcj" event={"ID":"8750bff5-ab3f-4cbe-9dbb-4378b39c93a3","Type":"ContainerDied","Data":"7f3bed86760bffd711f4956231a848c1af2683458981237d5a3c3c69d5c777f6"} Mar 10 07:04:07 crc kubenswrapper[4825]: I0310 07:04:07.276749 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-fnzcj" Mar 10 07:04:07 crc kubenswrapper[4825]: I0310 07:04:07.283524 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9ba0e3ee-1309-4411-a927-866b35c2776b","Type":"ContainerStarted","Data":"85a2b6ae62f4562edc4361ee48380de641fa71091bc57a5172633641c40144a7"} Mar 10 07:04:07 crc kubenswrapper[4825]: I0310 07:04:07.328430 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-fnzcj"] Mar 10 07:04:07 crc kubenswrapper[4825]: I0310 07:04:07.334036 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-fnzcj"] Mar 10 07:04:07 crc kubenswrapper[4825]: I0310 07:04:07.388389 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-qzmg5"] Mar 10 07:04:07 crc kubenswrapper[4825]: I0310 07:04:07.396627 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-qzmg5"] Mar 10 07:04:09 crc kubenswrapper[4825]: I0310 07:04:09.259485 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8750bff5-ab3f-4cbe-9dbb-4378b39c93a3" path="/var/lib/kubelet/pods/8750bff5-ab3f-4cbe-9dbb-4378b39c93a3/volumes" Mar 10 07:04:09 crc kubenswrapper[4825]: I0310 07:04:09.261751 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a364111b-fbb3-4eb4-9acf-14d491d15519" path="/var/lib/kubelet/pods/a364111b-fbb3-4eb4-9acf-14d491d15519/volumes" Mar 10 07:04:13 crc kubenswrapper[4825]: I0310 07:04:13.588840 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 07:04:14 crc kubenswrapper[4825]: I0310 07:04:14.342009 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"29b6d990-12b7-446f-9cd8-53c1111b1512","Type":"ContainerStarted","Data":"ab4092c2592a4c7eb52ffe086ca172917d888418a22ed5bfde36459943bc1706"} Mar 10 07:04:14 crc kubenswrapper[4825]: I0310 07:04:14.342451 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 10 07:04:14 crc kubenswrapper[4825]: I0310 07:04:14.345060 4825 generic.go:334] "Generic (PLEG): container finished" podID="2f5ca3f7-1d76-4749-a5c9-1063d3d44886" containerID="edf716e37dfc32124d097f2a38586d83b24327f8a4221726babeb8c34418ddad" exitCode=0 Mar 10 07:04:14 crc kubenswrapper[4825]: I0310 07:04:14.345231 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552104-z7jkb" event={"ID":"2f5ca3f7-1d76-4749-a5c9-1063d3d44886","Type":"ContainerDied","Data":"edf716e37dfc32124d097f2a38586d83b24327f8a4221726babeb8c34418ddad"} Mar 10 07:04:14 crc kubenswrapper[4825]: I0310 07:04:14.354464 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e6301c97-5fb5-4f12-9181-ae937aa01b33","Type":"ContainerStarted","Data":"890a4d9166366f896b7b573d660a3e9559cfb944d463e538a8cb9f89a3568bf9"} Mar 10 07:04:14 crc kubenswrapper[4825]: I0310 07:04:14.367230 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"757635f4-9aaf-48bb-bd50-60398db738d4","Type":"ContainerStarted","Data":"6289da05c1ca1af51b3d7bc91bf14b2681c592f4f4c2d675793e7468ec8ede52"} Mar 10 07:04:14 crc kubenswrapper[4825]: I0310 07:04:14.371449 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=12.791786423 podStartE2EDuration="24.371430684s" podCreationTimestamp="2026-03-10 07:03:50 +0000 UTC" firstStartedPulling="2026-03-10 07:04:01.739980473 +0000 UTC m=+1194.769761088" lastFinishedPulling="2026-03-10 07:04:13.319624724 +0000 UTC m=+1206.349405349" observedRunningTime="2026-03-10 07:04:14.363907308 +0000 UTC m=+1207.393687923" watchObservedRunningTime="2026-03-10 07:04:14.371430684 +0000 UTC m=+1207.401211299" Mar 10 07:04:14 crc kubenswrapper[4825]: I0310 07:04:14.373253 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4d0be589-d069-4183-b9f6-bde715ad716b","Type":"ContainerStarted","Data":"287e2ed2fe7cd936cd37546d987ad73fd2626cb93c8a46da8028302a3c2cd37d"} Mar 10 07:04:14 crc kubenswrapper[4825]: I0310 07:04:14.382453 4825 generic.go:334] "Generic (PLEG): container finished" podID="f04e11a7-e387-4e51-b878-8633ca528b1a" containerID="0629a4d8ca064194d19a4f47c25dd149490196967be2ed943f6370d705b5cad2" exitCode=0 Mar 10 07:04:14 crc kubenswrapper[4825]: I0310 07:04:14.382565 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8x4pm" event={"ID":"f04e11a7-e387-4e51-b878-8633ca528b1a","Type":"ContainerDied","Data":"0629a4d8ca064194d19a4f47c25dd149490196967be2ed943f6370d705b5cad2"} Mar 10 07:04:14 crc kubenswrapper[4825]: I0310 07:04:14.397569 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1b5b179f-f4fb-479c-9720-25587566c518","Type":"ContainerStarted","Data":"8f908b02b36a68508bfbba63d180143eca8f4c5bb9968039511ca87e022f56c8"} Mar 10 07:04:14 crc kubenswrapper[4825]: I0310 07:04:14.418350 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jctbb" event={"ID":"2b7468b7-ceaa-44b4-8364-5d3601f43c1b","Type":"ContainerStarted","Data":"dbfa45eee1893e39936ca7ed89cd4006d13a24a21be39d324c10431ed74834ee"} Mar 10 07:04:14 crc kubenswrapper[4825]: I0310 07:04:14.418723 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-jctbb" Mar 10 07:04:14 crc kubenswrapper[4825]: I0310 07:04:14.423235 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5d4252ca-f279-4c95-8f10-205339d028a5","Type":"ContainerStarted","Data":"4c5195f2b444f93836dcb7df7bcc67ecb17ab72fc077c910233e60c95d78edb3"} Mar 10 07:04:14 crc kubenswrapper[4825]: I0310 07:04:14.423596 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 10 07:04:14 crc kubenswrapper[4825]: I0310 07:04:14.462484 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-jctbb" podStartSLOduration=9.756198914 podStartE2EDuration="21.462464454s" podCreationTimestamp="2026-03-10 07:03:53 +0000 UTC" firstStartedPulling="2026-03-10 07:04:01.481750486 +0000 UTC m=+1194.511531101" lastFinishedPulling="2026-03-10 07:04:13.188016026 +0000 UTC m=+1206.217796641" observedRunningTime="2026-03-10 07:04:14.460445531 +0000 UTC m=+1207.490226156" watchObservedRunningTime="2026-03-10 07:04:14.462464454 +0000 UTC m=+1207.492245069" Mar 10 07:04:15 crc kubenswrapper[4825]: I0310 07:04:15.436313 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8x4pm" event={"ID":"f04e11a7-e387-4e51-b878-8633ca528b1a","Type":"ContainerStarted","Data":"1f073cb0d55b119007da7e16a09131032c9832ee0f4faf6e839adba2f4de4803"} Mar 10 07:04:15 crc kubenswrapper[4825]: I0310 07:04:15.436754 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8x4pm" event={"ID":"f04e11a7-e387-4e51-b878-8633ca528b1a","Type":"ContainerStarted","Data":"dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134"} Mar 10 07:04:15 crc kubenswrapper[4825]: I0310 07:04:15.459159 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=16.14068033 podStartE2EDuration="27.459125987s" podCreationTimestamp="2026-03-10 07:03:48 +0000 UTC" firstStartedPulling="2026-03-10 07:04:01.871808487 +0000 UTC m=+1194.901589102" lastFinishedPulling="2026-03-10 07:04:13.190254144 +0000 UTC m=+1206.220034759" observedRunningTime="2026-03-10 07:04:14.490591617 +0000 UTC m=+1207.520372232" watchObservedRunningTime="2026-03-10 07:04:15.459125987 +0000 UTC m=+1208.488906602" Mar 10 07:04:15 crc kubenswrapper[4825]: I0310 07:04:15.462166 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-8x4pm" podStartSLOduration=11.525710339 podStartE2EDuration="22.462153296s" podCreationTimestamp="2026-03-10 07:03:53 +0000 UTC" firstStartedPulling="2026-03-10 07:04:02.253252823 +0000 UTC m=+1195.283033458" lastFinishedPulling="2026-03-10 07:04:13.18969581 +0000 UTC m=+1206.219476415" observedRunningTime="2026-03-10 07:04:15.455562674 +0000 UTC m=+1208.485343329" watchObservedRunningTime="2026-03-10 07:04:15.462153296 +0000 UTC m=+1208.491933901" Mar 10 07:04:16 crc kubenswrapper[4825]: I0310 07:04:16.236179 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552104-z7jkb" Mar 10 07:04:16 crc kubenswrapper[4825]: I0310 07:04:16.360448 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6s7z\" (UniqueName: \"kubernetes.io/projected/2f5ca3f7-1d76-4749-a5c9-1063d3d44886-kube-api-access-c6s7z\") pod \"2f5ca3f7-1d76-4749-a5c9-1063d3d44886\" (UID: \"2f5ca3f7-1d76-4749-a5c9-1063d3d44886\") " Mar 10 07:04:16 crc kubenswrapper[4825]: I0310 07:04:16.367243 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f5ca3f7-1d76-4749-a5c9-1063d3d44886-kube-api-access-c6s7z" (OuterVolumeSpecName: "kube-api-access-c6s7z") pod "2f5ca3f7-1d76-4749-a5c9-1063d3d44886" (UID: "2f5ca3f7-1d76-4749-a5c9-1063d3d44886"). InnerVolumeSpecName "kube-api-access-c6s7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:04:16 crc kubenswrapper[4825]: I0310 07:04:16.447040 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552104-z7jkb" event={"ID":"2f5ca3f7-1d76-4749-a5c9-1063d3d44886","Type":"ContainerDied","Data":"c6a4bc10acce355af606ccc9123ca390115d0d0755791776631ec89e993df16c"} Mar 10 07:04:16 crc kubenswrapper[4825]: I0310 07:04:16.447088 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6a4bc10acce355af606ccc9123ca390115d0d0755791776631ec89e993df16c" Mar 10 07:04:16 crc kubenswrapper[4825]: I0310 07:04:16.447055 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552104-z7jkb" Mar 10 07:04:16 crc kubenswrapper[4825]: I0310 07:04:16.447291 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:04:16 crc kubenswrapper[4825]: I0310 07:04:16.447323 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:04:16 crc kubenswrapper[4825]: I0310 07:04:16.463745 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6s7z\" (UniqueName: \"kubernetes.io/projected/2f5ca3f7-1d76-4749-a5c9-1063d3d44886-kube-api-access-c6s7z\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:17 crc kubenswrapper[4825]: I0310 07:04:17.310320 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552098-dffsl"] Mar 10 07:04:17 crc kubenswrapper[4825]: I0310 07:04:17.319314 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552098-dffsl"] Mar 10 07:04:17 crc kubenswrapper[4825]: I0310 07:04:17.464591 4825 generic.go:334] "Generic (PLEG): container finished" podID="757635f4-9aaf-48bb-bd50-60398db738d4" containerID="6289da05c1ca1af51b3d7bc91bf14b2681c592f4f4c2d675793e7468ec8ede52" exitCode=0 Mar 10 07:04:17 crc kubenswrapper[4825]: I0310 07:04:17.464728 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"757635f4-9aaf-48bb-bd50-60398db738d4","Type":"ContainerDied","Data":"6289da05c1ca1af51b3d7bc91bf14b2681c592f4f4c2d675793e7468ec8ede52"} Mar 10 07:04:17 crc kubenswrapper[4825]: I0310 07:04:17.468653 4825 generic.go:334] "Generic (PLEG): container finished" podID="1b5b179f-f4fb-479c-9720-25587566c518" containerID="8f908b02b36a68508bfbba63d180143eca8f4c5bb9968039511ca87e022f56c8" exitCode=0 Mar 10 07:04:17 crc kubenswrapper[4825]: I0310 07:04:17.468724 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1b5b179f-f4fb-479c-9720-25587566c518","Type":"ContainerDied","Data":"8f908b02b36a68508bfbba63d180143eca8f4c5bb9968039511ca87e022f56c8"} Mar 10 07:04:17 crc kubenswrapper[4825]: I0310 07:04:17.472264 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4d0be589-d069-4183-b9f6-bde715ad716b","Type":"ContainerStarted","Data":"445d3424a112177b35839673ae3ebc80192ee7a09e556dccf91f7607c8d6b87a"} Mar 10 07:04:17 crc kubenswrapper[4825]: I0310 07:04:17.474733 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e6301c97-5fb5-4f12-9181-ae937aa01b33","Type":"ContainerStarted","Data":"6d4ba65a7a74c4f484262169282d716735558f576441bf3306734527b8d19733"} Mar 10 07:04:17 crc kubenswrapper[4825]: I0310 07:04:17.562797 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=7.66772276 podStartE2EDuration="22.562777116s" podCreationTimestamp="2026-03-10 07:03:55 +0000 UTC" firstStartedPulling="2026-03-10 07:04:02.141343368 +0000 UTC m=+1195.171123993" lastFinishedPulling="2026-03-10 07:04:17.036397724 +0000 UTC m=+1210.066178349" observedRunningTime="2026-03-10 07:04:17.553384291 +0000 UTC m=+1210.583164936" watchObservedRunningTime="2026-03-10 07:04:17.562777116 +0000 UTC m=+1210.592557751" Mar 10 07:04:17 crc kubenswrapper[4825]: I0310 07:04:17.584047 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=6.268330216 podStartE2EDuration="21.584027609s" podCreationTimestamp="2026-03-10 07:03:56 +0000 UTC" firstStartedPulling="2026-03-10 07:04:01.709376336 +0000 UTC m=+1194.739156951" lastFinishedPulling="2026-03-10 07:04:17.025073709 +0000 UTC m=+1210.054854344" observedRunningTime="2026-03-10 07:04:17.579088391 +0000 UTC m=+1210.608869026" watchObservedRunningTime="2026-03-10 07:04:17.584027609 +0000 UTC m=+1210.613808224" Mar 10 07:04:18 crc kubenswrapper[4825]: I0310 07:04:18.162127 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 10 07:04:18 crc kubenswrapper[4825]: I0310 07:04:18.231796 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 10 07:04:18 crc kubenswrapper[4825]: I0310 07:04:18.284625 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 10 07:04:18 crc kubenswrapper[4825]: I0310 07:04:18.490619 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"757635f4-9aaf-48bb-bd50-60398db738d4","Type":"ContainerStarted","Data":"64283b83ccd27989eef21b478fef4edaf5b901a9f55b243380a9b8e95c69504f"} Mar 10 07:04:18 crc kubenswrapper[4825]: I0310 07:04:18.494032 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1b5b179f-f4fb-479c-9720-25587566c518","Type":"ContainerStarted","Data":"c5251bd347e4e2a0285d072a7e65894076feb5953eaeecf9032c40e2a8952418"} Mar 10 07:04:18 crc kubenswrapper[4825]: I0310 07:04:18.494743 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 10 07:04:18 crc kubenswrapper[4825]: I0310 07:04:18.538801 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=20.733873069 podStartE2EDuration="32.538773479s" podCreationTimestamp="2026-03-10 07:03:46 +0000 UTC" firstStartedPulling="2026-03-10 07:04:01.469283271 +0000 UTC m=+1194.499063886" lastFinishedPulling="2026-03-10 07:04:13.274183681 +0000 UTC m=+1206.303964296" observedRunningTime="2026-03-10 07:04:18.528498351 +0000 UTC m=+1211.558279036" watchObservedRunningTime="2026-03-10 07:04:18.538773479 +0000 UTC m=+1211.568554134" Mar 10 07:04:18 crc kubenswrapper[4825]: I0310 07:04:18.559362 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 10 07:04:18 crc kubenswrapper[4825]: I0310 07:04:18.579629 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=21.85943196 podStartE2EDuration="33.579597943s" podCreationTimestamp="2026-03-10 07:03:45 +0000 UTC" firstStartedPulling="2026-03-10 07:04:01.467526685 +0000 UTC m=+1194.497307300" lastFinishedPulling="2026-03-10 07:04:13.187692668 +0000 UTC m=+1206.217473283" observedRunningTime="2026-03-10 07:04:18.56528241 +0000 UTC m=+1211.595063085" watchObservedRunningTime="2026-03-10 07:04:18.579597943 +0000 UTC m=+1211.609378598" Mar 10 07:04:18 crc kubenswrapper[4825]: I0310 07:04:18.704780 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 10 07:04:18 crc kubenswrapper[4825]: I0310 07:04:18.869930 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-gxl2b"] Mar 10 07:04:18 crc kubenswrapper[4825]: I0310 07:04:18.922170 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-795cf8b45c-rb9zl"] Mar 10 07:04:18 crc kubenswrapper[4825]: E0310 07:04:18.923412 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f5ca3f7-1d76-4749-a5c9-1063d3d44886" containerName="oc" Mar 10 07:04:18 crc kubenswrapper[4825]: I0310 07:04:18.923491 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f5ca3f7-1d76-4749-a5c9-1063d3d44886" containerName="oc" Mar 10 07:04:18 crc kubenswrapper[4825]: I0310 07:04:18.924007 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f5ca3f7-1d76-4749-a5c9-1063d3d44886" containerName="oc" Mar 10 07:04:18 crc kubenswrapper[4825]: I0310 07:04:18.932793 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795cf8b45c-rb9zl" Mar 10 07:04:18 crc kubenswrapper[4825]: I0310 07:04:18.943963 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 10 07:04:18 crc kubenswrapper[4825]: I0310 07:04:18.959273 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-cnt4k"] Mar 10 07:04:18 crc kubenswrapper[4825]: I0310 07:04:18.962939 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cnt4k" Mar 10 07:04:18 crc kubenswrapper[4825]: I0310 07:04:18.974883 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 10 07:04:18 crc kubenswrapper[4825]: I0310 07:04:18.993056 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-795cf8b45c-rb9zl"] Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.010519 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-cnt4k"] Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.015654 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcb8g\" (UniqueName: \"kubernetes.io/projected/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-kube-api-access-dcb8g\") pod \"ovn-controller-metrics-cnt4k\" (UID: \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\") " pod="openstack/ovn-controller-metrics-cnt4k" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.015901 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mrwb\" (UniqueName: \"kubernetes.io/projected/73078060-8fff-41f1-baa7-17ce9929faec-kube-api-access-8mrwb\") pod \"dnsmasq-dns-795cf8b45c-rb9zl\" (UID: \"73078060-8fff-41f1-baa7-17ce9929faec\") " pod="openstack/dnsmasq-dns-795cf8b45c-rb9zl" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.016615 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-combined-ca-bundle\") pod \"ovn-controller-metrics-cnt4k\" (UID: \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\") " pod="openstack/ovn-controller-metrics-cnt4k" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.016782 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73078060-8fff-41f1-baa7-17ce9929faec-config\") pod \"dnsmasq-dns-795cf8b45c-rb9zl\" (UID: \"73078060-8fff-41f1-baa7-17ce9929faec\") " pod="openstack/dnsmasq-dns-795cf8b45c-rb9zl" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.016916 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-config\") pod \"ovn-controller-metrics-cnt4k\" (UID: \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\") " pod="openstack/ovn-controller-metrics-cnt4k" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.017018 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-cnt4k\" (UID: \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\") " pod="openstack/ovn-controller-metrics-cnt4k" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.017124 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73078060-8fff-41f1-baa7-17ce9929faec-ovsdbserver-sb\") pod \"dnsmasq-dns-795cf8b45c-rb9zl\" (UID: \"73078060-8fff-41f1-baa7-17ce9929faec\") " pod="openstack/dnsmasq-dns-795cf8b45c-rb9zl" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.017260 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-ovs-rundir\") pod \"ovn-controller-metrics-cnt4k\" (UID: \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\") " pod="openstack/ovn-controller-metrics-cnt4k" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.017429 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-ovn-rundir\") pod \"ovn-controller-metrics-cnt4k\" (UID: \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\") " pod="openstack/ovn-controller-metrics-cnt4k" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.017533 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73078060-8fff-41f1-baa7-17ce9929faec-dns-svc\") pod \"dnsmasq-dns-795cf8b45c-rb9zl\" (UID: \"73078060-8fff-41f1-baa7-17ce9929faec\") " pod="openstack/dnsmasq-dns-795cf8b45c-rb9zl" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.118941 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcb8g\" (UniqueName: \"kubernetes.io/projected/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-kube-api-access-dcb8g\") pod \"ovn-controller-metrics-cnt4k\" (UID: \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\") " pod="openstack/ovn-controller-metrics-cnt4k" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.118986 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mrwb\" (UniqueName: \"kubernetes.io/projected/73078060-8fff-41f1-baa7-17ce9929faec-kube-api-access-8mrwb\") pod \"dnsmasq-dns-795cf8b45c-rb9zl\" (UID: \"73078060-8fff-41f1-baa7-17ce9929faec\") " pod="openstack/dnsmasq-dns-795cf8b45c-rb9zl" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.119032 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-combined-ca-bundle\") pod \"ovn-controller-metrics-cnt4k\" (UID: \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\") " pod="openstack/ovn-controller-metrics-cnt4k" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.119076 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73078060-8fff-41f1-baa7-17ce9929faec-config\") pod \"dnsmasq-dns-795cf8b45c-rb9zl\" (UID: \"73078060-8fff-41f1-baa7-17ce9929faec\") " pod="openstack/dnsmasq-dns-795cf8b45c-rb9zl" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.119113 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-config\") pod \"ovn-controller-metrics-cnt4k\" (UID: \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\") " pod="openstack/ovn-controller-metrics-cnt4k" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.119160 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-cnt4k\" (UID: \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\") " pod="openstack/ovn-controller-metrics-cnt4k" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.119194 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73078060-8fff-41f1-baa7-17ce9929faec-ovsdbserver-sb\") pod \"dnsmasq-dns-795cf8b45c-rb9zl\" (UID: \"73078060-8fff-41f1-baa7-17ce9929faec\") " pod="openstack/dnsmasq-dns-795cf8b45c-rb9zl" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.119231 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-ovs-rundir\") pod \"ovn-controller-metrics-cnt4k\" (UID: \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\") " pod="openstack/ovn-controller-metrics-cnt4k" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.119254 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-ovn-rundir\") pod \"ovn-controller-metrics-cnt4k\" (UID: \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\") " pod="openstack/ovn-controller-metrics-cnt4k" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.119276 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73078060-8fff-41f1-baa7-17ce9929faec-dns-svc\") pod \"dnsmasq-dns-795cf8b45c-rb9zl\" (UID: \"73078060-8fff-41f1-baa7-17ce9929faec\") " pod="openstack/dnsmasq-dns-795cf8b45c-rb9zl" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.120266 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73078060-8fff-41f1-baa7-17ce9929faec-dns-svc\") pod \"dnsmasq-dns-795cf8b45c-rb9zl\" (UID: \"73078060-8fff-41f1-baa7-17ce9929faec\") " pod="openstack/dnsmasq-dns-795cf8b45c-rb9zl" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.121419 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-config\") pod \"ovn-controller-metrics-cnt4k\" (UID: \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\") " pod="openstack/ovn-controller-metrics-cnt4k" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.121834 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-ovs-rundir\") pod \"ovn-controller-metrics-cnt4k\" (UID: \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\") " pod="openstack/ovn-controller-metrics-cnt4k" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.121955 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73078060-8fff-41f1-baa7-17ce9929faec-config\") pod \"dnsmasq-dns-795cf8b45c-rb9zl\" (UID: \"73078060-8fff-41f1-baa7-17ce9929faec\") " pod="openstack/dnsmasq-dns-795cf8b45c-rb9zl" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.122018 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-ovn-rundir\") pod \"ovn-controller-metrics-cnt4k\" (UID: \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\") " pod="openstack/ovn-controller-metrics-cnt4k" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.122508 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73078060-8fff-41f1-baa7-17ce9929faec-ovsdbserver-sb\") pod \"dnsmasq-dns-795cf8b45c-rb9zl\" (UID: \"73078060-8fff-41f1-baa7-17ce9929faec\") " pod="openstack/dnsmasq-dns-795cf8b45c-rb9zl" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.125768 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-cnt4k\" (UID: \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\") " pod="openstack/ovn-controller-metrics-cnt4k" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.125818 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-combined-ca-bundle\") pod \"ovn-controller-metrics-cnt4k\" (UID: \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\") " pod="openstack/ovn-controller-metrics-cnt4k" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.137257 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mrwb\" (UniqueName: \"kubernetes.io/projected/73078060-8fff-41f1-baa7-17ce9929faec-kube-api-access-8mrwb\") pod \"dnsmasq-dns-795cf8b45c-rb9zl\" (UID: \"73078060-8fff-41f1-baa7-17ce9929faec\") " pod="openstack/dnsmasq-dns-795cf8b45c-rb9zl" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.137645 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcb8g\" (UniqueName: \"kubernetes.io/projected/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-kube-api-access-dcb8g\") pod \"ovn-controller-metrics-cnt4k\" (UID: \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\") " pod="openstack/ovn-controller-metrics-cnt4k" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.162581 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-w92qr"] Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.202706 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-5lkdr"] Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.204151 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.206163 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.212671 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-5lkdr"] Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.220409 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec1ec5e1-8623-412f-a622-52b10a2c798d-config\") pod \"dnsmasq-dns-7b57d9888c-5lkdr\" (UID: \"ec1ec5e1-8623-412f-a622-52b10a2c798d\") " pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.220460 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec1ec5e1-8623-412f-a622-52b10a2c798d-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-5lkdr\" (UID: \"ec1ec5e1-8623-412f-a622-52b10a2c798d\") " pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.220487 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtkpn\" (UniqueName: \"kubernetes.io/projected/ec1ec5e1-8623-412f-a622-52b10a2c798d-kube-api-access-gtkpn\") pod \"dnsmasq-dns-7b57d9888c-5lkdr\" (UID: \"ec1ec5e1-8623-412f-a622-52b10a2c798d\") " pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.220507 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec1ec5e1-8623-412f-a622-52b10a2c798d-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-5lkdr\" (UID: \"ec1ec5e1-8623-412f-a622-52b10a2c798d\") " pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.220526 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec1ec5e1-8623-412f-a622-52b10a2c798d-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-5lkdr\" (UID: \"ec1ec5e1-8623-412f-a622-52b10a2c798d\") " pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.257274 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e427caf-e060-4f22-b5cd-96b47c0cf797" path="/var/lib/kubelet/pods/9e427caf-e060-4f22-b5cd-96b47c0cf797/volumes" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.276376 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795cf8b45c-rb9zl" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.284389 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.305617 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cnt4k" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.322390 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec1ec5e1-8623-412f-a622-52b10a2c798d-config\") pod \"dnsmasq-dns-7b57d9888c-5lkdr\" (UID: \"ec1ec5e1-8623-412f-a622-52b10a2c798d\") " pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.322465 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec1ec5e1-8623-412f-a622-52b10a2c798d-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-5lkdr\" (UID: \"ec1ec5e1-8623-412f-a622-52b10a2c798d\") " pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.322501 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtkpn\" (UniqueName: \"kubernetes.io/projected/ec1ec5e1-8623-412f-a622-52b10a2c798d-kube-api-access-gtkpn\") pod \"dnsmasq-dns-7b57d9888c-5lkdr\" (UID: \"ec1ec5e1-8623-412f-a622-52b10a2c798d\") " pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.322526 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec1ec5e1-8623-412f-a622-52b10a2c798d-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-5lkdr\" (UID: \"ec1ec5e1-8623-412f-a622-52b10a2c798d\") " pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.322552 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec1ec5e1-8623-412f-a622-52b10a2c798d-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-5lkdr\" (UID: \"ec1ec5e1-8623-412f-a622-52b10a2c798d\") " pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.324259 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec1ec5e1-8623-412f-a622-52b10a2c798d-config\") pod \"dnsmasq-dns-7b57d9888c-5lkdr\" (UID: \"ec1ec5e1-8623-412f-a622-52b10a2c798d\") " pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.324545 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec1ec5e1-8623-412f-a622-52b10a2c798d-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-5lkdr\" (UID: \"ec1ec5e1-8623-412f-a622-52b10a2c798d\") " pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.325126 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec1ec5e1-8623-412f-a622-52b10a2c798d-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-5lkdr\" (UID: \"ec1ec5e1-8623-412f-a622-52b10a2c798d\") " pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.326611 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec1ec5e1-8623-412f-a622-52b10a2c798d-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-5lkdr\" (UID: \"ec1ec5e1-8623-412f-a622-52b10a2c798d\") " pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.335736 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.341575 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtkpn\" (UniqueName: \"kubernetes.io/projected/ec1ec5e1-8623-412f-a622-52b10a2c798d-kube-api-access-gtkpn\") pod \"dnsmasq-dns-7b57d9888c-5lkdr\" (UID: \"ec1ec5e1-8623-412f-a622-52b10a2c798d\") " pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.491510 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-w92qr" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.511497 4825 generic.go:334] "Generic (PLEG): container finished" podID="7b53862f-ef06-4fc8-9e9b-d409a3eb3671" containerID="9487c7298cd8603fb8c58df036200c1a428fdf6ea3dc68082722cabfecf19b29" exitCode=0 Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.511640 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9fc56ff-gxl2b" event={"ID":"7b53862f-ef06-4fc8-9e9b-d409a3eb3671","Type":"ContainerDied","Data":"9487c7298cd8603fb8c58df036200c1a428fdf6ea3dc68082722cabfecf19b29"} Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.520220 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-w92qr" event={"ID":"9dbf9084-ea14-4c47-8d2e-682719c4f27c","Type":"ContainerDied","Data":"34f5eeb6d64e0edf4b4811dbad76921becdef54e33773de603b151286039dde2"} Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.520607 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-w92qr" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.526187 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl6vm\" (UniqueName: \"kubernetes.io/projected/9dbf9084-ea14-4c47-8d2e-682719c4f27c-kube-api-access-xl6vm\") pod \"9dbf9084-ea14-4c47-8d2e-682719c4f27c\" (UID: \"9dbf9084-ea14-4c47-8d2e-682719c4f27c\") " Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.526241 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9dbf9084-ea14-4c47-8d2e-682719c4f27c-config\") pod \"9dbf9084-ea14-4c47-8d2e-682719c4f27c\" (UID: \"9dbf9084-ea14-4c47-8d2e-682719c4f27c\") " Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.526429 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9dbf9084-ea14-4c47-8d2e-682719c4f27c-dns-svc\") pod \"9dbf9084-ea14-4c47-8d2e-682719c4f27c\" (UID: \"9dbf9084-ea14-4c47-8d2e-682719c4f27c\") " Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.526869 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dbf9084-ea14-4c47-8d2e-682719c4f27c-config" (OuterVolumeSpecName: "config") pod "9dbf9084-ea14-4c47-8d2e-682719c4f27c" (UID: "9dbf9084-ea14-4c47-8d2e-682719c4f27c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.527093 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dbf9084-ea14-4c47-8d2e-682719c4f27c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9dbf9084-ea14-4c47-8d2e-682719c4f27c" (UID: "9dbf9084-ea14-4c47-8d2e-682719c4f27c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.533052 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dbf9084-ea14-4c47-8d2e-682719c4f27c-kube-api-access-xl6vm" (OuterVolumeSpecName: "kube-api-access-xl6vm") pod "9dbf9084-ea14-4c47-8d2e-682719c4f27c" (UID: "9dbf9084-ea14-4c47-8d2e-682719c4f27c"). InnerVolumeSpecName "kube-api-access-xl6vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.560183 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.586303 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.637184 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9dbf9084-ea14-4c47-8d2e-682719c4f27c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.637224 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl6vm\" (UniqueName: \"kubernetes.io/projected/9dbf9084-ea14-4c47-8d2e-682719c4f27c-kube-api-access-xl6vm\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.637237 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9dbf9084-ea14-4c47-8d2e-682719c4f27c-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.769767 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-795cf8b45c-rb9zl"] Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.810411 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.811734 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.818685 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.818813 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.818849 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.819094 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-dlgk9" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.829943 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.852376 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77fced6-dee7-49fc-9088-e173a5be3cee-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " pod="openstack/ovn-northd-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.852541 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d77fced6-dee7-49fc-9088-e173a5be3cee-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " pod="openstack/ovn-northd-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.853094 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d77fced6-dee7-49fc-9088-e173a5be3cee-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " pod="openstack/ovn-northd-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.853194 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d77fced6-dee7-49fc-9088-e173a5be3cee-scripts\") pod \"ovn-northd-0\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " pod="openstack/ovn-northd-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.853263 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d77fced6-dee7-49fc-9088-e173a5be3cee-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " pod="openstack/ovn-northd-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.853335 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d77fced6-dee7-49fc-9088-e173a5be3cee-config\") pod \"ovn-northd-0\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " pod="openstack/ovn-northd-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.853462 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv4pm\" (UniqueName: \"kubernetes.io/projected/d77fced6-dee7-49fc-9088-e173a5be3cee-kube-api-access-fv4pm\") pod \"ovn-northd-0\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " pod="openstack/ovn-northd-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.891038 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-cnt4k"] Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.954786 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d77fced6-dee7-49fc-9088-e173a5be3cee-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " pod="openstack/ovn-northd-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.954837 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d77fced6-dee7-49fc-9088-e173a5be3cee-scripts\") pod \"ovn-northd-0\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " pod="openstack/ovn-northd-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.954862 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d77fced6-dee7-49fc-9088-e173a5be3cee-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " pod="openstack/ovn-northd-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.954890 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d77fced6-dee7-49fc-9088-e173a5be3cee-config\") pod \"ovn-northd-0\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " pod="openstack/ovn-northd-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.954941 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv4pm\" (UniqueName: \"kubernetes.io/projected/d77fced6-dee7-49fc-9088-e173a5be3cee-kube-api-access-fv4pm\") pod \"ovn-northd-0\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " pod="openstack/ovn-northd-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.954978 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77fced6-dee7-49fc-9088-e173a5be3cee-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " pod="openstack/ovn-northd-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.955043 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d77fced6-dee7-49fc-9088-e173a5be3cee-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " pod="openstack/ovn-northd-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.955667 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d77fced6-dee7-49fc-9088-e173a5be3cee-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " pod="openstack/ovn-northd-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.955714 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-w92qr"] Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.956457 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d77fced6-dee7-49fc-9088-e173a5be3cee-config\") pod \"ovn-northd-0\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " pod="openstack/ovn-northd-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.957617 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d77fced6-dee7-49fc-9088-e173a5be3cee-scripts\") pod \"ovn-northd-0\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " pod="openstack/ovn-northd-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.959764 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77fced6-dee7-49fc-9088-e173a5be3cee-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " pod="openstack/ovn-northd-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.960607 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d77fced6-dee7-49fc-9088-e173a5be3cee-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " pod="openstack/ovn-northd-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.961714 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-w92qr"] Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.962389 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d77fced6-dee7-49fc-9088-e173a5be3cee-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " pod="openstack/ovn-northd-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.977150 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv4pm\" (UniqueName: \"kubernetes.io/projected/d77fced6-dee7-49fc-9088-e173a5be3cee-kube-api-access-fv4pm\") pod \"ovn-northd-0\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " pod="openstack/ovn-northd-0" Mar 10 07:04:19 crc kubenswrapper[4825]: I0310 07:04:19.981160 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f9fc56ff-gxl2b" Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.056307 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5vcp\" (UniqueName: \"kubernetes.io/projected/7b53862f-ef06-4fc8-9e9b-d409a3eb3671-kube-api-access-w5vcp\") pod \"7b53862f-ef06-4fc8-9e9b-d409a3eb3671\" (UID: \"7b53862f-ef06-4fc8-9e9b-d409a3eb3671\") " Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.056405 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b53862f-ef06-4fc8-9e9b-d409a3eb3671-dns-svc\") pod \"7b53862f-ef06-4fc8-9e9b-d409a3eb3671\" (UID: \"7b53862f-ef06-4fc8-9e9b-d409a3eb3671\") " Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.056473 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b53862f-ef06-4fc8-9e9b-d409a3eb3671-config\") pod \"7b53862f-ef06-4fc8-9e9b-d409a3eb3671\" (UID: \"7b53862f-ef06-4fc8-9e9b-d409a3eb3671\") " Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.060014 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b53862f-ef06-4fc8-9e9b-d409a3eb3671-kube-api-access-w5vcp" (OuterVolumeSpecName: "kube-api-access-w5vcp") pod "7b53862f-ef06-4fc8-9e9b-d409a3eb3671" (UID: "7b53862f-ef06-4fc8-9e9b-d409a3eb3671"). InnerVolumeSpecName "kube-api-access-w5vcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.083111 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b53862f-ef06-4fc8-9e9b-d409a3eb3671-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7b53862f-ef06-4fc8-9e9b-d409a3eb3671" (UID: "7b53862f-ef06-4fc8-9e9b-d409a3eb3671"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.090671 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b53862f-ef06-4fc8-9e9b-d409a3eb3671-config" (OuterVolumeSpecName: "config") pod "7b53862f-ef06-4fc8-9e9b-d409a3eb3671" (UID: "7b53862f-ef06-4fc8-9e9b-d409a3eb3671"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.115153 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-5lkdr"] Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.157850 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5vcp\" (UniqueName: \"kubernetes.io/projected/7b53862f-ef06-4fc8-9e9b-d409a3eb3671-kube-api-access-w5vcp\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.157914 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b53862f-ef06-4fc8-9e9b-d409a3eb3671-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.157924 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b53862f-ef06-4fc8-9e9b-d409a3eb3671-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.228173 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.534465 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9fc56ff-gxl2b" event={"ID":"7b53862f-ef06-4fc8-9e9b-d409a3eb3671","Type":"ContainerDied","Data":"8566d0fe6392bf6233c4618fcc2501036e6a0c4258fbe143075560fe4f1b3529"} Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.534789 4825 scope.go:117] "RemoveContainer" containerID="9487c7298cd8603fb8c58df036200c1a428fdf6ea3dc68082722cabfecf19b29" Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.534556 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f9fc56ff-gxl2b" Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.538156 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cnt4k" event={"ID":"d2e08f36-91c9-4bad-b1fb-88a0938c4d25","Type":"ContainerStarted","Data":"a0f14677c076a4d6fea4cbdb391654ebcd047d9f5740b2422a0cfce1097e8da1"} Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.538203 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cnt4k" event={"ID":"d2e08f36-91c9-4bad-b1fb-88a0938c4d25","Type":"ContainerStarted","Data":"14d95d2f7730bc87f1858043cd3e90fb5b3130d6474c421938b5770e215eb821"} Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.539891 4825 generic.go:334] "Generic (PLEG): container finished" podID="73078060-8fff-41f1-baa7-17ce9929faec" containerID="e46942393fa8f3de1b8f8f0eb176b04af90ff2f0dd813223c2609739a7cd6f00" exitCode=0 Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.539958 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795cf8b45c-rb9zl" event={"ID":"73078060-8fff-41f1-baa7-17ce9929faec","Type":"ContainerDied","Data":"e46942393fa8f3de1b8f8f0eb176b04af90ff2f0dd813223c2609739a7cd6f00"} Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.539989 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795cf8b45c-rb9zl" event={"ID":"73078060-8fff-41f1-baa7-17ce9929faec","Type":"ContainerStarted","Data":"2731dd3d98a0adbc0a0638a538d118e1e7a216ef225513c0a458250a6b7ef570"} Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.544229 4825 generic.go:334] "Generic (PLEG): container finished" podID="ec1ec5e1-8623-412f-a622-52b10a2c798d" containerID="12fe27040f0212ebc74a1a63a75c026abefdf592a1a0b379c2df95abc8857cd1" exitCode=0 Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.544454 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" event={"ID":"ec1ec5e1-8623-412f-a622-52b10a2c798d","Type":"ContainerDied","Data":"12fe27040f0212ebc74a1a63a75c026abefdf592a1a0b379c2df95abc8857cd1"} Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.544543 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" event={"ID":"ec1ec5e1-8623-412f-a622-52b10a2c798d","Type":"ContainerStarted","Data":"615c124cfb840ed1378e6bb8868174391c9571ebf8c6fb6273e7c70031f3825f"} Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.574153 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-cnt4k" podStartSLOduration=2.574107728 podStartE2EDuration="2.574107728s" podCreationTimestamp="2026-03-10 07:04:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:04:20.567944228 +0000 UTC m=+1213.597724873" watchObservedRunningTime="2026-03-10 07:04:20.574107728 +0000 UTC m=+1213.603888403" Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.706310 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-gxl2b"] Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.716681 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-gxl2b"] Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.718822 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.905931 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 10 07:04:20 crc kubenswrapper[4825]: I0310 07:04:20.983695 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795cf8b45c-rb9zl"] Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.038790 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-r9bqz"] Mar 10 07:04:21 crc kubenswrapper[4825]: E0310 07:04:21.039154 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b53862f-ef06-4fc8-9e9b-d409a3eb3671" containerName="init" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.039169 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b53862f-ef06-4fc8-9e9b-d409a3eb3671" containerName="init" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.039334 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b53862f-ef06-4fc8-9e9b-d409a3eb3671" containerName="init" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.042975 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.054085 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-r9bqz"] Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.178799 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43dacc02-935b-4836-8863-e175536b0cd2-config\") pod \"dnsmasq-dns-675f7dd995-r9bqz\" (UID: \"43dacc02-935b-4836-8863-e175536b0cd2\") " pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.178880 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43dacc02-935b-4836-8863-e175536b0cd2-dns-svc\") pod \"dnsmasq-dns-675f7dd995-r9bqz\" (UID: \"43dacc02-935b-4836-8863-e175536b0cd2\") " pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.178950 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlfjm\" (UniqueName: \"kubernetes.io/projected/43dacc02-935b-4836-8863-e175536b0cd2-kube-api-access-qlfjm\") pod \"dnsmasq-dns-675f7dd995-r9bqz\" (UID: \"43dacc02-935b-4836-8863-e175536b0cd2\") " pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.179217 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43dacc02-935b-4836-8863-e175536b0cd2-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-r9bqz\" (UID: \"43dacc02-935b-4836-8863-e175536b0cd2\") " pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.179270 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43dacc02-935b-4836-8863-e175536b0cd2-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-r9bqz\" (UID: \"43dacc02-935b-4836-8863-e175536b0cd2\") " pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.245959 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b53862f-ef06-4fc8-9e9b-d409a3eb3671" path="/var/lib/kubelet/pods/7b53862f-ef06-4fc8-9e9b-d409a3eb3671/volumes" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.246539 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dbf9084-ea14-4c47-8d2e-682719c4f27c" path="/var/lib/kubelet/pods/9dbf9084-ea14-4c47-8d2e-682719c4f27c/volumes" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.281206 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43dacc02-935b-4836-8863-e175536b0cd2-dns-svc\") pod \"dnsmasq-dns-675f7dd995-r9bqz\" (UID: \"43dacc02-935b-4836-8863-e175536b0cd2\") " pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.281306 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlfjm\" (UniqueName: \"kubernetes.io/projected/43dacc02-935b-4836-8863-e175536b0cd2-kube-api-access-qlfjm\") pod \"dnsmasq-dns-675f7dd995-r9bqz\" (UID: \"43dacc02-935b-4836-8863-e175536b0cd2\") " pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.281350 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43dacc02-935b-4836-8863-e175536b0cd2-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-r9bqz\" (UID: \"43dacc02-935b-4836-8863-e175536b0cd2\") " pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.281407 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43dacc02-935b-4836-8863-e175536b0cd2-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-r9bqz\" (UID: \"43dacc02-935b-4836-8863-e175536b0cd2\") " pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.281470 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43dacc02-935b-4836-8863-e175536b0cd2-config\") pod \"dnsmasq-dns-675f7dd995-r9bqz\" (UID: \"43dacc02-935b-4836-8863-e175536b0cd2\") " pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.282542 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43dacc02-935b-4836-8863-e175536b0cd2-dns-svc\") pod \"dnsmasq-dns-675f7dd995-r9bqz\" (UID: \"43dacc02-935b-4836-8863-e175536b0cd2\") " pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.282743 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43dacc02-935b-4836-8863-e175536b0cd2-config\") pod \"dnsmasq-dns-675f7dd995-r9bqz\" (UID: \"43dacc02-935b-4836-8863-e175536b0cd2\") " pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.282776 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43dacc02-935b-4836-8863-e175536b0cd2-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-r9bqz\" (UID: \"43dacc02-935b-4836-8863-e175536b0cd2\") " pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.283355 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43dacc02-935b-4836-8863-e175536b0cd2-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-r9bqz\" (UID: \"43dacc02-935b-4836-8863-e175536b0cd2\") " pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.331454 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlfjm\" (UniqueName: \"kubernetes.io/projected/43dacc02-935b-4836-8863-e175536b0cd2-kube-api-access-qlfjm\") pod \"dnsmasq-dns-675f7dd995-r9bqz\" (UID: \"43dacc02-935b-4836-8863-e175536b0cd2\") " pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.367581 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.561276 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795cf8b45c-rb9zl" event={"ID":"73078060-8fff-41f1-baa7-17ce9929faec","Type":"ContainerStarted","Data":"ec326de215a24c2df15b1e284360a058d24ccf18ce7b6475e96b228d6f3a911e"} Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.562299 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-795cf8b45c-rb9zl" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.567503 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" event={"ID":"ec1ec5e1-8623-412f-a622-52b10a2c798d","Type":"ContainerStarted","Data":"a90dbf51fbed901941157edef9fd7fbe40dc81e7559edb2eac2bdf74e6b8a785"} Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.567627 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.570339 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d77fced6-dee7-49fc-9088-e173a5be3cee","Type":"ContainerStarted","Data":"e80f70e559e88ab7e4a0b13d4c1292a7c68c490c36f06ac9365c83bd1c1a3bd8"} Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.591780 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-795cf8b45c-rb9zl" podStartSLOduration=3.591762867 podStartE2EDuration="3.591762867s" podCreationTimestamp="2026-03-10 07:04:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:04:21.584637441 +0000 UTC m=+1214.614418056" watchObservedRunningTime="2026-03-10 07:04:21.591762867 +0000 UTC m=+1214.621543482" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.611278 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" podStartSLOduration=2.611253265 podStartE2EDuration="2.611253265s" podCreationTimestamp="2026-03-10 07:04:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:04:21.610022202 +0000 UTC m=+1214.639802817" watchObservedRunningTime="2026-03-10 07:04:21.611253265 +0000 UTC m=+1214.641033880" Mar 10 07:04:21 crc kubenswrapper[4825]: I0310 07:04:21.954306 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-r9bqz"] Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.096533 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.108056 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.110037 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.110844 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-wbls8" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.110869 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.110919 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.110953 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.200498 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/632168fa-0532-4c7e-b688-fb361ee89ec8-lock\") pod \"swift-storage-0\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " pod="openstack/swift-storage-0" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.200561 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-etc-swift\") pod \"swift-storage-0\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " pod="openstack/swift-storage-0" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.200597 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7gvh\" (UniqueName: \"kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-kube-api-access-v7gvh\") pod \"swift-storage-0\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " pod="openstack/swift-storage-0" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.200620 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " pod="openstack/swift-storage-0" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.200703 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/632168fa-0532-4c7e-b688-fb361ee89ec8-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " pod="openstack/swift-storage-0" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.200745 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/632168fa-0532-4c7e-b688-fb361ee89ec8-cache\") pod \"swift-storage-0\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " pod="openstack/swift-storage-0" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.302566 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/632168fa-0532-4c7e-b688-fb361ee89ec8-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " pod="openstack/swift-storage-0" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.304665 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/632168fa-0532-4c7e-b688-fb361ee89ec8-cache\") pod \"swift-storage-0\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " pod="openstack/swift-storage-0" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.304831 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/632168fa-0532-4c7e-b688-fb361ee89ec8-lock\") pod \"swift-storage-0\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " pod="openstack/swift-storage-0" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.304990 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-etc-swift\") pod \"swift-storage-0\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " pod="openstack/swift-storage-0" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.305092 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7gvh\" (UniqueName: \"kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-kube-api-access-v7gvh\") pod \"swift-storage-0\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " pod="openstack/swift-storage-0" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.305234 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " pod="openstack/swift-storage-0" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.305640 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/swift-storage-0" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.306479 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/632168fa-0532-4c7e-b688-fb361ee89ec8-lock\") pod \"swift-storage-0\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " pod="openstack/swift-storage-0" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.306870 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/632168fa-0532-4c7e-b688-fb361ee89ec8-cache\") pod \"swift-storage-0\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " pod="openstack/swift-storage-0" Mar 10 07:04:22 crc kubenswrapper[4825]: E0310 07:04:22.307114 4825 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 07:04:22 crc kubenswrapper[4825]: E0310 07:04:22.307153 4825 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 07:04:22 crc kubenswrapper[4825]: E0310 07:04:22.307197 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-etc-swift podName:632168fa-0532-4c7e-b688-fb361ee89ec8 nodeName:}" failed. No retries permitted until 2026-03-10 07:04:22.807179263 +0000 UTC m=+1215.836959878 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-etc-swift") pod "swift-storage-0" (UID: "632168fa-0532-4c7e-b688-fb361ee89ec8") : configmap "swift-ring-files" not found Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.308209 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/632168fa-0532-4c7e-b688-fb361ee89ec8-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " pod="openstack/swift-storage-0" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.329996 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7gvh\" (UniqueName: \"kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-kube-api-access-v7gvh\") pod \"swift-storage-0\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " pod="openstack/swift-storage-0" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.337555 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " pod="openstack/swift-storage-0" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.588808 4825 generic.go:334] "Generic (PLEG): container finished" podID="43dacc02-935b-4836-8863-e175536b0cd2" containerID="4021e97ed12d312ed4f3a61752f965d4945892a16e4d8c80f26b392bf7588282" exitCode=0 Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.588878 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" event={"ID":"43dacc02-935b-4836-8863-e175536b0cd2","Type":"ContainerDied","Data":"4021e97ed12d312ed4f3a61752f965d4945892a16e4d8c80f26b392bf7588282"} Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.588909 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" event={"ID":"43dacc02-935b-4836-8863-e175536b0cd2","Type":"ContainerStarted","Data":"54eafc7ff9ea176efcecca587236b832093c6df9500eb83f8afd60ed5dfb8850"} Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.623230 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d77fced6-dee7-49fc-9088-e173a5be3cee","Type":"ContainerStarted","Data":"4ceda535e48bc00d5f96a25e503044a292df24630498d86c8a415946cacf0904"} Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.623285 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d77fced6-dee7-49fc-9088-e173a5be3cee","Type":"ContainerStarted","Data":"6a4f962505809098311731c7df1e9e9acaa99f0d4d2a16ebefd3083955087398"} Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.624146 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.624272 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-795cf8b45c-rb9zl" podUID="73078060-8fff-41f1-baa7-17ce9929faec" containerName="dnsmasq-dns" containerID="cri-o://ec326de215a24c2df15b1e284360a058d24ccf18ce7b6475e96b228d6f3a911e" gracePeriod=10 Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.719833 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.410716712 podStartE2EDuration="3.719803442s" podCreationTimestamp="2026-03-10 07:04:19 +0000 UTC" firstStartedPulling="2026-03-10 07:04:20.723373127 +0000 UTC m=+1213.753153742" lastFinishedPulling="2026-03-10 07:04:22.032459857 +0000 UTC m=+1215.062240472" observedRunningTime="2026-03-10 07:04:22.663944787 +0000 UTC m=+1215.693725402" watchObservedRunningTime="2026-03-10 07:04:22.719803442 +0000 UTC m=+1215.749584057" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.722301 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-2sxbl"] Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.727424 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.731883 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.732097 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.734363 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.741559 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2sxbl"] Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.823310 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b166cbd-0add-4896-92ab-caed991a0a06-scripts\") pod \"swift-ring-rebalance-2sxbl\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.823941 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3b166cbd-0add-4896-92ab-caed991a0a06-swiftconf\") pod \"swift-ring-rebalance-2sxbl\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.823982 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3b166cbd-0add-4896-92ab-caed991a0a06-ring-data-devices\") pod \"swift-ring-rebalance-2sxbl\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.824021 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3b166cbd-0add-4896-92ab-caed991a0a06-etc-swift\") pod \"swift-ring-rebalance-2sxbl\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.824053 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x4mg\" (UniqueName: \"kubernetes.io/projected/3b166cbd-0add-4896-92ab-caed991a0a06-kube-api-access-2x4mg\") pod \"swift-ring-rebalance-2sxbl\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.824105 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-etc-swift\") pod \"swift-storage-0\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " pod="openstack/swift-storage-0" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.824189 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3b166cbd-0add-4896-92ab-caed991a0a06-dispersionconf\") pod \"swift-ring-rebalance-2sxbl\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:22 crc kubenswrapper[4825]: I0310 07:04:22.824227 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b166cbd-0add-4896-92ab-caed991a0a06-combined-ca-bundle\") pod \"swift-ring-rebalance-2sxbl\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:22 crc kubenswrapper[4825]: E0310 07:04:22.824473 4825 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 07:04:22 crc kubenswrapper[4825]: E0310 07:04:22.824500 4825 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 07:04:22 crc kubenswrapper[4825]: E0310 07:04:22.824547 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-etc-swift podName:632168fa-0532-4c7e-b688-fb361ee89ec8 nodeName:}" failed. No retries permitted until 2026-03-10 07:04:23.82452936 +0000 UTC m=+1216.854309975 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-etc-swift") pod "swift-storage-0" (UID: "632168fa-0532-4c7e-b688-fb361ee89ec8") : configmap "swift-ring-files" not found Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:22.926168 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3b166cbd-0add-4896-92ab-caed991a0a06-etc-swift\") pod \"swift-ring-rebalance-2sxbl\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:22.926208 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x4mg\" (UniqueName: \"kubernetes.io/projected/3b166cbd-0add-4896-92ab-caed991a0a06-kube-api-access-2x4mg\") pod \"swift-ring-rebalance-2sxbl\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:22.926279 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3b166cbd-0add-4896-92ab-caed991a0a06-dispersionconf\") pod \"swift-ring-rebalance-2sxbl\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:22.926295 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b166cbd-0add-4896-92ab-caed991a0a06-combined-ca-bundle\") pod \"swift-ring-rebalance-2sxbl\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:22.926333 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b166cbd-0add-4896-92ab-caed991a0a06-scripts\") pod \"swift-ring-rebalance-2sxbl\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:22.926385 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3b166cbd-0add-4896-92ab-caed991a0a06-swiftconf\") pod \"swift-ring-rebalance-2sxbl\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:22.926406 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3b166cbd-0add-4896-92ab-caed991a0a06-ring-data-devices\") pod \"swift-ring-rebalance-2sxbl\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:22.927540 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b166cbd-0add-4896-92ab-caed991a0a06-scripts\") pod \"swift-ring-rebalance-2sxbl\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:22.928334 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3b166cbd-0add-4896-92ab-caed991a0a06-etc-swift\") pod \"swift-ring-rebalance-2sxbl\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:22.928760 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3b166cbd-0add-4896-92ab-caed991a0a06-ring-data-devices\") pod \"swift-ring-rebalance-2sxbl\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:22.932372 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b166cbd-0add-4896-92ab-caed991a0a06-combined-ca-bundle\") pod \"swift-ring-rebalance-2sxbl\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:22.932585 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3b166cbd-0add-4896-92ab-caed991a0a06-swiftconf\") pod \"swift-ring-rebalance-2sxbl\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:22.933012 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3b166cbd-0add-4896-92ab-caed991a0a06-dispersionconf\") pod \"swift-ring-rebalance-2sxbl\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:22.953051 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x4mg\" (UniqueName: \"kubernetes.io/projected/3b166cbd-0add-4896-92ab-caed991a0a06-kube-api-access-2x4mg\") pod \"swift-ring-rebalance-2sxbl\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.042866 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.127035 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795cf8b45c-rb9zl" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.230033 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mrwb\" (UniqueName: \"kubernetes.io/projected/73078060-8fff-41f1-baa7-17ce9929faec-kube-api-access-8mrwb\") pod \"73078060-8fff-41f1-baa7-17ce9929faec\" (UID: \"73078060-8fff-41f1-baa7-17ce9929faec\") " Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.230409 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73078060-8fff-41f1-baa7-17ce9929faec-dns-svc\") pod \"73078060-8fff-41f1-baa7-17ce9929faec\" (UID: \"73078060-8fff-41f1-baa7-17ce9929faec\") " Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.230464 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73078060-8fff-41f1-baa7-17ce9929faec-config\") pod \"73078060-8fff-41f1-baa7-17ce9929faec\" (UID: \"73078060-8fff-41f1-baa7-17ce9929faec\") " Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.230509 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73078060-8fff-41f1-baa7-17ce9929faec-ovsdbserver-sb\") pod \"73078060-8fff-41f1-baa7-17ce9929faec\" (UID: \"73078060-8fff-41f1-baa7-17ce9929faec\") " Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.234615 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73078060-8fff-41f1-baa7-17ce9929faec-kube-api-access-8mrwb" (OuterVolumeSpecName: "kube-api-access-8mrwb") pod "73078060-8fff-41f1-baa7-17ce9929faec" (UID: "73078060-8fff-41f1-baa7-17ce9929faec"). InnerVolumeSpecName "kube-api-access-8mrwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.280822 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73078060-8fff-41f1-baa7-17ce9929faec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "73078060-8fff-41f1-baa7-17ce9929faec" (UID: "73078060-8fff-41f1-baa7-17ce9929faec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.283864 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73078060-8fff-41f1-baa7-17ce9929faec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "73078060-8fff-41f1-baa7-17ce9929faec" (UID: "73078060-8fff-41f1-baa7-17ce9929faec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.312705 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73078060-8fff-41f1-baa7-17ce9929faec-config" (OuterVolumeSpecName: "config") pod "73078060-8fff-41f1-baa7-17ce9929faec" (UID: "73078060-8fff-41f1-baa7-17ce9929faec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.332901 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mrwb\" (UniqueName: \"kubernetes.io/projected/73078060-8fff-41f1-baa7-17ce9929faec-kube-api-access-8mrwb\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.332919 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73078060-8fff-41f1-baa7-17ce9929faec-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.332929 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73078060-8fff-41f1-baa7-17ce9929faec-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.332936 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73078060-8fff-41f1-baa7-17ce9929faec-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.640633 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" event={"ID":"43dacc02-935b-4836-8863-e175536b0cd2","Type":"ContainerStarted","Data":"e38be62763739c1a7ecde94467ed37f7af960422776051d7c021bd000a095079"} Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.641201 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.649325 4825 generic.go:334] "Generic (PLEG): container finished" podID="73078060-8fff-41f1-baa7-17ce9929faec" containerID="ec326de215a24c2df15b1e284360a058d24ccf18ce7b6475e96b228d6f3a911e" exitCode=0 Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.649634 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795cf8b45c-rb9zl" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.650099 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795cf8b45c-rb9zl" event={"ID":"73078060-8fff-41f1-baa7-17ce9929faec","Type":"ContainerDied","Data":"ec326de215a24c2df15b1e284360a058d24ccf18ce7b6475e96b228d6f3a911e"} Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.650153 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795cf8b45c-rb9zl" event={"ID":"73078060-8fff-41f1-baa7-17ce9929faec","Type":"ContainerDied","Data":"2731dd3d98a0adbc0a0638a538d118e1e7a216ef225513c0a458250a6b7ef570"} Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.650182 4825 scope.go:117] "RemoveContainer" containerID="ec326de215a24c2df15b1e284360a058d24ccf18ce7b6475e96b228d6f3a911e" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.684295 4825 scope.go:117] "RemoveContainer" containerID="e46942393fa8f3de1b8f8f0eb176b04af90ff2f0dd813223c2609739a7cd6f00" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.686253 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" podStartSLOduration=2.686232047 podStartE2EDuration="2.686232047s" podCreationTimestamp="2026-03-10 07:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:04:23.662515899 +0000 UTC m=+1216.692296534" watchObservedRunningTime="2026-03-10 07:04:23.686232047 +0000 UTC m=+1216.716012662" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.706190 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795cf8b45c-rb9zl"] Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.713680 4825 scope.go:117] "RemoveContainer" containerID="ec326de215a24c2df15b1e284360a058d24ccf18ce7b6475e96b228d6f3a911e" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.713819 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-795cf8b45c-rb9zl"] Mar 10 07:04:23 crc kubenswrapper[4825]: E0310 07:04:23.714329 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec326de215a24c2df15b1e284360a058d24ccf18ce7b6475e96b228d6f3a911e\": container with ID starting with ec326de215a24c2df15b1e284360a058d24ccf18ce7b6475e96b228d6f3a911e not found: ID does not exist" containerID="ec326de215a24c2df15b1e284360a058d24ccf18ce7b6475e96b228d6f3a911e" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.714364 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec326de215a24c2df15b1e284360a058d24ccf18ce7b6475e96b228d6f3a911e"} err="failed to get container status \"ec326de215a24c2df15b1e284360a058d24ccf18ce7b6475e96b228d6f3a911e\": rpc error: code = NotFound desc = could not find container \"ec326de215a24c2df15b1e284360a058d24ccf18ce7b6475e96b228d6f3a911e\": container with ID starting with ec326de215a24c2df15b1e284360a058d24ccf18ce7b6475e96b228d6f3a911e not found: ID does not exist" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.714399 4825 scope.go:117] "RemoveContainer" containerID="e46942393fa8f3de1b8f8f0eb176b04af90ff2f0dd813223c2609739a7cd6f00" Mar 10 07:04:23 crc kubenswrapper[4825]: E0310 07:04:23.714967 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e46942393fa8f3de1b8f8f0eb176b04af90ff2f0dd813223c2609739a7cd6f00\": container with ID starting with e46942393fa8f3de1b8f8f0eb176b04af90ff2f0dd813223c2609739a7cd6f00 not found: ID does not exist" containerID="e46942393fa8f3de1b8f8f0eb176b04af90ff2f0dd813223c2609739a7cd6f00" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.715054 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e46942393fa8f3de1b8f8f0eb176b04af90ff2f0dd813223c2609739a7cd6f00"} err="failed to get container status \"e46942393fa8f3de1b8f8f0eb176b04af90ff2f0dd813223c2609739a7cd6f00\": rpc error: code = NotFound desc = could not find container \"e46942393fa8f3de1b8f8f0eb176b04af90ff2f0dd813223c2609739a7cd6f00\": container with ID starting with e46942393fa8f3de1b8f8f0eb176b04af90ff2f0dd813223c2609739a7cd6f00 not found: ID does not exist" Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.844684 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-etc-swift\") pod \"swift-storage-0\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " pod="openstack/swift-storage-0" Mar 10 07:04:23 crc kubenswrapper[4825]: E0310 07:04:23.845077 4825 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 07:04:23 crc kubenswrapper[4825]: E0310 07:04:23.845099 4825 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 07:04:23 crc kubenswrapper[4825]: E0310 07:04:23.845172 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-etc-swift podName:632168fa-0532-4c7e-b688-fb361ee89ec8 nodeName:}" failed. No retries permitted until 2026-03-10 07:04:25.845155477 +0000 UTC m=+1218.874936092 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-etc-swift") pod "swift-storage-0" (UID: "632168fa-0532-4c7e-b688-fb361ee89ec8") : configmap "swift-ring-files" not found Mar 10 07:04:23 crc kubenswrapper[4825]: I0310 07:04:23.868297 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2sxbl"] Mar 10 07:04:23 crc kubenswrapper[4825]: W0310 07:04:23.872472 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b166cbd_0add_4896_92ab_caed991a0a06.slice/crio-471a479090ebfe92bcdcc8dac58152fd527bc9056890cd6225b60f3d237edb51 WatchSource:0}: Error finding container 471a479090ebfe92bcdcc8dac58152fd527bc9056890cd6225b60f3d237edb51: Status 404 returned error can't find the container with id 471a479090ebfe92bcdcc8dac58152fd527bc9056890cd6225b60f3d237edb51 Mar 10 07:04:24 crc kubenswrapper[4825]: I0310 07:04:24.658660 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2sxbl" event={"ID":"3b166cbd-0add-4896-92ab-caed991a0a06","Type":"ContainerStarted","Data":"471a479090ebfe92bcdcc8dac58152fd527bc9056890cd6225b60f3d237edb51"} Mar 10 07:04:25 crc kubenswrapper[4825]: I0310 07:04:25.248113 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73078060-8fff-41f1-baa7-17ce9929faec" path="/var/lib/kubelet/pods/73078060-8fff-41f1-baa7-17ce9929faec/volumes" Mar 10 07:04:25 crc kubenswrapper[4825]: I0310 07:04:25.887334 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-etc-swift\") pod \"swift-storage-0\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " pod="openstack/swift-storage-0" Mar 10 07:04:25 crc kubenswrapper[4825]: E0310 07:04:25.887550 4825 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 07:04:25 crc kubenswrapper[4825]: E0310 07:04:25.887566 4825 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 07:04:25 crc kubenswrapper[4825]: E0310 07:04:25.887606 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-etc-swift podName:632168fa-0532-4c7e-b688-fb361ee89ec8 nodeName:}" failed. No retries permitted until 2026-03-10 07:04:29.8875919 +0000 UTC m=+1222.917372515 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-etc-swift") pod "swift-storage-0" (UID: "632168fa-0532-4c7e-b688-fb361ee89ec8") : configmap "swift-ring-files" not found Mar 10 07:04:26 crc kubenswrapper[4825]: I0310 07:04:26.883022 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 10 07:04:26 crc kubenswrapper[4825]: I0310 07:04:26.883449 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 10 07:04:27 crc kubenswrapper[4825]: I0310 07:04:27.018813 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 10 07:04:27 crc kubenswrapper[4825]: I0310 07:04:27.693338 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2sxbl" event={"ID":"3b166cbd-0add-4896-92ab-caed991a0a06","Type":"ContainerStarted","Data":"7f5c45f549267da3853bd7ddfebfbea9374ef2c39b83d88180442cd2939efb42"} Mar 10 07:04:27 crc kubenswrapper[4825]: I0310 07:04:27.723282 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-2sxbl" podStartSLOduration=2.3405083810000002 podStartE2EDuration="5.723259149s" podCreationTimestamp="2026-03-10 07:04:22 +0000 UTC" firstStartedPulling="2026-03-10 07:04:23.874731027 +0000 UTC m=+1216.904511642" lastFinishedPulling="2026-03-10 07:04:27.257481785 +0000 UTC m=+1220.287262410" observedRunningTime="2026-03-10 07:04:27.721162484 +0000 UTC m=+1220.750943109" watchObservedRunningTime="2026-03-10 07:04:27.723259149 +0000 UTC m=+1220.753039774" Mar 10 07:04:27 crc kubenswrapper[4825]: I0310 07:04:27.817924 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 10 07:04:28 crc kubenswrapper[4825]: I0310 07:04:28.339977 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 10 07:04:28 crc kubenswrapper[4825]: I0310 07:04:28.340029 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 10 07:04:28 crc kubenswrapper[4825]: I0310 07:04:28.453305 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 10 07:04:28 crc kubenswrapper[4825]: I0310 07:04:28.801371 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.009428 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-e713-account-create-update-8c9hv"] Mar 10 07:04:29 crc kubenswrapper[4825]: E0310 07:04:29.009853 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73078060-8fff-41f1-baa7-17ce9929faec" containerName="dnsmasq-dns" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.009875 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="73078060-8fff-41f1-baa7-17ce9929faec" containerName="dnsmasq-dns" Mar 10 07:04:29 crc kubenswrapper[4825]: E0310 07:04:29.009896 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73078060-8fff-41f1-baa7-17ce9929faec" containerName="init" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.009905 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="73078060-8fff-41f1-baa7-17ce9929faec" containerName="init" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.011342 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="73078060-8fff-41f1-baa7-17ce9929faec" containerName="dnsmasq-dns" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.012055 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e713-account-create-update-8c9hv" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.019582 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.024188 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-vd86v"] Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.027473 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vd86v" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.036036 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-vd86v"] Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.050883 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e713-account-create-update-8c9hv"] Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.057952 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sc8s\" (UniqueName: \"kubernetes.io/projected/10fac422-52dd-409a-adb6-156fe67b68d3-kube-api-access-6sc8s\") pod \"glance-e713-account-create-update-8c9hv\" (UID: \"10fac422-52dd-409a-adb6-156fe67b68d3\") " pod="openstack/glance-e713-account-create-update-8c9hv" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.058008 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzsfs\" (UniqueName: \"kubernetes.io/projected/758e4473-67ca-4d59-bf8c-3584b92a8663-kube-api-access-bzsfs\") pod \"glance-db-create-vd86v\" (UID: \"758e4473-67ca-4d59-bf8c-3584b92a8663\") " pod="openstack/glance-db-create-vd86v" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.058076 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10fac422-52dd-409a-adb6-156fe67b68d3-operator-scripts\") pod \"glance-e713-account-create-update-8c9hv\" (UID: \"10fac422-52dd-409a-adb6-156fe67b68d3\") " pod="openstack/glance-e713-account-create-update-8c9hv" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.058204 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/758e4473-67ca-4d59-bf8c-3584b92a8663-operator-scripts\") pod \"glance-db-create-vd86v\" (UID: \"758e4473-67ca-4d59-bf8c-3584b92a8663\") " pod="openstack/glance-db-create-vd86v" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.160033 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sc8s\" (UniqueName: \"kubernetes.io/projected/10fac422-52dd-409a-adb6-156fe67b68d3-kube-api-access-6sc8s\") pod \"glance-e713-account-create-update-8c9hv\" (UID: \"10fac422-52dd-409a-adb6-156fe67b68d3\") " pod="openstack/glance-e713-account-create-update-8c9hv" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.160099 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzsfs\" (UniqueName: \"kubernetes.io/projected/758e4473-67ca-4d59-bf8c-3584b92a8663-kube-api-access-bzsfs\") pod \"glance-db-create-vd86v\" (UID: \"758e4473-67ca-4d59-bf8c-3584b92a8663\") " pod="openstack/glance-db-create-vd86v" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.160170 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10fac422-52dd-409a-adb6-156fe67b68d3-operator-scripts\") pod \"glance-e713-account-create-update-8c9hv\" (UID: \"10fac422-52dd-409a-adb6-156fe67b68d3\") " pod="openstack/glance-e713-account-create-update-8c9hv" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.160342 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/758e4473-67ca-4d59-bf8c-3584b92a8663-operator-scripts\") pod \"glance-db-create-vd86v\" (UID: \"758e4473-67ca-4d59-bf8c-3584b92a8663\") " pod="openstack/glance-db-create-vd86v" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.165420 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/758e4473-67ca-4d59-bf8c-3584b92a8663-operator-scripts\") pod \"glance-db-create-vd86v\" (UID: \"758e4473-67ca-4d59-bf8c-3584b92a8663\") " pod="openstack/glance-db-create-vd86v" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.182729 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10fac422-52dd-409a-adb6-156fe67b68d3-operator-scripts\") pod \"glance-e713-account-create-update-8c9hv\" (UID: \"10fac422-52dd-409a-adb6-156fe67b68d3\") " pod="openstack/glance-e713-account-create-update-8c9hv" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.209274 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzsfs\" (UniqueName: \"kubernetes.io/projected/758e4473-67ca-4d59-bf8c-3584b92a8663-kube-api-access-bzsfs\") pod \"glance-db-create-vd86v\" (UID: \"758e4473-67ca-4d59-bf8c-3584b92a8663\") " pod="openstack/glance-db-create-vd86v" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.217393 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sc8s\" (UniqueName: \"kubernetes.io/projected/10fac422-52dd-409a-adb6-156fe67b68d3-kube-api-access-6sc8s\") pod \"glance-e713-account-create-update-8c9hv\" (UID: \"10fac422-52dd-409a-adb6-156fe67b68d3\") " pod="openstack/glance-e713-account-create-update-8c9hv" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.335118 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e713-account-create-update-8c9hv" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.372422 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vd86v" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.575556 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.788749 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-gpq78"] Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.790488 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gpq78" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.797103 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-gpq78"] Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.840179 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e713-account-create-update-8c9hv"] Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.864273 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5fb2-account-create-update-vpbd9"] Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.865858 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5fb2-account-create-update-vpbd9" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.880698 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5fb2-account-create-update-vpbd9"] Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.883540 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncp8d\" (UniqueName: \"kubernetes.io/projected/fafc3bbc-dcca-4650-8c76-e1bbda061f75-kube-api-access-ncp8d\") pod \"keystone-db-create-gpq78\" (UID: \"fafc3bbc-dcca-4650-8c76-e1bbda061f75\") " pod="openstack/keystone-db-create-gpq78" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.883813 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fafc3bbc-dcca-4650-8c76-e1bbda061f75-operator-scripts\") pod \"keystone-db-create-gpq78\" (UID: \"fafc3bbc-dcca-4650-8c76-e1bbda061f75\") " pod="openstack/keystone-db-create-gpq78" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.883881 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8530cfd9-9a8b-4d03-93cd-52edad33a965-operator-scripts\") pod \"keystone-5fb2-account-create-update-vpbd9\" (UID: \"8530cfd9-9a8b-4d03-93cd-52edad33a965\") " pod="openstack/keystone-5fb2-account-create-update-vpbd9" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.884157 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvq7n\" (UniqueName: \"kubernetes.io/projected/8530cfd9-9a8b-4d03-93cd-52edad33a965-kube-api-access-jvq7n\") pod \"keystone-5fb2-account-create-update-vpbd9\" (UID: \"8530cfd9-9a8b-4d03-93cd-52edad33a965\") " pod="openstack/keystone-5fb2-account-create-update-vpbd9" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.913036 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.928324 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-vd86v"] Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.986991 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fafc3bbc-dcca-4650-8c76-e1bbda061f75-operator-scripts\") pod \"keystone-db-create-gpq78\" (UID: \"fafc3bbc-dcca-4650-8c76-e1bbda061f75\") " pod="openstack/keystone-db-create-gpq78" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.987044 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8530cfd9-9a8b-4d03-93cd-52edad33a965-operator-scripts\") pod \"keystone-5fb2-account-create-update-vpbd9\" (UID: \"8530cfd9-9a8b-4d03-93cd-52edad33a965\") " pod="openstack/keystone-5fb2-account-create-update-vpbd9" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.987111 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvq7n\" (UniqueName: \"kubernetes.io/projected/8530cfd9-9a8b-4d03-93cd-52edad33a965-kube-api-access-jvq7n\") pod \"keystone-5fb2-account-create-update-vpbd9\" (UID: \"8530cfd9-9a8b-4d03-93cd-52edad33a965\") " pod="openstack/keystone-5fb2-account-create-update-vpbd9" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.987181 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-etc-swift\") pod \"swift-storage-0\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " pod="openstack/swift-storage-0" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.987203 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncp8d\" (UniqueName: \"kubernetes.io/projected/fafc3bbc-dcca-4650-8c76-e1bbda061f75-kube-api-access-ncp8d\") pod \"keystone-db-create-gpq78\" (UID: \"fafc3bbc-dcca-4650-8c76-e1bbda061f75\") " pod="openstack/keystone-db-create-gpq78" Mar 10 07:04:29 crc kubenswrapper[4825]: E0310 07:04:29.988280 4825 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 07:04:29 crc kubenswrapper[4825]: E0310 07:04:29.988316 4825 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 07:04:29 crc kubenswrapper[4825]: E0310 07:04:29.988381 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-etc-swift podName:632168fa-0532-4c7e-b688-fb361ee89ec8 nodeName:}" failed. No retries permitted until 2026-03-10 07:04:37.988360063 +0000 UTC m=+1231.018140688 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-etc-swift") pod "swift-storage-0" (UID: "632168fa-0532-4c7e-b688-fb361ee89ec8") : configmap "swift-ring-files" not found Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.988394 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fafc3bbc-dcca-4650-8c76-e1bbda061f75-operator-scripts\") pod \"keystone-db-create-gpq78\" (UID: \"fafc3bbc-dcca-4650-8c76-e1bbda061f75\") " pod="openstack/keystone-db-create-gpq78" Mar 10 07:04:29 crc kubenswrapper[4825]: I0310 07:04:29.989745 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8530cfd9-9a8b-4d03-93cd-52edad33a965-operator-scripts\") pod \"keystone-5fb2-account-create-update-vpbd9\" (UID: \"8530cfd9-9a8b-4d03-93cd-52edad33a965\") " pod="openstack/keystone-5fb2-account-create-update-vpbd9" Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.008467 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncp8d\" (UniqueName: \"kubernetes.io/projected/fafc3bbc-dcca-4650-8c76-e1bbda061f75-kube-api-access-ncp8d\") pod \"keystone-db-create-gpq78\" (UID: \"fafc3bbc-dcca-4650-8c76-e1bbda061f75\") " pod="openstack/keystone-db-create-gpq78" Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.008959 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvq7n\" (UniqueName: \"kubernetes.io/projected/8530cfd9-9a8b-4d03-93cd-52edad33a965-kube-api-access-jvq7n\") pod \"keystone-5fb2-account-create-update-vpbd9\" (UID: \"8530cfd9-9a8b-4d03-93cd-52edad33a965\") " pod="openstack/keystone-5fb2-account-create-update-vpbd9" Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.068305 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-2jsr2"] Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.069693 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2jsr2" Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.077012 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-1e7f-account-create-update-cxt8s"] Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.078092 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1e7f-account-create-update-cxt8s" Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.079689 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.090772 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swpr8\" (UniqueName: \"kubernetes.io/projected/f3950c3f-9b1c-495c-a837-9d2e13500042-kube-api-access-swpr8\") pod \"placement-1e7f-account-create-update-cxt8s\" (UID: \"f3950c3f-9b1c-495c-a837-9d2e13500042\") " pod="openstack/placement-1e7f-account-create-update-cxt8s" Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.090866 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75tg5\" (UniqueName: \"kubernetes.io/projected/7d35d77e-fb8b-4aff-82f2-be95d6eae583-kube-api-access-75tg5\") pod \"placement-db-create-2jsr2\" (UID: \"7d35d77e-fb8b-4aff-82f2-be95d6eae583\") " pod="openstack/placement-db-create-2jsr2" Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.090926 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d35d77e-fb8b-4aff-82f2-be95d6eae583-operator-scripts\") pod \"placement-db-create-2jsr2\" (UID: \"7d35d77e-fb8b-4aff-82f2-be95d6eae583\") " pod="openstack/placement-db-create-2jsr2" Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.090986 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3950c3f-9b1c-495c-a837-9d2e13500042-operator-scripts\") pod \"placement-1e7f-account-create-update-cxt8s\" (UID: \"f3950c3f-9b1c-495c-a837-9d2e13500042\") " pod="openstack/placement-1e7f-account-create-update-cxt8s" Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.091089 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-2jsr2"] Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.101226 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1e7f-account-create-update-cxt8s"] Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.110820 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gpq78" Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.153435 4825 scope.go:117] "RemoveContainer" containerID="5bb17d36525609847fef21235c27a86842bece8a7b650a7e31d643f4dd7cd3f3" Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.191919 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swpr8\" (UniqueName: \"kubernetes.io/projected/f3950c3f-9b1c-495c-a837-9d2e13500042-kube-api-access-swpr8\") pod \"placement-1e7f-account-create-update-cxt8s\" (UID: \"f3950c3f-9b1c-495c-a837-9d2e13500042\") " pod="openstack/placement-1e7f-account-create-update-cxt8s" Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.192003 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75tg5\" (UniqueName: \"kubernetes.io/projected/7d35d77e-fb8b-4aff-82f2-be95d6eae583-kube-api-access-75tg5\") pod \"placement-db-create-2jsr2\" (UID: \"7d35d77e-fb8b-4aff-82f2-be95d6eae583\") " pod="openstack/placement-db-create-2jsr2" Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.192056 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d35d77e-fb8b-4aff-82f2-be95d6eae583-operator-scripts\") pod \"placement-db-create-2jsr2\" (UID: \"7d35d77e-fb8b-4aff-82f2-be95d6eae583\") " pod="openstack/placement-db-create-2jsr2" Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.192106 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3950c3f-9b1c-495c-a837-9d2e13500042-operator-scripts\") pod \"placement-1e7f-account-create-update-cxt8s\" (UID: \"f3950c3f-9b1c-495c-a837-9d2e13500042\") " pod="openstack/placement-1e7f-account-create-update-cxt8s" Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.193015 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3950c3f-9b1c-495c-a837-9d2e13500042-operator-scripts\") pod \"placement-1e7f-account-create-update-cxt8s\" (UID: \"f3950c3f-9b1c-495c-a837-9d2e13500042\") " pod="openstack/placement-1e7f-account-create-update-cxt8s" Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.193148 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d35d77e-fb8b-4aff-82f2-be95d6eae583-operator-scripts\") pod \"placement-db-create-2jsr2\" (UID: \"7d35d77e-fb8b-4aff-82f2-be95d6eae583\") " pod="openstack/placement-db-create-2jsr2" Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.222759 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swpr8\" (UniqueName: \"kubernetes.io/projected/f3950c3f-9b1c-495c-a837-9d2e13500042-kube-api-access-swpr8\") pod \"placement-1e7f-account-create-update-cxt8s\" (UID: \"f3950c3f-9b1c-495c-a837-9d2e13500042\") " pod="openstack/placement-1e7f-account-create-update-cxt8s" Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.226379 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75tg5\" (UniqueName: \"kubernetes.io/projected/7d35d77e-fb8b-4aff-82f2-be95d6eae583-kube-api-access-75tg5\") pod \"placement-db-create-2jsr2\" (UID: \"7d35d77e-fb8b-4aff-82f2-be95d6eae583\") " pod="openstack/placement-db-create-2jsr2" Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.232925 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5fb2-account-create-update-vpbd9" Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.395240 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2jsr2" Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.406909 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1e7f-account-create-update-cxt8s" Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.587488 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-gpq78"] Mar 10 07:04:30 crc kubenswrapper[4825]: W0310 07:04:30.588745 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfafc3bbc_dcca_4650_8c76_e1bbda061f75.slice/crio-e0ccdd685336a3d71a90e7b7be4ec29443ea772f76f2e86f856b7637c310c51f WatchSource:0}: Error finding container e0ccdd685336a3d71a90e7b7be4ec29443ea772f76f2e86f856b7637c310c51f: Status 404 returned error can't find the container with id e0ccdd685336a3d71a90e7b7be4ec29443ea772f76f2e86f856b7637c310c51f Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.722459 4825 generic.go:334] "Generic (PLEG): container finished" podID="10fac422-52dd-409a-adb6-156fe67b68d3" containerID="19eeac237e74a5220a577c3d082d7b8b22aaedda3a7f589b4665325a76f1f281" exitCode=0 Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.722562 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e713-account-create-update-8c9hv" event={"ID":"10fac422-52dd-409a-adb6-156fe67b68d3","Type":"ContainerDied","Data":"19eeac237e74a5220a577c3d082d7b8b22aaedda3a7f589b4665325a76f1f281"} Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.722617 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e713-account-create-update-8c9hv" event={"ID":"10fac422-52dd-409a-adb6-156fe67b68d3","Type":"ContainerStarted","Data":"b501e5a472df767f10d200c2ce311c493b3da9cab7d29b636eca89533ee7a005"} Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.725586 4825 generic.go:334] "Generic (PLEG): container finished" podID="758e4473-67ca-4d59-bf8c-3584b92a8663" containerID="a6005c6bc4da885ce83884c2c94dd7496cf366fd8be1169072b634c4a4fd77de" exitCode=0 Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.725651 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vd86v" event={"ID":"758e4473-67ca-4d59-bf8c-3584b92a8663","Type":"ContainerDied","Data":"a6005c6bc4da885ce83884c2c94dd7496cf366fd8be1169072b634c4a4fd77de"} Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.725693 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vd86v" event={"ID":"758e4473-67ca-4d59-bf8c-3584b92a8663","Type":"ContainerStarted","Data":"cf6182ad5fa10c4ffec2d50c5f4798694cf335151e28b29ebc2204a7cf1616d2"} Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.726777 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gpq78" event={"ID":"fafc3bbc-dcca-4650-8c76-e1bbda061f75","Type":"ContainerStarted","Data":"e0ccdd685336a3d71a90e7b7be4ec29443ea772f76f2e86f856b7637c310c51f"} Mar 10 07:04:30 crc kubenswrapper[4825]: W0310 07:04:30.742071 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8530cfd9_9a8b_4d03_93cd_52edad33a965.slice/crio-a082106a190922c95f76a20d6c08e0f2bb3902287114c45854f08b23fe504cb7 WatchSource:0}: Error finding container a082106a190922c95f76a20d6c08e0f2bb3902287114c45854f08b23fe504cb7: Status 404 returned error can't find the container with id a082106a190922c95f76a20d6c08e0f2bb3902287114c45854f08b23fe504cb7 Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.745577 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5fb2-account-create-update-vpbd9"] Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.860329 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-2jsr2"] Mar 10 07:04:30 crc kubenswrapper[4825]: W0310 07:04:30.861462 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d35d77e_fb8b_4aff_82f2_be95d6eae583.slice/crio-7be76b847dbe7cf9e79edecb721d447429332ac027ade43c4b45ce08bf7c30bf WatchSource:0}: Error finding container 7be76b847dbe7cf9e79edecb721d447429332ac027ade43c4b45ce08bf7c30bf: Status 404 returned error can't find the container with id 7be76b847dbe7cf9e79edecb721d447429332ac027ade43c4b45ce08bf7c30bf Mar 10 07:04:30 crc kubenswrapper[4825]: I0310 07:04:30.972724 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1e7f-account-create-update-cxt8s"] Mar 10 07:04:31 crc kubenswrapper[4825]: I0310 07:04:31.376395 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" Mar 10 07:04:31 crc kubenswrapper[4825]: I0310 07:04:31.479588 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-5lkdr"] Mar 10 07:04:31 crc kubenswrapper[4825]: I0310 07:04:31.480281 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" podUID="ec1ec5e1-8623-412f-a622-52b10a2c798d" containerName="dnsmasq-dns" containerID="cri-o://a90dbf51fbed901941157edef9fd7fbe40dc81e7559edb2eac2bdf74e6b8a785" gracePeriod=10 Mar 10 07:04:31 crc kubenswrapper[4825]: I0310 07:04:31.736014 4825 generic.go:334] "Generic (PLEG): container finished" podID="f3950c3f-9b1c-495c-a837-9d2e13500042" containerID="3c1a26aa489fef14c83ee2f8dab937b7ecb3974bfcc67e5ac84499f654860d6b" exitCode=0 Mar 10 07:04:31 crc kubenswrapper[4825]: I0310 07:04:31.736211 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1e7f-account-create-update-cxt8s" event={"ID":"f3950c3f-9b1c-495c-a837-9d2e13500042","Type":"ContainerDied","Data":"3c1a26aa489fef14c83ee2f8dab937b7ecb3974bfcc67e5ac84499f654860d6b"} Mar 10 07:04:31 crc kubenswrapper[4825]: I0310 07:04:31.736243 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1e7f-account-create-update-cxt8s" event={"ID":"f3950c3f-9b1c-495c-a837-9d2e13500042","Type":"ContainerStarted","Data":"893fe7b7c21ae7fe81926feb1088ac57aa51fac4479b22a205e88af977a911df"} Mar 10 07:04:31 crc kubenswrapper[4825]: I0310 07:04:31.740866 4825 generic.go:334] "Generic (PLEG): container finished" podID="8530cfd9-9a8b-4d03-93cd-52edad33a965" containerID="940fb8b9f6cdbf71b704bd64a4630d8473c614b5066613b8ce93f6601e6420ac" exitCode=0 Mar 10 07:04:31 crc kubenswrapper[4825]: I0310 07:04:31.740934 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5fb2-account-create-update-vpbd9" event={"ID":"8530cfd9-9a8b-4d03-93cd-52edad33a965","Type":"ContainerDied","Data":"940fb8b9f6cdbf71b704bd64a4630d8473c614b5066613b8ce93f6601e6420ac"} Mar 10 07:04:31 crc kubenswrapper[4825]: I0310 07:04:31.740960 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5fb2-account-create-update-vpbd9" event={"ID":"8530cfd9-9a8b-4d03-93cd-52edad33a965","Type":"ContainerStarted","Data":"a082106a190922c95f76a20d6c08e0f2bb3902287114c45854f08b23fe504cb7"} Mar 10 07:04:31 crc kubenswrapper[4825]: I0310 07:04:31.743498 4825 generic.go:334] "Generic (PLEG): container finished" podID="ec1ec5e1-8623-412f-a622-52b10a2c798d" containerID="a90dbf51fbed901941157edef9fd7fbe40dc81e7559edb2eac2bdf74e6b8a785" exitCode=0 Mar 10 07:04:31 crc kubenswrapper[4825]: I0310 07:04:31.743547 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" event={"ID":"ec1ec5e1-8623-412f-a622-52b10a2c798d","Type":"ContainerDied","Data":"a90dbf51fbed901941157edef9fd7fbe40dc81e7559edb2eac2bdf74e6b8a785"} Mar 10 07:04:31 crc kubenswrapper[4825]: I0310 07:04:31.745629 4825 generic.go:334] "Generic (PLEG): container finished" podID="fafc3bbc-dcca-4650-8c76-e1bbda061f75" containerID="d511a3187b10669bd4c998f3a57faa64870755649eb5a3cb0b79fc7f46b76cf2" exitCode=0 Mar 10 07:04:31 crc kubenswrapper[4825]: I0310 07:04:31.745811 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gpq78" event={"ID":"fafc3bbc-dcca-4650-8c76-e1bbda061f75","Type":"ContainerDied","Data":"d511a3187b10669bd4c998f3a57faa64870755649eb5a3cb0b79fc7f46b76cf2"} Mar 10 07:04:31 crc kubenswrapper[4825]: I0310 07:04:31.747428 4825 generic.go:334] "Generic (PLEG): container finished" podID="7d35d77e-fb8b-4aff-82f2-be95d6eae583" containerID="d1e097358167e39ddccce6b130e413b76f204a644ef1a982d1f7345519f8fd77" exitCode=0 Mar 10 07:04:31 crc kubenswrapper[4825]: I0310 07:04:31.747518 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2jsr2" event={"ID":"7d35d77e-fb8b-4aff-82f2-be95d6eae583","Type":"ContainerDied","Data":"d1e097358167e39ddccce6b130e413b76f204a644ef1a982d1f7345519f8fd77"} Mar 10 07:04:31 crc kubenswrapper[4825]: I0310 07:04:31.747541 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2jsr2" event={"ID":"7d35d77e-fb8b-4aff-82f2-be95d6eae583","Type":"ContainerStarted","Data":"7be76b847dbe7cf9e79edecb721d447429332ac027ade43c4b45ce08bf7c30bf"} Mar 10 07:04:31 crc kubenswrapper[4825]: I0310 07:04:31.993944 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.034639 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec1ec5e1-8623-412f-a622-52b10a2c798d-dns-svc\") pod \"ec1ec5e1-8623-412f-a622-52b10a2c798d\" (UID: \"ec1ec5e1-8623-412f-a622-52b10a2c798d\") " Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.034698 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtkpn\" (UniqueName: \"kubernetes.io/projected/ec1ec5e1-8623-412f-a622-52b10a2c798d-kube-api-access-gtkpn\") pod \"ec1ec5e1-8623-412f-a622-52b10a2c798d\" (UID: \"ec1ec5e1-8623-412f-a622-52b10a2c798d\") " Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.034735 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec1ec5e1-8623-412f-a622-52b10a2c798d-ovsdbserver-sb\") pod \"ec1ec5e1-8623-412f-a622-52b10a2c798d\" (UID: \"ec1ec5e1-8623-412f-a622-52b10a2c798d\") " Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.034823 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec1ec5e1-8623-412f-a622-52b10a2c798d-ovsdbserver-nb\") pod \"ec1ec5e1-8623-412f-a622-52b10a2c798d\" (UID: \"ec1ec5e1-8623-412f-a622-52b10a2c798d\") " Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.034858 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec1ec5e1-8623-412f-a622-52b10a2c798d-config\") pod \"ec1ec5e1-8623-412f-a622-52b10a2c798d\" (UID: \"ec1ec5e1-8623-412f-a622-52b10a2c798d\") " Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.058448 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec1ec5e1-8623-412f-a622-52b10a2c798d-kube-api-access-gtkpn" (OuterVolumeSpecName: "kube-api-access-gtkpn") pod "ec1ec5e1-8623-412f-a622-52b10a2c798d" (UID: "ec1ec5e1-8623-412f-a622-52b10a2c798d"). InnerVolumeSpecName "kube-api-access-gtkpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.120127 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec1ec5e1-8623-412f-a622-52b10a2c798d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ec1ec5e1-8623-412f-a622-52b10a2c798d" (UID: "ec1ec5e1-8623-412f-a622-52b10a2c798d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.135013 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec1ec5e1-8623-412f-a622-52b10a2c798d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ec1ec5e1-8623-412f-a622-52b10a2c798d" (UID: "ec1ec5e1-8623-412f-a622-52b10a2c798d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.136832 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtkpn\" (UniqueName: \"kubernetes.io/projected/ec1ec5e1-8623-412f-a622-52b10a2c798d-kube-api-access-gtkpn\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.136867 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec1ec5e1-8623-412f-a622-52b10a2c798d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.136880 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec1ec5e1-8623-412f-a622-52b10a2c798d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.149677 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec1ec5e1-8623-412f-a622-52b10a2c798d-config" (OuterVolumeSpecName: "config") pod "ec1ec5e1-8623-412f-a622-52b10a2c798d" (UID: "ec1ec5e1-8623-412f-a622-52b10a2c798d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.154159 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vd86v" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.233735 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec1ec5e1-8623-412f-a622-52b10a2c798d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ec1ec5e1-8623-412f-a622-52b10a2c798d" (UID: "ec1ec5e1-8623-412f-a622-52b10a2c798d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.243845 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzsfs\" (UniqueName: \"kubernetes.io/projected/758e4473-67ca-4d59-bf8c-3584b92a8663-kube-api-access-bzsfs\") pod \"758e4473-67ca-4d59-bf8c-3584b92a8663\" (UID: \"758e4473-67ca-4d59-bf8c-3584b92a8663\") " Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.244039 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/758e4473-67ca-4d59-bf8c-3584b92a8663-operator-scripts\") pod \"758e4473-67ca-4d59-bf8c-3584b92a8663\" (UID: \"758e4473-67ca-4d59-bf8c-3584b92a8663\") " Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.244490 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec1ec5e1-8623-412f-a622-52b10a2c798d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.244502 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec1ec5e1-8623-412f-a622-52b10a2c798d-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.245218 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/758e4473-67ca-4d59-bf8c-3584b92a8663-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "758e4473-67ca-4d59-bf8c-3584b92a8663" (UID: "758e4473-67ca-4d59-bf8c-3584b92a8663"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.263393 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/758e4473-67ca-4d59-bf8c-3584b92a8663-kube-api-access-bzsfs" (OuterVolumeSpecName: "kube-api-access-bzsfs") pod "758e4473-67ca-4d59-bf8c-3584b92a8663" (UID: "758e4473-67ca-4d59-bf8c-3584b92a8663"). InnerVolumeSpecName "kube-api-access-bzsfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.348680 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzsfs\" (UniqueName: \"kubernetes.io/projected/758e4473-67ca-4d59-bf8c-3584b92a8663-kube-api-access-bzsfs\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.349668 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/758e4473-67ca-4d59-bf8c-3584b92a8663-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.415976 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e713-account-create-update-8c9hv" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.450429 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sc8s\" (UniqueName: \"kubernetes.io/projected/10fac422-52dd-409a-adb6-156fe67b68d3-kube-api-access-6sc8s\") pod \"10fac422-52dd-409a-adb6-156fe67b68d3\" (UID: \"10fac422-52dd-409a-adb6-156fe67b68d3\") " Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.450691 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10fac422-52dd-409a-adb6-156fe67b68d3-operator-scripts\") pod \"10fac422-52dd-409a-adb6-156fe67b68d3\" (UID: \"10fac422-52dd-409a-adb6-156fe67b68d3\") " Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.451804 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10fac422-52dd-409a-adb6-156fe67b68d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "10fac422-52dd-409a-adb6-156fe67b68d3" (UID: "10fac422-52dd-409a-adb6-156fe67b68d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.471846 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10fac422-52dd-409a-adb6-156fe67b68d3-kube-api-access-6sc8s" (OuterVolumeSpecName: "kube-api-access-6sc8s") pod "10fac422-52dd-409a-adb6-156fe67b68d3" (UID: "10fac422-52dd-409a-adb6-156fe67b68d3"). InnerVolumeSpecName "kube-api-access-6sc8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.552938 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sc8s\" (UniqueName: \"kubernetes.io/projected/10fac422-52dd-409a-adb6-156fe67b68d3-kube-api-access-6sc8s\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.553612 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10fac422-52dd-409a-adb6-156fe67b68d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.764439 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e713-account-create-update-8c9hv" event={"ID":"10fac422-52dd-409a-adb6-156fe67b68d3","Type":"ContainerDied","Data":"b501e5a472df767f10d200c2ce311c493b3da9cab7d29b636eca89533ee7a005"} Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.764646 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b501e5a472df767f10d200c2ce311c493b3da9cab7d29b636eca89533ee7a005" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.764772 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e713-account-create-update-8c9hv" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.770256 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" event={"ID":"ec1ec5e1-8623-412f-a622-52b10a2c798d","Type":"ContainerDied","Data":"615c124cfb840ed1378e6bb8868174391c9571ebf8c6fb6273e7c70031f3825f"} Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.770329 4825 scope.go:117] "RemoveContainer" containerID="a90dbf51fbed901941157edef9fd7fbe40dc81e7559edb2eac2bdf74e6b8a785" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.770341 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-5lkdr" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.774437 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vd86v" event={"ID":"758e4473-67ca-4d59-bf8c-3584b92a8663","Type":"ContainerDied","Data":"cf6182ad5fa10c4ffec2d50c5f4798694cf335151e28b29ebc2204a7cf1616d2"} Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.774487 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf6182ad5fa10c4ffec2d50c5f4798694cf335151e28b29ebc2204a7cf1616d2" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.774645 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vd86v" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.799431 4825 scope.go:117] "RemoveContainer" containerID="12fe27040f0212ebc74a1a63a75c026abefdf592a1a0b379c2df95abc8857cd1" Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.818856 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-5lkdr"] Mar 10 07:04:32 crc kubenswrapper[4825]: I0310 07:04:32.826150 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-5lkdr"] Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.100604 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1e7f-account-create-update-cxt8s" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.177735 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swpr8\" (UniqueName: \"kubernetes.io/projected/f3950c3f-9b1c-495c-a837-9d2e13500042-kube-api-access-swpr8\") pod \"f3950c3f-9b1c-495c-a837-9d2e13500042\" (UID: \"f3950c3f-9b1c-495c-a837-9d2e13500042\") " Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.177819 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3950c3f-9b1c-495c-a837-9d2e13500042-operator-scripts\") pod \"f3950c3f-9b1c-495c-a837-9d2e13500042\" (UID: \"f3950c3f-9b1c-495c-a837-9d2e13500042\") " Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.178538 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3950c3f-9b1c-495c-a837-9d2e13500042-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f3950c3f-9b1c-495c-a837-9d2e13500042" (UID: "f3950c3f-9b1c-495c-a837-9d2e13500042"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.194624 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3950c3f-9b1c-495c-a837-9d2e13500042-kube-api-access-swpr8" (OuterVolumeSpecName: "kube-api-access-swpr8") pod "f3950c3f-9b1c-495c-a837-9d2e13500042" (UID: "f3950c3f-9b1c-495c-a837-9d2e13500042"). InnerVolumeSpecName "kube-api-access-swpr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.251226 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec1ec5e1-8623-412f-a622-52b10a2c798d" path="/var/lib/kubelet/pods/ec1ec5e1-8623-412f-a622-52b10a2c798d/volumes" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.285428 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swpr8\" (UniqueName: \"kubernetes.io/projected/f3950c3f-9b1c-495c-a837-9d2e13500042-kube-api-access-swpr8\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.285516 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3950c3f-9b1c-495c-a837-9d2e13500042-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.352288 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2jsr2" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.357409 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5fb2-account-create-update-vpbd9" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.363309 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gpq78" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.390226 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75tg5\" (UniqueName: \"kubernetes.io/projected/7d35d77e-fb8b-4aff-82f2-be95d6eae583-kube-api-access-75tg5\") pod \"7d35d77e-fb8b-4aff-82f2-be95d6eae583\" (UID: \"7d35d77e-fb8b-4aff-82f2-be95d6eae583\") " Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.390301 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvq7n\" (UniqueName: \"kubernetes.io/projected/8530cfd9-9a8b-4d03-93cd-52edad33a965-kube-api-access-jvq7n\") pod \"8530cfd9-9a8b-4d03-93cd-52edad33a965\" (UID: \"8530cfd9-9a8b-4d03-93cd-52edad33a965\") " Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.390360 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d35d77e-fb8b-4aff-82f2-be95d6eae583-operator-scripts\") pod \"7d35d77e-fb8b-4aff-82f2-be95d6eae583\" (UID: \"7d35d77e-fb8b-4aff-82f2-be95d6eae583\") " Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.390396 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8530cfd9-9a8b-4d03-93cd-52edad33a965-operator-scripts\") pod \"8530cfd9-9a8b-4d03-93cd-52edad33a965\" (UID: \"8530cfd9-9a8b-4d03-93cd-52edad33a965\") " Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.390434 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncp8d\" (UniqueName: \"kubernetes.io/projected/fafc3bbc-dcca-4650-8c76-e1bbda061f75-kube-api-access-ncp8d\") pod \"fafc3bbc-dcca-4650-8c76-e1bbda061f75\" (UID: \"fafc3bbc-dcca-4650-8c76-e1bbda061f75\") " Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.393547 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d35d77e-fb8b-4aff-82f2-be95d6eae583-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7d35d77e-fb8b-4aff-82f2-be95d6eae583" (UID: "7d35d77e-fb8b-4aff-82f2-be95d6eae583"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.394060 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8530cfd9-9a8b-4d03-93cd-52edad33a965-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8530cfd9-9a8b-4d03-93cd-52edad33a965" (UID: "8530cfd9-9a8b-4d03-93cd-52edad33a965"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.395463 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8530cfd9-9a8b-4d03-93cd-52edad33a965-kube-api-access-jvq7n" (OuterVolumeSpecName: "kube-api-access-jvq7n") pod "8530cfd9-9a8b-4d03-93cd-52edad33a965" (UID: "8530cfd9-9a8b-4d03-93cd-52edad33a965"). InnerVolumeSpecName "kube-api-access-jvq7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.395504 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d35d77e-fb8b-4aff-82f2-be95d6eae583-kube-api-access-75tg5" (OuterVolumeSpecName: "kube-api-access-75tg5") pod "7d35d77e-fb8b-4aff-82f2-be95d6eae583" (UID: "7d35d77e-fb8b-4aff-82f2-be95d6eae583"). InnerVolumeSpecName "kube-api-access-75tg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.398067 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fafc3bbc-dcca-4650-8c76-e1bbda061f75-kube-api-access-ncp8d" (OuterVolumeSpecName: "kube-api-access-ncp8d") pod "fafc3bbc-dcca-4650-8c76-e1bbda061f75" (UID: "fafc3bbc-dcca-4650-8c76-e1bbda061f75"). InnerVolumeSpecName "kube-api-access-ncp8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.491716 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fafc3bbc-dcca-4650-8c76-e1bbda061f75-operator-scripts\") pod \"fafc3bbc-dcca-4650-8c76-e1bbda061f75\" (UID: \"fafc3bbc-dcca-4650-8c76-e1bbda061f75\") " Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.492058 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75tg5\" (UniqueName: \"kubernetes.io/projected/7d35d77e-fb8b-4aff-82f2-be95d6eae583-kube-api-access-75tg5\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.492072 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvq7n\" (UniqueName: \"kubernetes.io/projected/8530cfd9-9a8b-4d03-93cd-52edad33a965-kube-api-access-jvq7n\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.492083 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d35d77e-fb8b-4aff-82f2-be95d6eae583-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.492091 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8530cfd9-9a8b-4d03-93cd-52edad33a965-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.492099 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncp8d\" (UniqueName: \"kubernetes.io/projected/fafc3bbc-dcca-4650-8c76-e1bbda061f75-kube-api-access-ncp8d\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.492606 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fafc3bbc-dcca-4650-8c76-e1bbda061f75-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fafc3bbc-dcca-4650-8c76-e1bbda061f75" (UID: "fafc3bbc-dcca-4650-8c76-e1bbda061f75"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.594210 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fafc3bbc-dcca-4650-8c76-e1bbda061f75-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.786053 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gpq78" event={"ID":"fafc3bbc-dcca-4650-8c76-e1bbda061f75","Type":"ContainerDied","Data":"e0ccdd685336a3d71a90e7b7be4ec29443ea772f76f2e86f856b7637c310c51f"} Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.786100 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0ccdd685336a3d71a90e7b7be4ec29443ea772f76f2e86f856b7637c310c51f" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.786124 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gpq78" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.787989 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2jsr2" event={"ID":"7d35d77e-fb8b-4aff-82f2-be95d6eae583","Type":"ContainerDied","Data":"7be76b847dbe7cf9e79edecb721d447429332ac027ade43c4b45ce08bf7c30bf"} Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.788012 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7be76b847dbe7cf9e79edecb721d447429332ac027ade43c4b45ce08bf7c30bf" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.788071 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2jsr2" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.789803 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1e7f-account-create-update-cxt8s" event={"ID":"f3950c3f-9b1c-495c-a837-9d2e13500042","Type":"ContainerDied","Data":"893fe7b7c21ae7fe81926feb1088ac57aa51fac4479b22a205e88af977a911df"} Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.789919 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="893fe7b7c21ae7fe81926feb1088ac57aa51fac4479b22a205e88af977a911df" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.790041 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1e7f-account-create-update-cxt8s" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.792985 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5fb2-account-create-update-vpbd9" event={"ID":"8530cfd9-9a8b-4d03-93cd-52edad33a965","Type":"ContainerDied","Data":"a082106a190922c95f76a20d6c08e0f2bb3902287114c45854f08b23fe504cb7"} Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.793487 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a082106a190922c95f76a20d6c08e0f2bb3902287114c45854f08b23fe504cb7" Mar 10 07:04:33 crc kubenswrapper[4825]: I0310 07:04:33.793267 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5fb2-account-create-update-vpbd9" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.260171 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-7zgw8"] Mar 10 07:04:34 crc kubenswrapper[4825]: E0310 07:04:34.260689 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1ec5e1-8623-412f-a622-52b10a2c798d" containerName="init" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.260720 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1ec5e1-8623-412f-a622-52b10a2c798d" containerName="init" Mar 10 07:04:34 crc kubenswrapper[4825]: E0310 07:04:34.260757 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8530cfd9-9a8b-4d03-93cd-52edad33a965" containerName="mariadb-account-create-update" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.260772 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8530cfd9-9a8b-4d03-93cd-52edad33a965" containerName="mariadb-account-create-update" Mar 10 07:04:34 crc kubenswrapper[4825]: E0310 07:04:34.260791 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1ec5e1-8623-412f-a622-52b10a2c798d" containerName="dnsmasq-dns" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.260802 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1ec5e1-8623-412f-a622-52b10a2c798d" containerName="dnsmasq-dns" Mar 10 07:04:34 crc kubenswrapper[4825]: E0310 07:04:34.260827 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="758e4473-67ca-4d59-bf8c-3584b92a8663" containerName="mariadb-database-create" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.260838 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="758e4473-67ca-4d59-bf8c-3584b92a8663" containerName="mariadb-database-create" Mar 10 07:04:34 crc kubenswrapper[4825]: E0310 07:04:34.260859 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d35d77e-fb8b-4aff-82f2-be95d6eae583" containerName="mariadb-database-create" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.260872 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d35d77e-fb8b-4aff-82f2-be95d6eae583" containerName="mariadb-database-create" Mar 10 07:04:34 crc kubenswrapper[4825]: E0310 07:04:34.260886 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10fac422-52dd-409a-adb6-156fe67b68d3" containerName="mariadb-account-create-update" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.260899 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="10fac422-52dd-409a-adb6-156fe67b68d3" containerName="mariadb-account-create-update" Mar 10 07:04:34 crc kubenswrapper[4825]: E0310 07:04:34.260916 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fafc3bbc-dcca-4650-8c76-e1bbda061f75" containerName="mariadb-database-create" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.260925 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fafc3bbc-dcca-4650-8c76-e1bbda061f75" containerName="mariadb-database-create" Mar 10 07:04:34 crc kubenswrapper[4825]: E0310 07:04:34.260943 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3950c3f-9b1c-495c-a837-9d2e13500042" containerName="mariadb-account-create-update" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.260955 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3950c3f-9b1c-495c-a837-9d2e13500042" containerName="mariadb-account-create-update" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.261222 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="758e4473-67ca-4d59-bf8c-3584b92a8663" containerName="mariadb-database-create" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.261257 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="10fac422-52dd-409a-adb6-156fe67b68d3" containerName="mariadb-account-create-update" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.261283 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="8530cfd9-9a8b-4d03-93cd-52edad33a965" containerName="mariadb-account-create-update" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.261305 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d35d77e-fb8b-4aff-82f2-be95d6eae583" containerName="mariadb-database-create" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.261323 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec1ec5e1-8623-412f-a622-52b10a2c798d" containerName="dnsmasq-dns" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.261336 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3950c3f-9b1c-495c-a837-9d2e13500042" containerName="mariadb-account-create-update" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.261354 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="fafc3bbc-dcca-4650-8c76-e1bbda061f75" containerName="mariadb-database-create" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.262203 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7zgw8" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.264614 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.266650 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lwfpg" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.277342 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-7zgw8"] Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.422971 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff-db-sync-config-data\") pod \"glance-db-sync-7zgw8\" (UID: \"31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff\") " pod="openstack/glance-db-sync-7zgw8" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.423498 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65w7p\" (UniqueName: \"kubernetes.io/projected/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff-kube-api-access-65w7p\") pod \"glance-db-sync-7zgw8\" (UID: \"31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff\") " pod="openstack/glance-db-sync-7zgw8" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.423685 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff-config-data\") pod \"glance-db-sync-7zgw8\" (UID: \"31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff\") " pod="openstack/glance-db-sync-7zgw8" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.424579 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff-combined-ca-bundle\") pod \"glance-db-sync-7zgw8\" (UID: \"31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff\") " pod="openstack/glance-db-sync-7zgw8" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.526069 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff-db-sync-config-data\") pod \"glance-db-sync-7zgw8\" (UID: \"31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff\") " pod="openstack/glance-db-sync-7zgw8" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.526142 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65w7p\" (UniqueName: \"kubernetes.io/projected/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff-kube-api-access-65w7p\") pod \"glance-db-sync-7zgw8\" (UID: \"31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff\") " pod="openstack/glance-db-sync-7zgw8" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.526189 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff-config-data\") pod \"glance-db-sync-7zgw8\" (UID: \"31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff\") " pod="openstack/glance-db-sync-7zgw8" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.526218 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff-combined-ca-bundle\") pod \"glance-db-sync-7zgw8\" (UID: \"31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff\") " pod="openstack/glance-db-sync-7zgw8" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.531036 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff-db-sync-config-data\") pod \"glance-db-sync-7zgw8\" (UID: \"31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff\") " pod="openstack/glance-db-sync-7zgw8" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.531197 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff-combined-ca-bundle\") pod \"glance-db-sync-7zgw8\" (UID: \"31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff\") " pod="openstack/glance-db-sync-7zgw8" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.533191 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff-config-data\") pod \"glance-db-sync-7zgw8\" (UID: \"31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff\") " pod="openstack/glance-db-sync-7zgw8" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.548033 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65w7p\" (UniqueName: \"kubernetes.io/projected/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff-kube-api-access-65w7p\") pod \"glance-db-sync-7zgw8\" (UID: \"31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff\") " pod="openstack/glance-db-sync-7zgw8" Mar 10 07:04:34 crc kubenswrapper[4825]: I0310 07:04:34.582151 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7zgw8" Mar 10 07:04:35 crc kubenswrapper[4825]: I0310 07:04:35.208591 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-7zgw8"] Mar 10 07:04:35 crc kubenswrapper[4825]: W0310 07:04:35.209153 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31d721ea_21f5_4f7b_a5c2_cee3aa78f6ff.slice/crio-6d18a0ad45c18f6101409e5f30e5d6e51550a0a09f4c57f78982f3234faf52d4 WatchSource:0}: Error finding container 6d18a0ad45c18f6101409e5f30e5d6e51550a0a09f4c57f78982f3234faf52d4: Status 404 returned error can't find the container with id 6d18a0ad45c18f6101409e5f30e5d6e51550a0a09f4c57f78982f3234faf52d4 Mar 10 07:04:35 crc kubenswrapper[4825]: I0310 07:04:35.551804 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-ks6c8"] Mar 10 07:04:35 crc kubenswrapper[4825]: I0310 07:04:35.552872 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ks6c8" Mar 10 07:04:35 crc kubenswrapper[4825]: I0310 07:04:35.554668 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 10 07:04:35 crc kubenswrapper[4825]: I0310 07:04:35.564794 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ks6c8"] Mar 10 07:04:35 crc kubenswrapper[4825]: I0310 07:04:35.646372 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvrll\" (UniqueName: \"kubernetes.io/projected/3da4390e-9a55-4bfb-a028-b8bf06342cf5-kube-api-access-qvrll\") pod \"root-account-create-update-ks6c8\" (UID: \"3da4390e-9a55-4bfb-a028-b8bf06342cf5\") " pod="openstack/root-account-create-update-ks6c8" Mar 10 07:04:35 crc kubenswrapper[4825]: I0310 07:04:35.646782 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3da4390e-9a55-4bfb-a028-b8bf06342cf5-operator-scripts\") pod \"root-account-create-update-ks6c8\" (UID: \"3da4390e-9a55-4bfb-a028-b8bf06342cf5\") " pod="openstack/root-account-create-update-ks6c8" Mar 10 07:04:35 crc kubenswrapper[4825]: I0310 07:04:35.748525 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvrll\" (UniqueName: \"kubernetes.io/projected/3da4390e-9a55-4bfb-a028-b8bf06342cf5-kube-api-access-qvrll\") pod \"root-account-create-update-ks6c8\" (UID: \"3da4390e-9a55-4bfb-a028-b8bf06342cf5\") " pod="openstack/root-account-create-update-ks6c8" Mar 10 07:04:35 crc kubenswrapper[4825]: I0310 07:04:35.748939 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3da4390e-9a55-4bfb-a028-b8bf06342cf5-operator-scripts\") pod \"root-account-create-update-ks6c8\" (UID: \"3da4390e-9a55-4bfb-a028-b8bf06342cf5\") " pod="openstack/root-account-create-update-ks6c8" Mar 10 07:04:35 crc kubenswrapper[4825]: I0310 07:04:35.750157 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3da4390e-9a55-4bfb-a028-b8bf06342cf5-operator-scripts\") pod \"root-account-create-update-ks6c8\" (UID: \"3da4390e-9a55-4bfb-a028-b8bf06342cf5\") " pod="openstack/root-account-create-update-ks6c8" Mar 10 07:04:35 crc kubenswrapper[4825]: I0310 07:04:35.779275 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvrll\" (UniqueName: \"kubernetes.io/projected/3da4390e-9a55-4bfb-a028-b8bf06342cf5-kube-api-access-qvrll\") pod \"root-account-create-update-ks6c8\" (UID: \"3da4390e-9a55-4bfb-a028-b8bf06342cf5\") " pod="openstack/root-account-create-update-ks6c8" Mar 10 07:04:35 crc kubenswrapper[4825]: I0310 07:04:35.816925 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7zgw8" event={"ID":"31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff","Type":"ContainerStarted","Data":"6d18a0ad45c18f6101409e5f30e5d6e51550a0a09f4c57f78982f3234faf52d4"} Mar 10 07:04:35 crc kubenswrapper[4825]: I0310 07:04:35.818816 4825 generic.go:334] "Generic (PLEG): container finished" podID="3b166cbd-0add-4896-92ab-caed991a0a06" containerID="7f5c45f549267da3853bd7ddfebfbea9374ef2c39b83d88180442cd2939efb42" exitCode=0 Mar 10 07:04:35 crc kubenswrapper[4825]: I0310 07:04:35.818865 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2sxbl" event={"ID":"3b166cbd-0add-4896-92ab-caed991a0a06","Type":"ContainerDied","Data":"7f5c45f549267da3853bd7ddfebfbea9374ef2c39b83d88180442cd2939efb42"} Mar 10 07:04:35 crc kubenswrapper[4825]: I0310 07:04:35.868502 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ks6c8" Mar 10 07:04:36 crc kubenswrapper[4825]: I0310 07:04:36.293951 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ks6c8"] Mar 10 07:04:36 crc kubenswrapper[4825]: W0310 07:04:36.303404 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3da4390e_9a55_4bfb_a028_b8bf06342cf5.slice/crio-5bfd5b579e0a0771f4740673964337c264827442fa36d1758df6da3759a0a536 WatchSource:0}: Error finding container 5bfd5b579e0a0771f4740673964337c264827442fa36d1758df6da3759a0a536: Status 404 returned error can't find the container with id 5bfd5b579e0a0771f4740673964337c264827442fa36d1758df6da3759a0a536 Mar 10 07:04:36 crc kubenswrapper[4825]: I0310 07:04:36.830294 4825 generic.go:334] "Generic (PLEG): container finished" podID="3da4390e-9a55-4bfb-a028-b8bf06342cf5" containerID="703a298507850b13b032c3baeba76b06d9175a69c35e6fb7102fe7069f2b5cb9" exitCode=0 Mar 10 07:04:36 crc kubenswrapper[4825]: I0310 07:04:36.830444 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ks6c8" event={"ID":"3da4390e-9a55-4bfb-a028-b8bf06342cf5","Type":"ContainerDied","Data":"703a298507850b13b032c3baeba76b06d9175a69c35e6fb7102fe7069f2b5cb9"} Mar 10 07:04:36 crc kubenswrapper[4825]: I0310 07:04:36.830619 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ks6c8" event={"ID":"3da4390e-9a55-4bfb-a028-b8bf06342cf5","Type":"ContainerStarted","Data":"5bfd5b579e0a0771f4740673964337c264827442fa36d1758df6da3759a0a536"} Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.193373 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.292673 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3b166cbd-0add-4896-92ab-caed991a0a06-etc-swift\") pod \"3b166cbd-0add-4896-92ab-caed991a0a06\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.292920 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b166cbd-0add-4896-92ab-caed991a0a06-scripts\") pod \"3b166cbd-0add-4896-92ab-caed991a0a06\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.292976 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3b166cbd-0add-4896-92ab-caed991a0a06-swiftconf\") pod \"3b166cbd-0add-4896-92ab-caed991a0a06\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.293082 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b166cbd-0add-4896-92ab-caed991a0a06-combined-ca-bundle\") pod \"3b166cbd-0add-4896-92ab-caed991a0a06\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.293173 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x4mg\" (UniqueName: \"kubernetes.io/projected/3b166cbd-0add-4896-92ab-caed991a0a06-kube-api-access-2x4mg\") pod \"3b166cbd-0add-4896-92ab-caed991a0a06\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.293230 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3b166cbd-0add-4896-92ab-caed991a0a06-dispersionconf\") pod \"3b166cbd-0add-4896-92ab-caed991a0a06\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.293349 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3b166cbd-0add-4896-92ab-caed991a0a06-ring-data-devices\") pod \"3b166cbd-0add-4896-92ab-caed991a0a06\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.293617 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b166cbd-0add-4896-92ab-caed991a0a06-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3b166cbd-0add-4896-92ab-caed991a0a06" (UID: "3b166cbd-0add-4896-92ab-caed991a0a06"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.294055 4825 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3b166cbd-0add-4896-92ab-caed991a0a06-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.295721 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b166cbd-0add-4896-92ab-caed991a0a06-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3b166cbd-0add-4896-92ab-caed991a0a06" (UID: "3b166cbd-0add-4896-92ab-caed991a0a06"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.328345 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b166cbd-0add-4896-92ab-caed991a0a06-scripts" (OuterVolumeSpecName: "scripts") pod "3b166cbd-0add-4896-92ab-caed991a0a06" (UID: "3b166cbd-0add-4896-92ab-caed991a0a06"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.328403 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b166cbd-0add-4896-92ab-caed991a0a06-kube-api-access-2x4mg" (OuterVolumeSpecName: "kube-api-access-2x4mg") pod "3b166cbd-0add-4896-92ab-caed991a0a06" (UID: "3b166cbd-0add-4896-92ab-caed991a0a06"). InnerVolumeSpecName "kube-api-access-2x4mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:04:37 crc kubenswrapper[4825]: E0310 07:04:37.331922 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b166cbd-0add-4896-92ab-caed991a0a06-swiftconf podName:3b166cbd-0add-4896-92ab-caed991a0a06 nodeName:}" failed. No retries permitted until 2026-03-10 07:04:37.83189214 +0000 UTC m=+1230.861672755 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "swiftconf" (UniqueName: "kubernetes.io/secret/3b166cbd-0add-4896-92ab-caed991a0a06-swiftconf") pod "3b166cbd-0add-4896-92ab-caed991a0a06" (UID: "3b166cbd-0add-4896-92ab-caed991a0a06") : error deleting /var/lib/kubelet/pods/3b166cbd-0add-4896-92ab-caed991a0a06/volume-subpaths: remove /var/lib/kubelet/pods/3b166cbd-0add-4896-92ab-caed991a0a06/volume-subpaths: no such file or directory Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.332474 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b166cbd-0add-4896-92ab-caed991a0a06-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3b166cbd-0add-4896-92ab-caed991a0a06" (UID: "3b166cbd-0add-4896-92ab-caed991a0a06"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.336593 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b166cbd-0add-4896-92ab-caed991a0a06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b166cbd-0add-4896-92ab-caed991a0a06" (UID: "3b166cbd-0add-4896-92ab-caed991a0a06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.395268 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b166cbd-0add-4896-92ab-caed991a0a06-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.395341 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b166cbd-0add-4896-92ab-caed991a0a06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.395504 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x4mg\" (UniqueName: \"kubernetes.io/projected/3b166cbd-0add-4896-92ab-caed991a0a06-kube-api-access-2x4mg\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.395585 4825 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3b166cbd-0add-4896-92ab-caed991a0a06-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.395632 4825 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3b166cbd-0add-4896-92ab-caed991a0a06-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.844621 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2sxbl" event={"ID":"3b166cbd-0add-4896-92ab-caed991a0a06","Type":"ContainerDied","Data":"471a479090ebfe92bcdcc8dac58152fd527bc9056890cd6225b60f3d237edb51"} Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.844692 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="471a479090ebfe92bcdcc8dac58152fd527bc9056890cd6225b60f3d237edb51" Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.844685 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2sxbl" Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.905280 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3b166cbd-0add-4896-92ab-caed991a0a06-swiftconf\") pod \"3b166cbd-0add-4896-92ab-caed991a0a06\" (UID: \"3b166cbd-0add-4896-92ab-caed991a0a06\") " Mar 10 07:04:37 crc kubenswrapper[4825]: I0310 07:04:37.910682 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b166cbd-0add-4896-92ab-caed991a0a06-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3b166cbd-0add-4896-92ab-caed991a0a06" (UID: "3b166cbd-0add-4896-92ab-caed991a0a06"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:04:38 crc kubenswrapper[4825]: I0310 07:04:38.013432 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-etc-swift\") pod \"swift-storage-0\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " pod="openstack/swift-storage-0" Mar 10 07:04:38 crc kubenswrapper[4825]: I0310 07:04:38.013502 4825 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3b166cbd-0add-4896-92ab-caed991a0a06-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:38 crc kubenswrapper[4825]: I0310 07:04:38.018288 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-etc-swift\") pod \"swift-storage-0\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " pod="openstack/swift-storage-0" Mar 10 07:04:38 crc kubenswrapper[4825]: I0310 07:04:38.124434 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 07:04:38 crc kubenswrapper[4825]: I0310 07:04:38.302360 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ks6c8" Mar 10 07:04:38 crc kubenswrapper[4825]: I0310 07:04:38.432618 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvrll\" (UniqueName: \"kubernetes.io/projected/3da4390e-9a55-4bfb-a028-b8bf06342cf5-kube-api-access-qvrll\") pod \"3da4390e-9a55-4bfb-a028-b8bf06342cf5\" (UID: \"3da4390e-9a55-4bfb-a028-b8bf06342cf5\") " Mar 10 07:04:38 crc kubenswrapper[4825]: I0310 07:04:38.432705 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3da4390e-9a55-4bfb-a028-b8bf06342cf5-operator-scripts\") pod \"3da4390e-9a55-4bfb-a028-b8bf06342cf5\" (UID: \"3da4390e-9a55-4bfb-a028-b8bf06342cf5\") " Mar 10 07:04:38 crc kubenswrapper[4825]: I0310 07:04:38.433734 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3da4390e-9a55-4bfb-a028-b8bf06342cf5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3da4390e-9a55-4bfb-a028-b8bf06342cf5" (UID: "3da4390e-9a55-4bfb-a028-b8bf06342cf5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:38 crc kubenswrapper[4825]: I0310 07:04:38.439032 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3da4390e-9a55-4bfb-a028-b8bf06342cf5-kube-api-access-qvrll" (OuterVolumeSpecName: "kube-api-access-qvrll") pod "3da4390e-9a55-4bfb-a028-b8bf06342cf5" (UID: "3da4390e-9a55-4bfb-a028-b8bf06342cf5"). InnerVolumeSpecName "kube-api-access-qvrll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:04:38 crc kubenswrapper[4825]: I0310 07:04:38.534173 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvrll\" (UniqueName: \"kubernetes.io/projected/3da4390e-9a55-4bfb-a028-b8bf06342cf5-kube-api-access-qvrll\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:38 crc kubenswrapper[4825]: I0310 07:04:38.534209 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3da4390e-9a55-4bfb-a028-b8bf06342cf5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:38 crc kubenswrapper[4825]: I0310 07:04:38.830823 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 10 07:04:38 crc kubenswrapper[4825]: I0310 07:04:38.854384 4825 generic.go:334] "Generic (PLEG): container finished" podID="40efa241-98cc-4dec-9ae8-8a892b367ebc" containerID="afabc77fd2eb1f467ac3dc9c5bc7ab28337af9e10ce2b6cf66724e5f31f41951" exitCode=0 Mar 10 07:04:38 crc kubenswrapper[4825]: I0310 07:04:38.854473 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"40efa241-98cc-4dec-9ae8-8a892b367ebc","Type":"ContainerDied","Data":"afabc77fd2eb1f467ac3dc9c5bc7ab28337af9e10ce2b6cf66724e5f31f41951"} Mar 10 07:04:38 crc kubenswrapper[4825]: I0310 07:04:38.859252 4825 generic.go:334] "Generic (PLEG): container finished" podID="9ba0e3ee-1309-4411-a927-866b35c2776b" containerID="85a2b6ae62f4562edc4361ee48380de641fa71091bc57a5172633641c40144a7" exitCode=0 Mar 10 07:04:38 crc kubenswrapper[4825]: I0310 07:04:38.859348 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9ba0e3ee-1309-4411-a927-866b35c2776b","Type":"ContainerDied","Data":"85a2b6ae62f4562edc4361ee48380de641fa71091bc57a5172633641c40144a7"} Mar 10 07:04:38 crc kubenswrapper[4825]: I0310 07:04:38.864744 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ks6c8" event={"ID":"3da4390e-9a55-4bfb-a028-b8bf06342cf5","Type":"ContainerDied","Data":"5bfd5b579e0a0771f4740673964337c264827442fa36d1758df6da3759a0a536"} Mar 10 07:04:38 crc kubenswrapper[4825]: I0310 07:04:38.864784 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bfd5b579e0a0771f4740673964337c264827442fa36d1758df6da3759a0a536" Mar 10 07:04:38 crc kubenswrapper[4825]: I0310 07:04:38.865012 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ks6c8" Mar 10 07:04:38 crc kubenswrapper[4825]: I0310 07:04:38.876553 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerStarted","Data":"5d31298fc11e6c4ffc4b3bf96856e7a83277dceb2c2cca7b069e69b3b80cf543"} Mar 10 07:04:39 crc kubenswrapper[4825]: I0310 07:04:39.889921 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"40efa241-98cc-4dec-9ae8-8a892b367ebc","Type":"ContainerStarted","Data":"a98a5e36d3a3124ffe2ffa323d51e71ad524e44ce610050ac778bcba5e77fd17"} Mar 10 07:04:39 crc kubenswrapper[4825]: I0310 07:04:39.890606 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 10 07:04:39 crc kubenswrapper[4825]: I0310 07:04:39.892332 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9ba0e3ee-1309-4411-a927-866b35c2776b","Type":"ContainerStarted","Data":"33787b88c881e69ba026aff5962a5455c1e4b2f94327fcc119388ebe1cd30211"} Mar 10 07:04:39 crc kubenswrapper[4825]: I0310 07:04:39.892879 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:04:39 crc kubenswrapper[4825]: I0310 07:04:39.925539 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=41.656459021 podStartE2EDuration="56.925520079s" podCreationTimestamp="2026-03-10 07:03:43 +0000 UTC" firstStartedPulling="2026-03-10 07:03:45.37630033 +0000 UTC m=+1178.406080955" lastFinishedPulling="2026-03-10 07:04:00.645361358 +0000 UTC m=+1193.675142013" observedRunningTime="2026-03-10 07:04:39.917807958 +0000 UTC m=+1232.947588573" watchObservedRunningTime="2026-03-10 07:04:39.925520079 +0000 UTC m=+1232.955300694" Mar 10 07:04:39 crc kubenswrapper[4825]: I0310 07:04:39.938889 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=41.265892736 podStartE2EDuration="55.938865947s" podCreationTimestamp="2026-03-10 07:03:44 +0000 UTC" firstStartedPulling="2026-03-10 07:03:50.93306979 +0000 UTC m=+1183.962850415" lastFinishedPulling="2026-03-10 07:04:05.606043001 +0000 UTC m=+1198.635823626" observedRunningTime="2026-03-10 07:04:39.935655823 +0000 UTC m=+1232.965436458" watchObservedRunningTime="2026-03-10 07:04:39.938865947 +0000 UTC m=+1232.968646582" Mar 10 07:04:40 crc kubenswrapper[4825]: I0310 07:04:40.282168 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 10 07:04:40 crc kubenswrapper[4825]: I0310 07:04:40.910238 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerStarted","Data":"49e3de3dc2efd76ad3dd592ab843057ad3883dc58b1b23503b05369a2fb81259"} Mar 10 07:04:40 crc kubenswrapper[4825]: I0310 07:04:40.910306 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerStarted","Data":"001ce68a7777bd6940dec1d7b9d36b3306b90993cc66dd9412186267450b2a7a"} Mar 10 07:04:40 crc kubenswrapper[4825]: I0310 07:04:40.910322 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerStarted","Data":"b6da432e4ef6e11dcf09322fd99510f39d8b7a43cde3cc5579134c6692dd38d0"} Mar 10 07:04:41 crc kubenswrapper[4825]: I0310 07:04:41.923722 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerStarted","Data":"dbff9d40c47a9b57b2e18d794a415a9500d5967b3986aef885713d9846f81e1f"} Mar 10 07:04:42 crc kubenswrapper[4825]: I0310 07:04:42.020355 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-ks6c8"] Mar 10 07:04:42 crc kubenswrapper[4825]: I0310 07:04:42.030391 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-ks6c8"] Mar 10 07:04:43 crc kubenswrapper[4825]: I0310 07:04:43.259890 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3da4390e-9a55-4bfb-a028-b8bf06342cf5" path="/var/lib/kubelet/pods/3da4390e-9a55-4bfb-a028-b8bf06342cf5/volumes" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.288092 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-jctbb" podUID="2b7468b7-ceaa-44b4-8364-5d3601f43c1b" containerName="ovn-controller" probeResult="failure" output=< Mar 10 07:04:44 crc kubenswrapper[4825]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 10 07:04:44 crc kubenswrapper[4825]: > Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.388793 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.399796 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.648478 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-jctbb-config-hxnml"] Mar 10 07:04:44 crc kubenswrapper[4825]: E0310 07:04:44.648896 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da4390e-9a55-4bfb-a028-b8bf06342cf5" containerName="mariadb-account-create-update" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.648917 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da4390e-9a55-4bfb-a028-b8bf06342cf5" containerName="mariadb-account-create-update" Mar 10 07:04:44 crc kubenswrapper[4825]: E0310 07:04:44.648942 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b166cbd-0add-4896-92ab-caed991a0a06" containerName="swift-ring-rebalance" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.648948 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b166cbd-0add-4896-92ab-caed991a0a06" containerName="swift-ring-rebalance" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.649112 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3da4390e-9a55-4bfb-a028-b8bf06342cf5" containerName="mariadb-account-create-update" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.649148 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b166cbd-0add-4896-92ab-caed991a0a06" containerName="swift-ring-rebalance" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.649691 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jctbb-config-hxnml" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.658074 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.660640 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jctbb-config-hxnml"] Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.674515 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b81795a7-9260-45ee-ac6d-b714ed894ce2-var-run-ovn\") pod \"ovn-controller-jctbb-config-hxnml\" (UID: \"b81795a7-9260-45ee-ac6d-b714ed894ce2\") " pod="openstack/ovn-controller-jctbb-config-hxnml" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.674660 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b81795a7-9260-45ee-ac6d-b714ed894ce2-scripts\") pod \"ovn-controller-jctbb-config-hxnml\" (UID: \"b81795a7-9260-45ee-ac6d-b714ed894ce2\") " pod="openstack/ovn-controller-jctbb-config-hxnml" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.674797 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58x57\" (UniqueName: \"kubernetes.io/projected/b81795a7-9260-45ee-ac6d-b714ed894ce2-kube-api-access-58x57\") pod \"ovn-controller-jctbb-config-hxnml\" (UID: \"b81795a7-9260-45ee-ac6d-b714ed894ce2\") " pod="openstack/ovn-controller-jctbb-config-hxnml" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.674922 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b81795a7-9260-45ee-ac6d-b714ed894ce2-var-run\") pod \"ovn-controller-jctbb-config-hxnml\" (UID: \"b81795a7-9260-45ee-ac6d-b714ed894ce2\") " pod="openstack/ovn-controller-jctbb-config-hxnml" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.674958 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b81795a7-9260-45ee-ac6d-b714ed894ce2-additional-scripts\") pod \"ovn-controller-jctbb-config-hxnml\" (UID: \"b81795a7-9260-45ee-ac6d-b714ed894ce2\") " pod="openstack/ovn-controller-jctbb-config-hxnml" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.674991 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b81795a7-9260-45ee-ac6d-b714ed894ce2-var-log-ovn\") pod \"ovn-controller-jctbb-config-hxnml\" (UID: \"b81795a7-9260-45ee-ac6d-b714ed894ce2\") " pod="openstack/ovn-controller-jctbb-config-hxnml" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.776873 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b81795a7-9260-45ee-ac6d-b714ed894ce2-scripts\") pod \"ovn-controller-jctbb-config-hxnml\" (UID: \"b81795a7-9260-45ee-ac6d-b714ed894ce2\") " pod="openstack/ovn-controller-jctbb-config-hxnml" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.777025 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58x57\" (UniqueName: \"kubernetes.io/projected/b81795a7-9260-45ee-ac6d-b714ed894ce2-kube-api-access-58x57\") pod \"ovn-controller-jctbb-config-hxnml\" (UID: \"b81795a7-9260-45ee-ac6d-b714ed894ce2\") " pod="openstack/ovn-controller-jctbb-config-hxnml" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.777159 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b81795a7-9260-45ee-ac6d-b714ed894ce2-var-run\") pod \"ovn-controller-jctbb-config-hxnml\" (UID: \"b81795a7-9260-45ee-ac6d-b714ed894ce2\") " pod="openstack/ovn-controller-jctbb-config-hxnml" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.777201 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b81795a7-9260-45ee-ac6d-b714ed894ce2-additional-scripts\") pod \"ovn-controller-jctbb-config-hxnml\" (UID: \"b81795a7-9260-45ee-ac6d-b714ed894ce2\") " pod="openstack/ovn-controller-jctbb-config-hxnml" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.777244 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b81795a7-9260-45ee-ac6d-b714ed894ce2-var-log-ovn\") pod \"ovn-controller-jctbb-config-hxnml\" (UID: \"b81795a7-9260-45ee-ac6d-b714ed894ce2\") " pod="openstack/ovn-controller-jctbb-config-hxnml" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.777276 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b81795a7-9260-45ee-ac6d-b714ed894ce2-var-run-ovn\") pod \"ovn-controller-jctbb-config-hxnml\" (UID: \"b81795a7-9260-45ee-ac6d-b714ed894ce2\") " pod="openstack/ovn-controller-jctbb-config-hxnml" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.777626 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b81795a7-9260-45ee-ac6d-b714ed894ce2-var-run-ovn\") pod \"ovn-controller-jctbb-config-hxnml\" (UID: \"b81795a7-9260-45ee-ac6d-b714ed894ce2\") " pod="openstack/ovn-controller-jctbb-config-hxnml" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.777867 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b81795a7-9260-45ee-ac6d-b714ed894ce2-var-run\") pod \"ovn-controller-jctbb-config-hxnml\" (UID: \"b81795a7-9260-45ee-ac6d-b714ed894ce2\") " pod="openstack/ovn-controller-jctbb-config-hxnml" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.778556 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b81795a7-9260-45ee-ac6d-b714ed894ce2-var-log-ovn\") pod \"ovn-controller-jctbb-config-hxnml\" (UID: \"b81795a7-9260-45ee-ac6d-b714ed894ce2\") " pod="openstack/ovn-controller-jctbb-config-hxnml" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.779378 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b81795a7-9260-45ee-ac6d-b714ed894ce2-additional-scripts\") pod \"ovn-controller-jctbb-config-hxnml\" (UID: \"b81795a7-9260-45ee-ac6d-b714ed894ce2\") " pod="openstack/ovn-controller-jctbb-config-hxnml" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.781279 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b81795a7-9260-45ee-ac6d-b714ed894ce2-scripts\") pod \"ovn-controller-jctbb-config-hxnml\" (UID: \"b81795a7-9260-45ee-ac6d-b714ed894ce2\") " pod="openstack/ovn-controller-jctbb-config-hxnml" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.809001 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58x57\" (UniqueName: \"kubernetes.io/projected/b81795a7-9260-45ee-ac6d-b714ed894ce2-kube-api-access-58x57\") pod \"ovn-controller-jctbb-config-hxnml\" (UID: \"b81795a7-9260-45ee-ac6d-b714ed894ce2\") " pod="openstack/ovn-controller-jctbb-config-hxnml" Mar 10 07:04:44 crc kubenswrapper[4825]: I0310 07:04:44.969766 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jctbb-config-hxnml" Mar 10 07:04:47 crc kubenswrapper[4825]: I0310 07:04:47.002731 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-z4g4q"] Mar 10 07:04:47 crc kubenswrapper[4825]: I0310 07:04:47.004341 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z4g4q" Mar 10 07:04:47 crc kubenswrapper[4825]: I0310 07:04:47.006505 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 10 07:04:47 crc kubenswrapper[4825]: I0310 07:04:47.012240 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-z4g4q"] Mar 10 07:04:47 crc kubenswrapper[4825]: I0310 07:04:47.122547 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b31a6071-5d6d-4ea5-9639-8fc8ea220862-operator-scripts\") pod \"root-account-create-update-z4g4q\" (UID: \"b31a6071-5d6d-4ea5-9639-8fc8ea220862\") " pod="openstack/root-account-create-update-z4g4q" Mar 10 07:04:47 crc kubenswrapper[4825]: I0310 07:04:47.122796 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk4zh\" (UniqueName: \"kubernetes.io/projected/b31a6071-5d6d-4ea5-9639-8fc8ea220862-kube-api-access-wk4zh\") pod \"root-account-create-update-z4g4q\" (UID: \"b31a6071-5d6d-4ea5-9639-8fc8ea220862\") " pod="openstack/root-account-create-update-z4g4q" Mar 10 07:04:47 crc kubenswrapper[4825]: I0310 07:04:47.223967 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b31a6071-5d6d-4ea5-9639-8fc8ea220862-operator-scripts\") pod \"root-account-create-update-z4g4q\" (UID: \"b31a6071-5d6d-4ea5-9639-8fc8ea220862\") " pod="openstack/root-account-create-update-z4g4q" Mar 10 07:04:47 crc kubenswrapper[4825]: I0310 07:04:47.224153 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk4zh\" (UniqueName: \"kubernetes.io/projected/b31a6071-5d6d-4ea5-9639-8fc8ea220862-kube-api-access-wk4zh\") pod \"root-account-create-update-z4g4q\" (UID: \"b31a6071-5d6d-4ea5-9639-8fc8ea220862\") " pod="openstack/root-account-create-update-z4g4q" Mar 10 07:04:47 crc kubenswrapper[4825]: I0310 07:04:47.225247 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b31a6071-5d6d-4ea5-9639-8fc8ea220862-operator-scripts\") pod \"root-account-create-update-z4g4q\" (UID: \"b31a6071-5d6d-4ea5-9639-8fc8ea220862\") " pod="openstack/root-account-create-update-z4g4q" Mar 10 07:04:47 crc kubenswrapper[4825]: I0310 07:04:47.247186 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk4zh\" (UniqueName: \"kubernetes.io/projected/b31a6071-5d6d-4ea5-9639-8fc8ea220862-kube-api-access-wk4zh\") pod \"root-account-create-update-z4g4q\" (UID: \"b31a6071-5d6d-4ea5-9639-8fc8ea220862\") " pod="openstack/root-account-create-update-z4g4q" Mar 10 07:04:47 crc kubenswrapper[4825]: I0310 07:04:47.322305 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z4g4q" Mar 10 07:04:49 crc kubenswrapper[4825]: I0310 07:04:49.283609 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-jctbb" podUID="2b7468b7-ceaa-44b4-8364-5d3601f43c1b" containerName="ovn-controller" probeResult="failure" output=< Mar 10 07:04:49 crc kubenswrapper[4825]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 10 07:04:49 crc kubenswrapper[4825]: > Mar 10 07:04:50 crc kubenswrapper[4825]: I0310 07:04:50.360183 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-z4g4q"] Mar 10 07:04:50 crc kubenswrapper[4825]: I0310 07:04:50.367909 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jctbb-config-hxnml"] Mar 10 07:04:50 crc kubenswrapper[4825]: W0310 07:04:50.386408 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb81795a7_9260_45ee_ac6d_b714ed894ce2.slice/crio-d5b413c0f067181dc9dc2ef4a760b176e2534b0ed5cedc5cb527b3ac27762a2e WatchSource:0}: Error finding container d5b413c0f067181dc9dc2ef4a760b176e2534b0ed5cedc5cb527b3ac27762a2e: Status 404 returned error can't find the container with id d5b413c0f067181dc9dc2ef4a760b176e2534b0ed5cedc5cb527b3ac27762a2e Mar 10 07:04:51 crc kubenswrapper[4825]: I0310 07:04:51.006668 4825 generic.go:334] "Generic (PLEG): container finished" podID="b81795a7-9260-45ee-ac6d-b714ed894ce2" containerID="3bd00aaa4f5f332f9aa4d37bf582366cd2e28121d59d7ff53887a4d10a61b3e2" exitCode=0 Mar 10 07:04:51 crc kubenswrapper[4825]: I0310 07:04:51.006734 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jctbb-config-hxnml" event={"ID":"b81795a7-9260-45ee-ac6d-b714ed894ce2","Type":"ContainerDied","Data":"3bd00aaa4f5f332f9aa4d37bf582366cd2e28121d59d7ff53887a4d10a61b3e2"} Mar 10 07:04:51 crc kubenswrapper[4825]: I0310 07:04:51.006972 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jctbb-config-hxnml" event={"ID":"b81795a7-9260-45ee-ac6d-b714ed894ce2","Type":"ContainerStarted","Data":"d5b413c0f067181dc9dc2ef4a760b176e2534b0ed5cedc5cb527b3ac27762a2e"} Mar 10 07:04:51 crc kubenswrapper[4825]: I0310 07:04:51.008434 4825 generic.go:334] "Generic (PLEG): container finished" podID="b31a6071-5d6d-4ea5-9639-8fc8ea220862" containerID="9c0002c4d7294538321f66a795d740ab3d1a89b545a70e9228503bb72c6cd2b1" exitCode=0 Mar 10 07:04:51 crc kubenswrapper[4825]: I0310 07:04:51.008461 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z4g4q" event={"ID":"b31a6071-5d6d-4ea5-9639-8fc8ea220862","Type":"ContainerDied","Data":"9c0002c4d7294538321f66a795d740ab3d1a89b545a70e9228503bb72c6cd2b1"} Mar 10 07:04:51 crc kubenswrapper[4825]: I0310 07:04:51.008490 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z4g4q" event={"ID":"b31a6071-5d6d-4ea5-9639-8fc8ea220862","Type":"ContainerStarted","Data":"d33a9817140f7fdc04367360ec8d9c06c1290624de4686eceb931d4df57d7b28"} Mar 10 07:04:51 crc kubenswrapper[4825]: I0310 07:04:51.012352 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerStarted","Data":"76563e2f197f8efff20de4d4a18477c3d06424bc7c12b84fc84e0bc0a8e03877"} Mar 10 07:04:51 crc kubenswrapper[4825]: I0310 07:04:51.012397 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerStarted","Data":"3c0a38e663ef3eeb91871a7cbf3ab45bc29c396569c4dd16e651b9a699527022"} Mar 10 07:04:51 crc kubenswrapper[4825]: I0310 07:04:51.012410 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerStarted","Data":"13e1e93f8ea417f0086cb72d1b5c3c472463fcb9b293144a2e6d095358753ed6"} Mar 10 07:04:51 crc kubenswrapper[4825]: I0310 07:04:51.013994 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7zgw8" event={"ID":"31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff","Type":"ContainerStarted","Data":"9e2afec1031575eab3dc1ec0e1af73607aeb7fdff1f66af902b177103558df55"} Mar 10 07:04:51 crc kubenswrapper[4825]: I0310 07:04:51.074643 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-7zgw8" podStartSLOduration=2.054086263 podStartE2EDuration="17.074623793s" podCreationTimestamp="2026-03-10 07:04:34 +0000 UTC" firstStartedPulling="2026-03-10 07:04:35.211147353 +0000 UTC m=+1228.240927968" lastFinishedPulling="2026-03-10 07:04:50.231684883 +0000 UTC m=+1243.261465498" observedRunningTime="2026-03-10 07:04:51.071582294 +0000 UTC m=+1244.101362929" watchObservedRunningTime="2026-03-10 07:04:51.074623793 +0000 UTC m=+1244.104404418" Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.030814 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerStarted","Data":"ec1f01797e26b3ab6c17de9ed25bdc13caaa429fb99d4c291f3d27585657d67a"} Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.447756 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z4g4q" Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.519776 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b31a6071-5d6d-4ea5-9639-8fc8ea220862-operator-scripts\") pod \"b31a6071-5d6d-4ea5-9639-8fc8ea220862\" (UID: \"b31a6071-5d6d-4ea5-9639-8fc8ea220862\") " Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.519898 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk4zh\" (UniqueName: \"kubernetes.io/projected/b31a6071-5d6d-4ea5-9639-8fc8ea220862-kube-api-access-wk4zh\") pod \"b31a6071-5d6d-4ea5-9639-8fc8ea220862\" (UID: \"b31a6071-5d6d-4ea5-9639-8fc8ea220862\") " Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.520767 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b31a6071-5d6d-4ea5-9639-8fc8ea220862-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b31a6071-5d6d-4ea5-9639-8fc8ea220862" (UID: "b31a6071-5d6d-4ea5-9639-8fc8ea220862"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.525903 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b31a6071-5d6d-4ea5-9639-8fc8ea220862-kube-api-access-wk4zh" (OuterVolumeSpecName: "kube-api-access-wk4zh") pod "b31a6071-5d6d-4ea5-9639-8fc8ea220862" (UID: "b31a6071-5d6d-4ea5-9639-8fc8ea220862"). InnerVolumeSpecName "kube-api-access-wk4zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.581156 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jctbb-config-hxnml" Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.621554 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b81795a7-9260-45ee-ac6d-b714ed894ce2-var-run\") pod \"b81795a7-9260-45ee-ac6d-b714ed894ce2\" (UID: \"b81795a7-9260-45ee-ac6d-b714ed894ce2\") " Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.621742 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b81795a7-9260-45ee-ac6d-b714ed894ce2-var-log-ovn\") pod \"b81795a7-9260-45ee-ac6d-b714ed894ce2\" (UID: \"b81795a7-9260-45ee-ac6d-b714ed894ce2\") " Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.621799 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58x57\" (UniqueName: \"kubernetes.io/projected/b81795a7-9260-45ee-ac6d-b714ed894ce2-kube-api-access-58x57\") pod \"b81795a7-9260-45ee-ac6d-b714ed894ce2\" (UID: \"b81795a7-9260-45ee-ac6d-b714ed894ce2\") " Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.621876 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b81795a7-9260-45ee-ac6d-b714ed894ce2-additional-scripts\") pod \"b81795a7-9260-45ee-ac6d-b714ed894ce2\" (UID: \"b81795a7-9260-45ee-ac6d-b714ed894ce2\") " Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.621957 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b81795a7-9260-45ee-ac6d-b714ed894ce2-var-run-ovn\") pod \"b81795a7-9260-45ee-ac6d-b714ed894ce2\" (UID: \"b81795a7-9260-45ee-ac6d-b714ed894ce2\") " Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.621987 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b81795a7-9260-45ee-ac6d-b714ed894ce2-scripts\") pod \"b81795a7-9260-45ee-ac6d-b714ed894ce2\" (UID: \"b81795a7-9260-45ee-ac6d-b714ed894ce2\") " Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.622548 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b31a6071-5d6d-4ea5-9639-8fc8ea220862-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.622575 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk4zh\" (UniqueName: \"kubernetes.io/projected/b31a6071-5d6d-4ea5-9639-8fc8ea220862-kube-api-access-wk4zh\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.623427 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b81795a7-9260-45ee-ac6d-b714ed894ce2-var-run" (OuterVolumeSpecName: "var-run") pod "b81795a7-9260-45ee-ac6d-b714ed894ce2" (UID: "b81795a7-9260-45ee-ac6d-b714ed894ce2"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.623489 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b81795a7-9260-45ee-ac6d-b714ed894ce2-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b81795a7-9260-45ee-ac6d-b714ed894ce2" (UID: "b81795a7-9260-45ee-ac6d-b714ed894ce2"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.623527 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b81795a7-9260-45ee-ac6d-b714ed894ce2-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b81795a7-9260-45ee-ac6d-b714ed894ce2" (UID: "b81795a7-9260-45ee-ac6d-b714ed894ce2"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.624022 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b81795a7-9260-45ee-ac6d-b714ed894ce2-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b81795a7-9260-45ee-ac6d-b714ed894ce2" (UID: "b81795a7-9260-45ee-ac6d-b714ed894ce2"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.624228 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b81795a7-9260-45ee-ac6d-b714ed894ce2-scripts" (OuterVolumeSpecName: "scripts") pod "b81795a7-9260-45ee-ac6d-b714ed894ce2" (UID: "b81795a7-9260-45ee-ac6d-b714ed894ce2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.631420 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b81795a7-9260-45ee-ac6d-b714ed894ce2-kube-api-access-58x57" (OuterVolumeSpecName: "kube-api-access-58x57") pod "b81795a7-9260-45ee-ac6d-b714ed894ce2" (UID: "b81795a7-9260-45ee-ac6d-b714ed894ce2"). InnerVolumeSpecName "kube-api-access-58x57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.725326 4825 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b81795a7-9260-45ee-ac6d-b714ed894ce2-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.725368 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58x57\" (UniqueName: \"kubernetes.io/projected/b81795a7-9260-45ee-ac6d-b714ed894ce2-kube-api-access-58x57\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.725405 4825 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b81795a7-9260-45ee-ac6d-b714ed894ce2-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.725418 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b81795a7-9260-45ee-ac6d-b714ed894ce2-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.725432 4825 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b81795a7-9260-45ee-ac6d-b714ed894ce2-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:52 crc kubenswrapper[4825]: I0310 07:04:52.725444 4825 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b81795a7-9260-45ee-ac6d-b714ed894ce2-var-run\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:53 crc kubenswrapper[4825]: I0310 07:04:53.050666 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jctbb-config-hxnml" event={"ID":"b81795a7-9260-45ee-ac6d-b714ed894ce2","Type":"ContainerDied","Data":"d5b413c0f067181dc9dc2ef4a760b176e2534b0ed5cedc5cb527b3ac27762a2e"} Mar 10 07:04:53 crc kubenswrapper[4825]: I0310 07:04:53.051217 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5b413c0f067181dc9dc2ef4a760b176e2534b0ed5cedc5cb527b3ac27762a2e" Mar 10 07:04:53 crc kubenswrapper[4825]: I0310 07:04:53.050711 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jctbb-config-hxnml" Mar 10 07:04:53 crc kubenswrapper[4825]: I0310 07:04:53.053967 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z4g4q" event={"ID":"b31a6071-5d6d-4ea5-9639-8fc8ea220862","Type":"ContainerDied","Data":"d33a9817140f7fdc04367360ec8d9c06c1290624de4686eceb931d4df57d7b28"} Mar 10 07:04:53 crc kubenswrapper[4825]: I0310 07:04:53.053997 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z4g4q" Mar 10 07:04:53 crc kubenswrapper[4825]: I0310 07:04:53.054026 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d33a9817140f7fdc04367360ec8d9c06c1290624de4686eceb931d4df57d7b28" Mar 10 07:04:53 crc kubenswrapper[4825]: I0310 07:04:53.677624 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-jctbb-config-hxnml"] Mar 10 07:04:53 crc kubenswrapper[4825]: I0310 07:04:53.688877 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-jctbb-config-hxnml"] Mar 10 07:04:53 crc kubenswrapper[4825]: I0310 07:04:53.845989 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-jctbb-config-nq5vn"] Mar 10 07:04:53 crc kubenswrapper[4825]: E0310 07:04:53.846418 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b31a6071-5d6d-4ea5-9639-8fc8ea220862" containerName="mariadb-account-create-update" Mar 10 07:04:53 crc kubenswrapper[4825]: I0310 07:04:53.846443 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b31a6071-5d6d-4ea5-9639-8fc8ea220862" containerName="mariadb-account-create-update" Mar 10 07:04:53 crc kubenswrapper[4825]: E0310 07:04:53.846455 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b81795a7-9260-45ee-ac6d-b714ed894ce2" containerName="ovn-config" Mar 10 07:04:53 crc kubenswrapper[4825]: I0310 07:04:53.846463 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b81795a7-9260-45ee-ac6d-b714ed894ce2" containerName="ovn-config" Mar 10 07:04:53 crc kubenswrapper[4825]: I0310 07:04:53.846645 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b31a6071-5d6d-4ea5-9639-8fc8ea220862" containerName="mariadb-account-create-update" Mar 10 07:04:53 crc kubenswrapper[4825]: I0310 07:04:53.846662 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b81795a7-9260-45ee-ac6d-b714ed894ce2" containerName="ovn-config" Mar 10 07:04:53 crc kubenswrapper[4825]: I0310 07:04:53.847228 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jctbb-config-nq5vn" Mar 10 07:04:53 crc kubenswrapper[4825]: I0310 07:04:53.853740 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 10 07:04:53 crc kubenswrapper[4825]: I0310 07:04:53.855768 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jctbb-config-nq5vn"] Mar 10 07:04:53 crc kubenswrapper[4825]: I0310 07:04:53.943077 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/44afa3f7-22c9-4881-86ca-8690b07a9972-var-run-ovn\") pod \"ovn-controller-jctbb-config-nq5vn\" (UID: \"44afa3f7-22c9-4881-86ca-8690b07a9972\") " pod="openstack/ovn-controller-jctbb-config-nq5vn" Mar 10 07:04:53 crc kubenswrapper[4825]: I0310 07:04:53.943192 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqjlw\" (UniqueName: \"kubernetes.io/projected/44afa3f7-22c9-4881-86ca-8690b07a9972-kube-api-access-bqjlw\") pod \"ovn-controller-jctbb-config-nq5vn\" (UID: \"44afa3f7-22c9-4881-86ca-8690b07a9972\") " pod="openstack/ovn-controller-jctbb-config-nq5vn" Mar 10 07:04:53 crc kubenswrapper[4825]: I0310 07:04:53.943224 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/44afa3f7-22c9-4881-86ca-8690b07a9972-var-run\") pod \"ovn-controller-jctbb-config-nq5vn\" (UID: \"44afa3f7-22c9-4881-86ca-8690b07a9972\") " pod="openstack/ovn-controller-jctbb-config-nq5vn" Mar 10 07:04:53 crc kubenswrapper[4825]: I0310 07:04:53.943255 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/44afa3f7-22c9-4881-86ca-8690b07a9972-var-log-ovn\") pod \"ovn-controller-jctbb-config-nq5vn\" (UID: \"44afa3f7-22c9-4881-86ca-8690b07a9972\") " pod="openstack/ovn-controller-jctbb-config-nq5vn" Mar 10 07:04:53 crc kubenswrapper[4825]: I0310 07:04:53.943295 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44afa3f7-22c9-4881-86ca-8690b07a9972-scripts\") pod \"ovn-controller-jctbb-config-nq5vn\" (UID: \"44afa3f7-22c9-4881-86ca-8690b07a9972\") " pod="openstack/ovn-controller-jctbb-config-nq5vn" Mar 10 07:04:53 crc kubenswrapper[4825]: I0310 07:04:53.943360 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/44afa3f7-22c9-4881-86ca-8690b07a9972-additional-scripts\") pod \"ovn-controller-jctbb-config-nq5vn\" (UID: \"44afa3f7-22c9-4881-86ca-8690b07a9972\") " pod="openstack/ovn-controller-jctbb-config-nq5vn" Mar 10 07:04:54 crc kubenswrapper[4825]: I0310 07:04:54.043947 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/44afa3f7-22c9-4881-86ca-8690b07a9972-additional-scripts\") pod \"ovn-controller-jctbb-config-nq5vn\" (UID: \"44afa3f7-22c9-4881-86ca-8690b07a9972\") " pod="openstack/ovn-controller-jctbb-config-nq5vn" Mar 10 07:04:54 crc kubenswrapper[4825]: I0310 07:04:54.044004 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/44afa3f7-22c9-4881-86ca-8690b07a9972-var-run-ovn\") pod \"ovn-controller-jctbb-config-nq5vn\" (UID: \"44afa3f7-22c9-4881-86ca-8690b07a9972\") " pod="openstack/ovn-controller-jctbb-config-nq5vn" Mar 10 07:04:54 crc kubenswrapper[4825]: I0310 07:04:54.044050 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqjlw\" (UniqueName: \"kubernetes.io/projected/44afa3f7-22c9-4881-86ca-8690b07a9972-kube-api-access-bqjlw\") pod \"ovn-controller-jctbb-config-nq5vn\" (UID: \"44afa3f7-22c9-4881-86ca-8690b07a9972\") " pod="openstack/ovn-controller-jctbb-config-nq5vn" Mar 10 07:04:54 crc kubenswrapper[4825]: I0310 07:04:54.044074 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/44afa3f7-22c9-4881-86ca-8690b07a9972-var-run\") pod \"ovn-controller-jctbb-config-nq5vn\" (UID: \"44afa3f7-22c9-4881-86ca-8690b07a9972\") " pod="openstack/ovn-controller-jctbb-config-nq5vn" Mar 10 07:04:54 crc kubenswrapper[4825]: I0310 07:04:54.044095 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/44afa3f7-22c9-4881-86ca-8690b07a9972-var-log-ovn\") pod \"ovn-controller-jctbb-config-nq5vn\" (UID: \"44afa3f7-22c9-4881-86ca-8690b07a9972\") " pod="openstack/ovn-controller-jctbb-config-nq5vn" Mar 10 07:04:54 crc kubenswrapper[4825]: I0310 07:04:54.044124 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44afa3f7-22c9-4881-86ca-8690b07a9972-scripts\") pod \"ovn-controller-jctbb-config-nq5vn\" (UID: \"44afa3f7-22c9-4881-86ca-8690b07a9972\") " pod="openstack/ovn-controller-jctbb-config-nq5vn" Mar 10 07:04:54 crc kubenswrapper[4825]: I0310 07:04:54.044599 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/44afa3f7-22c9-4881-86ca-8690b07a9972-var-run\") pod \"ovn-controller-jctbb-config-nq5vn\" (UID: \"44afa3f7-22c9-4881-86ca-8690b07a9972\") " pod="openstack/ovn-controller-jctbb-config-nq5vn" Mar 10 07:04:54 crc kubenswrapper[4825]: I0310 07:04:54.044718 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/44afa3f7-22c9-4881-86ca-8690b07a9972-var-log-ovn\") pod \"ovn-controller-jctbb-config-nq5vn\" (UID: \"44afa3f7-22c9-4881-86ca-8690b07a9972\") " pod="openstack/ovn-controller-jctbb-config-nq5vn" Mar 10 07:04:54 crc kubenswrapper[4825]: I0310 07:04:54.044783 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/44afa3f7-22c9-4881-86ca-8690b07a9972-var-run-ovn\") pod \"ovn-controller-jctbb-config-nq5vn\" (UID: \"44afa3f7-22c9-4881-86ca-8690b07a9972\") " pod="openstack/ovn-controller-jctbb-config-nq5vn" Mar 10 07:04:54 crc kubenswrapper[4825]: I0310 07:04:54.045265 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/44afa3f7-22c9-4881-86ca-8690b07a9972-additional-scripts\") pod \"ovn-controller-jctbb-config-nq5vn\" (UID: \"44afa3f7-22c9-4881-86ca-8690b07a9972\") " pod="openstack/ovn-controller-jctbb-config-nq5vn" Mar 10 07:04:54 crc kubenswrapper[4825]: I0310 07:04:54.046442 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44afa3f7-22c9-4881-86ca-8690b07a9972-scripts\") pod \"ovn-controller-jctbb-config-nq5vn\" (UID: \"44afa3f7-22c9-4881-86ca-8690b07a9972\") " pod="openstack/ovn-controller-jctbb-config-nq5vn" Mar 10 07:04:54 crc kubenswrapper[4825]: I0310 07:04:54.061445 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqjlw\" (UniqueName: \"kubernetes.io/projected/44afa3f7-22c9-4881-86ca-8690b07a9972-kube-api-access-bqjlw\") pod \"ovn-controller-jctbb-config-nq5vn\" (UID: \"44afa3f7-22c9-4881-86ca-8690b07a9972\") " pod="openstack/ovn-controller-jctbb-config-nq5vn" Mar 10 07:04:54 crc kubenswrapper[4825]: I0310 07:04:54.072370 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerStarted","Data":"6a94411539794d68dfeebd72122285b13eda3087dbbeb9b315116582797f9ae7"} Mar 10 07:04:54 crc kubenswrapper[4825]: I0310 07:04:54.072418 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerStarted","Data":"24e7978cba137e69c6bcc8c4236d6cec9f0db399c5f65bdbb35f3ca5a0021d8b"} Mar 10 07:04:54 crc kubenswrapper[4825]: I0310 07:04:54.072428 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerStarted","Data":"2696dee8a458e4895f3ba7693b75ddaad87852936a2c94221d87c43242a8d9c0"} Mar 10 07:04:54 crc kubenswrapper[4825]: I0310 07:04:54.072436 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerStarted","Data":"1b4b9f5c44b3bd7e6bdce3108794f5acc85e48ec01796fe34d6bd0e82000c702"} Mar 10 07:04:54 crc kubenswrapper[4825]: I0310 07:04:54.072443 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerStarted","Data":"6b6649c00a311af982a228be67d8b620ca48866ed93599303fea93062ada078a"} Mar 10 07:04:54 crc kubenswrapper[4825]: I0310 07:04:54.198856 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jctbb-config-nq5vn" Mar 10 07:04:54 crc kubenswrapper[4825]: I0310 07:04:54.309396 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-jctbb" Mar 10 07:04:54 crc kubenswrapper[4825]: I0310 07:04:54.673994 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jctbb-config-nq5vn"] Mar 10 07:04:54 crc kubenswrapper[4825]: W0310 07:04:54.680744 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44afa3f7_22c9_4881_86ca_8690b07a9972.slice/crio-8e7aee3fd22a8e98753973fadd022c8fc2a1ab35b46c547f0d0330df7ae66486 WatchSource:0}: Error finding container 8e7aee3fd22a8e98753973fadd022c8fc2a1ab35b46c547f0d0330df7ae66486: Status 404 returned error can't find the container with id 8e7aee3fd22a8e98753973fadd022c8fc2a1ab35b46c547f0d0330df7ae66486 Mar 10 07:04:54 crc kubenswrapper[4825]: I0310 07:04:54.821299 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.092192 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerStarted","Data":"0eb3526b2f9a3dc4c181eae70d6e3f55e5aedd2b23585785d07ab9b47c86be86"} Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.092533 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerStarted","Data":"bbfb6db4ff1cf66b5cefbdf85aba01f8712fe08811899971bacb92c4cffa4c80"} Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.113610 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jctbb-config-nq5vn" event={"ID":"44afa3f7-22c9-4881-86ca-8690b07a9972","Type":"ContainerStarted","Data":"c7bdda645430db45f7b8e88e993abb18f845999fa4f97858f9f4c8547b9bd21f"} Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.113679 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jctbb-config-nq5vn" event={"ID":"44afa3f7-22c9-4881-86ca-8690b07a9972","Type":"ContainerStarted","Data":"8e7aee3fd22a8e98753973fadd022c8fc2a1ab35b46c547f0d0330df7ae66486"} Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.214303 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-j22lk"] Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.215294 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-j22lk" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.261392 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b81795a7-9260-45ee-ac6d-b714ed894ce2" path="/var/lib/kubelet/pods/b81795a7-9260-45ee-ac6d-b714ed894ce2/volumes" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.262097 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-j22lk"] Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.299280 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.297472836 podStartE2EDuration="34.29926542s" podCreationTimestamp="2026-03-10 07:04:21 +0000 UTC" firstStartedPulling="2026-03-10 07:04:38.839463743 +0000 UTC m=+1231.869244368" lastFinishedPulling="2026-03-10 07:04:52.841256337 +0000 UTC m=+1245.871036952" observedRunningTime="2026-03-10 07:04:55.235523399 +0000 UTC m=+1248.265304014" watchObservedRunningTime="2026-03-10 07:04:55.29926542 +0000 UTC m=+1248.329046035" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.358328 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-jctbb-config-nq5vn" podStartSLOduration=2.358310309 podStartE2EDuration="2.358310309s" podCreationTimestamp="2026-03-10 07:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:04:55.357540269 +0000 UTC m=+1248.387320894" watchObservedRunningTime="2026-03-10 07:04:55.358310309 +0000 UTC m=+1248.388090924" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.377207 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b061d3ea-7025-4381-a6b3-232d5998b15f-operator-scripts\") pod \"cinder-db-create-j22lk\" (UID: \"b061d3ea-7025-4381-a6b3-232d5998b15f\") " pod="openstack/cinder-db-create-j22lk" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.377267 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdnnl\" (UniqueName: \"kubernetes.io/projected/b061d3ea-7025-4381-a6b3-232d5998b15f-kube-api-access-sdnnl\") pod \"cinder-db-create-j22lk\" (UID: \"b061d3ea-7025-4381-a6b3-232d5998b15f\") " pod="openstack/cinder-db-create-j22lk" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.478944 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b061d3ea-7025-4381-a6b3-232d5998b15f-operator-scripts\") pod \"cinder-db-create-j22lk\" (UID: \"b061d3ea-7025-4381-a6b3-232d5998b15f\") " pod="openstack/cinder-db-create-j22lk" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.479034 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdnnl\" (UniqueName: \"kubernetes.io/projected/b061d3ea-7025-4381-a6b3-232d5998b15f-kube-api-access-sdnnl\") pod \"cinder-db-create-j22lk\" (UID: \"b061d3ea-7025-4381-a6b3-232d5998b15f\") " pod="openstack/cinder-db-create-j22lk" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.480039 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b061d3ea-7025-4381-a6b3-232d5998b15f-operator-scripts\") pod \"cinder-db-create-j22lk\" (UID: \"b061d3ea-7025-4381-a6b3-232d5998b15f\") " pod="openstack/cinder-db-create-j22lk" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.499642 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdnnl\" (UniqueName: \"kubernetes.io/projected/b061d3ea-7025-4381-a6b3-232d5998b15f-kube-api-access-sdnnl\") pod \"cinder-db-create-j22lk\" (UID: \"b061d3ea-7025-4381-a6b3-232d5998b15f\") " pod="openstack/cinder-db-create-j22lk" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.541363 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-l5vf2"] Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.548480 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-j22lk" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.562416 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-l5vf2" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.581639 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-l5vf2"] Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.643475 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-833c-account-create-update-stmjm"] Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.647318 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-833c-account-create-update-stmjm" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.657534 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.682754 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-833c-account-create-update-stmjm"] Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.694000 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-mcpm6"] Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.695791 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mcpm6" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.701988 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.702315 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.702517 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.702566 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.703524 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkcxf\" (UniqueName: \"kubernetes.io/projected/489d7ca6-7e05-482e-b880-69bea3b57c62-kube-api-access-zkcxf\") pod \"barbican-db-create-l5vf2\" (UID: \"489d7ca6-7e05-482e-b880-69bea3b57c62\") " pod="openstack/barbican-db-create-l5vf2" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.705659 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/489d7ca6-7e05-482e-b880-69bea3b57c62-operator-scripts\") pod \"barbican-db-create-l5vf2\" (UID: \"489d7ca6-7e05-482e-b880-69bea3b57c62\") " pod="openstack/barbican-db-create-l5vf2" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.705726 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b2s2\" (UniqueName: \"kubernetes.io/projected/178e89cb-daf7-488f-8188-de4e98bc1047-kube-api-access-7b2s2\") pod \"cinder-833c-account-create-update-stmjm\" (UID: \"178e89cb-daf7-488f-8188-de4e98bc1047\") " pod="openstack/cinder-833c-account-create-update-stmjm" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.705840 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/178e89cb-daf7-488f-8188-de4e98bc1047-operator-scripts\") pod \"cinder-833c-account-create-update-stmjm\" (UID: \"178e89cb-daf7-488f-8188-de4e98bc1047\") " pod="openstack/cinder-833c-account-create-update-stmjm" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.711029 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rs4jw" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.711234 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-mcpm6"] Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.754230 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-6cw56"] Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.755951 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6cw56" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.807142 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/178e89cb-daf7-488f-8188-de4e98bc1047-operator-scripts\") pod \"cinder-833c-account-create-update-stmjm\" (UID: \"178e89cb-daf7-488f-8188-de4e98bc1047\") " pod="openstack/cinder-833c-account-create-update-stmjm" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.807233 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f77e29-2f56-4415-b778-376f67322f69-config-data\") pod \"keystone-db-sync-mcpm6\" (UID: \"e2f77e29-2f56-4415-b778-376f67322f69\") " pod="openstack/keystone-db-sync-mcpm6" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.807330 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkcxf\" (UniqueName: \"kubernetes.io/projected/489d7ca6-7e05-482e-b880-69bea3b57c62-kube-api-access-zkcxf\") pod \"barbican-db-create-l5vf2\" (UID: \"489d7ca6-7e05-482e-b880-69bea3b57c62\") " pod="openstack/barbican-db-create-l5vf2" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.807373 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e22d1830-21e9-40f0-ab2b-83335a568a18-operator-scripts\") pod \"neutron-db-create-6cw56\" (UID: \"e22d1830-21e9-40f0-ab2b-83335a568a18\") " pod="openstack/neutron-db-create-6cw56" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.807396 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/489d7ca6-7e05-482e-b880-69bea3b57c62-operator-scripts\") pod \"barbican-db-create-l5vf2\" (UID: \"489d7ca6-7e05-482e-b880-69bea3b57c62\") " pod="openstack/barbican-db-create-l5vf2" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.807415 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgrgn\" (UniqueName: \"kubernetes.io/projected/e2f77e29-2f56-4415-b778-376f67322f69-kube-api-access-mgrgn\") pod \"keystone-db-sync-mcpm6\" (UID: \"e2f77e29-2f56-4415-b778-376f67322f69\") " pod="openstack/keystone-db-sync-mcpm6" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.807432 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b2s2\" (UniqueName: \"kubernetes.io/projected/178e89cb-daf7-488f-8188-de4e98bc1047-kube-api-access-7b2s2\") pod \"cinder-833c-account-create-update-stmjm\" (UID: \"178e89cb-daf7-488f-8188-de4e98bc1047\") " pod="openstack/cinder-833c-account-create-update-stmjm" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.807452 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glskw\" (UniqueName: \"kubernetes.io/projected/e22d1830-21e9-40f0-ab2b-83335a568a18-kube-api-access-glskw\") pod \"neutron-db-create-6cw56\" (UID: \"e22d1830-21e9-40f0-ab2b-83335a568a18\") " pod="openstack/neutron-db-create-6cw56" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.807484 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f77e29-2f56-4415-b778-376f67322f69-combined-ca-bundle\") pod \"keystone-db-sync-mcpm6\" (UID: \"e2f77e29-2f56-4415-b778-376f67322f69\") " pod="openstack/keystone-db-sync-mcpm6" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.809159 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/178e89cb-daf7-488f-8188-de4e98bc1047-operator-scripts\") pod \"cinder-833c-account-create-update-stmjm\" (UID: \"178e89cb-daf7-488f-8188-de4e98bc1047\") " pod="openstack/cinder-833c-account-create-update-stmjm" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.811902 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/489d7ca6-7e05-482e-b880-69bea3b57c62-operator-scripts\") pod \"barbican-db-create-l5vf2\" (UID: \"489d7ca6-7e05-482e-b880-69bea3b57c62\") " pod="openstack/barbican-db-create-l5vf2" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.821782 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-6cw56"] Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.843762 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b2s2\" (UniqueName: \"kubernetes.io/projected/178e89cb-daf7-488f-8188-de4e98bc1047-kube-api-access-7b2s2\") pod \"cinder-833c-account-create-update-stmjm\" (UID: \"178e89cb-daf7-488f-8188-de4e98bc1047\") " pod="openstack/cinder-833c-account-create-update-stmjm" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.850530 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkcxf\" (UniqueName: \"kubernetes.io/projected/489d7ca6-7e05-482e-b880-69bea3b57c62-kube-api-access-zkcxf\") pod \"barbican-db-create-l5vf2\" (UID: \"489d7ca6-7e05-482e-b880-69bea3b57c62\") " pod="openstack/barbican-db-create-l5vf2" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.854946 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b684fb9f5-8pqxx"] Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.856955 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.860539 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.886047 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b684fb9f5-8pqxx"] Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.910718 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e22d1830-21e9-40f0-ab2b-83335a568a18-operator-scripts\") pod \"neutron-db-create-6cw56\" (UID: \"e22d1830-21e9-40f0-ab2b-83335a568a18\") " pod="openstack/neutron-db-create-6cw56" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.910766 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgrgn\" (UniqueName: \"kubernetes.io/projected/e2f77e29-2f56-4415-b778-376f67322f69-kube-api-access-mgrgn\") pod \"keystone-db-sync-mcpm6\" (UID: \"e2f77e29-2f56-4415-b778-376f67322f69\") " pod="openstack/keystone-db-sync-mcpm6" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.910796 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glskw\" (UniqueName: \"kubernetes.io/projected/e22d1830-21e9-40f0-ab2b-83335a568a18-kube-api-access-glskw\") pod \"neutron-db-create-6cw56\" (UID: \"e22d1830-21e9-40f0-ab2b-83335a568a18\") " pod="openstack/neutron-db-create-6cw56" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.910819 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-ovsdbserver-nb\") pod \"dnsmasq-dns-7b684fb9f5-8pqxx\" (UID: \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\") " pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.910850 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-config\") pod \"dnsmasq-dns-7b684fb9f5-8pqxx\" (UID: \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\") " pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.910874 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f77e29-2f56-4415-b778-376f67322f69-combined-ca-bundle\") pod \"keystone-db-sync-mcpm6\" (UID: \"e2f77e29-2f56-4415-b778-376f67322f69\") " pod="openstack/keystone-db-sync-mcpm6" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.910902 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc2wb\" (UniqueName: \"kubernetes.io/projected/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-kube-api-access-vc2wb\") pod \"dnsmasq-dns-7b684fb9f5-8pqxx\" (UID: \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\") " pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.910935 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-ovsdbserver-sb\") pod \"dnsmasq-dns-7b684fb9f5-8pqxx\" (UID: \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\") " pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.910960 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f77e29-2f56-4415-b778-376f67322f69-config-data\") pod \"keystone-db-sync-mcpm6\" (UID: \"e2f77e29-2f56-4415-b778-376f67322f69\") " pod="openstack/keystone-db-sync-mcpm6" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.911007 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-dns-swift-storage-0\") pod \"dnsmasq-dns-7b684fb9f5-8pqxx\" (UID: \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\") " pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.911036 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-dns-svc\") pod \"dnsmasq-dns-7b684fb9f5-8pqxx\" (UID: \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\") " pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.911687 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e22d1830-21e9-40f0-ab2b-83335a568a18-operator-scripts\") pod \"neutron-db-create-6cw56\" (UID: \"e22d1830-21e9-40f0-ab2b-83335a568a18\") " pod="openstack/neutron-db-create-6cw56" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.932019 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f77e29-2f56-4415-b778-376f67322f69-config-data\") pod \"keystone-db-sync-mcpm6\" (UID: \"e2f77e29-2f56-4415-b778-376f67322f69\") " pod="openstack/keystone-db-sync-mcpm6" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.953204 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f77e29-2f56-4415-b778-376f67322f69-combined-ca-bundle\") pod \"keystone-db-sync-mcpm6\" (UID: \"e2f77e29-2f56-4415-b778-376f67322f69\") " pod="openstack/keystone-db-sync-mcpm6" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.959645 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgrgn\" (UniqueName: \"kubernetes.io/projected/e2f77e29-2f56-4415-b778-376f67322f69-kube-api-access-mgrgn\") pod \"keystone-db-sync-mcpm6\" (UID: \"e2f77e29-2f56-4415-b778-376f67322f69\") " pod="openstack/keystone-db-sync-mcpm6" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.959745 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b26b-account-create-update-8l4dm"] Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.977908 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-l5vf2" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.980473 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b26b-account-create-update-8l4dm" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.980915 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b26b-account-create-update-8l4dm"] Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.988525 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 10 07:04:55 crc kubenswrapper[4825]: I0310 07:04:55.988771 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glskw\" (UniqueName: \"kubernetes.io/projected/e22d1830-21e9-40f0-ab2b-83335a568a18-kube-api-access-glskw\") pod \"neutron-db-create-6cw56\" (UID: \"e22d1830-21e9-40f0-ab2b-83335a568a18\") " pod="openstack/neutron-db-create-6cw56" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.005043 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-833c-account-create-update-stmjm" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.015492 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc2wb\" (UniqueName: \"kubernetes.io/projected/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-kube-api-access-vc2wb\") pod \"dnsmasq-dns-7b684fb9f5-8pqxx\" (UID: \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\") " pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.016958 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e9617ba-17da-4946-8c5b-b1060683237d-operator-scripts\") pod \"barbican-b26b-account-create-update-8l4dm\" (UID: \"4e9617ba-17da-4946-8c5b-b1060683237d\") " pod="openstack/barbican-b26b-account-create-update-8l4dm" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.017008 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-ovsdbserver-sb\") pod \"dnsmasq-dns-7b684fb9f5-8pqxx\" (UID: \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\") " pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.017106 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-dns-swift-storage-0\") pod \"dnsmasq-dns-7b684fb9f5-8pqxx\" (UID: \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\") " pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.017155 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5wf4\" (UniqueName: \"kubernetes.io/projected/4e9617ba-17da-4946-8c5b-b1060683237d-kube-api-access-d5wf4\") pod \"barbican-b26b-account-create-update-8l4dm\" (UID: \"4e9617ba-17da-4946-8c5b-b1060683237d\") " pod="openstack/barbican-b26b-account-create-update-8l4dm" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.017185 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-dns-svc\") pod \"dnsmasq-dns-7b684fb9f5-8pqxx\" (UID: \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\") " pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.017247 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-ovsdbserver-nb\") pod \"dnsmasq-dns-7b684fb9f5-8pqxx\" (UID: \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\") " pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.017269 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-config\") pod \"dnsmasq-dns-7b684fb9f5-8pqxx\" (UID: \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\") " pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.018071 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-config\") pod \"dnsmasq-dns-7b684fb9f5-8pqxx\" (UID: \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\") " pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.018870 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-ovsdbserver-sb\") pod \"dnsmasq-dns-7b684fb9f5-8pqxx\" (UID: \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\") " pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.024310 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-dns-svc\") pod \"dnsmasq-dns-7b684fb9f5-8pqxx\" (UID: \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\") " pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.024875 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-ovsdbserver-nb\") pod \"dnsmasq-dns-7b684fb9f5-8pqxx\" (UID: \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\") " pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.026065 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mcpm6" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.045030 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-dns-swift-storage-0\") pod \"dnsmasq-dns-7b684fb9f5-8pqxx\" (UID: \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\") " pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.055157 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc2wb\" (UniqueName: \"kubernetes.io/projected/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-kube-api-access-vc2wb\") pod \"dnsmasq-dns-7b684fb9f5-8pqxx\" (UID: \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\") " pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.062503 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-10c2-account-create-update-7fscp"] Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.069223 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-10c2-account-create-update-7fscp" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.076251 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.093211 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-10c2-account-create-update-7fscp"] Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.119473 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5wf4\" (UniqueName: \"kubernetes.io/projected/4e9617ba-17da-4946-8c5b-b1060683237d-kube-api-access-d5wf4\") pod \"barbican-b26b-account-create-update-8l4dm\" (UID: \"4e9617ba-17da-4946-8c5b-b1060683237d\") " pod="openstack/barbican-b26b-account-create-update-8l4dm" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.119640 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e9617ba-17da-4946-8c5b-b1060683237d-operator-scripts\") pod \"barbican-b26b-account-create-update-8l4dm\" (UID: \"4e9617ba-17da-4946-8c5b-b1060683237d\") " pod="openstack/barbican-b26b-account-create-update-8l4dm" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.120465 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e9617ba-17da-4946-8c5b-b1060683237d-operator-scripts\") pod \"barbican-b26b-account-create-update-8l4dm\" (UID: \"4e9617ba-17da-4946-8c5b-b1060683237d\") " pod="openstack/barbican-b26b-account-create-update-8l4dm" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.120560 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37023b62-fd36-44d5-9c0f-45cf1b640da6-operator-scripts\") pod \"neutron-10c2-account-create-update-7fscp\" (UID: \"37023b62-fd36-44d5-9c0f-45cf1b640da6\") " pod="openstack/neutron-10c2-account-create-update-7fscp" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.120691 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jslv5\" (UniqueName: \"kubernetes.io/projected/37023b62-fd36-44d5-9c0f-45cf1b640da6-kube-api-access-jslv5\") pod \"neutron-10c2-account-create-update-7fscp\" (UID: \"37023b62-fd36-44d5-9c0f-45cf1b640da6\") " pod="openstack/neutron-10c2-account-create-update-7fscp" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.140926 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6cw56" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.140941 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5wf4\" (UniqueName: \"kubernetes.io/projected/4e9617ba-17da-4946-8c5b-b1060683237d-kube-api-access-d5wf4\") pod \"barbican-b26b-account-create-update-8l4dm\" (UID: \"4e9617ba-17da-4946-8c5b-b1060683237d\") " pod="openstack/barbican-b26b-account-create-update-8l4dm" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.224307 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37023b62-fd36-44d5-9c0f-45cf1b640da6-operator-scripts\") pod \"neutron-10c2-account-create-update-7fscp\" (UID: \"37023b62-fd36-44d5-9c0f-45cf1b640da6\") " pod="openstack/neutron-10c2-account-create-update-7fscp" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.224469 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jslv5\" (UniqueName: \"kubernetes.io/projected/37023b62-fd36-44d5-9c0f-45cf1b640da6-kube-api-access-jslv5\") pod \"neutron-10c2-account-create-update-7fscp\" (UID: \"37023b62-fd36-44d5-9c0f-45cf1b640da6\") " pod="openstack/neutron-10c2-account-create-update-7fscp" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.227549 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37023b62-fd36-44d5-9c0f-45cf1b640da6-operator-scripts\") pod \"neutron-10c2-account-create-update-7fscp\" (UID: \"37023b62-fd36-44d5-9c0f-45cf1b640da6\") " pod="openstack/neutron-10c2-account-create-update-7fscp" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.253013 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jslv5\" (UniqueName: \"kubernetes.io/projected/37023b62-fd36-44d5-9c0f-45cf1b640da6-kube-api-access-jslv5\") pod \"neutron-10c2-account-create-update-7fscp\" (UID: \"37023b62-fd36-44d5-9c0f-45cf1b640da6\") " pod="openstack/neutron-10c2-account-create-update-7fscp" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.273672 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.305485 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b26b-account-create-update-8l4dm" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.350424 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-j22lk"] Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.390190 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-10c2-account-create-update-7fscp" Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.544556 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-l5vf2"] Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.640035 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-833c-account-create-update-stmjm"] Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.734911 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-mcpm6"] Mar 10 07:04:56 crc kubenswrapper[4825]: W0310 07:04:56.746619 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2f77e29_2f56_4415_b778_376f67322f69.slice/crio-943d11a8aea80b98ec5deca750fd89f33e406d7da48ec8cb363c284e9c0fa47c WatchSource:0}: Error finding container 943d11a8aea80b98ec5deca750fd89f33e406d7da48ec8cb363c284e9c0fa47c: Status 404 returned error can't find the container with id 943d11a8aea80b98ec5deca750fd89f33e406d7da48ec8cb363c284e9c0fa47c Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.810713 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-6cw56"] Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.862581 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b684fb9f5-8pqxx"] Mar 10 07:04:56 crc kubenswrapper[4825]: I0310 07:04:56.975102 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b26b-account-create-update-8l4dm"] Mar 10 07:04:57 crc kubenswrapper[4825]: I0310 07:04:57.115123 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-10c2-account-create-update-7fscp"] Mar 10 07:04:57 crc kubenswrapper[4825]: I0310 07:04:57.149374 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-j22lk" event={"ID":"b061d3ea-7025-4381-a6b3-232d5998b15f","Type":"ContainerStarted","Data":"6ea644254c4be2d8dccc8488e346df3272810cf37b2c4aec97ada14fd9383c97"} Mar 10 07:04:57 crc kubenswrapper[4825]: I0310 07:04:57.149424 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-j22lk" event={"ID":"b061d3ea-7025-4381-a6b3-232d5998b15f","Type":"ContainerStarted","Data":"bf974f0e28be815f5fc5b914581b08d8027b90a416aad7b7a829fdedb914c2d8"} Mar 10 07:04:57 crc kubenswrapper[4825]: I0310 07:04:57.155104 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mcpm6" event={"ID":"e2f77e29-2f56-4415-b778-376f67322f69","Type":"ContainerStarted","Data":"943d11a8aea80b98ec5deca750fd89f33e406d7da48ec8cb363c284e9c0fa47c"} Mar 10 07:04:57 crc kubenswrapper[4825]: I0310 07:04:57.157014 4825 generic.go:334] "Generic (PLEG): container finished" podID="44afa3f7-22c9-4881-86ca-8690b07a9972" containerID="c7bdda645430db45f7b8e88e993abb18f845999fa4f97858f9f4c8547b9bd21f" exitCode=0 Mar 10 07:04:57 crc kubenswrapper[4825]: I0310 07:04:57.157158 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jctbb-config-nq5vn" event={"ID":"44afa3f7-22c9-4881-86ca-8690b07a9972","Type":"ContainerDied","Data":"c7bdda645430db45f7b8e88e993abb18f845999fa4f97858f9f4c8547b9bd21f"} Mar 10 07:04:57 crc kubenswrapper[4825]: I0310 07:04:57.159386 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-l5vf2" event={"ID":"489d7ca6-7e05-482e-b880-69bea3b57c62","Type":"ContainerStarted","Data":"f7ac3927b544e90a8dc6d089575a7df963d2e88feeb45b58caabdeb8c821d487"} Mar 10 07:04:57 crc kubenswrapper[4825]: I0310 07:04:57.159448 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-l5vf2" event={"ID":"489d7ca6-7e05-482e-b880-69bea3b57c62","Type":"ContainerStarted","Data":"3958a9d49bf18212f293f88c67f12ca6cb8b83424bddcc13205fea4497852be3"} Mar 10 07:04:57 crc kubenswrapper[4825]: I0310 07:04:57.168357 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6cw56" event={"ID":"e22d1830-21e9-40f0-ab2b-83335a568a18","Type":"ContainerStarted","Data":"ee6a2dd99ef1498477db567018498314266ac3980ae3e3dff6c88841a37b0704"} Mar 10 07:04:57 crc kubenswrapper[4825]: I0310 07:04:57.168408 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6cw56" event={"ID":"e22d1830-21e9-40f0-ab2b-83335a568a18","Type":"ContainerStarted","Data":"fe7e718f6503ced41f7a67cfa4957670a966065f192666acd65c49722a0ae7d9"} Mar 10 07:04:57 crc kubenswrapper[4825]: I0310 07:04:57.175100 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" event={"ID":"34986ed7-9e8c-467b-8be5-613c7f3f8e4b","Type":"ContainerStarted","Data":"af609656607d4d616d603727acade28f85fa01330ca7fa8971782c85b3ea9b61"} Mar 10 07:04:57 crc kubenswrapper[4825]: I0310 07:04:57.176702 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-j22lk" podStartSLOduration=2.176682962 podStartE2EDuration="2.176682962s" podCreationTimestamp="2026-03-10 07:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:04:57.168715295 +0000 UTC m=+1250.198495910" watchObservedRunningTime="2026-03-10 07:04:57.176682962 +0000 UTC m=+1250.206463577" Mar 10 07:04:57 crc kubenswrapper[4825]: I0310 07:04:57.179510 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-833c-account-create-update-stmjm" event={"ID":"178e89cb-daf7-488f-8188-de4e98bc1047","Type":"ContainerStarted","Data":"d09ee82d5aa4a7a17bf2b371f7100ef94c15ec19c7501d5875a9d27e816de091"} Mar 10 07:04:57 crc kubenswrapper[4825]: I0310 07:04:57.179549 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-833c-account-create-update-stmjm" event={"ID":"178e89cb-daf7-488f-8188-de4e98bc1047","Type":"ContainerStarted","Data":"960520181a145a8b6710b17dedef1691caf6b1a97e5d9a9ca874846f77928f17"} Mar 10 07:04:57 crc kubenswrapper[4825]: I0310 07:04:57.182744 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b26b-account-create-update-8l4dm" event={"ID":"4e9617ba-17da-4946-8c5b-b1060683237d","Type":"ContainerStarted","Data":"c9c1609b553fc626f099138385c622344d8095e540866f8342dba6b161ff67e7"} Mar 10 07:04:57 crc kubenswrapper[4825]: I0310 07:04:57.238504 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-l5vf2" podStartSLOduration=2.238485753 podStartE2EDuration="2.238485753s" podCreationTimestamp="2026-03-10 07:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:04:57.220306889 +0000 UTC m=+1250.250087504" watchObservedRunningTime="2026-03-10 07:04:57.238485753 +0000 UTC m=+1250.268266368" Mar 10 07:04:57 crc kubenswrapper[4825]: I0310 07:04:57.239246 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-6cw56" podStartSLOduration=2.239240343 podStartE2EDuration="2.239240343s" podCreationTimestamp="2026-03-10 07:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:04:57.23568655 +0000 UTC m=+1250.265467165" watchObservedRunningTime="2026-03-10 07:04:57.239240343 +0000 UTC m=+1250.269020958" Mar 10 07:04:57 crc kubenswrapper[4825]: I0310 07:04:57.300716 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-833c-account-create-update-stmjm" podStartSLOduration=2.300690054 podStartE2EDuration="2.300690054s" podCreationTimestamp="2026-03-10 07:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:04:57.288758393 +0000 UTC m=+1250.318539008" watchObservedRunningTime="2026-03-10 07:04:57.300690054 +0000 UTC m=+1250.330470669" Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.194670 4825 generic.go:334] "Generic (PLEG): container finished" podID="37023b62-fd36-44d5-9c0f-45cf1b640da6" containerID="5e77a76f7d16b2669bd7d5e44e7c46216e39d10ff92e441fc9810b89690b1df6" exitCode=0 Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.195360 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-10c2-account-create-update-7fscp" event={"ID":"37023b62-fd36-44d5-9c0f-45cf1b640da6","Type":"ContainerDied","Data":"5e77a76f7d16b2669bd7d5e44e7c46216e39d10ff92e441fc9810b89690b1df6"} Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.195391 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-10c2-account-create-update-7fscp" event={"ID":"37023b62-fd36-44d5-9c0f-45cf1b640da6","Type":"ContainerStarted","Data":"b56bb6cf15673f3ba5da4fa8838e5869231f228758378fd5abbac2bc9cd78ef8"} Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.199021 4825 generic.go:334] "Generic (PLEG): container finished" podID="b061d3ea-7025-4381-a6b3-232d5998b15f" containerID="6ea644254c4be2d8dccc8488e346df3272810cf37b2c4aec97ada14fd9383c97" exitCode=0 Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.199195 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-j22lk" event={"ID":"b061d3ea-7025-4381-a6b3-232d5998b15f","Type":"ContainerDied","Data":"6ea644254c4be2d8dccc8488e346df3272810cf37b2c4aec97ada14fd9383c97"} Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.203714 4825 generic.go:334] "Generic (PLEG): container finished" podID="4e9617ba-17da-4946-8c5b-b1060683237d" containerID="59e7d3f59f28bf9a85655e65cea5774e18262e1f33b27b8421e9a03b0b4731ad" exitCode=0 Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.203861 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b26b-account-create-update-8l4dm" event={"ID":"4e9617ba-17da-4946-8c5b-b1060683237d","Type":"ContainerDied","Data":"59e7d3f59f28bf9a85655e65cea5774e18262e1f33b27b8421e9a03b0b4731ad"} Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.206894 4825 generic.go:334] "Generic (PLEG): container finished" podID="489d7ca6-7e05-482e-b880-69bea3b57c62" containerID="f7ac3927b544e90a8dc6d089575a7df963d2e88feeb45b58caabdeb8c821d487" exitCode=0 Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.206970 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-l5vf2" event={"ID":"489d7ca6-7e05-482e-b880-69bea3b57c62","Type":"ContainerDied","Data":"f7ac3927b544e90a8dc6d089575a7df963d2e88feeb45b58caabdeb8c821d487"} Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.210312 4825 generic.go:334] "Generic (PLEG): container finished" podID="e22d1830-21e9-40f0-ab2b-83335a568a18" containerID="ee6a2dd99ef1498477db567018498314266ac3980ae3e3dff6c88841a37b0704" exitCode=0 Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.210393 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6cw56" event={"ID":"e22d1830-21e9-40f0-ab2b-83335a568a18","Type":"ContainerDied","Data":"ee6a2dd99ef1498477db567018498314266ac3980ae3e3dff6c88841a37b0704"} Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.220529 4825 generic.go:334] "Generic (PLEG): container finished" podID="34986ed7-9e8c-467b-8be5-613c7f3f8e4b" containerID="e1e36514caebef2f964c884dfc0275ec341667edb650d1c5097a843350f609c2" exitCode=0 Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.220585 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" event={"ID":"34986ed7-9e8c-467b-8be5-613c7f3f8e4b","Type":"ContainerDied","Data":"e1e36514caebef2f964c884dfc0275ec341667edb650d1c5097a843350f609c2"} Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.222452 4825 generic.go:334] "Generic (PLEG): container finished" podID="178e89cb-daf7-488f-8188-de4e98bc1047" containerID="d09ee82d5aa4a7a17bf2b371f7100ef94c15ec19c7501d5875a9d27e816de091" exitCode=0 Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.222685 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-833c-account-create-update-stmjm" event={"ID":"178e89cb-daf7-488f-8188-de4e98bc1047","Type":"ContainerDied","Data":"d09ee82d5aa4a7a17bf2b371f7100ef94c15ec19c7501d5875a9d27e816de091"} Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.851449 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jctbb-config-nq5vn" Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.887911 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/44afa3f7-22c9-4881-86ca-8690b07a9972-var-run-ovn\") pod \"44afa3f7-22c9-4881-86ca-8690b07a9972\" (UID: \"44afa3f7-22c9-4881-86ca-8690b07a9972\") " Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.887976 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44afa3f7-22c9-4881-86ca-8690b07a9972-scripts\") pod \"44afa3f7-22c9-4881-86ca-8690b07a9972\" (UID: \"44afa3f7-22c9-4881-86ca-8690b07a9972\") " Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.888055 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqjlw\" (UniqueName: \"kubernetes.io/projected/44afa3f7-22c9-4881-86ca-8690b07a9972-kube-api-access-bqjlw\") pod \"44afa3f7-22c9-4881-86ca-8690b07a9972\" (UID: \"44afa3f7-22c9-4881-86ca-8690b07a9972\") " Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.888201 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/44afa3f7-22c9-4881-86ca-8690b07a9972-var-run\") pod \"44afa3f7-22c9-4881-86ca-8690b07a9972\" (UID: \"44afa3f7-22c9-4881-86ca-8690b07a9972\") " Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.888238 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/44afa3f7-22c9-4881-86ca-8690b07a9972-var-log-ovn\") pod \"44afa3f7-22c9-4881-86ca-8690b07a9972\" (UID: \"44afa3f7-22c9-4881-86ca-8690b07a9972\") " Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.888272 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/44afa3f7-22c9-4881-86ca-8690b07a9972-additional-scripts\") pod \"44afa3f7-22c9-4881-86ca-8690b07a9972\" (UID: \"44afa3f7-22c9-4881-86ca-8690b07a9972\") " Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.889561 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44afa3f7-22c9-4881-86ca-8690b07a9972-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "44afa3f7-22c9-4881-86ca-8690b07a9972" (UID: "44afa3f7-22c9-4881-86ca-8690b07a9972"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.889612 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44afa3f7-22c9-4881-86ca-8690b07a9972-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "44afa3f7-22c9-4881-86ca-8690b07a9972" (UID: "44afa3f7-22c9-4881-86ca-8690b07a9972"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.890435 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44afa3f7-22c9-4881-86ca-8690b07a9972-scripts" (OuterVolumeSpecName: "scripts") pod "44afa3f7-22c9-4881-86ca-8690b07a9972" (UID: "44afa3f7-22c9-4881-86ca-8690b07a9972"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.891418 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44afa3f7-22c9-4881-86ca-8690b07a9972-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "44afa3f7-22c9-4881-86ca-8690b07a9972" (UID: "44afa3f7-22c9-4881-86ca-8690b07a9972"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.891463 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44afa3f7-22c9-4881-86ca-8690b07a9972-var-run" (OuterVolumeSpecName: "var-run") pod "44afa3f7-22c9-4881-86ca-8690b07a9972" (UID: "44afa3f7-22c9-4881-86ca-8690b07a9972"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.908544 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44afa3f7-22c9-4881-86ca-8690b07a9972-kube-api-access-bqjlw" (OuterVolumeSpecName: "kube-api-access-bqjlw") pod "44afa3f7-22c9-4881-86ca-8690b07a9972" (UID: "44afa3f7-22c9-4881-86ca-8690b07a9972"). InnerVolumeSpecName "kube-api-access-bqjlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.989838 4825 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/44afa3f7-22c9-4881-86ca-8690b07a9972-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.989874 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44afa3f7-22c9-4881-86ca-8690b07a9972-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.989888 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqjlw\" (UniqueName: \"kubernetes.io/projected/44afa3f7-22c9-4881-86ca-8690b07a9972-kube-api-access-bqjlw\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.989903 4825 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/44afa3f7-22c9-4881-86ca-8690b07a9972-var-run\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.989916 4825 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/44afa3f7-22c9-4881-86ca-8690b07a9972-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:58 crc kubenswrapper[4825]: I0310 07:04:58.989929 4825 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/44afa3f7-22c9-4881-86ca-8690b07a9972-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:59 crc kubenswrapper[4825]: I0310 07:04:59.248967 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jctbb-config-nq5vn" Mar 10 07:04:59 crc kubenswrapper[4825]: I0310 07:04:59.265807 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jctbb-config-nq5vn" event={"ID":"44afa3f7-22c9-4881-86ca-8690b07a9972","Type":"ContainerDied","Data":"8e7aee3fd22a8e98753973fadd022c8fc2a1ab35b46c547f0d0330df7ae66486"} Mar 10 07:04:59 crc kubenswrapper[4825]: I0310 07:04:59.265848 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e7aee3fd22a8e98753973fadd022c8fc2a1ab35b46c547f0d0330df7ae66486" Mar 10 07:04:59 crc kubenswrapper[4825]: I0310 07:04:59.270278 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" event={"ID":"34986ed7-9e8c-467b-8be5-613c7f3f8e4b","Type":"ContainerStarted","Data":"3512388ff231c62f2433e8641aaa018bfa8bfb7fee8b63c6dcfa8faaad232c79"} Mar 10 07:04:59 crc kubenswrapper[4825]: I0310 07:04:59.270513 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" Mar 10 07:04:59 crc kubenswrapper[4825]: I0310 07:04:59.293802 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-jctbb-config-nq5vn"] Mar 10 07:04:59 crc kubenswrapper[4825]: I0310 07:04:59.303062 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-jctbb-config-nq5vn"] Mar 10 07:04:59 crc kubenswrapper[4825]: I0310 07:04:59.336747 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" podStartSLOduration=4.33673106 podStartE2EDuration="4.33673106s" podCreationTimestamp="2026-03-10 07:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:04:59.333940017 +0000 UTC m=+1252.363720632" watchObservedRunningTime="2026-03-10 07:04:59.33673106 +0000 UTC m=+1252.366511665" Mar 10 07:04:59 crc kubenswrapper[4825]: I0310 07:04:59.643208 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-l5vf2" Mar 10 07:04:59 crc kubenswrapper[4825]: I0310 07:04:59.809881 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkcxf\" (UniqueName: \"kubernetes.io/projected/489d7ca6-7e05-482e-b880-69bea3b57c62-kube-api-access-zkcxf\") pod \"489d7ca6-7e05-482e-b880-69bea3b57c62\" (UID: \"489d7ca6-7e05-482e-b880-69bea3b57c62\") " Mar 10 07:04:59 crc kubenswrapper[4825]: I0310 07:04:59.810028 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/489d7ca6-7e05-482e-b880-69bea3b57c62-operator-scripts\") pod \"489d7ca6-7e05-482e-b880-69bea3b57c62\" (UID: \"489d7ca6-7e05-482e-b880-69bea3b57c62\") " Mar 10 07:04:59 crc kubenswrapper[4825]: I0310 07:04:59.811026 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/489d7ca6-7e05-482e-b880-69bea3b57c62-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "489d7ca6-7e05-482e-b880-69bea3b57c62" (UID: "489d7ca6-7e05-482e-b880-69bea3b57c62"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:04:59 crc kubenswrapper[4825]: I0310 07:04:59.820290 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/489d7ca6-7e05-482e-b880-69bea3b57c62-kube-api-access-zkcxf" (OuterVolumeSpecName: "kube-api-access-zkcxf") pod "489d7ca6-7e05-482e-b880-69bea3b57c62" (UID: "489d7ca6-7e05-482e-b880-69bea3b57c62"). InnerVolumeSpecName "kube-api-access-zkcxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:04:59 crc kubenswrapper[4825]: I0310 07:04:59.912593 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkcxf\" (UniqueName: \"kubernetes.io/projected/489d7ca6-7e05-482e-b880-69bea3b57c62-kube-api-access-zkcxf\") on node \"crc\" DevicePath \"\"" Mar 10 07:04:59 crc kubenswrapper[4825]: I0310 07:04:59.912657 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/489d7ca6-7e05-482e-b880-69bea3b57c62-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:00 crc kubenswrapper[4825]: I0310 07:05:00.279121 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-l5vf2" Mar 10 07:05:00 crc kubenswrapper[4825]: I0310 07:05:00.279445 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-l5vf2" event={"ID":"489d7ca6-7e05-482e-b880-69bea3b57c62","Type":"ContainerDied","Data":"3958a9d49bf18212f293f88c67f12ca6cb8b83424bddcc13205fea4497852be3"} Mar 10 07:05:00 crc kubenswrapper[4825]: I0310 07:05:00.279490 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3958a9d49bf18212f293f88c67f12ca6cb8b83424bddcc13205fea4497852be3" Mar 10 07:05:00 crc kubenswrapper[4825]: I0310 07:05:00.285338 4825 generic.go:334] "Generic (PLEG): container finished" podID="31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff" containerID="9e2afec1031575eab3dc1ec0e1af73607aeb7fdff1f66af902b177103558df55" exitCode=0 Mar 10 07:05:00 crc kubenswrapper[4825]: I0310 07:05:00.285424 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7zgw8" event={"ID":"31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff","Type":"ContainerDied","Data":"9e2afec1031575eab3dc1ec0e1af73607aeb7fdff1f66af902b177103558df55"} Mar 10 07:05:01 crc kubenswrapper[4825]: I0310 07:05:01.249635 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44afa3f7-22c9-4881-86ca-8690b07a9972" path="/var/lib/kubelet/pods/44afa3f7-22c9-4881-86ca-8690b07a9972/volumes" Mar 10 07:05:02 crc kubenswrapper[4825]: I0310 07:05:02.960103 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-j22lk" Mar 10 07:05:02 crc kubenswrapper[4825]: I0310 07:05:02.979385 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdnnl\" (UniqueName: \"kubernetes.io/projected/b061d3ea-7025-4381-a6b3-232d5998b15f-kube-api-access-sdnnl\") pod \"b061d3ea-7025-4381-a6b3-232d5998b15f\" (UID: \"b061d3ea-7025-4381-a6b3-232d5998b15f\") " Mar 10 07:05:02 crc kubenswrapper[4825]: I0310 07:05:02.979861 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b061d3ea-7025-4381-a6b3-232d5998b15f-operator-scripts\") pod \"b061d3ea-7025-4381-a6b3-232d5998b15f\" (UID: \"b061d3ea-7025-4381-a6b3-232d5998b15f\") " Mar 10 07:05:02 crc kubenswrapper[4825]: I0310 07:05:02.981697 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b061d3ea-7025-4381-a6b3-232d5998b15f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b061d3ea-7025-4381-a6b3-232d5998b15f" (UID: "b061d3ea-7025-4381-a6b3-232d5998b15f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:02 crc kubenswrapper[4825]: I0310 07:05:02.987371 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b061d3ea-7025-4381-a6b3-232d5998b15f-kube-api-access-sdnnl" (OuterVolumeSpecName: "kube-api-access-sdnnl") pod "b061d3ea-7025-4381-a6b3-232d5998b15f" (UID: "b061d3ea-7025-4381-a6b3-232d5998b15f"). InnerVolumeSpecName "kube-api-access-sdnnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.082346 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdnnl\" (UniqueName: \"kubernetes.io/projected/b061d3ea-7025-4381-a6b3-232d5998b15f-kube-api-access-sdnnl\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.082380 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b061d3ea-7025-4381-a6b3-232d5998b15f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.089717 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6cw56" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.096958 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7zgw8" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.117492 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-10c2-account-create-update-7fscp" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.180324 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-833c-account-create-update-stmjm" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.183648 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37023b62-fd36-44d5-9c0f-45cf1b640da6-operator-scripts\") pod \"37023b62-fd36-44d5-9c0f-45cf1b640da6\" (UID: \"37023b62-fd36-44d5-9c0f-45cf1b640da6\") " Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.183829 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jslv5\" (UniqueName: \"kubernetes.io/projected/37023b62-fd36-44d5-9c0f-45cf1b640da6-kube-api-access-jslv5\") pod \"37023b62-fd36-44d5-9c0f-45cf1b640da6\" (UID: \"37023b62-fd36-44d5-9c0f-45cf1b640da6\") " Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.183869 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/178e89cb-daf7-488f-8188-de4e98bc1047-operator-scripts\") pod \"178e89cb-daf7-488f-8188-de4e98bc1047\" (UID: \"178e89cb-daf7-488f-8188-de4e98bc1047\") " Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.183920 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e22d1830-21e9-40f0-ab2b-83335a568a18-operator-scripts\") pod \"e22d1830-21e9-40f0-ab2b-83335a568a18\" (UID: \"e22d1830-21e9-40f0-ab2b-83335a568a18\") " Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.183953 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff-config-data\") pod \"31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff\" (UID: \"31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff\") " Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.183986 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b2s2\" (UniqueName: \"kubernetes.io/projected/178e89cb-daf7-488f-8188-de4e98bc1047-kube-api-access-7b2s2\") pod \"178e89cb-daf7-488f-8188-de4e98bc1047\" (UID: \"178e89cb-daf7-488f-8188-de4e98bc1047\") " Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.184036 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff-db-sync-config-data\") pod \"31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff\" (UID: \"31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff\") " Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.184094 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff-combined-ca-bundle\") pod \"31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff\" (UID: \"31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff\") " Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.184175 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65w7p\" (UniqueName: \"kubernetes.io/projected/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff-kube-api-access-65w7p\") pod \"31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff\" (UID: \"31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff\") " Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.184239 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glskw\" (UniqueName: \"kubernetes.io/projected/e22d1830-21e9-40f0-ab2b-83335a568a18-kube-api-access-glskw\") pod \"e22d1830-21e9-40f0-ab2b-83335a568a18\" (UID: \"e22d1830-21e9-40f0-ab2b-83335a568a18\") " Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.184287 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37023b62-fd36-44d5-9c0f-45cf1b640da6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "37023b62-fd36-44d5-9c0f-45cf1b640da6" (UID: "37023b62-fd36-44d5-9c0f-45cf1b640da6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.184615 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37023b62-fd36-44d5-9c0f-45cf1b640da6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.185188 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/178e89cb-daf7-488f-8188-de4e98bc1047-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "178e89cb-daf7-488f-8188-de4e98bc1047" (UID: "178e89cb-daf7-488f-8188-de4e98bc1047"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.190153 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff" (UID: "31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.192210 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37023b62-fd36-44d5-9c0f-45cf1b640da6-kube-api-access-jslv5" (OuterVolumeSpecName: "kube-api-access-jslv5") pod "37023b62-fd36-44d5-9c0f-45cf1b640da6" (UID: "37023b62-fd36-44d5-9c0f-45cf1b640da6"). InnerVolumeSpecName "kube-api-access-jslv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.192254 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/178e89cb-daf7-488f-8188-de4e98bc1047-kube-api-access-7b2s2" (OuterVolumeSpecName: "kube-api-access-7b2s2") pod "178e89cb-daf7-488f-8188-de4e98bc1047" (UID: "178e89cb-daf7-488f-8188-de4e98bc1047"). InnerVolumeSpecName "kube-api-access-7b2s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.192282 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e22d1830-21e9-40f0-ab2b-83335a568a18-kube-api-access-glskw" (OuterVolumeSpecName: "kube-api-access-glskw") pod "e22d1830-21e9-40f0-ab2b-83335a568a18" (UID: "e22d1830-21e9-40f0-ab2b-83335a568a18"). InnerVolumeSpecName "kube-api-access-glskw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.193436 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b26b-account-create-update-8l4dm" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.195636 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e22d1830-21e9-40f0-ab2b-83335a568a18-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e22d1830-21e9-40f0-ab2b-83335a568a18" (UID: "e22d1830-21e9-40f0-ab2b-83335a568a18"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.210460 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff-kube-api-access-65w7p" (OuterVolumeSpecName: "kube-api-access-65w7p") pod "31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff" (UID: "31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff"). InnerVolumeSpecName "kube-api-access-65w7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.236260 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff" (UID: "31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.270271 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff-config-data" (OuterVolumeSpecName: "config-data") pod "31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff" (UID: "31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.285552 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e9617ba-17da-4946-8c5b-b1060683237d-operator-scripts\") pod \"4e9617ba-17da-4946-8c5b-b1060683237d\" (UID: \"4e9617ba-17da-4946-8c5b-b1060683237d\") " Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.285613 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5wf4\" (UniqueName: \"kubernetes.io/projected/4e9617ba-17da-4946-8c5b-b1060683237d-kube-api-access-d5wf4\") pod \"4e9617ba-17da-4946-8c5b-b1060683237d\" (UID: \"4e9617ba-17da-4946-8c5b-b1060683237d\") " Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.286037 4825 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.286054 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.286068 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65w7p\" (UniqueName: \"kubernetes.io/projected/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff-kube-api-access-65w7p\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.286078 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glskw\" (UniqueName: \"kubernetes.io/projected/e22d1830-21e9-40f0-ab2b-83335a568a18-kube-api-access-glskw\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.286088 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jslv5\" (UniqueName: \"kubernetes.io/projected/37023b62-fd36-44d5-9c0f-45cf1b640da6-kube-api-access-jslv5\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.286097 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/178e89cb-daf7-488f-8188-de4e98bc1047-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.286106 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e22d1830-21e9-40f0-ab2b-83335a568a18-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.286116 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.286124 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b2s2\" (UniqueName: \"kubernetes.io/projected/178e89cb-daf7-488f-8188-de4e98bc1047-kube-api-access-7b2s2\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.287253 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e9617ba-17da-4946-8c5b-b1060683237d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4e9617ba-17da-4946-8c5b-b1060683237d" (UID: "4e9617ba-17da-4946-8c5b-b1060683237d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.290104 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e9617ba-17da-4946-8c5b-b1060683237d-kube-api-access-d5wf4" (OuterVolumeSpecName: "kube-api-access-d5wf4") pod "4e9617ba-17da-4946-8c5b-b1060683237d" (UID: "4e9617ba-17da-4946-8c5b-b1060683237d"). InnerVolumeSpecName "kube-api-access-d5wf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.312176 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-833c-account-create-update-stmjm" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.312179 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-833c-account-create-update-stmjm" event={"ID":"178e89cb-daf7-488f-8188-de4e98bc1047","Type":"ContainerDied","Data":"960520181a145a8b6710b17dedef1691caf6b1a97e5d9a9ca874846f77928f17"} Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.312247 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="960520181a145a8b6710b17dedef1691caf6b1a97e5d9a9ca874846f77928f17" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.313889 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-10c2-account-create-update-7fscp" event={"ID":"37023b62-fd36-44d5-9c0f-45cf1b640da6","Type":"ContainerDied","Data":"b56bb6cf15673f3ba5da4fa8838e5869231f228758378fd5abbac2bc9cd78ef8"} Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.313925 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b56bb6cf15673f3ba5da4fa8838e5869231f228758378fd5abbac2bc9cd78ef8" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.313899 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-10c2-account-create-update-7fscp" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.315811 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b26b-account-create-update-8l4dm" event={"ID":"4e9617ba-17da-4946-8c5b-b1060683237d","Type":"ContainerDied","Data":"c9c1609b553fc626f099138385c622344d8095e540866f8342dba6b161ff67e7"} Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.315888 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9c1609b553fc626f099138385c622344d8095e540866f8342dba6b161ff67e7" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.315946 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b26b-account-create-update-8l4dm" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.320090 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-j22lk" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.320091 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-j22lk" event={"ID":"b061d3ea-7025-4381-a6b3-232d5998b15f","Type":"ContainerDied","Data":"bf974f0e28be815f5fc5b914581b08d8027b90a416aad7b7a829fdedb914c2d8"} Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.320175 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf974f0e28be815f5fc5b914581b08d8027b90a416aad7b7a829fdedb914c2d8" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.323320 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mcpm6" event={"ID":"e2f77e29-2f56-4415-b778-376f67322f69","Type":"ContainerStarted","Data":"d9fbc166daf6efaadd2da5517d14469cb803ed05f0994d79c6b487e8b8563e3e"} Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.329421 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7zgw8" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.331959 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7zgw8" event={"ID":"31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff","Type":"ContainerDied","Data":"6d18a0ad45c18f6101409e5f30e5d6e51550a0a09f4c57f78982f3234faf52d4"} Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.331996 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d18a0ad45c18f6101409e5f30e5d6e51550a0a09f4c57f78982f3234faf52d4" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.336044 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6cw56" event={"ID":"e22d1830-21e9-40f0-ab2b-83335a568a18","Type":"ContainerDied","Data":"fe7e718f6503ced41f7a67cfa4957670a966065f192666acd65c49722a0ae7d9"} Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.336086 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe7e718f6503ced41f7a67cfa4957670a966065f192666acd65c49722a0ae7d9" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.336097 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6cw56" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.348909 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-mcpm6" podStartSLOduration=2.217916268 podStartE2EDuration="8.348889171s" podCreationTimestamp="2026-03-10 07:04:55 +0000 UTC" firstStartedPulling="2026-03-10 07:04:56.75145955 +0000 UTC m=+1249.781240165" lastFinishedPulling="2026-03-10 07:05:02.882432453 +0000 UTC m=+1255.912213068" observedRunningTime="2026-03-10 07:05:03.342245088 +0000 UTC m=+1256.372025703" watchObservedRunningTime="2026-03-10 07:05:03.348889171 +0000 UTC m=+1256.378669786" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.386804 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e9617ba-17da-4946-8c5b-b1060683237d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:03 crc kubenswrapper[4825]: I0310 07:05:03.386838 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5wf4\" (UniqueName: \"kubernetes.io/projected/4e9617ba-17da-4946-8c5b-b1060683237d-kube-api-access-d5wf4\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.638206 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b684fb9f5-8pqxx"] Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.638722 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" podUID="34986ed7-9e8c-467b-8be5-613c7f3f8e4b" containerName="dnsmasq-dns" containerID="cri-o://3512388ff231c62f2433e8641aaa018bfa8bfb7fee8b63c6dcfa8faaad232c79" gracePeriod=10 Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.640351 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.721686 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-bjbqz"] Mar 10 07:05:04 crc kubenswrapper[4825]: E0310 07:05:04.722055 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e9617ba-17da-4946-8c5b-b1060683237d" containerName="mariadb-account-create-update" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.722074 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e9617ba-17da-4946-8c5b-b1060683237d" containerName="mariadb-account-create-update" Mar 10 07:05:04 crc kubenswrapper[4825]: E0310 07:05:04.722092 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff" containerName="glance-db-sync" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.722099 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff" containerName="glance-db-sync" Mar 10 07:05:04 crc kubenswrapper[4825]: E0310 07:05:04.722109 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="489d7ca6-7e05-482e-b880-69bea3b57c62" containerName="mariadb-database-create" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.722115 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="489d7ca6-7e05-482e-b880-69bea3b57c62" containerName="mariadb-database-create" Mar 10 07:05:04 crc kubenswrapper[4825]: E0310 07:05:04.722145 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e22d1830-21e9-40f0-ab2b-83335a568a18" containerName="mariadb-database-create" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.722151 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e22d1830-21e9-40f0-ab2b-83335a568a18" containerName="mariadb-database-create" Mar 10 07:05:04 crc kubenswrapper[4825]: E0310 07:05:04.722167 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="178e89cb-daf7-488f-8188-de4e98bc1047" containerName="mariadb-account-create-update" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.722174 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="178e89cb-daf7-488f-8188-de4e98bc1047" containerName="mariadb-account-create-update" Mar 10 07:05:04 crc kubenswrapper[4825]: E0310 07:05:04.722182 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37023b62-fd36-44d5-9c0f-45cf1b640da6" containerName="mariadb-account-create-update" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.722188 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="37023b62-fd36-44d5-9c0f-45cf1b640da6" containerName="mariadb-account-create-update" Mar 10 07:05:04 crc kubenswrapper[4825]: E0310 07:05:04.722207 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b061d3ea-7025-4381-a6b3-232d5998b15f" containerName="mariadb-database-create" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.722213 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b061d3ea-7025-4381-a6b3-232d5998b15f" containerName="mariadb-database-create" Mar 10 07:05:04 crc kubenswrapper[4825]: E0310 07:05:04.722225 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44afa3f7-22c9-4881-86ca-8690b07a9972" containerName="ovn-config" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.722230 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="44afa3f7-22c9-4881-86ca-8690b07a9972" containerName="ovn-config" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.722367 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="489d7ca6-7e05-482e-b880-69bea3b57c62" containerName="mariadb-database-create" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.722377 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b061d3ea-7025-4381-a6b3-232d5998b15f" containerName="mariadb-database-create" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.722389 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff" containerName="glance-db-sync" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.722397 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="44afa3f7-22c9-4881-86ca-8690b07a9972" containerName="ovn-config" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.722410 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="178e89cb-daf7-488f-8188-de4e98bc1047" containerName="mariadb-account-create-update" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.722418 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e9617ba-17da-4946-8c5b-b1060683237d" containerName="mariadb-account-create-update" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.722428 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e22d1830-21e9-40f0-ab2b-83335a568a18" containerName="mariadb-database-create" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.722439 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="37023b62-fd36-44d5-9c0f-45cf1b640da6" containerName="mariadb-account-create-update" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.723275 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.744237 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-bjbqz"] Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.817239 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-dns-svc\") pod \"dnsmasq-dns-75c886f8b5-bjbqz\" (UID: \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\") " pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.817284 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-ovsdbserver-nb\") pod \"dnsmasq-dns-75c886f8b5-bjbqz\" (UID: \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\") " pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.817311 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-dns-swift-storage-0\") pod \"dnsmasq-dns-75c886f8b5-bjbqz\" (UID: \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\") " pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.817336 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdsmd\" (UniqueName: \"kubernetes.io/projected/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-kube-api-access-gdsmd\") pod \"dnsmasq-dns-75c886f8b5-bjbqz\" (UID: \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\") " pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.817358 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-ovsdbserver-sb\") pod \"dnsmasq-dns-75c886f8b5-bjbqz\" (UID: \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\") " pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.817416 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-config\") pod \"dnsmasq-dns-75c886f8b5-bjbqz\" (UID: \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\") " pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.917847 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-dns-svc\") pod \"dnsmasq-dns-75c886f8b5-bjbqz\" (UID: \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\") " pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.917906 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-ovsdbserver-nb\") pod \"dnsmasq-dns-75c886f8b5-bjbqz\" (UID: \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\") " pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.917933 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-dns-swift-storage-0\") pod \"dnsmasq-dns-75c886f8b5-bjbqz\" (UID: \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\") " pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.917966 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdsmd\" (UniqueName: \"kubernetes.io/projected/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-kube-api-access-gdsmd\") pod \"dnsmasq-dns-75c886f8b5-bjbqz\" (UID: \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\") " pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.917989 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-ovsdbserver-sb\") pod \"dnsmasq-dns-75c886f8b5-bjbqz\" (UID: \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\") " pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.918046 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-config\") pod \"dnsmasq-dns-75c886f8b5-bjbqz\" (UID: \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\") " pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.918894 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-dns-svc\") pod \"dnsmasq-dns-75c886f8b5-bjbqz\" (UID: \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\") " pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.918926 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-ovsdbserver-nb\") pod \"dnsmasq-dns-75c886f8b5-bjbqz\" (UID: \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\") " pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.919216 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-config\") pod \"dnsmasq-dns-75c886f8b5-bjbqz\" (UID: \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\") " pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.919490 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-ovsdbserver-sb\") pod \"dnsmasq-dns-75c886f8b5-bjbqz\" (UID: \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\") " pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.919725 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-dns-swift-storage-0\") pod \"dnsmasq-dns-75c886f8b5-bjbqz\" (UID: \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\") " pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" Mar 10 07:05:04 crc kubenswrapper[4825]: I0310 07:05:04.947483 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdsmd\" (UniqueName: \"kubernetes.io/projected/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-kube-api-access-gdsmd\") pod \"dnsmasq-dns-75c886f8b5-bjbqz\" (UID: \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\") " pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" Mar 10 07:05:05 crc kubenswrapper[4825]: I0310 07:05:05.037001 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" Mar 10 07:05:05 crc kubenswrapper[4825]: I0310 07:05:05.357861 4825 generic.go:334] "Generic (PLEG): container finished" podID="34986ed7-9e8c-467b-8be5-613c7f3f8e4b" containerID="3512388ff231c62f2433e8641aaa018bfa8bfb7fee8b63c6dcfa8faaad232c79" exitCode=0 Mar 10 07:05:05 crc kubenswrapper[4825]: I0310 07:05:05.358064 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" event={"ID":"34986ed7-9e8c-467b-8be5-613c7f3f8e4b","Type":"ContainerDied","Data":"3512388ff231c62f2433e8641aaa018bfa8bfb7fee8b63c6dcfa8faaad232c79"} Mar 10 07:05:05 crc kubenswrapper[4825]: I0310 07:05:05.520801 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-bjbqz"] Mar 10 07:05:05 crc kubenswrapper[4825]: I0310 07:05:05.597360 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" Mar 10 07:05:05 crc kubenswrapper[4825]: I0310 07:05:05.732236 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-config\") pod \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\" (UID: \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\") " Mar 10 07:05:05 crc kubenswrapper[4825]: I0310 07:05:05.732802 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc2wb\" (UniqueName: \"kubernetes.io/projected/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-kube-api-access-vc2wb\") pod \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\" (UID: \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\") " Mar 10 07:05:05 crc kubenswrapper[4825]: I0310 07:05:05.733599 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-ovsdbserver-sb\") pod \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\" (UID: \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\") " Mar 10 07:05:05 crc kubenswrapper[4825]: I0310 07:05:05.734013 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-dns-svc\") pod \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\" (UID: \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\") " Mar 10 07:05:05 crc kubenswrapper[4825]: I0310 07:05:05.734065 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-dns-swift-storage-0\") pod \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\" (UID: \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\") " Mar 10 07:05:05 crc kubenswrapper[4825]: I0310 07:05:05.734085 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-ovsdbserver-nb\") pod \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\" (UID: \"34986ed7-9e8c-467b-8be5-613c7f3f8e4b\") " Mar 10 07:05:05 crc kubenswrapper[4825]: I0310 07:05:05.736760 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-kube-api-access-vc2wb" (OuterVolumeSpecName: "kube-api-access-vc2wb") pod "34986ed7-9e8c-467b-8be5-613c7f3f8e4b" (UID: "34986ed7-9e8c-467b-8be5-613c7f3f8e4b"). InnerVolumeSpecName "kube-api-access-vc2wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:05 crc kubenswrapper[4825]: I0310 07:05:05.803367 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "34986ed7-9e8c-467b-8be5-613c7f3f8e4b" (UID: "34986ed7-9e8c-467b-8be5-613c7f3f8e4b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:05 crc kubenswrapper[4825]: I0310 07:05:05.803523 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-config" (OuterVolumeSpecName: "config") pod "34986ed7-9e8c-467b-8be5-613c7f3f8e4b" (UID: "34986ed7-9e8c-467b-8be5-613c7f3f8e4b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:05 crc kubenswrapper[4825]: I0310 07:05:05.809256 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "34986ed7-9e8c-467b-8be5-613c7f3f8e4b" (UID: "34986ed7-9e8c-467b-8be5-613c7f3f8e4b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:05 crc kubenswrapper[4825]: I0310 07:05:05.810694 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "34986ed7-9e8c-467b-8be5-613c7f3f8e4b" (UID: "34986ed7-9e8c-467b-8be5-613c7f3f8e4b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:05 crc kubenswrapper[4825]: I0310 07:05:05.816761 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "34986ed7-9e8c-467b-8be5-613c7f3f8e4b" (UID: "34986ed7-9e8c-467b-8be5-613c7f3f8e4b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:05 crc kubenswrapper[4825]: I0310 07:05:05.836315 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:05 crc kubenswrapper[4825]: I0310 07:05:05.836347 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc2wb\" (UniqueName: \"kubernetes.io/projected/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-kube-api-access-vc2wb\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:05 crc kubenswrapper[4825]: I0310 07:05:05.836358 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:05 crc kubenswrapper[4825]: I0310 07:05:05.836367 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:05 crc kubenswrapper[4825]: I0310 07:05:05.836383 4825 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:05 crc kubenswrapper[4825]: I0310 07:05:05.836391 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34986ed7-9e8c-467b-8be5-613c7f3f8e4b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:06 crc kubenswrapper[4825]: I0310 07:05:06.371492 4825 generic.go:334] "Generic (PLEG): container finished" podID="963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87" containerID="63cff044124ca8958cdbbb486a150093237eb4a99328588069bf2dca6451e893" exitCode=0 Mar 10 07:05:06 crc kubenswrapper[4825]: I0310 07:05:06.371557 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" event={"ID":"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87","Type":"ContainerDied","Data":"63cff044124ca8958cdbbb486a150093237eb4a99328588069bf2dca6451e893"} Mar 10 07:05:06 crc kubenswrapper[4825]: I0310 07:05:06.372292 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" event={"ID":"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87","Type":"ContainerStarted","Data":"0a39ed9d49e785ae2dd014410c8fb924eafde3f3935321d36b1a44d071a7d51c"} Mar 10 07:05:06 crc kubenswrapper[4825]: I0310 07:05:06.375226 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" event={"ID":"34986ed7-9e8c-467b-8be5-613c7f3f8e4b","Type":"ContainerDied","Data":"af609656607d4d616d603727acade28f85fa01330ca7fa8971782c85b3ea9b61"} Mar 10 07:05:06 crc kubenswrapper[4825]: I0310 07:05:06.375359 4825 scope.go:117] "RemoveContainer" containerID="3512388ff231c62f2433e8641aaa018bfa8bfb7fee8b63c6dcfa8faaad232c79" Mar 10 07:05:06 crc kubenswrapper[4825]: I0310 07:05:06.375617 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b684fb9f5-8pqxx" Mar 10 07:05:06 crc kubenswrapper[4825]: I0310 07:05:06.485970 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b684fb9f5-8pqxx"] Mar 10 07:05:06 crc kubenswrapper[4825]: I0310 07:05:06.491731 4825 scope.go:117] "RemoveContainer" containerID="e1e36514caebef2f964c884dfc0275ec341667edb650d1c5097a843350f609c2" Mar 10 07:05:06 crc kubenswrapper[4825]: I0310 07:05:06.500540 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b684fb9f5-8pqxx"] Mar 10 07:05:07 crc kubenswrapper[4825]: I0310 07:05:07.252072 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34986ed7-9e8c-467b-8be5-613c7f3f8e4b" path="/var/lib/kubelet/pods/34986ed7-9e8c-467b-8be5-613c7f3f8e4b/volumes" Mar 10 07:05:07 crc kubenswrapper[4825]: I0310 07:05:07.390808 4825 generic.go:334] "Generic (PLEG): container finished" podID="e2f77e29-2f56-4415-b778-376f67322f69" containerID="d9fbc166daf6efaadd2da5517d14469cb803ed05f0994d79c6b487e8b8563e3e" exitCode=0 Mar 10 07:05:07 crc kubenswrapper[4825]: I0310 07:05:07.390904 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mcpm6" event={"ID":"e2f77e29-2f56-4415-b778-376f67322f69","Type":"ContainerDied","Data":"d9fbc166daf6efaadd2da5517d14469cb803ed05f0994d79c6b487e8b8563e3e"} Mar 10 07:05:07 crc kubenswrapper[4825]: I0310 07:05:07.393801 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" event={"ID":"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87","Type":"ContainerStarted","Data":"20b0a95b9f8805a48bc588a2be463ff0b2e9a348534ee2b072e0e98a74abfa9e"} Mar 10 07:05:07 crc kubenswrapper[4825]: I0310 07:05:07.393955 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" Mar 10 07:05:07 crc kubenswrapper[4825]: I0310 07:05:07.459510 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" podStartSLOduration=3.459487917 podStartE2EDuration="3.459487917s" podCreationTimestamp="2026-03-10 07:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:05:07.450411311 +0000 UTC m=+1260.480191936" watchObservedRunningTime="2026-03-10 07:05:07.459487917 +0000 UTC m=+1260.489268532" Mar 10 07:05:08 crc kubenswrapper[4825]: I0310 07:05:08.693983 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mcpm6" Mar 10 07:05:08 crc kubenswrapper[4825]: I0310 07:05:08.804700 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgrgn\" (UniqueName: \"kubernetes.io/projected/e2f77e29-2f56-4415-b778-376f67322f69-kube-api-access-mgrgn\") pod \"e2f77e29-2f56-4415-b778-376f67322f69\" (UID: \"e2f77e29-2f56-4415-b778-376f67322f69\") " Mar 10 07:05:08 crc kubenswrapper[4825]: I0310 07:05:08.804820 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f77e29-2f56-4415-b778-376f67322f69-combined-ca-bundle\") pod \"e2f77e29-2f56-4415-b778-376f67322f69\" (UID: \"e2f77e29-2f56-4415-b778-376f67322f69\") " Mar 10 07:05:08 crc kubenswrapper[4825]: I0310 07:05:08.805038 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f77e29-2f56-4415-b778-376f67322f69-config-data\") pod \"e2f77e29-2f56-4415-b778-376f67322f69\" (UID: \"e2f77e29-2f56-4415-b778-376f67322f69\") " Mar 10 07:05:08 crc kubenswrapper[4825]: I0310 07:05:08.812555 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2f77e29-2f56-4415-b778-376f67322f69-kube-api-access-mgrgn" (OuterVolumeSpecName: "kube-api-access-mgrgn") pod "e2f77e29-2f56-4415-b778-376f67322f69" (UID: "e2f77e29-2f56-4415-b778-376f67322f69"). InnerVolumeSpecName "kube-api-access-mgrgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:08 crc kubenswrapper[4825]: I0310 07:05:08.837215 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f77e29-2f56-4415-b778-376f67322f69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2f77e29-2f56-4415-b778-376f67322f69" (UID: "e2f77e29-2f56-4415-b778-376f67322f69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:08 crc kubenswrapper[4825]: I0310 07:05:08.858276 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f77e29-2f56-4415-b778-376f67322f69-config-data" (OuterVolumeSpecName: "config-data") pod "e2f77e29-2f56-4415-b778-376f67322f69" (UID: "e2f77e29-2f56-4415-b778-376f67322f69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:08 crc kubenswrapper[4825]: I0310 07:05:08.907600 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgrgn\" (UniqueName: \"kubernetes.io/projected/e2f77e29-2f56-4415-b778-376f67322f69-kube-api-access-mgrgn\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:08 crc kubenswrapper[4825]: I0310 07:05:08.907636 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f77e29-2f56-4415-b778-376f67322f69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:08 crc kubenswrapper[4825]: I0310 07:05:08.907648 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f77e29-2f56-4415-b778-376f67322f69-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.419501 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mcpm6" event={"ID":"e2f77e29-2f56-4415-b778-376f67322f69","Type":"ContainerDied","Data":"943d11a8aea80b98ec5deca750fd89f33e406d7da48ec8cb363c284e9c0fa47c"} Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.419935 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="943d11a8aea80b98ec5deca750fd89f33e406d7da48ec8cb363c284e9c0fa47c" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.419634 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mcpm6" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.640882 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-bjbqz"] Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.641296 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" podUID="963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87" containerName="dnsmasq-dns" containerID="cri-o://20b0a95b9f8805a48bc588a2be463ff0b2e9a348534ee2b072e0e98a74abfa9e" gracePeriod=10 Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.649310 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-9dv74"] Mar 10 07:05:09 crc kubenswrapper[4825]: E0310 07:05:09.649648 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34986ed7-9e8c-467b-8be5-613c7f3f8e4b" containerName="dnsmasq-dns" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.649667 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="34986ed7-9e8c-467b-8be5-613c7f3f8e4b" containerName="dnsmasq-dns" Mar 10 07:05:09 crc kubenswrapper[4825]: E0310 07:05:09.649679 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34986ed7-9e8c-467b-8be5-613c7f3f8e4b" containerName="init" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.649686 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="34986ed7-9e8c-467b-8be5-613c7f3f8e4b" containerName="init" Mar 10 07:05:09 crc kubenswrapper[4825]: E0310 07:05:09.649693 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f77e29-2f56-4415-b778-376f67322f69" containerName="keystone-db-sync" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.649699 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f77e29-2f56-4415-b778-376f67322f69" containerName="keystone-db-sync" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.649867 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2f77e29-2f56-4415-b778-376f67322f69" containerName="keystone-db-sync" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.649892 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="34986ed7-9e8c-467b-8be5-613c7f3f8e4b" containerName="dnsmasq-dns" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.650454 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9dv74" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.655696 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.655955 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.656076 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.656229 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.656367 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rs4jw" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.670882 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9dv74"] Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.712788 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-995cj"] Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.714158 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-995cj" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.724678 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsqbp\" (UniqueName: \"kubernetes.io/projected/1c42468d-409a-4aad-b52f-6553acdc5179-kube-api-access-tsqbp\") pod \"keystone-bootstrap-9dv74\" (UID: \"1c42468d-409a-4aad-b52f-6553acdc5179\") " pod="openstack/keystone-bootstrap-9dv74" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.724750 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-combined-ca-bundle\") pod \"keystone-bootstrap-9dv74\" (UID: \"1c42468d-409a-4aad-b52f-6553acdc5179\") " pod="openstack/keystone-bootstrap-9dv74" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.724812 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-credential-keys\") pod \"keystone-bootstrap-9dv74\" (UID: \"1c42468d-409a-4aad-b52f-6553acdc5179\") " pod="openstack/keystone-bootstrap-9dv74" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.724871 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-config-data\") pod \"keystone-bootstrap-9dv74\" (UID: \"1c42468d-409a-4aad-b52f-6553acdc5179\") " pod="openstack/keystone-bootstrap-9dv74" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.724949 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-scripts\") pod \"keystone-bootstrap-9dv74\" (UID: \"1c42468d-409a-4aad-b52f-6553acdc5179\") " pod="openstack/keystone-bootstrap-9dv74" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.725014 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-fernet-keys\") pod \"keystone-bootstrap-9dv74\" (UID: \"1c42468d-409a-4aad-b52f-6553acdc5179\") " pod="openstack/keystone-bootstrap-9dv74" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.737821 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-995cj"] Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.804426 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-j9wfx"] Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.805460 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-j9wfx" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.814820 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.815014 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-7s7j4" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.815105 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.826686 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3f58042-5c26-4b0b-9f85-d35d9305115e-db-sync-config-data\") pod \"cinder-db-sync-j9wfx\" (UID: \"c3f58042-5c26-4b0b-9f85-d35d9305115e\") " pod="openstack/cinder-db-sync-j9wfx" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.826741 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3f58042-5c26-4b0b-9f85-d35d9305115e-config-data\") pod \"cinder-db-sync-j9wfx\" (UID: \"c3f58042-5c26-4b0b-9f85-d35d9305115e\") " pod="openstack/cinder-db-sync-j9wfx" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.826762 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3f58042-5c26-4b0b-9f85-d35d9305115e-combined-ca-bundle\") pod \"cinder-db-sync-j9wfx\" (UID: \"c3f58042-5c26-4b0b-9f85-d35d9305115e\") " pod="openstack/cinder-db-sync-j9wfx" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.826784 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-dns-svc\") pod \"dnsmasq-dns-5985c59c55-995cj\" (UID: \"f5a08816-8aac-4c49-abab-deff21e61f4b\") " pod="openstack/dnsmasq-dns-5985c59c55-995cj" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.826807 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-ovsdbserver-nb\") pod \"dnsmasq-dns-5985c59c55-995cj\" (UID: \"f5a08816-8aac-4c49-abab-deff21e61f4b\") " pod="openstack/dnsmasq-dns-5985c59c55-995cj" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.826829 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsqbp\" (UniqueName: \"kubernetes.io/projected/1c42468d-409a-4aad-b52f-6553acdc5179-kube-api-access-tsqbp\") pod \"keystone-bootstrap-9dv74\" (UID: \"1c42468d-409a-4aad-b52f-6553acdc5179\") " pod="openstack/keystone-bootstrap-9dv74" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.826857 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-combined-ca-bundle\") pod \"keystone-bootstrap-9dv74\" (UID: \"1c42468d-409a-4aad-b52f-6553acdc5179\") " pod="openstack/keystone-bootstrap-9dv74" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.826876 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hnvt\" (UniqueName: \"kubernetes.io/projected/c3f58042-5c26-4b0b-9f85-d35d9305115e-kube-api-access-2hnvt\") pod \"cinder-db-sync-j9wfx\" (UID: \"c3f58042-5c26-4b0b-9f85-d35d9305115e\") " pod="openstack/cinder-db-sync-j9wfx" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.826914 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-config\") pod \"dnsmasq-dns-5985c59c55-995cj\" (UID: \"f5a08816-8aac-4c49-abab-deff21e61f4b\") " pod="openstack/dnsmasq-dns-5985c59c55-995cj" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.826934 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3f58042-5c26-4b0b-9f85-d35d9305115e-scripts\") pod \"cinder-db-sync-j9wfx\" (UID: \"c3f58042-5c26-4b0b-9f85-d35d9305115e\") " pod="openstack/cinder-db-sync-j9wfx" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.826955 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-credential-keys\") pod \"keystone-bootstrap-9dv74\" (UID: \"1c42468d-409a-4aad-b52f-6553acdc5179\") " pod="openstack/keystone-bootstrap-9dv74" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.826977 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-config-data\") pod \"keystone-bootstrap-9dv74\" (UID: \"1c42468d-409a-4aad-b52f-6553acdc5179\") " pod="openstack/keystone-bootstrap-9dv74" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.827020 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-scripts\") pod \"keystone-bootstrap-9dv74\" (UID: \"1c42468d-409a-4aad-b52f-6553acdc5179\") " pod="openstack/keystone-bootstrap-9dv74" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.827047 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-dns-swift-storage-0\") pod \"dnsmasq-dns-5985c59c55-995cj\" (UID: \"f5a08816-8aac-4c49-abab-deff21e61f4b\") " pod="openstack/dnsmasq-dns-5985c59c55-995cj" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.827065 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-fernet-keys\") pod \"keystone-bootstrap-9dv74\" (UID: \"1c42468d-409a-4aad-b52f-6553acdc5179\") " pod="openstack/keystone-bootstrap-9dv74" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.827083 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5drg\" (UniqueName: \"kubernetes.io/projected/f5a08816-8aac-4c49-abab-deff21e61f4b-kube-api-access-h5drg\") pod \"dnsmasq-dns-5985c59c55-995cj\" (UID: \"f5a08816-8aac-4c49-abab-deff21e61f4b\") " pod="openstack/dnsmasq-dns-5985c59c55-995cj" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.827100 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3f58042-5c26-4b0b-9f85-d35d9305115e-etc-machine-id\") pod \"cinder-db-sync-j9wfx\" (UID: \"c3f58042-5c26-4b0b-9f85-d35d9305115e\") " pod="openstack/cinder-db-sync-j9wfx" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.827152 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-ovsdbserver-sb\") pod \"dnsmasq-dns-5985c59c55-995cj\" (UID: \"f5a08816-8aac-4c49-abab-deff21e61f4b\") " pod="openstack/dnsmasq-dns-5985c59c55-995cj" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.837290 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-scripts\") pod \"keystone-bootstrap-9dv74\" (UID: \"1c42468d-409a-4aad-b52f-6553acdc5179\") " pod="openstack/keystone-bootstrap-9dv74" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.843144 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-j9wfx"] Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.845776 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-credential-keys\") pod \"keystone-bootstrap-9dv74\" (UID: \"1c42468d-409a-4aad-b52f-6553acdc5179\") " pod="openstack/keystone-bootstrap-9dv74" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.846742 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-fernet-keys\") pod \"keystone-bootstrap-9dv74\" (UID: \"1c42468d-409a-4aad-b52f-6553acdc5179\") " pod="openstack/keystone-bootstrap-9dv74" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.853030 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-combined-ca-bundle\") pod \"keystone-bootstrap-9dv74\" (UID: \"1c42468d-409a-4aad-b52f-6553acdc5179\") " pod="openstack/keystone-bootstrap-9dv74" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.859113 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-config-data\") pod \"keystone-bootstrap-9dv74\" (UID: \"1c42468d-409a-4aad-b52f-6553acdc5179\") " pod="openstack/keystone-bootstrap-9dv74" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.876964 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsqbp\" (UniqueName: \"kubernetes.io/projected/1c42468d-409a-4aad-b52f-6553acdc5179-kube-api-access-tsqbp\") pod \"keystone-bootstrap-9dv74\" (UID: \"1c42468d-409a-4aad-b52f-6553acdc5179\") " pod="openstack/keystone-bootstrap-9dv74" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.929517 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-dns-swift-storage-0\") pod \"dnsmasq-dns-5985c59c55-995cj\" (UID: \"f5a08816-8aac-4c49-abab-deff21e61f4b\") " pod="openstack/dnsmasq-dns-5985c59c55-995cj" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.929568 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5drg\" (UniqueName: \"kubernetes.io/projected/f5a08816-8aac-4c49-abab-deff21e61f4b-kube-api-access-h5drg\") pod \"dnsmasq-dns-5985c59c55-995cj\" (UID: \"f5a08816-8aac-4c49-abab-deff21e61f4b\") " pod="openstack/dnsmasq-dns-5985c59c55-995cj" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.929589 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3f58042-5c26-4b0b-9f85-d35d9305115e-etc-machine-id\") pod \"cinder-db-sync-j9wfx\" (UID: \"c3f58042-5c26-4b0b-9f85-d35d9305115e\") " pod="openstack/cinder-db-sync-j9wfx" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.929609 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-ovsdbserver-sb\") pod \"dnsmasq-dns-5985c59c55-995cj\" (UID: \"f5a08816-8aac-4c49-abab-deff21e61f4b\") " pod="openstack/dnsmasq-dns-5985c59c55-995cj" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.929633 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3f58042-5c26-4b0b-9f85-d35d9305115e-db-sync-config-data\") pod \"cinder-db-sync-j9wfx\" (UID: \"c3f58042-5c26-4b0b-9f85-d35d9305115e\") " pod="openstack/cinder-db-sync-j9wfx" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.929655 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3f58042-5c26-4b0b-9f85-d35d9305115e-config-data\") pod \"cinder-db-sync-j9wfx\" (UID: \"c3f58042-5c26-4b0b-9f85-d35d9305115e\") " pod="openstack/cinder-db-sync-j9wfx" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.929672 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3f58042-5c26-4b0b-9f85-d35d9305115e-combined-ca-bundle\") pod \"cinder-db-sync-j9wfx\" (UID: \"c3f58042-5c26-4b0b-9f85-d35d9305115e\") " pod="openstack/cinder-db-sync-j9wfx" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.929695 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-dns-svc\") pod \"dnsmasq-dns-5985c59c55-995cj\" (UID: \"f5a08816-8aac-4c49-abab-deff21e61f4b\") " pod="openstack/dnsmasq-dns-5985c59c55-995cj" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.929719 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-ovsdbserver-nb\") pod \"dnsmasq-dns-5985c59c55-995cj\" (UID: \"f5a08816-8aac-4c49-abab-deff21e61f4b\") " pod="openstack/dnsmasq-dns-5985c59c55-995cj" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.929746 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hnvt\" (UniqueName: \"kubernetes.io/projected/c3f58042-5c26-4b0b-9f85-d35d9305115e-kube-api-access-2hnvt\") pod \"cinder-db-sync-j9wfx\" (UID: \"c3f58042-5c26-4b0b-9f85-d35d9305115e\") " pod="openstack/cinder-db-sync-j9wfx" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.929773 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-config\") pod \"dnsmasq-dns-5985c59c55-995cj\" (UID: \"f5a08816-8aac-4c49-abab-deff21e61f4b\") " pod="openstack/dnsmasq-dns-5985c59c55-995cj" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.929790 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3f58042-5c26-4b0b-9f85-d35d9305115e-scripts\") pod \"cinder-db-sync-j9wfx\" (UID: \"c3f58042-5c26-4b0b-9f85-d35d9305115e\") " pod="openstack/cinder-db-sync-j9wfx" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.933583 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-dns-svc\") pod \"dnsmasq-dns-5985c59c55-995cj\" (UID: \"f5a08816-8aac-4c49-abab-deff21e61f4b\") " pod="openstack/dnsmasq-dns-5985c59c55-995cj" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.933608 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-config\") pod \"dnsmasq-dns-5985c59c55-995cj\" (UID: \"f5a08816-8aac-4c49-abab-deff21e61f4b\") " pod="openstack/dnsmasq-dns-5985c59c55-995cj" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.934184 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-qlkjv"] Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.935316 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qlkjv" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.935987 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3f58042-5c26-4b0b-9f85-d35d9305115e-etc-machine-id\") pod \"cinder-db-sync-j9wfx\" (UID: \"c3f58042-5c26-4b0b-9f85-d35d9305115e\") " pod="openstack/cinder-db-sync-j9wfx" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.938195 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-dns-swift-storage-0\") pod \"dnsmasq-dns-5985c59c55-995cj\" (UID: \"f5a08816-8aac-4c49-abab-deff21e61f4b\") " pod="openstack/dnsmasq-dns-5985c59c55-995cj" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.938246 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-ovsdbserver-nb\") pod \"dnsmasq-dns-5985c59c55-995cj\" (UID: \"f5a08816-8aac-4c49-abab-deff21e61f4b\") " pod="openstack/dnsmasq-dns-5985c59c55-995cj" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.938722 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.939115 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.939282 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-zsh45" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.940080 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3f58042-5c26-4b0b-9f85-d35d9305115e-combined-ca-bundle\") pod \"cinder-db-sync-j9wfx\" (UID: \"c3f58042-5c26-4b0b-9f85-d35d9305115e\") " pod="openstack/cinder-db-sync-j9wfx" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.947375 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-ovsdbserver-sb\") pod \"dnsmasq-dns-5985c59c55-995cj\" (UID: \"f5a08816-8aac-4c49-abab-deff21e61f4b\") " pod="openstack/dnsmasq-dns-5985c59c55-995cj" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.955295 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3f58042-5c26-4b0b-9f85-d35d9305115e-scripts\") pod \"cinder-db-sync-j9wfx\" (UID: \"c3f58042-5c26-4b0b-9f85-d35d9305115e\") " pod="openstack/cinder-db-sync-j9wfx" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.959453 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3f58042-5c26-4b0b-9f85-d35d9305115e-config-data\") pod \"cinder-db-sync-j9wfx\" (UID: \"c3f58042-5c26-4b0b-9f85-d35d9305115e\") " pod="openstack/cinder-db-sync-j9wfx" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.965774 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3f58042-5c26-4b0b-9f85-d35d9305115e-db-sync-config-data\") pod \"cinder-db-sync-j9wfx\" (UID: \"c3f58042-5c26-4b0b-9f85-d35d9305115e\") " pod="openstack/cinder-db-sync-j9wfx" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.975004 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9dv74" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.978687 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.982665 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.991538 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5drg\" (UniqueName: \"kubernetes.io/projected/f5a08816-8aac-4c49-abab-deff21e61f4b-kube-api-access-h5drg\") pod \"dnsmasq-dns-5985c59c55-995cj\" (UID: \"f5a08816-8aac-4c49-abab-deff21e61f4b\") " pod="openstack/dnsmasq-dns-5985c59c55-995cj" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.992150 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 07:05:09 crc kubenswrapper[4825]: I0310 07:05:09.992334 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.013353 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hnvt\" (UniqueName: \"kubernetes.io/projected/c3f58042-5c26-4b0b-9f85-d35d9305115e-kube-api-access-2hnvt\") pod \"cinder-db-sync-j9wfx\" (UID: \"c3f58042-5c26-4b0b-9f85-d35d9305115e\") " pod="openstack/cinder-db-sync-j9wfx" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.025206 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-qlkjv"] Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.035946 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-995cj" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.037101 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0862f63-fd04-41f5-88a2-638189435867-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " pod="openstack/ceilometer-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.037233 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce0e86d1-5b0d-491f-bbd9-59fdd8024236-config\") pod \"neutron-db-sync-qlkjv\" (UID: \"ce0e86d1-5b0d-491f-bbd9-59fdd8024236\") " pod="openstack/neutron-db-sync-qlkjv" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.037266 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0862f63-fd04-41f5-88a2-638189435867-log-httpd\") pod \"ceilometer-0\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " pod="openstack/ceilometer-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.037303 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g868g\" (UniqueName: \"kubernetes.io/projected/ce0e86d1-5b0d-491f-bbd9-59fdd8024236-kube-api-access-g868g\") pod \"neutron-db-sync-qlkjv\" (UID: \"ce0e86d1-5b0d-491f-bbd9-59fdd8024236\") " pod="openstack/neutron-db-sync-qlkjv" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.037329 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0862f63-fd04-41f5-88a2-638189435867-config-data\") pod \"ceilometer-0\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " pod="openstack/ceilometer-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.037345 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0862f63-fd04-41f5-88a2-638189435867-run-httpd\") pod \"ceilometer-0\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " pod="openstack/ceilometer-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.037363 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0862f63-fd04-41f5-88a2-638189435867-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " pod="openstack/ceilometer-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.037381 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z2x2\" (UniqueName: \"kubernetes.io/projected/f0862f63-fd04-41f5-88a2-638189435867-kube-api-access-2z2x2\") pod \"ceilometer-0\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " pod="openstack/ceilometer-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.037400 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0862f63-fd04-41f5-88a2-638189435867-scripts\") pod \"ceilometer-0\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " pod="openstack/ceilometer-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.037417 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce0e86d1-5b0d-491f-bbd9-59fdd8024236-combined-ca-bundle\") pod \"neutron-db-sync-qlkjv\" (UID: \"ce0e86d1-5b0d-491f-bbd9-59fdd8024236\") " pod="openstack/neutron-db-sync-qlkjv" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.038266 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.095746 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-j9wfx" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.139079 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-995cj"] Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.140106 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce0e86d1-5b0d-491f-bbd9-59fdd8024236-combined-ca-bundle\") pod \"neutron-db-sync-qlkjv\" (UID: \"ce0e86d1-5b0d-491f-bbd9-59fdd8024236\") " pod="openstack/neutron-db-sync-qlkjv" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.140208 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0862f63-fd04-41f5-88a2-638189435867-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " pod="openstack/ceilometer-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.140251 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce0e86d1-5b0d-491f-bbd9-59fdd8024236-config\") pod \"neutron-db-sync-qlkjv\" (UID: \"ce0e86d1-5b0d-491f-bbd9-59fdd8024236\") " pod="openstack/neutron-db-sync-qlkjv" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.140278 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0862f63-fd04-41f5-88a2-638189435867-log-httpd\") pod \"ceilometer-0\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " pod="openstack/ceilometer-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.140331 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g868g\" (UniqueName: \"kubernetes.io/projected/ce0e86d1-5b0d-491f-bbd9-59fdd8024236-kube-api-access-g868g\") pod \"neutron-db-sync-qlkjv\" (UID: \"ce0e86d1-5b0d-491f-bbd9-59fdd8024236\") " pod="openstack/neutron-db-sync-qlkjv" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.140355 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0862f63-fd04-41f5-88a2-638189435867-config-data\") pod \"ceilometer-0\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " pod="openstack/ceilometer-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.140369 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0862f63-fd04-41f5-88a2-638189435867-run-httpd\") pod \"ceilometer-0\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " pod="openstack/ceilometer-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.140406 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0862f63-fd04-41f5-88a2-638189435867-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " pod="openstack/ceilometer-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.140425 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z2x2\" (UniqueName: \"kubernetes.io/projected/f0862f63-fd04-41f5-88a2-638189435867-kube-api-access-2z2x2\") pod \"ceilometer-0\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " pod="openstack/ceilometer-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.140444 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0862f63-fd04-41f5-88a2-638189435867-scripts\") pod \"ceilometer-0\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " pod="openstack/ceilometer-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.141643 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0862f63-fd04-41f5-88a2-638189435867-log-httpd\") pod \"ceilometer-0\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " pod="openstack/ceilometer-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.142347 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0862f63-fd04-41f5-88a2-638189435867-run-httpd\") pod \"ceilometer-0\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " pod="openstack/ceilometer-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.150515 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce0e86d1-5b0d-491f-bbd9-59fdd8024236-combined-ca-bundle\") pod \"neutron-db-sync-qlkjv\" (UID: \"ce0e86d1-5b0d-491f-bbd9-59fdd8024236\") " pod="openstack/neutron-db-sync-qlkjv" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.155633 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0862f63-fd04-41f5-88a2-638189435867-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " pod="openstack/ceilometer-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.165213 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0862f63-fd04-41f5-88a2-638189435867-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " pod="openstack/ceilometer-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.172552 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0862f63-fd04-41f5-88a2-638189435867-config-data\") pod \"ceilometer-0\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " pod="openstack/ceilometer-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.172958 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce0e86d1-5b0d-491f-bbd9-59fdd8024236-config\") pod \"neutron-db-sync-qlkjv\" (UID: \"ce0e86d1-5b0d-491f-bbd9-59fdd8024236\") " pod="openstack/neutron-db-sync-qlkjv" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.173075 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0862f63-fd04-41f5-88a2-638189435867-scripts\") pod \"ceilometer-0\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " pod="openstack/ceilometer-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.179203 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g868g\" (UniqueName: \"kubernetes.io/projected/ce0e86d1-5b0d-491f-bbd9-59fdd8024236-kube-api-access-g868g\") pod \"neutron-db-sync-qlkjv\" (UID: \"ce0e86d1-5b0d-491f-bbd9-59fdd8024236\") " pod="openstack/neutron-db-sync-qlkjv" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.196087 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z2x2\" (UniqueName: \"kubernetes.io/projected/f0862f63-fd04-41f5-88a2-638189435867-kube-api-access-2z2x2\") pod \"ceilometer-0\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " pod="openstack/ceilometer-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.198334 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-cznm2"] Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.212411 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.223100 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-qd4jq"] Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.228818 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-qd4jq" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.233506 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qkjz5" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.234722 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.252555 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmmg7\" (UniqueName: \"kubernetes.io/projected/5f59ca51-fd91-41a3-ad9c-6b04ccd93288-kube-api-access-rmmg7\") pod \"barbican-db-sync-qd4jq\" (UID: \"5f59ca51-fd91-41a3-ad9c-6b04ccd93288\") " pod="openstack/barbican-db-sync-qd4jq" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.254882 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-dns-svc\") pod \"dnsmasq-dns-ccd7c9f8f-cznm2\" (UID: \"359d68ef-affb-48e7-861d-ab0b8d397f47\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.255308 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f59ca51-fd91-41a3-ad9c-6b04ccd93288-combined-ca-bundle\") pod \"barbican-db-sync-qd4jq\" (UID: \"5f59ca51-fd91-41a3-ad9c-6b04ccd93288\") " pod="openstack/barbican-db-sync-qd4jq" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.255445 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-config\") pod \"dnsmasq-dns-ccd7c9f8f-cznm2\" (UID: \"359d68ef-affb-48e7-861d-ab0b8d397f47\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.255606 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-ovsdbserver-sb\") pod \"dnsmasq-dns-ccd7c9f8f-cznm2\" (UID: \"359d68ef-affb-48e7-861d-ab0b8d397f47\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.255775 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5f59ca51-fd91-41a3-ad9c-6b04ccd93288-db-sync-config-data\") pod \"barbican-db-sync-qd4jq\" (UID: \"5f59ca51-fd91-41a3-ad9c-6b04ccd93288\") " pod="openstack/barbican-db-sync-qd4jq" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.255881 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-ovsdbserver-nb\") pod \"dnsmasq-dns-ccd7c9f8f-cznm2\" (UID: \"359d68ef-affb-48e7-861d-ab0b8d397f47\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.256042 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-dns-swift-storage-0\") pod \"dnsmasq-dns-ccd7c9f8f-cznm2\" (UID: \"359d68ef-affb-48e7-861d-ab0b8d397f47\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.258477 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtg8c\" (UniqueName: \"kubernetes.io/projected/359d68ef-affb-48e7-861d-ab0b8d397f47-kube-api-access-dtg8c\") pod \"dnsmasq-dns-ccd7c9f8f-cznm2\" (UID: \"359d68ef-affb-48e7-861d-ab0b8d397f47\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.261674 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-cznm2"] Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.290436 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-qd4jq"] Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.337103 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-b6jlj"] Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.338278 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-b6jlj" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.341175 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-hvqft" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.341472 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.341482 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.364212 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-b6jlj"] Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.365265 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmmg7\" (UniqueName: \"kubernetes.io/projected/5f59ca51-fd91-41a3-ad9c-6b04ccd93288-kube-api-access-rmmg7\") pod \"barbican-db-sync-qd4jq\" (UID: \"5f59ca51-fd91-41a3-ad9c-6b04ccd93288\") " pod="openstack/barbican-db-sync-qd4jq" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.365295 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-dns-svc\") pod \"dnsmasq-dns-ccd7c9f8f-cznm2\" (UID: \"359d68ef-affb-48e7-861d-ab0b8d397f47\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.365358 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f59ca51-fd91-41a3-ad9c-6b04ccd93288-combined-ca-bundle\") pod \"barbican-db-sync-qd4jq\" (UID: \"5f59ca51-fd91-41a3-ad9c-6b04ccd93288\") " pod="openstack/barbican-db-sync-qd4jq" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.365374 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-config\") pod \"dnsmasq-dns-ccd7c9f8f-cznm2\" (UID: \"359d68ef-affb-48e7-861d-ab0b8d397f47\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.365402 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-ovsdbserver-sb\") pod \"dnsmasq-dns-ccd7c9f8f-cznm2\" (UID: \"359d68ef-affb-48e7-861d-ab0b8d397f47\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.365440 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5f59ca51-fd91-41a3-ad9c-6b04ccd93288-db-sync-config-data\") pod \"barbican-db-sync-qd4jq\" (UID: \"5f59ca51-fd91-41a3-ad9c-6b04ccd93288\") " pod="openstack/barbican-db-sync-qd4jq" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.365461 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-ovsdbserver-nb\") pod \"dnsmasq-dns-ccd7c9f8f-cznm2\" (UID: \"359d68ef-affb-48e7-861d-ab0b8d397f47\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.365491 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-dns-swift-storage-0\") pod \"dnsmasq-dns-ccd7c9f8f-cznm2\" (UID: \"359d68ef-affb-48e7-861d-ab0b8d397f47\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.365519 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtg8c\" (UniqueName: \"kubernetes.io/projected/359d68ef-affb-48e7-861d-ab0b8d397f47-kube-api-access-dtg8c\") pod \"dnsmasq-dns-ccd7c9f8f-cznm2\" (UID: \"359d68ef-affb-48e7-861d-ab0b8d397f47\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.368795 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-config\") pod \"dnsmasq-dns-ccd7c9f8f-cznm2\" (UID: \"359d68ef-affb-48e7-861d-ab0b8d397f47\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.369072 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-ovsdbserver-sb\") pod \"dnsmasq-dns-ccd7c9f8f-cznm2\" (UID: \"359d68ef-affb-48e7-861d-ab0b8d397f47\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.369151 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-ovsdbserver-nb\") pod \"dnsmasq-dns-ccd7c9f8f-cznm2\" (UID: \"359d68ef-affb-48e7-861d-ab0b8d397f47\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.369679 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-dns-swift-storage-0\") pod \"dnsmasq-dns-ccd7c9f8f-cznm2\" (UID: \"359d68ef-affb-48e7-861d-ab0b8d397f47\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.371390 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-dns-svc\") pod \"dnsmasq-dns-ccd7c9f8f-cznm2\" (UID: \"359d68ef-affb-48e7-861d-ab0b8d397f47\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.376655 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f59ca51-fd91-41a3-ad9c-6b04ccd93288-combined-ca-bundle\") pod \"barbican-db-sync-qd4jq\" (UID: \"5f59ca51-fd91-41a3-ad9c-6b04ccd93288\") " pod="openstack/barbican-db-sync-qd4jq" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.380568 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5f59ca51-fd91-41a3-ad9c-6b04ccd93288-db-sync-config-data\") pod \"barbican-db-sync-qd4jq\" (UID: \"5f59ca51-fd91-41a3-ad9c-6b04ccd93288\") " pod="openstack/barbican-db-sync-qd4jq" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.384297 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtg8c\" (UniqueName: \"kubernetes.io/projected/359d68ef-affb-48e7-861d-ab0b8d397f47-kube-api-access-dtg8c\") pod \"dnsmasq-dns-ccd7c9f8f-cznm2\" (UID: \"359d68ef-affb-48e7-861d-ab0b8d397f47\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.387649 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmmg7\" (UniqueName: \"kubernetes.io/projected/5f59ca51-fd91-41a3-ad9c-6b04ccd93288-kube-api-access-rmmg7\") pod \"barbican-db-sync-qd4jq\" (UID: \"5f59ca51-fd91-41a3-ad9c-6b04ccd93288\") " pod="openstack/barbican-db-sync-qd4jq" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.431104 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qlkjv" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.440219 4825 generic.go:334] "Generic (PLEG): container finished" podID="963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87" containerID="20b0a95b9f8805a48bc588a2be463ff0b2e9a348534ee2b072e0e98a74abfa9e" exitCode=0 Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.440289 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" event={"ID":"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87","Type":"ContainerDied","Data":"20b0a95b9f8805a48bc588a2be463ff0b2e9a348534ee2b072e0e98a74abfa9e"} Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.450344 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.467907 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a762be8c-156d-453d-bcd6-ae8571d9133f-scripts\") pod \"placement-db-sync-b6jlj\" (UID: \"a762be8c-156d-453d-bcd6-ae8571d9133f\") " pod="openstack/placement-db-sync-b6jlj" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.468087 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a762be8c-156d-453d-bcd6-ae8571d9133f-combined-ca-bundle\") pod \"placement-db-sync-b6jlj\" (UID: \"a762be8c-156d-453d-bcd6-ae8571d9133f\") " pod="openstack/placement-db-sync-b6jlj" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.468180 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a762be8c-156d-453d-bcd6-ae8571d9133f-logs\") pod \"placement-db-sync-b6jlj\" (UID: \"a762be8c-156d-453d-bcd6-ae8571d9133f\") " pod="openstack/placement-db-sync-b6jlj" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.468250 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a762be8c-156d-453d-bcd6-ae8571d9133f-config-data\") pod \"placement-db-sync-b6jlj\" (UID: \"a762be8c-156d-453d-bcd6-ae8571d9133f\") " pod="openstack/placement-db-sync-b6jlj" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.468306 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d2ws\" (UniqueName: \"kubernetes.io/projected/a762be8c-156d-453d-bcd6-ae8571d9133f-kube-api-access-5d2ws\") pod \"placement-db-sync-b6jlj\" (UID: \"a762be8c-156d-453d-bcd6-ae8571d9133f\") " pod="openstack/placement-db-sync-b6jlj" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.505603 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.571931 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-ovsdbserver-sb\") pod \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\" (UID: \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\") " Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.571993 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-config\") pod \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\" (UID: \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\") " Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.572566 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-ovsdbserver-nb\") pod \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\" (UID: \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\") " Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.573301 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-dns-svc\") pod \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\" (UID: \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\") " Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.573352 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-dns-swift-storage-0\") pod \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\" (UID: \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\") " Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.573388 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdsmd\" (UniqueName: \"kubernetes.io/projected/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-kube-api-access-gdsmd\") pod \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\" (UID: \"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87\") " Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.573626 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d2ws\" (UniqueName: \"kubernetes.io/projected/a762be8c-156d-453d-bcd6-ae8571d9133f-kube-api-access-5d2ws\") pod \"placement-db-sync-b6jlj\" (UID: \"a762be8c-156d-453d-bcd6-ae8571d9133f\") " pod="openstack/placement-db-sync-b6jlj" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.573677 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a762be8c-156d-453d-bcd6-ae8571d9133f-scripts\") pod \"placement-db-sync-b6jlj\" (UID: \"a762be8c-156d-453d-bcd6-ae8571d9133f\") " pod="openstack/placement-db-sync-b6jlj" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.573758 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a762be8c-156d-453d-bcd6-ae8571d9133f-combined-ca-bundle\") pod \"placement-db-sync-b6jlj\" (UID: \"a762be8c-156d-453d-bcd6-ae8571d9133f\") " pod="openstack/placement-db-sync-b6jlj" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.573813 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a762be8c-156d-453d-bcd6-ae8571d9133f-logs\") pod \"placement-db-sync-b6jlj\" (UID: \"a762be8c-156d-453d-bcd6-ae8571d9133f\") " pod="openstack/placement-db-sync-b6jlj" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.574583 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a762be8c-156d-453d-bcd6-ae8571d9133f-config-data\") pod \"placement-db-sync-b6jlj\" (UID: \"a762be8c-156d-453d-bcd6-ae8571d9133f\") " pod="openstack/placement-db-sync-b6jlj" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.584595 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a762be8c-156d-453d-bcd6-ae8571d9133f-logs\") pod \"placement-db-sync-b6jlj\" (UID: \"a762be8c-156d-453d-bcd6-ae8571d9133f\") " pod="openstack/placement-db-sync-b6jlj" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.589721 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a762be8c-156d-453d-bcd6-ae8571d9133f-combined-ca-bundle\") pod \"placement-db-sync-b6jlj\" (UID: \"a762be8c-156d-453d-bcd6-ae8571d9133f\") " pod="openstack/placement-db-sync-b6jlj" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.589930 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-kube-api-access-gdsmd" (OuterVolumeSpecName: "kube-api-access-gdsmd") pod "963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87" (UID: "963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87"). InnerVolumeSpecName "kube-api-access-gdsmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.590181 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-qd4jq" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.590270 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a762be8c-156d-453d-bcd6-ae8571d9133f-scripts\") pod \"placement-db-sync-b6jlj\" (UID: \"a762be8c-156d-453d-bcd6-ae8571d9133f\") " pod="openstack/placement-db-sync-b6jlj" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.592415 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.593349 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a762be8c-156d-453d-bcd6-ae8571d9133f-config-data\") pod \"placement-db-sync-b6jlj\" (UID: \"a762be8c-156d-453d-bcd6-ae8571d9133f\") " pod="openstack/placement-db-sync-b6jlj" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.608255 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d2ws\" (UniqueName: \"kubernetes.io/projected/a762be8c-156d-453d-bcd6-ae8571d9133f-kube-api-access-5d2ws\") pod \"placement-db-sync-b6jlj\" (UID: \"a762be8c-156d-453d-bcd6-ae8571d9133f\") " pod="openstack/placement-db-sync-b6jlj" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.660981 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87" (UID: "963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.671866 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87" (UID: "963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.676939 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.676962 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdsmd\" (UniqueName: \"kubernetes.io/projected/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-kube-api-access-gdsmd\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.676970 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.683329 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-b6jlj" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.694196 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-config" (OuterVolumeSpecName: "config") pod "963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87" (UID: "963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.694297 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87" (UID: "963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.718229 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87" (UID: "963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.724370 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9dv74"] Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.738368 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-995cj"] Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.778181 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.778215 4825 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.778226 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.781586 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 07:05:10 crc kubenswrapper[4825]: E0310 07:05:10.781925 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87" containerName="init" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.781940 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87" containerName="init" Mar 10 07:05:10 crc kubenswrapper[4825]: E0310 07:05:10.781959 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87" containerName="dnsmasq-dns" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.781965 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87" containerName="dnsmasq-dns" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.782110 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87" containerName="dnsmasq-dns" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.782916 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.787947 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.788164 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.788260 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.788360 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lwfpg" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.813869 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.858288 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-j9wfx"] Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.903118 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d5ff589-3823-4ba2-b383-55663e00179d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.903207 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d5ff589-3823-4ba2-b383-55663e00179d-logs\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.903240 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.903345 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d5ff589-3823-4ba2-b383-55663e00179d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.903390 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d5ff589-3823-4ba2-b383-55663e00179d-config-data\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.903427 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zvjx\" (UniqueName: \"kubernetes.io/projected/2d5ff589-3823-4ba2-b383-55663e00179d-kube-api-access-9zvjx\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.903460 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d5ff589-3823-4ba2-b383-55663e00179d-scripts\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.903496 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d5ff589-3823-4ba2-b383-55663e00179d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.920410 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.923428 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.930484 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.930707 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 07:05:10 crc kubenswrapper[4825]: I0310 07:05:10.947763 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.005513 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d5ff589-3823-4ba2-b383-55663e00179d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.005784 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.005822 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d5ff589-3823-4ba2-b383-55663e00179d-config-data\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.005839 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-logs\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.005866 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.005887 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zvjx\" (UniqueName: \"kubernetes.io/projected/2d5ff589-3823-4ba2-b383-55663e00179d-kube-api-access-9zvjx\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.005906 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d5ff589-3823-4ba2-b383-55663e00179d-scripts\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.005927 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.005955 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d5ff589-3823-4ba2-b383-55663e00179d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.005981 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.006009 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.006036 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d5ff589-3823-4ba2-b383-55663e00179d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.006056 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d5ff589-3823-4ba2-b383-55663e00179d-logs\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.006077 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.006139 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.006173 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7s8n\" (UniqueName: \"kubernetes.io/projected/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-kube-api-access-s7s8n\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.006765 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d5ff589-3823-4ba2-b383-55663e00179d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.006966 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d5ff589-3823-4ba2-b383-55663e00179d-logs\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.007218 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.012367 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d5ff589-3823-4ba2-b383-55663e00179d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.022989 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d5ff589-3823-4ba2-b383-55663e00179d-config-data\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.040297 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zvjx\" (UniqueName: \"kubernetes.io/projected/2d5ff589-3823-4ba2-b383-55663e00179d-kube-api-access-9zvjx\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.042121 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-qlkjv"] Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.047104 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d5ff589-3823-4ba2-b383-55663e00179d-scripts\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.049022 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d5ff589-3823-4ba2-b383-55663e00179d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.049884 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.062027 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.107362 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7s8n\" (UniqueName: \"kubernetes.io/projected/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-kube-api-access-s7s8n\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.107438 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.107471 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-logs\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.107504 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.107535 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.107570 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.107596 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.107679 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.108463 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.113036 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.113748 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-logs\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.114735 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.118264 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.120390 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.123745 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7s8n\" (UniqueName: \"kubernetes.io/projected/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-kube-api-access-s7s8n\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.153499 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.158000 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.316981 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.333146 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-qd4jq"] Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.348551 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.465077 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0862f63-fd04-41f5-88a2-638189435867","Type":"ContainerStarted","Data":"cd63a26a1a8105fb150a76f6abee7d5c9466889f520fe6a87bec5da6bba60a33"} Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.490918 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-cznm2"] Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.501239 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qlkjv" event={"ID":"ce0e86d1-5b0d-491f-bbd9-59fdd8024236","Type":"ContainerStarted","Data":"801097cc7df21da7704fabf68817db4f65ee4b352a35ff8f2dde518c9ebccf2d"} Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.509397 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-j9wfx" event={"ID":"c3f58042-5c26-4b0b-9f85-d35d9305115e","Type":"ContainerStarted","Data":"eaf7ff81460399f981f5998ee5ed6115093fe16e7bc8a700f771cce129535ec1"} Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.531264 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-qd4jq" event={"ID":"5f59ca51-fd91-41a3-ad9c-6b04ccd93288","Type":"ContainerStarted","Data":"3625fb732ac4b285e45c6be2d531d6cc9d16cbef14c46b6dff18ccf0115116e7"} Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.543250 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-b6jlj"] Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.550017 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-qlkjv" podStartSLOduration=2.549999571 podStartE2EDuration="2.549999571s" podCreationTimestamp="2026-03-10 07:05:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:05:11.524499536 +0000 UTC m=+1264.554280151" watchObservedRunningTime="2026-03-10 07:05:11.549999571 +0000 UTC m=+1264.579780186" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.554015 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5985c59c55-995cj" event={"ID":"f5a08816-8aac-4c49-abab-deff21e61f4b","Type":"ContainerStarted","Data":"13b65f99fb590290a0cdbeb2d60da933f27a18df7b0f2452edb2c694202caf71"} Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.564027 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9dv74" event={"ID":"1c42468d-409a-4aad-b52f-6553acdc5179","Type":"ContainerStarted","Data":"df94e0024265987c9b30cca9ac8ccb167c7c44ec2783fb8fb4ffc5f32a852855"} Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.584747 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" event={"ID":"963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87","Type":"ContainerDied","Data":"0a39ed9d49e785ae2dd014410c8fb924eafde3f3935321d36b1a44d071a7d51c"} Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.584805 4825 scope.go:117] "RemoveContainer" containerID="20b0a95b9f8805a48bc588a2be463ff0b2e9a348534ee2b072e0e98a74abfa9e" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.584962 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-bjbqz" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.626575 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-9dv74" podStartSLOduration=2.626555286 podStartE2EDuration="2.626555286s" podCreationTimestamp="2026-03-10 07:05:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:05:11.609088451 +0000 UTC m=+1264.638869076" watchObservedRunningTime="2026-03-10 07:05:11.626555286 +0000 UTC m=+1264.656335901" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.662351 4825 scope.go:117] "RemoveContainer" containerID="63cff044124ca8958cdbbb486a150093237eb4a99328588069bf2dca6451e893" Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.711965 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-bjbqz"] Mar 10 07:05:11 crc kubenswrapper[4825]: I0310 07:05:11.746897 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-bjbqz"] Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.082696 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 07:05:12 crc kubenswrapper[4825]: W0310 07:05:12.104344 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d5ff589_3823_4ba2_b383_55663e00179d.slice/crio-44b10bf6f10f3ea8a4f61ee9cbf4d90e288ce8d80a75a3bf9ddc73c4149534ab WatchSource:0}: Error finding container 44b10bf6f10f3ea8a4f61ee9cbf4d90e288ce8d80a75a3bf9ddc73c4149534ab: Status 404 returned error can't find the container with id 44b10bf6f10f3ea8a4f61ee9cbf4d90e288ce8d80a75a3bf9ddc73c4149534ab Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.161261 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 07:05:12 crc kubenswrapper[4825]: W0310 07:05:12.257935 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dec2179_f022_4cc2_9cb6_7d2dc326bc10.slice/crio-e3d8af8deba0a28a00b24c6eb3ae76eb86dec0ede786754ab9f4009df37323cf WatchSource:0}: Error finding container e3d8af8deba0a28a00b24c6eb3ae76eb86dec0ede786754ab9f4009df37323cf: Status 404 returned error can't find the container with id e3d8af8deba0a28a00b24c6eb3ae76eb86dec0ede786754ab9f4009df37323cf Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.418678 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-995cj" Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.468014 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-ovsdbserver-nb\") pod \"f5a08816-8aac-4c49-abab-deff21e61f4b\" (UID: \"f5a08816-8aac-4c49-abab-deff21e61f4b\") " Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.468113 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-config\") pod \"f5a08816-8aac-4c49-abab-deff21e61f4b\" (UID: \"f5a08816-8aac-4c49-abab-deff21e61f4b\") " Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.468251 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-ovsdbserver-sb\") pod \"f5a08816-8aac-4c49-abab-deff21e61f4b\" (UID: \"f5a08816-8aac-4c49-abab-deff21e61f4b\") " Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.468345 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-dns-svc\") pod \"f5a08816-8aac-4c49-abab-deff21e61f4b\" (UID: \"f5a08816-8aac-4c49-abab-deff21e61f4b\") " Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.468472 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-dns-swift-storage-0\") pod \"f5a08816-8aac-4c49-abab-deff21e61f4b\" (UID: \"f5a08816-8aac-4c49-abab-deff21e61f4b\") " Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.468516 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5drg\" (UniqueName: \"kubernetes.io/projected/f5a08816-8aac-4c49-abab-deff21e61f4b-kube-api-access-h5drg\") pod \"f5a08816-8aac-4c49-abab-deff21e61f4b\" (UID: \"f5a08816-8aac-4c49-abab-deff21e61f4b\") " Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.495944 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5a08816-8aac-4c49-abab-deff21e61f4b-kube-api-access-h5drg" (OuterVolumeSpecName: "kube-api-access-h5drg") pod "f5a08816-8aac-4c49-abab-deff21e61f4b" (UID: "f5a08816-8aac-4c49-abab-deff21e61f4b"). InnerVolumeSpecName "kube-api-access-h5drg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.516913 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f5a08816-8aac-4c49-abab-deff21e61f4b" (UID: "f5a08816-8aac-4c49-abab-deff21e61f4b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.520035 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f5a08816-8aac-4c49-abab-deff21e61f4b" (UID: "f5a08816-8aac-4c49-abab-deff21e61f4b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.525749 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f5a08816-8aac-4c49-abab-deff21e61f4b" (UID: "f5a08816-8aac-4c49-abab-deff21e61f4b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.526618 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-config" (OuterVolumeSpecName: "config") pod "f5a08816-8aac-4c49-abab-deff21e61f4b" (UID: "f5a08816-8aac-4c49-abab-deff21e61f4b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.527092 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f5a08816-8aac-4c49-abab-deff21e61f4b" (UID: "f5a08816-8aac-4c49-abab-deff21e61f4b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.570485 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.570519 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.570529 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.570538 4825 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.570551 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5drg\" (UniqueName: \"kubernetes.io/projected/f5a08816-8aac-4c49-abab-deff21e61f4b-kube-api-access-h5drg\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.570559 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5a08816-8aac-4c49-abab-deff21e61f4b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.630915 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2d5ff589-3823-4ba2-b383-55663e00179d","Type":"ContainerStarted","Data":"44b10bf6f10f3ea8a4f61ee9cbf4d90e288ce8d80a75a3bf9ddc73c4149534ab"} Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.635939 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-b6jlj" event={"ID":"a762be8c-156d-453d-bcd6-ae8571d9133f","Type":"ContainerStarted","Data":"ee6bc960daabb8e76e322319ef0f1c19d731cfed1b246df08e79d019a7ca5a71"} Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.639717 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5dec2179-f022-4cc2-9cb6-7d2dc326bc10","Type":"ContainerStarted","Data":"e3d8af8deba0a28a00b24c6eb3ae76eb86dec0ede786754ab9f4009df37323cf"} Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.646436 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qlkjv" event={"ID":"ce0e86d1-5b0d-491f-bbd9-59fdd8024236","Type":"ContainerStarted","Data":"c05058771a30272a89ebdf20f49d8eaf62dce4f91bce9a630a67ad47518d7ccc"} Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.651186 4825 generic.go:334] "Generic (PLEG): container finished" podID="f5a08816-8aac-4c49-abab-deff21e61f4b" containerID="a8d122ed14a7b57bafe284ffebe2f50d89e382ccf57508c7c911d3f337fe35db" exitCode=0 Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.651323 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-995cj" Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.652357 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5985c59c55-995cj" event={"ID":"f5a08816-8aac-4c49-abab-deff21e61f4b","Type":"ContainerDied","Data":"a8d122ed14a7b57bafe284ffebe2f50d89e382ccf57508c7c911d3f337fe35db"} Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.652390 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5985c59c55-995cj" event={"ID":"f5a08816-8aac-4c49-abab-deff21e61f4b","Type":"ContainerDied","Data":"13b65f99fb590290a0cdbeb2d60da933f27a18df7b0f2452edb2c694202caf71"} Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.652414 4825 scope.go:117] "RemoveContainer" containerID="a8d122ed14a7b57bafe284ffebe2f50d89e382ccf57508c7c911d3f337fe35db" Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.659662 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9dv74" event={"ID":"1c42468d-409a-4aad-b52f-6553acdc5179","Type":"ContainerStarted","Data":"38a51644caf8c9a32ccb226b34fbd7261ce3347b38b99707dc8437f9e1f32f8a"} Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.669343 4825 generic.go:334] "Generic (PLEG): container finished" podID="359d68ef-affb-48e7-861d-ab0b8d397f47" containerID="07ee4bffa41bb926277c5961dd88a98dee5f10f593bd6aa7cafdefe008bae570" exitCode=0 Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.669393 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" event={"ID":"359d68ef-affb-48e7-861d-ab0b8d397f47","Type":"ContainerDied","Data":"07ee4bffa41bb926277c5961dd88a98dee5f10f593bd6aa7cafdefe008bae570"} Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.669427 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" event={"ID":"359d68ef-affb-48e7-861d-ab0b8d397f47","Type":"ContainerStarted","Data":"7ced23c92af8cd6c52399ed72893cea26284bf43ddac1ccfc767f31d13ca8fe3"} Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.853737 4825 scope.go:117] "RemoveContainer" containerID="a8d122ed14a7b57bafe284ffebe2f50d89e382ccf57508c7c911d3f337fe35db" Mar 10 07:05:12 crc kubenswrapper[4825]: E0310 07:05:12.863329 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8d122ed14a7b57bafe284ffebe2f50d89e382ccf57508c7c911d3f337fe35db\": container with ID starting with a8d122ed14a7b57bafe284ffebe2f50d89e382ccf57508c7c911d3f337fe35db not found: ID does not exist" containerID="a8d122ed14a7b57bafe284ffebe2f50d89e382ccf57508c7c911d3f337fe35db" Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.863384 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8d122ed14a7b57bafe284ffebe2f50d89e382ccf57508c7c911d3f337fe35db"} err="failed to get container status \"a8d122ed14a7b57bafe284ffebe2f50d89e382ccf57508c7c911d3f337fe35db\": rpc error: code = NotFound desc = could not find container \"a8d122ed14a7b57bafe284ffebe2f50d89e382ccf57508c7c911d3f337fe35db\": container with ID starting with a8d122ed14a7b57bafe284ffebe2f50d89e382ccf57508c7c911d3f337fe35db not found: ID does not exist" Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.875741 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-995cj"] Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.910802 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-995cj"] Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.942777 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.958608 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 07:05:12 crc kubenswrapper[4825]: I0310 07:05:12.967958 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 07:05:13 crc kubenswrapper[4825]: I0310 07:05:13.288047 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87" path="/var/lib/kubelet/pods/963cfb0f-3fe6-4a00-b09f-a4d9e86f4b87/volumes" Mar 10 07:05:13 crc kubenswrapper[4825]: I0310 07:05:13.290566 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5a08816-8aac-4c49-abab-deff21e61f4b" path="/var/lib/kubelet/pods/f5a08816-8aac-4c49-abab-deff21e61f4b/volumes" Mar 10 07:05:13 crc kubenswrapper[4825]: I0310 07:05:13.728573 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" event={"ID":"359d68ef-affb-48e7-861d-ab0b8d397f47","Type":"ContainerStarted","Data":"e4421e7c232a25e39615dea0e0fdb279fe983715220552a5e6b6936a1d0a963b"} Mar 10 07:05:13 crc kubenswrapper[4825]: I0310 07:05:13.728942 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" Mar 10 07:05:13 crc kubenswrapper[4825]: I0310 07:05:13.731780 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2d5ff589-3823-4ba2-b383-55663e00179d","Type":"ContainerStarted","Data":"cbae43b98b7d3c0a7c7e2b9d1aa471ccf547cd1d4bce9f6deadbe9ee371f7cdd"} Mar 10 07:05:13 crc kubenswrapper[4825]: I0310 07:05:13.735687 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5dec2179-f022-4cc2-9cb6-7d2dc326bc10","Type":"ContainerStarted","Data":"37fe47bae01a2a5c8eccc7ac97f930212c36c51609e5617476dbffa57770e020"} Mar 10 07:05:14 crc kubenswrapper[4825]: I0310 07:05:14.768465 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5dec2179-f022-4cc2-9cb6-7d2dc326bc10","Type":"ContainerStarted","Data":"da9b57e52caaf44be121346a02932df9fc330bbbc80d1b36297cb8eff08ae0c2"} Mar 10 07:05:14 crc kubenswrapper[4825]: I0310 07:05:14.769832 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5dec2179-f022-4cc2-9cb6-7d2dc326bc10" containerName="glance-httpd" containerID="cri-o://da9b57e52caaf44be121346a02932df9fc330bbbc80d1b36297cb8eff08ae0c2" gracePeriod=30 Mar 10 07:05:14 crc kubenswrapper[4825]: I0310 07:05:14.770005 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5dec2179-f022-4cc2-9cb6-7d2dc326bc10" containerName="glance-log" containerID="cri-o://37fe47bae01a2a5c8eccc7ac97f930212c36c51609e5617476dbffa57770e020" gracePeriod=30 Mar 10 07:05:14 crc kubenswrapper[4825]: I0310 07:05:14.788379 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2d5ff589-3823-4ba2-b383-55663e00179d" containerName="glance-log" containerID="cri-o://cbae43b98b7d3c0a7c7e2b9d1aa471ccf547cd1d4bce9f6deadbe9ee371f7cdd" gracePeriod=30 Mar 10 07:05:14 crc kubenswrapper[4825]: I0310 07:05:14.788710 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2d5ff589-3823-4ba2-b383-55663e00179d" containerName="glance-httpd" containerID="cri-o://0bd9da29b5ce443489d3e524014de1f4ad395e34701bf0dec9b364edfdd57c58" gracePeriod=30 Mar 10 07:05:14 crc kubenswrapper[4825]: I0310 07:05:14.789111 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2d5ff589-3823-4ba2-b383-55663e00179d","Type":"ContainerStarted","Data":"0bd9da29b5ce443489d3e524014de1f4ad395e34701bf0dec9b364edfdd57c58"} Mar 10 07:05:14 crc kubenswrapper[4825]: I0310 07:05:14.791100 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.791089144 podStartE2EDuration="5.791089144s" podCreationTimestamp="2026-03-10 07:05:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:05:14.788746123 +0000 UTC m=+1267.818526748" watchObservedRunningTime="2026-03-10 07:05:14.791089144 +0000 UTC m=+1267.820869759" Mar 10 07:05:14 crc kubenswrapper[4825]: I0310 07:05:14.799269 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" podStartSLOduration=4.799252327 podStartE2EDuration="4.799252327s" podCreationTimestamp="2026-03-10 07:05:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:05:13.754379424 +0000 UTC m=+1266.784160039" watchObservedRunningTime="2026-03-10 07:05:14.799252327 +0000 UTC m=+1267.829032942" Mar 10 07:05:15 crc kubenswrapper[4825]: I0310 07:05:15.809375 4825 generic.go:334] "Generic (PLEG): container finished" podID="5dec2179-f022-4cc2-9cb6-7d2dc326bc10" containerID="da9b57e52caaf44be121346a02932df9fc330bbbc80d1b36297cb8eff08ae0c2" exitCode=0 Mar 10 07:05:15 crc kubenswrapper[4825]: I0310 07:05:15.809594 4825 generic.go:334] "Generic (PLEG): container finished" podID="5dec2179-f022-4cc2-9cb6-7d2dc326bc10" containerID="37fe47bae01a2a5c8eccc7ac97f930212c36c51609e5617476dbffa57770e020" exitCode=143 Mar 10 07:05:15 crc kubenswrapper[4825]: I0310 07:05:15.809483 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5dec2179-f022-4cc2-9cb6-7d2dc326bc10","Type":"ContainerDied","Data":"da9b57e52caaf44be121346a02932df9fc330bbbc80d1b36297cb8eff08ae0c2"} Mar 10 07:05:15 crc kubenswrapper[4825]: I0310 07:05:15.809679 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5dec2179-f022-4cc2-9cb6-7d2dc326bc10","Type":"ContainerDied","Data":"37fe47bae01a2a5c8eccc7ac97f930212c36c51609e5617476dbffa57770e020"} Mar 10 07:05:15 crc kubenswrapper[4825]: I0310 07:05:15.813323 4825 generic.go:334] "Generic (PLEG): container finished" podID="2d5ff589-3823-4ba2-b383-55663e00179d" containerID="0bd9da29b5ce443489d3e524014de1f4ad395e34701bf0dec9b364edfdd57c58" exitCode=0 Mar 10 07:05:15 crc kubenswrapper[4825]: I0310 07:05:15.813390 4825 generic.go:334] "Generic (PLEG): container finished" podID="2d5ff589-3823-4ba2-b383-55663e00179d" containerID="cbae43b98b7d3c0a7c7e2b9d1aa471ccf547cd1d4bce9f6deadbe9ee371f7cdd" exitCode=143 Mar 10 07:05:15 crc kubenswrapper[4825]: I0310 07:05:15.813420 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2d5ff589-3823-4ba2-b383-55663e00179d","Type":"ContainerDied","Data":"0bd9da29b5ce443489d3e524014de1f4ad395e34701bf0dec9b364edfdd57c58"} Mar 10 07:05:15 crc kubenswrapper[4825]: I0310 07:05:15.813480 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2d5ff589-3823-4ba2-b383-55663e00179d","Type":"ContainerDied","Data":"cbae43b98b7d3c0a7c7e2b9d1aa471ccf547cd1d4bce9f6deadbe9ee371f7cdd"} Mar 10 07:05:16 crc kubenswrapper[4825]: I0310 07:05:16.828810 4825 generic.go:334] "Generic (PLEG): container finished" podID="1c42468d-409a-4aad-b52f-6553acdc5179" containerID="38a51644caf8c9a32ccb226b34fbd7261ce3347b38b99707dc8437f9e1f32f8a" exitCode=0 Mar 10 07:05:16 crc kubenswrapper[4825]: I0310 07:05:16.828909 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9dv74" event={"ID":"1c42468d-409a-4aad-b52f-6553acdc5179","Type":"ContainerDied","Data":"38a51644caf8c9a32ccb226b34fbd7261ce3347b38b99707dc8437f9e1f32f8a"} Mar 10 07:05:16 crc kubenswrapper[4825]: I0310 07:05:16.852452 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.8524251 podStartE2EDuration="7.8524251s" podCreationTimestamp="2026-03-10 07:05:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:05:14.818130599 +0000 UTC m=+1267.847911224" watchObservedRunningTime="2026-03-10 07:05:16.8524251 +0000 UTC m=+1269.882205715" Mar 10 07:05:20 crc kubenswrapper[4825]: I0310 07:05:20.594440 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" Mar 10 07:05:20 crc kubenswrapper[4825]: I0310 07:05:20.666794 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-r9bqz"] Mar 10 07:05:20 crc kubenswrapper[4825]: I0310 07:05:20.667105 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" podUID="43dacc02-935b-4836-8863-e175536b0cd2" containerName="dnsmasq-dns" containerID="cri-o://e38be62763739c1a7ecde94467ed37f7af960422776051d7c021bd000a095079" gracePeriod=10 Mar 10 07:05:20 crc kubenswrapper[4825]: I0310 07:05:20.892639 4825 generic.go:334] "Generic (PLEG): container finished" podID="43dacc02-935b-4836-8863-e175536b0cd2" containerID="e38be62763739c1a7ecde94467ed37f7af960422776051d7c021bd000a095079" exitCode=0 Mar 10 07:05:20 crc kubenswrapper[4825]: I0310 07:05:20.892693 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" event={"ID":"43dacc02-935b-4836-8863-e175536b0cd2","Type":"ContainerDied","Data":"e38be62763739c1a7ecde94467ed37f7af960422776051d7c021bd000a095079"} Mar 10 07:05:21 crc kubenswrapper[4825]: I0310 07:05:21.369285 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" podUID="43dacc02-935b-4836-8863-e175536b0cd2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: connect: connection refused" Mar 10 07:05:23 crc kubenswrapper[4825]: I0310 07:05:23.292993 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9dv74" Mar 10 07:05:23 crc kubenswrapper[4825]: I0310 07:05:23.400630 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-combined-ca-bundle\") pod \"1c42468d-409a-4aad-b52f-6553acdc5179\" (UID: \"1c42468d-409a-4aad-b52f-6553acdc5179\") " Mar 10 07:05:23 crc kubenswrapper[4825]: I0310 07:05:23.400705 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsqbp\" (UniqueName: \"kubernetes.io/projected/1c42468d-409a-4aad-b52f-6553acdc5179-kube-api-access-tsqbp\") pod \"1c42468d-409a-4aad-b52f-6553acdc5179\" (UID: \"1c42468d-409a-4aad-b52f-6553acdc5179\") " Mar 10 07:05:23 crc kubenswrapper[4825]: I0310 07:05:23.400763 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-scripts\") pod \"1c42468d-409a-4aad-b52f-6553acdc5179\" (UID: \"1c42468d-409a-4aad-b52f-6553acdc5179\") " Mar 10 07:05:23 crc kubenswrapper[4825]: I0310 07:05:23.400808 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-credential-keys\") pod \"1c42468d-409a-4aad-b52f-6553acdc5179\" (UID: \"1c42468d-409a-4aad-b52f-6553acdc5179\") " Mar 10 07:05:23 crc kubenswrapper[4825]: I0310 07:05:23.400870 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-config-data\") pod \"1c42468d-409a-4aad-b52f-6553acdc5179\" (UID: \"1c42468d-409a-4aad-b52f-6553acdc5179\") " Mar 10 07:05:23 crc kubenswrapper[4825]: I0310 07:05:23.400887 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-fernet-keys\") pod \"1c42468d-409a-4aad-b52f-6553acdc5179\" (UID: \"1c42468d-409a-4aad-b52f-6553acdc5179\") " Mar 10 07:05:23 crc kubenswrapper[4825]: I0310 07:05:23.407179 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1c42468d-409a-4aad-b52f-6553acdc5179" (UID: "1c42468d-409a-4aad-b52f-6553acdc5179"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:23 crc kubenswrapper[4825]: I0310 07:05:23.407366 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-scripts" (OuterVolumeSpecName: "scripts") pod "1c42468d-409a-4aad-b52f-6553acdc5179" (UID: "1c42468d-409a-4aad-b52f-6553acdc5179"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:23 crc kubenswrapper[4825]: I0310 07:05:23.416926 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c42468d-409a-4aad-b52f-6553acdc5179-kube-api-access-tsqbp" (OuterVolumeSpecName: "kube-api-access-tsqbp") pod "1c42468d-409a-4aad-b52f-6553acdc5179" (UID: "1c42468d-409a-4aad-b52f-6553acdc5179"). InnerVolumeSpecName "kube-api-access-tsqbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:23 crc kubenswrapper[4825]: I0310 07:05:23.416977 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1c42468d-409a-4aad-b52f-6553acdc5179" (UID: "1c42468d-409a-4aad-b52f-6553acdc5179"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:23 crc kubenswrapper[4825]: I0310 07:05:23.426381 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c42468d-409a-4aad-b52f-6553acdc5179" (UID: "1c42468d-409a-4aad-b52f-6553acdc5179"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:23 crc kubenswrapper[4825]: I0310 07:05:23.429957 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-config-data" (OuterVolumeSpecName: "config-data") pod "1c42468d-409a-4aad-b52f-6553acdc5179" (UID: "1c42468d-409a-4aad-b52f-6553acdc5179"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:23 crc kubenswrapper[4825]: I0310 07:05:23.502826 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsqbp\" (UniqueName: \"kubernetes.io/projected/1c42468d-409a-4aad-b52f-6553acdc5179-kube-api-access-tsqbp\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:23 crc kubenswrapper[4825]: I0310 07:05:23.503273 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:23 crc kubenswrapper[4825]: I0310 07:05:23.503284 4825 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:23 crc kubenswrapper[4825]: I0310 07:05:23.503293 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:23 crc kubenswrapper[4825]: I0310 07:05:23.503303 4825 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:23 crc kubenswrapper[4825]: I0310 07:05:23.503312 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c42468d-409a-4aad-b52f-6553acdc5179-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:23 crc kubenswrapper[4825]: I0310 07:05:23.921251 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9dv74" event={"ID":"1c42468d-409a-4aad-b52f-6553acdc5179","Type":"ContainerDied","Data":"df94e0024265987c9b30cca9ac8ccb167c7c44ec2783fb8fb4ffc5f32a852855"} Mar 10 07:05:23 crc kubenswrapper[4825]: I0310 07:05:23.921303 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9dv74" Mar 10 07:05:23 crc kubenswrapper[4825]: I0310 07:05:23.921307 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df94e0024265987c9b30cca9ac8ccb167c7c44ec2783fb8fb4ffc5f32a852855" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.381822 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-9dv74"] Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.396146 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-9dv74"] Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.485178 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-t7ktm"] Mar 10 07:05:24 crc kubenswrapper[4825]: E0310 07:05:24.485618 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5a08816-8aac-4c49-abab-deff21e61f4b" containerName="init" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.485639 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5a08816-8aac-4c49-abab-deff21e61f4b" containerName="init" Mar 10 07:05:24 crc kubenswrapper[4825]: E0310 07:05:24.485665 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c42468d-409a-4aad-b52f-6553acdc5179" containerName="keystone-bootstrap" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.485673 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c42468d-409a-4aad-b52f-6553acdc5179" containerName="keystone-bootstrap" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.485873 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c42468d-409a-4aad-b52f-6553acdc5179" containerName="keystone-bootstrap" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.485899 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5a08816-8aac-4c49-abab-deff21e61f4b" containerName="init" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.486656 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t7ktm" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.490949 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.491253 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.491556 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rs4jw" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.491724 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.491741 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.499537 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-t7ktm"] Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.626231 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-combined-ca-bundle\") pod \"keystone-bootstrap-t7ktm\" (UID: \"0ff86ab8-a122-4256-8529-12265e6177e4\") " pod="openstack/keystone-bootstrap-t7ktm" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.626284 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-credential-keys\") pod \"keystone-bootstrap-t7ktm\" (UID: \"0ff86ab8-a122-4256-8529-12265e6177e4\") " pod="openstack/keystone-bootstrap-t7ktm" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.626373 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-config-data\") pod \"keystone-bootstrap-t7ktm\" (UID: \"0ff86ab8-a122-4256-8529-12265e6177e4\") " pod="openstack/keystone-bootstrap-t7ktm" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.626459 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-fernet-keys\") pod \"keystone-bootstrap-t7ktm\" (UID: \"0ff86ab8-a122-4256-8529-12265e6177e4\") " pod="openstack/keystone-bootstrap-t7ktm" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.626499 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-scripts\") pod \"keystone-bootstrap-t7ktm\" (UID: \"0ff86ab8-a122-4256-8529-12265e6177e4\") " pod="openstack/keystone-bootstrap-t7ktm" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.626535 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgnq5\" (UniqueName: \"kubernetes.io/projected/0ff86ab8-a122-4256-8529-12265e6177e4-kube-api-access-xgnq5\") pod \"keystone-bootstrap-t7ktm\" (UID: \"0ff86ab8-a122-4256-8529-12265e6177e4\") " pod="openstack/keystone-bootstrap-t7ktm" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.728757 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-config-data\") pod \"keystone-bootstrap-t7ktm\" (UID: \"0ff86ab8-a122-4256-8529-12265e6177e4\") " pod="openstack/keystone-bootstrap-t7ktm" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.728844 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-scripts\") pod \"keystone-bootstrap-t7ktm\" (UID: \"0ff86ab8-a122-4256-8529-12265e6177e4\") " pod="openstack/keystone-bootstrap-t7ktm" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.728870 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-fernet-keys\") pod \"keystone-bootstrap-t7ktm\" (UID: \"0ff86ab8-a122-4256-8529-12265e6177e4\") " pod="openstack/keystone-bootstrap-t7ktm" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.728907 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgnq5\" (UniqueName: \"kubernetes.io/projected/0ff86ab8-a122-4256-8529-12265e6177e4-kube-api-access-xgnq5\") pod \"keystone-bootstrap-t7ktm\" (UID: \"0ff86ab8-a122-4256-8529-12265e6177e4\") " pod="openstack/keystone-bootstrap-t7ktm" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.729058 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-combined-ca-bundle\") pod \"keystone-bootstrap-t7ktm\" (UID: \"0ff86ab8-a122-4256-8529-12265e6177e4\") " pod="openstack/keystone-bootstrap-t7ktm" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.729091 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-credential-keys\") pod \"keystone-bootstrap-t7ktm\" (UID: \"0ff86ab8-a122-4256-8529-12265e6177e4\") " pod="openstack/keystone-bootstrap-t7ktm" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.733867 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-scripts\") pod \"keystone-bootstrap-t7ktm\" (UID: \"0ff86ab8-a122-4256-8529-12265e6177e4\") " pod="openstack/keystone-bootstrap-t7ktm" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.736848 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-config-data\") pod \"keystone-bootstrap-t7ktm\" (UID: \"0ff86ab8-a122-4256-8529-12265e6177e4\") " pod="openstack/keystone-bootstrap-t7ktm" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.740634 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-credential-keys\") pod \"keystone-bootstrap-t7ktm\" (UID: \"0ff86ab8-a122-4256-8529-12265e6177e4\") " pod="openstack/keystone-bootstrap-t7ktm" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.744045 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-combined-ca-bundle\") pod \"keystone-bootstrap-t7ktm\" (UID: \"0ff86ab8-a122-4256-8529-12265e6177e4\") " pod="openstack/keystone-bootstrap-t7ktm" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.747617 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-fernet-keys\") pod \"keystone-bootstrap-t7ktm\" (UID: \"0ff86ab8-a122-4256-8529-12265e6177e4\") " pod="openstack/keystone-bootstrap-t7ktm" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.748423 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgnq5\" (UniqueName: \"kubernetes.io/projected/0ff86ab8-a122-4256-8529-12265e6177e4-kube-api-access-xgnq5\") pod \"keystone-bootstrap-t7ktm\" (UID: \"0ff86ab8-a122-4256-8529-12265e6177e4\") " pod="openstack/keystone-bootstrap-t7ktm" Mar 10 07:05:24 crc kubenswrapper[4825]: I0310 07:05:24.817106 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t7ktm" Mar 10 07:05:25 crc kubenswrapper[4825]: I0310 07:05:25.260652 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c42468d-409a-4aad-b52f-6553acdc5179" path="/var/lib/kubelet/pods/1c42468d-409a-4aad-b52f-6553acdc5179/volumes" Mar 10 07:05:26 crc kubenswrapper[4825]: I0310 07:05:26.368736 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" podUID="43dacc02-935b-4836-8863-e175536b0cd2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: connect: connection refused" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.594971 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.613834 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.748971 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d5ff589-3823-4ba2-b383-55663e00179d-config-data\") pod \"2d5ff589-3823-4ba2-b383-55663e00179d\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.749035 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d5ff589-3823-4ba2-b383-55663e00179d-logs\") pod \"2d5ff589-3823-4ba2-b383-55663e00179d\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.749061 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"2d5ff589-3823-4ba2-b383-55663e00179d\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.749122 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d5ff589-3823-4ba2-b383-55663e00179d-combined-ca-bundle\") pod \"2d5ff589-3823-4ba2-b383-55663e00179d\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.749179 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d5ff589-3823-4ba2-b383-55663e00179d-scripts\") pod \"2d5ff589-3823-4ba2-b383-55663e00179d\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.749329 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-scripts\") pod \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.749615 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7s8n\" (UniqueName: \"kubernetes.io/projected/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-kube-api-access-s7s8n\") pod \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.749710 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-config-data\") pod \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.749782 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-combined-ca-bundle\") pod \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.749819 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-internal-tls-certs\") pod \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.749847 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.749883 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d5ff589-3823-4ba2-b383-55663e00179d-httpd-run\") pod \"2d5ff589-3823-4ba2-b383-55663e00179d\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.749958 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-httpd-run\") pod \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.750002 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d5ff589-3823-4ba2-b383-55663e00179d-public-tls-certs\") pod \"2d5ff589-3823-4ba2-b383-55663e00179d\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.750066 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zvjx\" (UniqueName: \"kubernetes.io/projected/2d5ff589-3823-4ba2-b383-55663e00179d-kube-api-access-9zvjx\") pod \"2d5ff589-3823-4ba2-b383-55663e00179d\" (UID: \"2d5ff589-3823-4ba2-b383-55663e00179d\") " Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.750104 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-logs\") pod \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\" (UID: \"5dec2179-f022-4cc2-9cb6-7d2dc326bc10\") " Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.751041 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-logs" (OuterVolumeSpecName: "logs") pod "5dec2179-f022-4cc2-9cb6-7d2dc326bc10" (UID: "5dec2179-f022-4cc2-9cb6-7d2dc326bc10"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.754703 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d5ff589-3823-4ba2-b383-55663e00179d-logs" (OuterVolumeSpecName: "logs") pod "2d5ff589-3823-4ba2-b383-55663e00179d" (UID: "2d5ff589-3823-4ba2-b383-55663e00179d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.755470 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5dec2179-f022-4cc2-9cb6-7d2dc326bc10" (UID: "5dec2179-f022-4cc2-9cb6-7d2dc326bc10"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.756855 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d5ff589-3823-4ba2-b383-55663e00179d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2d5ff589-3823-4ba2-b383-55663e00179d" (UID: "2d5ff589-3823-4ba2-b383-55663e00179d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.758996 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-kube-api-access-s7s8n" (OuterVolumeSpecName: "kube-api-access-s7s8n") pod "5dec2179-f022-4cc2-9cb6-7d2dc326bc10" (UID: "5dec2179-f022-4cc2-9cb6-7d2dc326bc10"). InnerVolumeSpecName "kube-api-access-s7s8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.760282 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-scripts" (OuterVolumeSpecName: "scripts") pod "5dec2179-f022-4cc2-9cb6-7d2dc326bc10" (UID: "5dec2179-f022-4cc2-9cb6-7d2dc326bc10"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.761261 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "5dec2179-f022-4cc2-9cb6-7d2dc326bc10" (UID: "5dec2179-f022-4cc2-9cb6-7d2dc326bc10"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.765733 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d5ff589-3823-4ba2-b383-55663e00179d-kube-api-access-9zvjx" (OuterVolumeSpecName: "kube-api-access-9zvjx") pod "2d5ff589-3823-4ba2-b383-55663e00179d" (UID: "2d5ff589-3823-4ba2-b383-55663e00179d"). InnerVolumeSpecName "kube-api-access-9zvjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.780722 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d5ff589-3823-4ba2-b383-55663e00179d-scripts" (OuterVolumeSpecName: "scripts") pod "2d5ff589-3823-4ba2-b383-55663e00179d" (UID: "2d5ff589-3823-4ba2-b383-55663e00179d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.782524 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "2d5ff589-3823-4ba2-b383-55663e00179d" (UID: "2d5ff589-3823-4ba2-b383-55663e00179d"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.794780 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dec2179-f022-4cc2-9cb6-7d2dc326bc10" (UID: "5dec2179-f022-4cc2-9cb6-7d2dc326bc10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.805085 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d5ff589-3823-4ba2-b383-55663e00179d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d5ff589-3823-4ba2-b383-55663e00179d" (UID: "2d5ff589-3823-4ba2-b383-55663e00179d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.842723 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d5ff589-3823-4ba2-b383-55663e00179d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2d5ff589-3823-4ba2-b383-55663e00179d" (UID: "2d5ff589-3823-4ba2-b383-55663e00179d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.845166 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5dec2179-f022-4cc2-9cb6-7d2dc326bc10" (UID: "5dec2179-f022-4cc2-9cb6-7d2dc326bc10"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.852795 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.852845 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d5ff589-3823-4ba2-b383-55663e00179d-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.852860 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d5ff589-3823-4ba2-b383-55663e00179d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.852875 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d5ff589-3823-4ba2-b383-55663e00179d-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.852889 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.852900 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7s8n\" (UniqueName: \"kubernetes.io/projected/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-kube-api-access-s7s8n\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.852913 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.852923 4825 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.852951 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.852962 4825 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d5ff589-3823-4ba2-b383-55663e00179d-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.852975 4825 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.852985 4825 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d5ff589-3823-4ba2-b383-55663e00179d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.852995 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zvjx\" (UniqueName: \"kubernetes.io/projected/2d5ff589-3823-4ba2-b383-55663e00179d-kube-api-access-9zvjx\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.853006 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.860465 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-config-data" (OuterVolumeSpecName: "config-data") pod "5dec2179-f022-4cc2-9cb6-7d2dc326bc10" (UID: "5dec2179-f022-4cc2-9cb6-7d2dc326bc10"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.867331 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d5ff589-3823-4ba2-b383-55663e00179d-config-data" (OuterVolumeSpecName: "config-data") pod "2d5ff589-3823-4ba2-b383-55663e00179d" (UID: "2d5ff589-3823-4ba2-b383-55663e00179d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.873285 4825 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.875860 4825 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.955302 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d5ff589-3823-4ba2-b383-55663e00179d-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.955334 4825 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.955343 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dec2179-f022-4cc2-9cb6-7d2dc326bc10-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.955352 4825 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.983764 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5dec2179-f022-4cc2-9cb6-7d2dc326bc10","Type":"ContainerDied","Data":"e3d8af8deba0a28a00b24c6eb3ae76eb86dec0ede786754ab9f4009df37323cf"} Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.983819 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.983832 4825 scope.go:117] "RemoveContainer" containerID="da9b57e52caaf44be121346a02932df9fc330bbbc80d1b36297cb8eff08ae0c2" Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.987504 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2d5ff589-3823-4ba2-b383-55663e00179d","Type":"ContainerDied","Data":"44b10bf6f10f3ea8a4f61ee9cbf4d90e288ce8d80a75a3bf9ddc73c4149534ab"} Mar 10 07:05:30 crc kubenswrapper[4825]: I0310 07:05:30.987565 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.030162 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.060282 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.071291 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.077160 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.100778 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 07:05:31 crc kubenswrapper[4825]: E0310 07:05:31.101475 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d5ff589-3823-4ba2-b383-55663e00179d" containerName="glance-log" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.101496 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d5ff589-3823-4ba2-b383-55663e00179d" containerName="glance-log" Mar 10 07:05:31 crc kubenswrapper[4825]: E0310 07:05:31.101527 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dec2179-f022-4cc2-9cb6-7d2dc326bc10" containerName="glance-httpd" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.101537 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dec2179-f022-4cc2-9cb6-7d2dc326bc10" containerName="glance-httpd" Mar 10 07:05:31 crc kubenswrapper[4825]: E0310 07:05:31.101550 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dec2179-f022-4cc2-9cb6-7d2dc326bc10" containerName="glance-log" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.101556 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dec2179-f022-4cc2-9cb6-7d2dc326bc10" containerName="glance-log" Mar 10 07:05:31 crc kubenswrapper[4825]: E0310 07:05:31.101580 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d5ff589-3823-4ba2-b383-55663e00179d" containerName="glance-httpd" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.101587 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d5ff589-3823-4ba2-b383-55663e00179d" containerName="glance-httpd" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.101938 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d5ff589-3823-4ba2-b383-55663e00179d" containerName="glance-log" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.101967 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dec2179-f022-4cc2-9cb6-7d2dc326bc10" containerName="glance-log" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.101986 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dec2179-f022-4cc2-9cb6-7d2dc326bc10" containerName="glance-httpd" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.102005 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d5ff589-3823-4ba2-b383-55663e00179d" containerName="glance-httpd" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.103386 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.103497 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.112960 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lwfpg" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.113202 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.114641 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.114874 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.143973 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.148804 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.152485 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.152845 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.167549 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.250221 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d5ff589-3823-4ba2-b383-55663e00179d" path="/var/lib/kubelet/pods/2d5ff589-3823-4ba2-b383-55663e00179d/volumes" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.253231 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dec2179-f022-4cc2-9cb6-7d2dc326bc10" path="/var/lib/kubelet/pods/5dec2179-f022-4cc2-9cb6-7d2dc326bc10/volumes" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.262059 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3011e2a4-8b30-4770-9008-ba269097abfd-logs\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.262121 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.262238 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3011e2a4-8b30-4770-9008-ba269097abfd-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.262372 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.262407 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.262450 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3011e2a4-8b30-4770-9008-ba269097abfd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.262529 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.262621 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3011e2a4-8b30-4770-9008-ba269097abfd-scripts\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.262676 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3011e2a4-8b30-4770-9008-ba269097abfd-config-data\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.262714 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.262742 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.262815 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-logs\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.262864 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3011e2a4-8b30-4770-9008-ba269097abfd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.263059 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6nbm\" (UniqueName: \"kubernetes.io/projected/3011e2a4-8b30-4770-9008-ba269097abfd-kube-api-access-q6nbm\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.263218 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.263302 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-796sf\" (UniqueName: \"kubernetes.io/projected/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-kube-api-access-796sf\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.365604 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-logs\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.365927 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3011e2a4-8b30-4770-9008-ba269097abfd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.366046 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6nbm\" (UniqueName: \"kubernetes.io/projected/3011e2a4-8b30-4770-9008-ba269097abfd-kube-api-access-q6nbm\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.366112 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.366163 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-796sf\" (UniqueName: \"kubernetes.io/projected/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-kube-api-access-796sf\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.366562 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3011e2a4-8b30-4770-9008-ba269097abfd-logs\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.366623 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.366648 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3011e2a4-8b30-4770-9008-ba269097abfd-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.366729 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.366763 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.366791 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3011e2a4-8b30-4770-9008-ba269097abfd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.366940 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.367004 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3011e2a4-8b30-4770-9008-ba269097abfd-scripts\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.367033 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3011e2a4-8b30-4770-9008-ba269097abfd-config-data\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.367092 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-logs\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.366731 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.367432 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.367469 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3011e2a4-8b30-4770-9008-ba269097abfd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.367702 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.367869 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.369004 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.371502 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3011e2a4-8b30-4770-9008-ba269097abfd-logs\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.374245 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.376714 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.377965 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3011e2a4-8b30-4770-9008-ba269097abfd-scripts\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.378291 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.379008 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3011e2a4-8b30-4770-9008-ba269097abfd-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.379198 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3011e2a4-8b30-4770-9008-ba269097abfd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.381741 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.387357 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3011e2a4-8b30-4770-9008-ba269097abfd-config-data\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.389883 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-796sf\" (UniqueName: \"kubernetes.io/projected/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-kube-api-access-796sf\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.392447 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6nbm\" (UniqueName: \"kubernetes.io/projected/3011e2a4-8b30-4770-9008-ba269097abfd-kube-api-access-q6nbm\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.402503 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.404233 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.440841 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.477454 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 07:05:31 crc kubenswrapper[4825]: E0310 07:05:31.991022 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838" Mar 10 07:05:31 crc kubenswrapper[4825]: E0310 07:05:31.991228 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2hnvt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-j9wfx_openstack(c3f58042-5c26-4b0b-9f85-d35d9305115e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 07:05:31 crc kubenswrapper[4825]: E0310 07:05:31.993183 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-j9wfx" podUID="c3f58042-5c26-4b0b-9f85-d35d9305115e" Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.999380 4825 generic.go:334] "Generic (PLEG): container finished" podID="ce0e86d1-5b0d-491f-bbd9-59fdd8024236" containerID="c05058771a30272a89ebdf20f49d8eaf62dce4f91bce9a630a67ad47518d7ccc" exitCode=0 Mar 10 07:05:31 crc kubenswrapper[4825]: I0310 07:05:31.999488 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qlkjv" event={"ID":"ce0e86d1-5b0d-491f-bbd9-59fdd8024236","Type":"ContainerDied","Data":"c05058771a30272a89ebdf20f49d8eaf62dce4f91bce9a630a67ad47518d7ccc"} Mar 10 07:05:32 crc kubenswrapper[4825]: I0310 07:05:32.002385 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" event={"ID":"43dacc02-935b-4836-8863-e175536b0cd2","Type":"ContainerDied","Data":"54eafc7ff9ea176efcecca587236b832093c6df9500eb83f8afd60ed5dfb8850"} Mar 10 07:05:32 crc kubenswrapper[4825]: I0310 07:05:32.002442 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54eafc7ff9ea176efcecca587236b832093c6df9500eb83f8afd60ed5dfb8850" Mar 10 07:05:32 crc kubenswrapper[4825]: I0310 07:05:32.023660 4825 scope.go:117] "RemoveContainer" containerID="37fe47bae01a2a5c8eccc7ac97f930212c36c51609e5617476dbffa57770e020" Mar 10 07:05:32 crc kubenswrapper[4825]: I0310 07:05:32.117752 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" Mar 10 07:05:32 crc kubenswrapper[4825]: I0310 07:05:32.156820 4825 scope.go:117] "RemoveContainer" containerID="0bd9da29b5ce443489d3e524014de1f4ad395e34701bf0dec9b364edfdd57c58" Mar 10 07:05:32 crc kubenswrapper[4825]: I0310 07:05:32.182840 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43dacc02-935b-4836-8863-e175536b0cd2-config\") pod \"43dacc02-935b-4836-8863-e175536b0cd2\" (UID: \"43dacc02-935b-4836-8863-e175536b0cd2\") " Mar 10 07:05:32 crc kubenswrapper[4825]: I0310 07:05:32.182993 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43dacc02-935b-4836-8863-e175536b0cd2-ovsdbserver-sb\") pod \"43dacc02-935b-4836-8863-e175536b0cd2\" (UID: \"43dacc02-935b-4836-8863-e175536b0cd2\") " Mar 10 07:05:32 crc kubenswrapper[4825]: I0310 07:05:32.183068 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43dacc02-935b-4836-8863-e175536b0cd2-dns-svc\") pod \"43dacc02-935b-4836-8863-e175536b0cd2\" (UID: \"43dacc02-935b-4836-8863-e175536b0cd2\") " Mar 10 07:05:32 crc kubenswrapper[4825]: I0310 07:05:32.183109 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlfjm\" (UniqueName: \"kubernetes.io/projected/43dacc02-935b-4836-8863-e175536b0cd2-kube-api-access-qlfjm\") pod \"43dacc02-935b-4836-8863-e175536b0cd2\" (UID: \"43dacc02-935b-4836-8863-e175536b0cd2\") " Mar 10 07:05:32 crc kubenswrapper[4825]: I0310 07:05:32.183185 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43dacc02-935b-4836-8863-e175536b0cd2-ovsdbserver-nb\") pod \"43dacc02-935b-4836-8863-e175536b0cd2\" (UID: \"43dacc02-935b-4836-8863-e175536b0cd2\") " Mar 10 07:05:32 crc kubenswrapper[4825]: I0310 07:05:32.207428 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43dacc02-935b-4836-8863-e175536b0cd2-kube-api-access-qlfjm" (OuterVolumeSpecName: "kube-api-access-qlfjm") pod "43dacc02-935b-4836-8863-e175536b0cd2" (UID: "43dacc02-935b-4836-8863-e175536b0cd2"). InnerVolumeSpecName "kube-api-access-qlfjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:32 crc kubenswrapper[4825]: I0310 07:05:32.260654 4825 scope.go:117] "RemoveContainer" containerID="cbae43b98b7d3c0a7c7e2b9d1aa471ccf547cd1d4bce9f6deadbe9ee371f7cdd" Mar 10 07:05:32 crc kubenswrapper[4825]: I0310 07:05:32.262051 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43dacc02-935b-4836-8863-e175536b0cd2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "43dacc02-935b-4836-8863-e175536b0cd2" (UID: "43dacc02-935b-4836-8863-e175536b0cd2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:32 crc kubenswrapper[4825]: I0310 07:05:32.286840 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43dacc02-935b-4836-8863-e175536b0cd2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:32 crc kubenswrapper[4825]: I0310 07:05:32.286866 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlfjm\" (UniqueName: \"kubernetes.io/projected/43dacc02-935b-4836-8863-e175536b0cd2-kube-api-access-qlfjm\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:32 crc kubenswrapper[4825]: I0310 07:05:32.295060 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43dacc02-935b-4836-8863-e175536b0cd2-config" (OuterVolumeSpecName: "config") pod "43dacc02-935b-4836-8863-e175536b0cd2" (UID: "43dacc02-935b-4836-8863-e175536b0cd2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:32 crc kubenswrapper[4825]: I0310 07:05:32.305617 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43dacc02-935b-4836-8863-e175536b0cd2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "43dacc02-935b-4836-8863-e175536b0cd2" (UID: "43dacc02-935b-4836-8863-e175536b0cd2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:32 crc kubenswrapper[4825]: I0310 07:05:32.306303 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43dacc02-935b-4836-8863-e175536b0cd2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "43dacc02-935b-4836-8863-e175536b0cd2" (UID: "43dacc02-935b-4836-8863-e175536b0cd2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:32 crc kubenswrapper[4825]: I0310 07:05:32.389050 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43dacc02-935b-4836-8863-e175536b0cd2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:32 crc kubenswrapper[4825]: I0310 07:05:32.389097 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43dacc02-935b-4836-8863-e175536b0cd2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:32 crc kubenswrapper[4825]: I0310 07:05:32.389109 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43dacc02-935b-4836-8863-e175536b0cd2-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:32 crc kubenswrapper[4825]: I0310 07:05:32.627782 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 07:05:32 crc kubenswrapper[4825]: W0310 07:05:32.629401 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6f252a1_68f2_4f9a_ade7_0b979581b8c6.slice/crio-fb059dba5c7178f8d2bfa979ed022c99c306d84c82afdb0fba8800ac06115a93 WatchSource:0}: Error finding container fb059dba5c7178f8d2bfa979ed022c99c306d84c82afdb0fba8800ac06115a93: Status 404 returned error can't find the container with id fb059dba5c7178f8d2bfa979ed022c99c306d84c82afdb0fba8800ac06115a93 Mar 10 07:05:32 crc kubenswrapper[4825]: I0310 07:05:32.662457 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-t7ktm"] Mar 10 07:05:33 crc kubenswrapper[4825]: I0310 07:05:33.052408 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-qd4jq" event={"ID":"5f59ca51-fd91-41a3-ad9c-6b04ccd93288","Type":"ContainerStarted","Data":"33eb4c1cd9c309922b3b82212e200be97e525a98f1adfa545e20e9f5febfc35a"} Mar 10 07:05:33 crc kubenswrapper[4825]: I0310 07:05:33.057676 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t7ktm" event={"ID":"0ff86ab8-a122-4256-8529-12265e6177e4","Type":"ContainerStarted","Data":"314c7056d3c2823b0ecdeb62329c9c412680ceadc703f4aa197bcd466ad43054"} Mar 10 07:05:33 crc kubenswrapper[4825]: I0310 07:05:33.057729 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t7ktm" event={"ID":"0ff86ab8-a122-4256-8529-12265e6177e4","Type":"ContainerStarted","Data":"3a3265d7310bb905af713e565b8b0de72ce7abb7da75d063e70196316e73e6a4"} Mar 10 07:05:33 crc kubenswrapper[4825]: I0310 07:05:33.075738 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-qd4jq" podStartSLOduration=2.451680671 podStartE2EDuration="23.075711505s" podCreationTimestamp="2026-03-10 07:05:10 +0000 UTC" firstStartedPulling="2026-03-10 07:05:11.352551034 +0000 UTC m=+1264.382331639" lastFinishedPulling="2026-03-10 07:05:31.976581858 +0000 UTC m=+1285.006362473" observedRunningTime="2026-03-10 07:05:33.071346541 +0000 UTC m=+1286.101127156" watchObservedRunningTime="2026-03-10 07:05:33.075711505 +0000 UTC m=+1286.105492120" Mar 10 07:05:33 crc kubenswrapper[4825]: I0310 07:05:33.080242 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0862f63-fd04-41f5-88a2-638189435867","Type":"ContainerStarted","Data":"5220de8c97a07f6973f7771ca6c9152a7febd443c4da86431842c9c1571664c3"} Mar 10 07:05:33 crc kubenswrapper[4825]: I0310 07:05:33.089678 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-t7ktm" podStartSLOduration=9.089583107 podStartE2EDuration="9.089583107s" podCreationTimestamp="2026-03-10 07:05:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:05:33.085523011 +0000 UTC m=+1286.115303626" watchObservedRunningTime="2026-03-10 07:05:33.089583107 +0000 UTC m=+1286.119363722" Mar 10 07:05:33 crc kubenswrapper[4825]: I0310 07:05:33.097078 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d6f252a1-68f2-4f9a-ade7-0b979581b8c6","Type":"ContainerStarted","Data":"fb059dba5c7178f8d2bfa979ed022c99c306d84c82afdb0fba8800ac06115a93"} Mar 10 07:05:33 crc kubenswrapper[4825]: I0310 07:05:33.099248 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" Mar 10 07:05:33 crc kubenswrapper[4825]: I0310 07:05:33.099745 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-b6jlj" event={"ID":"a762be8c-156d-453d-bcd6-ae8571d9133f","Type":"ContainerStarted","Data":"8522014e2b89450213c7c7bf32a784e2ab3f811e78e3dd033eb199f51fcc0e91"} Mar 10 07:05:33 crc kubenswrapper[4825]: E0310 07:05:33.103271 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838\\\"\"" pod="openstack/cinder-db-sync-j9wfx" podUID="c3f58042-5c26-4b0b-9f85-d35d9305115e" Mar 10 07:05:33 crc kubenswrapper[4825]: I0310 07:05:33.152841 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-b6jlj" podStartSLOduration=2.770953903 podStartE2EDuration="23.152821025s" podCreationTimestamp="2026-03-10 07:05:10 +0000 UTC" firstStartedPulling="2026-03-10 07:05:11.588462243 +0000 UTC m=+1264.618242858" lastFinishedPulling="2026-03-10 07:05:31.970329365 +0000 UTC m=+1285.000109980" observedRunningTime="2026-03-10 07:05:33.148992735 +0000 UTC m=+1286.178773350" watchObservedRunningTime="2026-03-10 07:05:33.152821025 +0000 UTC m=+1286.182601660" Mar 10 07:05:33 crc kubenswrapper[4825]: I0310 07:05:33.183774 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-r9bqz"] Mar 10 07:05:33 crc kubenswrapper[4825]: I0310 07:05:33.193006 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-r9bqz"] Mar 10 07:05:33 crc kubenswrapper[4825]: I0310 07:05:33.330110 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43dacc02-935b-4836-8863-e175536b0cd2" path="/var/lib/kubelet/pods/43dacc02-935b-4836-8863-e175536b0cd2/volumes" Mar 10 07:05:33 crc kubenswrapper[4825]: I0310 07:05:33.569769 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 07:05:33 crc kubenswrapper[4825]: W0310 07:05:33.746554 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3011e2a4_8b30_4770_9008_ba269097abfd.slice/crio-f291696fd5354c722891e1cfc989cf8c9f2e5b135ebc85075447c078350582be WatchSource:0}: Error finding container f291696fd5354c722891e1cfc989cf8c9f2e5b135ebc85075447c078350582be: Status 404 returned error can't find the container with id f291696fd5354c722891e1cfc989cf8c9f2e5b135ebc85075447c078350582be Mar 10 07:05:33 crc kubenswrapper[4825]: I0310 07:05:33.965040 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qlkjv" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.019695 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g868g\" (UniqueName: \"kubernetes.io/projected/ce0e86d1-5b0d-491f-bbd9-59fdd8024236-kube-api-access-g868g\") pod \"ce0e86d1-5b0d-491f-bbd9-59fdd8024236\" (UID: \"ce0e86d1-5b0d-491f-bbd9-59fdd8024236\") " Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.019823 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce0e86d1-5b0d-491f-bbd9-59fdd8024236-config\") pod \"ce0e86d1-5b0d-491f-bbd9-59fdd8024236\" (UID: \"ce0e86d1-5b0d-491f-bbd9-59fdd8024236\") " Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.019972 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce0e86d1-5b0d-491f-bbd9-59fdd8024236-combined-ca-bundle\") pod \"ce0e86d1-5b0d-491f-bbd9-59fdd8024236\" (UID: \"ce0e86d1-5b0d-491f-bbd9-59fdd8024236\") " Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.025669 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce0e86d1-5b0d-491f-bbd9-59fdd8024236-kube-api-access-g868g" (OuterVolumeSpecName: "kube-api-access-g868g") pod "ce0e86d1-5b0d-491f-bbd9-59fdd8024236" (UID: "ce0e86d1-5b0d-491f-bbd9-59fdd8024236"). InnerVolumeSpecName "kube-api-access-g868g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.080501 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce0e86d1-5b0d-491f-bbd9-59fdd8024236-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce0e86d1-5b0d-491f-bbd9-59fdd8024236" (UID: "ce0e86d1-5b0d-491f-bbd9-59fdd8024236"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.097616 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce0e86d1-5b0d-491f-bbd9-59fdd8024236-config" (OuterVolumeSpecName: "config") pod "ce0e86d1-5b0d-491f-bbd9-59fdd8024236" (UID: "ce0e86d1-5b0d-491f-bbd9-59fdd8024236"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.122742 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g868g\" (UniqueName: \"kubernetes.io/projected/ce0e86d1-5b0d-491f-bbd9-59fdd8024236-kube-api-access-g868g\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.123255 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce0e86d1-5b0d-491f-bbd9-59fdd8024236-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.123275 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce0e86d1-5b0d-491f-bbd9-59fdd8024236-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.134710 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3011e2a4-8b30-4770-9008-ba269097abfd","Type":"ContainerStarted","Data":"f291696fd5354c722891e1cfc989cf8c9f2e5b135ebc85075447c078350582be"} Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.137164 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0862f63-fd04-41f5-88a2-638189435867","Type":"ContainerStarted","Data":"e27c06202d1e8962a583ae240525b20457246cdcf4407b7147b58552bfbf1108"} Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.138937 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d6f252a1-68f2-4f9a-ade7-0b979581b8c6","Type":"ContainerStarted","Data":"7ece94efb23a5c9ce51f2fc5f49bcfaf81a9ad61d4de7ab5ac8289cd5f6303ba"} Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.138962 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d6f252a1-68f2-4f9a-ade7-0b979581b8c6","Type":"ContainerStarted","Data":"65812fc5fcac830b784d74cae3b58dafd700fcc4d4623a7b4f5a9fd5e827bda9"} Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.146470 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qlkjv" event={"ID":"ce0e86d1-5b0d-491f-bbd9-59fdd8024236","Type":"ContainerDied","Data":"801097cc7df21da7704fabf68817db4f65ee4b352a35ff8f2dde518c9ebccf2d"} Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.146517 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="801097cc7df21da7704fabf68817db4f65ee4b352a35ff8f2dde518c9ebccf2d" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.146690 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qlkjv" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.164307 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.164286556 podStartE2EDuration="3.164286556s" podCreationTimestamp="2026-03-10 07:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:05:34.162657633 +0000 UTC m=+1287.192438248" watchObservedRunningTime="2026-03-10 07:05:34.164286556 +0000 UTC m=+1287.194067171" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.335772 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-tw65b"] Mar 10 07:05:34 crc kubenswrapper[4825]: E0310 07:05:34.336344 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43dacc02-935b-4836-8863-e175536b0cd2" containerName="dnsmasq-dns" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.336358 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="43dacc02-935b-4836-8863-e175536b0cd2" containerName="dnsmasq-dns" Mar 10 07:05:34 crc kubenswrapper[4825]: E0310 07:05:34.336367 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0e86d1-5b0d-491f-bbd9-59fdd8024236" containerName="neutron-db-sync" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.336374 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0e86d1-5b0d-491f-bbd9-59fdd8024236" containerName="neutron-db-sync" Mar 10 07:05:34 crc kubenswrapper[4825]: E0310 07:05:34.336402 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43dacc02-935b-4836-8863-e175536b0cd2" containerName="init" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.336409 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="43dacc02-935b-4836-8863-e175536b0cd2" containerName="init" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.336610 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="43dacc02-935b-4836-8863-e175536b0cd2" containerName="dnsmasq-dns" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.336626 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce0e86d1-5b0d-491f-bbd9-59fdd8024236" containerName="neutron-db-sync" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.337705 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-tw65b" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.345885 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-tw65b"] Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.361087 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9xwc\" (UniqueName: \"kubernetes.io/projected/824f93b4-983c-404b-a380-af829a45e941-kube-api-access-w9xwc\") pod \"dnsmasq-dns-7859c7799c-tw65b\" (UID: \"824f93b4-983c-404b-a380-af829a45e941\") " pod="openstack/dnsmasq-dns-7859c7799c-tw65b" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.361165 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-ovsdbserver-sb\") pod \"dnsmasq-dns-7859c7799c-tw65b\" (UID: \"824f93b4-983c-404b-a380-af829a45e941\") " pod="openstack/dnsmasq-dns-7859c7799c-tw65b" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.361209 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-dns-svc\") pod \"dnsmasq-dns-7859c7799c-tw65b\" (UID: \"824f93b4-983c-404b-a380-af829a45e941\") " pod="openstack/dnsmasq-dns-7859c7799c-tw65b" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.361280 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-dns-swift-storage-0\") pod \"dnsmasq-dns-7859c7799c-tw65b\" (UID: \"824f93b4-983c-404b-a380-af829a45e941\") " pod="openstack/dnsmasq-dns-7859c7799c-tw65b" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.361308 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-ovsdbserver-nb\") pod \"dnsmasq-dns-7859c7799c-tw65b\" (UID: \"824f93b4-983c-404b-a380-af829a45e941\") " pod="openstack/dnsmasq-dns-7859c7799c-tw65b" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.361335 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-config\") pod \"dnsmasq-dns-7859c7799c-tw65b\" (UID: \"824f93b4-983c-404b-a380-af829a45e941\") " pod="openstack/dnsmasq-dns-7859c7799c-tw65b" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.439389 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-75c968ccd4-ln762"] Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.442510 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75c968ccd4-ln762" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.450594 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.453988 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.456856 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-zsh45" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.458913 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-75c968ccd4-ln762"] Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.459314 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.479011 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-dns-swift-storage-0\") pod \"dnsmasq-dns-7859c7799c-tw65b\" (UID: \"824f93b4-983c-404b-a380-af829a45e941\") " pod="openstack/dnsmasq-dns-7859c7799c-tw65b" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.479094 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-ovsdbserver-nb\") pod \"dnsmasq-dns-7859c7799c-tw65b\" (UID: \"824f93b4-983c-404b-a380-af829a45e941\") " pod="openstack/dnsmasq-dns-7859c7799c-tw65b" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.479178 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-config\") pod \"dnsmasq-dns-7859c7799c-tw65b\" (UID: \"824f93b4-983c-404b-a380-af829a45e941\") " pod="openstack/dnsmasq-dns-7859c7799c-tw65b" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.479229 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48fa8e66-5488-44b6-aa54-910c521e0060-combined-ca-bundle\") pod \"neutron-75c968ccd4-ln762\" (UID: \"48fa8e66-5488-44b6-aa54-910c521e0060\") " pod="openstack/neutron-75c968ccd4-ln762" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.479433 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48fa8e66-5488-44b6-aa54-910c521e0060-ovndb-tls-certs\") pod \"neutron-75c968ccd4-ln762\" (UID: \"48fa8e66-5488-44b6-aa54-910c521e0060\") " pod="openstack/neutron-75c968ccd4-ln762" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.479490 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9xwc\" (UniqueName: \"kubernetes.io/projected/824f93b4-983c-404b-a380-af829a45e941-kube-api-access-w9xwc\") pod \"dnsmasq-dns-7859c7799c-tw65b\" (UID: \"824f93b4-983c-404b-a380-af829a45e941\") " pod="openstack/dnsmasq-dns-7859c7799c-tw65b" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.479555 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-ovsdbserver-sb\") pod \"dnsmasq-dns-7859c7799c-tw65b\" (UID: \"824f93b4-983c-404b-a380-af829a45e941\") " pod="openstack/dnsmasq-dns-7859c7799c-tw65b" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.479589 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm8wk\" (UniqueName: \"kubernetes.io/projected/48fa8e66-5488-44b6-aa54-910c521e0060-kube-api-access-mm8wk\") pod \"neutron-75c968ccd4-ln762\" (UID: \"48fa8e66-5488-44b6-aa54-910c521e0060\") " pod="openstack/neutron-75c968ccd4-ln762" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.479652 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/48fa8e66-5488-44b6-aa54-910c521e0060-httpd-config\") pod \"neutron-75c968ccd4-ln762\" (UID: \"48fa8e66-5488-44b6-aa54-910c521e0060\") " pod="openstack/neutron-75c968ccd4-ln762" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.479681 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-dns-svc\") pod \"dnsmasq-dns-7859c7799c-tw65b\" (UID: \"824f93b4-983c-404b-a380-af829a45e941\") " pod="openstack/dnsmasq-dns-7859c7799c-tw65b" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.479727 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48fa8e66-5488-44b6-aa54-910c521e0060-config\") pod \"neutron-75c968ccd4-ln762\" (UID: \"48fa8e66-5488-44b6-aa54-910c521e0060\") " pod="openstack/neutron-75c968ccd4-ln762" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.483915 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-dns-swift-storage-0\") pod \"dnsmasq-dns-7859c7799c-tw65b\" (UID: \"824f93b4-983c-404b-a380-af829a45e941\") " pod="openstack/dnsmasq-dns-7859c7799c-tw65b" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.485582 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-config\") pod \"dnsmasq-dns-7859c7799c-tw65b\" (UID: \"824f93b4-983c-404b-a380-af829a45e941\") " pod="openstack/dnsmasq-dns-7859c7799c-tw65b" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.486227 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-dns-svc\") pod \"dnsmasq-dns-7859c7799c-tw65b\" (UID: \"824f93b4-983c-404b-a380-af829a45e941\") " pod="openstack/dnsmasq-dns-7859c7799c-tw65b" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.486507 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-ovsdbserver-sb\") pod \"dnsmasq-dns-7859c7799c-tw65b\" (UID: \"824f93b4-983c-404b-a380-af829a45e941\") " pod="openstack/dnsmasq-dns-7859c7799c-tw65b" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.490719 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-ovsdbserver-nb\") pod \"dnsmasq-dns-7859c7799c-tw65b\" (UID: \"824f93b4-983c-404b-a380-af829a45e941\") " pod="openstack/dnsmasq-dns-7859c7799c-tw65b" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.527392 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9xwc\" (UniqueName: \"kubernetes.io/projected/824f93b4-983c-404b-a380-af829a45e941-kube-api-access-w9xwc\") pod \"dnsmasq-dns-7859c7799c-tw65b\" (UID: \"824f93b4-983c-404b-a380-af829a45e941\") " pod="openstack/dnsmasq-dns-7859c7799c-tw65b" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.581076 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm8wk\" (UniqueName: \"kubernetes.io/projected/48fa8e66-5488-44b6-aa54-910c521e0060-kube-api-access-mm8wk\") pod \"neutron-75c968ccd4-ln762\" (UID: \"48fa8e66-5488-44b6-aa54-910c521e0060\") " pod="openstack/neutron-75c968ccd4-ln762" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.581504 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/48fa8e66-5488-44b6-aa54-910c521e0060-httpd-config\") pod \"neutron-75c968ccd4-ln762\" (UID: \"48fa8e66-5488-44b6-aa54-910c521e0060\") " pod="openstack/neutron-75c968ccd4-ln762" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.581546 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48fa8e66-5488-44b6-aa54-910c521e0060-config\") pod \"neutron-75c968ccd4-ln762\" (UID: \"48fa8e66-5488-44b6-aa54-910c521e0060\") " pod="openstack/neutron-75c968ccd4-ln762" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.581617 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48fa8e66-5488-44b6-aa54-910c521e0060-combined-ca-bundle\") pod \"neutron-75c968ccd4-ln762\" (UID: \"48fa8e66-5488-44b6-aa54-910c521e0060\") " pod="openstack/neutron-75c968ccd4-ln762" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.581689 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48fa8e66-5488-44b6-aa54-910c521e0060-ovndb-tls-certs\") pod \"neutron-75c968ccd4-ln762\" (UID: \"48fa8e66-5488-44b6-aa54-910c521e0060\") " pod="openstack/neutron-75c968ccd4-ln762" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.585649 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/48fa8e66-5488-44b6-aa54-910c521e0060-httpd-config\") pod \"neutron-75c968ccd4-ln762\" (UID: \"48fa8e66-5488-44b6-aa54-910c521e0060\") " pod="openstack/neutron-75c968ccd4-ln762" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.586717 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48fa8e66-5488-44b6-aa54-910c521e0060-ovndb-tls-certs\") pod \"neutron-75c968ccd4-ln762\" (UID: \"48fa8e66-5488-44b6-aa54-910c521e0060\") " pod="openstack/neutron-75c968ccd4-ln762" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.590025 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48fa8e66-5488-44b6-aa54-910c521e0060-combined-ca-bundle\") pod \"neutron-75c968ccd4-ln762\" (UID: \"48fa8e66-5488-44b6-aa54-910c521e0060\") " pod="openstack/neutron-75c968ccd4-ln762" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.597269 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/48fa8e66-5488-44b6-aa54-910c521e0060-config\") pod \"neutron-75c968ccd4-ln762\" (UID: \"48fa8e66-5488-44b6-aa54-910c521e0060\") " pod="openstack/neutron-75c968ccd4-ln762" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.602896 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm8wk\" (UniqueName: \"kubernetes.io/projected/48fa8e66-5488-44b6-aa54-910c521e0060-kube-api-access-mm8wk\") pod \"neutron-75c968ccd4-ln762\" (UID: \"48fa8e66-5488-44b6-aa54-910c521e0060\") " pod="openstack/neutron-75c968ccd4-ln762" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.682714 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-tw65b" Mar 10 07:05:34 crc kubenswrapper[4825]: I0310 07:05:34.884602 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75c968ccd4-ln762" Mar 10 07:05:35 crc kubenswrapper[4825]: I0310 07:05:35.172538 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3011e2a4-8b30-4770-9008-ba269097abfd","Type":"ContainerStarted","Data":"8d2883cad525f02553d47fdbbfc805d6ed15f1d5770b9ca52425168368349330"} Mar 10 07:05:35 crc kubenswrapper[4825]: I0310 07:05:35.279260 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-tw65b"] Mar 10 07:05:35 crc kubenswrapper[4825]: I0310 07:05:35.607717 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-75c968ccd4-ln762"] Mar 10 07:05:35 crc kubenswrapper[4825]: W0310 07:05:35.623989 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48fa8e66_5488_44b6_aa54_910c521e0060.slice/crio-7381bfb4553887c1b9ec30f844dcffc1a5b290a510e400267251853793044422 WatchSource:0}: Error finding container 7381bfb4553887c1b9ec30f844dcffc1a5b290a510e400267251853793044422: Status 404 returned error can't find the container with id 7381bfb4553887c1b9ec30f844dcffc1a5b290a510e400267251853793044422 Mar 10 07:05:36 crc kubenswrapper[4825]: I0310 07:05:36.192226 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75c968ccd4-ln762" event={"ID":"48fa8e66-5488-44b6-aa54-910c521e0060","Type":"ContainerStarted","Data":"856f4ec0ebd37a9f123f95905685e8420cf83491aa030fcabc9f1867c6e6162a"} Mar 10 07:05:36 crc kubenswrapper[4825]: I0310 07:05:36.192847 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75c968ccd4-ln762" event={"ID":"48fa8e66-5488-44b6-aa54-910c521e0060","Type":"ContainerStarted","Data":"7381bfb4553887c1b9ec30f844dcffc1a5b290a510e400267251853793044422"} Mar 10 07:05:36 crc kubenswrapper[4825]: I0310 07:05:36.194833 4825 generic.go:334] "Generic (PLEG): container finished" podID="a762be8c-156d-453d-bcd6-ae8571d9133f" containerID="8522014e2b89450213c7c7bf32a784e2ab3f811e78e3dd033eb199f51fcc0e91" exitCode=0 Mar 10 07:05:36 crc kubenswrapper[4825]: I0310 07:05:36.194905 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-b6jlj" event={"ID":"a762be8c-156d-453d-bcd6-ae8571d9133f","Type":"ContainerDied","Data":"8522014e2b89450213c7c7bf32a784e2ab3f811e78e3dd033eb199f51fcc0e91"} Mar 10 07:05:36 crc kubenswrapper[4825]: I0310 07:05:36.196792 4825 generic.go:334] "Generic (PLEG): container finished" podID="824f93b4-983c-404b-a380-af829a45e941" containerID="e14e57419fe4e69bc99826f3d6d2b81b839fda8522de9020183047a134388d7a" exitCode=0 Mar 10 07:05:36 crc kubenswrapper[4825]: I0310 07:05:36.196883 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-tw65b" event={"ID":"824f93b4-983c-404b-a380-af829a45e941","Type":"ContainerDied","Data":"e14e57419fe4e69bc99826f3d6d2b81b839fda8522de9020183047a134388d7a"} Mar 10 07:05:36 crc kubenswrapper[4825]: I0310 07:05:36.196939 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-tw65b" event={"ID":"824f93b4-983c-404b-a380-af829a45e941","Type":"ContainerStarted","Data":"081070501e956a05b39e21ad7fc091fa1a31aec2ecc5e299e6a4645a43a82e2d"} Mar 10 07:05:36 crc kubenswrapper[4825]: I0310 07:05:36.200639 4825 generic.go:334] "Generic (PLEG): container finished" podID="5f59ca51-fd91-41a3-ad9c-6b04ccd93288" containerID="33eb4c1cd9c309922b3b82212e200be97e525a98f1adfa545e20e9f5febfc35a" exitCode=0 Mar 10 07:05:36 crc kubenswrapper[4825]: I0310 07:05:36.200735 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-qd4jq" event={"ID":"5f59ca51-fd91-41a3-ad9c-6b04ccd93288","Type":"ContainerDied","Data":"33eb4c1cd9c309922b3b82212e200be97e525a98f1adfa545e20e9f5febfc35a"} Mar 10 07:05:36 crc kubenswrapper[4825]: I0310 07:05:36.206151 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3011e2a4-8b30-4770-9008-ba269097abfd","Type":"ContainerStarted","Data":"fb587c895e312cd855a14b0eceb839242e93bb126433f037ed201fac8c44bb35"} Mar 10 07:05:36 crc kubenswrapper[4825]: I0310 07:05:36.286032 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.286003995 podStartE2EDuration="5.286003995s" podCreationTimestamp="2026-03-10 07:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:05:36.264484965 +0000 UTC m=+1289.294265600" watchObservedRunningTime="2026-03-10 07:05:36.286003995 +0000 UTC m=+1289.315784610" Mar 10 07:05:36 crc kubenswrapper[4825]: I0310 07:05:36.369251 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-675f7dd995-r9bqz" podUID="43dacc02-935b-4836-8863-e175536b0cd2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: i/o timeout" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.236393 4825 generic.go:334] "Generic (PLEG): container finished" podID="0ff86ab8-a122-4256-8529-12265e6177e4" containerID="314c7056d3c2823b0ecdeb62329c9c412680ceadc703f4aa197bcd466ad43054" exitCode=0 Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.236763 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t7ktm" event={"ID":"0ff86ab8-a122-4256-8529-12265e6177e4","Type":"ContainerDied","Data":"314c7056d3c2823b0ecdeb62329c9c412680ceadc703f4aa197bcd466ad43054"} Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.272933 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75c968ccd4-ln762" event={"ID":"48fa8e66-5488-44b6-aa54-910c521e0060","Type":"ContainerStarted","Data":"efe3b17926b62fe634f669ee81be39143fab352af89c32bf35e4700fe1395613"} Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.272976 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-tw65b" event={"ID":"824f93b4-983c-404b-a380-af829a45e941","Type":"ContainerStarted","Data":"ef0d89d3e073acf2bf9e3ba9146d48e2fb04787cc89c60552a89f784cc317afc"} Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.272996 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-75c968ccd4-ln762" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.273006 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7859c7799c-tw65b" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.277486 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7454bbb9bc-7l262"] Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.279061 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.280794 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.281559 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.324637 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7454bbb9bc-7l262"] Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.326290 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-75c968ccd4-ln762" podStartSLOduration=3.326278378 podStartE2EDuration="3.326278378s" podCreationTimestamp="2026-03-10 07:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:05:37.284391036 +0000 UTC m=+1290.314171671" watchObservedRunningTime="2026-03-10 07:05:37.326278378 +0000 UTC m=+1290.356058993" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.340395 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7859c7799c-tw65b" podStartSLOduration=3.340376925 podStartE2EDuration="3.340376925s" podCreationTimestamp="2026-03-10 07:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:05:37.317311104 +0000 UTC m=+1290.347091719" watchObservedRunningTime="2026-03-10 07:05:37.340376925 +0000 UTC m=+1290.370157540" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.375086 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45xdg\" (UniqueName: \"kubernetes.io/projected/082f6f7c-700c-41f1-a10e-58223570c18c-kube-api-access-45xdg\") pod \"neutron-7454bbb9bc-7l262\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.375121 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-internal-tls-certs\") pod \"neutron-7454bbb9bc-7l262\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.375248 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-config\") pod \"neutron-7454bbb9bc-7l262\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.375304 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-public-tls-certs\") pod \"neutron-7454bbb9bc-7l262\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.375389 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-httpd-config\") pod \"neutron-7454bbb9bc-7l262\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.375421 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-combined-ca-bundle\") pod \"neutron-7454bbb9bc-7l262\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.375466 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-ovndb-tls-certs\") pod \"neutron-7454bbb9bc-7l262\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.477099 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-internal-tls-certs\") pod \"neutron-7454bbb9bc-7l262\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.477474 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-config\") pod \"neutron-7454bbb9bc-7l262\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.477511 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-public-tls-certs\") pod \"neutron-7454bbb9bc-7l262\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.477571 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-httpd-config\") pod \"neutron-7454bbb9bc-7l262\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.477610 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-combined-ca-bundle\") pod \"neutron-7454bbb9bc-7l262\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.477652 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-ovndb-tls-certs\") pod \"neutron-7454bbb9bc-7l262\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.477736 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45xdg\" (UniqueName: \"kubernetes.io/projected/082f6f7c-700c-41f1-a10e-58223570c18c-kube-api-access-45xdg\") pod \"neutron-7454bbb9bc-7l262\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.484666 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-combined-ca-bundle\") pod \"neutron-7454bbb9bc-7l262\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.485082 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-internal-tls-certs\") pod \"neutron-7454bbb9bc-7l262\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.503505 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-config\") pod \"neutron-7454bbb9bc-7l262\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.504047 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-ovndb-tls-certs\") pod \"neutron-7454bbb9bc-7l262\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.506442 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45xdg\" (UniqueName: \"kubernetes.io/projected/082f6f7c-700c-41f1-a10e-58223570c18c-kube-api-access-45xdg\") pod \"neutron-7454bbb9bc-7l262\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.507896 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-httpd-config\") pod \"neutron-7454bbb9bc-7l262\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.520048 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-public-tls-certs\") pod \"neutron-7454bbb9bc-7l262\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.616223 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.799053 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-qd4jq" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.803015 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-b6jlj" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.900267 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a762be8c-156d-453d-bcd6-ae8571d9133f-scripts\") pod \"a762be8c-156d-453d-bcd6-ae8571d9133f\" (UID: \"a762be8c-156d-453d-bcd6-ae8571d9133f\") " Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.900380 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f59ca51-fd91-41a3-ad9c-6b04ccd93288-combined-ca-bundle\") pod \"5f59ca51-fd91-41a3-ad9c-6b04ccd93288\" (UID: \"5f59ca51-fd91-41a3-ad9c-6b04ccd93288\") " Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.900416 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a762be8c-156d-453d-bcd6-ae8571d9133f-config-data\") pod \"a762be8c-156d-453d-bcd6-ae8571d9133f\" (UID: \"a762be8c-156d-453d-bcd6-ae8571d9133f\") " Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.900473 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a762be8c-156d-453d-bcd6-ae8571d9133f-logs\") pod \"a762be8c-156d-453d-bcd6-ae8571d9133f\" (UID: \"a762be8c-156d-453d-bcd6-ae8571d9133f\") " Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.900529 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d2ws\" (UniqueName: \"kubernetes.io/projected/a762be8c-156d-453d-bcd6-ae8571d9133f-kube-api-access-5d2ws\") pod \"a762be8c-156d-453d-bcd6-ae8571d9133f\" (UID: \"a762be8c-156d-453d-bcd6-ae8571d9133f\") " Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.900630 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmmg7\" (UniqueName: \"kubernetes.io/projected/5f59ca51-fd91-41a3-ad9c-6b04ccd93288-kube-api-access-rmmg7\") pod \"5f59ca51-fd91-41a3-ad9c-6b04ccd93288\" (UID: \"5f59ca51-fd91-41a3-ad9c-6b04ccd93288\") " Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.900699 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5f59ca51-fd91-41a3-ad9c-6b04ccd93288-db-sync-config-data\") pod \"5f59ca51-fd91-41a3-ad9c-6b04ccd93288\" (UID: \"5f59ca51-fd91-41a3-ad9c-6b04ccd93288\") " Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.900810 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a762be8c-156d-453d-bcd6-ae8571d9133f-combined-ca-bundle\") pod \"a762be8c-156d-453d-bcd6-ae8571d9133f\" (UID: \"a762be8c-156d-453d-bcd6-ae8571d9133f\") " Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.902175 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a762be8c-156d-453d-bcd6-ae8571d9133f-logs" (OuterVolumeSpecName: "logs") pod "a762be8c-156d-453d-bcd6-ae8571d9133f" (UID: "a762be8c-156d-453d-bcd6-ae8571d9133f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.906364 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a762be8c-156d-453d-bcd6-ae8571d9133f-scripts" (OuterVolumeSpecName: "scripts") pod "a762be8c-156d-453d-bcd6-ae8571d9133f" (UID: "a762be8c-156d-453d-bcd6-ae8571d9133f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.908922 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a762be8c-156d-453d-bcd6-ae8571d9133f-kube-api-access-5d2ws" (OuterVolumeSpecName: "kube-api-access-5d2ws") pod "a762be8c-156d-453d-bcd6-ae8571d9133f" (UID: "a762be8c-156d-453d-bcd6-ae8571d9133f"). InnerVolumeSpecName "kube-api-access-5d2ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.911036 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f59ca51-fd91-41a3-ad9c-6b04ccd93288-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5f59ca51-fd91-41a3-ad9c-6b04ccd93288" (UID: "5f59ca51-fd91-41a3-ad9c-6b04ccd93288"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.911324 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f59ca51-fd91-41a3-ad9c-6b04ccd93288-kube-api-access-rmmg7" (OuterVolumeSpecName: "kube-api-access-rmmg7") pod "5f59ca51-fd91-41a3-ad9c-6b04ccd93288" (UID: "5f59ca51-fd91-41a3-ad9c-6b04ccd93288"). InnerVolumeSpecName "kube-api-access-rmmg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.927311 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f59ca51-fd91-41a3-ad9c-6b04ccd93288-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f59ca51-fd91-41a3-ad9c-6b04ccd93288" (UID: "5f59ca51-fd91-41a3-ad9c-6b04ccd93288"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.937573 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a762be8c-156d-453d-bcd6-ae8571d9133f-config-data" (OuterVolumeSpecName: "config-data") pod "a762be8c-156d-453d-bcd6-ae8571d9133f" (UID: "a762be8c-156d-453d-bcd6-ae8571d9133f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:37 crc kubenswrapper[4825]: I0310 07:05:37.943034 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a762be8c-156d-453d-bcd6-ae8571d9133f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a762be8c-156d-453d-bcd6-ae8571d9133f" (UID: "a762be8c-156d-453d-bcd6-ae8571d9133f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.004306 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a762be8c-156d-453d-bcd6-ae8571d9133f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.004378 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a762be8c-156d-453d-bcd6-ae8571d9133f-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.004394 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f59ca51-fd91-41a3-ad9c-6b04ccd93288-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.004409 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a762be8c-156d-453d-bcd6-ae8571d9133f-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.004423 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a762be8c-156d-453d-bcd6-ae8571d9133f-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.004439 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d2ws\" (UniqueName: \"kubernetes.io/projected/a762be8c-156d-453d-bcd6-ae8571d9133f-kube-api-access-5d2ws\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.004459 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmmg7\" (UniqueName: \"kubernetes.io/projected/5f59ca51-fd91-41a3-ad9c-6b04ccd93288-kube-api-access-rmmg7\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.004476 4825 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5f59ca51-fd91-41a3-ad9c-6b04ccd93288-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.269568 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-b6jlj" event={"ID":"a762be8c-156d-453d-bcd6-ae8571d9133f","Type":"ContainerDied","Data":"ee6bc960daabb8e76e322319ef0f1c19d731cfed1b246df08e79d019a7ca5a71"} Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.269610 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee6bc960daabb8e76e322319ef0f1c19d731cfed1b246df08e79d019a7ca5a71" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.279495 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-b6jlj" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.298243 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-qd4jq" event={"ID":"5f59ca51-fd91-41a3-ad9c-6b04ccd93288","Type":"ContainerDied","Data":"3625fb732ac4b285e45c6be2d531d6cc9d16cbef14c46b6dff18ccf0115116e7"} Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.298302 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3625fb732ac4b285e45c6be2d531d6cc9d16cbef14c46b6dff18ccf0115116e7" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.298438 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-qd4jq" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.318517 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7454bbb9bc-7l262"] Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.404553 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6b684b6dd6-r5vgn"] Mar 10 07:05:38 crc kubenswrapper[4825]: E0310 07:05:38.404925 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f59ca51-fd91-41a3-ad9c-6b04ccd93288" containerName="barbican-db-sync" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.404941 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f59ca51-fd91-41a3-ad9c-6b04ccd93288" containerName="barbican-db-sync" Mar 10 07:05:38 crc kubenswrapper[4825]: E0310 07:05:38.404952 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a762be8c-156d-453d-bcd6-ae8571d9133f" containerName="placement-db-sync" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.404959 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a762be8c-156d-453d-bcd6-ae8571d9133f" containerName="placement-db-sync" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.405121 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f59ca51-fd91-41a3-ad9c-6b04ccd93288" containerName="barbican-db-sync" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.405165 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a762be8c-156d-453d-bcd6-ae8571d9133f" containerName="placement-db-sync" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.406042 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.412287 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.412489 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-hvqft" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.412502 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.412556 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.412502 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.419553 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b684b6dd6-r5vgn"] Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.487204 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-b6df5fb85-rp95h"] Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.489271 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-b6df5fb85-rp95h" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.493232 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qkjz5" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.493515 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.508316 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.510665 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-8646874786-6nnq4"] Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.513061 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-combined-ca-bundle\") pod \"placement-6b684b6dd6-r5vgn\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.513090 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-config-data\") pod \"placement-6b684b6dd6-r5vgn\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.513111 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-internal-tls-certs\") pod \"placement-6b684b6dd6-r5vgn\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.513200 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbsgr\" (UniqueName: \"kubernetes.io/projected/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-kube-api-access-xbsgr\") pod \"placement-6b684b6dd6-r5vgn\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.513218 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-public-tls-certs\") pod \"placement-6b684b6dd6-r5vgn\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.513239 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-scripts\") pod \"placement-6b684b6dd6-r5vgn\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.513266 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-logs\") pod \"placement-6b684b6dd6-r5vgn\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.513393 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8646874786-6nnq4" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.516739 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.535845 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-b6df5fb85-rp95h"] Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.549280 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8646874786-6nnq4"] Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.615773 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-scripts\") pod \"placement-6b684b6dd6-r5vgn\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.615833 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-logs\") pod \"placement-6b684b6dd6-r5vgn\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.615871 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psdl6\" (UniqueName: \"kubernetes.io/projected/6fd14279-d503-4cda-a6b4-14bfd6945596-kube-api-access-psdl6\") pod \"barbican-worker-b6df5fb85-rp95h\" (UID: \"6fd14279-d503-4cda-a6b4-14bfd6945596\") " pod="openstack/barbican-worker-b6df5fb85-rp95h" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.615902 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/beaf0874-59e2-4ec2-9425-4c11184a7e3d-logs\") pod \"barbican-keystone-listener-8646874786-6nnq4\" (UID: \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\") " pod="openstack/barbican-keystone-listener-8646874786-6nnq4" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.615926 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-combined-ca-bundle\") pod \"placement-6b684b6dd6-r5vgn\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.615946 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-config-data\") pod \"placement-6b684b6dd6-r5vgn\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.615964 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-internal-tls-certs\") pod \"placement-6b684b6dd6-r5vgn\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.615980 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/beaf0874-59e2-4ec2-9425-4c11184a7e3d-config-data\") pod \"barbican-keystone-listener-8646874786-6nnq4\" (UID: \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\") " pod="openstack/barbican-keystone-listener-8646874786-6nnq4" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.616005 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fd14279-d503-4cda-a6b4-14bfd6945596-config-data\") pod \"barbican-worker-b6df5fb85-rp95h\" (UID: \"6fd14279-d503-4cda-a6b4-14bfd6945596\") " pod="openstack/barbican-worker-b6df5fb85-rp95h" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.616027 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beaf0874-59e2-4ec2-9425-4c11184a7e3d-combined-ca-bundle\") pod \"barbican-keystone-listener-8646874786-6nnq4\" (UID: \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\") " pod="openstack/barbican-keystone-listener-8646874786-6nnq4" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.616047 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fd14279-d503-4cda-a6b4-14bfd6945596-config-data-custom\") pod \"barbican-worker-b6df5fb85-rp95h\" (UID: \"6fd14279-d503-4cda-a6b4-14bfd6945596\") " pod="openstack/barbican-worker-b6df5fb85-rp95h" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.616075 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fd14279-d503-4cda-a6b4-14bfd6945596-logs\") pod \"barbican-worker-b6df5fb85-rp95h\" (UID: \"6fd14279-d503-4cda-a6b4-14bfd6945596\") " pod="openstack/barbican-worker-b6df5fb85-rp95h" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.616097 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8b56\" (UniqueName: \"kubernetes.io/projected/beaf0874-59e2-4ec2-9425-4c11184a7e3d-kube-api-access-n8b56\") pod \"barbican-keystone-listener-8646874786-6nnq4\" (UID: \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\") " pod="openstack/barbican-keystone-listener-8646874786-6nnq4" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.616152 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fd14279-d503-4cda-a6b4-14bfd6945596-combined-ca-bundle\") pod \"barbican-worker-b6df5fb85-rp95h\" (UID: \"6fd14279-d503-4cda-a6b4-14bfd6945596\") " pod="openstack/barbican-worker-b6df5fb85-rp95h" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.616180 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbsgr\" (UniqueName: \"kubernetes.io/projected/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-kube-api-access-xbsgr\") pod \"placement-6b684b6dd6-r5vgn\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.616195 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/beaf0874-59e2-4ec2-9425-4c11184a7e3d-config-data-custom\") pod \"barbican-keystone-listener-8646874786-6nnq4\" (UID: \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\") " pod="openstack/barbican-keystone-listener-8646874786-6nnq4" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.616216 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-public-tls-certs\") pod \"placement-6b684b6dd6-r5vgn\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.617565 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-tw65b"] Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.620850 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-logs\") pod \"placement-6b684b6dd6-r5vgn\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.646881 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-public-tls-certs\") pod \"placement-6b684b6dd6-r5vgn\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.655997 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-internal-tls-certs\") pod \"placement-6b684b6dd6-r5vgn\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.657216 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-config-data\") pod \"placement-6b684b6dd6-r5vgn\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.664083 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbsgr\" (UniqueName: \"kubernetes.io/projected/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-kube-api-access-xbsgr\") pod \"placement-6b684b6dd6-r5vgn\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.682791 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-combined-ca-bundle\") pod \"placement-6b684b6dd6-r5vgn\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.685244 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-zldfp"] Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.686610 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.700415 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-scripts\") pod \"placement-6b684b6dd6-r5vgn\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.720248 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8b56\" (UniqueName: \"kubernetes.io/projected/beaf0874-59e2-4ec2-9425-4c11184a7e3d-kube-api-access-n8b56\") pod \"barbican-keystone-listener-8646874786-6nnq4\" (UID: \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\") " pod="openstack/barbican-keystone-listener-8646874786-6nnq4" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.720332 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fd14279-d503-4cda-a6b4-14bfd6945596-combined-ca-bundle\") pod \"barbican-worker-b6df5fb85-rp95h\" (UID: \"6fd14279-d503-4cda-a6b4-14bfd6945596\") " pod="openstack/barbican-worker-b6df5fb85-rp95h" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.720360 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/beaf0874-59e2-4ec2-9425-4c11184a7e3d-config-data-custom\") pod \"barbican-keystone-listener-8646874786-6nnq4\" (UID: \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\") " pod="openstack/barbican-keystone-listener-8646874786-6nnq4" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.720417 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psdl6\" (UniqueName: \"kubernetes.io/projected/6fd14279-d503-4cda-a6b4-14bfd6945596-kube-api-access-psdl6\") pod \"barbican-worker-b6df5fb85-rp95h\" (UID: \"6fd14279-d503-4cda-a6b4-14bfd6945596\") " pod="openstack/barbican-worker-b6df5fb85-rp95h" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.720442 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/beaf0874-59e2-4ec2-9425-4c11184a7e3d-logs\") pod \"barbican-keystone-listener-8646874786-6nnq4\" (UID: \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\") " pod="openstack/barbican-keystone-listener-8646874786-6nnq4" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.720478 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/beaf0874-59e2-4ec2-9425-4c11184a7e3d-config-data\") pod \"barbican-keystone-listener-8646874786-6nnq4\" (UID: \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\") " pod="openstack/barbican-keystone-listener-8646874786-6nnq4" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.720500 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fd14279-d503-4cda-a6b4-14bfd6945596-config-data\") pod \"barbican-worker-b6df5fb85-rp95h\" (UID: \"6fd14279-d503-4cda-a6b4-14bfd6945596\") " pod="openstack/barbican-worker-b6df5fb85-rp95h" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.720518 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beaf0874-59e2-4ec2-9425-4c11184a7e3d-combined-ca-bundle\") pod \"barbican-keystone-listener-8646874786-6nnq4\" (UID: \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\") " pod="openstack/barbican-keystone-listener-8646874786-6nnq4" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.720536 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fd14279-d503-4cda-a6b4-14bfd6945596-config-data-custom\") pod \"barbican-worker-b6df5fb85-rp95h\" (UID: \"6fd14279-d503-4cda-a6b4-14bfd6945596\") " pod="openstack/barbican-worker-b6df5fb85-rp95h" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.720564 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fd14279-d503-4cda-a6b4-14bfd6945596-logs\") pod \"barbican-worker-b6df5fb85-rp95h\" (UID: \"6fd14279-d503-4cda-a6b4-14bfd6945596\") " pod="openstack/barbican-worker-b6df5fb85-rp95h" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.720927 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fd14279-d503-4cda-a6b4-14bfd6945596-logs\") pod \"barbican-worker-b6df5fb85-rp95h\" (UID: \"6fd14279-d503-4cda-a6b4-14bfd6945596\") " pod="openstack/barbican-worker-b6df5fb85-rp95h" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.726499 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-zldfp"] Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.726682 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.727759 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/beaf0874-59e2-4ec2-9425-4c11184a7e3d-logs\") pod \"barbican-keystone-listener-8646874786-6nnq4\" (UID: \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\") " pod="openstack/barbican-keystone-listener-8646874786-6nnq4" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.729410 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fd14279-d503-4cda-a6b4-14bfd6945596-combined-ca-bundle\") pod \"barbican-worker-b6df5fb85-rp95h\" (UID: \"6fd14279-d503-4cda-a6b4-14bfd6945596\") " pod="openstack/barbican-worker-b6df5fb85-rp95h" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.741161 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fd14279-d503-4cda-a6b4-14bfd6945596-config-data\") pod \"barbican-worker-b6df5fb85-rp95h\" (UID: \"6fd14279-d503-4cda-a6b4-14bfd6945596\") " pod="openstack/barbican-worker-b6df5fb85-rp95h" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.747933 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fd14279-d503-4cda-a6b4-14bfd6945596-config-data-custom\") pod \"barbican-worker-b6df5fb85-rp95h\" (UID: \"6fd14279-d503-4cda-a6b4-14bfd6945596\") " pod="openstack/barbican-worker-b6df5fb85-rp95h" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.748261 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/beaf0874-59e2-4ec2-9425-4c11184a7e3d-config-data\") pod \"barbican-keystone-listener-8646874786-6nnq4\" (UID: \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\") " pod="openstack/barbican-keystone-listener-8646874786-6nnq4" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.751298 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/beaf0874-59e2-4ec2-9425-4c11184a7e3d-config-data-custom\") pod \"barbican-keystone-listener-8646874786-6nnq4\" (UID: \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\") " pod="openstack/barbican-keystone-listener-8646874786-6nnq4" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.752950 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psdl6\" (UniqueName: \"kubernetes.io/projected/6fd14279-d503-4cda-a6b4-14bfd6945596-kube-api-access-psdl6\") pod \"barbican-worker-b6df5fb85-rp95h\" (UID: \"6fd14279-d503-4cda-a6b4-14bfd6945596\") " pod="openstack/barbican-worker-b6df5fb85-rp95h" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.754572 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beaf0874-59e2-4ec2-9425-4c11184a7e3d-combined-ca-bundle\") pod \"barbican-keystone-listener-8646874786-6nnq4\" (UID: \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\") " pod="openstack/barbican-keystone-listener-8646874786-6nnq4" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.770712 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8b56\" (UniqueName: \"kubernetes.io/projected/beaf0874-59e2-4ec2-9425-4c11184a7e3d-kube-api-access-n8b56\") pod \"barbican-keystone-listener-8646874786-6nnq4\" (UID: \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\") " pod="openstack/barbican-keystone-listener-8646874786-6nnq4" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.773020 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6d84cdb96-k2sbq"] Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.777692 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d84cdb96-k2sbq" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.781404 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6d84cdb96-k2sbq"] Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.788736 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.824599 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-ovsdbserver-sb\") pod \"dnsmasq-dns-8449d68f4f-zldfp\" (UID: \"b960a8d7-4706-4657-9991-84a87645ab8a\") " pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.824746 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-config\") pod \"dnsmasq-dns-8449d68f4f-zldfp\" (UID: \"b960a8d7-4706-4657-9991-84a87645ab8a\") " pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.824788 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-dns-swift-storage-0\") pod \"dnsmasq-dns-8449d68f4f-zldfp\" (UID: \"b960a8d7-4706-4657-9991-84a87645ab8a\") " pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.824834 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-dns-svc\") pod \"dnsmasq-dns-8449d68f4f-zldfp\" (UID: \"b960a8d7-4706-4657-9991-84a87645ab8a\") " pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.824891 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-ovsdbserver-nb\") pod \"dnsmasq-dns-8449d68f4f-zldfp\" (UID: \"b960a8d7-4706-4657-9991-84a87645ab8a\") " pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.824920 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwbc6\" (UniqueName: \"kubernetes.io/projected/b960a8d7-4706-4657-9991-84a87645ab8a-kube-api-access-mwbc6\") pod \"dnsmasq-dns-8449d68f4f-zldfp\" (UID: \"b960a8d7-4706-4657-9991-84a87645ab8a\") " pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.858692 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-b6df5fb85-rp95h" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.869510 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8646874786-6nnq4" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.926144 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-config-data-custom\") pod \"barbican-api-6d84cdb96-k2sbq\" (UID: \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\") " pod="openstack/barbican-api-6d84cdb96-k2sbq" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.926523 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-combined-ca-bundle\") pod \"barbican-api-6d84cdb96-k2sbq\" (UID: \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\") " pod="openstack/barbican-api-6d84cdb96-k2sbq" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.926562 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-ovsdbserver-nb\") pod \"dnsmasq-dns-8449d68f4f-zldfp\" (UID: \"b960a8d7-4706-4657-9991-84a87645ab8a\") " pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.926587 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwbc6\" (UniqueName: \"kubernetes.io/projected/b960a8d7-4706-4657-9991-84a87645ab8a-kube-api-access-mwbc6\") pod \"dnsmasq-dns-8449d68f4f-zldfp\" (UID: \"b960a8d7-4706-4657-9991-84a87645ab8a\") " pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.926636 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-ovsdbserver-sb\") pod \"dnsmasq-dns-8449d68f4f-zldfp\" (UID: \"b960a8d7-4706-4657-9991-84a87645ab8a\") " pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.926663 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-logs\") pod \"barbican-api-6d84cdb96-k2sbq\" (UID: \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\") " pod="openstack/barbican-api-6d84cdb96-k2sbq" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.926734 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-config\") pod \"dnsmasq-dns-8449d68f4f-zldfp\" (UID: \"b960a8d7-4706-4657-9991-84a87645ab8a\") " pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.926765 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-dns-swift-storage-0\") pod \"dnsmasq-dns-8449d68f4f-zldfp\" (UID: \"b960a8d7-4706-4657-9991-84a87645ab8a\") " pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.926791 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndxsc\" (UniqueName: \"kubernetes.io/projected/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-kube-api-access-ndxsc\") pod \"barbican-api-6d84cdb96-k2sbq\" (UID: \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\") " pod="openstack/barbican-api-6d84cdb96-k2sbq" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.926824 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-dns-svc\") pod \"dnsmasq-dns-8449d68f4f-zldfp\" (UID: \"b960a8d7-4706-4657-9991-84a87645ab8a\") " pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.926841 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-config-data\") pod \"barbican-api-6d84cdb96-k2sbq\" (UID: \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\") " pod="openstack/barbican-api-6d84cdb96-k2sbq" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.927788 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-ovsdbserver-nb\") pod \"dnsmasq-dns-8449d68f4f-zldfp\" (UID: \"b960a8d7-4706-4657-9991-84a87645ab8a\") " pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.927849 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-config\") pod \"dnsmasq-dns-8449d68f4f-zldfp\" (UID: \"b960a8d7-4706-4657-9991-84a87645ab8a\") " pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.928514 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-dns-swift-storage-0\") pod \"dnsmasq-dns-8449d68f4f-zldfp\" (UID: \"b960a8d7-4706-4657-9991-84a87645ab8a\") " pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.928732 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-ovsdbserver-sb\") pod \"dnsmasq-dns-8449d68f4f-zldfp\" (UID: \"b960a8d7-4706-4657-9991-84a87645ab8a\") " pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.929457 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-dns-svc\") pod \"dnsmasq-dns-8449d68f4f-zldfp\" (UID: \"b960a8d7-4706-4657-9991-84a87645ab8a\") " pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" Mar 10 07:05:38 crc kubenswrapper[4825]: I0310 07:05:38.943820 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwbc6\" (UniqueName: \"kubernetes.io/projected/b960a8d7-4706-4657-9991-84a87645ab8a-kube-api-access-mwbc6\") pod \"dnsmasq-dns-8449d68f4f-zldfp\" (UID: \"b960a8d7-4706-4657-9991-84a87645ab8a\") " pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" Mar 10 07:05:39 crc kubenswrapper[4825]: I0310 07:05:39.028490 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndxsc\" (UniqueName: \"kubernetes.io/projected/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-kube-api-access-ndxsc\") pod \"barbican-api-6d84cdb96-k2sbq\" (UID: \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\") " pod="openstack/barbican-api-6d84cdb96-k2sbq" Mar 10 07:05:39 crc kubenswrapper[4825]: I0310 07:05:39.028554 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-config-data\") pod \"barbican-api-6d84cdb96-k2sbq\" (UID: \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\") " pod="openstack/barbican-api-6d84cdb96-k2sbq" Mar 10 07:05:39 crc kubenswrapper[4825]: I0310 07:05:39.028578 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-config-data-custom\") pod \"barbican-api-6d84cdb96-k2sbq\" (UID: \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\") " pod="openstack/barbican-api-6d84cdb96-k2sbq" Mar 10 07:05:39 crc kubenswrapper[4825]: I0310 07:05:39.028597 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-combined-ca-bundle\") pod \"barbican-api-6d84cdb96-k2sbq\" (UID: \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\") " pod="openstack/barbican-api-6d84cdb96-k2sbq" Mar 10 07:05:39 crc kubenswrapper[4825]: I0310 07:05:39.028650 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-logs\") pod \"barbican-api-6d84cdb96-k2sbq\" (UID: \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\") " pod="openstack/barbican-api-6d84cdb96-k2sbq" Mar 10 07:05:39 crc kubenswrapper[4825]: I0310 07:05:39.029210 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-logs\") pod \"barbican-api-6d84cdb96-k2sbq\" (UID: \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\") " pod="openstack/barbican-api-6d84cdb96-k2sbq" Mar 10 07:05:39 crc kubenswrapper[4825]: I0310 07:05:39.034695 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-config-data-custom\") pod \"barbican-api-6d84cdb96-k2sbq\" (UID: \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\") " pod="openstack/barbican-api-6d84cdb96-k2sbq" Mar 10 07:05:39 crc kubenswrapper[4825]: I0310 07:05:39.035105 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-combined-ca-bundle\") pod \"barbican-api-6d84cdb96-k2sbq\" (UID: \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\") " pod="openstack/barbican-api-6d84cdb96-k2sbq" Mar 10 07:05:39 crc kubenswrapper[4825]: I0310 07:05:39.040945 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-config-data\") pod \"barbican-api-6d84cdb96-k2sbq\" (UID: \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\") " pod="openstack/barbican-api-6d84cdb96-k2sbq" Mar 10 07:05:39 crc kubenswrapper[4825]: I0310 07:05:39.049291 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndxsc\" (UniqueName: \"kubernetes.io/projected/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-kube-api-access-ndxsc\") pod \"barbican-api-6d84cdb96-k2sbq\" (UID: \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\") " pod="openstack/barbican-api-6d84cdb96-k2sbq" Mar 10 07:05:39 crc kubenswrapper[4825]: I0310 07:05:39.117156 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" Mar 10 07:05:39 crc kubenswrapper[4825]: I0310 07:05:39.141123 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d84cdb96-k2sbq" Mar 10 07:05:39 crc kubenswrapper[4825]: I0310 07:05:39.309252 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7859c7799c-tw65b" podUID="824f93b4-983c-404b-a380-af829a45e941" containerName="dnsmasq-dns" containerID="cri-o://ef0d89d3e073acf2bf9e3ba9146d48e2fb04787cc89c60552a89f784cc317afc" gracePeriod=10 Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.320183 4825 generic.go:334] "Generic (PLEG): container finished" podID="824f93b4-983c-404b-a380-af829a45e941" containerID="ef0d89d3e073acf2bf9e3ba9146d48e2fb04787cc89c60552a89f784cc317afc" exitCode=0 Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.320406 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-tw65b" event={"ID":"824f93b4-983c-404b-a380-af829a45e941","Type":"ContainerDied","Data":"ef0d89d3e073acf2bf9e3ba9146d48e2fb04787cc89c60552a89f784cc317afc"} Mar 10 07:05:40 crc kubenswrapper[4825]: W0310 07:05:40.415392 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod082f6f7c_700c_41f1_a10e_58223570c18c.slice/crio-e46e37ce674680346f695cdc7c3357bd4c0e06959a6f94ff7c361bc7997e5aa8 WatchSource:0}: Error finding container e46e37ce674680346f695cdc7c3357bd4c0e06959a6f94ff7c361bc7997e5aa8: Status 404 returned error can't find the container with id e46e37ce674680346f695cdc7c3357bd4c0e06959a6f94ff7c361bc7997e5aa8 Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.706470 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t7ktm" Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.812080 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-tw65b" Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.874716 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-combined-ca-bundle\") pod \"0ff86ab8-a122-4256-8529-12265e6177e4\" (UID: \"0ff86ab8-a122-4256-8529-12265e6177e4\") " Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.874793 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-credential-keys\") pod \"0ff86ab8-a122-4256-8529-12265e6177e4\" (UID: \"0ff86ab8-a122-4256-8529-12265e6177e4\") " Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.874853 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-fernet-keys\") pod \"0ff86ab8-a122-4256-8529-12265e6177e4\" (UID: \"0ff86ab8-a122-4256-8529-12265e6177e4\") " Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.874897 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgnq5\" (UniqueName: \"kubernetes.io/projected/0ff86ab8-a122-4256-8529-12265e6177e4-kube-api-access-xgnq5\") pod \"0ff86ab8-a122-4256-8529-12265e6177e4\" (UID: \"0ff86ab8-a122-4256-8529-12265e6177e4\") " Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.874951 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-scripts\") pod \"0ff86ab8-a122-4256-8529-12265e6177e4\" (UID: \"0ff86ab8-a122-4256-8529-12265e6177e4\") " Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.875052 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-config-data\") pod \"0ff86ab8-a122-4256-8529-12265e6177e4\" (UID: \"0ff86ab8-a122-4256-8529-12265e6177e4\") " Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.880835 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0ff86ab8-a122-4256-8529-12265e6177e4" (UID: "0ff86ab8-a122-4256-8529-12265e6177e4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.884045 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0ff86ab8-a122-4256-8529-12265e6177e4" (UID: "0ff86ab8-a122-4256-8529-12265e6177e4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.889314 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-scripts" (OuterVolumeSpecName: "scripts") pod "0ff86ab8-a122-4256-8529-12265e6177e4" (UID: "0ff86ab8-a122-4256-8529-12265e6177e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.898414 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ff86ab8-a122-4256-8529-12265e6177e4-kube-api-access-xgnq5" (OuterVolumeSpecName: "kube-api-access-xgnq5") pod "0ff86ab8-a122-4256-8529-12265e6177e4" (UID: "0ff86ab8-a122-4256-8529-12265e6177e4"). InnerVolumeSpecName "kube-api-access-xgnq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.915473 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ff86ab8-a122-4256-8529-12265e6177e4" (UID: "0ff86ab8-a122-4256-8529-12265e6177e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.924370 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-config-data" (OuterVolumeSpecName: "config-data") pod "0ff86ab8-a122-4256-8529-12265e6177e4" (UID: "0ff86ab8-a122-4256-8529-12265e6177e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.976849 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-dns-swift-storage-0\") pod \"824f93b4-983c-404b-a380-af829a45e941\" (UID: \"824f93b4-983c-404b-a380-af829a45e941\") " Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.976976 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9xwc\" (UniqueName: \"kubernetes.io/projected/824f93b4-983c-404b-a380-af829a45e941-kube-api-access-w9xwc\") pod \"824f93b4-983c-404b-a380-af829a45e941\" (UID: \"824f93b4-983c-404b-a380-af829a45e941\") " Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.977011 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-config\") pod \"824f93b4-983c-404b-a380-af829a45e941\" (UID: \"824f93b4-983c-404b-a380-af829a45e941\") " Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.977031 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-dns-svc\") pod \"824f93b4-983c-404b-a380-af829a45e941\" (UID: \"824f93b4-983c-404b-a380-af829a45e941\") " Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.977101 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-ovsdbserver-sb\") pod \"824f93b4-983c-404b-a380-af829a45e941\" (UID: \"824f93b4-983c-404b-a380-af829a45e941\") " Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.977189 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-ovsdbserver-nb\") pod \"824f93b4-983c-404b-a380-af829a45e941\" (UID: \"824f93b4-983c-404b-a380-af829a45e941\") " Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.978090 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.978118 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.978141 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.978153 4825 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.978161 4825 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ff86ab8-a122-4256-8529-12265e6177e4-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.978170 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgnq5\" (UniqueName: \"kubernetes.io/projected/0ff86ab8-a122-4256-8529-12265e6177e4-kube-api-access-xgnq5\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:40 crc kubenswrapper[4825]: I0310 07:05:40.983287 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/824f93b4-983c-404b-a380-af829a45e941-kube-api-access-w9xwc" (OuterVolumeSpecName: "kube-api-access-w9xwc") pod "824f93b4-983c-404b-a380-af829a45e941" (UID: "824f93b4-983c-404b-a380-af829a45e941"). InnerVolumeSpecName "kube-api-access-w9xwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.027689 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "824f93b4-983c-404b-a380-af829a45e941" (UID: "824f93b4-983c-404b-a380-af829a45e941"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.035420 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "824f93b4-983c-404b-a380-af829a45e941" (UID: "824f93b4-983c-404b-a380-af829a45e941"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.037534 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-config" (OuterVolumeSpecName: "config") pod "824f93b4-983c-404b-a380-af829a45e941" (UID: "824f93b4-983c-404b-a380-af829a45e941"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.045094 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "824f93b4-983c-404b-a380-af829a45e941" (UID: "824f93b4-983c-404b-a380-af829a45e941"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.079366 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9xwc\" (UniqueName: \"kubernetes.io/projected/824f93b4-983c-404b-a380-af829a45e941-kube-api-access-w9xwc\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.079406 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.079417 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.079427 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.079439 4825 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.092699 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "824f93b4-983c-404b-a380-af829a45e941" (UID: "824f93b4-983c-404b-a380-af829a45e941"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.119150 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-zldfp"] Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.182797 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/824f93b4-983c-404b-a380-af829a45e941-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.269621 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b684b6dd6-r5vgn"] Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.283358 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8646874786-6nnq4"] Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.322619 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6d84cdb96-k2sbq"] Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.331510 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7454bbb9bc-7l262" event={"ID":"082f6f7c-700c-41f1-a10e-58223570c18c","Type":"ContainerStarted","Data":"f41b2155d8729db5968bc0758def9b1914d2c9b44cd66a96c2353428211928d7"} Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.331582 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7454bbb9bc-7l262" event={"ID":"082f6f7c-700c-41f1-a10e-58223570c18c","Type":"ContainerStarted","Data":"e46e37ce674680346f695cdc7c3357bd4c0e06959a6f94ff7c361bc7997e5aa8"} Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.340558 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-tw65b" event={"ID":"824f93b4-983c-404b-a380-af829a45e941","Type":"ContainerDied","Data":"081070501e956a05b39e21ad7fc091fa1a31aec2ecc5e299e6a4645a43a82e2d"} Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.340664 4825 scope.go:117] "RemoveContainer" containerID="ef0d89d3e073acf2bf9e3ba9146d48e2fb04787cc89c60552a89f784cc317afc" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.340846 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-tw65b" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.349552 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t7ktm" event={"ID":"0ff86ab8-a122-4256-8529-12265e6177e4","Type":"ContainerDied","Data":"3a3265d7310bb905af713e565b8b0de72ce7abb7da75d063e70196316e73e6a4"} Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.349651 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a3265d7310bb905af713e565b8b0de72ce7abb7da75d063e70196316e73e6a4" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.351597 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t7ktm" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.356491 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" event={"ID":"b960a8d7-4706-4657-9991-84a87645ab8a","Type":"ContainerStarted","Data":"f92b204f9a371e5c6f00262aab6ca478581ae4b6a826a651ac62bba9beabbd93"} Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.358714 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b684b6dd6-r5vgn" event={"ID":"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5","Type":"ContainerStarted","Data":"82e626f0b660d292f5477324a181ee8e984ccdb1cafb8fefcd25bd72f2263bae"} Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.380011 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-tw65b"] Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.388195 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-tw65b"] Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.388416 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0862f63-fd04-41f5-88a2-638189435867","Type":"ContainerStarted","Data":"dd6c629e4ee997cf8d87fb453e10f65c343d3bd0d25e0cc7e06cde200a99b7d6"} Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.396647 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-dcb459ff6-b58pb"] Mar 10 07:05:41 crc kubenswrapper[4825]: E0310 07:05:41.397070 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="824f93b4-983c-404b-a380-af829a45e941" containerName="init" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.397093 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="824f93b4-983c-404b-a380-af829a45e941" containerName="init" Mar 10 07:05:41 crc kubenswrapper[4825]: E0310 07:05:41.397104 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="824f93b4-983c-404b-a380-af829a45e941" containerName="dnsmasq-dns" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.397110 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="824f93b4-983c-404b-a380-af829a45e941" containerName="dnsmasq-dns" Mar 10 07:05:41 crc kubenswrapper[4825]: E0310 07:05:41.397128 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ff86ab8-a122-4256-8529-12265e6177e4" containerName="keystone-bootstrap" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.397148 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ff86ab8-a122-4256-8529-12265e6177e4" containerName="keystone-bootstrap" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.397325 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="824f93b4-983c-404b-a380-af829a45e941" containerName="dnsmasq-dns" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.397336 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ff86ab8-a122-4256-8529-12265e6177e4" containerName="keystone-bootstrap" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.398320 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.404007 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.404477 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.406673 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-dcb459ff6-b58pb"] Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.441833 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.443283 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.478222 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.478420 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.481511 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.489684 4825 scope.go:117] "RemoveContainer" containerID="e14e57419fe4e69bc99826f3d6d2b81b839fda8522de9020183047a134388d7a" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.492509 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-public-tls-certs\") pod \"barbican-api-dcb459ff6-b58pb\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.492602 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzsng\" (UniqueName: \"kubernetes.io/projected/c93bcce6-e8c8-4916-af01-e85f3dfded64-kube-api-access-kzsng\") pod \"barbican-api-dcb459ff6-b58pb\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.492675 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-config-data-custom\") pod \"barbican-api-dcb459ff6-b58pb\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.492703 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c93bcce6-e8c8-4916-af01-e85f3dfded64-logs\") pod \"barbican-api-dcb459ff6-b58pb\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.492767 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-combined-ca-bundle\") pod \"barbican-api-dcb459ff6-b58pb\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.492792 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-config-data\") pod \"barbican-api-dcb459ff6-b58pb\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.492833 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-internal-tls-certs\") pod \"barbican-api-dcb459ff6-b58pb\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.530208 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.540975 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.553525 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-b6df5fb85-rp95h"] Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.556147 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.594972 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-internal-tls-certs\") pod \"barbican-api-dcb459ff6-b58pb\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.595382 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-public-tls-certs\") pod \"barbican-api-dcb459ff6-b58pb\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.595488 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzsng\" (UniqueName: \"kubernetes.io/projected/c93bcce6-e8c8-4916-af01-e85f3dfded64-kube-api-access-kzsng\") pod \"barbican-api-dcb459ff6-b58pb\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.595617 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-config-data-custom\") pod \"barbican-api-dcb459ff6-b58pb\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.595702 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c93bcce6-e8c8-4916-af01-e85f3dfded64-logs\") pod \"barbican-api-dcb459ff6-b58pb\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.595812 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-combined-ca-bundle\") pod \"barbican-api-dcb459ff6-b58pb\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.595879 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-config-data\") pod \"barbican-api-dcb459ff6-b58pb\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.597242 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c93bcce6-e8c8-4916-af01-e85f3dfded64-logs\") pod \"barbican-api-dcb459ff6-b58pb\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.610749 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-config-data-custom\") pod \"barbican-api-dcb459ff6-b58pb\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.610770 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-combined-ca-bundle\") pod \"barbican-api-dcb459ff6-b58pb\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.611665 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-config-data\") pod \"barbican-api-dcb459ff6-b58pb\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.624304 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-public-tls-certs\") pod \"barbican-api-dcb459ff6-b58pb\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.632712 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-internal-tls-certs\") pod \"barbican-api-dcb459ff6-b58pb\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.635348 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzsng\" (UniqueName: \"kubernetes.io/projected/c93bcce6-e8c8-4916-af01-e85f3dfded64-kube-api-access-kzsng\") pod \"barbican-api-dcb459ff6-b58pb\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.715611 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.832687 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-599df5898d-bqcpr"] Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.834070 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.839298 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.839551 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.839942 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.840096 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rs4jw" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.841047 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.841195 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.866214 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-599df5898d-bqcpr"] Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.908749 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-combined-ca-bundle\") pod \"keystone-599df5898d-bqcpr\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.908823 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-public-tls-certs\") pod \"keystone-599df5898d-bqcpr\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.908865 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-fernet-keys\") pod \"keystone-599df5898d-bqcpr\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.908909 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-config-data\") pod \"keystone-599df5898d-bqcpr\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.908938 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t87rc\" (UniqueName: \"kubernetes.io/projected/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-kube-api-access-t87rc\") pod \"keystone-599df5898d-bqcpr\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.908988 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-internal-tls-certs\") pod \"keystone-599df5898d-bqcpr\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.909035 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-scripts\") pod \"keystone-599df5898d-bqcpr\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:41 crc kubenswrapper[4825]: I0310 07:05:41.909155 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-credential-keys\") pod \"keystone-599df5898d-bqcpr\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.011763 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-credential-keys\") pod \"keystone-599df5898d-bqcpr\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.012028 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-combined-ca-bundle\") pod \"keystone-599df5898d-bqcpr\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.012052 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-public-tls-certs\") pod \"keystone-599df5898d-bqcpr\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.012088 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-fernet-keys\") pod \"keystone-599df5898d-bqcpr\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.012110 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-config-data\") pod \"keystone-599df5898d-bqcpr\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.012161 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-internal-tls-certs\") pod \"keystone-599df5898d-bqcpr\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.012177 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t87rc\" (UniqueName: \"kubernetes.io/projected/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-kube-api-access-t87rc\") pod \"keystone-599df5898d-bqcpr\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.012220 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-scripts\") pod \"keystone-599df5898d-bqcpr\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.016400 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-scripts\") pod \"keystone-599df5898d-bqcpr\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.022731 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-public-tls-certs\") pod \"keystone-599df5898d-bqcpr\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.030305 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-combined-ca-bundle\") pod \"keystone-599df5898d-bqcpr\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.030965 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-fernet-keys\") pod \"keystone-599df5898d-bqcpr\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.038816 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-config-data\") pod \"keystone-599df5898d-bqcpr\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.041636 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-internal-tls-certs\") pod \"keystone-599df5898d-bqcpr\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.061157 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t87rc\" (UniqueName: \"kubernetes.io/projected/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-kube-api-access-t87rc\") pod \"keystone-599df5898d-bqcpr\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.072641 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-credential-keys\") pod \"keystone-599df5898d-bqcpr\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.166258 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.379964 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-dcb459ff6-b58pb"] Mar 10 07:05:42 crc kubenswrapper[4825]: W0310 07:05:42.391377 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc93bcce6_e8c8_4916_af01_e85f3dfded64.slice/crio-be6c22749106b274fbe03fb5640bba5610567305756596ad82dc2b42365f6b46 WatchSource:0}: Error finding container be6c22749106b274fbe03fb5640bba5610567305756596ad82dc2b42365f6b46: Status 404 returned error can't find the container with id be6c22749106b274fbe03fb5640bba5610567305756596ad82dc2b42365f6b46 Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.409805 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-b6df5fb85-rp95h" event={"ID":"6fd14279-d503-4cda-a6b4-14bfd6945596","Type":"ContainerStarted","Data":"f21e9888ccc1516588069e812e5c52c3b58a35a4014c0d4b75a764a59f934ab4"} Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.422578 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7454bbb9bc-7l262" event={"ID":"082f6f7c-700c-41f1-a10e-58223570c18c","Type":"ContainerStarted","Data":"ea7fc0200b81cf0f53d33c7f999d45d1a95c059b3c16e5b55b69ae7fd61d4a52"} Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.423925 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.441397 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d84cdb96-k2sbq" event={"ID":"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792","Type":"ContainerStarted","Data":"aa924c2f0665e6994624c651ef160599dd8014ee01e883f7a0cece8680569823"} Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.441443 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d84cdb96-k2sbq" event={"ID":"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792","Type":"ContainerStarted","Data":"a752c292a0ae40a81df7bf9c73349a256e2f9ef0fb232b0a1ab397f7115ade30"} Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.441456 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d84cdb96-k2sbq" event={"ID":"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792","Type":"ContainerStarted","Data":"c7a651b5d91b5136ef4112a6cf4bffd1ef11daee346eee79d6a3737450b667bf"} Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.443144 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6d84cdb96-k2sbq" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.443173 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6d84cdb96-k2sbq" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.450911 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7454bbb9bc-7l262" podStartSLOduration=5.450892614 podStartE2EDuration="5.450892614s" podCreationTimestamp="2026-03-10 07:05:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:05:42.444060736 +0000 UTC m=+1295.473841351" watchObservedRunningTime="2026-03-10 07:05:42.450892614 +0000 UTC m=+1295.480673229" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.464634 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8646874786-6nnq4" event={"ID":"beaf0874-59e2-4ec2-9425-4c11184a7e3d","Type":"ContainerStarted","Data":"b2d6d851cdbee1e512f8930b58ce72522fb335db88aba4e5cf7b89c7c3847e60"} Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.471834 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6d84cdb96-k2sbq" podStartSLOduration=4.471816819 podStartE2EDuration="4.471816819s" podCreationTimestamp="2026-03-10 07:05:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:05:42.467538418 +0000 UTC m=+1295.497319033" watchObservedRunningTime="2026-03-10 07:05:42.471816819 +0000 UTC m=+1295.501597434" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.474107 4825 generic.go:334] "Generic (PLEG): container finished" podID="b960a8d7-4706-4657-9991-84a87645ab8a" containerID="16414e715215838cd5c5f664a3215cd34f3e6389a5976ded0f744bb1b68662f9" exitCode=0 Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.474208 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" event={"ID":"b960a8d7-4706-4657-9991-84a87645ab8a","Type":"ContainerDied","Data":"16414e715215838cd5c5f664a3215cd34f3e6389a5976ded0f744bb1b68662f9"} Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.486990 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b684b6dd6-r5vgn" event={"ID":"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5","Type":"ContainerStarted","Data":"1dd1b73af2989fabf8ad6057b797d997323d290591c75c2def399b0331c06c79"} Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.487287 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b684b6dd6-r5vgn" event={"ID":"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5","Type":"ContainerStarted","Data":"98cef233b9fa81351c21de237de8599c342df210bb40854490688bca71f84674"} Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.487632 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.487726 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.490336 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.490398 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.490410 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.490420 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.734294 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6b684b6dd6-r5vgn" podStartSLOduration=4.734266649 podStartE2EDuration="4.734266649s" podCreationTimestamp="2026-03-10 07:05:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:05:42.551311121 +0000 UTC m=+1295.581091736" watchObservedRunningTime="2026-03-10 07:05:42.734266649 +0000 UTC m=+1295.764047264" Mar 10 07:05:42 crc kubenswrapper[4825]: I0310 07:05:42.749457 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-599df5898d-bqcpr"] Mar 10 07:05:43 crc kubenswrapper[4825]: I0310 07:05:43.259290 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="824f93b4-983c-404b-a380-af829a45e941" path="/var/lib/kubelet/pods/824f93b4-983c-404b-a380-af829a45e941/volumes" Mar 10 07:05:43 crc kubenswrapper[4825]: I0310 07:05:43.503879 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-dcb459ff6-b58pb" event={"ID":"c93bcce6-e8c8-4916-af01-e85f3dfded64","Type":"ContainerStarted","Data":"13eef2fd23b3a4387f866485b870a26a32edd760fbc68889410c962fef58826e"} Mar 10 07:05:43 crc kubenswrapper[4825]: I0310 07:05:43.503940 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-dcb459ff6-b58pb" event={"ID":"c93bcce6-e8c8-4916-af01-e85f3dfded64","Type":"ContainerStarted","Data":"1838604b40253b47912466b869a4dbb0ed7bea3659d5063f2cc0ce282f9d473f"} Mar 10 07:05:43 crc kubenswrapper[4825]: I0310 07:05:43.503954 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-dcb459ff6-b58pb" event={"ID":"c93bcce6-e8c8-4916-af01-e85f3dfded64","Type":"ContainerStarted","Data":"be6c22749106b274fbe03fb5640bba5610567305756596ad82dc2b42365f6b46"} Mar 10 07:05:43 crc kubenswrapper[4825]: I0310 07:05:43.505320 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:43 crc kubenswrapper[4825]: I0310 07:05:43.505343 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:43 crc kubenswrapper[4825]: I0310 07:05:43.516954 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-599df5898d-bqcpr" event={"ID":"86dc5ac4-dac8-4fdf-bba9-06d13efacd53","Type":"ContainerStarted","Data":"878047a39225dab631a6decdc7de8a0629c77fdf4b999f47c2c1d0353e8c9e86"} Mar 10 07:05:43 crc kubenswrapper[4825]: I0310 07:05:43.517008 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-599df5898d-bqcpr" event={"ID":"86dc5ac4-dac8-4fdf-bba9-06d13efacd53","Type":"ContainerStarted","Data":"45d34bb2ca07c04238a805768d304ffe30e1488ed0f99c18fc8607e89c89570d"} Mar 10 07:05:43 crc kubenswrapper[4825]: I0310 07:05:43.517269 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:05:43 crc kubenswrapper[4825]: I0310 07:05:43.541630 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-dcb459ff6-b58pb" podStartSLOduration=2.541607702 podStartE2EDuration="2.541607702s" podCreationTimestamp="2026-03-10 07:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:05:43.533375617 +0000 UTC m=+1296.563156242" watchObservedRunningTime="2026-03-10 07:05:43.541607702 +0000 UTC m=+1296.571388317" Mar 10 07:05:43 crc kubenswrapper[4825]: I0310 07:05:43.544373 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" event={"ID":"b960a8d7-4706-4657-9991-84a87645ab8a","Type":"ContainerStarted","Data":"1a209b2e2e157803fd09a01c067e18e18efe719eae8e01be688bef95b8efab6f"} Mar 10 07:05:43 crc kubenswrapper[4825]: I0310 07:05:43.544415 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" Mar 10 07:05:43 crc kubenswrapper[4825]: I0310 07:05:43.590443 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-599df5898d-bqcpr" podStartSLOduration=2.590416164 podStartE2EDuration="2.590416164s" podCreationTimestamp="2026-03-10 07:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:05:43.548090221 +0000 UTC m=+1296.577870836" watchObservedRunningTime="2026-03-10 07:05:43.590416164 +0000 UTC m=+1296.620196809" Mar 10 07:05:43 crc kubenswrapper[4825]: I0310 07:05:43.605232 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" podStartSLOduration=5.605202009 podStartE2EDuration="5.605202009s" podCreationTimestamp="2026-03-10 07:05:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:05:43.587406985 +0000 UTC m=+1296.617187610" watchObservedRunningTime="2026-03-10 07:05:43.605202009 +0000 UTC m=+1296.634982644" Mar 10 07:05:44 crc kubenswrapper[4825]: I0310 07:05:44.553090 4825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 07:05:44 crc kubenswrapper[4825]: I0310 07:05:44.553377 4825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.074785 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.075147 4825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.117808 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.134609 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.234469 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.564434 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8646874786-6nnq4" event={"ID":"beaf0874-59e2-4ec2-9425-4c11184a7e3d","Type":"ContainerStarted","Data":"99223e0c7d50ee9fd6f248dfcb3f76a75aa508bb660de7f5dcbf4cda04138d4d"} Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.564492 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8646874786-6nnq4" event={"ID":"beaf0874-59e2-4ec2-9425-4c11184a7e3d","Type":"ContainerStarted","Data":"1c204c0dd92e2e8c589476254c0066dc443ca2290d53873337c9657d6386b775"} Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.569179 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-b6df5fb85-rp95h" event={"ID":"6fd14279-d503-4cda-a6b4-14bfd6945596","Type":"ContainerStarted","Data":"5ba93edcb4a95504083c440ac77fb9ff72be024ad46e8a250d21b4b8dba98d3f"} Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.569217 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-b6df5fb85-rp95h" event={"ID":"6fd14279-d503-4cda-a6b4-14bfd6945596","Type":"ContainerStarted","Data":"7d1abc0040c690dba66e527f4090056039b02be17578050e52f970cade647ad7"} Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.589755 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-8646874786-6nnq4" podStartSLOduration=4.570333646 podStartE2EDuration="7.589733012s" podCreationTimestamp="2026-03-10 07:05:38 +0000 UTC" firstStartedPulling="2026-03-10 07:05:41.4896423 +0000 UTC m=+1294.519422915" lastFinishedPulling="2026-03-10 07:05:44.509041666 +0000 UTC m=+1297.538822281" observedRunningTime="2026-03-10 07:05:45.584573978 +0000 UTC m=+1298.614354593" watchObservedRunningTime="2026-03-10 07:05:45.589733012 +0000 UTC m=+1298.619513627" Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.604952 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-b6df5fb85-rp95h" podStartSLOduration=4.648648898 podStartE2EDuration="7.604933519s" podCreationTimestamp="2026-03-10 07:05:38 +0000 UTC" firstStartedPulling="2026-03-10 07:05:41.557411716 +0000 UTC m=+1294.587192331" lastFinishedPulling="2026-03-10 07:05:44.513696337 +0000 UTC m=+1297.543476952" observedRunningTime="2026-03-10 07:05:45.600979436 +0000 UTC m=+1298.630760051" watchObservedRunningTime="2026-03-10 07:05:45.604933519 +0000 UTC m=+1298.634714134" Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.785485 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-54d455c957-2jn47"] Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.787614 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-54d455c957-2jn47" Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.793259 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-75884b96b6-fqkns"] Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.794738 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.823270 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-54d455c957-2jn47"] Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.836106 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-75884b96b6-fqkns"] Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.916429 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79caf655-43c2-401d-9967-97d9a35d9741-config-data\") pod \"barbican-worker-54d455c957-2jn47\" (UID: \"79caf655-43c2-401d-9967-97d9a35d9741\") " pod="openstack/barbican-worker-54d455c957-2jn47" Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.916499 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d907281e-010a-403c-ba2a-b8d178dacbb0-config-data\") pod \"barbican-keystone-listener-75884b96b6-fqkns\" (UID: \"d907281e-010a-403c-ba2a-b8d178dacbb0\") " pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.916558 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d907281e-010a-403c-ba2a-b8d178dacbb0-logs\") pod \"barbican-keystone-listener-75884b96b6-fqkns\" (UID: \"d907281e-010a-403c-ba2a-b8d178dacbb0\") " pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.916592 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79caf655-43c2-401d-9967-97d9a35d9741-combined-ca-bundle\") pod \"barbican-worker-54d455c957-2jn47\" (UID: \"79caf655-43c2-401d-9967-97d9a35d9741\") " pod="openstack/barbican-worker-54d455c957-2jn47" Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.916720 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79caf655-43c2-401d-9967-97d9a35d9741-logs\") pod \"barbican-worker-54d455c957-2jn47\" (UID: \"79caf655-43c2-401d-9967-97d9a35d9741\") " pod="openstack/barbican-worker-54d455c957-2jn47" Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.916773 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g6d7\" (UniqueName: \"kubernetes.io/projected/d907281e-010a-403c-ba2a-b8d178dacbb0-kube-api-access-7g6d7\") pod \"barbican-keystone-listener-75884b96b6-fqkns\" (UID: \"d907281e-010a-403c-ba2a-b8d178dacbb0\") " pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.916812 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79caf655-43c2-401d-9967-97d9a35d9741-config-data-custom\") pod \"barbican-worker-54d455c957-2jn47\" (UID: \"79caf655-43c2-401d-9967-97d9a35d9741\") " pod="openstack/barbican-worker-54d455c957-2jn47" Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.916872 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77l52\" (UniqueName: \"kubernetes.io/projected/79caf655-43c2-401d-9967-97d9a35d9741-kube-api-access-77l52\") pod \"barbican-worker-54d455c957-2jn47\" (UID: \"79caf655-43c2-401d-9967-97d9a35d9741\") " pod="openstack/barbican-worker-54d455c957-2jn47" Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.917004 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d907281e-010a-403c-ba2a-b8d178dacbb0-config-data-custom\") pod \"barbican-keystone-listener-75884b96b6-fqkns\" (UID: \"d907281e-010a-403c-ba2a-b8d178dacbb0\") " pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.917089 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d907281e-010a-403c-ba2a-b8d178dacbb0-combined-ca-bundle\") pod \"barbican-keystone-listener-75884b96b6-fqkns\" (UID: \"d907281e-010a-403c-ba2a-b8d178dacbb0\") " pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.969283 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6d84cdb96-k2sbq"] Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.969860 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6d84cdb96-k2sbq" podUID="360e5ec8-ccd5-4e2e-8bea-f6773e7d2792" containerName="barbican-api" containerID="cri-o://aa924c2f0665e6994624c651ef160599dd8014ee01e883f7a0cece8680569823" gracePeriod=30 Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.969973 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6d84cdb96-k2sbq" podUID="360e5ec8-ccd5-4e2e-8bea-f6773e7d2792" containerName="barbican-api-log" containerID="cri-o://a752c292a0ae40a81df7bf9c73349a256e2f9ef0fb232b0a1ab397f7115ade30" gracePeriod=30 Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.986677 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-9c98f44d4-hm6qz"] Mar 10 07:05:45 crc kubenswrapper[4825]: I0310 07:05:45.988337 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.007798 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-9c98f44d4-hm6qz"] Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.018402 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79caf655-43c2-401d-9967-97d9a35d9741-combined-ca-bundle\") pod \"barbican-worker-54d455c957-2jn47\" (UID: \"79caf655-43c2-401d-9967-97d9a35d9741\") " pod="openstack/barbican-worker-54d455c957-2jn47" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.018474 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79caf655-43c2-401d-9967-97d9a35d9741-logs\") pod \"barbican-worker-54d455c957-2jn47\" (UID: \"79caf655-43c2-401d-9967-97d9a35d9741\") " pod="openstack/barbican-worker-54d455c957-2jn47" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.018501 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g6d7\" (UniqueName: \"kubernetes.io/projected/d907281e-010a-403c-ba2a-b8d178dacbb0-kube-api-access-7g6d7\") pod \"barbican-keystone-listener-75884b96b6-fqkns\" (UID: \"d907281e-010a-403c-ba2a-b8d178dacbb0\") " pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.018533 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79caf655-43c2-401d-9967-97d9a35d9741-config-data-custom\") pod \"barbican-worker-54d455c957-2jn47\" (UID: \"79caf655-43c2-401d-9967-97d9a35d9741\") " pod="openstack/barbican-worker-54d455c957-2jn47" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.018561 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77l52\" (UniqueName: \"kubernetes.io/projected/79caf655-43c2-401d-9967-97d9a35d9741-kube-api-access-77l52\") pod \"barbican-worker-54d455c957-2jn47\" (UID: \"79caf655-43c2-401d-9967-97d9a35d9741\") " pod="openstack/barbican-worker-54d455c957-2jn47" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.018620 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d907281e-010a-403c-ba2a-b8d178dacbb0-config-data-custom\") pod \"barbican-keystone-listener-75884b96b6-fqkns\" (UID: \"d907281e-010a-403c-ba2a-b8d178dacbb0\") " pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.018654 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d907281e-010a-403c-ba2a-b8d178dacbb0-combined-ca-bundle\") pod \"barbican-keystone-listener-75884b96b6-fqkns\" (UID: \"d907281e-010a-403c-ba2a-b8d178dacbb0\") " pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.018710 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79caf655-43c2-401d-9967-97d9a35d9741-config-data\") pod \"barbican-worker-54d455c957-2jn47\" (UID: \"79caf655-43c2-401d-9967-97d9a35d9741\") " pod="openstack/barbican-worker-54d455c957-2jn47" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.018727 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d907281e-010a-403c-ba2a-b8d178dacbb0-config-data\") pod \"barbican-keystone-listener-75884b96b6-fqkns\" (UID: \"d907281e-010a-403c-ba2a-b8d178dacbb0\") " pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.018747 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d907281e-010a-403c-ba2a-b8d178dacbb0-logs\") pod \"barbican-keystone-listener-75884b96b6-fqkns\" (UID: \"d907281e-010a-403c-ba2a-b8d178dacbb0\") " pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.019114 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d907281e-010a-403c-ba2a-b8d178dacbb0-logs\") pod \"barbican-keystone-listener-75884b96b6-fqkns\" (UID: \"d907281e-010a-403c-ba2a-b8d178dacbb0\") " pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.020595 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79caf655-43c2-401d-9967-97d9a35d9741-logs\") pod \"barbican-worker-54d455c957-2jn47\" (UID: \"79caf655-43c2-401d-9967-97d9a35d9741\") " pod="openstack/barbican-worker-54d455c957-2jn47" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.026455 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79caf655-43c2-401d-9967-97d9a35d9741-combined-ca-bundle\") pod \"barbican-worker-54d455c957-2jn47\" (UID: \"79caf655-43c2-401d-9967-97d9a35d9741\") " pod="openstack/barbican-worker-54d455c957-2jn47" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.030312 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d907281e-010a-403c-ba2a-b8d178dacbb0-combined-ca-bundle\") pod \"barbican-keystone-listener-75884b96b6-fqkns\" (UID: \"d907281e-010a-403c-ba2a-b8d178dacbb0\") " pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.031542 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d907281e-010a-403c-ba2a-b8d178dacbb0-config-data\") pod \"barbican-keystone-listener-75884b96b6-fqkns\" (UID: \"d907281e-010a-403c-ba2a-b8d178dacbb0\") " pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.031817 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79caf655-43c2-401d-9967-97d9a35d9741-config-data\") pod \"barbican-worker-54d455c957-2jn47\" (UID: \"79caf655-43c2-401d-9967-97d9a35d9741\") " pod="openstack/barbican-worker-54d455c957-2jn47" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.033753 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d907281e-010a-403c-ba2a-b8d178dacbb0-config-data-custom\") pod \"barbican-keystone-listener-75884b96b6-fqkns\" (UID: \"d907281e-010a-403c-ba2a-b8d178dacbb0\") " pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.035502 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79caf655-43c2-401d-9967-97d9a35d9741-config-data-custom\") pod \"barbican-worker-54d455c957-2jn47\" (UID: \"79caf655-43c2-401d-9967-97d9a35d9741\") " pod="openstack/barbican-worker-54d455c957-2jn47" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.036359 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g6d7\" (UniqueName: \"kubernetes.io/projected/d907281e-010a-403c-ba2a-b8d178dacbb0-kube-api-access-7g6d7\") pod \"barbican-keystone-listener-75884b96b6-fqkns\" (UID: \"d907281e-010a-403c-ba2a-b8d178dacbb0\") " pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.036440 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77l52\" (UniqueName: \"kubernetes.io/projected/79caf655-43c2-401d-9967-97d9a35d9741-kube-api-access-77l52\") pod \"barbican-worker-54d455c957-2jn47\" (UID: \"79caf655-43c2-401d-9967-97d9a35d9741\") " pod="openstack/barbican-worker-54d455c957-2jn47" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.120503 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-config-data\") pod \"barbican-api-9c98f44d4-hm6qz\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.120850 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42d8b8af-9916-4aba-b4e7-f825ed30f182-logs\") pod \"barbican-api-9c98f44d4-hm6qz\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.120970 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-public-tls-certs\") pod \"barbican-api-9c98f44d4-hm6qz\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.121038 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-config-data-custom\") pod \"barbican-api-9c98f44d4-hm6qz\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.121061 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-internal-tls-certs\") pod \"barbican-api-9c98f44d4-hm6qz\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.121198 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m55qr\" (UniqueName: \"kubernetes.io/projected/42d8b8af-9916-4aba-b4e7-f825ed30f182-kube-api-access-m55qr\") pod \"barbican-api-9c98f44d4-hm6qz\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.121246 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-combined-ca-bundle\") pod \"barbican-api-9c98f44d4-hm6qz\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.128511 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-54d455c957-2jn47" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.155737 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.223678 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-config-data\") pod \"barbican-api-9c98f44d4-hm6qz\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.223752 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42d8b8af-9916-4aba-b4e7-f825ed30f182-logs\") pod \"barbican-api-9c98f44d4-hm6qz\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.223829 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-public-tls-certs\") pod \"barbican-api-9c98f44d4-hm6qz\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.223858 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-config-data-custom\") pod \"barbican-api-9c98f44d4-hm6qz\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.223886 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-internal-tls-certs\") pod \"barbican-api-9c98f44d4-hm6qz\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.223921 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m55qr\" (UniqueName: \"kubernetes.io/projected/42d8b8af-9916-4aba-b4e7-f825ed30f182-kube-api-access-m55qr\") pod \"barbican-api-9c98f44d4-hm6qz\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.223946 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-combined-ca-bundle\") pod \"barbican-api-9c98f44d4-hm6qz\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.226105 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42d8b8af-9916-4aba-b4e7-f825ed30f182-logs\") pod \"barbican-api-9c98f44d4-hm6qz\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.227318 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-public-tls-certs\") pod \"barbican-api-9c98f44d4-hm6qz\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.228567 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-config-data-custom\") pod \"barbican-api-9c98f44d4-hm6qz\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.230522 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-combined-ca-bundle\") pod \"barbican-api-9c98f44d4-hm6qz\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.230919 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-internal-tls-certs\") pod \"barbican-api-9c98f44d4-hm6qz\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.236353 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-config-data\") pod \"barbican-api-9c98f44d4-hm6qz\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.253983 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m55qr\" (UniqueName: \"kubernetes.io/projected/42d8b8af-9916-4aba-b4e7-f825ed30f182-kube-api-access-m55qr\") pod \"barbican-api-9c98f44d4-hm6qz\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.312592 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.578643 4825 generic.go:334] "Generic (PLEG): container finished" podID="360e5ec8-ccd5-4e2e-8bea-f6773e7d2792" containerID="aa924c2f0665e6994624c651ef160599dd8014ee01e883f7a0cece8680569823" exitCode=0 Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.578678 4825 generic.go:334] "Generic (PLEG): container finished" podID="360e5ec8-ccd5-4e2e-8bea-f6773e7d2792" containerID="a752c292a0ae40a81df7bf9c73349a256e2f9ef0fb232b0a1ab397f7115ade30" exitCode=143 Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.580681 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d84cdb96-k2sbq" event={"ID":"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792","Type":"ContainerDied","Data":"aa924c2f0665e6994624c651ef160599dd8014ee01e883f7a0cece8680569823"} Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.580716 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d84cdb96-k2sbq" event={"ID":"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792","Type":"ContainerDied","Data":"a752c292a0ae40a81df7bf9c73349a256e2f9ef0fb232b0a1ab397f7115ade30"} Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.888491 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:05:46 crc kubenswrapper[4825]: I0310 07:05:46.888557 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:05:49 crc kubenswrapper[4825]: I0310 07:05:49.119625 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" Mar 10 07:05:49 crc kubenswrapper[4825]: I0310 07:05:49.165951 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6d84cdb96-k2sbq" podUID="360e5ec8-ccd5-4e2e-8bea-f6773e7d2792" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": dial tcp 10.217.0.162:9311: connect: connection refused" Mar 10 07:05:49 crc kubenswrapper[4825]: I0310 07:05:49.166306 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6d84cdb96-k2sbq" podUID="360e5ec8-ccd5-4e2e-8bea-f6773e7d2792" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": dial tcp 10.217.0.162:9311: connect: connection refused" Mar 10 07:05:49 crc kubenswrapper[4825]: I0310 07:05:49.271590 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-cznm2"] Mar 10 07:05:49 crc kubenswrapper[4825]: I0310 07:05:49.275767 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" podUID="359d68ef-affb-48e7-861d-ab0b8d397f47" containerName="dnsmasq-dns" containerID="cri-o://e4421e7c232a25e39615dea0e0fdb279fe983715220552a5e6b6936a1d0a963b" gracePeriod=10 Mar 10 07:05:49 crc kubenswrapper[4825]: I0310 07:05:49.613632 4825 generic.go:334] "Generic (PLEG): container finished" podID="359d68ef-affb-48e7-861d-ab0b8d397f47" containerID="e4421e7c232a25e39615dea0e0fdb279fe983715220552a5e6b6936a1d0a963b" exitCode=0 Mar 10 07:05:49 crc kubenswrapper[4825]: I0310 07:05:49.614168 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" event={"ID":"359d68ef-affb-48e7-861d-ab0b8d397f47","Type":"ContainerDied","Data":"e4421e7c232a25e39615dea0e0fdb279fe983715220552a5e6b6936a1d0a963b"} Mar 10 07:05:50 crc kubenswrapper[4825]: I0310 07:05:50.723630 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-54d455c957-2jn47"] Mar 10 07:05:50 crc kubenswrapper[4825]: W0310 07:05:50.828252 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79caf655_43c2_401d_9967_97d9a35d9741.slice/crio-53553d1e2e85651fb2a187ec88eff96e47b92e66639e428ccf1bf578323e6b0d WatchSource:0}: Error finding container 53553d1e2e85651fb2a187ec88eff96e47b92e66639e428ccf1bf578323e6b0d: Status 404 returned error can't find the container with id 53553d1e2e85651fb2a187ec88eff96e47b92e66639e428ccf1bf578323e6b0d Mar 10 07:05:50 crc kubenswrapper[4825]: I0310 07:05:50.972833 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d84cdb96-k2sbq" Mar 10 07:05:50 crc kubenswrapper[4825]: I0310 07:05:50.983637 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.154073 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-config-data-custom\") pod \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\" (UID: \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\") " Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.154599 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndxsc\" (UniqueName: \"kubernetes.io/projected/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-kube-api-access-ndxsc\") pod \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\" (UID: \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\") " Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.154705 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-dns-swift-storage-0\") pod \"359d68ef-affb-48e7-861d-ab0b8d397f47\" (UID: \"359d68ef-affb-48e7-861d-ab0b8d397f47\") " Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.154734 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-ovsdbserver-nb\") pod \"359d68ef-affb-48e7-861d-ab0b8d397f47\" (UID: \"359d68ef-affb-48e7-861d-ab0b8d397f47\") " Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.154816 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-dns-svc\") pod \"359d68ef-affb-48e7-861d-ab0b8d397f47\" (UID: \"359d68ef-affb-48e7-861d-ab0b8d397f47\") " Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.154860 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-config\") pod \"359d68ef-affb-48e7-861d-ab0b8d397f47\" (UID: \"359d68ef-affb-48e7-861d-ab0b8d397f47\") " Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.154911 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-config-data\") pod \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\" (UID: \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\") " Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.154954 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-combined-ca-bundle\") pod \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\" (UID: \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\") " Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.154974 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-logs\") pod \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\" (UID: \"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792\") " Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.154993 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-ovsdbserver-sb\") pod \"359d68ef-affb-48e7-861d-ab0b8d397f47\" (UID: \"359d68ef-affb-48e7-861d-ab0b8d397f47\") " Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.155015 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtg8c\" (UniqueName: \"kubernetes.io/projected/359d68ef-affb-48e7-861d-ab0b8d397f47-kube-api-access-dtg8c\") pod \"359d68ef-affb-48e7-861d-ab0b8d397f47\" (UID: \"359d68ef-affb-48e7-861d-ab0b8d397f47\") " Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.156206 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-logs" (OuterVolumeSpecName: "logs") pod "360e5ec8-ccd5-4e2e-8bea-f6773e7d2792" (UID: "360e5ec8-ccd5-4e2e-8bea-f6773e7d2792"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.197871 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/359d68ef-affb-48e7-861d-ab0b8d397f47-kube-api-access-dtg8c" (OuterVolumeSpecName: "kube-api-access-dtg8c") pod "359d68ef-affb-48e7-861d-ab0b8d397f47" (UID: "359d68ef-affb-48e7-861d-ab0b8d397f47"). InnerVolumeSpecName "kube-api-access-dtg8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.204370 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "360e5ec8-ccd5-4e2e-8bea-f6773e7d2792" (UID: "360e5ec8-ccd5-4e2e-8bea-f6773e7d2792"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.213813 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-kube-api-access-ndxsc" (OuterVolumeSpecName: "kube-api-access-ndxsc") pod "360e5ec8-ccd5-4e2e-8bea-f6773e7d2792" (UID: "360e5ec8-ccd5-4e2e-8bea-f6773e7d2792"). InnerVolumeSpecName "kube-api-access-ndxsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.264646 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "359d68ef-affb-48e7-861d-ab0b8d397f47" (UID: "359d68ef-affb-48e7-861d-ab0b8d397f47"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.266407 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.266429 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.266438 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtg8c\" (UniqueName: \"kubernetes.io/projected/359d68ef-affb-48e7-861d-ab0b8d397f47-kube-api-access-dtg8c\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.266448 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.266456 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndxsc\" (UniqueName: \"kubernetes.io/projected/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-kube-api-access-ndxsc\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.278745 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-config-data" (OuterVolumeSpecName: "config-data") pod "360e5ec8-ccd5-4e2e-8bea-f6773e7d2792" (UID: "360e5ec8-ccd5-4e2e-8bea-f6773e7d2792"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.280738 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-config" (OuterVolumeSpecName: "config") pod "359d68ef-affb-48e7-861d-ab0b8d397f47" (UID: "359d68ef-affb-48e7-861d-ab0b8d397f47"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.308130 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "360e5ec8-ccd5-4e2e-8bea-f6773e7d2792" (UID: "360e5ec8-ccd5-4e2e-8bea-f6773e7d2792"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.330967 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "359d68ef-affb-48e7-861d-ab0b8d397f47" (UID: "359d68ef-affb-48e7-861d-ab0b8d397f47"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.355474 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "359d68ef-affb-48e7-861d-ab0b8d397f47" (UID: "359d68ef-affb-48e7-861d-ab0b8d397f47"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.368436 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.368458 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.368468 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.368477 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.368485 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.372545 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "359d68ef-affb-48e7-861d-ab0b8d397f47" (UID: "359d68ef-affb-48e7-861d-ab0b8d397f47"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.414267 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-9c98f44d4-hm6qz"] Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.470695 4825 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/359d68ef-affb-48e7-861d-ab0b8d397f47-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.597314 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-75884b96b6-fqkns"] Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.635058 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54d455c957-2jn47" event={"ID":"79caf655-43c2-401d-9967-97d9a35d9741","Type":"ContainerStarted","Data":"b0dc5fbc17a28febd97f83275e735288d32e837f8d7e53a674a6790d1cb06167"} Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.635123 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54d455c957-2jn47" event={"ID":"79caf655-43c2-401d-9967-97d9a35d9741","Type":"ContainerStarted","Data":"53553d1e2e85651fb2a187ec88eff96e47b92e66639e428ccf1bf578323e6b0d"} Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.639288 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0862f63-fd04-41f5-88a2-638189435867","Type":"ContainerStarted","Data":"85f191b32d8eb72bf45bd6404b6c04323fc8f5928580d6347d89da5deaf61e86"} Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.639506 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f0862f63-fd04-41f5-88a2-638189435867" containerName="ceilometer-central-agent" containerID="cri-o://5220de8c97a07f6973f7771ca6c9152a7febd443c4da86431842c9c1571664c3" gracePeriod=30 Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.639864 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f0862f63-fd04-41f5-88a2-638189435867" containerName="proxy-httpd" containerID="cri-o://85f191b32d8eb72bf45bd6404b6c04323fc8f5928580d6347d89da5deaf61e86" gracePeriod=30 Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.639921 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.639940 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f0862f63-fd04-41f5-88a2-638189435867" containerName="ceilometer-notification-agent" containerID="cri-o://e27c06202d1e8962a583ae240525b20457246cdcf4407b7147b58552bfbf1108" gracePeriod=30 Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.639928 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f0862f63-fd04-41f5-88a2-638189435867" containerName="sg-core" containerID="cri-o://dd6c629e4ee997cf8d87fb453e10f65c343d3bd0d25e0cc7e06cde200a99b7d6" gracePeriod=30 Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.646506 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9c98f44d4-hm6qz" event={"ID":"42d8b8af-9916-4aba-b4e7-f825ed30f182","Type":"ContainerStarted","Data":"6a1691cba1fef49f1aa2f7c44e95ac06b21fe92d69b0eadbad46361719368876"} Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.646555 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9c98f44d4-hm6qz" event={"ID":"42d8b8af-9916-4aba-b4e7-f825ed30f182","Type":"ContainerStarted","Data":"113dd6b9c04855397b2f1ec5c7cdc93dc6e65cfadd3838790802d833cf902ff4"} Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.648651 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" event={"ID":"d907281e-010a-403c-ba2a-b8d178dacbb0","Type":"ContainerStarted","Data":"b06ac46e842de2fe235b4bf3a5e34fbf009cceee6b2267a3ea7854b38f4849fe"} Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.675738 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d84cdb96-k2sbq" event={"ID":"360e5ec8-ccd5-4e2e-8bea-f6773e7d2792","Type":"ContainerDied","Data":"c7a651b5d91b5136ef4112a6cf4bffd1ef11daee346eee79d6a3737450b667bf"} Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.675801 4825 scope.go:117] "RemoveContainer" containerID="aa924c2f0665e6994624c651ef160599dd8014ee01e883f7a0cece8680569823" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.675972 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d84cdb96-k2sbq" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.682475 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" event={"ID":"359d68ef-affb-48e7-861d-ab0b8d397f47","Type":"ContainerDied","Data":"7ced23c92af8cd6c52399ed72893cea26284bf43ddac1ccfc767f31d13ca8fe3"} Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.682564 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.689198 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.717973378 podStartE2EDuration="42.689172545s" podCreationTimestamp="2026-03-10 07:05:09 +0000 UTC" firstStartedPulling="2026-03-10 07:05:11.060472952 +0000 UTC m=+1264.090253567" lastFinishedPulling="2026-03-10 07:05:51.031672119 +0000 UTC m=+1304.061452734" observedRunningTime="2026-03-10 07:05:51.677002677 +0000 UTC m=+1304.706783292" watchObservedRunningTime="2026-03-10 07:05:51.689172545 +0000 UTC m=+1304.718953160" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.755069 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6d84cdb96-k2sbq"] Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.756389 4825 scope.go:117] "RemoveContainer" containerID="a752c292a0ae40a81df7bf9c73349a256e2f9ef0fb232b0a1ab397f7115ade30" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.785459 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6d84cdb96-k2sbq"] Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.800942 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-cznm2"] Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.810463 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-cznm2"] Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.815362 4825 scope.go:117] "RemoveContainer" containerID="e4421e7c232a25e39615dea0e0fdb279fe983715220552a5e6b6936a1d0a963b" Mar 10 07:05:51 crc kubenswrapper[4825]: I0310 07:05:51.844278 4825 scope.go:117] "RemoveContainer" containerID="07ee4bffa41bb926277c5961dd88a98dee5f10f593bd6aa7cafdefe008bae570" Mar 10 07:05:52 crc kubenswrapper[4825]: I0310 07:05:52.696070 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54d455c957-2jn47" event={"ID":"79caf655-43c2-401d-9967-97d9a35d9741","Type":"ContainerStarted","Data":"e0c5268cbaf4fcdd48619feb929fd81e110e9f60040fedec074af3e0608a9a8d"} Mar 10 07:05:52 crc kubenswrapper[4825]: I0310 07:05:52.700117 4825 generic.go:334] "Generic (PLEG): container finished" podID="f0862f63-fd04-41f5-88a2-638189435867" containerID="85f191b32d8eb72bf45bd6404b6c04323fc8f5928580d6347d89da5deaf61e86" exitCode=0 Mar 10 07:05:52 crc kubenswrapper[4825]: I0310 07:05:52.700190 4825 generic.go:334] "Generic (PLEG): container finished" podID="f0862f63-fd04-41f5-88a2-638189435867" containerID="dd6c629e4ee997cf8d87fb453e10f65c343d3bd0d25e0cc7e06cde200a99b7d6" exitCode=2 Mar 10 07:05:52 crc kubenswrapper[4825]: I0310 07:05:52.700207 4825 generic.go:334] "Generic (PLEG): container finished" podID="f0862f63-fd04-41f5-88a2-638189435867" containerID="5220de8c97a07f6973f7771ca6c9152a7febd443c4da86431842c9c1571664c3" exitCode=0 Mar 10 07:05:52 crc kubenswrapper[4825]: I0310 07:05:52.700262 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0862f63-fd04-41f5-88a2-638189435867","Type":"ContainerDied","Data":"85f191b32d8eb72bf45bd6404b6c04323fc8f5928580d6347d89da5deaf61e86"} Mar 10 07:05:52 crc kubenswrapper[4825]: I0310 07:05:52.700292 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0862f63-fd04-41f5-88a2-638189435867","Type":"ContainerDied","Data":"dd6c629e4ee997cf8d87fb453e10f65c343d3bd0d25e0cc7e06cde200a99b7d6"} Mar 10 07:05:52 crc kubenswrapper[4825]: I0310 07:05:52.700311 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0862f63-fd04-41f5-88a2-638189435867","Type":"ContainerDied","Data":"5220de8c97a07f6973f7771ca6c9152a7febd443c4da86431842c9c1571664c3"} Mar 10 07:05:52 crc kubenswrapper[4825]: I0310 07:05:52.705946 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9c98f44d4-hm6qz" event={"ID":"42d8b8af-9916-4aba-b4e7-f825ed30f182","Type":"ContainerStarted","Data":"45339b16b80b77a1ec497dc0cfdf7cad94e760f0b42f91faed72b2b7acaeea0f"} Mar 10 07:05:52 crc kubenswrapper[4825]: I0310 07:05:52.706020 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:52 crc kubenswrapper[4825]: I0310 07:05:52.706489 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:52 crc kubenswrapper[4825]: I0310 07:05:52.708813 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" event={"ID":"d907281e-010a-403c-ba2a-b8d178dacbb0","Type":"ContainerStarted","Data":"c5e15cd589465e5cfd321144406581f242067dc2f205033f4254ac94393e1ff6"} Mar 10 07:05:52 crc kubenswrapper[4825]: I0310 07:05:52.708857 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" event={"ID":"d907281e-010a-403c-ba2a-b8d178dacbb0","Type":"ContainerStarted","Data":"f0fabece877d02360c9f8c4a495f79b17208aceb1c8e9d8dac1c99cae589b5d0"} Mar 10 07:05:52 crc kubenswrapper[4825]: I0310 07:05:52.719312 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-j9wfx" event={"ID":"c3f58042-5c26-4b0b-9f85-d35d9305115e","Type":"ContainerStarted","Data":"7c7257e4058fc753891db09721efcb315ea6caeeab1b10d2c3b36746db13b108"} Mar 10 07:05:52 crc kubenswrapper[4825]: I0310 07:05:52.729765 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-54d455c957-2jn47" podStartSLOduration=7.729741136 podStartE2EDuration="7.729741136s" podCreationTimestamp="2026-03-10 07:05:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:05:52.729440018 +0000 UTC m=+1305.759220643" watchObservedRunningTime="2026-03-10 07:05:52.729741136 +0000 UTC m=+1305.759521751" Mar 10 07:05:52 crc kubenswrapper[4825]: I0310 07:05:52.753927 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" podStartSLOduration=7.753904966 podStartE2EDuration="7.753904966s" podCreationTimestamp="2026-03-10 07:05:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:05:52.752034887 +0000 UTC m=+1305.781815512" watchObservedRunningTime="2026-03-10 07:05:52.753904966 +0000 UTC m=+1305.783685601" Mar 10 07:05:52 crc kubenswrapper[4825]: I0310 07:05:52.776774 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-b6df5fb85-rp95h"] Mar 10 07:05:52 crc kubenswrapper[4825]: I0310 07:05:52.777040 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-b6df5fb85-rp95h" podUID="6fd14279-d503-4cda-a6b4-14bfd6945596" containerName="barbican-worker-log" containerID="cri-o://7d1abc0040c690dba66e527f4090056039b02be17578050e52f970cade647ad7" gracePeriod=30 Mar 10 07:05:52 crc kubenswrapper[4825]: I0310 07:05:52.777123 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-b6df5fb85-rp95h" podUID="6fd14279-d503-4cda-a6b4-14bfd6945596" containerName="barbican-worker" containerID="cri-o://5ba93edcb4a95504083c440ac77fb9ff72be024ad46e8a250d21b4b8dba98d3f" gracePeriod=30 Mar 10 07:05:52 crc kubenswrapper[4825]: I0310 07:05:52.791198 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-9c98f44d4-hm6qz" podStartSLOduration=7.7911818969999995 podStartE2EDuration="7.791181897s" podCreationTimestamp="2026-03-10 07:05:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:05:52.784080272 +0000 UTC m=+1305.813860897" watchObservedRunningTime="2026-03-10 07:05:52.791181897 +0000 UTC m=+1305.820962512" Mar 10 07:05:52 crc kubenswrapper[4825]: I0310 07:05:52.820754 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-8646874786-6nnq4"] Mar 10 07:05:52 crc kubenswrapper[4825]: I0310 07:05:52.821037 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-8646874786-6nnq4" podUID="beaf0874-59e2-4ec2-9425-4c11184a7e3d" containerName="barbican-keystone-listener-log" containerID="cri-o://1c204c0dd92e2e8c589476254c0066dc443ca2290d53873337c9657d6386b775" gracePeriod=30 Mar 10 07:05:52 crc kubenswrapper[4825]: I0310 07:05:52.821182 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-8646874786-6nnq4" podUID="beaf0874-59e2-4ec2-9425-4c11184a7e3d" containerName="barbican-keystone-listener" containerID="cri-o://99223e0c7d50ee9fd6f248dfcb3f76a75aa508bb660de7f5dcbf4cda04138d4d" gracePeriod=30 Mar 10 07:05:52 crc kubenswrapper[4825]: I0310 07:05:52.841120 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-j9wfx" podStartSLOduration=3.845859704 podStartE2EDuration="43.841096408s" podCreationTimestamp="2026-03-10 07:05:09 +0000 UTC" firstStartedPulling="2026-03-10 07:05:10.987288214 +0000 UTC m=+1264.017068829" lastFinishedPulling="2026-03-10 07:05:50.982524918 +0000 UTC m=+1304.012305533" observedRunningTime="2026-03-10 07:05:52.823347416 +0000 UTC m=+1305.853128031" watchObservedRunningTime="2026-03-10 07:05:52.841096408 +0000 UTC m=+1305.870877023" Mar 10 07:05:53 crc kubenswrapper[4825]: I0310 07:05:53.256669 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="359d68ef-affb-48e7-861d-ab0b8d397f47" path="/var/lib/kubelet/pods/359d68ef-affb-48e7-861d-ab0b8d397f47/volumes" Mar 10 07:05:53 crc kubenswrapper[4825]: I0310 07:05:53.257651 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="360e5ec8-ccd5-4e2e-8bea-f6773e7d2792" path="/var/lib/kubelet/pods/360e5ec8-ccd5-4e2e-8bea-f6773e7d2792/volumes" Mar 10 07:05:53 crc kubenswrapper[4825]: I0310 07:05:53.450533 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:53 crc kubenswrapper[4825]: I0310 07:05:53.568566 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:05:53 crc kubenswrapper[4825]: I0310 07:05:53.742124 4825 generic.go:334] "Generic (PLEG): container finished" podID="f0862f63-fd04-41f5-88a2-638189435867" containerID="e27c06202d1e8962a583ae240525b20457246cdcf4407b7147b58552bfbf1108" exitCode=0 Mar 10 07:05:53 crc kubenswrapper[4825]: I0310 07:05:53.742187 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0862f63-fd04-41f5-88a2-638189435867","Type":"ContainerDied","Data":"e27c06202d1e8962a583ae240525b20457246cdcf4407b7147b58552bfbf1108"} Mar 10 07:05:53 crc kubenswrapper[4825]: I0310 07:05:53.744726 4825 generic.go:334] "Generic (PLEG): container finished" podID="6fd14279-d503-4cda-a6b4-14bfd6945596" containerID="7d1abc0040c690dba66e527f4090056039b02be17578050e52f970cade647ad7" exitCode=143 Mar 10 07:05:53 crc kubenswrapper[4825]: I0310 07:05:53.744806 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-b6df5fb85-rp95h" event={"ID":"6fd14279-d503-4cda-a6b4-14bfd6945596","Type":"ContainerDied","Data":"7d1abc0040c690dba66e527f4090056039b02be17578050e52f970cade647ad7"} Mar 10 07:05:53 crc kubenswrapper[4825]: I0310 07:05:53.747027 4825 generic.go:334] "Generic (PLEG): container finished" podID="beaf0874-59e2-4ec2-9425-4c11184a7e3d" containerID="1c204c0dd92e2e8c589476254c0066dc443ca2290d53873337c9657d6386b775" exitCode=143 Mar 10 07:05:53 crc kubenswrapper[4825]: I0310 07:05:53.747904 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8646874786-6nnq4" event={"ID":"beaf0874-59e2-4ec2-9425-4c11184a7e3d","Type":"ContainerDied","Data":"1c204c0dd92e2e8c589476254c0066dc443ca2290d53873337c9657d6386b775"} Mar 10 07:05:53 crc kubenswrapper[4825]: I0310 07:05:53.908733 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.030967 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z2x2\" (UniqueName: \"kubernetes.io/projected/f0862f63-fd04-41f5-88a2-638189435867-kube-api-access-2z2x2\") pod \"f0862f63-fd04-41f5-88a2-638189435867\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.031453 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0862f63-fd04-41f5-88a2-638189435867-config-data\") pod \"f0862f63-fd04-41f5-88a2-638189435867\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.031532 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0862f63-fd04-41f5-88a2-638189435867-sg-core-conf-yaml\") pod \"f0862f63-fd04-41f5-88a2-638189435867\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.031570 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0862f63-fd04-41f5-88a2-638189435867-scripts\") pod \"f0862f63-fd04-41f5-88a2-638189435867\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.031607 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0862f63-fd04-41f5-88a2-638189435867-log-httpd\") pod \"f0862f63-fd04-41f5-88a2-638189435867\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.031648 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0862f63-fd04-41f5-88a2-638189435867-combined-ca-bundle\") pod \"f0862f63-fd04-41f5-88a2-638189435867\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.031718 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0862f63-fd04-41f5-88a2-638189435867-run-httpd\") pod \"f0862f63-fd04-41f5-88a2-638189435867\" (UID: \"f0862f63-fd04-41f5-88a2-638189435867\") " Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.032358 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0862f63-fd04-41f5-88a2-638189435867-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f0862f63-fd04-41f5-88a2-638189435867" (UID: "f0862f63-fd04-41f5-88a2-638189435867"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.032376 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0862f63-fd04-41f5-88a2-638189435867-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f0862f63-fd04-41f5-88a2-638189435867" (UID: "f0862f63-fd04-41f5-88a2-638189435867"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.037897 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0862f63-fd04-41f5-88a2-638189435867-kube-api-access-2z2x2" (OuterVolumeSpecName: "kube-api-access-2z2x2") pod "f0862f63-fd04-41f5-88a2-638189435867" (UID: "f0862f63-fd04-41f5-88a2-638189435867"). InnerVolumeSpecName "kube-api-access-2z2x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.052031 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0862f63-fd04-41f5-88a2-638189435867-scripts" (OuterVolumeSpecName: "scripts") pod "f0862f63-fd04-41f5-88a2-638189435867" (UID: "f0862f63-fd04-41f5-88a2-638189435867"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.079972 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0862f63-fd04-41f5-88a2-638189435867-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f0862f63-fd04-41f5-88a2-638189435867" (UID: "f0862f63-fd04-41f5-88a2-638189435867"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.107898 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0862f63-fd04-41f5-88a2-638189435867-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0862f63-fd04-41f5-88a2-638189435867" (UID: "f0862f63-fd04-41f5-88a2-638189435867"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.142772 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z2x2\" (UniqueName: \"kubernetes.io/projected/f0862f63-fd04-41f5-88a2-638189435867-kube-api-access-2z2x2\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.142808 4825 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0862f63-fd04-41f5-88a2-638189435867-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.142818 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0862f63-fd04-41f5-88a2-638189435867-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.142826 4825 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0862f63-fd04-41f5-88a2-638189435867-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.142836 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0862f63-fd04-41f5-88a2-638189435867-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.142846 4825 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0862f63-fd04-41f5-88a2-638189435867-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.181327 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0862f63-fd04-41f5-88a2-638189435867-config-data" (OuterVolumeSpecName: "config-data") pod "f0862f63-fd04-41f5-88a2-638189435867" (UID: "f0862f63-fd04-41f5-88a2-638189435867"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.244316 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0862f63-fd04-41f5-88a2-638189435867-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.759759 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0862f63-fd04-41f5-88a2-638189435867","Type":"ContainerDied","Data":"cd63a26a1a8105fb150a76f6abee7d5c9466889f520fe6a87bec5da6bba60a33"} Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.759829 4825 scope.go:117] "RemoveContainer" containerID="85f191b32d8eb72bf45bd6404b6c04323fc8f5928580d6347d89da5deaf61e86" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.759771 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.807379 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.816069 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.850085 4825 scope.go:117] "RemoveContainer" containerID="dd6c629e4ee997cf8d87fb453e10f65c343d3bd0d25e0cc7e06cde200a99b7d6" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.859803 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:05:54 crc kubenswrapper[4825]: E0310 07:05:54.860235 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="359d68ef-affb-48e7-861d-ab0b8d397f47" containerName="dnsmasq-dns" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.860256 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="359d68ef-affb-48e7-861d-ab0b8d397f47" containerName="dnsmasq-dns" Mar 10 07:05:54 crc kubenswrapper[4825]: E0310 07:05:54.860272 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="360e5ec8-ccd5-4e2e-8bea-f6773e7d2792" containerName="barbican-api-log" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.860278 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="360e5ec8-ccd5-4e2e-8bea-f6773e7d2792" containerName="barbican-api-log" Mar 10 07:05:54 crc kubenswrapper[4825]: E0310 07:05:54.860289 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0862f63-fd04-41f5-88a2-638189435867" containerName="ceilometer-central-agent" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.860295 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0862f63-fd04-41f5-88a2-638189435867" containerName="ceilometer-central-agent" Mar 10 07:05:54 crc kubenswrapper[4825]: E0310 07:05:54.860306 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0862f63-fd04-41f5-88a2-638189435867" containerName="ceilometer-notification-agent" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.860312 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0862f63-fd04-41f5-88a2-638189435867" containerName="ceilometer-notification-agent" Mar 10 07:05:54 crc kubenswrapper[4825]: E0310 07:05:54.860322 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="359d68ef-affb-48e7-861d-ab0b8d397f47" containerName="init" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.860327 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="359d68ef-affb-48e7-861d-ab0b8d397f47" containerName="init" Mar 10 07:05:54 crc kubenswrapper[4825]: E0310 07:05:54.860343 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0862f63-fd04-41f5-88a2-638189435867" containerName="sg-core" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.860348 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0862f63-fd04-41f5-88a2-638189435867" containerName="sg-core" Mar 10 07:05:54 crc kubenswrapper[4825]: E0310 07:05:54.860357 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="360e5ec8-ccd5-4e2e-8bea-f6773e7d2792" containerName="barbican-api" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.860363 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="360e5ec8-ccd5-4e2e-8bea-f6773e7d2792" containerName="barbican-api" Mar 10 07:05:54 crc kubenswrapper[4825]: E0310 07:05:54.860377 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0862f63-fd04-41f5-88a2-638189435867" containerName="proxy-httpd" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.860383 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0862f63-fd04-41f5-88a2-638189435867" containerName="proxy-httpd" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.860561 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="360e5ec8-ccd5-4e2e-8bea-f6773e7d2792" containerName="barbican-api-log" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.860576 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0862f63-fd04-41f5-88a2-638189435867" containerName="sg-core" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.860585 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0862f63-fd04-41f5-88a2-638189435867" containerName="proxy-httpd" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.860599 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0862f63-fd04-41f5-88a2-638189435867" containerName="ceilometer-notification-agent" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.860609 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0862f63-fd04-41f5-88a2-638189435867" containerName="ceilometer-central-agent" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.860623 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="360e5ec8-ccd5-4e2e-8bea-f6773e7d2792" containerName="barbican-api" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.860636 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="359d68ef-affb-48e7-861d-ab0b8d397f47" containerName="dnsmasq-dns" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.862386 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.864628 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.864907 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.872617 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.885593 4825 scope.go:117] "RemoveContainer" containerID="e27c06202d1e8962a583ae240525b20457246cdcf4407b7147b58552bfbf1108" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.947556 4825 scope.go:117] "RemoveContainer" containerID="5220de8c97a07f6973f7771ca6c9152a7febd443c4da86431842c9c1571664c3" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.958862 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-run-httpd\") pod \"ceilometer-0\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " pod="openstack/ceilometer-0" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.958933 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " pod="openstack/ceilometer-0" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.958978 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-scripts\") pod \"ceilometer-0\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " pod="openstack/ceilometer-0" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.959038 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-config-data\") pod \"ceilometer-0\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " pod="openstack/ceilometer-0" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.959076 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-log-httpd\") pod \"ceilometer-0\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " pod="openstack/ceilometer-0" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.959098 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw8tx\" (UniqueName: \"kubernetes.io/projected/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-kube-api-access-cw8tx\") pod \"ceilometer-0\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " pod="openstack/ceilometer-0" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.959114 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " pod="openstack/ceilometer-0" Mar 10 07:05:54 crc kubenswrapper[4825]: I0310 07:05:54.987062 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:05:54 crc kubenswrapper[4825]: E0310 07:05:54.987810 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-cw8tx log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="52379ed3-93d8-4f2e-b15e-5ad0cb49ce83" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.060427 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-run-httpd\") pod \"ceilometer-0\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " pod="openstack/ceilometer-0" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.060506 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " pod="openstack/ceilometer-0" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.060555 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-scripts\") pod \"ceilometer-0\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " pod="openstack/ceilometer-0" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.060652 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-config-data\") pod \"ceilometer-0\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " pod="openstack/ceilometer-0" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.060691 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-log-httpd\") pod \"ceilometer-0\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " pod="openstack/ceilometer-0" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.060715 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw8tx\" (UniqueName: \"kubernetes.io/projected/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-kube-api-access-cw8tx\") pod \"ceilometer-0\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " pod="openstack/ceilometer-0" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.060733 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " pod="openstack/ceilometer-0" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.061411 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-run-httpd\") pod \"ceilometer-0\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " pod="openstack/ceilometer-0" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.062008 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-log-httpd\") pod \"ceilometer-0\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " pod="openstack/ceilometer-0" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.065817 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-scripts\") pod \"ceilometer-0\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " pod="openstack/ceilometer-0" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.066154 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-config-data\") pod \"ceilometer-0\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " pod="openstack/ceilometer-0" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.066808 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " pod="openstack/ceilometer-0" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.067786 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " pod="openstack/ceilometer-0" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.087020 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw8tx\" (UniqueName: \"kubernetes.io/projected/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-kube-api-access-cw8tx\") pod \"ceilometer-0\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " pod="openstack/ceilometer-0" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.259102 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0862f63-fd04-41f5-88a2-638189435867" path="/var/lib/kubelet/pods/f0862f63-fd04-41f5-88a2-638189435867/volumes" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.593587 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-ccd7c9f8f-cznm2" podUID="359d68ef-affb-48e7-861d-ab0b8d397f47" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.147:5353: i/o timeout" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.635850 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8646874786-6nnq4" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.698270 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-b6df5fb85-rp95h" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.771818 4825 generic.go:334] "Generic (PLEG): container finished" podID="6fd14279-d503-4cda-a6b4-14bfd6945596" containerID="5ba93edcb4a95504083c440ac77fb9ff72be024ad46e8a250d21b4b8dba98d3f" exitCode=0 Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.771885 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-b6df5fb85-rp95h" event={"ID":"6fd14279-d503-4cda-a6b4-14bfd6945596","Type":"ContainerDied","Data":"5ba93edcb4a95504083c440ac77fb9ff72be024ad46e8a250d21b4b8dba98d3f"} Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.771922 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-b6df5fb85-rp95h" event={"ID":"6fd14279-d503-4cda-a6b4-14bfd6945596","Type":"ContainerDied","Data":"f21e9888ccc1516588069e812e5c52c3b58a35a4014c0d4b75a764a59f934ab4"} Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.771931 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-b6df5fb85-rp95h" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.771948 4825 scope.go:117] "RemoveContainer" containerID="5ba93edcb4a95504083c440ac77fb9ff72be024ad46e8a250d21b4b8dba98d3f" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.776269 4825 generic.go:334] "Generic (PLEG): container finished" podID="beaf0874-59e2-4ec2-9425-4c11184a7e3d" containerID="99223e0c7d50ee9fd6f248dfcb3f76a75aa508bb660de7f5dcbf4cda04138d4d" exitCode=0 Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.776292 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8646874786-6nnq4" event={"ID":"beaf0874-59e2-4ec2-9425-4c11184a7e3d","Type":"ContainerDied","Data":"99223e0c7d50ee9fd6f248dfcb3f76a75aa508bb660de7f5dcbf4cda04138d4d"} Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.776325 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8646874786-6nnq4" event={"ID":"beaf0874-59e2-4ec2-9425-4c11184a7e3d","Type":"ContainerDied","Data":"b2d6d851cdbee1e512f8930b58ce72522fb335db88aba4e5cf7b89c7c3847e60"} Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.776403 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8646874786-6nnq4" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.777974 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.781642 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beaf0874-59e2-4ec2-9425-4c11184a7e3d-combined-ca-bundle\") pod \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\" (UID: \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\") " Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.781708 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/beaf0874-59e2-4ec2-9425-4c11184a7e3d-config-data-custom\") pod \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\" (UID: \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\") " Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.781800 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/beaf0874-59e2-4ec2-9425-4c11184a7e3d-config-data\") pod \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\" (UID: \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\") " Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.781981 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8b56\" (UniqueName: \"kubernetes.io/projected/beaf0874-59e2-4ec2-9425-4c11184a7e3d-kube-api-access-n8b56\") pod \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\" (UID: \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\") " Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.782011 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/beaf0874-59e2-4ec2-9425-4c11184a7e3d-logs\") pod \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\" (UID: \"beaf0874-59e2-4ec2-9425-4c11184a7e3d\") " Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.782850 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beaf0874-59e2-4ec2-9425-4c11184a7e3d-logs" (OuterVolumeSpecName: "logs") pod "beaf0874-59e2-4ec2-9425-4c11184a7e3d" (UID: "beaf0874-59e2-4ec2-9425-4c11184a7e3d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.787609 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beaf0874-59e2-4ec2-9425-4c11184a7e3d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "beaf0874-59e2-4ec2-9425-4c11184a7e3d" (UID: "beaf0874-59e2-4ec2-9425-4c11184a7e3d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.790442 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beaf0874-59e2-4ec2-9425-4c11184a7e3d-kube-api-access-n8b56" (OuterVolumeSpecName: "kube-api-access-n8b56") pod "beaf0874-59e2-4ec2-9425-4c11184a7e3d" (UID: "beaf0874-59e2-4ec2-9425-4c11184a7e3d"). InnerVolumeSpecName "kube-api-access-n8b56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.792577 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.805420 4825 scope.go:117] "RemoveContainer" containerID="7d1abc0040c690dba66e527f4090056039b02be17578050e52f970cade647ad7" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.808809 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beaf0874-59e2-4ec2-9425-4c11184a7e3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "beaf0874-59e2-4ec2-9425-4c11184a7e3d" (UID: "beaf0874-59e2-4ec2-9425-4c11184a7e3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.826477 4825 scope.go:117] "RemoveContainer" containerID="5ba93edcb4a95504083c440ac77fb9ff72be024ad46e8a250d21b4b8dba98d3f" Mar 10 07:05:55 crc kubenswrapper[4825]: E0310 07:05:55.826832 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ba93edcb4a95504083c440ac77fb9ff72be024ad46e8a250d21b4b8dba98d3f\": container with ID starting with 5ba93edcb4a95504083c440ac77fb9ff72be024ad46e8a250d21b4b8dba98d3f not found: ID does not exist" containerID="5ba93edcb4a95504083c440ac77fb9ff72be024ad46e8a250d21b4b8dba98d3f" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.826868 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ba93edcb4a95504083c440ac77fb9ff72be024ad46e8a250d21b4b8dba98d3f"} err="failed to get container status \"5ba93edcb4a95504083c440ac77fb9ff72be024ad46e8a250d21b4b8dba98d3f\": rpc error: code = NotFound desc = could not find container \"5ba93edcb4a95504083c440ac77fb9ff72be024ad46e8a250d21b4b8dba98d3f\": container with ID starting with 5ba93edcb4a95504083c440ac77fb9ff72be024ad46e8a250d21b4b8dba98d3f not found: ID does not exist" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.826901 4825 scope.go:117] "RemoveContainer" containerID="7d1abc0040c690dba66e527f4090056039b02be17578050e52f970cade647ad7" Mar 10 07:05:55 crc kubenswrapper[4825]: E0310 07:05:55.827179 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d1abc0040c690dba66e527f4090056039b02be17578050e52f970cade647ad7\": container with ID starting with 7d1abc0040c690dba66e527f4090056039b02be17578050e52f970cade647ad7 not found: ID does not exist" containerID="7d1abc0040c690dba66e527f4090056039b02be17578050e52f970cade647ad7" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.827205 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d1abc0040c690dba66e527f4090056039b02be17578050e52f970cade647ad7"} err="failed to get container status \"7d1abc0040c690dba66e527f4090056039b02be17578050e52f970cade647ad7\": rpc error: code = NotFound desc = could not find container \"7d1abc0040c690dba66e527f4090056039b02be17578050e52f970cade647ad7\": container with ID starting with 7d1abc0040c690dba66e527f4090056039b02be17578050e52f970cade647ad7 not found: ID does not exist" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.827224 4825 scope.go:117] "RemoveContainer" containerID="99223e0c7d50ee9fd6f248dfcb3f76a75aa508bb660de7f5dcbf4cda04138d4d" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.842737 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beaf0874-59e2-4ec2-9425-4c11184a7e3d-config-data" (OuterVolumeSpecName: "config-data") pod "beaf0874-59e2-4ec2-9425-4c11184a7e3d" (UID: "beaf0874-59e2-4ec2-9425-4c11184a7e3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.883016 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fd14279-d503-4cda-a6b4-14bfd6945596-combined-ca-bundle\") pod \"6fd14279-d503-4cda-a6b4-14bfd6945596\" (UID: \"6fd14279-d503-4cda-a6b4-14bfd6945596\") " Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.883075 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fd14279-d503-4cda-a6b4-14bfd6945596-config-data-custom\") pod \"6fd14279-d503-4cda-a6b4-14bfd6945596\" (UID: \"6fd14279-d503-4cda-a6b4-14bfd6945596\") " Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.883203 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fd14279-d503-4cda-a6b4-14bfd6945596-config-data\") pod \"6fd14279-d503-4cda-a6b4-14bfd6945596\" (UID: \"6fd14279-d503-4cda-a6b4-14bfd6945596\") " Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.883279 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psdl6\" (UniqueName: \"kubernetes.io/projected/6fd14279-d503-4cda-a6b4-14bfd6945596-kube-api-access-psdl6\") pod \"6fd14279-d503-4cda-a6b4-14bfd6945596\" (UID: \"6fd14279-d503-4cda-a6b4-14bfd6945596\") " Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.883362 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fd14279-d503-4cda-a6b4-14bfd6945596-logs\") pod \"6fd14279-d503-4cda-a6b4-14bfd6945596\" (UID: \"6fd14279-d503-4cda-a6b4-14bfd6945596\") " Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.884079 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/beaf0874-59e2-4ec2-9425-4c11184a7e3d-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.884110 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8b56\" (UniqueName: \"kubernetes.io/projected/beaf0874-59e2-4ec2-9425-4c11184a7e3d-kube-api-access-n8b56\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.884124 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/beaf0874-59e2-4ec2-9425-4c11184a7e3d-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.884152 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beaf0874-59e2-4ec2-9425-4c11184a7e3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.884163 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/beaf0874-59e2-4ec2-9425-4c11184a7e3d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.884307 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fd14279-d503-4cda-a6b4-14bfd6945596-logs" (OuterVolumeSpecName: "logs") pod "6fd14279-d503-4cda-a6b4-14bfd6945596" (UID: "6fd14279-d503-4cda-a6b4-14bfd6945596"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.889749 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fd14279-d503-4cda-a6b4-14bfd6945596-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6fd14279-d503-4cda-a6b4-14bfd6945596" (UID: "6fd14279-d503-4cda-a6b4-14bfd6945596"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.890603 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fd14279-d503-4cda-a6b4-14bfd6945596-kube-api-access-psdl6" (OuterVolumeSpecName: "kube-api-access-psdl6") pod "6fd14279-d503-4cda-a6b4-14bfd6945596" (UID: "6fd14279-d503-4cda-a6b4-14bfd6945596"). InnerVolumeSpecName "kube-api-access-psdl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.922645 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fd14279-d503-4cda-a6b4-14bfd6945596-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6fd14279-d503-4cda-a6b4-14bfd6945596" (UID: "6fd14279-d503-4cda-a6b4-14bfd6945596"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.940407 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fd14279-d503-4cda-a6b4-14bfd6945596-config-data" (OuterVolumeSpecName: "config-data") pod "6fd14279-d503-4cda-a6b4-14bfd6945596" (UID: "6fd14279-d503-4cda-a6b4-14bfd6945596"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.984964 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-run-httpd\") pod \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.985082 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-combined-ca-bundle\") pod \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.985125 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw8tx\" (UniqueName: \"kubernetes.io/projected/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-kube-api-access-cw8tx\") pod \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.985213 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-scripts\") pod \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.985308 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-sg-core-conf-yaml\") pod \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.985334 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-log-httpd\") pod \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.985365 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-config-data\") pod \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\" (UID: \"52379ed3-93d8-4f2e-b15e-5ad0cb49ce83\") " Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.985740 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fd14279-d503-4cda-a6b4-14bfd6945596-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.985754 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psdl6\" (UniqueName: \"kubernetes.io/projected/6fd14279-d503-4cda-a6b4-14bfd6945596-kube-api-access-psdl6\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.985764 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6fd14279-d503-4cda-a6b4-14bfd6945596-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.985774 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fd14279-d503-4cda-a6b4-14bfd6945596-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.985782 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fd14279-d503-4cda-a6b4-14bfd6945596-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.986094 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "52379ed3-93d8-4f2e-b15e-5ad0cb49ce83" (UID: "52379ed3-93d8-4f2e-b15e-5ad0cb49ce83"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.986433 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "52379ed3-93d8-4f2e-b15e-5ad0cb49ce83" (UID: "52379ed3-93d8-4f2e-b15e-5ad0cb49ce83"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.990514 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "52379ed3-93d8-4f2e-b15e-5ad0cb49ce83" (UID: "52379ed3-93d8-4f2e-b15e-5ad0cb49ce83"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.991250 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-kube-api-access-cw8tx" (OuterVolumeSpecName: "kube-api-access-cw8tx") pod "52379ed3-93d8-4f2e-b15e-5ad0cb49ce83" (UID: "52379ed3-93d8-4f2e-b15e-5ad0cb49ce83"). InnerVolumeSpecName "kube-api-access-cw8tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.991407 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52379ed3-93d8-4f2e-b15e-5ad0cb49ce83" (UID: "52379ed3-93d8-4f2e-b15e-5ad0cb49ce83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.991577 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-config-data" (OuterVolumeSpecName: "config-data") pod "52379ed3-93d8-4f2e-b15e-5ad0cb49ce83" (UID: "52379ed3-93d8-4f2e-b15e-5ad0cb49ce83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:55 crc kubenswrapper[4825]: I0310 07:05:55.991622 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-scripts" (OuterVolumeSpecName: "scripts") pod "52379ed3-93d8-4f2e-b15e-5ad0cb49ce83" (UID: "52379ed3-93d8-4f2e-b15e-5ad0cb49ce83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.060498 4825 scope.go:117] "RemoveContainer" containerID="1c204c0dd92e2e8c589476254c0066dc443ca2290d53873337c9657d6386b775" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.088013 4825 scope.go:117] "RemoveContainer" containerID="99223e0c7d50ee9fd6f248dfcb3f76a75aa508bb660de7f5dcbf4cda04138d4d" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.088100 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.088205 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cw8tx\" (UniqueName: \"kubernetes.io/projected/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-kube-api-access-cw8tx\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.088226 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.088247 4825 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.088259 4825 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.088269 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.088280 4825 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:56 crc kubenswrapper[4825]: E0310 07:05:56.088983 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99223e0c7d50ee9fd6f248dfcb3f76a75aa508bb660de7f5dcbf4cda04138d4d\": container with ID starting with 99223e0c7d50ee9fd6f248dfcb3f76a75aa508bb660de7f5dcbf4cda04138d4d not found: ID does not exist" containerID="99223e0c7d50ee9fd6f248dfcb3f76a75aa508bb660de7f5dcbf4cda04138d4d" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.089022 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99223e0c7d50ee9fd6f248dfcb3f76a75aa508bb660de7f5dcbf4cda04138d4d"} err="failed to get container status \"99223e0c7d50ee9fd6f248dfcb3f76a75aa508bb660de7f5dcbf4cda04138d4d\": rpc error: code = NotFound desc = could not find container \"99223e0c7d50ee9fd6f248dfcb3f76a75aa508bb660de7f5dcbf4cda04138d4d\": container with ID starting with 99223e0c7d50ee9fd6f248dfcb3f76a75aa508bb660de7f5dcbf4cda04138d4d not found: ID does not exist" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.089044 4825 scope.go:117] "RemoveContainer" containerID="1c204c0dd92e2e8c589476254c0066dc443ca2290d53873337c9657d6386b775" Mar 10 07:05:56 crc kubenswrapper[4825]: E0310 07:05:56.089308 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c204c0dd92e2e8c589476254c0066dc443ca2290d53873337c9657d6386b775\": container with ID starting with 1c204c0dd92e2e8c589476254c0066dc443ca2290d53873337c9657d6386b775 not found: ID does not exist" containerID="1c204c0dd92e2e8c589476254c0066dc443ca2290d53873337c9657d6386b775" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.089330 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c204c0dd92e2e8c589476254c0066dc443ca2290d53873337c9657d6386b775"} err="failed to get container status \"1c204c0dd92e2e8c589476254c0066dc443ca2290d53873337c9657d6386b775\": rpc error: code = NotFound desc = could not find container \"1c204c0dd92e2e8c589476254c0066dc443ca2290d53873337c9657d6386b775\": container with ID starting with 1c204c0dd92e2e8c589476254c0066dc443ca2290d53873337c9657d6386b775 not found: ID does not exist" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.126909 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-8646874786-6nnq4"] Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.139404 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-8646874786-6nnq4"] Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.166563 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-b6df5fb85-rp95h"] Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.180474 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-b6df5fb85-rp95h"] Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.798608 4825 generic.go:334] "Generic (PLEG): container finished" podID="c3f58042-5c26-4b0b-9f85-d35d9305115e" containerID="7c7257e4058fc753891db09721efcb315ea6caeeab1b10d2c3b36746db13b108" exitCode=0 Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.798695 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-j9wfx" event={"ID":"c3f58042-5c26-4b0b-9f85-d35d9305115e","Type":"ContainerDied","Data":"7c7257e4058fc753891db09721efcb315ea6caeeab1b10d2c3b36746db13b108"} Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.800953 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.907537 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.924557 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.950307 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:05:56 crc kubenswrapper[4825]: E0310 07:05:56.950890 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd14279-d503-4cda-a6b4-14bfd6945596" containerName="barbican-worker" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.950912 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd14279-d503-4cda-a6b4-14bfd6945596" containerName="barbican-worker" Mar 10 07:05:56 crc kubenswrapper[4825]: E0310 07:05:56.950922 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beaf0874-59e2-4ec2-9425-4c11184a7e3d" containerName="barbican-keystone-listener" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.950929 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="beaf0874-59e2-4ec2-9425-4c11184a7e3d" containerName="barbican-keystone-listener" Mar 10 07:05:56 crc kubenswrapper[4825]: E0310 07:05:56.950939 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beaf0874-59e2-4ec2-9425-4c11184a7e3d" containerName="barbican-keystone-listener-log" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.950947 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="beaf0874-59e2-4ec2-9425-4c11184a7e3d" containerName="barbican-keystone-listener-log" Mar 10 07:05:56 crc kubenswrapper[4825]: E0310 07:05:56.950960 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd14279-d503-4cda-a6b4-14bfd6945596" containerName="barbican-worker-log" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.950970 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd14279-d503-4cda-a6b4-14bfd6945596" containerName="barbican-worker-log" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.951164 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="beaf0874-59e2-4ec2-9425-4c11184a7e3d" containerName="barbican-keystone-listener" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.951187 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="beaf0874-59e2-4ec2-9425-4c11184a7e3d" containerName="barbican-keystone-listener-log" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.951198 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fd14279-d503-4cda-a6b4-14bfd6945596" containerName="barbican-worker-log" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.951209 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fd14279-d503-4cda-a6b4-14bfd6945596" containerName="barbican-worker" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.953017 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.958863 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.959297 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 07:05:56 crc kubenswrapper[4825]: I0310 07:05:56.959901 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.017770 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6zpp\" (UniqueName: \"kubernetes.io/projected/b0a61399-4007-4b70-9067-7f6d04ef56af-kube-api-access-c6zpp\") pod \"ceilometer-0\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " pod="openstack/ceilometer-0" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.017869 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a61399-4007-4b70-9067-7f6d04ef56af-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " pod="openstack/ceilometer-0" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.017955 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0a61399-4007-4b70-9067-7f6d04ef56af-run-httpd\") pod \"ceilometer-0\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " pod="openstack/ceilometer-0" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.018005 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0a61399-4007-4b70-9067-7f6d04ef56af-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " pod="openstack/ceilometer-0" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.018165 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a61399-4007-4b70-9067-7f6d04ef56af-scripts\") pod \"ceilometer-0\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " pod="openstack/ceilometer-0" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.018198 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a61399-4007-4b70-9067-7f6d04ef56af-config-data\") pod \"ceilometer-0\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " pod="openstack/ceilometer-0" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.018267 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0a61399-4007-4b70-9067-7f6d04ef56af-log-httpd\") pod \"ceilometer-0\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " pod="openstack/ceilometer-0" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.119797 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a61399-4007-4b70-9067-7f6d04ef56af-scripts\") pod \"ceilometer-0\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " pod="openstack/ceilometer-0" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.119858 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a61399-4007-4b70-9067-7f6d04ef56af-config-data\") pod \"ceilometer-0\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " pod="openstack/ceilometer-0" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.119893 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0a61399-4007-4b70-9067-7f6d04ef56af-log-httpd\") pod \"ceilometer-0\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " pod="openstack/ceilometer-0" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.119917 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6zpp\" (UniqueName: \"kubernetes.io/projected/b0a61399-4007-4b70-9067-7f6d04ef56af-kube-api-access-c6zpp\") pod \"ceilometer-0\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " pod="openstack/ceilometer-0" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.119962 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a61399-4007-4b70-9067-7f6d04ef56af-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " pod="openstack/ceilometer-0" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.120002 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0a61399-4007-4b70-9067-7f6d04ef56af-run-httpd\") pod \"ceilometer-0\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " pod="openstack/ceilometer-0" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.120044 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0a61399-4007-4b70-9067-7f6d04ef56af-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " pod="openstack/ceilometer-0" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.120526 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0a61399-4007-4b70-9067-7f6d04ef56af-log-httpd\") pod \"ceilometer-0\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " pod="openstack/ceilometer-0" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.120675 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0a61399-4007-4b70-9067-7f6d04ef56af-run-httpd\") pod \"ceilometer-0\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " pod="openstack/ceilometer-0" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.126244 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a61399-4007-4b70-9067-7f6d04ef56af-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " pod="openstack/ceilometer-0" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.127566 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a61399-4007-4b70-9067-7f6d04ef56af-scripts\") pod \"ceilometer-0\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " pod="openstack/ceilometer-0" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.129186 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a61399-4007-4b70-9067-7f6d04ef56af-config-data\") pod \"ceilometer-0\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " pod="openstack/ceilometer-0" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.141482 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0a61399-4007-4b70-9067-7f6d04ef56af-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " pod="openstack/ceilometer-0" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.145967 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6zpp\" (UniqueName: \"kubernetes.io/projected/b0a61399-4007-4b70-9067-7f6d04ef56af-kube-api-access-c6zpp\") pod \"ceilometer-0\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " pod="openstack/ceilometer-0" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.251866 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52379ed3-93d8-4f2e-b15e-5ad0cb49ce83" path="/var/lib/kubelet/pods/52379ed3-93d8-4f2e-b15e-5ad0cb49ce83/volumes" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.252534 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fd14279-d503-4cda-a6b4-14bfd6945596" path="/var/lib/kubelet/pods/6fd14279-d503-4cda-a6b4-14bfd6945596/volumes" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.253434 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beaf0874-59e2-4ec2-9425-4c11184a7e3d" path="/var/lib/kubelet/pods/beaf0874-59e2-4ec2-9425-4c11184a7e3d/volumes" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.282839 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.772494 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.804713 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.809777 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:05:57 crc kubenswrapper[4825]: W0310 07:05:57.831435 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0a61399_4007_4b70_9067_7f6d04ef56af.slice/crio-b63a87ac7ab11c44ec9bd287fcff71aa35533525cd5dab4ed72b524177815804 WatchSource:0}: Error finding container b63a87ac7ab11c44ec9bd287fcff71aa35533525cd5dab4ed72b524177815804: Status 404 returned error can't find the container with id b63a87ac7ab11c44ec9bd287fcff71aa35533525cd5dab4ed72b524177815804 Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.894270 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-dcb459ff6-b58pb"] Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.895089 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-dcb459ff6-b58pb" podUID="c93bcce6-e8c8-4916-af01-e85f3dfded64" containerName="barbican-api-log" containerID="cri-o://1838604b40253b47912466b869a4dbb0ed7bea3659d5063f2cc0ce282f9d473f" gracePeriod=30 Mar 10 07:05:57 crc kubenswrapper[4825]: I0310 07:05:57.895319 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-dcb459ff6-b58pb" podUID="c93bcce6-e8c8-4916-af01-e85f3dfded64" containerName="barbican-api" containerID="cri-o://13eef2fd23b3a4387f866485b870a26a32edd760fbc68889410c962fef58826e" gracePeriod=30 Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.313206 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-j9wfx" Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.477181 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3f58042-5c26-4b0b-9f85-d35d9305115e-etc-machine-id\") pod \"c3f58042-5c26-4b0b-9f85-d35d9305115e\" (UID: \"c3f58042-5c26-4b0b-9f85-d35d9305115e\") " Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.477632 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3f58042-5c26-4b0b-9f85-d35d9305115e-combined-ca-bundle\") pod \"c3f58042-5c26-4b0b-9f85-d35d9305115e\" (UID: \"c3f58042-5c26-4b0b-9f85-d35d9305115e\") " Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.477710 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3f58042-5c26-4b0b-9f85-d35d9305115e-db-sync-config-data\") pod \"c3f58042-5c26-4b0b-9f85-d35d9305115e\" (UID: \"c3f58042-5c26-4b0b-9f85-d35d9305115e\") " Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.477757 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3f58042-5c26-4b0b-9f85-d35d9305115e-config-data\") pod \"c3f58042-5c26-4b0b-9f85-d35d9305115e\" (UID: \"c3f58042-5c26-4b0b-9f85-d35d9305115e\") " Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.478045 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3f58042-5c26-4b0b-9f85-d35d9305115e-scripts\") pod \"c3f58042-5c26-4b0b-9f85-d35d9305115e\" (UID: \"c3f58042-5c26-4b0b-9f85-d35d9305115e\") " Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.478193 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hnvt\" (UniqueName: \"kubernetes.io/projected/c3f58042-5c26-4b0b-9f85-d35d9305115e-kube-api-access-2hnvt\") pod \"c3f58042-5c26-4b0b-9f85-d35d9305115e\" (UID: \"c3f58042-5c26-4b0b-9f85-d35d9305115e\") " Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.477388 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3f58042-5c26-4b0b-9f85-d35d9305115e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c3f58042-5c26-4b0b-9f85-d35d9305115e" (UID: "c3f58042-5c26-4b0b-9f85-d35d9305115e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.483674 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3f58042-5c26-4b0b-9f85-d35d9305115e-scripts" (OuterVolumeSpecName: "scripts") pod "c3f58042-5c26-4b0b-9f85-d35d9305115e" (UID: "c3f58042-5c26-4b0b-9f85-d35d9305115e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.484534 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3f58042-5c26-4b0b-9f85-d35d9305115e-kube-api-access-2hnvt" (OuterVolumeSpecName: "kube-api-access-2hnvt") pod "c3f58042-5c26-4b0b-9f85-d35d9305115e" (UID: "c3f58042-5c26-4b0b-9f85-d35d9305115e"). InnerVolumeSpecName "kube-api-access-2hnvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.485980 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3f58042-5c26-4b0b-9f85-d35d9305115e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c3f58042-5c26-4b0b-9f85-d35d9305115e" (UID: "c3f58042-5c26-4b0b-9f85-d35d9305115e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.508504 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3f58042-5c26-4b0b-9f85-d35d9305115e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3f58042-5c26-4b0b-9f85-d35d9305115e" (UID: "c3f58042-5c26-4b0b-9f85-d35d9305115e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.541950 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3f58042-5c26-4b0b-9f85-d35d9305115e-config-data" (OuterVolumeSpecName: "config-data") pod "c3f58042-5c26-4b0b-9f85-d35d9305115e" (UID: "c3f58042-5c26-4b0b-9f85-d35d9305115e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.580879 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3f58042-5c26-4b0b-9f85-d35d9305115e-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.580910 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hnvt\" (UniqueName: \"kubernetes.io/projected/c3f58042-5c26-4b0b-9f85-d35d9305115e-kube-api-access-2hnvt\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.580923 4825 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3f58042-5c26-4b0b-9f85-d35d9305115e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.580935 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3f58042-5c26-4b0b-9f85-d35d9305115e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.580945 4825 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3f58042-5c26-4b0b-9f85-d35d9305115e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.580953 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3f58042-5c26-4b0b-9f85-d35d9305115e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.823354 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0a61399-4007-4b70-9067-7f6d04ef56af","Type":"ContainerStarted","Data":"e3f046df83dd1dc0fe9210b20a9e0d04085a4819187f404caf1145c464c250e1"} Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.823405 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0a61399-4007-4b70-9067-7f6d04ef56af","Type":"ContainerStarted","Data":"b63a87ac7ab11c44ec9bd287fcff71aa35533525cd5dab4ed72b524177815804"} Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.826374 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-j9wfx" Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.827108 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-j9wfx" event={"ID":"c3f58042-5c26-4b0b-9f85-d35d9305115e","Type":"ContainerDied","Data":"eaf7ff81460399f981f5998ee5ed6115093fe16e7bc8a700f771cce129535ec1"} Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.827175 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eaf7ff81460399f981f5998ee5ed6115093fe16e7bc8a700f771cce129535ec1" Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.830362 4825 generic.go:334] "Generic (PLEG): container finished" podID="c93bcce6-e8c8-4916-af01-e85f3dfded64" containerID="1838604b40253b47912466b869a4dbb0ed7bea3659d5063f2cc0ce282f9d473f" exitCode=143 Mar 10 07:05:58 crc kubenswrapper[4825]: I0310 07:05:58.830391 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-dcb459ff6-b58pb" event={"ID":"c93bcce6-e8c8-4916-af01-e85f3dfded64","Type":"ContainerDied","Data":"1838604b40253b47912466b869a4dbb0ed7bea3659d5063f2cc0ce282f9d473f"} Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.227192 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 07:05:59 crc kubenswrapper[4825]: E0310 07:05:59.227589 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3f58042-5c26-4b0b-9f85-d35d9305115e" containerName="cinder-db-sync" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.227605 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3f58042-5c26-4b0b-9f85-d35d9305115e" containerName="cinder-db-sync" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.227789 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3f58042-5c26-4b0b-9f85-d35d9305115e" containerName="cinder-db-sync" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.228699 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.241290 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-7s7j4" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.241710 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.242125 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.242524 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.265600 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.292650 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\") " pod="openstack/cinder-scheduler-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.318560 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\") " pod="openstack/cinder-scheduler-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.320339 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mclx\" (UniqueName: \"kubernetes.io/projected/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-kube-api-access-2mclx\") pod \"cinder-scheduler-0\" (UID: \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\") " pod="openstack/cinder-scheduler-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.320462 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-scripts\") pod \"cinder-scheduler-0\" (UID: \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\") " pod="openstack/cinder-scheduler-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.320579 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\") " pod="openstack/cinder-scheduler-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.320754 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-config-data\") pod \"cinder-scheduler-0\" (UID: \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\") " pod="openstack/cinder-scheduler-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.390201 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-n7rdd"] Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.391770 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.417052 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-n7rdd"] Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.432957 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mclx\" (UniqueName: \"kubernetes.io/projected/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-kube-api-access-2mclx\") pod \"cinder-scheduler-0\" (UID: \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\") " pod="openstack/cinder-scheduler-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.433002 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-scripts\") pod \"cinder-scheduler-0\" (UID: \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\") " pod="openstack/cinder-scheduler-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.433030 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmc2f\" (UniqueName: \"kubernetes.io/projected/b9c1ef21-4ae0-4693-baf2-06678f62411e-kube-api-access-pmc2f\") pod \"dnsmasq-dns-7b8fcc65cc-n7rdd\" (UID: \"b9c1ef21-4ae0-4693-baf2-06678f62411e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.433055 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\") " pod="openstack/cinder-scheduler-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.433071 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-dns-svc\") pod \"dnsmasq-dns-7b8fcc65cc-n7rdd\" (UID: \"b9c1ef21-4ae0-4693-baf2-06678f62411e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.433100 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8fcc65cc-n7rdd\" (UID: \"b9c1ef21-4ae0-4693-baf2-06678f62411e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.433148 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-config\") pod \"dnsmasq-dns-7b8fcc65cc-n7rdd\" (UID: \"b9c1ef21-4ae0-4693-baf2-06678f62411e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.433169 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-config-data\") pod \"cinder-scheduler-0\" (UID: \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\") " pod="openstack/cinder-scheduler-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.433197 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\") " pod="openstack/cinder-scheduler-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.433252 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\") " pod="openstack/cinder-scheduler-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.433272 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-dns-swift-storage-0\") pod \"dnsmasq-dns-7b8fcc65cc-n7rdd\" (UID: \"b9c1ef21-4ae0-4693-baf2-06678f62411e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.433299 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8fcc65cc-n7rdd\" (UID: \"b9c1ef21-4ae0-4693-baf2-06678f62411e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.441215 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\") " pod="openstack/cinder-scheduler-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.442824 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-config-data\") pod \"cinder-scheduler-0\" (UID: \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\") " pod="openstack/cinder-scheduler-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.443676 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-scripts\") pod \"cinder-scheduler-0\" (UID: \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\") " pod="openstack/cinder-scheduler-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.445728 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\") " pod="openstack/cinder-scheduler-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.450399 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\") " pod="openstack/cinder-scheduler-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.492375 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.497233 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.514971 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.519103 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mclx\" (UniqueName: \"kubernetes.io/projected/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-kube-api-access-2mclx\") pod \"cinder-scheduler-0\" (UID: \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\") " pod="openstack/cinder-scheduler-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.535034 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8fcc65cc-n7rdd\" (UID: \"b9c1ef21-4ae0-4693-baf2-06678f62411e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.535096 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-config\") pod \"dnsmasq-dns-7b8fcc65cc-n7rdd\" (UID: \"b9c1ef21-4ae0-4693-baf2-06678f62411e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.535155 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb8652d3-6561-469d-b489-ee8006518277-logs\") pod \"cinder-api-0\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " pod="openstack/cinder-api-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.535189 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb8652d3-6561-469d-b489-ee8006518277-scripts\") pod \"cinder-api-0\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " pod="openstack/cinder-api-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.535256 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgbkq\" (UniqueName: \"kubernetes.io/projected/eb8652d3-6561-469d-b489-ee8006518277-kube-api-access-vgbkq\") pod \"cinder-api-0\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " pod="openstack/cinder-api-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.535294 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8652d3-6561-469d-b489-ee8006518277-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " pod="openstack/cinder-api-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.535318 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb8652d3-6561-469d-b489-ee8006518277-etc-machine-id\") pod \"cinder-api-0\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " pod="openstack/cinder-api-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.535349 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-dns-swift-storage-0\") pod \"dnsmasq-dns-7b8fcc65cc-n7rdd\" (UID: \"b9c1ef21-4ae0-4693-baf2-06678f62411e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.535374 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8652d3-6561-469d-b489-ee8006518277-config-data\") pod \"cinder-api-0\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " pod="openstack/cinder-api-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.535391 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8fcc65cc-n7rdd\" (UID: \"b9c1ef21-4ae0-4693-baf2-06678f62411e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.535413 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb8652d3-6561-469d-b489-ee8006518277-config-data-custom\") pod \"cinder-api-0\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " pod="openstack/cinder-api-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.535463 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmc2f\" (UniqueName: \"kubernetes.io/projected/b9c1ef21-4ae0-4693-baf2-06678f62411e-kube-api-access-pmc2f\") pod \"dnsmasq-dns-7b8fcc65cc-n7rdd\" (UID: \"b9c1ef21-4ae0-4693-baf2-06678f62411e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.535488 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-dns-svc\") pod \"dnsmasq-dns-7b8fcc65cc-n7rdd\" (UID: \"b9c1ef21-4ae0-4693-baf2-06678f62411e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.536899 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8fcc65cc-n7rdd\" (UID: \"b9c1ef21-4ae0-4693-baf2-06678f62411e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.537542 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-config\") pod \"dnsmasq-dns-7b8fcc65cc-n7rdd\" (UID: \"b9c1ef21-4ae0-4693-baf2-06678f62411e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.538153 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-dns-swift-storage-0\") pod \"dnsmasq-dns-7b8fcc65cc-n7rdd\" (UID: \"b9c1ef21-4ae0-4693-baf2-06678f62411e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.538719 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8fcc65cc-n7rdd\" (UID: \"b9c1ef21-4ae0-4693-baf2-06678f62411e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.538772 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-dns-svc\") pod \"dnsmasq-dns-7b8fcc65cc-n7rdd\" (UID: \"b9c1ef21-4ae0-4693-baf2-06678f62411e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.564842 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.567563 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.575255 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmc2f\" (UniqueName: \"kubernetes.io/projected/b9c1ef21-4ae0-4693-baf2-06678f62411e-kube-api-access-pmc2f\") pod \"dnsmasq-dns-7b8fcc65cc-n7rdd\" (UID: \"b9c1ef21-4ae0-4693-baf2-06678f62411e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.640276 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgbkq\" (UniqueName: \"kubernetes.io/projected/eb8652d3-6561-469d-b489-ee8006518277-kube-api-access-vgbkq\") pod \"cinder-api-0\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " pod="openstack/cinder-api-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.640783 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8652d3-6561-469d-b489-ee8006518277-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " pod="openstack/cinder-api-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.640960 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb8652d3-6561-469d-b489-ee8006518277-etc-machine-id\") pod \"cinder-api-0\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " pod="openstack/cinder-api-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.641175 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8652d3-6561-469d-b489-ee8006518277-config-data\") pod \"cinder-api-0\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " pod="openstack/cinder-api-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.643602 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb8652d3-6561-469d-b489-ee8006518277-config-data-custom\") pod \"cinder-api-0\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " pod="openstack/cinder-api-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.642085 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb8652d3-6561-469d-b489-ee8006518277-etc-machine-id\") pod \"cinder-api-0\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " pod="openstack/cinder-api-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.646529 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb8652d3-6561-469d-b489-ee8006518277-logs\") pod \"cinder-api-0\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " pod="openstack/cinder-api-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.646706 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb8652d3-6561-469d-b489-ee8006518277-scripts\") pod \"cinder-api-0\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " pod="openstack/cinder-api-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.647237 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8652d3-6561-469d-b489-ee8006518277-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " pod="openstack/cinder-api-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.647525 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb8652d3-6561-469d-b489-ee8006518277-logs\") pod \"cinder-api-0\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " pod="openstack/cinder-api-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.648010 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb8652d3-6561-469d-b489-ee8006518277-config-data-custom\") pod \"cinder-api-0\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " pod="openstack/cinder-api-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.649492 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8652d3-6561-469d-b489-ee8006518277-config-data\") pod \"cinder-api-0\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " pod="openstack/cinder-api-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.650576 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb8652d3-6561-469d-b489-ee8006518277-scripts\") pod \"cinder-api-0\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " pod="openstack/cinder-api-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.668897 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgbkq\" (UniqueName: \"kubernetes.io/projected/eb8652d3-6561-469d-b489-ee8006518277-kube-api-access-vgbkq\") pod \"cinder-api-0\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " pod="openstack/cinder-api-0" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.847165 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.862670 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0a61399-4007-4b70-9067-7f6d04ef56af","Type":"ContainerStarted","Data":"6cf51432c3e3d819a5b3762341501279a4238698130e1747021831154fb129a3"} Mar 10 07:05:59 crc kubenswrapper[4825]: I0310 07:05:59.919958 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 07:06:00 crc kubenswrapper[4825]: I0310 07:06:00.161525 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552106-kjbwd"] Mar 10 07:06:00 crc kubenswrapper[4825]: I0310 07:06:00.163654 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552106-kjbwd" Mar 10 07:06:00 crc kubenswrapper[4825]: I0310 07:06:00.186543 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:06:00 crc kubenswrapper[4825]: I0310 07:06:00.186760 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:06:00 crc kubenswrapper[4825]: I0310 07:06:00.186799 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:06:00 crc kubenswrapper[4825]: I0310 07:06:00.201118 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552106-kjbwd"] Mar 10 07:06:00 crc kubenswrapper[4825]: I0310 07:06:00.267809 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlv5c\" (UniqueName: \"kubernetes.io/projected/7ead0475-146d-4055-9e56-ab3ba945041c-kube-api-access-jlv5c\") pod \"auto-csr-approver-29552106-kjbwd\" (UID: \"7ead0475-146d-4055-9e56-ab3ba945041c\") " pod="openshift-infra/auto-csr-approver-29552106-kjbwd" Mar 10 07:06:00 crc kubenswrapper[4825]: I0310 07:06:00.275867 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 07:06:00 crc kubenswrapper[4825]: W0310 07:06:00.285918 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0c5ff0f_ee41_4f20_b87d_4db0855c9161.slice/crio-8b10207fc4fb452bbd657124f0f25d2fe603777f48ff2291ee370ef8f9be358e WatchSource:0}: Error finding container 8b10207fc4fb452bbd657124f0f25d2fe603777f48ff2291ee370ef8f9be358e: Status 404 returned error can't find the container with id 8b10207fc4fb452bbd657124f0f25d2fe603777f48ff2291ee370ef8f9be358e Mar 10 07:06:00 crc kubenswrapper[4825]: I0310 07:06:00.373444 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlv5c\" (UniqueName: \"kubernetes.io/projected/7ead0475-146d-4055-9e56-ab3ba945041c-kube-api-access-jlv5c\") pod \"auto-csr-approver-29552106-kjbwd\" (UID: \"7ead0475-146d-4055-9e56-ab3ba945041c\") " pod="openshift-infra/auto-csr-approver-29552106-kjbwd" Mar 10 07:06:00 crc kubenswrapper[4825]: I0310 07:06:00.412922 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlv5c\" (UniqueName: \"kubernetes.io/projected/7ead0475-146d-4055-9e56-ab3ba945041c-kube-api-access-jlv5c\") pod \"auto-csr-approver-29552106-kjbwd\" (UID: \"7ead0475-146d-4055-9e56-ab3ba945041c\") " pod="openshift-infra/auto-csr-approver-29552106-kjbwd" Mar 10 07:06:00 crc kubenswrapper[4825]: I0310 07:06:00.521662 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552106-kjbwd" Mar 10 07:06:00 crc kubenswrapper[4825]: I0310 07:06:00.540395 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 07:06:00 crc kubenswrapper[4825]: I0310 07:06:00.553507 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-n7rdd"] Mar 10 07:06:00 crc kubenswrapper[4825]: W0310 07:06:00.571880 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9c1ef21_4ae0_4693_baf2_06678f62411e.slice/crio-3f91dfef15d972a20c51f7ab8eb6d49c3cef5ec6a1c022bd311a84911847dee3 WatchSource:0}: Error finding container 3f91dfef15d972a20c51f7ab8eb6d49c3cef5ec6a1c022bd311a84911847dee3: Status 404 returned error can't find the container with id 3f91dfef15d972a20c51f7ab8eb6d49c3cef5ec6a1c022bd311a84911847dee3 Mar 10 07:06:00 crc kubenswrapper[4825]: I0310 07:06:00.891379 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eb8652d3-6561-469d-b489-ee8006518277","Type":"ContainerStarted","Data":"f6762325f46b199501c36f193633e4ff37929190877be3bb1fee46bffc00b023"} Mar 10 07:06:00 crc kubenswrapper[4825]: I0310 07:06:00.905432 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" event={"ID":"b9c1ef21-4ae0-4693-baf2-06678f62411e","Type":"ContainerStarted","Data":"3f91dfef15d972a20c51f7ab8eb6d49c3cef5ec6a1c022bd311a84911847dee3"} Mar 10 07:06:00 crc kubenswrapper[4825]: I0310 07:06:00.909814 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e0c5ff0f-ee41-4f20-b87d-4db0855c9161","Type":"ContainerStarted","Data":"8b10207fc4fb452bbd657124f0f25d2fe603777f48ff2291ee370ef8f9be358e"} Mar 10 07:06:00 crc kubenswrapper[4825]: I0310 07:06:00.949525 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0a61399-4007-4b70-9067-7f6d04ef56af","Type":"ContainerStarted","Data":"ff228c3d6867b3d2585c3193264c033f65a5967d49dba70fdc735472e9d6d59c"} Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.206390 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552106-kjbwd"] Mar 10 07:06:01 crc kubenswrapper[4825]: W0310 07:06:01.218884 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ead0475_146d_4055_9e56_ab3ba945041c.slice/crio-fa64375df7a65fe4c6865978bbdc0b86a2c8ef43079e682bf445b7375ede8e40 WatchSource:0}: Error finding container fa64375df7a65fe4c6865978bbdc0b86a2c8ef43079e682bf445b7375ede8e40: Status 404 returned error can't find the container with id fa64375df7a65fe4c6865978bbdc0b86a2c8ef43079e682bf445b7375ede8e40 Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.641895 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.727751 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-public-tls-certs\") pod \"c93bcce6-e8c8-4916-af01-e85f3dfded64\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.728038 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c93bcce6-e8c8-4916-af01-e85f3dfded64-logs\") pod \"c93bcce6-e8c8-4916-af01-e85f3dfded64\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.728183 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-config-data-custom\") pod \"c93bcce6-e8c8-4916-af01-e85f3dfded64\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.728263 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-config-data\") pod \"c93bcce6-e8c8-4916-af01-e85f3dfded64\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.728348 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-internal-tls-certs\") pod \"c93bcce6-e8c8-4916-af01-e85f3dfded64\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.728419 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-combined-ca-bundle\") pod \"c93bcce6-e8c8-4916-af01-e85f3dfded64\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.728545 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzsng\" (UniqueName: \"kubernetes.io/projected/c93bcce6-e8c8-4916-af01-e85f3dfded64-kube-api-access-kzsng\") pod \"c93bcce6-e8c8-4916-af01-e85f3dfded64\" (UID: \"c93bcce6-e8c8-4916-af01-e85f3dfded64\") " Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.731052 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c93bcce6-e8c8-4916-af01-e85f3dfded64-logs" (OuterVolumeSpecName: "logs") pod "c93bcce6-e8c8-4916-af01-e85f3dfded64" (UID: "c93bcce6-e8c8-4916-af01-e85f3dfded64"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.733529 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c93bcce6-e8c8-4916-af01-e85f3dfded64" (UID: "c93bcce6-e8c8-4916-af01-e85f3dfded64"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.737928 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c93bcce6-e8c8-4916-af01-e85f3dfded64-kube-api-access-kzsng" (OuterVolumeSpecName: "kube-api-access-kzsng") pod "c93bcce6-e8c8-4916-af01-e85f3dfded64" (UID: "c93bcce6-e8c8-4916-af01-e85f3dfded64"). InnerVolumeSpecName "kube-api-access-kzsng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.797378 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c93bcce6-e8c8-4916-af01-e85f3dfded64" (UID: "c93bcce6-e8c8-4916-af01-e85f3dfded64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.828989 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.830438 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.830460 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.830470 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzsng\" (UniqueName: \"kubernetes.io/projected/c93bcce6-e8c8-4916-af01-e85f3dfded64-kube-api-access-kzsng\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.830480 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c93bcce6-e8c8-4916-af01-e85f3dfded64-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.838050 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c93bcce6-e8c8-4916-af01-e85f3dfded64" (UID: "c93bcce6-e8c8-4916-af01-e85f3dfded64"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.850274 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-config-data" (OuterVolumeSpecName: "config-data") pod "c93bcce6-e8c8-4916-af01-e85f3dfded64" (UID: "c93bcce6-e8c8-4916-af01-e85f3dfded64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.861498 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c93bcce6-e8c8-4916-af01-e85f3dfded64" (UID: "c93bcce6-e8c8-4916-af01-e85f3dfded64"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.931887 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.931922 4825 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.931933 4825 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c93bcce6-e8c8-4916-af01-e85f3dfded64-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.985745 4825 generic.go:334] "Generic (PLEG): container finished" podID="c93bcce6-e8c8-4916-af01-e85f3dfded64" containerID="13eef2fd23b3a4387f866485b870a26a32edd760fbc68889410c962fef58826e" exitCode=0 Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.985834 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-dcb459ff6-b58pb" event={"ID":"c93bcce6-e8c8-4916-af01-e85f3dfded64","Type":"ContainerDied","Data":"13eef2fd23b3a4387f866485b870a26a32edd760fbc68889410c962fef58826e"} Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.985881 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-dcb459ff6-b58pb" event={"ID":"c93bcce6-e8c8-4916-af01-e85f3dfded64","Type":"ContainerDied","Data":"be6c22749106b274fbe03fb5640bba5610567305756596ad82dc2b42365f6b46"} Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.985901 4825 scope.go:117] "RemoveContainer" containerID="13eef2fd23b3a4387f866485b870a26a32edd760fbc68889410c962fef58826e" Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.986068 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-dcb459ff6-b58pb" Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.998271 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eb8652d3-6561-469d-b489-ee8006518277","Type":"ContainerStarted","Data":"608423ce51f37b012c818a571335a63e76f3e20a4ea9257f23364cd9d5e5373d"} Mar 10 07:06:01 crc kubenswrapper[4825]: I0310 07:06:01.999806 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552106-kjbwd" event={"ID":"7ead0475-146d-4055-9e56-ab3ba945041c","Type":"ContainerStarted","Data":"fa64375df7a65fe4c6865978bbdc0b86a2c8ef43079e682bf445b7375ede8e40"} Mar 10 07:06:02 crc kubenswrapper[4825]: I0310 07:06:02.017422 4825 generic.go:334] "Generic (PLEG): container finished" podID="b9c1ef21-4ae0-4693-baf2-06678f62411e" containerID="3e7b56e402fc52c37eaffba0cead25e2e2f123e3f084b20fe95806e05bfb33dd" exitCode=0 Mar 10 07:06:02 crc kubenswrapper[4825]: I0310 07:06:02.017525 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" event={"ID":"b9c1ef21-4ae0-4693-baf2-06678f62411e","Type":"ContainerDied","Data":"3e7b56e402fc52c37eaffba0cead25e2e2f123e3f084b20fe95806e05bfb33dd"} Mar 10 07:06:02 crc kubenswrapper[4825]: I0310 07:06:02.039339 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e0c5ff0f-ee41-4f20-b87d-4db0855c9161","Type":"ContainerStarted","Data":"634ce193db17752ea91a7a337afdbfe210ec554d1465d0cdbcd5504db7321043"} Mar 10 07:06:02 crc kubenswrapper[4825]: I0310 07:06:02.063152 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-dcb459ff6-b58pb"] Mar 10 07:06:02 crc kubenswrapper[4825]: I0310 07:06:02.069911 4825 scope.go:117] "RemoveContainer" containerID="1838604b40253b47912466b869a4dbb0ed7bea3659d5063f2cc0ce282f9d473f" Mar 10 07:06:02 crc kubenswrapper[4825]: I0310 07:06:02.093449 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-dcb459ff6-b58pb"] Mar 10 07:06:02 crc kubenswrapper[4825]: I0310 07:06:02.137220 4825 scope.go:117] "RemoveContainer" containerID="13eef2fd23b3a4387f866485b870a26a32edd760fbc68889410c962fef58826e" Mar 10 07:06:02 crc kubenswrapper[4825]: E0310 07:06:02.137756 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13eef2fd23b3a4387f866485b870a26a32edd760fbc68889410c962fef58826e\": container with ID starting with 13eef2fd23b3a4387f866485b870a26a32edd760fbc68889410c962fef58826e not found: ID does not exist" containerID="13eef2fd23b3a4387f866485b870a26a32edd760fbc68889410c962fef58826e" Mar 10 07:06:02 crc kubenswrapper[4825]: I0310 07:06:02.137813 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13eef2fd23b3a4387f866485b870a26a32edd760fbc68889410c962fef58826e"} err="failed to get container status \"13eef2fd23b3a4387f866485b870a26a32edd760fbc68889410c962fef58826e\": rpc error: code = NotFound desc = could not find container \"13eef2fd23b3a4387f866485b870a26a32edd760fbc68889410c962fef58826e\": container with ID starting with 13eef2fd23b3a4387f866485b870a26a32edd760fbc68889410c962fef58826e not found: ID does not exist" Mar 10 07:06:02 crc kubenswrapper[4825]: I0310 07:06:02.137861 4825 scope.go:117] "RemoveContainer" containerID="1838604b40253b47912466b869a4dbb0ed7bea3659d5063f2cc0ce282f9d473f" Mar 10 07:06:02 crc kubenswrapper[4825]: E0310 07:06:02.138304 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1838604b40253b47912466b869a4dbb0ed7bea3659d5063f2cc0ce282f9d473f\": container with ID starting with 1838604b40253b47912466b869a4dbb0ed7bea3659d5063f2cc0ce282f9d473f not found: ID does not exist" containerID="1838604b40253b47912466b869a4dbb0ed7bea3659d5063f2cc0ce282f9d473f" Mar 10 07:06:02 crc kubenswrapper[4825]: I0310 07:06:02.138352 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1838604b40253b47912466b869a4dbb0ed7bea3659d5063f2cc0ce282f9d473f"} err="failed to get container status \"1838604b40253b47912466b869a4dbb0ed7bea3659d5063f2cc0ce282f9d473f\": rpc error: code = NotFound desc = could not find container \"1838604b40253b47912466b869a4dbb0ed7bea3659d5063f2cc0ce282f9d473f\": container with ID starting with 1838604b40253b47912466b869a4dbb0ed7bea3659d5063f2cc0ce282f9d473f not found: ID does not exist" Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.054369 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0a61399-4007-4b70-9067-7f6d04ef56af","Type":"ContainerStarted","Data":"2d8e1b42dfd789d8be822497314dd1e42aa7804644acb3d68f8345dcc655110f"} Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.054789 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.058442 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eb8652d3-6561-469d-b489-ee8006518277","Type":"ContainerStarted","Data":"a8e084d8a89a3c7adc52accc2e9c96c8af2eeb1eacee98bcf2a57e272d3332a1"} Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.058568 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="eb8652d3-6561-469d-b489-ee8006518277" containerName="cinder-api-log" containerID="cri-o://608423ce51f37b012c818a571335a63e76f3e20a4ea9257f23364cd9d5e5373d" gracePeriod=30 Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.058797 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.058833 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="eb8652d3-6561-469d-b489-ee8006518277" containerName="cinder-api" containerID="cri-o://a8e084d8a89a3c7adc52accc2e9c96c8af2eeb1eacee98bcf2a57e272d3332a1" gracePeriod=30 Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.065107 4825 generic.go:334] "Generic (PLEG): container finished" podID="7ead0475-146d-4055-9e56-ab3ba945041c" containerID="d070d8dc5655990ee6a75c959e20121a68217564d45799abf7aa61ef7c8cb442" exitCode=0 Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.065183 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552106-kjbwd" event={"ID":"7ead0475-146d-4055-9e56-ab3ba945041c","Type":"ContainerDied","Data":"d070d8dc5655990ee6a75c959e20121a68217564d45799abf7aa61ef7c8cb442"} Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.068274 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" event={"ID":"b9c1ef21-4ae0-4693-baf2-06678f62411e","Type":"ContainerStarted","Data":"9bf08af71a6a3a7937ba9529a3e4285427d5f544f773c8146847f811e42d30f5"} Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.068734 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.070690 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e0c5ff0f-ee41-4f20-b87d-4db0855c9161","Type":"ContainerStarted","Data":"8a4d30188885f55be9bc509363bf84961ab41e834420534de1614a2d5da7a44e"} Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.082374 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.908309891 podStartE2EDuration="7.082360561s" podCreationTimestamp="2026-03-10 07:05:56 +0000 UTC" firstStartedPulling="2026-03-10 07:05:57.838262742 +0000 UTC m=+1310.868043367" lastFinishedPulling="2026-03-10 07:06:02.012313422 +0000 UTC m=+1315.042094037" observedRunningTime="2026-03-10 07:06:03.077965696 +0000 UTC m=+1316.107746311" watchObservedRunningTime="2026-03-10 07:06:03.082360561 +0000 UTC m=+1316.112141176" Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.102570 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" podStartSLOduration=4.102550537 podStartE2EDuration="4.102550537s" podCreationTimestamp="2026-03-10 07:05:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:06:03.100941225 +0000 UTC m=+1316.130721840" watchObservedRunningTime="2026-03-10 07:06:03.102550537 +0000 UTC m=+1316.132331152" Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.121206 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.121186053 podStartE2EDuration="4.121186053s" podCreationTimestamp="2026-03-10 07:05:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:06:03.121030149 +0000 UTC m=+1316.150810774" watchObservedRunningTime="2026-03-10 07:06:03.121186053 +0000 UTC m=+1316.150966668" Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.154832 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.382422538 podStartE2EDuration="4.154812839s" podCreationTimestamp="2026-03-10 07:05:59 +0000 UTC" firstStartedPulling="2026-03-10 07:06:00.314185613 +0000 UTC m=+1313.343966228" lastFinishedPulling="2026-03-10 07:06:01.086575914 +0000 UTC m=+1314.116356529" observedRunningTime="2026-03-10 07:06:03.153267379 +0000 UTC m=+1316.183048004" watchObservedRunningTime="2026-03-10 07:06:03.154812839 +0000 UTC m=+1316.184593454" Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.252572 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c93bcce6-e8c8-4916-af01-e85f3dfded64" path="/var/lib/kubelet/pods/c93bcce6-e8c8-4916-af01-e85f3dfded64/volumes" Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.848674 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.982574 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb8652d3-6561-469d-b489-ee8006518277-config-data-custom\") pod \"eb8652d3-6561-469d-b489-ee8006518277\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.982684 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8652d3-6561-469d-b489-ee8006518277-combined-ca-bundle\") pod \"eb8652d3-6561-469d-b489-ee8006518277\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.982747 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb8652d3-6561-469d-b489-ee8006518277-logs\") pod \"eb8652d3-6561-469d-b489-ee8006518277\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.982793 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb8652d3-6561-469d-b489-ee8006518277-scripts\") pod \"eb8652d3-6561-469d-b489-ee8006518277\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.982917 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgbkq\" (UniqueName: \"kubernetes.io/projected/eb8652d3-6561-469d-b489-ee8006518277-kube-api-access-vgbkq\") pod \"eb8652d3-6561-469d-b489-ee8006518277\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.983047 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb8652d3-6561-469d-b489-ee8006518277-etc-machine-id\") pod \"eb8652d3-6561-469d-b489-ee8006518277\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.983082 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8652d3-6561-469d-b489-ee8006518277-config-data\") pod \"eb8652d3-6561-469d-b489-ee8006518277\" (UID: \"eb8652d3-6561-469d-b489-ee8006518277\") " Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.983159 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb8652d3-6561-469d-b489-ee8006518277-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "eb8652d3-6561-469d-b489-ee8006518277" (UID: "eb8652d3-6561-469d-b489-ee8006518277"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.983453 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb8652d3-6561-469d-b489-ee8006518277-logs" (OuterVolumeSpecName: "logs") pod "eb8652d3-6561-469d-b489-ee8006518277" (UID: "eb8652d3-6561-469d-b489-ee8006518277"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.984124 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb8652d3-6561-469d-b489-ee8006518277-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.984182 4825 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb8652d3-6561-469d-b489-ee8006518277-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.990548 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb8652d3-6561-469d-b489-ee8006518277-kube-api-access-vgbkq" (OuterVolumeSpecName: "kube-api-access-vgbkq") pod "eb8652d3-6561-469d-b489-ee8006518277" (UID: "eb8652d3-6561-469d-b489-ee8006518277"). InnerVolumeSpecName "kube-api-access-vgbkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:06:03 crc kubenswrapper[4825]: I0310 07:06:03.996172 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb8652d3-6561-469d-b489-ee8006518277-scripts" (OuterVolumeSpecName: "scripts") pod "eb8652d3-6561-469d-b489-ee8006518277" (UID: "eb8652d3-6561-469d-b489-ee8006518277"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.012260 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb8652d3-6561-469d-b489-ee8006518277-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "eb8652d3-6561-469d-b489-ee8006518277" (UID: "eb8652d3-6561-469d-b489-ee8006518277"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.020703 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb8652d3-6561-469d-b489-ee8006518277-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb8652d3-6561-469d-b489-ee8006518277" (UID: "eb8652d3-6561-469d-b489-ee8006518277"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.049750 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb8652d3-6561-469d-b489-ee8006518277-config-data" (OuterVolumeSpecName: "config-data") pod "eb8652d3-6561-469d-b489-ee8006518277" (UID: "eb8652d3-6561-469d-b489-ee8006518277"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.081692 4825 generic.go:334] "Generic (PLEG): container finished" podID="eb8652d3-6561-469d-b489-ee8006518277" containerID="a8e084d8a89a3c7adc52accc2e9c96c8af2eeb1eacee98bcf2a57e272d3332a1" exitCode=0 Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.081738 4825 generic.go:334] "Generic (PLEG): container finished" podID="eb8652d3-6561-469d-b489-ee8006518277" containerID="608423ce51f37b012c818a571335a63e76f3e20a4ea9257f23364cd9d5e5373d" exitCode=143 Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.081766 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.081843 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eb8652d3-6561-469d-b489-ee8006518277","Type":"ContainerDied","Data":"a8e084d8a89a3c7adc52accc2e9c96c8af2eeb1eacee98bcf2a57e272d3332a1"} Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.081891 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eb8652d3-6561-469d-b489-ee8006518277","Type":"ContainerDied","Data":"608423ce51f37b012c818a571335a63e76f3e20a4ea9257f23364cd9d5e5373d"} Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.081911 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eb8652d3-6561-469d-b489-ee8006518277","Type":"ContainerDied","Data":"f6762325f46b199501c36f193633e4ff37929190877be3bb1fee46bffc00b023"} Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.081937 4825 scope.go:117] "RemoveContainer" containerID="a8e084d8a89a3c7adc52accc2e9c96c8af2eeb1eacee98bcf2a57e272d3332a1" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.086982 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb8652d3-6561-469d-b489-ee8006518277-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.087009 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb8652d3-6561-469d-b489-ee8006518277-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.087023 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb8652d3-6561-469d-b489-ee8006518277-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.087038 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgbkq\" (UniqueName: \"kubernetes.io/projected/eb8652d3-6561-469d-b489-ee8006518277-kube-api-access-vgbkq\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.087050 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb8652d3-6561-469d-b489-ee8006518277-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.112623 4825 scope.go:117] "RemoveContainer" containerID="608423ce51f37b012c818a571335a63e76f3e20a4ea9257f23364cd9d5e5373d" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.157516 4825 scope.go:117] "RemoveContainer" containerID="a8e084d8a89a3c7adc52accc2e9c96c8af2eeb1eacee98bcf2a57e272d3332a1" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.161213 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 07:06:04 crc kubenswrapper[4825]: E0310 07:06:04.161718 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8e084d8a89a3c7adc52accc2e9c96c8af2eeb1eacee98bcf2a57e272d3332a1\": container with ID starting with a8e084d8a89a3c7adc52accc2e9c96c8af2eeb1eacee98bcf2a57e272d3332a1 not found: ID does not exist" containerID="a8e084d8a89a3c7adc52accc2e9c96c8af2eeb1eacee98bcf2a57e272d3332a1" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.161762 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8e084d8a89a3c7adc52accc2e9c96c8af2eeb1eacee98bcf2a57e272d3332a1"} err="failed to get container status \"a8e084d8a89a3c7adc52accc2e9c96c8af2eeb1eacee98bcf2a57e272d3332a1\": rpc error: code = NotFound desc = could not find container \"a8e084d8a89a3c7adc52accc2e9c96c8af2eeb1eacee98bcf2a57e272d3332a1\": container with ID starting with a8e084d8a89a3c7adc52accc2e9c96c8af2eeb1eacee98bcf2a57e272d3332a1 not found: ID does not exist" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.161793 4825 scope.go:117] "RemoveContainer" containerID="608423ce51f37b012c818a571335a63e76f3e20a4ea9257f23364cd9d5e5373d" Mar 10 07:06:04 crc kubenswrapper[4825]: E0310 07:06:04.162674 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"608423ce51f37b012c818a571335a63e76f3e20a4ea9257f23364cd9d5e5373d\": container with ID starting with 608423ce51f37b012c818a571335a63e76f3e20a4ea9257f23364cd9d5e5373d not found: ID does not exist" containerID="608423ce51f37b012c818a571335a63e76f3e20a4ea9257f23364cd9d5e5373d" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.162721 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"608423ce51f37b012c818a571335a63e76f3e20a4ea9257f23364cd9d5e5373d"} err="failed to get container status \"608423ce51f37b012c818a571335a63e76f3e20a4ea9257f23364cd9d5e5373d\": rpc error: code = NotFound desc = could not find container \"608423ce51f37b012c818a571335a63e76f3e20a4ea9257f23364cd9d5e5373d\": container with ID starting with 608423ce51f37b012c818a571335a63e76f3e20a4ea9257f23364cd9d5e5373d not found: ID does not exist" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.162757 4825 scope.go:117] "RemoveContainer" containerID="a8e084d8a89a3c7adc52accc2e9c96c8af2eeb1eacee98bcf2a57e272d3332a1" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.163039 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8e084d8a89a3c7adc52accc2e9c96c8af2eeb1eacee98bcf2a57e272d3332a1"} err="failed to get container status \"a8e084d8a89a3c7adc52accc2e9c96c8af2eeb1eacee98bcf2a57e272d3332a1\": rpc error: code = NotFound desc = could not find container \"a8e084d8a89a3c7adc52accc2e9c96c8af2eeb1eacee98bcf2a57e272d3332a1\": container with ID starting with a8e084d8a89a3c7adc52accc2e9c96c8af2eeb1eacee98bcf2a57e272d3332a1 not found: ID does not exist" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.163066 4825 scope.go:117] "RemoveContainer" containerID="608423ce51f37b012c818a571335a63e76f3e20a4ea9257f23364cd9d5e5373d" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.164938 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"608423ce51f37b012c818a571335a63e76f3e20a4ea9257f23364cd9d5e5373d"} err="failed to get container status \"608423ce51f37b012c818a571335a63e76f3e20a4ea9257f23364cd9d5e5373d\": rpc error: code = NotFound desc = could not find container \"608423ce51f37b012c818a571335a63e76f3e20a4ea9257f23364cd9d5e5373d\": container with ID starting with 608423ce51f37b012c818a571335a63e76f3e20a4ea9257f23364cd9d5e5373d not found: ID does not exist" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.176233 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.186192 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 10 07:06:04 crc kubenswrapper[4825]: E0310 07:06:04.186632 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c93bcce6-e8c8-4916-af01-e85f3dfded64" containerName="barbican-api-log" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.186651 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c93bcce6-e8c8-4916-af01-e85f3dfded64" containerName="barbican-api-log" Mar 10 07:06:04 crc kubenswrapper[4825]: E0310 07:06:04.186663 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb8652d3-6561-469d-b489-ee8006518277" containerName="cinder-api" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.186669 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb8652d3-6561-469d-b489-ee8006518277" containerName="cinder-api" Mar 10 07:06:04 crc kubenswrapper[4825]: E0310 07:06:04.186682 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c93bcce6-e8c8-4916-af01-e85f3dfded64" containerName="barbican-api" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.186689 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c93bcce6-e8c8-4916-af01-e85f3dfded64" containerName="barbican-api" Mar 10 07:06:04 crc kubenswrapper[4825]: E0310 07:06:04.186710 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb8652d3-6561-469d-b489-ee8006518277" containerName="cinder-api-log" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.186716 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb8652d3-6561-469d-b489-ee8006518277" containerName="cinder-api-log" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.186871 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c93bcce6-e8c8-4916-af01-e85f3dfded64" containerName="barbican-api-log" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.186884 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb8652d3-6561-469d-b489-ee8006518277" containerName="cinder-api-log" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.186898 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb8652d3-6561-469d-b489-ee8006518277" containerName="cinder-api" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.186911 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c93bcce6-e8c8-4916-af01-e85f3dfded64" containerName="barbican-api" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.187890 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.192726 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.192949 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.193080 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.221563 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.290388 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-config-data\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.290636 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-config-data-custom\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.290828 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.290898 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-727td\" (UniqueName: \"kubernetes.io/projected/209fbdac-3b3c-451a-ae30-5888e1cbb891-kube-api-access-727td\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.291064 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.291276 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/209fbdac-3b3c-451a-ae30-5888e1cbb891-logs\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.291385 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-scripts\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.291540 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/209fbdac-3b3c-451a-ae30-5888e1cbb891-etc-machine-id\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.291673 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-public-tls-certs\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.393980 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/209fbdac-3b3c-451a-ae30-5888e1cbb891-logs\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.394073 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-scripts\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.394111 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/209fbdac-3b3c-451a-ae30-5888e1cbb891-etc-machine-id\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.394164 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-public-tls-certs\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.394195 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-config-data\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.394223 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-config-data-custom\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.394327 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/209fbdac-3b3c-451a-ae30-5888e1cbb891-etc-machine-id\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.394416 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.394948 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/209fbdac-3b3c-451a-ae30-5888e1cbb891-logs\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.395098 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-727td\" (UniqueName: \"kubernetes.io/projected/209fbdac-3b3c-451a-ae30-5888e1cbb891-kube-api-access-727td\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.395458 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.400064 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.401465 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-public-tls-certs\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.403409 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.409792 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-config-data-custom\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.411869 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-scripts\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.412554 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-727td\" (UniqueName: \"kubernetes.io/projected/209fbdac-3b3c-451a-ae30-5888e1cbb891-kube-api-access-727td\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.412862 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-config-data\") pod \"cinder-api-0\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.504415 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552106-kjbwd" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.542666 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.569506 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.600002 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlv5c\" (UniqueName: \"kubernetes.io/projected/7ead0475-146d-4055-9e56-ab3ba945041c-kube-api-access-jlv5c\") pod \"7ead0475-146d-4055-9e56-ab3ba945041c\" (UID: \"7ead0475-146d-4055-9e56-ab3ba945041c\") " Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.612284 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ead0475-146d-4055-9e56-ab3ba945041c-kube-api-access-jlv5c" (OuterVolumeSpecName: "kube-api-access-jlv5c") pod "7ead0475-146d-4055-9e56-ab3ba945041c" (UID: "7ead0475-146d-4055-9e56-ab3ba945041c"). InnerVolumeSpecName "kube-api-access-jlv5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.702243 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlv5c\" (UniqueName: \"kubernetes.io/projected/7ead0475-146d-4055-9e56-ab3ba945041c-kube-api-access-jlv5c\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:04 crc kubenswrapper[4825]: I0310 07:06:04.905248 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-75c968ccd4-ln762" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.050069 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.114393 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552106-kjbwd" event={"ID":"7ead0475-146d-4055-9e56-ab3ba945041c","Type":"ContainerDied","Data":"fa64375df7a65fe4c6865978bbdc0b86a2c8ef43079e682bf445b7375ede8e40"} Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.114450 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa64375df7a65fe4c6865978bbdc0b86a2c8ef43079e682bf445b7375ede8e40" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.114613 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552106-kjbwd" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.150851 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"209fbdac-3b3c-451a-ae30-5888e1cbb891","Type":"ContainerStarted","Data":"198f9a8326763796da8dab9aa851fe9756310d926e22948824d3289fcfb0b98f"} Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.227572 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7454bbb9bc-7l262"] Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.228000 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7454bbb9bc-7l262" podUID="082f6f7c-700c-41f1-a10e-58223570c18c" containerName="neutron-api" containerID="cri-o://f41b2155d8729db5968bc0758def9b1914d2c9b44cd66a96c2353428211928d7" gracePeriod=30 Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.228174 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7454bbb9bc-7l262" podUID="082f6f7c-700c-41f1-a10e-58223570c18c" containerName="neutron-httpd" containerID="cri-o://ea7fc0200b81cf0f53d33c7f999d45d1a95c059b3c16e5b55b69ae7fd61d4a52" gracePeriod=30 Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.236669 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7454bbb9bc-7l262" podUID="082f6f7c-700c-41f1-a10e-58223570c18c" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.157:9696/\": read tcp 10.217.0.2:60272->10.217.0.157:9696: read: connection reset by peer" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.253569 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb8652d3-6561-469d-b489-ee8006518277" path="/var/lib/kubelet/pods/eb8652d3-6561-469d-b489-ee8006518277/volumes" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.254347 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-67bc54d95c-r8n6n"] Mar 10 07:06:05 crc kubenswrapper[4825]: E0310 07:06:05.254686 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ead0475-146d-4055-9e56-ab3ba945041c" containerName="oc" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.254708 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ead0475-146d-4055-9e56-ab3ba945041c" containerName="oc" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.254877 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ead0475-146d-4055-9e56-ab3ba945041c" containerName="oc" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.256079 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67bc54d95c-r8n6n"] Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.256275 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.419675 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-combined-ca-bundle\") pod \"neutron-67bc54d95c-r8n6n\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.419798 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-httpd-config\") pod \"neutron-67bc54d95c-r8n6n\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.419864 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-ovndb-tls-certs\") pod \"neutron-67bc54d95c-r8n6n\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.419902 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-internal-tls-certs\") pod \"neutron-67bc54d95c-r8n6n\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.420039 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-public-tls-certs\") pod \"neutron-67bc54d95c-r8n6n\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.420082 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-config\") pod \"neutron-67bc54d95c-r8n6n\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.420181 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlznw\" (UniqueName: \"kubernetes.io/projected/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-kube-api-access-mlznw\") pod \"neutron-67bc54d95c-r8n6n\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.522234 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-public-tls-certs\") pod \"neutron-67bc54d95c-r8n6n\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.522317 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-config\") pod \"neutron-67bc54d95c-r8n6n\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.522380 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlznw\" (UniqueName: \"kubernetes.io/projected/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-kube-api-access-mlznw\") pod \"neutron-67bc54d95c-r8n6n\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.522434 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-combined-ca-bundle\") pod \"neutron-67bc54d95c-r8n6n\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.522464 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-httpd-config\") pod \"neutron-67bc54d95c-r8n6n\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.522508 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-ovndb-tls-certs\") pod \"neutron-67bc54d95c-r8n6n\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.522546 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-internal-tls-certs\") pod \"neutron-67bc54d95c-r8n6n\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.530537 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-httpd-config\") pod \"neutron-67bc54d95c-r8n6n\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.531118 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-config\") pod \"neutron-67bc54d95c-r8n6n\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.545908 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-ovndb-tls-certs\") pod \"neutron-67bc54d95c-r8n6n\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.546076 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-internal-tls-certs\") pod \"neutron-67bc54d95c-r8n6n\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.551340 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-public-tls-certs\") pod \"neutron-67bc54d95c-r8n6n\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.553226 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlznw\" (UniqueName: \"kubernetes.io/projected/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-kube-api-access-mlznw\") pod \"neutron-67bc54d95c-r8n6n\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.553283 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-combined-ca-bundle\") pod \"neutron-67bc54d95c-r8n6n\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.580587 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552100-n59dv"] Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.585775 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:05 crc kubenswrapper[4825]: I0310 07:06:05.601992 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552100-n59dv"] Mar 10 07:06:06 crc kubenswrapper[4825]: I0310 07:06:06.193391 4825 generic.go:334] "Generic (PLEG): container finished" podID="082f6f7c-700c-41f1-a10e-58223570c18c" containerID="ea7fc0200b81cf0f53d33c7f999d45d1a95c059b3c16e5b55b69ae7fd61d4a52" exitCode=0 Mar 10 07:06:06 crc kubenswrapper[4825]: I0310 07:06:06.193729 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7454bbb9bc-7l262" event={"ID":"082f6f7c-700c-41f1-a10e-58223570c18c","Type":"ContainerDied","Data":"ea7fc0200b81cf0f53d33c7f999d45d1a95c059b3c16e5b55b69ae7fd61d4a52"} Mar 10 07:06:06 crc kubenswrapper[4825]: I0310 07:06:06.199314 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"209fbdac-3b3c-451a-ae30-5888e1cbb891","Type":"ContainerStarted","Data":"41fab81e9dbca0e39235649b019d1b34334b13bd38163f9b2e2254c248abda96"} Mar 10 07:06:06 crc kubenswrapper[4825]: I0310 07:06:06.205654 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67bc54d95c-r8n6n"] Mar 10 07:06:07 crc kubenswrapper[4825]: I0310 07:06:07.215806 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67bc54d95c-r8n6n" event={"ID":"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6","Type":"ContainerStarted","Data":"3ad7566f82944df07d73d40d26dee49f887701e2a6b35d333784838aa475be9b"} Mar 10 07:06:07 crc kubenswrapper[4825]: I0310 07:06:07.216778 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:07 crc kubenswrapper[4825]: I0310 07:06:07.216826 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67bc54d95c-r8n6n" event={"ID":"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6","Type":"ContainerStarted","Data":"2281ab9c75c402ac8b53f6007d3f795cf1ff09aad4c21d1c2d1f97b2f04c2605"} Mar 10 07:06:07 crc kubenswrapper[4825]: I0310 07:06:07.216868 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67bc54d95c-r8n6n" event={"ID":"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6","Type":"ContainerStarted","Data":"e38d9882bdf6e2fd0a27032188b5caf3fbe81d2335e92cda553f3e1084342906"} Mar 10 07:06:07 crc kubenswrapper[4825]: I0310 07:06:07.222466 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"209fbdac-3b3c-451a-ae30-5888e1cbb891","Type":"ContainerStarted","Data":"bb56c1153d981c89befdcab3c321e793fa3da2a54c78cde62ebc8522df37f74c"} Mar 10 07:06:07 crc kubenswrapper[4825]: I0310 07:06:07.223450 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 10 07:06:07 crc kubenswrapper[4825]: I0310 07:06:07.246833 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-67bc54d95c-r8n6n" podStartSLOduration=2.246813761 podStartE2EDuration="2.246813761s" podCreationTimestamp="2026-03-10 07:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:06:07.237603151 +0000 UTC m=+1320.267383766" watchObservedRunningTime="2026-03-10 07:06:07.246813761 +0000 UTC m=+1320.276594376" Mar 10 07:06:07 crc kubenswrapper[4825]: I0310 07:06:07.253296 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e69e652a-0413-42a4-9fea-d20d094111ac" path="/var/lib/kubelet/pods/e69e652a-0413-42a4-9fea-d20d094111ac/volumes" Mar 10 07:06:07 crc kubenswrapper[4825]: I0310 07:06:07.273680 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.273658241 podStartE2EDuration="3.273658241s" podCreationTimestamp="2026-03-10 07:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:06:07.262220773 +0000 UTC m=+1320.292001398" watchObservedRunningTime="2026-03-10 07:06:07.273658241 +0000 UTC m=+1320.303438856" Mar 10 07:06:07 crc kubenswrapper[4825]: I0310 07:06:07.618404 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7454bbb9bc-7l262" podUID="082f6f7c-700c-41f1-a10e-58223570c18c" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.157:9696/\": dial tcp 10.217.0.157:9696: connect: connection refused" Mar 10 07:06:07 crc kubenswrapper[4825]: I0310 07:06:07.952262 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.083341 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-config\") pod \"082f6f7c-700c-41f1-a10e-58223570c18c\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.083464 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45xdg\" (UniqueName: \"kubernetes.io/projected/082f6f7c-700c-41f1-a10e-58223570c18c-kube-api-access-45xdg\") pod \"082f6f7c-700c-41f1-a10e-58223570c18c\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.083579 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-httpd-config\") pod \"082f6f7c-700c-41f1-a10e-58223570c18c\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.083738 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-public-tls-certs\") pod \"082f6f7c-700c-41f1-a10e-58223570c18c\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.083830 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-ovndb-tls-certs\") pod \"082f6f7c-700c-41f1-a10e-58223570c18c\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.083897 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-internal-tls-certs\") pod \"082f6f7c-700c-41f1-a10e-58223570c18c\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.084007 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-combined-ca-bundle\") pod \"082f6f7c-700c-41f1-a10e-58223570c18c\" (UID: \"082f6f7c-700c-41f1-a10e-58223570c18c\") " Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.091074 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "082f6f7c-700c-41f1-a10e-58223570c18c" (UID: "082f6f7c-700c-41f1-a10e-58223570c18c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.092052 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/082f6f7c-700c-41f1-a10e-58223570c18c-kube-api-access-45xdg" (OuterVolumeSpecName: "kube-api-access-45xdg") pod "082f6f7c-700c-41f1-a10e-58223570c18c" (UID: "082f6f7c-700c-41f1-a10e-58223570c18c"). InnerVolumeSpecName "kube-api-access-45xdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.150490 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "082f6f7c-700c-41f1-a10e-58223570c18c" (UID: "082f6f7c-700c-41f1-a10e-58223570c18c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.164052 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "082f6f7c-700c-41f1-a10e-58223570c18c" (UID: "082f6f7c-700c-41f1-a10e-58223570c18c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.166228 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "082f6f7c-700c-41f1-a10e-58223570c18c" (UID: "082f6f7c-700c-41f1-a10e-58223570c18c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.178963 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "082f6f7c-700c-41f1-a10e-58223570c18c" (UID: "082f6f7c-700c-41f1-a10e-58223570c18c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.185921 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.185948 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45xdg\" (UniqueName: \"kubernetes.io/projected/082f6f7c-700c-41f1-a10e-58223570c18c-kube-api-access-45xdg\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.185960 4825 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.185968 4825 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.185976 4825 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.185984 4825 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.190795 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-config" (OuterVolumeSpecName: "config") pod "082f6f7c-700c-41f1-a10e-58223570c18c" (UID: "082f6f7c-700c-41f1-a10e-58223570c18c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.236239 4825 generic.go:334] "Generic (PLEG): container finished" podID="082f6f7c-700c-41f1-a10e-58223570c18c" containerID="f41b2155d8729db5968bc0758def9b1914d2c9b44cd66a96c2353428211928d7" exitCode=0 Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.236299 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7454bbb9bc-7l262" Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.236310 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7454bbb9bc-7l262" event={"ID":"082f6f7c-700c-41f1-a10e-58223570c18c","Type":"ContainerDied","Data":"f41b2155d8729db5968bc0758def9b1914d2c9b44cd66a96c2353428211928d7"} Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.236356 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7454bbb9bc-7l262" event={"ID":"082f6f7c-700c-41f1-a10e-58223570c18c","Type":"ContainerDied","Data":"e46e37ce674680346f695cdc7c3357bd4c0e06959a6f94ff7c361bc7997e5aa8"} Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.236378 4825 scope.go:117] "RemoveContainer" containerID="ea7fc0200b81cf0f53d33c7f999d45d1a95c059b3c16e5b55b69ae7fd61d4a52" Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.298570 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/082f6f7c-700c-41f1-a10e-58223570c18c-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.311576 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7454bbb9bc-7l262"] Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.321790 4825 scope.go:117] "RemoveContainer" containerID="f41b2155d8729db5968bc0758def9b1914d2c9b44cd66a96c2353428211928d7" Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.331691 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7454bbb9bc-7l262"] Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.347998 4825 scope.go:117] "RemoveContainer" containerID="ea7fc0200b81cf0f53d33c7f999d45d1a95c059b3c16e5b55b69ae7fd61d4a52" Mar 10 07:06:08 crc kubenswrapper[4825]: E0310 07:06:08.348544 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea7fc0200b81cf0f53d33c7f999d45d1a95c059b3c16e5b55b69ae7fd61d4a52\": container with ID starting with ea7fc0200b81cf0f53d33c7f999d45d1a95c059b3c16e5b55b69ae7fd61d4a52 not found: ID does not exist" containerID="ea7fc0200b81cf0f53d33c7f999d45d1a95c059b3c16e5b55b69ae7fd61d4a52" Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.348591 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea7fc0200b81cf0f53d33c7f999d45d1a95c059b3c16e5b55b69ae7fd61d4a52"} err="failed to get container status \"ea7fc0200b81cf0f53d33c7f999d45d1a95c059b3c16e5b55b69ae7fd61d4a52\": rpc error: code = NotFound desc = could not find container \"ea7fc0200b81cf0f53d33c7f999d45d1a95c059b3c16e5b55b69ae7fd61d4a52\": container with ID starting with ea7fc0200b81cf0f53d33c7f999d45d1a95c059b3c16e5b55b69ae7fd61d4a52 not found: ID does not exist" Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.348623 4825 scope.go:117] "RemoveContainer" containerID="f41b2155d8729db5968bc0758def9b1914d2c9b44cd66a96c2353428211928d7" Mar 10 07:06:08 crc kubenswrapper[4825]: E0310 07:06:08.349467 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f41b2155d8729db5968bc0758def9b1914d2c9b44cd66a96c2353428211928d7\": container with ID starting with f41b2155d8729db5968bc0758def9b1914d2c9b44cd66a96c2353428211928d7 not found: ID does not exist" containerID="f41b2155d8729db5968bc0758def9b1914d2c9b44cd66a96c2353428211928d7" Mar 10 07:06:08 crc kubenswrapper[4825]: I0310 07:06:08.349556 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f41b2155d8729db5968bc0758def9b1914d2c9b44cd66a96c2353428211928d7"} err="failed to get container status \"f41b2155d8729db5968bc0758def9b1914d2c9b44cd66a96c2353428211928d7\": rpc error: code = NotFound desc = could not find container \"f41b2155d8729db5968bc0758def9b1914d2c9b44cd66a96c2353428211928d7\": container with ID starting with f41b2155d8729db5968bc0758def9b1914d2c9b44cd66a96c2353428211928d7 not found: ID does not exist" Mar 10 07:06:09 crc kubenswrapper[4825]: I0310 07:06:09.255625 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="082f6f7c-700c-41f1-a10e-58223570c18c" path="/var/lib/kubelet/pods/082f6f7c-700c-41f1-a10e-58223570c18c/volumes" Mar 10 07:06:09 crc kubenswrapper[4825]: I0310 07:06:09.741009 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:06:09 crc kubenswrapper[4825]: I0310 07:06:09.743087 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:06:09 crc kubenswrapper[4825]: I0310 07:06:09.840492 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 10 07:06:09 crc kubenswrapper[4825]: I0310 07:06:09.851604 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" Mar 10 07:06:09 crc kubenswrapper[4825]: I0310 07:06:09.907451 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 07:06:09 crc kubenswrapper[4825]: I0310 07:06:09.952109 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-zldfp"] Mar 10 07:06:09 crc kubenswrapper[4825]: I0310 07:06:09.952357 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" podUID="b960a8d7-4706-4657-9991-84a87645ab8a" containerName="dnsmasq-dns" containerID="cri-o://1a209b2e2e157803fd09a01c067e18e18efe719eae8e01be688bef95b8efab6f" gracePeriod=10 Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.020354 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-55b54bdfdb-8b9ns"] Mar 10 07:06:10 crc kubenswrapper[4825]: E0310 07:06:10.021007 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082f6f7c-700c-41f1-a10e-58223570c18c" containerName="neutron-httpd" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.021026 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="082f6f7c-700c-41f1-a10e-58223570c18c" containerName="neutron-httpd" Mar 10 07:06:10 crc kubenswrapper[4825]: E0310 07:06:10.021049 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082f6f7c-700c-41f1-a10e-58223570c18c" containerName="neutron-api" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.021056 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="082f6f7c-700c-41f1-a10e-58223570c18c" containerName="neutron-api" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.021296 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="082f6f7c-700c-41f1-a10e-58223570c18c" containerName="neutron-httpd" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.021316 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="082f6f7c-700c-41f1-a10e-58223570c18c" containerName="neutron-api" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.022279 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.043840 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55b54bdfdb-8b9ns"] Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.146490 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2kp2\" (UniqueName: \"kubernetes.io/projected/56c74365-656c-4362-8358-bbb17d0c8be0-kube-api-access-x2kp2\") pod \"placement-55b54bdfdb-8b9ns\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.146562 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-internal-tls-certs\") pod \"placement-55b54bdfdb-8b9ns\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.146682 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-combined-ca-bundle\") pod \"placement-55b54bdfdb-8b9ns\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.146737 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-config-data\") pod \"placement-55b54bdfdb-8b9ns\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.147036 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56c74365-656c-4362-8358-bbb17d0c8be0-logs\") pod \"placement-55b54bdfdb-8b9ns\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.147544 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-scripts\") pod \"placement-55b54bdfdb-8b9ns\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.148292 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-public-tls-certs\") pod \"placement-55b54bdfdb-8b9ns\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.251654 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2kp2\" (UniqueName: \"kubernetes.io/projected/56c74365-656c-4362-8358-bbb17d0c8be0-kube-api-access-x2kp2\") pod \"placement-55b54bdfdb-8b9ns\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.251990 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-internal-tls-certs\") pod \"placement-55b54bdfdb-8b9ns\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.252024 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-combined-ca-bundle\") pod \"placement-55b54bdfdb-8b9ns\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.252050 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-config-data\") pod \"placement-55b54bdfdb-8b9ns\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.252110 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56c74365-656c-4362-8358-bbb17d0c8be0-logs\") pod \"placement-55b54bdfdb-8b9ns\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.252159 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-scripts\") pod \"placement-55b54bdfdb-8b9ns\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.252229 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-public-tls-certs\") pod \"placement-55b54bdfdb-8b9ns\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.253191 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56c74365-656c-4362-8358-bbb17d0c8be0-logs\") pod \"placement-55b54bdfdb-8b9ns\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.259858 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-internal-tls-certs\") pod \"placement-55b54bdfdb-8b9ns\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.273838 4825 generic.go:334] "Generic (PLEG): container finished" podID="b960a8d7-4706-4657-9991-84a87645ab8a" containerID="1a209b2e2e157803fd09a01c067e18e18efe719eae8e01be688bef95b8efab6f" exitCode=0 Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.273948 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" event={"ID":"b960a8d7-4706-4657-9991-84a87645ab8a","Type":"ContainerDied","Data":"1a209b2e2e157803fd09a01c067e18e18efe719eae8e01be688bef95b8efab6f"} Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.274960 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e0c5ff0f-ee41-4f20-b87d-4db0855c9161" containerName="cinder-scheduler" containerID="cri-o://634ce193db17752ea91a7a337afdbfe210ec554d1465d0cdbcd5504db7321043" gracePeriod=30 Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.275429 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e0c5ff0f-ee41-4f20-b87d-4db0855c9161" containerName="probe" containerID="cri-o://8a4d30188885f55be9bc509363bf84961ab41e834420534de1614a2d5da7a44e" gracePeriod=30 Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.290075 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2kp2\" (UniqueName: \"kubernetes.io/projected/56c74365-656c-4362-8358-bbb17d0c8be0-kube-api-access-x2kp2\") pod \"placement-55b54bdfdb-8b9ns\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.303716 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-scripts\") pod \"placement-55b54bdfdb-8b9ns\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.304344 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-combined-ca-bundle\") pod \"placement-55b54bdfdb-8b9ns\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.307473 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-public-tls-certs\") pod \"placement-55b54bdfdb-8b9ns\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.322165 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-config-data\") pod \"placement-55b54bdfdb-8b9ns\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.408165 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.442816 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.576430 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-ovsdbserver-nb\") pod \"b960a8d7-4706-4657-9991-84a87645ab8a\" (UID: \"b960a8d7-4706-4657-9991-84a87645ab8a\") " Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.578509 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-dns-svc\") pod \"b960a8d7-4706-4657-9991-84a87645ab8a\" (UID: \"b960a8d7-4706-4657-9991-84a87645ab8a\") " Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.578667 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-config\") pod \"b960a8d7-4706-4657-9991-84a87645ab8a\" (UID: \"b960a8d7-4706-4657-9991-84a87645ab8a\") " Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.578687 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-ovsdbserver-sb\") pod \"b960a8d7-4706-4657-9991-84a87645ab8a\" (UID: \"b960a8d7-4706-4657-9991-84a87645ab8a\") " Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.578707 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-dns-swift-storage-0\") pod \"b960a8d7-4706-4657-9991-84a87645ab8a\" (UID: \"b960a8d7-4706-4657-9991-84a87645ab8a\") " Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.578745 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwbc6\" (UniqueName: \"kubernetes.io/projected/b960a8d7-4706-4657-9991-84a87645ab8a-kube-api-access-mwbc6\") pod \"b960a8d7-4706-4657-9991-84a87645ab8a\" (UID: \"b960a8d7-4706-4657-9991-84a87645ab8a\") " Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.584919 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b960a8d7-4706-4657-9991-84a87645ab8a-kube-api-access-mwbc6" (OuterVolumeSpecName: "kube-api-access-mwbc6") pod "b960a8d7-4706-4657-9991-84a87645ab8a" (UID: "b960a8d7-4706-4657-9991-84a87645ab8a"). InnerVolumeSpecName "kube-api-access-mwbc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.625950 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b960a8d7-4706-4657-9991-84a87645ab8a" (UID: "b960a8d7-4706-4657-9991-84a87645ab8a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.643411 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b960a8d7-4706-4657-9991-84a87645ab8a" (UID: "b960a8d7-4706-4657-9991-84a87645ab8a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.643608 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-config" (OuterVolumeSpecName: "config") pod "b960a8d7-4706-4657-9991-84a87645ab8a" (UID: "b960a8d7-4706-4657-9991-84a87645ab8a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.647289 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b960a8d7-4706-4657-9991-84a87645ab8a" (UID: "b960a8d7-4706-4657-9991-84a87645ab8a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.653164 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b960a8d7-4706-4657-9991-84a87645ab8a" (UID: "b960a8d7-4706-4657-9991-84a87645ab8a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.680882 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.680921 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.680932 4825 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.680949 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwbc6\" (UniqueName: \"kubernetes.io/projected/b960a8d7-4706-4657-9991-84a87645ab8a-kube-api-access-mwbc6\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.680962 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.680971 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b960a8d7-4706-4657-9991-84a87645ab8a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:10 crc kubenswrapper[4825]: W0310 07:06:10.882826 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56c74365_656c_4362_8358_bbb17d0c8be0.slice/crio-f51a988b85767d51c95d15a0175c79733a28ab4631101dd4cb9b8e62151b9f0d WatchSource:0}: Error finding container f51a988b85767d51c95d15a0175c79733a28ab4631101dd4cb9b8e62151b9f0d: Status 404 returned error can't find the container with id f51a988b85767d51c95d15a0175c79733a28ab4631101dd4cb9b8e62151b9f0d Mar 10 07:06:10 crc kubenswrapper[4825]: I0310 07:06:10.895676 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55b54bdfdb-8b9ns"] Mar 10 07:06:11 crc kubenswrapper[4825]: I0310 07:06:11.294945 4825 generic.go:334] "Generic (PLEG): container finished" podID="e0c5ff0f-ee41-4f20-b87d-4db0855c9161" containerID="8a4d30188885f55be9bc509363bf84961ab41e834420534de1614a2d5da7a44e" exitCode=0 Mar 10 07:06:11 crc kubenswrapper[4825]: I0310 07:06:11.295033 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e0c5ff0f-ee41-4f20-b87d-4db0855c9161","Type":"ContainerDied","Data":"8a4d30188885f55be9bc509363bf84961ab41e834420534de1614a2d5da7a44e"} Mar 10 07:06:11 crc kubenswrapper[4825]: I0310 07:06:11.299373 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" Mar 10 07:06:11 crc kubenswrapper[4825]: I0310 07:06:11.299374 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-zldfp" event={"ID":"b960a8d7-4706-4657-9991-84a87645ab8a","Type":"ContainerDied","Data":"f92b204f9a371e5c6f00262aab6ca478581ae4b6a826a651ac62bba9beabbd93"} Mar 10 07:06:11 crc kubenswrapper[4825]: I0310 07:06:11.299448 4825 scope.go:117] "RemoveContainer" containerID="1a209b2e2e157803fd09a01c067e18e18efe719eae8e01be688bef95b8efab6f" Mar 10 07:06:11 crc kubenswrapper[4825]: I0310 07:06:11.301544 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55b54bdfdb-8b9ns" event={"ID":"56c74365-656c-4362-8358-bbb17d0c8be0","Type":"ContainerStarted","Data":"cf0bf49129662bee51dc0d9f341998b13bf6e4b5c8b81b94887a67926afba8ca"} Mar 10 07:06:11 crc kubenswrapper[4825]: I0310 07:06:11.301588 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55b54bdfdb-8b9ns" event={"ID":"56c74365-656c-4362-8358-bbb17d0c8be0","Type":"ContainerStarted","Data":"f51a988b85767d51c95d15a0175c79733a28ab4631101dd4cb9b8e62151b9f0d"} Mar 10 07:06:11 crc kubenswrapper[4825]: I0310 07:06:11.374979 4825 scope.go:117] "RemoveContainer" containerID="16414e715215838cd5c5f664a3215cd34f3e6389a5976ded0f744bb1b68662f9" Mar 10 07:06:11 crc kubenswrapper[4825]: I0310 07:06:11.383536 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-zldfp"] Mar 10 07:06:11 crc kubenswrapper[4825]: I0310 07:06:11.392244 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-zldfp"] Mar 10 07:06:12 crc kubenswrapper[4825]: I0310 07:06:12.323889 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55b54bdfdb-8b9ns" event={"ID":"56c74365-656c-4362-8358-bbb17d0c8be0","Type":"ContainerStarted","Data":"506bc72445e5baadada4d8eeada583023eaf1965303dabc6bfea713ddd7e5cda"} Mar 10 07:06:12 crc kubenswrapper[4825]: I0310 07:06:12.325080 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:12 crc kubenswrapper[4825]: I0310 07:06:12.325179 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:12 crc kubenswrapper[4825]: I0310 07:06:12.375452 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-55b54bdfdb-8b9ns" podStartSLOduration=3.3753796400000002 podStartE2EDuration="3.37537964s" podCreationTimestamp="2026-03-10 07:06:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:06:12.350627604 +0000 UTC m=+1325.380408269" watchObservedRunningTime="2026-03-10 07:06:12.37537964 +0000 UTC m=+1325.405160275" Mar 10 07:06:13 crc kubenswrapper[4825]: I0310 07:06:13.265408 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b960a8d7-4706-4657-9991-84a87645ab8a" path="/var/lib/kubelet/pods/b960a8d7-4706-4657-9991-84a87645ab8a/volumes" Mar 10 07:06:13 crc kubenswrapper[4825]: I0310 07:06:13.343868 4825 generic.go:334] "Generic (PLEG): container finished" podID="e0c5ff0f-ee41-4f20-b87d-4db0855c9161" containerID="634ce193db17752ea91a7a337afdbfe210ec554d1465d0cdbcd5504db7321043" exitCode=0 Mar 10 07:06:13 crc kubenswrapper[4825]: I0310 07:06:13.343928 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e0c5ff0f-ee41-4f20-b87d-4db0855c9161","Type":"ContainerDied","Data":"634ce193db17752ea91a7a337afdbfe210ec554d1465d0cdbcd5504db7321043"} Mar 10 07:06:13 crc kubenswrapper[4825]: I0310 07:06:13.629407 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 07:06:13 crc kubenswrapper[4825]: I0310 07:06:13.750815 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-config-data-custom\") pod \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\" (UID: \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\") " Mar 10 07:06:13 crc kubenswrapper[4825]: I0310 07:06:13.753378 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-scripts\") pod \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\" (UID: \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\") " Mar 10 07:06:13 crc kubenswrapper[4825]: I0310 07:06:13.753669 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-config-data\") pod \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\" (UID: \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\") " Mar 10 07:06:13 crc kubenswrapper[4825]: I0310 07:06:13.753780 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-etc-machine-id\") pod \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\" (UID: \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\") " Mar 10 07:06:13 crc kubenswrapper[4825]: I0310 07:06:13.754005 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mclx\" (UniqueName: \"kubernetes.io/projected/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-kube-api-access-2mclx\") pod \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\" (UID: \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\") " Mar 10 07:06:13 crc kubenswrapper[4825]: I0310 07:06:13.754050 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e0c5ff0f-ee41-4f20-b87d-4db0855c9161" (UID: "e0c5ff0f-ee41-4f20-b87d-4db0855c9161"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 07:06:13 crc kubenswrapper[4825]: I0310 07:06:13.754087 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-combined-ca-bundle\") pod \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\" (UID: \"e0c5ff0f-ee41-4f20-b87d-4db0855c9161\") " Mar 10 07:06:13 crc kubenswrapper[4825]: I0310 07:06:13.755967 4825 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:13 crc kubenswrapper[4825]: I0310 07:06:13.758513 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e0c5ff0f-ee41-4f20-b87d-4db0855c9161" (UID: "e0c5ff0f-ee41-4f20-b87d-4db0855c9161"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:13 crc kubenswrapper[4825]: I0310 07:06:13.773093 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-kube-api-access-2mclx" (OuterVolumeSpecName: "kube-api-access-2mclx") pod "e0c5ff0f-ee41-4f20-b87d-4db0855c9161" (UID: "e0c5ff0f-ee41-4f20-b87d-4db0855c9161"). InnerVolumeSpecName "kube-api-access-2mclx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:06:13 crc kubenswrapper[4825]: I0310 07:06:13.776351 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-scripts" (OuterVolumeSpecName: "scripts") pod "e0c5ff0f-ee41-4f20-b87d-4db0855c9161" (UID: "e0c5ff0f-ee41-4f20-b87d-4db0855c9161"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:13 crc kubenswrapper[4825]: I0310 07:06:13.814453 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0c5ff0f-ee41-4f20-b87d-4db0855c9161" (UID: "e0c5ff0f-ee41-4f20-b87d-4db0855c9161"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:13 crc kubenswrapper[4825]: I0310 07:06:13.823493 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:06:13 crc kubenswrapper[4825]: I0310 07:06:13.847915 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-config-data" (OuterVolumeSpecName: "config-data") pod "e0c5ff0f-ee41-4f20-b87d-4db0855c9161" (UID: "e0c5ff0f-ee41-4f20-b87d-4db0855c9161"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:13 crc kubenswrapper[4825]: I0310 07:06:13.858942 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:13 crc kubenswrapper[4825]: I0310 07:06:13.858981 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mclx\" (UniqueName: \"kubernetes.io/projected/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-kube-api-access-2mclx\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:13 crc kubenswrapper[4825]: I0310 07:06:13.858992 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:13 crc kubenswrapper[4825]: I0310 07:06:13.859001 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:13 crc kubenswrapper[4825]: I0310 07:06:13.859010 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0c5ff0f-ee41-4f20-b87d-4db0855c9161-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.359916 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e0c5ff0f-ee41-4f20-b87d-4db0855c9161","Type":"ContainerDied","Data":"8b10207fc4fb452bbd657124f0f25d2fe603777f48ff2291ee370ef8f9be358e"} Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.360036 4825 scope.go:117] "RemoveContainer" containerID="8a4d30188885f55be9bc509363bf84961ab41e834420534de1614a2d5da7a44e" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.360321 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.400265 4825 scope.go:117] "RemoveContainer" containerID="634ce193db17752ea91a7a337afdbfe210ec554d1465d0cdbcd5504db7321043" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.425267 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.434556 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.441458 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 07:06:14 crc kubenswrapper[4825]: E0310 07:06:14.442013 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b960a8d7-4706-4657-9991-84a87645ab8a" containerName="init" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.442040 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b960a8d7-4706-4657-9991-84a87645ab8a" containerName="init" Mar 10 07:06:14 crc kubenswrapper[4825]: E0310 07:06:14.442062 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0c5ff0f-ee41-4f20-b87d-4db0855c9161" containerName="cinder-scheduler" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.442072 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0c5ff0f-ee41-4f20-b87d-4db0855c9161" containerName="cinder-scheduler" Mar 10 07:06:14 crc kubenswrapper[4825]: E0310 07:06:14.442090 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0c5ff0f-ee41-4f20-b87d-4db0855c9161" containerName="probe" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.442098 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0c5ff0f-ee41-4f20-b87d-4db0855c9161" containerName="probe" Mar 10 07:06:14 crc kubenswrapper[4825]: E0310 07:06:14.442113 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b960a8d7-4706-4657-9991-84a87645ab8a" containerName="dnsmasq-dns" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.442120 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b960a8d7-4706-4657-9991-84a87645ab8a" containerName="dnsmasq-dns" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.442379 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0c5ff0f-ee41-4f20-b87d-4db0855c9161" containerName="probe" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.442418 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0c5ff0f-ee41-4f20-b87d-4db0855c9161" containerName="cinder-scheduler" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.442429 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b960a8d7-4706-4657-9991-84a87645ab8a" containerName="dnsmasq-dns" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.443656 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.447604 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.451070 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.575845 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca4ebafb-aa22-44ad-8037-487a9c3baca4-scripts\") pod \"cinder-scheduler-0\" (UID: \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\") " pod="openstack/cinder-scheduler-0" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.575914 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4ebafb-aa22-44ad-8037-487a9c3baca4-config-data\") pod \"cinder-scheduler-0\" (UID: \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\") " pod="openstack/cinder-scheduler-0" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.575978 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca4ebafb-aa22-44ad-8037-487a9c3baca4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\") " pod="openstack/cinder-scheduler-0" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.575996 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4ebafb-aa22-44ad-8037-487a9c3baca4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\") " pod="openstack/cinder-scheduler-0" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.576289 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca4ebafb-aa22-44ad-8037-487a9c3baca4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\") " pod="openstack/cinder-scheduler-0" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.576485 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj4g7\" (UniqueName: \"kubernetes.io/projected/ca4ebafb-aa22-44ad-8037-487a9c3baca4-kube-api-access-cj4g7\") pod \"cinder-scheduler-0\" (UID: \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\") " pod="openstack/cinder-scheduler-0" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.678753 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca4ebafb-aa22-44ad-8037-487a9c3baca4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\") " pod="openstack/cinder-scheduler-0" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.678820 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4ebafb-aa22-44ad-8037-487a9c3baca4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\") " pod="openstack/cinder-scheduler-0" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.678857 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca4ebafb-aa22-44ad-8037-487a9c3baca4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\") " pod="openstack/cinder-scheduler-0" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.678903 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj4g7\" (UniqueName: \"kubernetes.io/projected/ca4ebafb-aa22-44ad-8037-487a9c3baca4-kube-api-access-cj4g7\") pod \"cinder-scheduler-0\" (UID: \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\") " pod="openstack/cinder-scheduler-0" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.678967 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca4ebafb-aa22-44ad-8037-487a9c3baca4-scripts\") pod \"cinder-scheduler-0\" (UID: \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\") " pod="openstack/cinder-scheduler-0" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.678965 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca4ebafb-aa22-44ad-8037-487a9c3baca4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\") " pod="openstack/cinder-scheduler-0" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.679021 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4ebafb-aa22-44ad-8037-487a9c3baca4-config-data\") pod \"cinder-scheduler-0\" (UID: \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\") " pod="openstack/cinder-scheduler-0" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.693037 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4ebafb-aa22-44ad-8037-487a9c3baca4-config-data\") pod \"cinder-scheduler-0\" (UID: \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\") " pod="openstack/cinder-scheduler-0" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.696773 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca4ebafb-aa22-44ad-8037-487a9c3baca4-scripts\") pod \"cinder-scheduler-0\" (UID: \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\") " pod="openstack/cinder-scheduler-0" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.696801 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca4ebafb-aa22-44ad-8037-487a9c3baca4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\") " pod="openstack/cinder-scheduler-0" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.696956 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4ebafb-aa22-44ad-8037-487a9c3baca4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\") " pod="openstack/cinder-scheduler-0" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.702761 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj4g7\" (UniqueName: \"kubernetes.io/projected/ca4ebafb-aa22-44ad-8037-487a9c3baca4-kube-api-access-cj4g7\") pod \"cinder-scheduler-0\" (UID: \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\") " pod="openstack/cinder-scheduler-0" Mar 10 07:06:14 crc kubenswrapper[4825]: I0310 07:06:14.792690 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 07:06:15 crc kubenswrapper[4825]: I0310 07:06:15.249234 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0c5ff0f-ee41-4f20-b87d-4db0855c9161" path="/var/lib/kubelet/pods/e0c5ff0f-ee41-4f20-b87d-4db0855c9161/volumes" Mar 10 07:06:15 crc kubenswrapper[4825]: W0310 07:06:15.290262 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca4ebafb_aa22_44ad_8037_487a9c3baca4.slice/crio-a253bdb397bc3dc3551756150243b4f3fe2f467d36dcaa92266ddfafc085214c WatchSource:0}: Error finding container a253bdb397bc3dc3551756150243b4f3fe2f467d36dcaa92266ddfafc085214c: Status 404 returned error can't find the container with id a253bdb397bc3dc3551756150243b4f3fe2f467d36dcaa92266ddfafc085214c Mar 10 07:06:15 crc kubenswrapper[4825]: I0310 07:06:15.293730 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 07:06:15 crc kubenswrapper[4825]: I0310 07:06:15.335886 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 10 07:06:15 crc kubenswrapper[4825]: I0310 07:06:15.337115 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 07:06:15 crc kubenswrapper[4825]: I0310 07:06:15.343270 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 10 07:06:15 crc kubenswrapper[4825]: I0310 07:06:15.343322 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 10 07:06:15 crc kubenswrapper[4825]: I0310 07:06:15.343548 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-qqqq9" Mar 10 07:06:15 crc kubenswrapper[4825]: I0310 07:06:15.365193 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 07:06:15 crc kubenswrapper[4825]: I0310 07:06:15.393659 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjrvq\" (UniqueName: \"kubernetes.io/projected/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11-kube-api-access-rjrvq\") pod \"openstackclient\" (UID: \"59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11\") " pod="openstack/openstackclient" Mar 10 07:06:15 crc kubenswrapper[4825]: I0310 07:06:15.393775 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11-openstack-config\") pod \"openstackclient\" (UID: \"59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11\") " pod="openstack/openstackclient" Mar 10 07:06:15 crc kubenswrapper[4825]: I0310 07:06:15.393880 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11-combined-ca-bundle\") pod \"openstackclient\" (UID: \"59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11\") " pod="openstack/openstackclient" Mar 10 07:06:15 crc kubenswrapper[4825]: I0310 07:06:15.393909 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11-openstack-config-secret\") pod \"openstackclient\" (UID: \"59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11\") " pod="openstack/openstackclient" Mar 10 07:06:15 crc kubenswrapper[4825]: I0310 07:06:15.396608 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ca4ebafb-aa22-44ad-8037-487a9c3baca4","Type":"ContainerStarted","Data":"a253bdb397bc3dc3551756150243b4f3fe2f467d36dcaa92266ddfafc085214c"} Mar 10 07:06:15 crc kubenswrapper[4825]: I0310 07:06:15.495573 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11-combined-ca-bundle\") pod \"openstackclient\" (UID: \"59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11\") " pod="openstack/openstackclient" Mar 10 07:06:15 crc kubenswrapper[4825]: I0310 07:06:15.495629 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11-openstack-config-secret\") pod \"openstackclient\" (UID: \"59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11\") " pod="openstack/openstackclient" Mar 10 07:06:15 crc kubenswrapper[4825]: I0310 07:06:15.495658 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjrvq\" (UniqueName: \"kubernetes.io/projected/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11-kube-api-access-rjrvq\") pod \"openstackclient\" (UID: \"59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11\") " pod="openstack/openstackclient" Mar 10 07:06:15 crc kubenswrapper[4825]: I0310 07:06:15.495717 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11-openstack-config\") pod \"openstackclient\" (UID: \"59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11\") " pod="openstack/openstackclient" Mar 10 07:06:15 crc kubenswrapper[4825]: I0310 07:06:15.496764 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11-openstack-config\") pod \"openstackclient\" (UID: \"59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11\") " pod="openstack/openstackclient" Mar 10 07:06:15 crc kubenswrapper[4825]: I0310 07:06:15.500765 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11-combined-ca-bundle\") pod \"openstackclient\" (UID: \"59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11\") " pod="openstack/openstackclient" Mar 10 07:06:15 crc kubenswrapper[4825]: I0310 07:06:15.500864 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11-openstack-config-secret\") pod \"openstackclient\" (UID: \"59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11\") " pod="openstack/openstackclient" Mar 10 07:06:15 crc kubenswrapper[4825]: I0310 07:06:15.512699 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjrvq\" (UniqueName: \"kubernetes.io/projected/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11-kube-api-access-rjrvq\") pod \"openstackclient\" (UID: \"59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11\") " pod="openstack/openstackclient" Mar 10 07:06:15 crc kubenswrapper[4825]: I0310 07:06:15.718387 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 07:06:16 crc kubenswrapper[4825]: I0310 07:06:16.248175 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 07:06:16 crc kubenswrapper[4825]: W0310 07:06:16.257029 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59e7664c_cb9b_4ef1_a7ab_8f7c1b130b11.slice/crio-f5f802a398cb06c328ff0c8ff5b4747be326487add1a88492e5ce026e300e805 WatchSource:0}: Error finding container f5f802a398cb06c328ff0c8ff5b4747be326487add1a88492e5ce026e300e805: Status 404 returned error can't find the container with id f5f802a398cb06c328ff0c8ff5b4747be326487add1a88492e5ce026e300e805 Mar 10 07:06:16 crc kubenswrapper[4825]: I0310 07:06:16.429925 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ca4ebafb-aa22-44ad-8037-487a9c3baca4","Type":"ContainerStarted","Data":"2afb8783418624fe53a581d5b384b0eac94f03f7b397fe6d2851a79acaf2faeb"} Mar 10 07:06:16 crc kubenswrapper[4825]: I0310 07:06:16.443276 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11","Type":"ContainerStarted","Data":"f5f802a398cb06c328ff0c8ff5b4747be326487add1a88492e5ce026e300e805"} Mar 10 07:06:16 crc kubenswrapper[4825]: I0310 07:06:16.698688 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 10 07:06:16 crc kubenswrapper[4825]: I0310 07:06:16.888398 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:06:16 crc kubenswrapper[4825]: I0310 07:06:16.888913 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:06:17 crc kubenswrapper[4825]: I0310 07:06:17.456746 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ca4ebafb-aa22-44ad-8037-487a9c3baca4","Type":"ContainerStarted","Data":"69a4b33cb0df81f2796dea2e3671cf92e7c2fcaf46f6acf19185262e97be0696"} Mar 10 07:06:17 crc kubenswrapper[4825]: I0310 07:06:17.485945 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.485910507 podStartE2EDuration="3.485910507s" podCreationTimestamp="2026-03-10 07:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:06:17.479555771 +0000 UTC m=+1330.509336416" watchObservedRunningTime="2026-03-10 07:06:17.485910507 +0000 UTC m=+1330.515691122" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.442106 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-bb7dcfc7-n79vf"] Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.452840 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-bb7dcfc7-n79vf"] Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.452979 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.455981 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.456305 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.461154 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.585260 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af39779a-3a65-4379-9e92-d69ab1610fc6-log-httpd\") pod \"swift-proxy-bb7dcfc7-n79vf\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.585316 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af39779a-3a65-4379-9e92-d69ab1610fc6-config-data\") pod \"swift-proxy-bb7dcfc7-n79vf\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.585349 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af39779a-3a65-4379-9e92-d69ab1610fc6-combined-ca-bundle\") pod \"swift-proxy-bb7dcfc7-n79vf\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.585388 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af39779a-3a65-4379-9e92-d69ab1610fc6-public-tls-certs\") pod \"swift-proxy-bb7dcfc7-n79vf\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.585415 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af39779a-3a65-4379-9e92-d69ab1610fc6-internal-tls-certs\") pod \"swift-proxy-bb7dcfc7-n79vf\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.585449 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/af39779a-3a65-4379-9e92-d69ab1610fc6-etc-swift\") pod \"swift-proxy-bb7dcfc7-n79vf\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.585743 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af39779a-3a65-4379-9e92-d69ab1610fc6-run-httpd\") pod \"swift-proxy-bb7dcfc7-n79vf\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.586089 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcznn\" (UniqueName: \"kubernetes.io/projected/af39779a-3a65-4379-9e92-d69ab1610fc6-kube-api-access-lcznn\") pod \"swift-proxy-bb7dcfc7-n79vf\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.689398 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcznn\" (UniqueName: \"kubernetes.io/projected/af39779a-3a65-4379-9e92-d69ab1610fc6-kube-api-access-lcznn\") pod \"swift-proxy-bb7dcfc7-n79vf\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.689515 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af39779a-3a65-4379-9e92-d69ab1610fc6-log-httpd\") pod \"swift-proxy-bb7dcfc7-n79vf\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.689845 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af39779a-3a65-4379-9e92-d69ab1610fc6-config-data\") pod \"swift-proxy-bb7dcfc7-n79vf\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.689918 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af39779a-3a65-4379-9e92-d69ab1610fc6-combined-ca-bundle\") pod \"swift-proxy-bb7dcfc7-n79vf\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.691928 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af39779a-3a65-4379-9e92-d69ab1610fc6-public-tls-certs\") pod \"swift-proxy-bb7dcfc7-n79vf\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.690463 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af39779a-3a65-4379-9e92-d69ab1610fc6-log-httpd\") pod \"swift-proxy-bb7dcfc7-n79vf\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.691998 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af39779a-3a65-4379-9e92-d69ab1610fc6-internal-tls-certs\") pod \"swift-proxy-bb7dcfc7-n79vf\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.692091 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/af39779a-3a65-4379-9e92-d69ab1610fc6-etc-swift\") pod \"swift-proxy-bb7dcfc7-n79vf\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.692249 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af39779a-3a65-4379-9e92-d69ab1610fc6-run-httpd\") pod \"swift-proxy-bb7dcfc7-n79vf\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.692777 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af39779a-3a65-4379-9e92-d69ab1610fc6-run-httpd\") pod \"swift-proxy-bb7dcfc7-n79vf\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.701305 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af39779a-3a65-4379-9e92-d69ab1610fc6-internal-tls-certs\") pod \"swift-proxy-bb7dcfc7-n79vf\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.701307 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af39779a-3a65-4379-9e92-d69ab1610fc6-combined-ca-bundle\") pod \"swift-proxy-bb7dcfc7-n79vf\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.704897 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af39779a-3a65-4379-9e92-d69ab1610fc6-config-data\") pod \"swift-proxy-bb7dcfc7-n79vf\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.706034 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af39779a-3a65-4379-9e92-d69ab1610fc6-public-tls-certs\") pod \"swift-proxy-bb7dcfc7-n79vf\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.706547 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/af39779a-3a65-4379-9e92-d69ab1610fc6-etc-swift\") pod \"swift-proxy-bb7dcfc7-n79vf\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.708113 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcznn\" (UniqueName: \"kubernetes.io/projected/af39779a-3a65-4379-9e92-d69ab1610fc6-kube-api-access-lcznn\") pod \"swift-proxy-bb7dcfc7-n79vf\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.784621 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:19 crc kubenswrapper[4825]: I0310 07:06:19.793516 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 10 07:06:20 crc kubenswrapper[4825]: I0310 07:06:20.341992 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-bb7dcfc7-n79vf"] Mar 10 07:06:21 crc kubenswrapper[4825]: I0310 07:06:21.302559 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:06:21 crc kubenswrapper[4825]: I0310 07:06:21.302853 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0a61399-4007-4b70-9067-7f6d04ef56af" containerName="ceilometer-central-agent" containerID="cri-o://e3f046df83dd1dc0fe9210b20a9e0d04085a4819187f404caf1145c464c250e1" gracePeriod=30 Mar 10 07:06:21 crc kubenswrapper[4825]: I0310 07:06:21.302941 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0a61399-4007-4b70-9067-7f6d04ef56af" containerName="proxy-httpd" containerID="cri-o://2d8e1b42dfd789d8be822497314dd1e42aa7804644acb3d68f8345dcc655110f" gracePeriod=30 Mar 10 07:06:21 crc kubenswrapper[4825]: I0310 07:06:21.303027 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0a61399-4007-4b70-9067-7f6d04ef56af" containerName="ceilometer-notification-agent" containerID="cri-o://6cf51432c3e3d819a5b3762341501279a4238698130e1747021831154fb129a3" gracePeriod=30 Mar 10 07:06:21 crc kubenswrapper[4825]: I0310 07:06:21.303192 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0a61399-4007-4b70-9067-7f6d04ef56af" containerName="sg-core" containerID="cri-o://ff228c3d6867b3d2585c3193264c033f65a5967d49dba70fdc735472e9d6d59c" gracePeriod=30 Mar 10 07:06:21 crc kubenswrapper[4825]: I0310 07:06:21.322318 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="b0a61399-4007-4b70-9067-7f6d04ef56af" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 10 07:06:21 crc kubenswrapper[4825]: I0310 07:06:21.521636 4825 generic.go:334] "Generic (PLEG): container finished" podID="b0a61399-4007-4b70-9067-7f6d04ef56af" containerID="2d8e1b42dfd789d8be822497314dd1e42aa7804644acb3d68f8345dcc655110f" exitCode=0 Mar 10 07:06:21 crc kubenswrapper[4825]: I0310 07:06:21.521692 4825 generic.go:334] "Generic (PLEG): container finished" podID="b0a61399-4007-4b70-9067-7f6d04ef56af" containerID="ff228c3d6867b3d2585c3193264c033f65a5967d49dba70fdc735472e9d6d59c" exitCode=2 Mar 10 07:06:21 crc kubenswrapper[4825]: I0310 07:06:21.521719 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0a61399-4007-4b70-9067-7f6d04ef56af","Type":"ContainerDied","Data":"2d8e1b42dfd789d8be822497314dd1e42aa7804644acb3d68f8345dcc655110f"} Mar 10 07:06:21 crc kubenswrapper[4825]: I0310 07:06:21.521754 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0a61399-4007-4b70-9067-7f6d04ef56af","Type":"ContainerDied","Data":"ff228c3d6867b3d2585c3193264c033f65a5967d49dba70fdc735472e9d6d59c"} Mar 10 07:06:22 crc kubenswrapper[4825]: I0310 07:06:22.545487 4825 generic.go:334] "Generic (PLEG): container finished" podID="b0a61399-4007-4b70-9067-7f6d04ef56af" containerID="6cf51432c3e3d819a5b3762341501279a4238698130e1747021831154fb129a3" exitCode=0 Mar 10 07:06:22 crc kubenswrapper[4825]: I0310 07:06:22.545805 4825 generic.go:334] "Generic (PLEG): container finished" podID="b0a61399-4007-4b70-9067-7f6d04ef56af" containerID="e3f046df83dd1dc0fe9210b20a9e0d04085a4819187f404caf1145c464c250e1" exitCode=0 Mar 10 07:06:22 crc kubenswrapper[4825]: I0310 07:06:22.545552 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0a61399-4007-4b70-9067-7f6d04ef56af","Type":"ContainerDied","Data":"6cf51432c3e3d819a5b3762341501279a4238698130e1747021831154fb129a3"} Mar 10 07:06:22 crc kubenswrapper[4825]: I0310 07:06:22.545858 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0a61399-4007-4b70-9067-7f6d04ef56af","Type":"ContainerDied","Data":"e3f046df83dd1dc0fe9210b20a9e0d04085a4819187f404caf1145c464c250e1"} Mar 10 07:06:25 crc kubenswrapper[4825]: I0310 07:06:25.004561 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 10 07:06:26 crc kubenswrapper[4825]: W0310 07:06:26.633205 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf39779a_3a65_4379_9e92_d69ab1610fc6.slice/crio-032212a00cfecef0137bdd8ffd7c86108ce60785d0d5c2e541cc86c0f1a910fe WatchSource:0}: Error finding container 032212a00cfecef0137bdd8ffd7c86108ce60785d0d5c2e541cc86c0f1a910fe: Status 404 returned error can't find the container with id 032212a00cfecef0137bdd8ffd7c86108ce60785d0d5c2e541cc86c0f1a910fe Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.018705 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.157894 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0a61399-4007-4b70-9067-7f6d04ef56af-run-httpd\") pod \"b0a61399-4007-4b70-9067-7f6d04ef56af\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.157955 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0a61399-4007-4b70-9067-7f6d04ef56af-log-httpd\") pod \"b0a61399-4007-4b70-9067-7f6d04ef56af\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.158054 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0a61399-4007-4b70-9067-7f6d04ef56af-sg-core-conf-yaml\") pod \"b0a61399-4007-4b70-9067-7f6d04ef56af\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.158118 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a61399-4007-4b70-9067-7f6d04ef56af-config-data\") pod \"b0a61399-4007-4b70-9067-7f6d04ef56af\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.158166 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a61399-4007-4b70-9067-7f6d04ef56af-scripts\") pod \"b0a61399-4007-4b70-9067-7f6d04ef56af\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.158194 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6zpp\" (UniqueName: \"kubernetes.io/projected/b0a61399-4007-4b70-9067-7f6d04ef56af-kube-api-access-c6zpp\") pod \"b0a61399-4007-4b70-9067-7f6d04ef56af\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.158274 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a61399-4007-4b70-9067-7f6d04ef56af-combined-ca-bundle\") pod \"b0a61399-4007-4b70-9067-7f6d04ef56af\" (UID: \"b0a61399-4007-4b70-9067-7f6d04ef56af\") " Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.158440 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0a61399-4007-4b70-9067-7f6d04ef56af-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b0a61399-4007-4b70-9067-7f6d04ef56af" (UID: "b0a61399-4007-4b70-9067-7f6d04ef56af"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.159072 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0a61399-4007-4b70-9067-7f6d04ef56af-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b0a61399-4007-4b70-9067-7f6d04ef56af" (UID: "b0a61399-4007-4b70-9067-7f6d04ef56af"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.159531 4825 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0a61399-4007-4b70-9067-7f6d04ef56af-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.159553 4825 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0a61399-4007-4b70-9067-7f6d04ef56af-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.163664 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0a61399-4007-4b70-9067-7f6d04ef56af-kube-api-access-c6zpp" (OuterVolumeSpecName: "kube-api-access-c6zpp") pod "b0a61399-4007-4b70-9067-7f6d04ef56af" (UID: "b0a61399-4007-4b70-9067-7f6d04ef56af"). InnerVolumeSpecName "kube-api-access-c6zpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.168454 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a61399-4007-4b70-9067-7f6d04ef56af-scripts" (OuterVolumeSpecName: "scripts") pod "b0a61399-4007-4b70-9067-7f6d04ef56af" (UID: "b0a61399-4007-4b70-9067-7f6d04ef56af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.199189 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a61399-4007-4b70-9067-7f6d04ef56af-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b0a61399-4007-4b70-9067-7f6d04ef56af" (UID: "b0a61399-4007-4b70-9067-7f6d04ef56af"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.253481 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a61399-4007-4b70-9067-7f6d04ef56af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0a61399-4007-4b70-9067-7f6d04ef56af" (UID: "b0a61399-4007-4b70-9067-7f6d04ef56af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.261894 4825 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0a61399-4007-4b70-9067-7f6d04ef56af-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.262190 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0a61399-4007-4b70-9067-7f6d04ef56af-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.262280 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6zpp\" (UniqueName: \"kubernetes.io/projected/b0a61399-4007-4b70-9067-7f6d04ef56af-kube-api-access-c6zpp\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.262348 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a61399-4007-4b70-9067-7f6d04ef56af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.282230 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a61399-4007-4b70-9067-7f6d04ef56af-config-data" (OuterVolumeSpecName: "config-data") pod "b0a61399-4007-4b70-9067-7f6d04ef56af" (UID: "b0a61399-4007-4b70-9067-7f6d04ef56af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.364284 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a61399-4007-4b70-9067-7f6d04ef56af-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.600933 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0a61399-4007-4b70-9067-7f6d04ef56af","Type":"ContainerDied","Data":"b63a87ac7ab11c44ec9bd287fcff71aa35533525cd5dab4ed72b524177815804"} Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.601029 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.601105 4825 scope.go:117] "RemoveContainer" containerID="2d8e1b42dfd789d8be822497314dd1e42aa7804644acb3d68f8345dcc655110f" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.605265 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-bb7dcfc7-n79vf" event={"ID":"af39779a-3a65-4379-9e92-d69ab1610fc6","Type":"ContainerStarted","Data":"8f8c2fc2f611cbc298369095f2d9421d72e169c12c731c66dc4b7e2b27276c19"} Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.605330 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-bb7dcfc7-n79vf" event={"ID":"af39779a-3a65-4379-9e92-d69ab1610fc6","Type":"ContainerStarted","Data":"2dd5d463672a820d3d3d4f47fbb2be409cc30d40e5202b4d480569146558f8cd"} Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.605346 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-bb7dcfc7-n79vf" event={"ID":"af39779a-3a65-4379-9e92-d69ab1610fc6","Type":"ContainerStarted","Data":"032212a00cfecef0137bdd8ffd7c86108ce60785d0d5c2e541cc86c0f1a910fe"} Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.605408 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.605499 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.607211 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11","Type":"ContainerStarted","Data":"eae523cad9b94daad0aa32c8a855fc6def30624e340d1c962919717b99480d5e"} Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.629886 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.158996805 podStartE2EDuration="12.629865223s" podCreationTimestamp="2026-03-10 07:06:15 +0000 UTC" firstStartedPulling="2026-03-10 07:06:16.260233201 +0000 UTC m=+1329.290013826" lastFinishedPulling="2026-03-10 07:06:26.731101639 +0000 UTC m=+1339.760882244" observedRunningTime="2026-03-10 07:06:27.625744056 +0000 UTC m=+1340.655524681" watchObservedRunningTime="2026-03-10 07:06:27.629865223 +0000 UTC m=+1340.659645838" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.642331 4825 scope.go:117] "RemoveContainer" containerID="ff228c3d6867b3d2585c3193264c033f65a5967d49dba70fdc735472e9d6d59c" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.668277 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-bb7dcfc7-n79vf" podStartSLOduration=8.668258994 podStartE2EDuration="8.668258994s" podCreationTimestamp="2026-03-10 07:06:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:06:27.653343685 +0000 UTC m=+1340.683124320" watchObservedRunningTime="2026-03-10 07:06:27.668258994 +0000 UTC m=+1340.698039599" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.712321 4825 scope.go:117] "RemoveContainer" containerID="6cf51432c3e3d819a5b3762341501279a4238698130e1747021831154fb129a3" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.717482 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.729017 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.733105 4825 scope.go:117] "RemoveContainer" containerID="e3f046df83dd1dc0fe9210b20a9e0d04085a4819187f404caf1145c464c250e1" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.757179 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:06:27 crc kubenswrapper[4825]: E0310 07:06:27.762236 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a61399-4007-4b70-9067-7f6d04ef56af" containerName="ceilometer-notification-agent" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.762281 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a61399-4007-4b70-9067-7f6d04ef56af" containerName="ceilometer-notification-agent" Mar 10 07:06:27 crc kubenswrapper[4825]: E0310 07:06:27.762318 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a61399-4007-4b70-9067-7f6d04ef56af" containerName="proxy-httpd" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.762325 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a61399-4007-4b70-9067-7f6d04ef56af" containerName="proxy-httpd" Mar 10 07:06:27 crc kubenswrapper[4825]: E0310 07:06:27.762339 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a61399-4007-4b70-9067-7f6d04ef56af" containerName="ceilometer-central-agent" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.762345 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a61399-4007-4b70-9067-7f6d04ef56af" containerName="ceilometer-central-agent" Mar 10 07:06:27 crc kubenswrapper[4825]: E0310 07:06:27.762371 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a61399-4007-4b70-9067-7f6d04ef56af" containerName="sg-core" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.762377 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a61399-4007-4b70-9067-7f6d04ef56af" containerName="sg-core" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.763058 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0a61399-4007-4b70-9067-7f6d04ef56af" containerName="proxy-httpd" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.763080 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0a61399-4007-4b70-9067-7f6d04ef56af" containerName="ceilometer-central-agent" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.763095 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0a61399-4007-4b70-9067-7f6d04ef56af" containerName="sg-core" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.763111 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0a61399-4007-4b70-9067-7f6d04ef56af" containerName="ceilometer-notification-agent" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.772106 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.780504 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.781300 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.796428 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.878077 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfzm6\" (UniqueName: \"kubernetes.io/projected/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-kube-api-access-mfzm6\") pod \"ceilometer-0\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " pod="openstack/ceilometer-0" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.878172 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-run-httpd\") pod \"ceilometer-0\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " pod="openstack/ceilometer-0" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.878202 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-config-data\") pod \"ceilometer-0\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " pod="openstack/ceilometer-0" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.878232 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " pod="openstack/ceilometer-0" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.878253 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-scripts\") pod \"ceilometer-0\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " pod="openstack/ceilometer-0" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.878302 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " pod="openstack/ceilometer-0" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.878414 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-log-httpd\") pod \"ceilometer-0\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " pod="openstack/ceilometer-0" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.980163 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " pod="openstack/ceilometer-0" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.980219 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-scripts\") pod \"ceilometer-0\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " pod="openstack/ceilometer-0" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.980276 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " pod="openstack/ceilometer-0" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.980369 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-log-httpd\") pod \"ceilometer-0\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " pod="openstack/ceilometer-0" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.980461 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfzm6\" (UniqueName: \"kubernetes.io/projected/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-kube-api-access-mfzm6\") pod \"ceilometer-0\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " pod="openstack/ceilometer-0" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.980495 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-run-httpd\") pod \"ceilometer-0\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " pod="openstack/ceilometer-0" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.980518 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-config-data\") pod \"ceilometer-0\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " pod="openstack/ceilometer-0" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.981104 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-log-httpd\") pod \"ceilometer-0\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " pod="openstack/ceilometer-0" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.981155 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-run-httpd\") pod \"ceilometer-0\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " pod="openstack/ceilometer-0" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.987072 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-config-data\") pod \"ceilometer-0\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " pod="openstack/ceilometer-0" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.993745 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " pod="openstack/ceilometer-0" Mar 10 07:06:27 crc kubenswrapper[4825]: I0310 07:06:27.996883 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-scripts\") pod \"ceilometer-0\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " pod="openstack/ceilometer-0" Mar 10 07:06:28 crc kubenswrapper[4825]: I0310 07:06:28.003108 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " pod="openstack/ceilometer-0" Mar 10 07:06:28 crc kubenswrapper[4825]: I0310 07:06:28.005757 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 07:06:28 crc kubenswrapper[4825]: I0310 07:06:28.006173 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3011e2a4-8b30-4770-9008-ba269097abfd" containerName="glance-log" containerID="cri-o://8d2883cad525f02553d47fdbbfc805d6ed15f1d5770b9ca52425168368349330" gracePeriod=30 Mar 10 07:06:28 crc kubenswrapper[4825]: I0310 07:06:28.007269 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3011e2a4-8b30-4770-9008-ba269097abfd" containerName="glance-httpd" containerID="cri-o://fb587c895e312cd855a14b0eceb839242e93bb126433f037ed201fac8c44bb35" gracePeriod=30 Mar 10 07:06:28 crc kubenswrapper[4825]: I0310 07:06:28.009123 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfzm6\" (UniqueName: \"kubernetes.io/projected/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-kube-api-access-mfzm6\") pod \"ceilometer-0\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " pod="openstack/ceilometer-0" Mar 10 07:06:28 crc kubenswrapper[4825]: I0310 07:06:28.103151 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:06:28 crc kubenswrapper[4825]: I0310 07:06:28.616827 4825 generic.go:334] "Generic (PLEG): container finished" podID="3011e2a4-8b30-4770-9008-ba269097abfd" containerID="8d2883cad525f02553d47fdbbfc805d6ed15f1d5770b9ca52425168368349330" exitCode=143 Mar 10 07:06:28 crc kubenswrapper[4825]: I0310 07:06:28.616927 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3011e2a4-8b30-4770-9008-ba269097abfd","Type":"ContainerDied","Data":"8d2883cad525f02553d47fdbbfc805d6ed15f1d5770b9ca52425168368349330"} Mar 10 07:06:28 crc kubenswrapper[4825]: I0310 07:06:28.690931 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:06:28 crc kubenswrapper[4825]: W0310 07:06:28.703294 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3c1d516_a0fa_48aa_85ff_eaf7b14499f1.slice/crio-7b858dcd330bc0928600af08991657f9b3d192405704048521753178c3186489 WatchSource:0}: Error finding container 7b858dcd330bc0928600af08991657f9b3d192405704048521753178c3186489: Status 404 returned error can't find the container with id 7b858dcd330bc0928600af08991657f9b3d192405704048521753178c3186489 Mar 10 07:06:28 crc kubenswrapper[4825]: I0310 07:06:28.785519 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:06:28 crc kubenswrapper[4825]: I0310 07:06:28.919697 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 07:06:28 crc kubenswrapper[4825]: I0310 07:06:28.920003 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d6f252a1-68f2-4f9a-ade7-0b979581b8c6" containerName="glance-log" containerID="cri-o://65812fc5fcac830b784d74cae3b58dafd700fcc4d4623a7b4f5a9fd5e827bda9" gracePeriod=30 Mar 10 07:06:28 crc kubenswrapper[4825]: I0310 07:06:28.920164 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d6f252a1-68f2-4f9a-ade7-0b979581b8c6" containerName="glance-httpd" containerID="cri-o://7ece94efb23a5c9ce51f2fc5f49bcfaf81a9ad61d4de7ab5ac8289cd5f6303ba" gracePeriod=30 Mar 10 07:06:29 crc kubenswrapper[4825]: I0310 07:06:29.251265 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0a61399-4007-4b70-9067-7f6d04ef56af" path="/var/lib/kubelet/pods/b0a61399-4007-4b70-9067-7f6d04ef56af/volumes" Mar 10 07:06:29 crc kubenswrapper[4825]: I0310 07:06:29.630634 4825 generic.go:334] "Generic (PLEG): container finished" podID="d6f252a1-68f2-4f9a-ade7-0b979581b8c6" containerID="65812fc5fcac830b784d74cae3b58dafd700fcc4d4623a7b4f5a9fd5e827bda9" exitCode=143 Mar 10 07:06:29 crc kubenswrapper[4825]: I0310 07:06:29.630712 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d6f252a1-68f2-4f9a-ade7-0b979581b8c6","Type":"ContainerDied","Data":"65812fc5fcac830b784d74cae3b58dafd700fcc4d4623a7b4f5a9fd5e827bda9"} Mar 10 07:06:29 crc kubenswrapper[4825]: I0310 07:06:29.632862 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1","Type":"ContainerStarted","Data":"d8757f7dcc14f6f88cccb8185df4916056abc0f519fa5e71f15ef87ce6d9c00a"} Mar 10 07:06:29 crc kubenswrapper[4825]: I0310 07:06:29.632933 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1","Type":"ContainerStarted","Data":"7b858dcd330bc0928600af08991657f9b3d192405704048521753178c3186489"} Mar 10 07:06:30 crc kubenswrapper[4825]: I0310 07:06:30.695198 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1","Type":"ContainerStarted","Data":"f39248a717e48f968e9a197d197c0c88a3ba6d6eaa5ac83f4e16d94e093946e7"} Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.121736 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-vv657"] Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.122840 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vv657" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.133379 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vv657"] Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.220590 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-5vfc4"] Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.226450 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5vfc4" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.254555 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-5vfc4"] Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.290475 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cba79823-f7ea-4ac9-80e0-ce6e216af1bc-operator-scripts\") pod \"nova-api-db-create-vv657\" (UID: \"cba79823-f7ea-4ac9-80e0-ce6e216af1bc\") " pod="openstack/nova-api-db-create-vv657" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.290548 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldm5b\" (UniqueName: \"kubernetes.io/projected/cba79823-f7ea-4ac9-80e0-ce6e216af1bc-kube-api-access-ldm5b\") pod \"nova-api-db-create-vv657\" (UID: \"cba79823-f7ea-4ac9-80e0-ce6e216af1bc\") " pod="openstack/nova-api-db-create-vv657" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.322215 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-2a9b-account-create-update-zq6mk"] Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.330330 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2a9b-account-create-update-zq6mk" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.338628 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.344600 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-t5vjd"] Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.358345 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t5vjd" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.359931 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2a9b-account-create-update-zq6mk"] Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.368176 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-t5vjd"] Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.400818 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cba79823-f7ea-4ac9-80e0-ce6e216af1bc-operator-scripts\") pod \"nova-api-db-create-vv657\" (UID: \"cba79823-f7ea-4ac9-80e0-ce6e216af1bc\") " pod="openstack/nova-api-db-create-vv657" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.400904 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/256445a5-41a0-47b0-a149-3c31cb2f0959-operator-scripts\") pod \"nova-cell0-db-create-5vfc4\" (UID: \"256445a5-41a0-47b0-a149-3c31cb2f0959\") " pod="openstack/nova-cell0-db-create-5vfc4" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.400958 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntg4d\" (UniqueName: \"kubernetes.io/projected/256445a5-41a0-47b0-a149-3c31cb2f0959-kube-api-access-ntg4d\") pod \"nova-cell0-db-create-5vfc4\" (UID: \"256445a5-41a0-47b0-a149-3c31cb2f0959\") " pod="openstack/nova-cell0-db-create-5vfc4" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.401017 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldm5b\" (UniqueName: \"kubernetes.io/projected/cba79823-f7ea-4ac9-80e0-ce6e216af1bc-kube-api-access-ldm5b\") pod \"nova-api-db-create-vv657\" (UID: \"cba79823-f7ea-4ac9-80e0-ce6e216af1bc\") " pod="openstack/nova-api-db-create-vv657" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.402095 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cba79823-f7ea-4ac9-80e0-ce6e216af1bc-operator-scripts\") pod \"nova-api-db-create-vv657\" (UID: \"cba79823-f7ea-4ac9-80e0-ce6e216af1bc\") " pod="openstack/nova-api-db-create-vv657" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.424730 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldm5b\" (UniqueName: \"kubernetes.io/projected/cba79823-f7ea-4ac9-80e0-ce6e216af1bc-kube-api-access-ldm5b\") pod \"nova-api-db-create-vv657\" (UID: \"cba79823-f7ea-4ac9-80e0-ce6e216af1bc\") " pod="openstack/nova-api-db-create-vv657" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.440407 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vv657" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.445841 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="3011e2a4-8b30-4770-9008-ba269097abfd" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.153:9292/healthcheck\": dial tcp 10.217.0.153:9292: connect: connection refused" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.446141 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="3011e2a4-8b30-4770-9008-ba269097abfd" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.153:9292/healthcheck\": dial tcp 10.217.0.153:9292: connect: connection refused" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.503184 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/256445a5-41a0-47b0-a149-3c31cb2f0959-operator-scripts\") pod \"nova-cell0-db-create-5vfc4\" (UID: \"256445a5-41a0-47b0-a149-3c31cb2f0959\") " pod="openstack/nova-cell0-db-create-5vfc4" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.503248 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntg4d\" (UniqueName: \"kubernetes.io/projected/256445a5-41a0-47b0-a149-3c31cb2f0959-kube-api-access-ntg4d\") pod \"nova-cell0-db-create-5vfc4\" (UID: \"256445a5-41a0-47b0-a149-3c31cb2f0959\") " pod="openstack/nova-cell0-db-create-5vfc4" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.503335 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drg67\" (UniqueName: \"kubernetes.io/projected/7df341ed-690d-4799-b9a4-18691d7de8d7-kube-api-access-drg67\") pod \"nova-cell1-db-create-t5vjd\" (UID: \"7df341ed-690d-4799-b9a4-18691d7de8d7\") " pod="openstack/nova-cell1-db-create-t5vjd" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.503423 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktrcg\" (UniqueName: \"kubernetes.io/projected/1e84f490-c283-41ba-92af-bca6b65b95cf-kube-api-access-ktrcg\") pod \"nova-api-2a9b-account-create-update-zq6mk\" (UID: \"1e84f490-c283-41ba-92af-bca6b65b95cf\") " pod="openstack/nova-api-2a9b-account-create-update-zq6mk" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.503459 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df341ed-690d-4799-b9a4-18691d7de8d7-operator-scripts\") pod \"nova-cell1-db-create-t5vjd\" (UID: \"7df341ed-690d-4799-b9a4-18691d7de8d7\") " pod="openstack/nova-cell1-db-create-t5vjd" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.503522 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e84f490-c283-41ba-92af-bca6b65b95cf-operator-scripts\") pod \"nova-api-2a9b-account-create-update-zq6mk\" (UID: \"1e84f490-c283-41ba-92af-bca6b65b95cf\") " pod="openstack/nova-api-2a9b-account-create-update-zq6mk" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.508548 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/256445a5-41a0-47b0-a149-3c31cb2f0959-operator-scripts\") pod \"nova-cell0-db-create-5vfc4\" (UID: \"256445a5-41a0-47b0-a149-3c31cb2f0959\") " pod="openstack/nova-cell0-db-create-5vfc4" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.543989 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntg4d\" (UniqueName: \"kubernetes.io/projected/256445a5-41a0-47b0-a149-3c31cb2f0959-kube-api-access-ntg4d\") pod \"nova-cell0-db-create-5vfc4\" (UID: \"256445a5-41a0-47b0-a149-3c31cb2f0959\") " pod="openstack/nova-cell0-db-create-5vfc4" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.545218 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-86ff-account-create-update-fh6cl"] Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.546761 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-86ff-account-create-update-fh6cl" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.548591 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.555000 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-86ff-account-create-update-fh6cl"] Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.605021 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktrcg\" (UniqueName: \"kubernetes.io/projected/1e84f490-c283-41ba-92af-bca6b65b95cf-kube-api-access-ktrcg\") pod \"nova-api-2a9b-account-create-update-zq6mk\" (UID: \"1e84f490-c283-41ba-92af-bca6b65b95cf\") " pod="openstack/nova-api-2a9b-account-create-update-zq6mk" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.605277 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df341ed-690d-4799-b9a4-18691d7de8d7-operator-scripts\") pod \"nova-cell1-db-create-t5vjd\" (UID: \"7df341ed-690d-4799-b9a4-18691d7de8d7\") " pod="openstack/nova-cell1-db-create-t5vjd" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.605332 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e84f490-c283-41ba-92af-bca6b65b95cf-operator-scripts\") pod \"nova-api-2a9b-account-create-update-zq6mk\" (UID: \"1e84f490-c283-41ba-92af-bca6b65b95cf\") " pod="openstack/nova-api-2a9b-account-create-update-zq6mk" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.605419 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drg67\" (UniqueName: \"kubernetes.io/projected/7df341ed-690d-4799-b9a4-18691d7de8d7-kube-api-access-drg67\") pod \"nova-cell1-db-create-t5vjd\" (UID: \"7df341ed-690d-4799-b9a4-18691d7de8d7\") " pod="openstack/nova-cell1-db-create-t5vjd" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.607746 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df341ed-690d-4799-b9a4-18691d7de8d7-operator-scripts\") pod \"nova-cell1-db-create-t5vjd\" (UID: \"7df341ed-690d-4799-b9a4-18691d7de8d7\") " pod="openstack/nova-cell1-db-create-t5vjd" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.608041 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e84f490-c283-41ba-92af-bca6b65b95cf-operator-scripts\") pod \"nova-api-2a9b-account-create-update-zq6mk\" (UID: \"1e84f490-c283-41ba-92af-bca6b65b95cf\") " pod="openstack/nova-api-2a9b-account-create-update-zq6mk" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.625737 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drg67\" (UniqueName: \"kubernetes.io/projected/7df341ed-690d-4799-b9a4-18691d7de8d7-kube-api-access-drg67\") pod \"nova-cell1-db-create-t5vjd\" (UID: \"7df341ed-690d-4799-b9a4-18691d7de8d7\") " pod="openstack/nova-cell1-db-create-t5vjd" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.626167 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5vfc4" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.631910 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktrcg\" (UniqueName: \"kubernetes.io/projected/1e84f490-c283-41ba-92af-bca6b65b95cf-kube-api-access-ktrcg\") pod \"nova-api-2a9b-account-create-update-zq6mk\" (UID: \"1e84f490-c283-41ba-92af-bca6b65b95cf\") " pod="openstack/nova-api-2a9b-account-create-update-zq6mk" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.658254 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2a9b-account-create-update-zq6mk" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.699664 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t5vjd" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.707470 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn9cn\" (UniqueName: \"kubernetes.io/projected/e6533be0-d453-4d29-980d-6be63a601fce-kube-api-access-hn9cn\") pod \"nova-cell0-86ff-account-create-update-fh6cl\" (UID: \"e6533be0-d453-4d29-980d-6be63a601fce\") " pod="openstack/nova-cell0-86ff-account-create-update-fh6cl" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.707548 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6533be0-d453-4d29-980d-6be63a601fce-operator-scripts\") pod \"nova-cell0-86ff-account-create-update-fh6cl\" (UID: \"e6533be0-d453-4d29-980d-6be63a601fce\") " pod="openstack/nova-cell0-86ff-account-create-update-fh6cl" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.719230 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.719647 4825 generic.go:334] "Generic (PLEG): container finished" podID="3011e2a4-8b30-4770-9008-ba269097abfd" containerID="fb587c895e312cd855a14b0eceb839242e93bb126433f037ed201fac8c44bb35" exitCode=0 Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.719703 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3011e2a4-8b30-4770-9008-ba269097abfd","Type":"ContainerDied","Data":"fb587c895e312cd855a14b0eceb839242e93bb126433f037ed201fac8c44bb35"} Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.719730 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3011e2a4-8b30-4770-9008-ba269097abfd","Type":"ContainerDied","Data":"f291696fd5354c722891e1cfc989cf8c9f2e5b135ebc85075447c078350582be"} Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.719749 4825 scope.go:117] "RemoveContainer" containerID="fb587c895e312cd855a14b0eceb839242e93bb126433f037ed201fac8c44bb35" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.742429 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-d132-account-create-update-gqrfl"] Mar 10 07:06:31 crc kubenswrapper[4825]: E0310 07:06:31.742953 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3011e2a4-8b30-4770-9008-ba269097abfd" containerName="glance-httpd" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.742978 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3011e2a4-8b30-4770-9008-ba269097abfd" containerName="glance-httpd" Mar 10 07:06:31 crc kubenswrapper[4825]: E0310 07:06:31.743020 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3011e2a4-8b30-4770-9008-ba269097abfd" containerName="glance-log" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.743028 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3011e2a4-8b30-4770-9008-ba269097abfd" containerName="glance-log" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.743260 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3011e2a4-8b30-4770-9008-ba269097abfd" containerName="glance-httpd" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.743283 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3011e2a4-8b30-4770-9008-ba269097abfd" containerName="glance-log" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.743946 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d132-account-create-update-gqrfl" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.750435 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d132-account-create-update-gqrfl"] Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.757904 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.768544 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1","Type":"ContainerStarted","Data":"8b670641e702b30f1f843231e4d3fffd5a269f476341f1a480f08ad6237fcd52"} Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.803921 4825 scope.go:117] "RemoveContainer" containerID="8d2883cad525f02553d47fdbbfc805d6ed15f1d5770b9ca52425168368349330" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.813197 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3011e2a4-8b30-4770-9008-ba269097abfd-config-data\") pod \"3011e2a4-8b30-4770-9008-ba269097abfd\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.813251 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3011e2a4-8b30-4770-9008-ba269097abfd-logs\") pod \"3011e2a4-8b30-4770-9008-ba269097abfd\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.813289 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3011e2a4-8b30-4770-9008-ba269097abfd-combined-ca-bundle\") pod \"3011e2a4-8b30-4770-9008-ba269097abfd\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.813341 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3011e2a4-8b30-4770-9008-ba269097abfd-httpd-run\") pod \"3011e2a4-8b30-4770-9008-ba269097abfd\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.813414 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6nbm\" (UniqueName: \"kubernetes.io/projected/3011e2a4-8b30-4770-9008-ba269097abfd-kube-api-access-q6nbm\") pod \"3011e2a4-8b30-4770-9008-ba269097abfd\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.813461 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3011e2a4-8b30-4770-9008-ba269097abfd-scripts\") pod \"3011e2a4-8b30-4770-9008-ba269097abfd\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.813604 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"3011e2a4-8b30-4770-9008-ba269097abfd\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.813633 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3011e2a4-8b30-4770-9008-ba269097abfd-public-tls-certs\") pod \"3011e2a4-8b30-4770-9008-ba269097abfd\" (UID: \"3011e2a4-8b30-4770-9008-ba269097abfd\") " Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.813866 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6533be0-d453-4d29-980d-6be63a601fce-operator-scripts\") pod \"nova-cell0-86ff-account-create-update-fh6cl\" (UID: \"e6533be0-d453-4d29-980d-6be63a601fce\") " pod="openstack/nova-cell0-86ff-account-create-update-fh6cl" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.813994 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn9cn\" (UniqueName: \"kubernetes.io/projected/e6533be0-d453-4d29-980d-6be63a601fce-kube-api-access-hn9cn\") pod \"nova-cell0-86ff-account-create-update-fh6cl\" (UID: \"e6533be0-d453-4d29-980d-6be63a601fce\") " pod="openstack/nova-cell0-86ff-account-create-update-fh6cl" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.814921 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3011e2a4-8b30-4770-9008-ba269097abfd-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3011e2a4-8b30-4770-9008-ba269097abfd" (UID: "3011e2a4-8b30-4770-9008-ba269097abfd"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.818718 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3011e2a4-8b30-4770-9008-ba269097abfd-kube-api-access-q6nbm" (OuterVolumeSpecName: "kube-api-access-q6nbm") pod "3011e2a4-8b30-4770-9008-ba269097abfd" (UID: "3011e2a4-8b30-4770-9008-ba269097abfd"). InnerVolumeSpecName "kube-api-access-q6nbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.821526 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3011e2a4-8b30-4770-9008-ba269097abfd-logs" (OuterVolumeSpecName: "logs") pod "3011e2a4-8b30-4770-9008-ba269097abfd" (UID: "3011e2a4-8b30-4770-9008-ba269097abfd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.827118 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3011e2a4-8b30-4770-9008-ba269097abfd-scripts" (OuterVolumeSpecName: "scripts") pod "3011e2a4-8b30-4770-9008-ba269097abfd" (UID: "3011e2a4-8b30-4770-9008-ba269097abfd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.846724 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "3011e2a4-8b30-4770-9008-ba269097abfd" (UID: "3011e2a4-8b30-4770-9008-ba269097abfd"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.848214 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6533be0-d453-4d29-980d-6be63a601fce-operator-scripts\") pod \"nova-cell0-86ff-account-create-update-fh6cl\" (UID: \"e6533be0-d453-4d29-980d-6be63a601fce\") " pod="openstack/nova-cell0-86ff-account-create-update-fh6cl" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.848490 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn9cn\" (UniqueName: \"kubernetes.io/projected/e6533be0-d453-4d29-980d-6be63a601fce-kube-api-access-hn9cn\") pod \"nova-cell0-86ff-account-create-update-fh6cl\" (UID: \"e6533be0-d453-4d29-980d-6be63a601fce\") " pod="openstack/nova-cell0-86ff-account-create-update-fh6cl" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.868991 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3011e2a4-8b30-4770-9008-ba269097abfd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3011e2a4-8b30-4770-9008-ba269097abfd" (UID: "3011e2a4-8b30-4770-9008-ba269097abfd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.901002 4825 scope.go:117] "RemoveContainer" containerID="fb587c895e312cd855a14b0eceb839242e93bb126433f037ed201fac8c44bb35" Mar 10 07:06:31 crc kubenswrapper[4825]: E0310 07:06:31.901584 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb587c895e312cd855a14b0eceb839242e93bb126433f037ed201fac8c44bb35\": container with ID starting with fb587c895e312cd855a14b0eceb839242e93bb126433f037ed201fac8c44bb35 not found: ID does not exist" containerID="fb587c895e312cd855a14b0eceb839242e93bb126433f037ed201fac8c44bb35" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.901616 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb587c895e312cd855a14b0eceb839242e93bb126433f037ed201fac8c44bb35"} err="failed to get container status \"fb587c895e312cd855a14b0eceb839242e93bb126433f037ed201fac8c44bb35\": rpc error: code = NotFound desc = could not find container \"fb587c895e312cd855a14b0eceb839242e93bb126433f037ed201fac8c44bb35\": container with ID starting with fb587c895e312cd855a14b0eceb839242e93bb126433f037ed201fac8c44bb35 not found: ID does not exist" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.901643 4825 scope.go:117] "RemoveContainer" containerID="8d2883cad525f02553d47fdbbfc805d6ed15f1d5770b9ca52425168368349330" Mar 10 07:06:31 crc kubenswrapper[4825]: E0310 07:06:31.901962 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d2883cad525f02553d47fdbbfc805d6ed15f1d5770b9ca52425168368349330\": container with ID starting with 8d2883cad525f02553d47fdbbfc805d6ed15f1d5770b9ca52425168368349330 not found: ID does not exist" containerID="8d2883cad525f02553d47fdbbfc805d6ed15f1d5770b9ca52425168368349330" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.901978 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d2883cad525f02553d47fdbbfc805d6ed15f1d5770b9ca52425168368349330"} err="failed to get container status \"8d2883cad525f02553d47fdbbfc805d6ed15f1d5770b9ca52425168368349330\": rpc error: code = NotFound desc = could not find container \"8d2883cad525f02553d47fdbbfc805d6ed15f1d5770b9ca52425168368349330\": container with ID starting with 8d2883cad525f02553d47fdbbfc805d6ed15f1d5770b9ca52425168368349330 not found: ID does not exist" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.903859 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-86ff-account-create-update-fh6cl" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.914205 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3011e2a4-8b30-4770-9008-ba269097abfd-config-data" (OuterVolumeSpecName: "config-data") pod "3011e2a4-8b30-4770-9008-ba269097abfd" (UID: "3011e2a4-8b30-4770-9008-ba269097abfd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.915326 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f-operator-scripts\") pod \"nova-cell1-d132-account-create-update-gqrfl\" (UID: \"cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f\") " pod="openstack/nova-cell1-d132-account-create-update-gqrfl" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.915487 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb9rd\" (UniqueName: \"kubernetes.io/projected/cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f-kube-api-access-gb9rd\") pod \"nova-cell1-d132-account-create-update-gqrfl\" (UID: \"cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f\") " pod="openstack/nova-cell1-d132-account-create-update-gqrfl" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.915581 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.915597 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3011e2a4-8b30-4770-9008-ba269097abfd-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.915638 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3011e2a4-8b30-4770-9008-ba269097abfd-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.915661 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3011e2a4-8b30-4770-9008-ba269097abfd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.915673 4825 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3011e2a4-8b30-4770-9008-ba269097abfd-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.915683 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6nbm\" (UniqueName: \"kubernetes.io/projected/3011e2a4-8b30-4770-9008-ba269097abfd-kube-api-access-q6nbm\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.915694 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3011e2a4-8b30-4770-9008-ba269097abfd-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.928232 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3011e2a4-8b30-4770-9008-ba269097abfd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3011e2a4-8b30-4770-9008-ba269097abfd" (UID: "3011e2a4-8b30-4770-9008-ba269097abfd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:31 crc kubenswrapper[4825]: I0310 07:06:31.953842 4825 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.017539 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb9rd\" (UniqueName: \"kubernetes.io/projected/cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f-kube-api-access-gb9rd\") pod \"nova-cell1-d132-account-create-update-gqrfl\" (UID: \"cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f\") " pod="openstack/nova-cell1-d132-account-create-update-gqrfl" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.017617 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f-operator-scripts\") pod \"nova-cell1-d132-account-create-update-gqrfl\" (UID: \"cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f\") " pod="openstack/nova-cell1-d132-account-create-update-gqrfl" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.017706 4825 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.017718 4825 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3011e2a4-8b30-4770-9008-ba269097abfd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.018421 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f-operator-scripts\") pod \"nova-cell1-d132-account-create-update-gqrfl\" (UID: \"cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f\") " pod="openstack/nova-cell1-d132-account-create-update-gqrfl" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.034967 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb9rd\" (UniqueName: \"kubernetes.io/projected/cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f-kube-api-access-gb9rd\") pod \"nova-cell1-d132-account-create-update-gqrfl\" (UID: \"cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f\") " pod="openstack/nova-cell1-d132-account-create-update-gqrfl" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.043989 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vv657"] Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.074005 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="d6f252a1-68f2-4f9a-ade7-0b979581b8c6" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.154:9292/healthcheck\": read tcp 10.217.0.2:44996->10.217.0.154:9292: read: connection reset by peer" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.074472 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="d6f252a1-68f2-4f9a-ade7-0b979581b8c6" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.154:9292/healthcheck\": read tcp 10.217.0.2:44994->10.217.0.154:9292: read: connection reset by peer" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.099231 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d132-account-create-update-gqrfl" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.112424 4825 scope.go:117] "RemoveContainer" containerID="801f701059c4cc548a98f3a4ba85c8a061b2d540f0fc8220d4041016633e9a0a" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.290038 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-5vfc4"] Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.432445 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2a9b-account-create-update-zq6mk"] Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.496425 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-t5vjd"] Mar 10 07:06:32 crc kubenswrapper[4825]: W0310 07:06:32.546349 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7df341ed_690d_4799_b9a4_18691d7de8d7.slice/crio-aca8d82873eb6ebedc58a1d149608971e5b8bbb749078513c00a646a82e87eeb WatchSource:0}: Error finding container aca8d82873eb6ebedc58a1d149608971e5b8bbb749078513c00a646a82e87eeb: Status 404 returned error can't find the container with id aca8d82873eb6ebedc58a1d149608971e5b8bbb749078513c00a646a82e87eeb Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.600370 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-86ff-account-create-update-fh6cl"] Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.767361 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.790705 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-d132-account-create-update-gqrfl"] Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.796943 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5vfc4" event={"ID":"256445a5-41a0-47b0-a149-3c31cb2f0959","Type":"ContainerStarted","Data":"c79985f0c20e51e94ef65276b3b0679ff00e78c8af4eb7e94defcd1ac905c499"} Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.796990 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5vfc4" event={"ID":"256445a5-41a0-47b0-a149-3c31cb2f0959","Type":"ContainerStarted","Data":"638146b015f7d888cc0a8daddf06bbad976bfc702da658e6b798612ee659e4a3"} Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.810490 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t5vjd" event={"ID":"7df341ed-690d-4799-b9a4-18691d7de8d7","Type":"ContainerStarted","Data":"aca8d82873eb6ebedc58a1d149608971e5b8bbb749078513c00a646a82e87eeb"} Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.820199 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.842674 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2a9b-account-create-update-zq6mk" event={"ID":"1e84f490-c283-41ba-92af-bca6b65b95cf","Type":"ContainerStarted","Data":"ff574575ba140cd21c83cc1ec9df541d4e5a9107a11a2d6cd35ab129d3763676"} Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.842728 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2a9b-account-create-update-zq6mk" event={"ID":"1e84f490-c283-41ba-92af-bca6b65b95cf","Type":"ContainerStarted","Data":"4739830110c0ceec552dcb90152ba05e1772ad885a029601bed6643884a93b32"} Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.843809 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-httpd-run\") pod \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.843857 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-internal-tls-certs\") pod \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.843950 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-config-data\") pod \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.843982 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.844047 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-scripts\") pod \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.844080 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-combined-ca-bundle\") pod \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.844109 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-logs\") pod \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.844130 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-796sf\" (UniqueName: \"kubernetes.io/projected/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-kube-api-access-796sf\") pod \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.845309 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-5vfc4" podStartSLOduration=1.845285396 podStartE2EDuration="1.845285396s" podCreationTimestamp="2026-03-10 07:06:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:06:32.837917074 +0000 UTC m=+1345.867697689" watchObservedRunningTime="2026-03-10 07:06:32.845285396 +0000 UTC m=+1345.875066011" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.847776 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-kube-api-access-796sf" (OuterVolumeSpecName: "kube-api-access-796sf") pod "d6f252a1-68f2-4f9a-ade7-0b979581b8c6" (UID: "d6f252a1-68f2-4f9a-ade7-0b979581b8c6"). InnerVolumeSpecName "kube-api-access-796sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.851253 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-logs" (OuterVolumeSpecName: "logs") pod "d6f252a1-68f2-4f9a-ade7-0b979581b8c6" (UID: "d6f252a1-68f2-4f9a-ade7-0b979581b8c6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.852686 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d6f252a1-68f2-4f9a-ade7-0b979581b8c6" (UID: "d6f252a1-68f2-4f9a-ade7-0b979581b8c6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.857251 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "d6f252a1-68f2-4f9a-ade7-0b979581b8c6" (UID: "d6f252a1-68f2-4f9a-ade7-0b979581b8c6"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.861608 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-scripts" (OuterVolumeSpecName: "scripts") pod "d6f252a1-68f2-4f9a-ade7-0b979581b8c6" (UID: "d6f252a1-68f2-4f9a-ade7-0b979581b8c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.868926 4825 generic.go:334] "Generic (PLEG): container finished" podID="cba79823-f7ea-4ac9-80e0-ce6e216af1bc" containerID="3d8dc26b029c96b4d27209c0234e0de2c19670232d57153317bfaac26bfd5e16" exitCode=0 Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.868996 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vv657" event={"ID":"cba79823-f7ea-4ac9-80e0-ce6e216af1bc","Type":"ContainerDied","Data":"3d8dc26b029c96b4d27209c0234e0de2c19670232d57153317bfaac26bfd5e16"} Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.869025 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vv657" event={"ID":"cba79823-f7ea-4ac9-80e0-ce6e216af1bc","Type":"ContainerStarted","Data":"c08d221efbc444c97f75a7b59fccdd5d5b5590ba9c97c26cae92fce91e144743"} Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.869553 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-2a9b-account-create-update-zq6mk" podStartSLOduration=1.869535288 podStartE2EDuration="1.869535288s" podCreationTimestamp="2026-03-10 07:06:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:06:32.867431683 +0000 UTC m=+1345.897212298" watchObservedRunningTime="2026-03-10 07:06:32.869535288 +0000 UTC m=+1345.899315903" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.878286 4825 generic.go:334] "Generic (PLEG): container finished" podID="d6f252a1-68f2-4f9a-ade7-0b979581b8c6" containerID="7ece94efb23a5c9ce51f2fc5f49bcfaf81a9ad61d4de7ab5ac8289cd5f6303ba" exitCode=0 Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.878361 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d6f252a1-68f2-4f9a-ade7-0b979581b8c6","Type":"ContainerDied","Data":"7ece94efb23a5c9ce51f2fc5f49bcfaf81a9ad61d4de7ab5ac8289cd5f6303ba"} Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.878400 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d6f252a1-68f2-4f9a-ade7-0b979581b8c6","Type":"ContainerDied","Data":"fb059dba5c7178f8d2bfa979ed022c99c306d84c82afdb0fba8800ac06115a93"} Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.878419 4825 scope.go:117] "RemoveContainer" containerID="7ece94efb23a5c9ce51f2fc5f49bcfaf81a9ad61d4de7ab5ac8289cd5f6303ba" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.878521 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.905404 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-86ff-account-create-update-fh6cl" event={"ID":"e6533be0-d453-4d29-980d-6be63a601fce","Type":"ContainerStarted","Data":"db664e40bcc7ecd11c565e58fd10d9f24fea441e1bc20c6ec5a204040b1cbda5"} Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.928666 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.945282 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6f252a1-68f2-4f9a-ade7-0b979581b8c6" (UID: "d6f252a1-68f2-4f9a-ade7-0b979581b8c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.946342 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-combined-ca-bundle\") pod \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\" (UID: \"d6f252a1-68f2-4f9a-ade7-0b979581b8c6\") " Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.946854 4825 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.946884 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.946893 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.946903 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.946911 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-796sf\" (UniqueName: \"kubernetes.io/projected/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-kube-api-access-796sf\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:32 crc kubenswrapper[4825]: W0310 07:06:32.948038 4825 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/d6f252a1-68f2-4f9a-ade7-0b979581b8c6/volumes/kubernetes.io~secret/combined-ca-bundle Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.948055 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6f252a1-68f2-4f9a-ade7-0b979581b8c6" (UID: "d6f252a1-68f2-4f9a-ade7-0b979581b8c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.956251 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.968596 4825 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.988401 4825 scope.go:117] "RemoveContainer" containerID="65812fc5fcac830b784d74cae3b58dafd700fcc4d4623a7b4f5a9fd5e827bda9" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.994822 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 07:06:32 crc kubenswrapper[4825]: E0310 07:06:32.995281 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f252a1-68f2-4f9a-ade7-0b979581b8c6" containerName="glance-log" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.995300 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f252a1-68f2-4f9a-ade7-0b979581b8c6" containerName="glance-log" Mar 10 07:06:32 crc kubenswrapper[4825]: E0310 07:06:32.995329 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f252a1-68f2-4f9a-ade7-0b979581b8c6" containerName="glance-httpd" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.995336 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f252a1-68f2-4f9a-ade7-0b979581b8c6" containerName="glance-httpd" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.995529 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6f252a1-68f2-4f9a-ade7-0b979581b8c6" containerName="glance-httpd" Mar 10 07:06:32 crc kubenswrapper[4825]: I0310 07:06:32.995557 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6f252a1-68f2-4f9a-ade7-0b979581b8c6" containerName="glance-log" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.018355 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.026561 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.040679 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.040878 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.051084 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.051113 4825 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.167332 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.167700 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.167750 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f33e10c8-8be2-46d4-8653-1960855a2a40-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.167783 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.167814 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f33e10c8-8be2-46d4-8653-1960855a2a40-logs\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.167900 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwppm\" (UniqueName: \"kubernetes.io/projected/f33e10c8-8be2-46d4-8653-1960855a2a40-kube-api-access-mwppm\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.167943 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-scripts\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.167992 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-config-data\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.184515 4825 scope.go:117] "RemoveContainer" containerID="7ece94efb23a5c9ce51f2fc5f49bcfaf81a9ad61d4de7ab5ac8289cd5f6303ba" Mar 10 07:06:33 crc kubenswrapper[4825]: E0310 07:06:33.198365 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ece94efb23a5c9ce51f2fc5f49bcfaf81a9ad61d4de7ab5ac8289cd5f6303ba\": container with ID starting with 7ece94efb23a5c9ce51f2fc5f49bcfaf81a9ad61d4de7ab5ac8289cd5f6303ba not found: ID does not exist" containerID="7ece94efb23a5c9ce51f2fc5f49bcfaf81a9ad61d4de7ab5ac8289cd5f6303ba" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.198416 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ece94efb23a5c9ce51f2fc5f49bcfaf81a9ad61d4de7ab5ac8289cd5f6303ba"} err="failed to get container status \"7ece94efb23a5c9ce51f2fc5f49bcfaf81a9ad61d4de7ab5ac8289cd5f6303ba\": rpc error: code = NotFound desc = could not find container \"7ece94efb23a5c9ce51f2fc5f49bcfaf81a9ad61d4de7ab5ac8289cd5f6303ba\": container with ID starting with 7ece94efb23a5c9ce51f2fc5f49bcfaf81a9ad61d4de7ab5ac8289cd5f6303ba not found: ID does not exist" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.198442 4825 scope.go:117] "RemoveContainer" containerID="65812fc5fcac830b784d74cae3b58dafd700fcc4d4623a7b4f5a9fd5e827bda9" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.199656 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-config-data" (OuterVolumeSpecName: "config-data") pod "d6f252a1-68f2-4f9a-ade7-0b979581b8c6" (UID: "d6f252a1-68f2-4f9a-ade7-0b979581b8c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.205309 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d6f252a1-68f2-4f9a-ade7-0b979581b8c6" (UID: "d6f252a1-68f2-4f9a-ade7-0b979581b8c6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:33 crc kubenswrapper[4825]: E0310 07:06:33.235366 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65812fc5fcac830b784d74cae3b58dafd700fcc4d4623a7b4f5a9fd5e827bda9\": container with ID starting with 65812fc5fcac830b784d74cae3b58dafd700fcc4d4623a7b4f5a9fd5e827bda9 not found: ID does not exist" containerID="65812fc5fcac830b784d74cae3b58dafd700fcc4d4623a7b4f5a9fd5e827bda9" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.235405 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65812fc5fcac830b784d74cae3b58dafd700fcc4d4623a7b4f5a9fd5e827bda9"} err="failed to get container status \"65812fc5fcac830b784d74cae3b58dafd700fcc4d4623a7b4f5a9fd5e827bda9\": rpc error: code = NotFound desc = could not find container \"65812fc5fcac830b784d74cae3b58dafd700fcc4d4623a7b4f5a9fd5e827bda9\": container with ID starting with 65812fc5fcac830b784d74cae3b58dafd700fcc4d4623a7b4f5a9fd5e827bda9 not found: ID does not exist" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.274231 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f33e10c8-8be2-46d4-8653-1960855a2a40-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.274289 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.274313 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f33e10c8-8be2-46d4-8653-1960855a2a40-logs\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.274373 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwppm\" (UniqueName: \"kubernetes.io/projected/f33e10c8-8be2-46d4-8653-1960855a2a40-kube-api-access-mwppm\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.274402 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-scripts\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.274436 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-config-data\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.274477 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.274520 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.274569 4825 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.274579 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6f252a1-68f2-4f9a-ade7-0b979581b8c6-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.276197 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f33e10c8-8be2-46d4-8653-1960855a2a40-logs\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.276703 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.282348 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.287102 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f33e10c8-8be2-46d4-8653-1960855a2a40-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.289609 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-scripts\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.298420 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3011e2a4-8b30-4770-9008-ba269097abfd" path="/var/lib/kubelet/pods/3011e2a4-8b30-4770-9008-ba269097abfd/volumes" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.311361 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-config-data\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.319005 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwppm\" (UniqueName: \"kubernetes.io/projected/f33e10c8-8be2-46d4-8653-1960855a2a40-kube-api-access-mwppm\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.324054 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.371624 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.554334 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.574151 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.586585 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.597542 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.603255 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.607995 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.608633 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.661658 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.689180 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.689267 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.689323 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.689380 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.689452 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.689518 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxbws\" (UniqueName: \"kubernetes.io/projected/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-kube-api-access-gxbws\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.689587 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-logs\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.689623 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.791084 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.791174 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxbws\" (UniqueName: \"kubernetes.io/projected/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-kube-api-access-gxbws\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.791217 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-logs\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.791244 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.791282 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.791310 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.791330 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.791360 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.793092 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-logs\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.793231 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.793593 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.798433 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.799848 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.803922 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.804468 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.812230 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxbws\" (UniqueName: \"kubernetes.io/projected/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-kube-api-access-gxbws\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.845662 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " pod="openstack/glance-default-internal-api-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.920785 4825 generic.go:334] "Generic (PLEG): container finished" podID="1e84f490-c283-41ba-92af-bca6b65b95cf" containerID="ff574575ba140cd21c83cc1ec9df541d4e5a9107a11a2d6cd35ab129d3763676" exitCode=0 Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.920834 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2a9b-account-create-update-zq6mk" event={"ID":"1e84f490-c283-41ba-92af-bca6b65b95cf","Type":"ContainerDied","Data":"ff574575ba140cd21c83cc1ec9df541d4e5a9107a11a2d6cd35ab129d3763676"} Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.922187 4825 generic.go:334] "Generic (PLEG): container finished" podID="7df341ed-690d-4799-b9a4-18691d7de8d7" containerID="6b20962e48b9baf3eaa366f3b6fd06bf9bf5f98ff0a301cbbdd95be4fc14ca33" exitCode=0 Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.922275 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t5vjd" event={"ID":"7df341ed-690d-4799-b9a4-18691d7de8d7","Type":"ContainerDied","Data":"6b20962e48b9baf3eaa366f3b6fd06bf9bf5f98ff0a301cbbdd95be4fc14ca33"} Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.931699 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" containerName="ceilometer-central-agent" containerID="cri-o://d8757f7dcc14f6f88cccb8185df4916056abc0f519fa5e71f15ef87ce6d9c00a" gracePeriod=30 Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.931790 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" containerName="sg-core" containerID="cri-o://8b670641e702b30f1f843231e4d3fffd5a269f476341f1a480f08ad6237fcd52" gracePeriod=30 Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.931813 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" containerName="ceilometer-notification-agent" containerID="cri-o://f39248a717e48f968e9a197d197c0c88a3ba6d6eaa5ac83f4e16d94e093946e7" gracePeriod=30 Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.931696 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1","Type":"ContainerStarted","Data":"8f0b13ffc70058ad42fbb5a5265f7002eb86ee0cc93a02d718884305221044d1"} Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.931884 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.931995 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" containerName="proxy-httpd" containerID="cri-o://8f0b13ffc70058ad42fbb5a5265f7002eb86ee0cc93a02d718884305221044d1" gracePeriod=30 Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.935303 4825 generic.go:334] "Generic (PLEG): container finished" podID="256445a5-41a0-47b0-a149-3c31cb2f0959" containerID="c79985f0c20e51e94ef65276b3b0679ff00e78c8af4eb7e94defcd1ac905c499" exitCode=0 Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.935358 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5vfc4" event={"ID":"256445a5-41a0-47b0-a149-3c31cb2f0959","Type":"ContainerDied","Data":"c79985f0c20e51e94ef65276b3b0679ff00e78c8af4eb7e94defcd1ac905c499"} Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.937807 4825 generic.go:334] "Generic (PLEG): container finished" podID="e6533be0-d453-4d29-980d-6be63a601fce" containerID="b62f07a8427cfba96a047aaa4a6ebd9cd84408e32b9a49b383d2817f4848d858" exitCode=0 Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.937860 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-86ff-account-create-update-fh6cl" event={"ID":"e6533be0-d453-4d29-980d-6be63a601fce","Type":"ContainerDied","Data":"b62f07a8427cfba96a047aaa4a6ebd9cd84408e32b9a49b383d2817f4848d858"} Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.949817 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d132-account-create-update-gqrfl" event={"ID":"cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f","Type":"ContainerStarted","Data":"c31f3d6f869e260c7cfef4fcd38079b721d8d8d8bf13e9740b4a1faedbf0b849"} Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.949863 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d132-account-create-update-gqrfl" event={"ID":"cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f","Type":"ContainerStarted","Data":"f86054c4ac09a36674615b9274b43339a0c81c7726d07419693633b2731f529d"} Mar 10 07:06:33 crc kubenswrapper[4825]: I0310 07:06:33.977055 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 07:06:34 crc kubenswrapper[4825]: I0310 07:06:34.044265 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 07:06:34 crc kubenswrapper[4825]: I0310 07:06:34.046355 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.929071009 podStartE2EDuration="7.046339449s" podCreationTimestamp="2026-03-10 07:06:27 +0000 UTC" firstStartedPulling="2026-03-10 07:06:28.705653743 +0000 UTC m=+1341.735434358" lastFinishedPulling="2026-03-10 07:06:32.822922183 +0000 UTC m=+1345.852702798" observedRunningTime="2026-03-10 07:06:34.002423874 +0000 UTC m=+1347.032204499" watchObservedRunningTime="2026-03-10 07:06:34.046339449 +0000 UTC m=+1347.076120064" Mar 10 07:06:34 crc kubenswrapper[4825]: I0310 07:06:34.075486 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-d132-account-create-update-gqrfl" podStartSLOduration=3.075462878 podStartE2EDuration="3.075462878s" podCreationTimestamp="2026-03-10 07:06:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:06:34.033490974 +0000 UTC m=+1347.063271589" watchObservedRunningTime="2026-03-10 07:06:34.075462878 +0000 UTC m=+1347.105243493" Mar 10 07:06:34 crc kubenswrapper[4825]: I0310 07:06:34.252841 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vv657" Mar 10 07:06:34 crc kubenswrapper[4825]: I0310 07:06:34.302300 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldm5b\" (UniqueName: \"kubernetes.io/projected/cba79823-f7ea-4ac9-80e0-ce6e216af1bc-kube-api-access-ldm5b\") pod \"cba79823-f7ea-4ac9-80e0-ce6e216af1bc\" (UID: \"cba79823-f7ea-4ac9-80e0-ce6e216af1bc\") " Mar 10 07:06:34 crc kubenswrapper[4825]: I0310 07:06:34.302600 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cba79823-f7ea-4ac9-80e0-ce6e216af1bc-operator-scripts\") pod \"cba79823-f7ea-4ac9-80e0-ce6e216af1bc\" (UID: \"cba79823-f7ea-4ac9-80e0-ce6e216af1bc\") " Mar 10 07:06:34 crc kubenswrapper[4825]: I0310 07:06:34.303608 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cba79823-f7ea-4ac9-80e0-ce6e216af1bc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cba79823-f7ea-4ac9-80e0-ce6e216af1bc" (UID: "cba79823-f7ea-4ac9-80e0-ce6e216af1bc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:06:34 crc kubenswrapper[4825]: I0310 07:06:34.315360 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cba79823-f7ea-4ac9-80e0-ce6e216af1bc-kube-api-access-ldm5b" (OuterVolumeSpecName: "kube-api-access-ldm5b") pod "cba79823-f7ea-4ac9-80e0-ce6e216af1bc" (UID: "cba79823-f7ea-4ac9-80e0-ce6e216af1bc"). InnerVolumeSpecName "kube-api-access-ldm5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:06:34 crc kubenswrapper[4825]: I0310 07:06:34.407806 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cba79823-f7ea-4ac9-80e0-ce6e216af1bc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:34 crc kubenswrapper[4825]: I0310 07:06:34.408098 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldm5b\" (UniqueName: \"kubernetes.io/projected/cba79823-f7ea-4ac9-80e0-ce6e216af1bc-kube-api-access-ldm5b\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:34 crc kubenswrapper[4825]: W0310 07:06:34.663528 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b3d94ce_f8d8_4653_a6b0_2682b23d834e.slice/crio-0eeecbff2b247ce21995a47c48d1ef0c0ebe8bfcbb15ff5e28bd7ff8f48a6d0c WatchSource:0}: Error finding container 0eeecbff2b247ce21995a47c48d1ef0c0ebe8bfcbb15ff5e28bd7ff8f48a6d0c: Status 404 returned error can't find the container with id 0eeecbff2b247ce21995a47c48d1ef0c0ebe8bfcbb15ff5e28bd7ff8f48a6d0c Mar 10 07:06:34 crc kubenswrapper[4825]: I0310 07:06:34.667147 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 07:06:34 crc kubenswrapper[4825]: I0310 07:06:34.796173 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:34 crc kubenswrapper[4825]: I0310 07:06:34.808837 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:06:34 crc kubenswrapper[4825]: I0310 07:06:34.982499 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f33e10c8-8be2-46d4-8653-1960855a2a40","Type":"ContainerStarted","Data":"0596a09cf87b1eba90ca5be28a144085a5ee0e4870ea3024df884f81b878aeaf"} Mar 10 07:06:34 crc kubenswrapper[4825]: I0310 07:06:34.983335 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f33e10c8-8be2-46d4-8653-1960855a2a40","Type":"ContainerStarted","Data":"540452d0c5f041913e598421282abfb7820efbe982dcad500a2958d00def99fe"} Mar 10 07:06:34 crc kubenswrapper[4825]: I0310 07:06:34.989799 4825 generic.go:334] "Generic (PLEG): container finished" podID="a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" containerID="8f0b13ffc70058ad42fbb5a5265f7002eb86ee0cc93a02d718884305221044d1" exitCode=0 Mar 10 07:06:34 crc kubenswrapper[4825]: I0310 07:06:34.989833 4825 generic.go:334] "Generic (PLEG): container finished" podID="a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" containerID="8b670641e702b30f1f843231e4d3fffd5a269f476341f1a480f08ad6237fcd52" exitCode=2 Mar 10 07:06:34 crc kubenswrapper[4825]: I0310 07:06:34.989842 4825 generic.go:334] "Generic (PLEG): container finished" podID="a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" containerID="f39248a717e48f968e9a197d197c0c88a3ba6d6eaa5ac83f4e16d94e093946e7" exitCode=0 Mar 10 07:06:34 crc kubenswrapper[4825]: I0310 07:06:34.989895 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1","Type":"ContainerDied","Data":"8f0b13ffc70058ad42fbb5a5265f7002eb86ee0cc93a02d718884305221044d1"} Mar 10 07:06:34 crc kubenswrapper[4825]: I0310 07:06:34.989967 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1","Type":"ContainerDied","Data":"8b670641e702b30f1f843231e4d3fffd5a269f476341f1a480f08ad6237fcd52"} Mar 10 07:06:34 crc kubenswrapper[4825]: I0310 07:06:34.989987 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1","Type":"ContainerDied","Data":"f39248a717e48f968e9a197d197c0c88a3ba6d6eaa5ac83f4e16d94e093946e7"} Mar 10 07:06:34 crc kubenswrapper[4825]: I0310 07:06:34.992647 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8b3d94ce-f8d8-4653-a6b0-2682b23d834e","Type":"ContainerStarted","Data":"0eeecbff2b247ce21995a47c48d1ef0c0ebe8bfcbb15ff5e28bd7ff8f48a6d0c"} Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.000442 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vv657" Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.001264 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vv657" event={"ID":"cba79823-f7ea-4ac9-80e0-ce6e216af1bc","Type":"ContainerDied","Data":"c08d221efbc444c97f75a7b59fccdd5d5b5590ba9c97c26cae92fce91e144743"} Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.001325 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c08d221efbc444c97f75a7b59fccdd5d5b5590ba9c97c26cae92fce91e144743" Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.003574 4825 generic.go:334] "Generic (PLEG): container finished" podID="cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f" containerID="c31f3d6f869e260c7cfef4fcd38079b721d8d8d8bf13e9740b4a1faedbf0b849" exitCode=0 Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.003636 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d132-account-create-update-gqrfl" event={"ID":"cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f","Type":"ContainerDied","Data":"c31f3d6f869e260c7cfef4fcd38079b721d8d8d8bf13e9740b4a1faedbf0b849"} Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.249249 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6f252a1-68f2-4f9a-ade7-0b979581b8c6" path="/var/lib/kubelet/pods/d6f252a1-68f2-4f9a-ade7-0b979581b8c6/volumes" Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.608121 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.608823 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2a9b-account-create-update-zq6mk" Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.617778 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-86ff-account-create-update-fh6cl" Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.661174 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5vfc4" Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.671539 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6533be0-d453-4d29-980d-6be63a601fce-operator-scripts\") pod \"e6533be0-d453-4d29-980d-6be63a601fce\" (UID: \"e6533be0-d453-4d29-980d-6be63a601fce\") " Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.671621 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn9cn\" (UniqueName: \"kubernetes.io/projected/e6533be0-d453-4d29-980d-6be63a601fce-kube-api-access-hn9cn\") pod \"e6533be0-d453-4d29-980d-6be63a601fce\" (UID: \"e6533be0-d453-4d29-980d-6be63a601fce\") " Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.671762 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktrcg\" (UniqueName: \"kubernetes.io/projected/1e84f490-c283-41ba-92af-bca6b65b95cf-kube-api-access-ktrcg\") pod \"1e84f490-c283-41ba-92af-bca6b65b95cf\" (UID: \"1e84f490-c283-41ba-92af-bca6b65b95cf\") " Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.671795 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e84f490-c283-41ba-92af-bca6b65b95cf-operator-scripts\") pod \"1e84f490-c283-41ba-92af-bca6b65b95cf\" (UID: \"1e84f490-c283-41ba-92af-bca6b65b95cf\") " Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.678507 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6533be0-d453-4d29-980d-6be63a601fce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6533be0-d453-4d29-980d-6be63a601fce" (UID: "e6533be0-d453-4d29-980d-6be63a601fce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.679376 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e84f490-c283-41ba-92af-bca6b65b95cf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e84f490-c283-41ba-92af-bca6b65b95cf" (UID: "1e84f490-c283-41ba-92af-bca6b65b95cf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.680680 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6533be0-d453-4d29-980d-6be63a601fce-kube-api-access-hn9cn" (OuterVolumeSpecName: "kube-api-access-hn9cn") pod "e6533be0-d453-4d29-980d-6be63a601fce" (UID: "e6533be0-d453-4d29-980d-6be63a601fce"). InnerVolumeSpecName "kube-api-access-hn9cn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.688779 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e84f490-c283-41ba-92af-bca6b65b95cf-kube-api-access-ktrcg" (OuterVolumeSpecName: "kube-api-access-ktrcg") pod "1e84f490-c283-41ba-92af-bca6b65b95cf" (UID: "1e84f490-c283-41ba-92af-bca6b65b95cf"). InnerVolumeSpecName "kube-api-access-ktrcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.695037 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t5vjd" Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.754398 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-75c968ccd4-ln762"] Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.756637 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-75c968ccd4-ln762" podUID="48fa8e66-5488-44b6-aa54-910c521e0060" containerName="neutron-api" containerID="cri-o://856f4ec0ebd37a9f123f95905685e8420cf83491aa030fcabc9f1867c6e6162a" gracePeriod=30 Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.757289 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-75c968ccd4-ln762" podUID="48fa8e66-5488-44b6-aa54-910c521e0060" containerName="neutron-httpd" containerID="cri-o://efe3b17926b62fe634f669ee81be39143fab352af89c32bf35e4700fe1395613" gracePeriod=30 Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.774235 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/256445a5-41a0-47b0-a149-3c31cb2f0959-operator-scripts\") pod \"256445a5-41a0-47b0-a149-3c31cb2f0959\" (UID: \"256445a5-41a0-47b0-a149-3c31cb2f0959\") " Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.774519 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntg4d\" (UniqueName: \"kubernetes.io/projected/256445a5-41a0-47b0-a149-3c31cb2f0959-kube-api-access-ntg4d\") pod \"256445a5-41a0-47b0-a149-3c31cb2f0959\" (UID: \"256445a5-41a0-47b0-a149-3c31cb2f0959\") " Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.774558 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df341ed-690d-4799-b9a4-18691d7de8d7-operator-scripts\") pod \"7df341ed-690d-4799-b9a4-18691d7de8d7\" (UID: \"7df341ed-690d-4799-b9a4-18691d7de8d7\") " Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.774591 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drg67\" (UniqueName: \"kubernetes.io/projected/7df341ed-690d-4799-b9a4-18691d7de8d7-kube-api-access-drg67\") pod \"7df341ed-690d-4799-b9a4-18691d7de8d7\" (UID: \"7df341ed-690d-4799-b9a4-18691d7de8d7\") " Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.775033 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktrcg\" (UniqueName: \"kubernetes.io/projected/1e84f490-c283-41ba-92af-bca6b65b95cf-kube-api-access-ktrcg\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.775053 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e84f490-c283-41ba-92af-bca6b65b95cf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.775062 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6533be0-d453-4d29-980d-6be63a601fce-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.775071 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn9cn\" (UniqueName: \"kubernetes.io/projected/e6533be0-d453-4d29-980d-6be63a601fce-kube-api-access-hn9cn\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.778796 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7df341ed-690d-4799-b9a4-18691d7de8d7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7df341ed-690d-4799-b9a4-18691d7de8d7" (UID: "7df341ed-690d-4799-b9a4-18691d7de8d7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.779209 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7df341ed-690d-4799-b9a4-18691d7de8d7-kube-api-access-drg67" (OuterVolumeSpecName: "kube-api-access-drg67") pod "7df341ed-690d-4799-b9a4-18691d7de8d7" (UID: "7df341ed-690d-4799-b9a4-18691d7de8d7"). InnerVolumeSpecName "kube-api-access-drg67". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.779503 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/256445a5-41a0-47b0-a149-3c31cb2f0959-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "256445a5-41a0-47b0-a149-3c31cb2f0959" (UID: "256445a5-41a0-47b0-a149-3c31cb2f0959"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.780326 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/256445a5-41a0-47b0-a149-3c31cb2f0959-kube-api-access-ntg4d" (OuterVolumeSpecName: "kube-api-access-ntg4d") pod "256445a5-41a0-47b0-a149-3c31cb2f0959" (UID: "256445a5-41a0-47b0-a149-3c31cb2f0959"). InnerVolumeSpecName "kube-api-access-ntg4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.877251 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntg4d\" (UniqueName: \"kubernetes.io/projected/256445a5-41a0-47b0-a149-3c31cb2f0959-kube-api-access-ntg4d\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.877545 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7df341ed-690d-4799-b9a4-18691d7de8d7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.877555 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drg67\" (UniqueName: \"kubernetes.io/projected/7df341ed-690d-4799-b9a4-18691d7de8d7-kube-api-access-drg67\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:35 crc kubenswrapper[4825]: I0310 07:06:35.877564 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/256445a5-41a0-47b0-a149-3c31cb2f0959-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:36 crc kubenswrapper[4825]: I0310 07:06:36.012903 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f33e10c8-8be2-46d4-8653-1960855a2a40","Type":"ContainerStarted","Data":"eb096eba10587e7f334a63bf327913dd03205b29419eb194a05f626f07286905"} Mar 10 07:06:36 crc kubenswrapper[4825]: I0310 07:06:36.022635 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-86ff-account-create-update-fh6cl" event={"ID":"e6533be0-d453-4d29-980d-6be63a601fce","Type":"ContainerDied","Data":"db664e40bcc7ecd11c565e58fd10d9f24fea441e1bc20c6ec5a204040b1cbda5"} Mar 10 07:06:36 crc kubenswrapper[4825]: I0310 07:06:36.022675 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db664e40bcc7ecd11c565e58fd10d9f24fea441e1bc20c6ec5a204040b1cbda5" Mar 10 07:06:36 crc kubenswrapper[4825]: I0310 07:06:36.022755 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-86ff-account-create-update-fh6cl" Mar 10 07:06:36 crc kubenswrapper[4825]: I0310 07:06:36.037838 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2a9b-account-create-update-zq6mk" event={"ID":"1e84f490-c283-41ba-92af-bca6b65b95cf","Type":"ContainerDied","Data":"4739830110c0ceec552dcb90152ba05e1772ad885a029601bed6643884a93b32"} Mar 10 07:06:36 crc kubenswrapper[4825]: I0310 07:06:36.037878 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4739830110c0ceec552dcb90152ba05e1772ad885a029601bed6643884a93b32" Mar 10 07:06:36 crc kubenswrapper[4825]: I0310 07:06:36.037883 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2a9b-account-create-update-zq6mk" Mar 10 07:06:36 crc kubenswrapper[4825]: I0310 07:06:36.040963 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.040942915 podStartE2EDuration="4.040942915s" podCreationTimestamp="2026-03-10 07:06:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:06:36.037299261 +0000 UTC m=+1349.067079876" watchObservedRunningTime="2026-03-10 07:06:36.040942915 +0000 UTC m=+1349.070723530" Mar 10 07:06:36 crc kubenswrapper[4825]: I0310 07:06:36.050339 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8b3d94ce-f8d8-4653-a6b0-2682b23d834e","Type":"ContainerStarted","Data":"8c02a587ff89f403a3ba0a4a6efae15db19b3375bb2065fb2488307352268600"} Mar 10 07:06:36 crc kubenswrapper[4825]: I0310 07:06:36.054798 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5vfc4" event={"ID":"256445a5-41a0-47b0-a149-3c31cb2f0959","Type":"ContainerDied","Data":"638146b015f7d888cc0a8daddf06bbad976bfc702da658e6b798612ee659e4a3"} Mar 10 07:06:36 crc kubenswrapper[4825]: I0310 07:06:36.054839 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="638146b015f7d888cc0a8daddf06bbad976bfc702da658e6b798612ee659e4a3" Mar 10 07:06:36 crc kubenswrapper[4825]: I0310 07:06:36.054919 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5vfc4" Mar 10 07:06:36 crc kubenswrapper[4825]: I0310 07:06:36.058190 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t5vjd" event={"ID":"7df341ed-690d-4799-b9a4-18691d7de8d7","Type":"ContainerDied","Data":"aca8d82873eb6ebedc58a1d149608971e5b8bbb749078513c00a646a82e87eeb"} Mar 10 07:06:36 crc kubenswrapper[4825]: I0310 07:06:36.058270 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aca8d82873eb6ebedc58a1d149608971e5b8bbb749078513c00a646a82e87eeb" Mar 10 07:06:36 crc kubenswrapper[4825]: I0310 07:06:36.058197 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t5vjd" Mar 10 07:06:36 crc kubenswrapper[4825]: I0310 07:06:36.060488 4825 generic.go:334] "Generic (PLEG): container finished" podID="48fa8e66-5488-44b6-aa54-910c521e0060" containerID="efe3b17926b62fe634f669ee81be39143fab352af89c32bf35e4700fe1395613" exitCode=0 Mar 10 07:06:36 crc kubenswrapper[4825]: I0310 07:06:36.060637 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75c968ccd4-ln762" event={"ID":"48fa8e66-5488-44b6-aa54-910c521e0060","Type":"ContainerDied","Data":"efe3b17926b62fe634f669ee81be39143fab352af89c32bf35e4700fe1395613"} Mar 10 07:06:36 crc kubenswrapper[4825]: I0310 07:06:36.427757 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d132-account-create-update-gqrfl" Mar 10 07:06:36 crc kubenswrapper[4825]: I0310 07:06:36.493997 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb9rd\" (UniqueName: \"kubernetes.io/projected/cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f-kube-api-access-gb9rd\") pod \"cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f\" (UID: \"cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f\") " Mar 10 07:06:36 crc kubenswrapper[4825]: I0310 07:06:36.494071 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f-operator-scripts\") pod \"cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f\" (UID: \"cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f\") " Mar 10 07:06:36 crc kubenswrapper[4825]: I0310 07:06:36.495000 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f" (UID: "cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:06:36 crc kubenswrapper[4825]: I0310 07:06:36.503282 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f-kube-api-access-gb9rd" (OuterVolumeSpecName: "kube-api-access-gb9rd") pod "cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f" (UID: "cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f"). InnerVolumeSpecName "kube-api-access-gb9rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:06:36 crc kubenswrapper[4825]: I0310 07:06:36.596039 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gb9rd\" (UniqueName: \"kubernetes.io/projected/cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f-kube-api-access-gb9rd\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:36 crc kubenswrapper[4825]: I0310 07:06:36.596371 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:37 crc kubenswrapper[4825]: I0310 07:06:37.072356 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8b3d94ce-f8d8-4653-a6b0-2682b23d834e","Type":"ContainerStarted","Data":"b75f52a9c2b47ee09bd6f6c1c6a4ac378afdcf400070a6ab9a12012c65129fcc"} Mar 10 07:06:37 crc kubenswrapper[4825]: I0310 07:06:37.075195 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-d132-account-create-update-gqrfl" Mar 10 07:06:37 crc kubenswrapper[4825]: I0310 07:06:37.075428 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-d132-account-create-update-gqrfl" event={"ID":"cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f","Type":"ContainerDied","Data":"f86054c4ac09a36674615b9274b43339a0c81c7726d07419693633b2731f529d"} Mar 10 07:06:37 crc kubenswrapper[4825]: I0310 07:06:37.075493 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f86054c4ac09a36674615b9274b43339a0c81c7726d07419693633b2731f529d" Mar 10 07:06:37 crc kubenswrapper[4825]: I0310 07:06:37.125362 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.125337199 podStartE2EDuration="4.125337199s" podCreationTimestamp="2026-03-10 07:06:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:06:37.121624572 +0000 UTC m=+1350.151405187" watchObservedRunningTime="2026-03-10 07:06:37.125337199 +0000 UTC m=+1350.155117834" Mar 10 07:06:38 crc kubenswrapper[4825]: I0310 07:06:38.084989 4825 generic.go:334] "Generic (PLEG): container finished" podID="48fa8e66-5488-44b6-aa54-910c521e0060" containerID="856f4ec0ebd37a9f123f95905685e8420cf83491aa030fcabc9f1867c6e6162a" exitCode=0 Mar 10 07:06:38 crc kubenswrapper[4825]: I0310 07:06:38.086090 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75c968ccd4-ln762" event={"ID":"48fa8e66-5488-44b6-aa54-910c521e0060","Type":"ContainerDied","Data":"856f4ec0ebd37a9f123f95905685e8420cf83491aa030fcabc9f1867c6e6162a"} Mar 10 07:06:38 crc kubenswrapper[4825]: I0310 07:06:38.375169 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75c968ccd4-ln762" Mar 10 07:06:38 crc kubenswrapper[4825]: I0310 07:06:38.540391 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48fa8e66-5488-44b6-aa54-910c521e0060-ovndb-tls-certs\") pod \"48fa8e66-5488-44b6-aa54-910c521e0060\" (UID: \"48fa8e66-5488-44b6-aa54-910c521e0060\") " Mar 10 07:06:38 crc kubenswrapper[4825]: I0310 07:06:38.540524 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm8wk\" (UniqueName: \"kubernetes.io/projected/48fa8e66-5488-44b6-aa54-910c521e0060-kube-api-access-mm8wk\") pod \"48fa8e66-5488-44b6-aa54-910c521e0060\" (UID: \"48fa8e66-5488-44b6-aa54-910c521e0060\") " Mar 10 07:06:38 crc kubenswrapper[4825]: I0310 07:06:38.540547 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48fa8e66-5488-44b6-aa54-910c521e0060-combined-ca-bundle\") pod \"48fa8e66-5488-44b6-aa54-910c521e0060\" (UID: \"48fa8e66-5488-44b6-aa54-910c521e0060\") " Mar 10 07:06:38 crc kubenswrapper[4825]: I0310 07:06:38.540573 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/48fa8e66-5488-44b6-aa54-910c521e0060-httpd-config\") pod \"48fa8e66-5488-44b6-aa54-910c521e0060\" (UID: \"48fa8e66-5488-44b6-aa54-910c521e0060\") " Mar 10 07:06:38 crc kubenswrapper[4825]: I0310 07:06:38.540624 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48fa8e66-5488-44b6-aa54-910c521e0060-config\") pod \"48fa8e66-5488-44b6-aa54-910c521e0060\" (UID: \"48fa8e66-5488-44b6-aa54-910c521e0060\") " Mar 10 07:06:38 crc kubenswrapper[4825]: I0310 07:06:38.569107 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48fa8e66-5488-44b6-aa54-910c521e0060-kube-api-access-mm8wk" (OuterVolumeSpecName: "kube-api-access-mm8wk") pod "48fa8e66-5488-44b6-aa54-910c521e0060" (UID: "48fa8e66-5488-44b6-aa54-910c521e0060"). InnerVolumeSpecName "kube-api-access-mm8wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:06:38 crc kubenswrapper[4825]: I0310 07:06:38.569921 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48fa8e66-5488-44b6-aa54-910c521e0060-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "48fa8e66-5488-44b6-aa54-910c521e0060" (UID: "48fa8e66-5488-44b6-aa54-910c521e0060"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:38 crc kubenswrapper[4825]: I0310 07:06:38.613569 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48fa8e66-5488-44b6-aa54-910c521e0060-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48fa8e66-5488-44b6-aa54-910c521e0060" (UID: "48fa8e66-5488-44b6-aa54-910c521e0060"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:38 crc kubenswrapper[4825]: I0310 07:06:38.623492 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48fa8e66-5488-44b6-aa54-910c521e0060-config" (OuterVolumeSpecName: "config") pod "48fa8e66-5488-44b6-aa54-910c521e0060" (UID: "48fa8e66-5488-44b6-aa54-910c521e0060"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:38 crc kubenswrapper[4825]: I0310 07:06:38.642272 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48fa8e66-5488-44b6-aa54-910c521e0060-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "48fa8e66-5488-44b6-aa54-910c521e0060" (UID: "48fa8e66-5488-44b6-aa54-910c521e0060"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:38 crc kubenswrapper[4825]: I0310 07:06:38.642308 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48fa8e66-5488-44b6-aa54-910c521e0060-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:38 crc kubenswrapper[4825]: I0310 07:06:38.642335 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm8wk\" (UniqueName: \"kubernetes.io/projected/48fa8e66-5488-44b6-aa54-910c521e0060-kube-api-access-mm8wk\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:38 crc kubenswrapper[4825]: I0310 07:06:38.642350 4825 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/48fa8e66-5488-44b6-aa54-910c521e0060-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:38 crc kubenswrapper[4825]: I0310 07:06:38.642361 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/48fa8e66-5488-44b6-aa54-910c521e0060-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:38 crc kubenswrapper[4825]: I0310 07:06:38.743923 4825 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48fa8e66-5488-44b6-aa54-910c521e0060-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.108317 4825 generic.go:334] "Generic (PLEG): container finished" podID="a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" containerID="d8757f7dcc14f6f88cccb8185df4916056abc0f519fa5e71f15ef87ce6d9c00a" exitCode=0 Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.108414 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1","Type":"ContainerDied","Data":"d8757f7dcc14f6f88cccb8185df4916056abc0f519fa5e71f15ef87ce6d9c00a"} Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.111798 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75c968ccd4-ln762" event={"ID":"48fa8e66-5488-44b6-aa54-910c521e0060","Type":"ContainerDied","Data":"7381bfb4553887c1b9ec30f844dcffc1a5b290a510e400267251853793044422"} Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.111850 4825 scope.go:117] "RemoveContainer" containerID="efe3b17926b62fe634f669ee81be39143fab352af89c32bf35e4700fe1395613" Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.112012 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75c968ccd4-ln762" Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.141282 4825 scope.go:117] "RemoveContainer" containerID="856f4ec0ebd37a9f123f95905685e8420cf83491aa030fcabc9f1867c6e6162a" Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.158676 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-75c968ccd4-ln762"] Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.167277 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-75c968ccd4-ln762"] Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.248484 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48fa8e66-5488-44b6-aa54-910c521e0060" path="/var/lib/kubelet/pods/48fa8e66-5488-44b6-aa54-910c521e0060/volumes" Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.294273 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.456946 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-sg-core-conf-yaml\") pod \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.457261 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-scripts\") pod \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.457412 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfzm6\" (UniqueName: \"kubernetes.io/projected/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-kube-api-access-mfzm6\") pod \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.457432 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-combined-ca-bundle\") pod \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.457926 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-config-data\") pod \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.457998 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-run-httpd\") pod \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.458034 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-log-httpd\") pod \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\" (UID: \"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1\") " Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.458553 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" (UID: "a3c1d516-a0fa-48aa-85ff-eaf7b14499f1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.458906 4825 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.458988 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" (UID: "a3c1d516-a0fa-48aa-85ff-eaf7b14499f1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.464265 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-kube-api-access-mfzm6" (OuterVolumeSpecName: "kube-api-access-mfzm6") pod "a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" (UID: "a3c1d516-a0fa-48aa-85ff-eaf7b14499f1"). InnerVolumeSpecName "kube-api-access-mfzm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.482343 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-scripts" (OuterVolumeSpecName: "scripts") pod "a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" (UID: "a3c1d516-a0fa-48aa-85ff-eaf7b14499f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.492006 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" (UID: "a3c1d516-a0fa-48aa-85ff-eaf7b14499f1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.539021 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" (UID: "a3c1d516-a0fa-48aa-85ff-eaf7b14499f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.560425 4825 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.560721 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.560812 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfzm6\" (UniqueName: \"kubernetes.io/projected/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-kube-api-access-mfzm6\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.560906 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.560981 4825 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.592639 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-config-data" (OuterVolumeSpecName: "config-data") pod "a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" (UID: "a3c1d516-a0fa-48aa-85ff-eaf7b14499f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:39 crc kubenswrapper[4825]: I0310 07:06:39.663639 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.137746 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3c1d516-a0fa-48aa-85ff-eaf7b14499f1","Type":"ContainerDied","Data":"7b858dcd330bc0928600af08991657f9b3d192405704048521753178c3186489"} Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.137808 4825 scope.go:117] "RemoveContainer" containerID="8f0b13ffc70058ad42fbb5a5265f7002eb86ee0cc93a02d718884305221044d1" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.137891 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.175345 4825 scope.go:117] "RemoveContainer" containerID="8b670641e702b30f1f843231e4d3fffd5a269f476341f1a480f08ad6237fcd52" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.201884 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.213726 4825 scope.go:117] "RemoveContainer" containerID="f39248a717e48f968e9a197d197c0c88a3ba6d6eaa5ac83f4e16d94e093946e7" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.226034 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.248897 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:06:40 crc kubenswrapper[4825]: E0310 07:06:40.249457 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48fa8e66-5488-44b6-aa54-910c521e0060" containerName="neutron-api" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.249481 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="48fa8e66-5488-44b6-aa54-910c521e0060" containerName="neutron-api" Mar 10 07:06:40 crc kubenswrapper[4825]: E0310 07:06:40.249500 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" containerName="ceilometer-central-agent" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.249510 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" containerName="ceilometer-central-agent" Mar 10 07:06:40 crc kubenswrapper[4825]: E0310 07:06:40.249529 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6533be0-d453-4d29-980d-6be63a601fce" containerName="mariadb-account-create-update" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.249536 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6533be0-d453-4d29-980d-6be63a601fce" containerName="mariadb-account-create-update" Mar 10 07:06:40 crc kubenswrapper[4825]: E0310 07:06:40.249552 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48fa8e66-5488-44b6-aa54-910c521e0060" containerName="neutron-httpd" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.249559 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="48fa8e66-5488-44b6-aa54-910c521e0060" containerName="neutron-httpd" Mar 10 07:06:40 crc kubenswrapper[4825]: E0310 07:06:40.249573 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" containerName="ceilometer-notification-agent" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.249581 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" containerName="ceilometer-notification-agent" Mar 10 07:06:40 crc kubenswrapper[4825]: E0310 07:06:40.249590 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="256445a5-41a0-47b0-a149-3c31cb2f0959" containerName="mariadb-database-create" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.249597 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="256445a5-41a0-47b0-a149-3c31cb2f0959" containerName="mariadb-database-create" Mar 10 07:06:40 crc kubenswrapper[4825]: E0310 07:06:40.249612 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba79823-f7ea-4ac9-80e0-ce6e216af1bc" containerName="mariadb-database-create" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.249620 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba79823-f7ea-4ac9-80e0-ce6e216af1bc" containerName="mariadb-database-create" Mar 10 07:06:40 crc kubenswrapper[4825]: E0310 07:06:40.249640 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f" containerName="mariadb-account-create-update" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.249648 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f" containerName="mariadb-account-create-update" Mar 10 07:06:40 crc kubenswrapper[4825]: E0310 07:06:40.249663 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" containerName="sg-core" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.249671 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" containerName="sg-core" Mar 10 07:06:40 crc kubenswrapper[4825]: E0310 07:06:40.249690 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df341ed-690d-4799-b9a4-18691d7de8d7" containerName="mariadb-database-create" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.249698 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df341ed-690d-4799-b9a4-18691d7de8d7" containerName="mariadb-database-create" Mar 10 07:06:40 crc kubenswrapper[4825]: E0310 07:06:40.249715 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" containerName="proxy-httpd" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.249724 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" containerName="proxy-httpd" Mar 10 07:06:40 crc kubenswrapper[4825]: E0310 07:06:40.249741 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e84f490-c283-41ba-92af-bca6b65b95cf" containerName="mariadb-account-create-update" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.249749 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e84f490-c283-41ba-92af-bca6b65b95cf" containerName="mariadb-account-create-update" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.249947 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" containerName="ceilometer-central-agent" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.249959 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f" containerName="mariadb-account-create-update" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.249972 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e84f490-c283-41ba-92af-bca6b65b95cf" containerName="mariadb-account-create-update" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.249983 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="48fa8e66-5488-44b6-aa54-910c521e0060" containerName="neutron-api" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.249997 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" containerName="proxy-httpd" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.250011 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="256445a5-41a0-47b0-a149-3c31cb2f0959" containerName="mariadb-database-create" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.250022 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" containerName="sg-core" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.250034 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" containerName="ceilometer-notification-agent" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.250041 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6533be0-d453-4d29-980d-6be63a601fce" containerName="mariadb-account-create-update" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.250055 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="7df341ed-690d-4799-b9a4-18691d7de8d7" containerName="mariadb-database-create" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.250069 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="48fa8e66-5488-44b6-aa54-910c521e0060" containerName="neutron-httpd" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.250087 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba79823-f7ea-4ac9-80e0-ce6e216af1bc" containerName="mariadb-database-create" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.252691 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.253577 4825 scope.go:117] "RemoveContainer" containerID="d8757f7dcc14f6f88cccb8185df4916056abc0f519fa5e71f15ef87ce6d9c00a" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.255661 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.256370 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.273587 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.375468 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c647c98-8aca-46cc-bc00-02bf50500321-config-data\") pod \"ceilometer-0\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " pod="openstack/ceilometer-0" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.375584 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c647c98-8aca-46cc-bc00-02bf50500321-scripts\") pod \"ceilometer-0\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " pod="openstack/ceilometer-0" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.375724 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c647c98-8aca-46cc-bc00-02bf50500321-log-httpd\") pod \"ceilometer-0\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " pod="openstack/ceilometer-0" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.375765 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c647c98-8aca-46cc-bc00-02bf50500321-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " pod="openstack/ceilometer-0" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.375828 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c647c98-8aca-46cc-bc00-02bf50500321-run-httpd\") pod \"ceilometer-0\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " pod="openstack/ceilometer-0" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.375868 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c647c98-8aca-46cc-bc00-02bf50500321-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " pod="openstack/ceilometer-0" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.375890 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqff5\" (UniqueName: \"kubernetes.io/projected/9c647c98-8aca-46cc-bc00-02bf50500321-kube-api-access-mqff5\") pod \"ceilometer-0\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " pod="openstack/ceilometer-0" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.477229 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c647c98-8aca-46cc-bc00-02bf50500321-log-httpd\") pod \"ceilometer-0\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " pod="openstack/ceilometer-0" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.477288 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c647c98-8aca-46cc-bc00-02bf50500321-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " pod="openstack/ceilometer-0" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.477330 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c647c98-8aca-46cc-bc00-02bf50500321-run-httpd\") pod \"ceilometer-0\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " pod="openstack/ceilometer-0" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.477359 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c647c98-8aca-46cc-bc00-02bf50500321-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " pod="openstack/ceilometer-0" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.477382 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqff5\" (UniqueName: \"kubernetes.io/projected/9c647c98-8aca-46cc-bc00-02bf50500321-kube-api-access-mqff5\") pod \"ceilometer-0\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " pod="openstack/ceilometer-0" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.477414 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c647c98-8aca-46cc-bc00-02bf50500321-config-data\") pod \"ceilometer-0\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " pod="openstack/ceilometer-0" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.477469 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c647c98-8aca-46cc-bc00-02bf50500321-scripts\") pod \"ceilometer-0\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " pod="openstack/ceilometer-0" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.477868 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c647c98-8aca-46cc-bc00-02bf50500321-log-httpd\") pod \"ceilometer-0\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " pod="openstack/ceilometer-0" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.478695 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c647c98-8aca-46cc-bc00-02bf50500321-run-httpd\") pod \"ceilometer-0\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " pod="openstack/ceilometer-0" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.482015 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c647c98-8aca-46cc-bc00-02bf50500321-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " pod="openstack/ceilometer-0" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.482286 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c647c98-8aca-46cc-bc00-02bf50500321-config-data\") pod \"ceilometer-0\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " pod="openstack/ceilometer-0" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.482779 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c647c98-8aca-46cc-bc00-02bf50500321-scripts\") pod \"ceilometer-0\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " pod="openstack/ceilometer-0" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.483005 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c647c98-8aca-46cc-bc00-02bf50500321-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " pod="openstack/ceilometer-0" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.497412 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqff5\" (UniqueName: \"kubernetes.io/projected/9c647c98-8aca-46cc-bc00-02bf50500321-kube-api-access-mqff5\") pod \"ceilometer-0\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " pod="openstack/ceilometer-0" Mar 10 07:06:40 crc kubenswrapper[4825]: I0310 07:06:40.573215 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:06:41 crc kubenswrapper[4825]: I0310 07:06:41.032929 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:06:41 crc kubenswrapper[4825]: I0310 07:06:41.148266 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c647c98-8aca-46cc-bc00-02bf50500321","Type":"ContainerStarted","Data":"78ffa0ae775591b504d9349dc310c88e4ba34c43b69ad51d7e12948ea5227a72"} Mar 10 07:06:41 crc kubenswrapper[4825]: I0310 07:06:41.247739 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3c1d516-a0fa-48aa-85ff-eaf7b14499f1" path="/var/lib/kubelet/pods/a3c1d516-a0fa-48aa-85ff-eaf7b14499f1/volumes" Mar 10 07:06:41 crc kubenswrapper[4825]: I0310 07:06:41.355650 4825 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podb960a8d7-4706-4657-9991-84a87645ab8a"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podb960a8d7-4706-4657-9991-84a87645ab8a] : Timed out while waiting for systemd to remove kubepods-besteffort-podb960a8d7_4706_4657_9991_84a87645ab8a.slice" Mar 10 07:06:41 crc kubenswrapper[4825]: I0310 07:06:41.435524 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:41 crc kubenswrapper[4825]: I0310 07:06:41.489991 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:06:41 crc kubenswrapper[4825]: I0310 07:06:41.573068 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6b684b6dd6-r5vgn"] Mar 10 07:06:41 crc kubenswrapper[4825]: I0310 07:06:41.573326 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6b684b6dd6-r5vgn" podUID="2d23be6c-5ecc-495c-b45c-46b79ccbdcf5" containerName="placement-log" containerID="cri-o://98cef233b9fa81351c21de237de8599c342df210bb40854490688bca71f84674" gracePeriod=30 Mar 10 07:06:41 crc kubenswrapper[4825]: I0310 07:06:41.573737 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6b684b6dd6-r5vgn" podUID="2d23be6c-5ecc-495c-b45c-46b79ccbdcf5" containerName="placement-api" containerID="cri-o://1dd1b73af2989fabf8ad6057b797d997323d290591c75c2def399b0331c06c79" gracePeriod=30 Mar 10 07:06:41 crc kubenswrapper[4825]: I0310 07:06:41.939468 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k7b7g"] Mar 10 07:06:41 crc kubenswrapper[4825]: I0310 07:06:41.940892 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k7b7g" Mar 10 07:06:41 crc kubenswrapper[4825]: I0310 07:06:41.943796 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 10 07:06:41 crc kubenswrapper[4825]: I0310 07:06:41.943944 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 07:06:41 crc kubenswrapper[4825]: I0310 07:06:41.944223 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-x45l4" Mar 10 07:06:41 crc kubenswrapper[4825]: I0310 07:06:41.960518 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k7b7g"] Mar 10 07:06:42 crc kubenswrapper[4825]: I0310 07:06:42.106422 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/663403bd-27d6-4584-9aaa-859445f539d6-scripts\") pod \"nova-cell0-conductor-db-sync-k7b7g\" (UID: \"663403bd-27d6-4584-9aaa-859445f539d6\") " pod="openstack/nova-cell0-conductor-db-sync-k7b7g" Mar 10 07:06:42 crc kubenswrapper[4825]: I0310 07:06:42.106582 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/663403bd-27d6-4584-9aaa-859445f539d6-config-data\") pod \"nova-cell0-conductor-db-sync-k7b7g\" (UID: \"663403bd-27d6-4584-9aaa-859445f539d6\") " pod="openstack/nova-cell0-conductor-db-sync-k7b7g" Mar 10 07:06:42 crc kubenswrapper[4825]: I0310 07:06:42.106607 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cksss\" (UniqueName: \"kubernetes.io/projected/663403bd-27d6-4584-9aaa-859445f539d6-kube-api-access-cksss\") pod \"nova-cell0-conductor-db-sync-k7b7g\" (UID: \"663403bd-27d6-4584-9aaa-859445f539d6\") " pod="openstack/nova-cell0-conductor-db-sync-k7b7g" Mar 10 07:06:42 crc kubenswrapper[4825]: I0310 07:06:42.106631 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/663403bd-27d6-4584-9aaa-859445f539d6-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-k7b7g\" (UID: \"663403bd-27d6-4584-9aaa-859445f539d6\") " pod="openstack/nova-cell0-conductor-db-sync-k7b7g" Mar 10 07:06:42 crc kubenswrapper[4825]: I0310 07:06:42.159504 4825 generic.go:334] "Generic (PLEG): container finished" podID="2d23be6c-5ecc-495c-b45c-46b79ccbdcf5" containerID="98cef233b9fa81351c21de237de8599c342df210bb40854490688bca71f84674" exitCode=143 Mar 10 07:06:42 crc kubenswrapper[4825]: I0310 07:06:42.159570 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b684b6dd6-r5vgn" event={"ID":"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5","Type":"ContainerDied","Data":"98cef233b9fa81351c21de237de8599c342df210bb40854490688bca71f84674"} Mar 10 07:06:42 crc kubenswrapper[4825]: I0310 07:06:42.161157 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c647c98-8aca-46cc-bc00-02bf50500321","Type":"ContainerStarted","Data":"88d73553fa50a5bca19b77f26ea09c721d2c802bb054b7c91d38d67f0e15583f"} Mar 10 07:06:42 crc kubenswrapper[4825]: I0310 07:06:42.207964 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cksss\" (UniqueName: \"kubernetes.io/projected/663403bd-27d6-4584-9aaa-859445f539d6-kube-api-access-cksss\") pod \"nova-cell0-conductor-db-sync-k7b7g\" (UID: \"663403bd-27d6-4584-9aaa-859445f539d6\") " pod="openstack/nova-cell0-conductor-db-sync-k7b7g" Mar 10 07:06:42 crc kubenswrapper[4825]: I0310 07:06:42.208021 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/663403bd-27d6-4584-9aaa-859445f539d6-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-k7b7g\" (UID: \"663403bd-27d6-4584-9aaa-859445f539d6\") " pod="openstack/nova-cell0-conductor-db-sync-k7b7g" Mar 10 07:06:42 crc kubenswrapper[4825]: I0310 07:06:42.208102 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/663403bd-27d6-4584-9aaa-859445f539d6-scripts\") pod \"nova-cell0-conductor-db-sync-k7b7g\" (UID: \"663403bd-27d6-4584-9aaa-859445f539d6\") " pod="openstack/nova-cell0-conductor-db-sync-k7b7g" Mar 10 07:06:42 crc kubenswrapper[4825]: I0310 07:06:42.208217 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/663403bd-27d6-4584-9aaa-859445f539d6-config-data\") pod \"nova-cell0-conductor-db-sync-k7b7g\" (UID: \"663403bd-27d6-4584-9aaa-859445f539d6\") " pod="openstack/nova-cell0-conductor-db-sync-k7b7g" Mar 10 07:06:42 crc kubenswrapper[4825]: I0310 07:06:42.212462 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/663403bd-27d6-4584-9aaa-859445f539d6-scripts\") pod \"nova-cell0-conductor-db-sync-k7b7g\" (UID: \"663403bd-27d6-4584-9aaa-859445f539d6\") " pod="openstack/nova-cell0-conductor-db-sync-k7b7g" Mar 10 07:06:42 crc kubenswrapper[4825]: I0310 07:06:42.213752 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/663403bd-27d6-4584-9aaa-859445f539d6-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-k7b7g\" (UID: \"663403bd-27d6-4584-9aaa-859445f539d6\") " pod="openstack/nova-cell0-conductor-db-sync-k7b7g" Mar 10 07:06:42 crc kubenswrapper[4825]: I0310 07:06:42.215180 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/663403bd-27d6-4584-9aaa-859445f539d6-config-data\") pod \"nova-cell0-conductor-db-sync-k7b7g\" (UID: \"663403bd-27d6-4584-9aaa-859445f539d6\") " pod="openstack/nova-cell0-conductor-db-sync-k7b7g" Mar 10 07:06:42 crc kubenswrapper[4825]: I0310 07:06:42.224183 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cksss\" (UniqueName: \"kubernetes.io/projected/663403bd-27d6-4584-9aaa-859445f539d6-kube-api-access-cksss\") pod \"nova-cell0-conductor-db-sync-k7b7g\" (UID: \"663403bd-27d6-4584-9aaa-859445f539d6\") " pod="openstack/nova-cell0-conductor-db-sync-k7b7g" Mar 10 07:06:42 crc kubenswrapper[4825]: I0310 07:06:42.267791 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k7b7g" Mar 10 07:06:42 crc kubenswrapper[4825]: I0310 07:06:42.653096 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:06:42 crc kubenswrapper[4825]: W0310 07:06:42.741363 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod663403bd_27d6_4584_9aaa_859445f539d6.slice/crio-80b236e92ee0d2a145a3f78f6afddee7e24e109c384187d4c40d05e62a8dce61 WatchSource:0}: Error finding container 80b236e92ee0d2a145a3f78f6afddee7e24e109c384187d4c40d05e62a8dce61: Status 404 returned error can't find the container with id 80b236e92ee0d2a145a3f78f6afddee7e24e109c384187d4c40d05e62a8dce61 Mar 10 07:06:42 crc kubenswrapper[4825]: I0310 07:06:42.742893 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k7b7g"] Mar 10 07:06:43 crc kubenswrapper[4825]: I0310 07:06:43.170070 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c647c98-8aca-46cc-bc00-02bf50500321","Type":"ContainerStarted","Data":"d622562b5f42f435db6b40bb64c857ae773b508272fe26be239a6df4fd3fc779"} Mar 10 07:06:43 crc kubenswrapper[4825]: I0310 07:06:43.170115 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c647c98-8aca-46cc-bc00-02bf50500321","Type":"ContainerStarted","Data":"bba109d5fe5e55a82c907945f0e3fe0772bd962a63f9a8d77ed335d3def73f12"} Mar 10 07:06:43 crc kubenswrapper[4825]: I0310 07:06:43.181096 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k7b7g" event={"ID":"663403bd-27d6-4584-9aaa-859445f539d6","Type":"ContainerStarted","Data":"80b236e92ee0d2a145a3f78f6afddee7e24e109c384187d4c40d05e62a8dce61"} Mar 10 07:06:43 crc kubenswrapper[4825]: I0310 07:06:43.556030 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 07:06:43 crc kubenswrapper[4825]: I0310 07:06:43.556069 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 07:06:43 crc kubenswrapper[4825]: I0310 07:06:43.630588 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 07:06:43 crc kubenswrapper[4825]: I0310 07:06:43.697519 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 07:06:43 crc kubenswrapper[4825]: I0310 07:06:43.978626 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 07:06:43 crc kubenswrapper[4825]: I0310 07:06:43.978683 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 07:06:44 crc kubenswrapper[4825]: I0310 07:06:44.016314 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 07:06:44 crc kubenswrapper[4825]: I0310 07:06:44.023644 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 07:06:44 crc kubenswrapper[4825]: I0310 07:06:44.191854 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 07:06:44 crc kubenswrapper[4825]: I0310 07:06:44.191904 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 07:06:44 crc kubenswrapper[4825]: I0310 07:06:44.191918 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 07:06:44 crc kubenswrapper[4825]: I0310 07:06:44.191932 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.131059 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.216061 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c647c98-8aca-46cc-bc00-02bf50500321","Type":"ContainerStarted","Data":"d12d342890f106ec66965c139b6179bf426cf28f4c0fcf70081ef10bb7ced185"} Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.216172 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.216207 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c647c98-8aca-46cc-bc00-02bf50500321" containerName="sg-core" containerID="cri-o://d622562b5f42f435db6b40bb64c857ae773b508272fe26be239a6df4fd3fc779" gracePeriod=30 Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.216205 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c647c98-8aca-46cc-bc00-02bf50500321" containerName="ceilometer-central-agent" containerID="cri-o://88d73553fa50a5bca19b77f26ea09c721d2c802bb054b7c91d38d67f0e15583f" gracePeriod=30 Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.216271 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c647c98-8aca-46cc-bc00-02bf50500321" containerName="proxy-httpd" containerID="cri-o://d12d342890f106ec66965c139b6179bf426cf28f4c0fcf70081ef10bb7ced185" gracePeriod=30 Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.216336 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9c647c98-8aca-46cc-bc00-02bf50500321" containerName="ceilometer-notification-agent" containerID="cri-o://bba109d5fe5e55a82c907945f0e3fe0772bd962a63f9a8d77ed335d3def73f12" gracePeriod=30 Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.222258 4825 generic.go:334] "Generic (PLEG): container finished" podID="2d23be6c-5ecc-495c-b45c-46b79ccbdcf5" containerID="1dd1b73af2989fabf8ad6057b797d997323d290591c75c2def399b0331c06c79" exitCode=0 Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.222343 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b684b6dd6-r5vgn" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.222391 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b684b6dd6-r5vgn" event={"ID":"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5","Type":"ContainerDied","Data":"1dd1b73af2989fabf8ad6057b797d997323d290591c75c2def399b0331c06c79"} Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.222424 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b684b6dd6-r5vgn" event={"ID":"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5","Type":"ContainerDied","Data":"82e626f0b660d292f5477324a181ee8e984ccdb1cafb8fefcd25bd72f2263bae"} Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.222444 4825 scope.go:117] "RemoveContainer" containerID="1dd1b73af2989fabf8ad6057b797d997323d290591c75c2def399b0331c06c79" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.252280 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.7603231419999998 podStartE2EDuration="5.252261744s" podCreationTimestamp="2026-03-10 07:06:40 +0000 UTC" firstStartedPulling="2026-03-10 07:06:41.037776201 +0000 UTC m=+1354.067556816" lastFinishedPulling="2026-03-10 07:06:44.529714793 +0000 UTC m=+1357.559495418" observedRunningTime="2026-03-10 07:06:45.242384757 +0000 UTC m=+1358.272165372" watchObservedRunningTime="2026-03-10 07:06:45.252261744 +0000 UTC m=+1358.282042359" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.265147 4825 scope.go:117] "RemoveContainer" containerID="98cef233b9fa81351c21de237de8599c342df210bb40854490688bca71f84674" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.272982 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-scripts\") pod \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.273063 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-internal-tls-certs\") pod \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.273116 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-public-tls-certs\") pod \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.273181 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbsgr\" (UniqueName: \"kubernetes.io/projected/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-kube-api-access-xbsgr\") pod \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.273215 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-logs\") pod \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.273367 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-config-data\") pod \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.273550 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-combined-ca-bundle\") pod \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\" (UID: \"2d23be6c-5ecc-495c-b45c-46b79ccbdcf5\") " Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.275254 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-logs" (OuterVolumeSpecName: "logs") pod "2d23be6c-5ecc-495c-b45c-46b79ccbdcf5" (UID: "2d23be6c-5ecc-495c-b45c-46b79ccbdcf5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.280111 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-scripts" (OuterVolumeSpecName: "scripts") pod "2d23be6c-5ecc-495c-b45c-46b79ccbdcf5" (UID: "2d23be6c-5ecc-495c-b45c-46b79ccbdcf5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.280227 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-kube-api-access-xbsgr" (OuterVolumeSpecName: "kube-api-access-xbsgr") pod "2d23be6c-5ecc-495c-b45c-46b79ccbdcf5" (UID: "2d23be6c-5ecc-495c-b45c-46b79ccbdcf5"). InnerVolumeSpecName "kube-api-access-xbsgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.285021 4825 scope.go:117] "RemoveContainer" containerID="1dd1b73af2989fabf8ad6057b797d997323d290591c75c2def399b0331c06c79" Mar 10 07:06:45 crc kubenswrapper[4825]: E0310 07:06:45.285471 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dd1b73af2989fabf8ad6057b797d997323d290591c75c2def399b0331c06c79\": container with ID starting with 1dd1b73af2989fabf8ad6057b797d997323d290591c75c2def399b0331c06c79 not found: ID does not exist" containerID="1dd1b73af2989fabf8ad6057b797d997323d290591c75c2def399b0331c06c79" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.285505 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dd1b73af2989fabf8ad6057b797d997323d290591c75c2def399b0331c06c79"} err="failed to get container status \"1dd1b73af2989fabf8ad6057b797d997323d290591c75c2def399b0331c06c79\": rpc error: code = NotFound desc = could not find container \"1dd1b73af2989fabf8ad6057b797d997323d290591c75c2def399b0331c06c79\": container with ID starting with 1dd1b73af2989fabf8ad6057b797d997323d290591c75c2def399b0331c06c79 not found: ID does not exist" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.285537 4825 scope.go:117] "RemoveContainer" containerID="98cef233b9fa81351c21de237de8599c342df210bb40854490688bca71f84674" Mar 10 07:06:45 crc kubenswrapper[4825]: E0310 07:06:45.285727 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98cef233b9fa81351c21de237de8599c342df210bb40854490688bca71f84674\": container with ID starting with 98cef233b9fa81351c21de237de8599c342df210bb40854490688bca71f84674 not found: ID does not exist" containerID="98cef233b9fa81351c21de237de8599c342df210bb40854490688bca71f84674" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.285753 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98cef233b9fa81351c21de237de8599c342df210bb40854490688bca71f84674"} err="failed to get container status \"98cef233b9fa81351c21de237de8599c342df210bb40854490688bca71f84674\": rpc error: code = NotFound desc = could not find container \"98cef233b9fa81351c21de237de8599c342df210bb40854490688bca71f84674\": container with ID starting with 98cef233b9fa81351c21de237de8599c342df210bb40854490688bca71f84674 not found: ID does not exist" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.339107 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-config-data" (OuterVolumeSpecName: "config-data") pod "2d23be6c-5ecc-495c-b45c-46b79ccbdcf5" (UID: "2d23be6c-5ecc-495c-b45c-46b79ccbdcf5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.352282 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d23be6c-5ecc-495c-b45c-46b79ccbdcf5" (UID: "2d23be6c-5ecc-495c-b45c-46b79ccbdcf5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.376115 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.376160 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.376170 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbsgr\" (UniqueName: \"kubernetes.io/projected/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-kube-api-access-xbsgr\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.376180 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.376190 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.399205 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2d23be6c-5ecc-495c-b45c-46b79ccbdcf5" (UID: "2d23be6c-5ecc-495c-b45c-46b79ccbdcf5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.417355 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2d23be6c-5ecc-495c-b45c-46b79ccbdcf5" (UID: "2d23be6c-5ecc-495c-b45c-46b79ccbdcf5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.477502 4825 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.477544 4825 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.635976 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6b684b6dd6-r5vgn"] Mar 10 07:06:45 crc kubenswrapper[4825]: I0310 07:06:45.647980 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6b684b6dd6-r5vgn"] Mar 10 07:06:46 crc kubenswrapper[4825]: I0310 07:06:46.236223 4825 generic.go:334] "Generic (PLEG): container finished" podID="9c647c98-8aca-46cc-bc00-02bf50500321" containerID="d12d342890f106ec66965c139b6179bf426cf28f4c0fcf70081ef10bb7ced185" exitCode=0 Mar 10 07:06:46 crc kubenswrapper[4825]: I0310 07:06:46.236506 4825 generic.go:334] "Generic (PLEG): container finished" podID="9c647c98-8aca-46cc-bc00-02bf50500321" containerID="d622562b5f42f435db6b40bb64c857ae773b508272fe26be239a6df4fd3fc779" exitCode=2 Mar 10 07:06:46 crc kubenswrapper[4825]: I0310 07:06:46.236514 4825 generic.go:334] "Generic (PLEG): container finished" podID="9c647c98-8aca-46cc-bc00-02bf50500321" containerID="bba109d5fe5e55a82c907945f0e3fe0772bd962a63f9a8d77ed335d3def73f12" exitCode=0 Mar 10 07:06:46 crc kubenswrapper[4825]: I0310 07:06:46.236351 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c647c98-8aca-46cc-bc00-02bf50500321","Type":"ContainerDied","Data":"d12d342890f106ec66965c139b6179bf426cf28f4c0fcf70081ef10bb7ced185"} Mar 10 07:06:46 crc kubenswrapper[4825]: I0310 07:06:46.236579 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c647c98-8aca-46cc-bc00-02bf50500321","Type":"ContainerDied","Data":"d622562b5f42f435db6b40bb64c857ae773b508272fe26be239a6df4fd3fc779"} Mar 10 07:06:46 crc kubenswrapper[4825]: I0310 07:06:46.236594 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c647c98-8aca-46cc-bc00-02bf50500321","Type":"ContainerDied","Data":"bba109d5fe5e55a82c907945f0e3fe0772bd962a63f9a8d77ed335d3def73f12"} Mar 10 07:06:46 crc kubenswrapper[4825]: I0310 07:06:46.238276 4825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 07:06:46 crc kubenswrapper[4825]: I0310 07:06:46.238302 4825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 07:06:46 crc kubenswrapper[4825]: I0310 07:06:46.270208 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 07:06:46 crc kubenswrapper[4825]: I0310 07:06:46.270296 4825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 07:06:46 crc kubenswrapper[4825]: I0310 07:06:46.277532 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 07:06:46 crc kubenswrapper[4825]: I0310 07:06:46.382710 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 07:06:46 crc kubenswrapper[4825]: I0310 07:06:46.470667 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 07:06:46 crc kubenswrapper[4825]: I0310 07:06:46.888777 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:06:46 crc kubenswrapper[4825]: I0310 07:06:46.888841 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:06:46 crc kubenswrapper[4825]: I0310 07:06:46.888894 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 07:06:46 crc kubenswrapper[4825]: I0310 07:06:46.889721 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5fc08ef292fcd7ed8a89f718e5e3157a2a220984fb44d362f65a63f59128261b"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 07:06:46 crc kubenswrapper[4825]: I0310 07:06:46.889795 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://5fc08ef292fcd7ed8a89f718e5e3157a2a220984fb44d362f65a63f59128261b" gracePeriod=600 Mar 10 07:06:47 crc kubenswrapper[4825]: I0310 07:06:47.259107 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="5fc08ef292fcd7ed8a89f718e5e3157a2a220984fb44d362f65a63f59128261b" exitCode=0 Mar 10 07:06:47 crc kubenswrapper[4825]: I0310 07:06:47.260795 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d23be6c-5ecc-495c-b45c-46b79ccbdcf5" path="/var/lib/kubelet/pods/2d23be6c-5ecc-495c-b45c-46b79ccbdcf5/volumes" Mar 10 07:06:47 crc kubenswrapper[4825]: I0310 07:06:47.262031 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"5fc08ef292fcd7ed8a89f718e5e3157a2a220984fb44d362f65a63f59128261b"} Mar 10 07:06:47 crc kubenswrapper[4825]: I0310 07:06:47.262086 4825 scope.go:117] "RemoveContainer" containerID="864f20c49ba566b3322366215d4709b7053290c4a77d8bfd2767699cce8f044b" Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.320607 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"c615601d462179e42a2a1b6a7ad7dd6409fbaf9c9befec571e34a43e2c4ba121"} Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.325760 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k7b7g" event={"ID":"663403bd-27d6-4584-9aaa-859445f539d6","Type":"ContainerStarted","Data":"3403e1d7a6a0a6a2cc35efd1e1dc9852c16e21f87682c0116ab023c66bfdfc6a"} Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.329034 4825 generic.go:334] "Generic (PLEG): container finished" podID="9c647c98-8aca-46cc-bc00-02bf50500321" containerID="88d73553fa50a5bca19b77f26ea09c721d2c802bb054b7c91d38d67f0e15583f" exitCode=0 Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.329081 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c647c98-8aca-46cc-bc00-02bf50500321","Type":"ContainerDied","Data":"88d73553fa50a5bca19b77f26ea09c721d2c802bb054b7c91d38d67f0e15583f"} Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.329107 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9c647c98-8aca-46cc-bc00-02bf50500321","Type":"ContainerDied","Data":"78ffa0ae775591b504d9349dc310c88e4ba34c43b69ad51d7e12948ea5227a72"} Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.329121 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78ffa0ae775591b504d9349dc310c88e4ba34c43b69ad51d7e12948ea5227a72" Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.357919 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.363541 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-k7b7g" podStartSLOduration=2.649163297 podStartE2EDuration="11.363517802s" podCreationTimestamp="2026-03-10 07:06:41 +0000 UTC" firstStartedPulling="2026-03-10 07:06:42.744058982 +0000 UTC m=+1355.773839587" lastFinishedPulling="2026-03-10 07:06:51.458413477 +0000 UTC m=+1364.488194092" observedRunningTime="2026-03-10 07:06:52.35771152 +0000 UTC m=+1365.387492145" watchObservedRunningTime="2026-03-10 07:06:52.363517802 +0000 UTC m=+1365.393298437" Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.511884 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c647c98-8aca-46cc-bc00-02bf50500321-scripts\") pod \"9c647c98-8aca-46cc-bc00-02bf50500321\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.511944 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c647c98-8aca-46cc-bc00-02bf50500321-run-httpd\") pod \"9c647c98-8aca-46cc-bc00-02bf50500321\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.511987 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c647c98-8aca-46cc-bc00-02bf50500321-log-httpd\") pod \"9c647c98-8aca-46cc-bc00-02bf50500321\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.512009 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c647c98-8aca-46cc-bc00-02bf50500321-config-data\") pod \"9c647c98-8aca-46cc-bc00-02bf50500321\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.512137 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqff5\" (UniqueName: \"kubernetes.io/projected/9c647c98-8aca-46cc-bc00-02bf50500321-kube-api-access-mqff5\") pod \"9c647c98-8aca-46cc-bc00-02bf50500321\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.512267 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c647c98-8aca-46cc-bc00-02bf50500321-combined-ca-bundle\") pod \"9c647c98-8aca-46cc-bc00-02bf50500321\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.512310 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c647c98-8aca-46cc-bc00-02bf50500321-sg-core-conf-yaml\") pod \"9c647c98-8aca-46cc-bc00-02bf50500321\" (UID: \"9c647c98-8aca-46cc-bc00-02bf50500321\") " Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.513147 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c647c98-8aca-46cc-bc00-02bf50500321-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9c647c98-8aca-46cc-bc00-02bf50500321" (UID: "9c647c98-8aca-46cc-bc00-02bf50500321"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.513588 4825 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c647c98-8aca-46cc-bc00-02bf50500321-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.514446 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c647c98-8aca-46cc-bc00-02bf50500321-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9c647c98-8aca-46cc-bc00-02bf50500321" (UID: "9c647c98-8aca-46cc-bc00-02bf50500321"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.518032 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c647c98-8aca-46cc-bc00-02bf50500321-kube-api-access-mqff5" (OuterVolumeSpecName: "kube-api-access-mqff5") pod "9c647c98-8aca-46cc-bc00-02bf50500321" (UID: "9c647c98-8aca-46cc-bc00-02bf50500321"). InnerVolumeSpecName "kube-api-access-mqff5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.526652 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c647c98-8aca-46cc-bc00-02bf50500321-scripts" (OuterVolumeSpecName: "scripts") pod "9c647c98-8aca-46cc-bc00-02bf50500321" (UID: "9c647c98-8aca-46cc-bc00-02bf50500321"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.545627 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c647c98-8aca-46cc-bc00-02bf50500321-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9c647c98-8aca-46cc-bc00-02bf50500321" (UID: "9c647c98-8aca-46cc-bc00-02bf50500321"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.595909 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c647c98-8aca-46cc-bc00-02bf50500321-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c647c98-8aca-46cc-bc00-02bf50500321" (UID: "9c647c98-8aca-46cc-bc00-02bf50500321"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.614917 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c647c98-8aca-46cc-bc00-02bf50500321-config-data" (OuterVolumeSpecName: "config-data") pod "9c647c98-8aca-46cc-bc00-02bf50500321" (UID: "9c647c98-8aca-46cc-bc00-02bf50500321"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.615546 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqff5\" (UniqueName: \"kubernetes.io/projected/9c647c98-8aca-46cc-bc00-02bf50500321-kube-api-access-mqff5\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.615728 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c647c98-8aca-46cc-bc00-02bf50500321-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.615816 4825 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9c647c98-8aca-46cc-bc00-02bf50500321-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.615840 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c647c98-8aca-46cc-bc00-02bf50500321-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.615860 4825 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9c647c98-8aca-46cc-bc00-02bf50500321-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:52 crc kubenswrapper[4825]: I0310 07:06:52.725503 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c647c98-8aca-46cc-bc00-02bf50500321-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.338903 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.390358 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.408585 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.426389 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:06:53 crc kubenswrapper[4825]: E0310 07:06:53.426966 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c647c98-8aca-46cc-bc00-02bf50500321" containerName="sg-core" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.426985 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c647c98-8aca-46cc-bc00-02bf50500321" containerName="sg-core" Mar 10 07:06:53 crc kubenswrapper[4825]: E0310 07:06:53.427011 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c647c98-8aca-46cc-bc00-02bf50500321" containerName="proxy-httpd" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.427020 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c647c98-8aca-46cc-bc00-02bf50500321" containerName="proxy-httpd" Mar 10 07:06:53 crc kubenswrapper[4825]: E0310 07:06:53.427038 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c647c98-8aca-46cc-bc00-02bf50500321" containerName="ceilometer-central-agent" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.427049 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c647c98-8aca-46cc-bc00-02bf50500321" containerName="ceilometer-central-agent" Mar 10 07:06:53 crc kubenswrapper[4825]: E0310 07:06:53.427065 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d23be6c-5ecc-495c-b45c-46b79ccbdcf5" containerName="placement-api" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.427098 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d23be6c-5ecc-495c-b45c-46b79ccbdcf5" containerName="placement-api" Mar 10 07:06:53 crc kubenswrapper[4825]: E0310 07:06:53.427118 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d23be6c-5ecc-495c-b45c-46b79ccbdcf5" containerName="placement-log" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.427127 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d23be6c-5ecc-495c-b45c-46b79ccbdcf5" containerName="placement-log" Mar 10 07:06:53 crc kubenswrapper[4825]: E0310 07:06:53.427171 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c647c98-8aca-46cc-bc00-02bf50500321" containerName="ceilometer-notification-agent" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.427180 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c647c98-8aca-46cc-bc00-02bf50500321" containerName="ceilometer-notification-agent" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.427378 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c647c98-8aca-46cc-bc00-02bf50500321" containerName="sg-core" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.427395 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c647c98-8aca-46cc-bc00-02bf50500321" containerName="proxy-httpd" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.427415 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c647c98-8aca-46cc-bc00-02bf50500321" containerName="ceilometer-notification-agent" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.427427 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d23be6c-5ecc-495c-b45c-46b79ccbdcf5" containerName="placement-log" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.427443 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c647c98-8aca-46cc-bc00-02bf50500321" containerName="ceilometer-central-agent" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.427461 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d23be6c-5ecc-495c-b45c-46b79ccbdcf5" containerName="placement-api" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.429446 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.432832 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.433811 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.470031 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.580523 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/059c1002-7126-473b-8ed0-e03131d4c07e-config-data\") pod \"ceilometer-0\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " pod="openstack/ceilometer-0" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.580597 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/059c1002-7126-473b-8ed0-e03131d4c07e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " pod="openstack/ceilometer-0" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.580718 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/059c1002-7126-473b-8ed0-e03131d4c07e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " pod="openstack/ceilometer-0" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.580786 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/059c1002-7126-473b-8ed0-e03131d4c07e-run-httpd\") pod \"ceilometer-0\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " pod="openstack/ceilometer-0" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.580803 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/059c1002-7126-473b-8ed0-e03131d4c07e-log-httpd\") pod \"ceilometer-0\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " pod="openstack/ceilometer-0" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.580860 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/059c1002-7126-473b-8ed0-e03131d4c07e-scripts\") pod \"ceilometer-0\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " pod="openstack/ceilometer-0" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.580939 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m6mp\" (UniqueName: \"kubernetes.io/projected/059c1002-7126-473b-8ed0-e03131d4c07e-kube-api-access-7m6mp\") pod \"ceilometer-0\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " pod="openstack/ceilometer-0" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.682692 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/059c1002-7126-473b-8ed0-e03131d4c07e-run-httpd\") pod \"ceilometer-0\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " pod="openstack/ceilometer-0" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.683314 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/059c1002-7126-473b-8ed0-e03131d4c07e-log-httpd\") pod \"ceilometer-0\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " pod="openstack/ceilometer-0" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.683250 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/059c1002-7126-473b-8ed0-e03131d4c07e-run-httpd\") pod \"ceilometer-0\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " pod="openstack/ceilometer-0" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.683405 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/059c1002-7126-473b-8ed0-e03131d4c07e-scripts\") pod \"ceilometer-0\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " pod="openstack/ceilometer-0" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.683903 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/059c1002-7126-473b-8ed0-e03131d4c07e-log-httpd\") pod \"ceilometer-0\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " pod="openstack/ceilometer-0" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.684426 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m6mp\" (UniqueName: \"kubernetes.io/projected/059c1002-7126-473b-8ed0-e03131d4c07e-kube-api-access-7m6mp\") pod \"ceilometer-0\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " pod="openstack/ceilometer-0" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.684527 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/059c1002-7126-473b-8ed0-e03131d4c07e-config-data\") pod \"ceilometer-0\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " pod="openstack/ceilometer-0" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.684558 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/059c1002-7126-473b-8ed0-e03131d4c07e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " pod="openstack/ceilometer-0" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.684677 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/059c1002-7126-473b-8ed0-e03131d4c07e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " pod="openstack/ceilometer-0" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.700733 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/059c1002-7126-473b-8ed0-e03131d4c07e-scripts\") pod \"ceilometer-0\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " pod="openstack/ceilometer-0" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.700805 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/059c1002-7126-473b-8ed0-e03131d4c07e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " pod="openstack/ceilometer-0" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.703071 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/059c1002-7126-473b-8ed0-e03131d4c07e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " pod="openstack/ceilometer-0" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.703431 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/059c1002-7126-473b-8ed0-e03131d4c07e-config-data\") pod \"ceilometer-0\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " pod="openstack/ceilometer-0" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.706298 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m6mp\" (UniqueName: \"kubernetes.io/projected/059c1002-7126-473b-8ed0-e03131d4c07e-kube-api-access-7m6mp\") pod \"ceilometer-0\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " pod="openstack/ceilometer-0" Mar 10 07:06:53 crc kubenswrapper[4825]: I0310 07:06:53.760606 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:06:54 crc kubenswrapper[4825]: I0310 07:06:54.337173 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:06:54 crc kubenswrapper[4825]: W0310 07:06:54.346332 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod059c1002_7126_473b_8ed0_e03131d4c07e.slice/crio-ef8a52ffcafb0042209eaac4f495828f11b0abd6bcc37e9c7d900bad926af20b WatchSource:0}: Error finding container ef8a52ffcafb0042209eaac4f495828f11b0abd6bcc37e9c7d900bad926af20b: Status 404 returned error can't find the container with id ef8a52ffcafb0042209eaac4f495828f11b0abd6bcc37e9c7d900bad926af20b Mar 10 07:06:54 crc kubenswrapper[4825]: I0310 07:06:54.357354 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"059c1002-7126-473b-8ed0-e03131d4c07e","Type":"ContainerStarted","Data":"ef8a52ffcafb0042209eaac4f495828f11b0abd6bcc37e9c7d900bad926af20b"} Mar 10 07:06:55 crc kubenswrapper[4825]: I0310 07:06:55.250789 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c647c98-8aca-46cc-bc00-02bf50500321" path="/var/lib/kubelet/pods/9c647c98-8aca-46cc-bc00-02bf50500321/volumes" Mar 10 07:06:55 crc kubenswrapper[4825]: I0310 07:06:55.366698 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"059c1002-7126-473b-8ed0-e03131d4c07e","Type":"ContainerStarted","Data":"204f88cbb5a2a17cb99619c1e6d48bcf0c849295e99dc1c64b1a72ab6634b29b"} Mar 10 07:06:55 crc kubenswrapper[4825]: I0310 07:06:55.505009 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:06:56 crc kubenswrapper[4825]: I0310 07:06:56.379346 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"059c1002-7126-473b-8ed0-e03131d4c07e","Type":"ContainerStarted","Data":"f00ae78a801af528b74393fec1e7cf5b8a5c88d28d9134dc4beba91790e5d83a"} Mar 10 07:06:58 crc kubenswrapper[4825]: I0310 07:06:58.399551 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"059c1002-7126-473b-8ed0-e03131d4c07e","Type":"ContainerStarted","Data":"725cced78280fe0f05e4f71cffd767296434ad22c227a6f2dee6afc1eef7afa3"} Mar 10 07:07:00 crc kubenswrapper[4825]: I0310 07:07:00.423838 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"059c1002-7126-473b-8ed0-e03131d4c07e","Type":"ContainerStarted","Data":"4195a5d29ac60d8eb99a40fe5c1de095c5dc39cfa0485d9b1f36cdb794e01453"} Mar 10 07:07:00 crc kubenswrapper[4825]: I0310 07:07:00.424897 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 07:07:00 crc kubenswrapper[4825]: I0310 07:07:00.424261 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="059c1002-7126-473b-8ed0-e03131d4c07e" containerName="ceilometer-central-agent" containerID="cri-o://204f88cbb5a2a17cb99619c1e6d48bcf0c849295e99dc1c64b1a72ab6634b29b" gracePeriod=30 Mar 10 07:07:00 crc kubenswrapper[4825]: I0310 07:07:00.424953 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="059c1002-7126-473b-8ed0-e03131d4c07e" containerName="proxy-httpd" containerID="cri-o://4195a5d29ac60d8eb99a40fe5c1de095c5dc39cfa0485d9b1f36cdb794e01453" gracePeriod=30 Mar 10 07:07:00 crc kubenswrapper[4825]: I0310 07:07:00.425049 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="059c1002-7126-473b-8ed0-e03131d4c07e" containerName="sg-core" containerID="cri-o://725cced78280fe0f05e4f71cffd767296434ad22c227a6f2dee6afc1eef7afa3" gracePeriod=30 Mar 10 07:07:00 crc kubenswrapper[4825]: I0310 07:07:00.425113 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="059c1002-7126-473b-8ed0-e03131d4c07e" containerName="ceilometer-notification-agent" containerID="cri-o://f00ae78a801af528b74393fec1e7cf5b8a5c88d28d9134dc4beba91790e5d83a" gracePeriod=30 Mar 10 07:07:00 crc kubenswrapper[4825]: I0310 07:07:00.455851 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.202760757 podStartE2EDuration="7.455832637s" podCreationTimestamp="2026-03-10 07:06:53 +0000 UTC" firstStartedPulling="2026-03-10 07:06:54.349859683 +0000 UTC m=+1367.379640298" lastFinishedPulling="2026-03-10 07:06:59.602931563 +0000 UTC m=+1372.632712178" observedRunningTime="2026-03-10 07:07:00.450887228 +0000 UTC m=+1373.480667843" watchObservedRunningTime="2026-03-10 07:07:00.455832637 +0000 UTC m=+1373.485613252" Mar 10 07:07:01 crc kubenswrapper[4825]: I0310 07:07:01.443172 4825 generic.go:334] "Generic (PLEG): container finished" podID="059c1002-7126-473b-8ed0-e03131d4c07e" containerID="4195a5d29ac60d8eb99a40fe5c1de095c5dc39cfa0485d9b1f36cdb794e01453" exitCode=0 Mar 10 07:07:01 crc kubenswrapper[4825]: I0310 07:07:01.443570 4825 generic.go:334] "Generic (PLEG): container finished" podID="059c1002-7126-473b-8ed0-e03131d4c07e" containerID="725cced78280fe0f05e4f71cffd767296434ad22c227a6f2dee6afc1eef7afa3" exitCode=2 Mar 10 07:07:01 crc kubenswrapper[4825]: I0310 07:07:01.443251 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"059c1002-7126-473b-8ed0-e03131d4c07e","Type":"ContainerDied","Data":"4195a5d29ac60d8eb99a40fe5c1de095c5dc39cfa0485d9b1f36cdb794e01453"} Mar 10 07:07:01 crc kubenswrapper[4825]: I0310 07:07:01.443622 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"059c1002-7126-473b-8ed0-e03131d4c07e","Type":"ContainerDied","Data":"725cced78280fe0f05e4f71cffd767296434ad22c227a6f2dee6afc1eef7afa3"} Mar 10 07:07:01 crc kubenswrapper[4825]: I0310 07:07:01.443586 4825 generic.go:334] "Generic (PLEG): container finished" podID="059c1002-7126-473b-8ed0-e03131d4c07e" containerID="f00ae78a801af528b74393fec1e7cf5b8a5c88d28d9134dc4beba91790e5d83a" exitCode=0 Mar 10 07:07:01 crc kubenswrapper[4825]: I0310 07:07:01.443640 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"059c1002-7126-473b-8ed0-e03131d4c07e","Type":"ContainerDied","Data":"f00ae78a801af528b74393fec1e7cf5b8a5c88d28d9134dc4beba91790e5d83a"} Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.243335 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.365888 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/059c1002-7126-473b-8ed0-e03131d4c07e-scripts\") pod \"059c1002-7126-473b-8ed0-e03131d4c07e\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.366003 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/059c1002-7126-473b-8ed0-e03131d4c07e-run-httpd\") pod \"059c1002-7126-473b-8ed0-e03131d4c07e\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.366069 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/059c1002-7126-473b-8ed0-e03131d4c07e-combined-ca-bundle\") pod \"059c1002-7126-473b-8ed0-e03131d4c07e\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.366123 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m6mp\" (UniqueName: \"kubernetes.io/projected/059c1002-7126-473b-8ed0-e03131d4c07e-kube-api-access-7m6mp\") pod \"059c1002-7126-473b-8ed0-e03131d4c07e\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.366300 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/059c1002-7126-473b-8ed0-e03131d4c07e-config-data\") pod \"059c1002-7126-473b-8ed0-e03131d4c07e\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.366368 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/059c1002-7126-473b-8ed0-e03131d4c07e-sg-core-conf-yaml\") pod \"059c1002-7126-473b-8ed0-e03131d4c07e\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.366422 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/059c1002-7126-473b-8ed0-e03131d4c07e-log-httpd\") pod \"059c1002-7126-473b-8ed0-e03131d4c07e\" (UID: \"059c1002-7126-473b-8ed0-e03131d4c07e\") " Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.366686 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/059c1002-7126-473b-8ed0-e03131d4c07e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "059c1002-7126-473b-8ed0-e03131d4c07e" (UID: "059c1002-7126-473b-8ed0-e03131d4c07e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.367537 4825 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/059c1002-7126-473b-8ed0-e03131d4c07e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.367742 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/059c1002-7126-473b-8ed0-e03131d4c07e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "059c1002-7126-473b-8ed0-e03131d4c07e" (UID: "059c1002-7126-473b-8ed0-e03131d4c07e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.374210 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/059c1002-7126-473b-8ed0-e03131d4c07e-scripts" (OuterVolumeSpecName: "scripts") pod "059c1002-7126-473b-8ed0-e03131d4c07e" (UID: "059c1002-7126-473b-8ed0-e03131d4c07e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.374946 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/059c1002-7126-473b-8ed0-e03131d4c07e-kube-api-access-7m6mp" (OuterVolumeSpecName: "kube-api-access-7m6mp") pod "059c1002-7126-473b-8ed0-e03131d4c07e" (UID: "059c1002-7126-473b-8ed0-e03131d4c07e"). InnerVolumeSpecName "kube-api-access-7m6mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.393784 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/059c1002-7126-473b-8ed0-e03131d4c07e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "059c1002-7126-473b-8ed0-e03131d4c07e" (UID: "059c1002-7126-473b-8ed0-e03131d4c07e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.439589 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/059c1002-7126-473b-8ed0-e03131d4c07e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "059c1002-7126-473b-8ed0-e03131d4c07e" (UID: "059c1002-7126-473b-8ed0-e03131d4c07e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.457883 4825 generic.go:334] "Generic (PLEG): container finished" podID="059c1002-7126-473b-8ed0-e03131d4c07e" containerID="204f88cbb5a2a17cb99619c1e6d48bcf0c849295e99dc1c64b1a72ab6634b29b" exitCode=0 Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.457984 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"059c1002-7126-473b-8ed0-e03131d4c07e","Type":"ContainerDied","Data":"204f88cbb5a2a17cb99619c1e6d48bcf0c849295e99dc1c64b1a72ab6634b29b"} Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.457985 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.458027 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"059c1002-7126-473b-8ed0-e03131d4c07e","Type":"ContainerDied","Data":"ef8a52ffcafb0042209eaac4f495828f11b0abd6bcc37e9c7d900bad926af20b"} Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.458078 4825 scope.go:117] "RemoveContainer" containerID="4195a5d29ac60d8eb99a40fe5c1de095c5dc39cfa0485d9b1f36cdb794e01453" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.461462 4825 generic.go:334] "Generic (PLEG): container finished" podID="663403bd-27d6-4584-9aaa-859445f539d6" containerID="3403e1d7a6a0a6a2cc35efd1e1dc9852c16e21f87682c0116ab023c66bfdfc6a" exitCode=0 Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.461608 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k7b7g" event={"ID":"663403bd-27d6-4584-9aaa-859445f539d6","Type":"ContainerDied","Data":"3403e1d7a6a0a6a2cc35efd1e1dc9852c16e21f87682c0116ab023c66bfdfc6a"} Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.469889 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/059c1002-7126-473b-8ed0-e03131d4c07e-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.469961 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/059c1002-7126-473b-8ed0-e03131d4c07e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.469975 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m6mp\" (UniqueName: \"kubernetes.io/projected/059c1002-7126-473b-8ed0-e03131d4c07e-kube-api-access-7m6mp\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.469986 4825 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/059c1002-7126-473b-8ed0-e03131d4c07e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.469996 4825 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/059c1002-7126-473b-8ed0-e03131d4c07e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.488984 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/059c1002-7126-473b-8ed0-e03131d4c07e-config-data" (OuterVolumeSpecName: "config-data") pod "059c1002-7126-473b-8ed0-e03131d4c07e" (UID: "059c1002-7126-473b-8ed0-e03131d4c07e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.489454 4825 scope.go:117] "RemoveContainer" containerID="725cced78280fe0f05e4f71cffd767296434ad22c227a6f2dee6afc1eef7afa3" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.510673 4825 scope.go:117] "RemoveContainer" containerID="f00ae78a801af528b74393fec1e7cf5b8a5c88d28d9134dc4beba91790e5d83a" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.544008 4825 scope.go:117] "RemoveContainer" containerID="204f88cbb5a2a17cb99619c1e6d48bcf0c849295e99dc1c64b1a72ab6634b29b" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.571982 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/059c1002-7126-473b-8ed0-e03131d4c07e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.576703 4825 scope.go:117] "RemoveContainer" containerID="4195a5d29ac60d8eb99a40fe5c1de095c5dc39cfa0485d9b1f36cdb794e01453" Mar 10 07:07:02 crc kubenswrapper[4825]: E0310 07:07:02.577367 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4195a5d29ac60d8eb99a40fe5c1de095c5dc39cfa0485d9b1f36cdb794e01453\": container with ID starting with 4195a5d29ac60d8eb99a40fe5c1de095c5dc39cfa0485d9b1f36cdb794e01453 not found: ID does not exist" containerID="4195a5d29ac60d8eb99a40fe5c1de095c5dc39cfa0485d9b1f36cdb794e01453" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.577419 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4195a5d29ac60d8eb99a40fe5c1de095c5dc39cfa0485d9b1f36cdb794e01453"} err="failed to get container status \"4195a5d29ac60d8eb99a40fe5c1de095c5dc39cfa0485d9b1f36cdb794e01453\": rpc error: code = NotFound desc = could not find container \"4195a5d29ac60d8eb99a40fe5c1de095c5dc39cfa0485d9b1f36cdb794e01453\": container with ID starting with 4195a5d29ac60d8eb99a40fe5c1de095c5dc39cfa0485d9b1f36cdb794e01453 not found: ID does not exist" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.577489 4825 scope.go:117] "RemoveContainer" containerID="725cced78280fe0f05e4f71cffd767296434ad22c227a6f2dee6afc1eef7afa3" Mar 10 07:07:02 crc kubenswrapper[4825]: E0310 07:07:02.578010 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"725cced78280fe0f05e4f71cffd767296434ad22c227a6f2dee6afc1eef7afa3\": container with ID starting with 725cced78280fe0f05e4f71cffd767296434ad22c227a6f2dee6afc1eef7afa3 not found: ID does not exist" containerID="725cced78280fe0f05e4f71cffd767296434ad22c227a6f2dee6afc1eef7afa3" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.578073 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"725cced78280fe0f05e4f71cffd767296434ad22c227a6f2dee6afc1eef7afa3"} err="failed to get container status \"725cced78280fe0f05e4f71cffd767296434ad22c227a6f2dee6afc1eef7afa3\": rpc error: code = NotFound desc = could not find container \"725cced78280fe0f05e4f71cffd767296434ad22c227a6f2dee6afc1eef7afa3\": container with ID starting with 725cced78280fe0f05e4f71cffd767296434ad22c227a6f2dee6afc1eef7afa3 not found: ID does not exist" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.578117 4825 scope.go:117] "RemoveContainer" containerID="f00ae78a801af528b74393fec1e7cf5b8a5c88d28d9134dc4beba91790e5d83a" Mar 10 07:07:02 crc kubenswrapper[4825]: E0310 07:07:02.578730 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f00ae78a801af528b74393fec1e7cf5b8a5c88d28d9134dc4beba91790e5d83a\": container with ID starting with f00ae78a801af528b74393fec1e7cf5b8a5c88d28d9134dc4beba91790e5d83a not found: ID does not exist" containerID="f00ae78a801af528b74393fec1e7cf5b8a5c88d28d9134dc4beba91790e5d83a" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.578767 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f00ae78a801af528b74393fec1e7cf5b8a5c88d28d9134dc4beba91790e5d83a"} err="failed to get container status \"f00ae78a801af528b74393fec1e7cf5b8a5c88d28d9134dc4beba91790e5d83a\": rpc error: code = NotFound desc = could not find container \"f00ae78a801af528b74393fec1e7cf5b8a5c88d28d9134dc4beba91790e5d83a\": container with ID starting with f00ae78a801af528b74393fec1e7cf5b8a5c88d28d9134dc4beba91790e5d83a not found: ID does not exist" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.578790 4825 scope.go:117] "RemoveContainer" containerID="204f88cbb5a2a17cb99619c1e6d48bcf0c849295e99dc1c64b1a72ab6634b29b" Mar 10 07:07:02 crc kubenswrapper[4825]: E0310 07:07:02.579247 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"204f88cbb5a2a17cb99619c1e6d48bcf0c849295e99dc1c64b1a72ab6634b29b\": container with ID starting with 204f88cbb5a2a17cb99619c1e6d48bcf0c849295e99dc1c64b1a72ab6634b29b not found: ID does not exist" containerID="204f88cbb5a2a17cb99619c1e6d48bcf0c849295e99dc1c64b1a72ab6634b29b" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.579282 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"204f88cbb5a2a17cb99619c1e6d48bcf0c849295e99dc1c64b1a72ab6634b29b"} err="failed to get container status \"204f88cbb5a2a17cb99619c1e6d48bcf0c849295e99dc1c64b1a72ab6634b29b\": rpc error: code = NotFound desc = could not find container \"204f88cbb5a2a17cb99619c1e6d48bcf0c849295e99dc1c64b1a72ab6634b29b\": container with ID starting with 204f88cbb5a2a17cb99619c1e6d48bcf0c849295e99dc1c64b1a72ab6634b29b not found: ID does not exist" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.804050 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.814203 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.842520 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:07:02 crc kubenswrapper[4825]: E0310 07:07:02.842966 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="059c1002-7126-473b-8ed0-e03131d4c07e" containerName="ceilometer-notification-agent" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.842985 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="059c1002-7126-473b-8ed0-e03131d4c07e" containerName="ceilometer-notification-agent" Mar 10 07:07:02 crc kubenswrapper[4825]: E0310 07:07:02.843001 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="059c1002-7126-473b-8ed0-e03131d4c07e" containerName="sg-core" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.843009 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="059c1002-7126-473b-8ed0-e03131d4c07e" containerName="sg-core" Mar 10 07:07:02 crc kubenswrapper[4825]: E0310 07:07:02.843033 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="059c1002-7126-473b-8ed0-e03131d4c07e" containerName="ceilometer-central-agent" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.843041 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="059c1002-7126-473b-8ed0-e03131d4c07e" containerName="ceilometer-central-agent" Mar 10 07:07:02 crc kubenswrapper[4825]: E0310 07:07:02.843055 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="059c1002-7126-473b-8ed0-e03131d4c07e" containerName="proxy-httpd" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.843062 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="059c1002-7126-473b-8ed0-e03131d4c07e" containerName="proxy-httpd" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.843285 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="059c1002-7126-473b-8ed0-e03131d4c07e" containerName="ceilometer-central-agent" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.843296 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="059c1002-7126-473b-8ed0-e03131d4c07e" containerName="ceilometer-notification-agent" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.843312 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="059c1002-7126-473b-8ed0-e03131d4c07e" containerName="sg-core" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.843335 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="059c1002-7126-473b-8ed0-e03131d4c07e" containerName="proxy-httpd" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.846709 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.850090 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.856006 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.859102 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.979946 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdprd\" (UniqueName: \"kubernetes.io/projected/b8d8c6c8-bda0-456d-9990-687257c80d94-kube-api-access-pdprd\") pod \"ceilometer-0\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " pod="openstack/ceilometer-0" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.980068 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8d8c6c8-bda0-456d-9990-687257c80d94-run-httpd\") pod \"ceilometer-0\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " pod="openstack/ceilometer-0" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.980093 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8d8c6c8-bda0-456d-9990-687257c80d94-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " pod="openstack/ceilometer-0" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.980120 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8d8c6c8-bda0-456d-9990-687257c80d94-config-data\") pod \"ceilometer-0\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " pod="openstack/ceilometer-0" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.980261 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8d8c6c8-bda0-456d-9990-687257c80d94-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " pod="openstack/ceilometer-0" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.980289 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8d8c6c8-bda0-456d-9990-687257c80d94-scripts\") pod \"ceilometer-0\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " pod="openstack/ceilometer-0" Mar 10 07:07:02 crc kubenswrapper[4825]: I0310 07:07:02.980351 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8d8c6c8-bda0-456d-9990-687257c80d94-log-httpd\") pod \"ceilometer-0\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " pod="openstack/ceilometer-0" Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.081566 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8d8c6c8-bda0-456d-9990-687257c80d94-run-httpd\") pod \"ceilometer-0\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " pod="openstack/ceilometer-0" Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.081603 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8d8c6c8-bda0-456d-9990-687257c80d94-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " pod="openstack/ceilometer-0" Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.081636 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8d8c6c8-bda0-456d-9990-687257c80d94-config-data\") pod \"ceilometer-0\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " pod="openstack/ceilometer-0" Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.081676 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8d8c6c8-bda0-456d-9990-687257c80d94-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " pod="openstack/ceilometer-0" Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.081704 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8d8c6c8-bda0-456d-9990-687257c80d94-scripts\") pod \"ceilometer-0\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " pod="openstack/ceilometer-0" Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.081757 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8d8c6c8-bda0-456d-9990-687257c80d94-log-httpd\") pod \"ceilometer-0\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " pod="openstack/ceilometer-0" Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.081841 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdprd\" (UniqueName: \"kubernetes.io/projected/b8d8c6c8-bda0-456d-9990-687257c80d94-kube-api-access-pdprd\") pod \"ceilometer-0\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " pod="openstack/ceilometer-0" Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.082106 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8d8c6c8-bda0-456d-9990-687257c80d94-run-httpd\") pod \"ceilometer-0\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " pod="openstack/ceilometer-0" Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.082430 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8d8c6c8-bda0-456d-9990-687257c80d94-log-httpd\") pod \"ceilometer-0\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " pod="openstack/ceilometer-0" Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.085931 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8d8c6c8-bda0-456d-9990-687257c80d94-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " pod="openstack/ceilometer-0" Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.089888 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8d8c6c8-bda0-456d-9990-687257c80d94-config-data\") pod \"ceilometer-0\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " pod="openstack/ceilometer-0" Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.092421 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8d8c6c8-bda0-456d-9990-687257c80d94-scripts\") pod \"ceilometer-0\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " pod="openstack/ceilometer-0" Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.098681 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8d8c6c8-bda0-456d-9990-687257c80d94-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " pod="openstack/ceilometer-0" Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.100893 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdprd\" (UniqueName: \"kubernetes.io/projected/b8d8c6c8-bda0-456d-9990-687257c80d94-kube-api-access-pdprd\") pod \"ceilometer-0\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " pod="openstack/ceilometer-0" Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.181323 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.255075 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="059c1002-7126-473b-8ed0-e03131d4c07e" path="/var/lib/kubelet/pods/059c1002-7126-473b-8ed0-e03131d4c07e/volumes" Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.518372 4825 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podd6f252a1-68f2-4f9a-ade7-0b979581b8c6"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podd6f252a1-68f2-4f9a-ade7-0b979581b8c6] : Timed out while waiting for systemd to remove kubepods-besteffort-podd6f252a1_68f2_4f9a_ade7_0b979581b8c6.slice" Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.701785 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.753337 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k7b7g" Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.901154 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cksss\" (UniqueName: \"kubernetes.io/projected/663403bd-27d6-4584-9aaa-859445f539d6-kube-api-access-cksss\") pod \"663403bd-27d6-4584-9aaa-859445f539d6\" (UID: \"663403bd-27d6-4584-9aaa-859445f539d6\") " Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.901285 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/663403bd-27d6-4584-9aaa-859445f539d6-config-data\") pod \"663403bd-27d6-4584-9aaa-859445f539d6\" (UID: \"663403bd-27d6-4584-9aaa-859445f539d6\") " Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.901351 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/663403bd-27d6-4584-9aaa-859445f539d6-scripts\") pod \"663403bd-27d6-4584-9aaa-859445f539d6\" (UID: \"663403bd-27d6-4584-9aaa-859445f539d6\") " Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.901376 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/663403bd-27d6-4584-9aaa-859445f539d6-combined-ca-bundle\") pod \"663403bd-27d6-4584-9aaa-859445f539d6\" (UID: \"663403bd-27d6-4584-9aaa-859445f539d6\") " Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.907077 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/663403bd-27d6-4584-9aaa-859445f539d6-scripts" (OuterVolumeSpecName: "scripts") pod "663403bd-27d6-4584-9aaa-859445f539d6" (UID: "663403bd-27d6-4584-9aaa-859445f539d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.911183 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/663403bd-27d6-4584-9aaa-859445f539d6-kube-api-access-cksss" (OuterVolumeSpecName: "kube-api-access-cksss") pod "663403bd-27d6-4584-9aaa-859445f539d6" (UID: "663403bd-27d6-4584-9aaa-859445f539d6"). InnerVolumeSpecName "kube-api-access-cksss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.929222 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/663403bd-27d6-4584-9aaa-859445f539d6-config-data" (OuterVolumeSpecName: "config-data") pod "663403bd-27d6-4584-9aaa-859445f539d6" (UID: "663403bd-27d6-4584-9aaa-859445f539d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:03 crc kubenswrapper[4825]: I0310 07:07:03.942604 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/663403bd-27d6-4584-9aaa-859445f539d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "663403bd-27d6-4584-9aaa-859445f539d6" (UID: "663403bd-27d6-4584-9aaa-859445f539d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:04 crc kubenswrapper[4825]: I0310 07:07:04.003876 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cksss\" (UniqueName: \"kubernetes.io/projected/663403bd-27d6-4584-9aaa-859445f539d6-kube-api-access-cksss\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:04 crc kubenswrapper[4825]: I0310 07:07:04.004154 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/663403bd-27d6-4584-9aaa-859445f539d6-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:04 crc kubenswrapper[4825]: I0310 07:07:04.004300 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/663403bd-27d6-4584-9aaa-859445f539d6-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:04 crc kubenswrapper[4825]: I0310 07:07:04.004427 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/663403bd-27d6-4584-9aaa-859445f539d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:04 crc kubenswrapper[4825]: I0310 07:07:04.497041 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k7b7g" event={"ID":"663403bd-27d6-4584-9aaa-859445f539d6","Type":"ContainerDied","Data":"80b236e92ee0d2a145a3f78f6afddee7e24e109c384187d4c40d05e62a8dce61"} Mar 10 07:07:04 crc kubenswrapper[4825]: I0310 07:07:04.497089 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80b236e92ee0d2a145a3f78f6afddee7e24e109c384187d4c40d05e62a8dce61" Mar 10 07:07:04 crc kubenswrapper[4825]: I0310 07:07:04.497302 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k7b7g" Mar 10 07:07:04 crc kubenswrapper[4825]: I0310 07:07:04.499578 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8d8c6c8-bda0-456d-9990-687257c80d94","Type":"ContainerStarted","Data":"adf5d4e91a570e81498106b743d4883dc61f0def6e11d66e836f8a700e0b8237"} Mar 10 07:07:04 crc kubenswrapper[4825]: I0310 07:07:04.595768 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 07:07:04 crc kubenswrapper[4825]: E0310 07:07:04.596119 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="663403bd-27d6-4584-9aaa-859445f539d6" containerName="nova-cell0-conductor-db-sync" Mar 10 07:07:04 crc kubenswrapper[4825]: I0310 07:07:04.596137 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="663403bd-27d6-4584-9aaa-859445f539d6" containerName="nova-cell0-conductor-db-sync" Mar 10 07:07:04 crc kubenswrapper[4825]: I0310 07:07:04.596345 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="663403bd-27d6-4584-9aaa-859445f539d6" containerName="nova-cell0-conductor-db-sync" Mar 10 07:07:04 crc kubenswrapper[4825]: I0310 07:07:04.596918 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 07:07:04 crc kubenswrapper[4825]: I0310 07:07:04.602811 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-x45l4" Mar 10 07:07:04 crc kubenswrapper[4825]: I0310 07:07:04.603091 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 07:07:04 crc kubenswrapper[4825]: I0310 07:07:04.639867 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 07:07:04 crc kubenswrapper[4825]: I0310 07:07:04.723957 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d73d8a-acf6-42e6-a30d-e093144ee0b9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"47d73d8a-acf6-42e6-a30d-e093144ee0b9\") " pod="openstack/nova-cell0-conductor-0" Mar 10 07:07:04 crc kubenswrapper[4825]: I0310 07:07:04.724355 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sstsk\" (UniqueName: \"kubernetes.io/projected/47d73d8a-acf6-42e6-a30d-e093144ee0b9-kube-api-access-sstsk\") pod \"nova-cell0-conductor-0\" (UID: \"47d73d8a-acf6-42e6-a30d-e093144ee0b9\") " pod="openstack/nova-cell0-conductor-0" Mar 10 07:07:04 crc kubenswrapper[4825]: I0310 07:07:04.724442 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d73d8a-acf6-42e6-a30d-e093144ee0b9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"47d73d8a-acf6-42e6-a30d-e093144ee0b9\") " pod="openstack/nova-cell0-conductor-0" Mar 10 07:07:04 crc kubenswrapper[4825]: I0310 07:07:04.826301 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d73d8a-acf6-42e6-a30d-e093144ee0b9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"47d73d8a-acf6-42e6-a30d-e093144ee0b9\") " pod="openstack/nova-cell0-conductor-0" Mar 10 07:07:04 crc kubenswrapper[4825]: I0310 07:07:04.826390 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d73d8a-acf6-42e6-a30d-e093144ee0b9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"47d73d8a-acf6-42e6-a30d-e093144ee0b9\") " pod="openstack/nova-cell0-conductor-0" Mar 10 07:07:04 crc kubenswrapper[4825]: I0310 07:07:04.826437 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sstsk\" (UniqueName: \"kubernetes.io/projected/47d73d8a-acf6-42e6-a30d-e093144ee0b9-kube-api-access-sstsk\") pod \"nova-cell0-conductor-0\" (UID: \"47d73d8a-acf6-42e6-a30d-e093144ee0b9\") " pod="openstack/nova-cell0-conductor-0" Mar 10 07:07:04 crc kubenswrapper[4825]: I0310 07:07:04.830619 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d73d8a-acf6-42e6-a30d-e093144ee0b9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"47d73d8a-acf6-42e6-a30d-e093144ee0b9\") " pod="openstack/nova-cell0-conductor-0" Mar 10 07:07:04 crc kubenswrapper[4825]: I0310 07:07:04.831216 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d73d8a-acf6-42e6-a30d-e093144ee0b9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"47d73d8a-acf6-42e6-a30d-e093144ee0b9\") " pod="openstack/nova-cell0-conductor-0" Mar 10 07:07:04 crc kubenswrapper[4825]: I0310 07:07:04.845297 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sstsk\" (UniqueName: \"kubernetes.io/projected/47d73d8a-acf6-42e6-a30d-e093144ee0b9-kube-api-access-sstsk\") pod \"nova-cell0-conductor-0\" (UID: \"47d73d8a-acf6-42e6-a30d-e093144ee0b9\") " pod="openstack/nova-cell0-conductor-0" Mar 10 07:07:05 crc kubenswrapper[4825]: I0310 07:07:05.005689 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 07:07:05 crc kubenswrapper[4825]: I0310 07:07:05.466936 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 07:07:05 crc kubenswrapper[4825]: W0310 07:07:05.470973 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47d73d8a_acf6_42e6_a30d_e093144ee0b9.slice/crio-01f1d5d9ae5b566ec85b11984ea694cea253fd859bb997b38c5b4718a9006b4a WatchSource:0}: Error finding container 01f1d5d9ae5b566ec85b11984ea694cea253fd859bb997b38c5b4718a9006b4a: Status 404 returned error can't find the container with id 01f1d5d9ae5b566ec85b11984ea694cea253fd859bb997b38c5b4718a9006b4a Mar 10 07:07:05 crc kubenswrapper[4825]: I0310 07:07:05.512671 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8d8c6c8-bda0-456d-9990-687257c80d94","Type":"ContainerStarted","Data":"14db831a97790172eaa4da4b994739695ed6532134e042b0b881af61d5be3571"} Mar 10 07:07:05 crc kubenswrapper[4825]: I0310 07:07:05.514084 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"47d73d8a-acf6-42e6-a30d-e093144ee0b9","Type":"ContainerStarted","Data":"01f1d5d9ae5b566ec85b11984ea694cea253fd859bb997b38c5b4718a9006b4a"} Mar 10 07:07:06 crc kubenswrapper[4825]: I0310 07:07:06.526669 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"47d73d8a-acf6-42e6-a30d-e093144ee0b9","Type":"ContainerStarted","Data":"a5a7ebb2da38af819e648d6e5ad4c63fc12ab758ecc1f1178b06bc2d53a663b1"} Mar 10 07:07:06 crc kubenswrapper[4825]: I0310 07:07:06.527083 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 10 07:07:06 crc kubenswrapper[4825]: I0310 07:07:06.533602 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8d8c6c8-bda0-456d-9990-687257c80d94","Type":"ContainerStarted","Data":"30741bbef1130028001c57998a0fbcac6249f525d23117f98dbbac7ca47f746d"} Mar 10 07:07:06 crc kubenswrapper[4825]: I0310 07:07:06.533664 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8d8c6c8-bda0-456d-9990-687257c80d94","Type":"ContainerStarted","Data":"eb8eb918d07b85f1845a186b5d8a359823392ba86b985cd163ad0001b3649d0f"} Mar 10 07:07:06 crc kubenswrapper[4825]: I0310 07:07:06.555219 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.555190668 podStartE2EDuration="2.555190668s" podCreationTimestamp="2026-03-10 07:07:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:07:06.5460621 +0000 UTC m=+1379.575842795" watchObservedRunningTime="2026-03-10 07:07:06.555190668 +0000 UTC m=+1379.584971313" Mar 10 07:07:08 crc kubenswrapper[4825]: I0310 07:07:08.551867 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8d8c6c8-bda0-456d-9990-687257c80d94","Type":"ContainerStarted","Data":"1eb716d9b690168d881b625fe363382d3aef275e433436e5abbfe3274a8d8e0c"} Mar 10 07:07:08 crc kubenswrapper[4825]: I0310 07:07:08.552472 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 07:07:08 crc kubenswrapper[4825]: I0310 07:07:08.573338 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.316460707 podStartE2EDuration="6.573321528s" podCreationTimestamp="2026-03-10 07:07:02 +0000 UTC" firstStartedPulling="2026-03-10 07:07:03.706900547 +0000 UTC m=+1376.736681162" lastFinishedPulling="2026-03-10 07:07:07.963761358 +0000 UTC m=+1380.993541983" observedRunningTime="2026-03-10 07:07:08.569857958 +0000 UTC m=+1381.599638573" watchObservedRunningTime="2026-03-10 07:07:08.573321528 +0000 UTC m=+1381.603102143" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.053612 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.643970 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-npbf2"] Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.659326 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-npbf2"] Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.659438 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-npbf2" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.663003 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3-config-data\") pod \"nova-cell0-cell-mapping-npbf2\" (UID: \"a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3\") " pod="openstack/nova-cell0-cell-mapping-npbf2" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.663060 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8dfh\" (UniqueName: \"kubernetes.io/projected/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3-kube-api-access-z8dfh\") pod \"nova-cell0-cell-mapping-npbf2\" (UID: \"a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3\") " pod="openstack/nova-cell0-cell-mapping-npbf2" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.663519 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3-scripts\") pod \"nova-cell0-cell-mapping-npbf2\" (UID: \"a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3\") " pod="openstack/nova-cell0-cell-mapping-npbf2" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.663540 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.663808 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-npbf2\" (UID: \"a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3\") " pod="openstack/nova-cell0-cell-mapping-npbf2" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.664005 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.765633 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3-scripts\") pod \"nova-cell0-cell-mapping-npbf2\" (UID: \"a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3\") " pod="openstack/nova-cell0-cell-mapping-npbf2" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.765843 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-npbf2\" (UID: \"a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3\") " pod="openstack/nova-cell0-cell-mapping-npbf2" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.765943 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3-config-data\") pod \"nova-cell0-cell-mapping-npbf2\" (UID: \"a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3\") " pod="openstack/nova-cell0-cell-mapping-npbf2" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.765989 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8dfh\" (UniqueName: \"kubernetes.io/projected/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3-kube-api-access-z8dfh\") pod \"nova-cell0-cell-mapping-npbf2\" (UID: \"a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3\") " pod="openstack/nova-cell0-cell-mapping-npbf2" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.774111 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-npbf2\" (UID: \"a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3\") " pod="openstack/nova-cell0-cell-mapping-npbf2" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.774279 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3-config-data\") pod \"nova-cell0-cell-mapping-npbf2\" (UID: \"a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3\") " pod="openstack/nova-cell0-cell-mapping-npbf2" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.784659 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3-scripts\") pod \"nova-cell0-cell-mapping-npbf2\" (UID: \"a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3\") " pod="openstack/nova-cell0-cell-mapping-npbf2" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.791923 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8dfh\" (UniqueName: \"kubernetes.io/projected/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3-kube-api-access-z8dfh\") pod \"nova-cell0-cell-mapping-npbf2\" (UID: \"a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3\") " pod="openstack/nova-cell0-cell-mapping-npbf2" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.854609 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.856438 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.893155 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.923451 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.967440 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.972472 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9632a51-2c7a-4538-895d-c9c99e7f1e00-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f9632a51-2c7a-4538-895d-c9c99e7f1e00\") " pod="openstack/nova-scheduler-0" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.972576 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9632a51-2c7a-4538-895d-c9c99e7f1e00-config-data\") pod \"nova-scheduler-0\" (UID: \"f9632a51-2c7a-4538-895d-c9c99e7f1e00\") " pod="openstack/nova-scheduler-0" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.972637 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc6j9\" (UniqueName: \"kubernetes.io/projected/f9632a51-2c7a-4538-895d-c9c99e7f1e00-kube-api-access-nc6j9\") pod \"nova-scheduler-0\" (UID: \"f9632a51-2c7a-4538-895d-c9c99e7f1e00\") " pod="openstack/nova-scheduler-0" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.974915 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.978090 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.984584 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-npbf2" Mar 10 07:07:10 crc kubenswrapper[4825]: I0310 07:07:10.992463 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.088587 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6965f726-906a-463f-b776-5ddf8e672c29-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6965f726-906a-463f-b776-5ddf8e672c29\") " pod="openstack/nova-api-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.088640 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvl7d\" (UniqueName: \"kubernetes.io/projected/6965f726-906a-463f-b776-5ddf8e672c29-kube-api-access-gvl7d\") pod \"nova-api-0\" (UID: \"6965f726-906a-463f-b776-5ddf8e672c29\") " pod="openstack/nova-api-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.088676 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9632a51-2c7a-4538-895d-c9c99e7f1e00-config-data\") pod \"nova-scheduler-0\" (UID: \"f9632a51-2c7a-4538-895d-c9c99e7f1e00\") " pod="openstack/nova-scheduler-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.088730 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6965f726-906a-463f-b776-5ddf8e672c29-logs\") pod \"nova-api-0\" (UID: \"6965f726-906a-463f-b776-5ddf8e672c29\") " pod="openstack/nova-api-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.088752 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc6j9\" (UniqueName: \"kubernetes.io/projected/f9632a51-2c7a-4538-895d-c9c99e7f1e00-kube-api-access-nc6j9\") pod \"nova-scheduler-0\" (UID: \"f9632a51-2c7a-4538-895d-c9c99e7f1e00\") " pod="openstack/nova-scheduler-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.088815 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6965f726-906a-463f-b776-5ddf8e672c29-config-data\") pod \"nova-api-0\" (UID: \"6965f726-906a-463f-b776-5ddf8e672c29\") " pod="openstack/nova-api-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.088842 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9632a51-2c7a-4538-895d-c9c99e7f1e00-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f9632a51-2c7a-4538-895d-c9c99e7f1e00\") " pod="openstack/nova-scheduler-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.093511 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.094993 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.098113 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9632a51-2c7a-4538-895d-c9c99e7f1e00-config-data\") pod \"nova-scheduler-0\" (UID: \"f9632a51-2c7a-4538-895d-c9c99e7f1e00\") " pod="openstack/nova-scheduler-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.098477 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.101777 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9632a51-2c7a-4538-895d-c9c99e7f1e00-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f9632a51-2c7a-4538-895d-c9c99e7f1e00\") " pod="openstack/nova-scheduler-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.135250 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc6j9\" (UniqueName: \"kubernetes.io/projected/f9632a51-2c7a-4538-895d-c9c99e7f1e00-kube-api-access-nc6j9\") pod \"nova-scheduler-0\" (UID: \"f9632a51-2c7a-4538-895d-c9c99e7f1e00\") " pod="openstack/nova-scheduler-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.165349 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.192197 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.193435 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.197016 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.198401 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnh9d\" (UniqueName: \"kubernetes.io/projected/3c57dafa-d38c-4d1b-9de5-570c25ec94ce-kube-api-access-bnh9d\") pod \"nova-metadata-0\" (UID: \"3c57dafa-d38c-4d1b-9de5-570c25ec94ce\") " pod="openstack/nova-metadata-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.198453 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6965f726-906a-463f-b776-5ddf8e672c29-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6965f726-906a-463f-b776-5ddf8e672c29\") " pod="openstack/nova-api-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.198470 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c57dafa-d38c-4d1b-9de5-570c25ec94ce-logs\") pod \"nova-metadata-0\" (UID: \"3c57dafa-d38c-4d1b-9de5-570c25ec94ce\") " pod="openstack/nova-metadata-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.198508 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvl7d\" (UniqueName: \"kubernetes.io/projected/6965f726-906a-463f-b776-5ddf8e672c29-kube-api-access-gvl7d\") pod \"nova-api-0\" (UID: \"6965f726-906a-463f-b776-5ddf8e672c29\") " pod="openstack/nova-api-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.198542 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c57dafa-d38c-4d1b-9de5-570c25ec94ce-config-data\") pod \"nova-metadata-0\" (UID: \"3c57dafa-d38c-4d1b-9de5-570c25ec94ce\") " pod="openstack/nova-metadata-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.198595 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6965f726-906a-463f-b776-5ddf8e672c29-logs\") pod \"nova-api-0\" (UID: \"6965f726-906a-463f-b776-5ddf8e672c29\") " pod="openstack/nova-api-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.198652 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6965f726-906a-463f-b776-5ddf8e672c29-config-data\") pod \"nova-api-0\" (UID: \"6965f726-906a-463f-b776-5ddf8e672c29\") " pod="openstack/nova-api-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.198673 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c57dafa-d38c-4d1b-9de5-570c25ec94ce-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c57dafa-d38c-4d1b-9de5-570c25ec94ce\") " pod="openstack/nova-metadata-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.199485 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6965f726-906a-463f-b776-5ddf8e672c29-logs\") pod \"nova-api-0\" (UID: \"6965f726-906a-463f-b776-5ddf8e672c29\") " pod="openstack/nova-api-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.212235 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6965f726-906a-463f-b776-5ddf8e672c29-config-data\") pod \"nova-api-0\" (UID: \"6965f726-906a-463f-b776-5ddf8e672c29\") " pod="openstack/nova-api-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.229534 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.245884 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6965f726-906a-463f-b776-5ddf8e672c29-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6965f726-906a-463f-b776-5ddf8e672c29\") " pod="openstack/nova-api-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.247658 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvl7d\" (UniqueName: \"kubernetes.io/projected/6965f726-906a-463f-b776-5ddf8e672c29-kube-api-access-gvl7d\") pod \"nova-api-0\" (UID: \"6965f726-906a-463f-b776-5ddf8e672c29\") " pod="openstack/nova-api-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.266097 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.290227 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-q8gbc"] Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.299859 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-q8gbc"] Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.300036 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.300762 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/568490b2-38d9-45d8-ad81-cfffffc697f2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"568490b2-38d9-45d8-ad81-cfffffc697f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.300840 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/568490b2-38d9-45d8-ad81-cfffffc697f2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"568490b2-38d9-45d8-ad81-cfffffc697f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.300957 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c57dafa-d38c-4d1b-9de5-570c25ec94ce-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c57dafa-d38c-4d1b-9de5-570c25ec94ce\") " pod="openstack/nova-metadata-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.301009 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8hcw\" (UniqueName: \"kubernetes.io/projected/568490b2-38d9-45d8-ad81-cfffffc697f2-kube-api-access-j8hcw\") pod \"nova-cell1-novncproxy-0\" (UID: \"568490b2-38d9-45d8-ad81-cfffffc697f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.301035 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnh9d\" (UniqueName: \"kubernetes.io/projected/3c57dafa-d38c-4d1b-9de5-570c25ec94ce-kube-api-access-bnh9d\") pod \"nova-metadata-0\" (UID: \"3c57dafa-d38c-4d1b-9de5-570c25ec94ce\") " pod="openstack/nova-metadata-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.301281 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c57dafa-d38c-4d1b-9de5-570c25ec94ce-logs\") pod \"nova-metadata-0\" (UID: \"3c57dafa-d38c-4d1b-9de5-570c25ec94ce\") " pod="openstack/nova-metadata-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.301363 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c57dafa-d38c-4d1b-9de5-570c25ec94ce-config-data\") pod \"nova-metadata-0\" (UID: \"3c57dafa-d38c-4d1b-9de5-570c25ec94ce\") " pod="openstack/nova-metadata-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.304929 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c57dafa-d38c-4d1b-9de5-570c25ec94ce-config-data\") pod \"nova-metadata-0\" (UID: \"3c57dafa-d38c-4d1b-9de5-570c25ec94ce\") " pod="openstack/nova-metadata-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.305545 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c57dafa-d38c-4d1b-9de5-570c25ec94ce-logs\") pod \"nova-metadata-0\" (UID: \"3c57dafa-d38c-4d1b-9de5-570c25ec94ce\") " pod="openstack/nova-metadata-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.307789 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c57dafa-d38c-4d1b-9de5-570c25ec94ce-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c57dafa-d38c-4d1b-9de5-570c25ec94ce\") " pod="openstack/nova-metadata-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.324662 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnh9d\" (UniqueName: \"kubernetes.io/projected/3c57dafa-d38c-4d1b-9de5-570c25ec94ce-kube-api-access-bnh9d\") pod \"nova-metadata-0\" (UID: \"3c57dafa-d38c-4d1b-9de5-570c25ec94ce\") " pod="openstack/nova-metadata-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.403398 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd5679c8c-q8gbc\" (UID: \"3ffd2f64-247a-4adf-bdc5-dce905192f17\") " pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.403657 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/568490b2-38d9-45d8-ad81-cfffffc697f2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"568490b2-38d9-45d8-ad81-cfffffc697f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.403729 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/568490b2-38d9-45d8-ad81-cfffffc697f2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"568490b2-38d9-45d8-ad81-cfffffc697f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.403791 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlrvx\" (UniqueName: \"kubernetes.io/projected/3ffd2f64-247a-4adf-bdc5-dce905192f17-kube-api-access-dlrvx\") pod \"dnsmasq-dns-7bd5679c8c-q8gbc\" (UID: \"3ffd2f64-247a-4adf-bdc5-dce905192f17\") " pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.403828 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-config\") pod \"dnsmasq-dns-7bd5679c8c-q8gbc\" (UID: \"3ffd2f64-247a-4adf-bdc5-dce905192f17\") " pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.404174 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-dns-svc\") pod \"dnsmasq-dns-7bd5679c8c-q8gbc\" (UID: \"3ffd2f64-247a-4adf-bdc5-dce905192f17\") " pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.404248 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd5679c8c-q8gbc\" (UID: \"3ffd2f64-247a-4adf-bdc5-dce905192f17\") " pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.404345 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8hcw\" (UniqueName: \"kubernetes.io/projected/568490b2-38d9-45d8-ad81-cfffffc697f2-kube-api-access-j8hcw\") pod \"nova-cell1-novncproxy-0\" (UID: \"568490b2-38d9-45d8-ad81-cfffffc697f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.404481 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd5679c8c-q8gbc\" (UID: \"3ffd2f64-247a-4adf-bdc5-dce905192f17\") " pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.408949 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/568490b2-38d9-45d8-ad81-cfffffc697f2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"568490b2-38d9-45d8-ad81-cfffffc697f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.410875 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/568490b2-38d9-45d8-ad81-cfffffc697f2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"568490b2-38d9-45d8-ad81-cfffffc697f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.412459 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.424962 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8hcw\" (UniqueName: \"kubernetes.io/projected/568490b2-38d9-45d8-ad81-cfffffc697f2-kube-api-access-j8hcw\") pod \"nova-cell1-novncproxy-0\" (UID: \"568490b2-38d9-45d8-ad81-cfffffc697f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.506197 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlrvx\" (UniqueName: \"kubernetes.io/projected/3ffd2f64-247a-4adf-bdc5-dce905192f17-kube-api-access-dlrvx\") pod \"dnsmasq-dns-7bd5679c8c-q8gbc\" (UID: \"3ffd2f64-247a-4adf-bdc5-dce905192f17\") " pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.506268 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-config\") pod \"dnsmasq-dns-7bd5679c8c-q8gbc\" (UID: \"3ffd2f64-247a-4adf-bdc5-dce905192f17\") " pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.506342 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-dns-svc\") pod \"dnsmasq-dns-7bd5679c8c-q8gbc\" (UID: \"3ffd2f64-247a-4adf-bdc5-dce905192f17\") " pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.506368 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd5679c8c-q8gbc\" (UID: \"3ffd2f64-247a-4adf-bdc5-dce905192f17\") " pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.506415 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd5679c8c-q8gbc\" (UID: \"3ffd2f64-247a-4adf-bdc5-dce905192f17\") " pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.506465 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd5679c8c-q8gbc\" (UID: \"3ffd2f64-247a-4adf-bdc5-dce905192f17\") " pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.508659 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd5679c8c-q8gbc\" (UID: \"3ffd2f64-247a-4adf-bdc5-dce905192f17\") " pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.508883 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-dns-svc\") pod \"dnsmasq-dns-7bd5679c8c-q8gbc\" (UID: \"3ffd2f64-247a-4adf-bdc5-dce905192f17\") " pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.509587 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd5679c8c-q8gbc\" (UID: \"3ffd2f64-247a-4adf-bdc5-dce905192f17\") " pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.509821 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-config\") pod \"dnsmasq-dns-7bd5679c8c-q8gbc\" (UID: \"3ffd2f64-247a-4adf-bdc5-dce905192f17\") " pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.512641 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd5679c8c-q8gbc\" (UID: \"3ffd2f64-247a-4adf-bdc5-dce905192f17\") " pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.520824 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.530281 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlrvx\" (UniqueName: \"kubernetes.io/projected/3ffd2f64-247a-4adf-bdc5-dce905192f17-kube-api-access-dlrvx\") pod \"dnsmasq-dns-7bd5679c8c-q8gbc\" (UID: \"3ffd2f64-247a-4adf-bdc5-dce905192f17\") " pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.559034 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.641442 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" Mar 10 07:07:11 crc kubenswrapper[4825]: I0310 07:07:11.678548 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-npbf2"] Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:11.799608 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5l64f"] Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:11.800798 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5l64f" Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:11.802692 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:11.804535 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:11.819328 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5l64f"] Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:11.820392 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tstxv\" (UniqueName: \"kubernetes.io/projected/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2-kube-api-access-tstxv\") pod \"nova-cell1-conductor-db-sync-5l64f\" (UID: \"3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2\") " pod="openstack/nova-cell1-conductor-db-sync-5l64f" Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:11.820499 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2-scripts\") pod \"nova-cell1-conductor-db-sync-5l64f\" (UID: \"3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2\") " pod="openstack/nova-cell1-conductor-db-sync-5l64f" Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:11.820549 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5l64f\" (UID: \"3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2\") " pod="openstack/nova-cell1-conductor-db-sync-5l64f" Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:11.820615 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2-config-data\") pod \"nova-cell1-conductor-db-sync-5l64f\" (UID: \"3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2\") " pod="openstack/nova-cell1-conductor-db-sync-5l64f" Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:11.875552 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 07:07:12 crc kubenswrapper[4825]: W0310 07:07:11.880555 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9632a51_2c7a_4538_895d_c9c99e7f1e00.slice/crio-dead8cb276b6510c07d7f0ca63188a075598debe1e7e7359b6259c263f798c08 WatchSource:0}: Error finding container dead8cb276b6510c07d7f0ca63188a075598debe1e7e7359b6259c263f798c08: Status 404 returned error can't find the container with id dead8cb276b6510c07d7f0ca63188a075598debe1e7e7359b6259c263f798c08 Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:11.921990 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2-scripts\") pod \"nova-cell1-conductor-db-sync-5l64f\" (UID: \"3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2\") " pod="openstack/nova-cell1-conductor-db-sync-5l64f" Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:11.923089 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5l64f\" (UID: \"3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2\") " pod="openstack/nova-cell1-conductor-db-sync-5l64f" Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:11.923180 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2-config-data\") pod \"nova-cell1-conductor-db-sync-5l64f\" (UID: \"3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2\") " pod="openstack/nova-cell1-conductor-db-sync-5l64f" Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:11.923296 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tstxv\" (UniqueName: \"kubernetes.io/projected/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2-kube-api-access-tstxv\") pod \"nova-cell1-conductor-db-sync-5l64f\" (UID: \"3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2\") " pod="openstack/nova-cell1-conductor-db-sync-5l64f" Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:11.927438 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5l64f\" (UID: \"3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2\") " pod="openstack/nova-cell1-conductor-db-sync-5l64f" Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:11.928866 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2-scripts\") pod \"nova-cell1-conductor-db-sync-5l64f\" (UID: \"3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2\") " pod="openstack/nova-cell1-conductor-db-sync-5l64f" Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:11.929371 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2-config-data\") pod \"nova-cell1-conductor-db-sync-5l64f\" (UID: \"3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2\") " pod="openstack/nova-cell1-conductor-db-sync-5l64f" Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:11.944332 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tstxv\" (UniqueName: \"kubernetes.io/projected/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2-kube-api-access-tstxv\") pod \"nova-cell1-conductor-db-sync-5l64f\" (UID: \"3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2\") " pod="openstack/nova-cell1-conductor-db-sync-5l64f" Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:12.057217 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5l64f" Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:12.075958 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:12.093040 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 07:07:12 crc kubenswrapper[4825]: W0310 07:07:12.119739 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6965f726_906a_463f_b776_5ddf8e672c29.slice/crio-1bcc123db11e54e7a1032777a413cfd2454ffd2e1a195c44bc28bd53b74391f9 WatchSource:0}: Error finding container 1bcc123db11e54e7a1032777a413cfd2454ffd2e1a195c44bc28bd53b74391f9: Status 404 returned error can't find the container with id 1bcc123db11e54e7a1032777a413cfd2454ffd2e1a195c44bc28bd53b74391f9 Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:12.311809 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:12.602189 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-npbf2" event={"ID":"a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3","Type":"ContainerStarted","Data":"e4fc36f4f8673da278d64fd9e1e19a5d1e27f954df79bea201495848ce4acc7b"} Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:12.602230 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-npbf2" event={"ID":"a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3","Type":"ContainerStarted","Data":"4da21a373258dee194b79ceabb67632dbbc47d13d8ceeb34373a0b282ba0bf53"} Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:12.604879 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"568490b2-38d9-45d8-ad81-cfffffc697f2","Type":"ContainerStarted","Data":"b8df5da5426b8daa53c19be52592664babf87026fce3ccbfcca15714e3d2f09a"} Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:12.606690 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f9632a51-2c7a-4538-895d-c9c99e7f1e00","Type":"ContainerStarted","Data":"dead8cb276b6510c07d7f0ca63188a075598debe1e7e7359b6259c263f798c08"} Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:12.610429 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6965f726-906a-463f-b776-5ddf8e672c29","Type":"ContainerStarted","Data":"1bcc123db11e54e7a1032777a413cfd2454ffd2e1a195c44bc28bd53b74391f9"} Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:12.622177 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c57dafa-d38c-4d1b-9de5-570c25ec94ce","Type":"ContainerStarted","Data":"3360a1dfcdfb9ea8534befbe48ca27d20d7a1dd8bfc63d248b23129f3ca5a8be"} Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:12.627961 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-npbf2" podStartSLOduration=2.627936787 podStartE2EDuration="2.627936787s" podCreationTimestamp="2026-03-10 07:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:07:12.618187782 +0000 UTC m=+1385.647968417" watchObservedRunningTime="2026-03-10 07:07:12.627936787 +0000 UTC m=+1385.657717402" Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:12.752826 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-q8gbc"] Mar 10 07:07:12 crc kubenswrapper[4825]: I0310 07:07:12.820866 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5l64f"] Mar 10 07:07:13 crc kubenswrapper[4825]: I0310 07:07:13.632230 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5l64f" event={"ID":"3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2","Type":"ContainerStarted","Data":"559ef4bd666291eae26c4b939416644a6b7eb90e6ef3542f1756432febae846d"} Mar 10 07:07:13 crc kubenswrapper[4825]: I0310 07:07:13.634948 4825 generic.go:334] "Generic (PLEG): container finished" podID="3ffd2f64-247a-4adf-bdc5-dce905192f17" containerID="b2975973fabf7cfa881a9ec9b70d675113e43e495fbc5493dca9e9faa3a17b72" exitCode=0 Mar 10 07:07:13 crc kubenswrapper[4825]: I0310 07:07:13.636700 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" event={"ID":"3ffd2f64-247a-4adf-bdc5-dce905192f17","Type":"ContainerDied","Data":"b2975973fabf7cfa881a9ec9b70d675113e43e495fbc5493dca9e9faa3a17b72"} Mar 10 07:07:13 crc kubenswrapper[4825]: I0310 07:07:13.636722 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" event={"ID":"3ffd2f64-247a-4adf-bdc5-dce905192f17","Type":"ContainerStarted","Data":"521ed245bf2452731ceaec17dcb1bbbed78deeb48213416bef0ca079b5b1c0cd"} Mar 10 07:07:14 crc kubenswrapper[4825]: I0310 07:07:14.650231 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5l64f" event={"ID":"3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2","Type":"ContainerStarted","Data":"5a6694411ba07209eab6dc1d0ce1dc4fc71125bfd4984ffce3d54b7f66b38d63"} Mar 10 07:07:14 crc kubenswrapper[4825]: I0310 07:07:14.673998 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-5l64f" podStartSLOduration=3.673978124 podStartE2EDuration="3.673978124s" podCreationTimestamp="2026-03-10 07:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:07:14.666143509 +0000 UTC m=+1387.695924144" watchObservedRunningTime="2026-03-10 07:07:14.673978124 +0000 UTC m=+1387.703758749" Mar 10 07:07:14 crc kubenswrapper[4825]: I0310 07:07:14.900929 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 07:07:14 crc kubenswrapper[4825]: I0310 07:07:14.909213 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 07:07:15 crc kubenswrapper[4825]: I0310 07:07:15.691282 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" event={"ID":"3ffd2f64-247a-4adf-bdc5-dce905192f17","Type":"ContainerStarted","Data":"ef26b86d0482989024104298ad30b8391e618d4bc4de4ccada15c03fbac43938"} Mar 10 07:07:15 crc kubenswrapper[4825]: I0310 07:07:15.693708 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" Mar 10 07:07:15 crc kubenswrapper[4825]: I0310 07:07:15.710808 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c57dafa-d38c-4d1b-9de5-570c25ec94ce","Type":"ContainerStarted","Data":"2a9818b5c33c795bc5e4625e6532ef07969e866625cd296b726cd56a8335cde3"} Mar 10 07:07:15 crc kubenswrapper[4825]: I0310 07:07:15.714315 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"568490b2-38d9-45d8-ad81-cfffffc697f2","Type":"ContainerStarted","Data":"17ed4ed2a06281c643252e4ec49d9bfea8bd858f1cae1cf04f57b290cecc1a71"} Mar 10 07:07:15 crc kubenswrapper[4825]: I0310 07:07:15.714471 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="568490b2-38d9-45d8-ad81-cfffffc697f2" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://17ed4ed2a06281c643252e4ec49d9bfea8bd858f1cae1cf04f57b290cecc1a71" gracePeriod=30 Mar 10 07:07:15 crc kubenswrapper[4825]: I0310 07:07:15.719699 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f9632a51-2c7a-4538-895d-c9c99e7f1e00","Type":"ContainerStarted","Data":"92bb61ab3a96d1bb04b8dc5bbd78d1594b7ade60769154636efd908f14fc8323"} Mar 10 07:07:15 crc kubenswrapper[4825]: I0310 07:07:15.719865 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" podStartSLOduration=4.719846968 podStartE2EDuration="4.719846968s" podCreationTimestamp="2026-03-10 07:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:07:15.708255946 +0000 UTC m=+1388.738036561" watchObservedRunningTime="2026-03-10 07:07:15.719846968 +0000 UTC m=+1388.749627583" Mar 10 07:07:15 crc kubenswrapper[4825]: I0310 07:07:15.724914 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6965f726-906a-463f-b776-5ddf8e672c29","Type":"ContainerStarted","Data":"0dba2b616476849648bdd5c442999b93f5a2d75b861d8c1098438ac42d5eacf3"} Mar 10 07:07:15 crc kubenswrapper[4825]: I0310 07:07:15.758481 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.020419207 podStartE2EDuration="4.758455694s" podCreationTimestamp="2026-03-10 07:07:11 +0000 UTC" firstStartedPulling="2026-03-10 07:07:12.317063062 +0000 UTC m=+1385.346843677" lastFinishedPulling="2026-03-10 07:07:15.055099549 +0000 UTC m=+1388.084880164" observedRunningTime="2026-03-10 07:07:15.736779209 +0000 UTC m=+1388.766559824" watchObservedRunningTime="2026-03-10 07:07:15.758455694 +0000 UTC m=+1388.788236309" Mar 10 07:07:15 crc kubenswrapper[4825]: I0310 07:07:15.771062 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.223724799 podStartE2EDuration="5.771042913s" podCreationTimestamp="2026-03-10 07:07:10 +0000 UTC" firstStartedPulling="2026-03-10 07:07:11.882378941 +0000 UTC m=+1384.912159556" lastFinishedPulling="2026-03-10 07:07:14.429697045 +0000 UTC m=+1387.459477670" observedRunningTime="2026-03-10 07:07:15.749289255 +0000 UTC m=+1388.779069870" watchObservedRunningTime="2026-03-10 07:07:15.771042913 +0000 UTC m=+1388.800823528" Mar 10 07:07:16 crc kubenswrapper[4825]: I0310 07:07:16.267064 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 07:07:16 crc kubenswrapper[4825]: I0310 07:07:16.559968 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:16 crc kubenswrapper[4825]: I0310 07:07:16.735085 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c57dafa-d38c-4d1b-9de5-570c25ec94ce","Type":"ContainerStarted","Data":"bc474ee7a41cea179857eefd140aa50bd4b92baba905079b14409fbd7aed96a9"} Mar 10 07:07:16 crc kubenswrapper[4825]: I0310 07:07:16.735231 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3c57dafa-d38c-4d1b-9de5-570c25ec94ce" containerName="nova-metadata-log" containerID="cri-o://2a9818b5c33c795bc5e4625e6532ef07969e866625cd296b726cd56a8335cde3" gracePeriod=30 Mar 10 07:07:16 crc kubenswrapper[4825]: I0310 07:07:16.735693 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3c57dafa-d38c-4d1b-9de5-570c25ec94ce" containerName="nova-metadata-metadata" containerID="cri-o://bc474ee7a41cea179857eefd140aa50bd4b92baba905079b14409fbd7aed96a9" gracePeriod=30 Mar 10 07:07:16 crc kubenswrapper[4825]: I0310 07:07:16.740641 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6965f726-906a-463f-b776-5ddf8e672c29","Type":"ContainerStarted","Data":"11f9db886132d5a6eb8c4e724d2cdece1f01b59eff676d258ab784a238134bfa"} Mar 10 07:07:16 crc kubenswrapper[4825]: I0310 07:07:16.798906 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.897635666 podStartE2EDuration="6.798891817s" podCreationTimestamp="2026-03-10 07:07:10 +0000 UTC" firstStartedPulling="2026-03-10 07:07:12.130108479 +0000 UTC m=+1385.159889094" lastFinishedPulling="2026-03-10 07:07:15.03136463 +0000 UTC m=+1388.061145245" observedRunningTime="2026-03-10 07:07:16.794461772 +0000 UTC m=+1389.824242387" watchObservedRunningTime="2026-03-10 07:07:16.798891817 +0000 UTC m=+1389.828672432" Mar 10 07:07:16 crc kubenswrapper[4825]: I0310 07:07:16.803377 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.89126567 podStartE2EDuration="6.803366004s" podCreationTimestamp="2026-03-10 07:07:10 +0000 UTC" firstStartedPulling="2026-03-10 07:07:12.119222395 +0000 UTC m=+1385.149003010" lastFinishedPulling="2026-03-10 07:07:15.031322729 +0000 UTC m=+1388.061103344" observedRunningTime="2026-03-10 07:07:16.77597589 +0000 UTC m=+1389.805756505" watchObservedRunningTime="2026-03-10 07:07:16.803366004 +0000 UTC m=+1389.833146609" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.425823 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.456723 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c57dafa-d38c-4d1b-9de5-570c25ec94ce-combined-ca-bundle\") pod \"3c57dafa-d38c-4d1b-9de5-570c25ec94ce\" (UID: \"3c57dafa-d38c-4d1b-9de5-570c25ec94ce\") " Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.456821 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnh9d\" (UniqueName: \"kubernetes.io/projected/3c57dafa-d38c-4d1b-9de5-570c25ec94ce-kube-api-access-bnh9d\") pod \"3c57dafa-d38c-4d1b-9de5-570c25ec94ce\" (UID: \"3c57dafa-d38c-4d1b-9de5-570c25ec94ce\") " Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.456910 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c57dafa-d38c-4d1b-9de5-570c25ec94ce-logs\") pod \"3c57dafa-d38c-4d1b-9de5-570c25ec94ce\" (UID: \"3c57dafa-d38c-4d1b-9de5-570c25ec94ce\") " Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.456938 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c57dafa-d38c-4d1b-9de5-570c25ec94ce-config-data\") pod \"3c57dafa-d38c-4d1b-9de5-570c25ec94ce\" (UID: \"3c57dafa-d38c-4d1b-9de5-570c25ec94ce\") " Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.457253 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c57dafa-d38c-4d1b-9de5-570c25ec94ce-logs" (OuterVolumeSpecName: "logs") pod "3c57dafa-d38c-4d1b-9de5-570c25ec94ce" (UID: "3c57dafa-d38c-4d1b-9de5-570c25ec94ce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.464830 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c57dafa-d38c-4d1b-9de5-570c25ec94ce-kube-api-access-bnh9d" (OuterVolumeSpecName: "kube-api-access-bnh9d") pod "3c57dafa-d38c-4d1b-9de5-570c25ec94ce" (UID: "3c57dafa-d38c-4d1b-9de5-570c25ec94ce"). InnerVolumeSpecName "kube-api-access-bnh9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.490144 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c57dafa-d38c-4d1b-9de5-570c25ec94ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c57dafa-d38c-4d1b-9de5-570c25ec94ce" (UID: "3c57dafa-d38c-4d1b-9de5-570c25ec94ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.491408 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c57dafa-d38c-4d1b-9de5-570c25ec94ce-config-data" (OuterVolumeSpecName: "config-data") pod "3c57dafa-d38c-4d1b-9de5-570c25ec94ce" (UID: "3c57dafa-d38c-4d1b-9de5-570c25ec94ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.559786 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c57dafa-d38c-4d1b-9de5-570c25ec94ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.560226 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnh9d\" (UniqueName: \"kubernetes.io/projected/3c57dafa-d38c-4d1b-9de5-570c25ec94ce-kube-api-access-bnh9d\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.560241 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c57dafa-d38c-4d1b-9de5-570c25ec94ce-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.560258 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c57dafa-d38c-4d1b-9de5-570c25ec94ce-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.755289 4825 generic.go:334] "Generic (PLEG): container finished" podID="3c57dafa-d38c-4d1b-9de5-570c25ec94ce" containerID="bc474ee7a41cea179857eefd140aa50bd4b92baba905079b14409fbd7aed96a9" exitCode=0 Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.755330 4825 generic.go:334] "Generic (PLEG): container finished" podID="3c57dafa-d38c-4d1b-9de5-570c25ec94ce" containerID="2a9818b5c33c795bc5e4625e6532ef07969e866625cd296b726cd56a8335cde3" exitCode=143 Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.755402 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.755484 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c57dafa-d38c-4d1b-9de5-570c25ec94ce","Type":"ContainerDied","Data":"bc474ee7a41cea179857eefd140aa50bd4b92baba905079b14409fbd7aed96a9"} Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.755563 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c57dafa-d38c-4d1b-9de5-570c25ec94ce","Type":"ContainerDied","Data":"2a9818b5c33c795bc5e4625e6532ef07969e866625cd296b726cd56a8335cde3"} Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.755582 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c57dafa-d38c-4d1b-9de5-570c25ec94ce","Type":"ContainerDied","Data":"3360a1dfcdfb9ea8534befbe48ca27d20d7a1dd8bfc63d248b23129f3ca5a8be"} Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.755667 4825 scope.go:117] "RemoveContainer" containerID="bc474ee7a41cea179857eefd140aa50bd4b92baba905079b14409fbd7aed96a9" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.784874 4825 scope.go:117] "RemoveContainer" containerID="2a9818b5c33c795bc5e4625e6532ef07969e866625cd296b726cd56a8335cde3" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.834661 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.855667 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.869384 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.869750 4825 scope.go:117] "RemoveContainer" containerID="bc474ee7a41cea179857eefd140aa50bd4b92baba905079b14409fbd7aed96a9" Mar 10 07:07:17 crc kubenswrapper[4825]: E0310 07:07:17.869756 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c57dafa-d38c-4d1b-9de5-570c25ec94ce" containerName="nova-metadata-metadata" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.869837 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c57dafa-d38c-4d1b-9de5-570c25ec94ce" containerName="nova-metadata-metadata" Mar 10 07:07:17 crc kubenswrapper[4825]: E0310 07:07:17.869903 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c57dafa-d38c-4d1b-9de5-570c25ec94ce" containerName="nova-metadata-log" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.869913 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c57dafa-d38c-4d1b-9de5-570c25ec94ce" containerName="nova-metadata-log" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.870217 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c57dafa-d38c-4d1b-9de5-570c25ec94ce" containerName="nova-metadata-metadata" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.870233 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c57dafa-d38c-4d1b-9de5-570c25ec94ce" containerName="nova-metadata-log" Mar 10 07:07:17 crc kubenswrapper[4825]: E0310 07:07:17.870309 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc474ee7a41cea179857eefd140aa50bd4b92baba905079b14409fbd7aed96a9\": container with ID starting with bc474ee7a41cea179857eefd140aa50bd4b92baba905079b14409fbd7aed96a9 not found: ID does not exist" containerID="bc474ee7a41cea179857eefd140aa50bd4b92baba905079b14409fbd7aed96a9" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.870364 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc474ee7a41cea179857eefd140aa50bd4b92baba905079b14409fbd7aed96a9"} err="failed to get container status \"bc474ee7a41cea179857eefd140aa50bd4b92baba905079b14409fbd7aed96a9\": rpc error: code = NotFound desc = could not find container \"bc474ee7a41cea179857eefd140aa50bd4b92baba905079b14409fbd7aed96a9\": container with ID starting with bc474ee7a41cea179857eefd140aa50bd4b92baba905079b14409fbd7aed96a9 not found: ID does not exist" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.870392 4825 scope.go:117] "RemoveContainer" containerID="2a9818b5c33c795bc5e4625e6532ef07969e866625cd296b726cd56a8335cde3" Mar 10 07:07:17 crc kubenswrapper[4825]: E0310 07:07:17.871030 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a9818b5c33c795bc5e4625e6532ef07969e866625cd296b726cd56a8335cde3\": container with ID starting with 2a9818b5c33c795bc5e4625e6532ef07969e866625cd296b726cd56a8335cde3 not found: ID does not exist" containerID="2a9818b5c33c795bc5e4625e6532ef07969e866625cd296b726cd56a8335cde3" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.871062 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a9818b5c33c795bc5e4625e6532ef07969e866625cd296b726cd56a8335cde3"} err="failed to get container status \"2a9818b5c33c795bc5e4625e6532ef07969e866625cd296b726cd56a8335cde3\": rpc error: code = NotFound desc = could not find container \"2a9818b5c33c795bc5e4625e6532ef07969e866625cd296b726cd56a8335cde3\": container with ID starting with 2a9818b5c33c795bc5e4625e6532ef07969e866625cd296b726cd56a8335cde3 not found: ID does not exist" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.871079 4825 scope.go:117] "RemoveContainer" containerID="bc474ee7a41cea179857eefd140aa50bd4b92baba905079b14409fbd7aed96a9" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.871192 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.871705 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc474ee7a41cea179857eefd140aa50bd4b92baba905079b14409fbd7aed96a9"} err="failed to get container status \"bc474ee7a41cea179857eefd140aa50bd4b92baba905079b14409fbd7aed96a9\": rpc error: code = NotFound desc = could not find container \"bc474ee7a41cea179857eefd140aa50bd4b92baba905079b14409fbd7aed96a9\": container with ID starting with bc474ee7a41cea179857eefd140aa50bd4b92baba905079b14409fbd7aed96a9 not found: ID does not exist" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.871751 4825 scope.go:117] "RemoveContainer" containerID="2a9818b5c33c795bc5e4625e6532ef07969e866625cd296b726cd56a8335cde3" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.875298 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a9818b5c33c795bc5e4625e6532ef07969e866625cd296b726cd56a8335cde3"} err="failed to get container status \"2a9818b5c33c795bc5e4625e6532ef07969e866625cd296b726cd56a8335cde3\": rpc error: code = NotFound desc = could not find container \"2a9818b5c33c795bc5e4625e6532ef07969e866625cd296b726cd56a8335cde3\": container with ID starting with 2a9818b5c33c795bc5e4625e6532ef07969e866625cd296b726cd56a8335cde3 not found: ID does not exist" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.875539 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.876261 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.891795 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.970195 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\") " pod="openstack/nova-metadata-0" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.970292 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvsh2\" (UniqueName: \"kubernetes.io/projected/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-kube-api-access-hvsh2\") pod \"nova-metadata-0\" (UID: \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\") " pod="openstack/nova-metadata-0" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.970368 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\") " pod="openstack/nova-metadata-0" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.970554 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-config-data\") pod \"nova-metadata-0\" (UID: \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\") " pod="openstack/nova-metadata-0" Mar 10 07:07:17 crc kubenswrapper[4825]: I0310 07:07:17.970599 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-logs\") pod \"nova-metadata-0\" (UID: \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\") " pod="openstack/nova-metadata-0" Mar 10 07:07:18 crc kubenswrapper[4825]: I0310 07:07:18.071749 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\") " pod="openstack/nova-metadata-0" Mar 10 07:07:18 crc kubenswrapper[4825]: I0310 07:07:18.071875 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvsh2\" (UniqueName: \"kubernetes.io/projected/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-kube-api-access-hvsh2\") pod \"nova-metadata-0\" (UID: \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\") " pod="openstack/nova-metadata-0" Mar 10 07:07:18 crc kubenswrapper[4825]: I0310 07:07:18.071977 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\") " pod="openstack/nova-metadata-0" Mar 10 07:07:18 crc kubenswrapper[4825]: I0310 07:07:18.072127 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-config-data\") pod \"nova-metadata-0\" (UID: \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\") " pod="openstack/nova-metadata-0" Mar 10 07:07:18 crc kubenswrapper[4825]: I0310 07:07:18.072209 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-logs\") pod \"nova-metadata-0\" (UID: \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\") " pod="openstack/nova-metadata-0" Mar 10 07:07:18 crc kubenswrapper[4825]: I0310 07:07:18.072919 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-logs\") pod \"nova-metadata-0\" (UID: \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\") " pod="openstack/nova-metadata-0" Mar 10 07:07:18 crc kubenswrapper[4825]: I0310 07:07:18.082612 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-config-data\") pod \"nova-metadata-0\" (UID: \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\") " pod="openstack/nova-metadata-0" Mar 10 07:07:18 crc kubenswrapper[4825]: I0310 07:07:18.083085 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\") " pod="openstack/nova-metadata-0" Mar 10 07:07:18 crc kubenswrapper[4825]: I0310 07:07:18.084104 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\") " pod="openstack/nova-metadata-0" Mar 10 07:07:18 crc kubenswrapper[4825]: I0310 07:07:18.096129 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvsh2\" (UniqueName: \"kubernetes.io/projected/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-kube-api-access-hvsh2\") pod \"nova-metadata-0\" (UID: \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\") " pod="openstack/nova-metadata-0" Mar 10 07:07:18 crc kubenswrapper[4825]: I0310 07:07:18.214958 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 07:07:18 crc kubenswrapper[4825]: I0310 07:07:18.737665 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 07:07:18 crc kubenswrapper[4825]: I0310 07:07:18.769875 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8","Type":"ContainerStarted","Data":"ef532ace951b102bcddb11af19d731d778f625c155f4082587b9e1dd822c9b43"} Mar 10 07:07:19 crc kubenswrapper[4825]: I0310 07:07:19.249375 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c57dafa-d38c-4d1b-9de5-570c25ec94ce" path="/var/lib/kubelet/pods/3c57dafa-d38c-4d1b-9de5-570c25ec94ce/volumes" Mar 10 07:07:19 crc kubenswrapper[4825]: I0310 07:07:19.785010 4825 generic.go:334] "Generic (PLEG): container finished" podID="a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3" containerID="e4fc36f4f8673da278d64fd9e1e19a5d1e27f954df79bea201495848ce4acc7b" exitCode=0 Mar 10 07:07:19 crc kubenswrapper[4825]: I0310 07:07:19.785141 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-npbf2" event={"ID":"a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3","Type":"ContainerDied","Data":"e4fc36f4f8673da278d64fd9e1e19a5d1e27f954df79bea201495848ce4acc7b"} Mar 10 07:07:19 crc kubenswrapper[4825]: I0310 07:07:19.788647 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8","Type":"ContainerStarted","Data":"f89ec31fef0d4d2e15ad096c0a1700f50406174406b2345a0f72e2913f020f6a"} Mar 10 07:07:19 crc kubenswrapper[4825]: I0310 07:07:19.788694 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8","Type":"ContainerStarted","Data":"26aa01fd9fbb6eefb82dd5becd8757efbcc08efc6d2ec8738abce656bce38f33"} Mar 10 07:07:19 crc kubenswrapper[4825]: I0310 07:07:19.832327 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.832306824 podStartE2EDuration="2.832306824s" podCreationTimestamp="2026-03-10 07:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:07:19.829030159 +0000 UTC m=+1392.858810774" watchObservedRunningTime="2026-03-10 07:07:19.832306824 +0000 UTC m=+1392.862087439" Mar 10 07:07:20 crc kubenswrapper[4825]: I0310 07:07:20.806710 4825 generic.go:334] "Generic (PLEG): container finished" podID="3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2" containerID="5a6694411ba07209eab6dc1d0ce1dc4fc71125bfd4984ffce3d54b7f66b38d63" exitCode=0 Mar 10 07:07:20 crc kubenswrapper[4825]: I0310 07:07:20.806801 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5l64f" event={"ID":"3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2","Type":"ContainerDied","Data":"5a6694411ba07209eab6dc1d0ce1dc4fc71125bfd4984ffce3d54b7f66b38d63"} Mar 10 07:07:21 crc kubenswrapper[4825]: I0310 07:07:21.211395 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-npbf2" Mar 10 07:07:21 crc kubenswrapper[4825]: I0310 07:07:21.238735 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3-combined-ca-bundle\") pod \"a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3\" (UID: \"a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3\") " Mar 10 07:07:21 crc kubenswrapper[4825]: I0310 07:07:21.239176 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3-scripts\") pod \"a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3\" (UID: \"a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3\") " Mar 10 07:07:21 crc kubenswrapper[4825]: I0310 07:07:21.239244 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8dfh\" (UniqueName: \"kubernetes.io/projected/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3-kube-api-access-z8dfh\") pod \"a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3\" (UID: \"a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3\") " Mar 10 07:07:21 crc kubenswrapper[4825]: I0310 07:07:21.239303 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3-config-data\") pod \"a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3\" (UID: \"a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3\") " Mar 10 07:07:21 crc kubenswrapper[4825]: I0310 07:07:21.288102 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3-kube-api-access-z8dfh" (OuterVolumeSpecName: "kube-api-access-z8dfh") pod "a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3" (UID: "a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3"). InnerVolumeSpecName "kube-api-access-z8dfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:07:21 crc kubenswrapper[4825]: I0310 07:07:21.293119 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3" (UID: "a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:21 crc kubenswrapper[4825]: I0310 07:07:21.294227 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3-scripts" (OuterVolumeSpecName: "scripts") pod "a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3" (UID: "a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:21 crc kubenswrapper[4825]: I0310 07:07:21.300441 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3-config-data" (OuterVolumeSpecName: "config-data") pod "a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3" (UID: "a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:21 crc kubenswrapper[4825]: I0310 07:07:21.341478 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:21 crc kubenswrapper[4825]: I0310 07:07:21.341766 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:21 crc kubenswrapper[4825]: I0310 07:07:21.342014 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8dfh\" (UniqueName: \"kubernetes.io/projected/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3-kube-api-access-z8dfh\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:21 crc kubenswrapper[4825]: I0310 07:07:21.342082 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:21 crc kubenswrapper[4825]: I0310 07:07:21.381520 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 07:07:21 crc kubenswrapper[4825]: I0310 07:07:21.381643 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 07:07:21 crc kubenswrapper[4825]: I0310 07:07:21.413208 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 07:07:21 crc kubenswrapper[4825]: I0310 07:07:21.413632 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 07:07:21 crc kubenswrapper[4825]: I0310 07:07:21.413718 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 07:07:21 crc kubenswrapper[4825]: I0310 07:07:21.643354 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" Mar 10 07:07:21 crc kubenswrapper[4825]: I0310 07:07:21.702874 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-n7rdd"] Mar 10 07:07:21 crc kubenswrapper[4825]: I0310 07:07:21.703115 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" podUID="b9c1ef21-4ae0-4693-baf2-06678f62411e" containerName="dnsmasq-dns" containerID="cri-o://9bf08af71a6a3a7937ba9529a3e4285427d5f544f773c8146847f811e42d30f5" gracePeriod=10 Mar 10 07:07:21 crc kubenswrapper[4825]: I0310 07:07:21.844145 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-npbf2" Mar 10 07:07:21 crc kubenswrapper[4825]: I0310 07:07:21.845110 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-npbf2" event={"ID":"a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3","Type":"ContainerDied","Data":"4da21a373258dee194b79ceabb67632dbbc47d13d8ceeb34373a0b282ba0bf53"} Mar 10 07:07:21 crc kubenswrapper[4825]: I0310 07:07:21.856566 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4da21a373258dee194b79ceabb67632dbbc47d13d8ceeb34373a0b282ba0bf53" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.117859 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.118106 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="20a9f35b-0428-42cc-b856-5ed0fb3ae7f8" containerName="nova-metadata-log" containerID="cri-o://26aa01fd9fbb6eefb82dd5becd8757efbcc08efc6d2ec8738abce656bce38f33" gracePeriod=30 Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.118568 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="20a9f35b-0428-42cc-b856-5ed0fb3ae7f8" containerName="nova-metadata-metadata" containerID="cri-o://f89ec31fef0d4d2e15ad096c0a1700f50406174406b2345a0f72e2913f020f6a" gracePeriod=30 Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.136949 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.163399 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.163637 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6965f726-906a-463f-b776-5ddf8e672c29" containerName="nova-api-log" containerID="cri-o://0dba2b616476849648bdd5c442999b93f5a2d75b861d8c1098438ac42d5eacf3" gracePeriod=30 Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.164056 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6965f726-906a-463f-b776-5ddf8e672c29" containerName="nova-api-api" containerID="cri-o://11f9db886132d5a6eb8c4e724d2cdece1f01b59eff676d258ab784a238134bfa" gracePeriod=30 Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.183004 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6965f726-906a-463f-b776-5ddf8e672c29" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": EOF" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.183199 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6965f726-906a-463f-b776-5ddf8e672c29" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": EOF" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.578912 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.580008 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5l64f" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.699330 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2-config-data\") pod \"3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2\" (UID: \"3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2\") " Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.699485 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tstxv\" (UniqueName: \"kubernetes.io/projected/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2-kube-api-access-tstxv\") pod \"3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2\" (UID: \"3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2\") " Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.699596 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2-scripts\") pod \"3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2\" (UID: \"3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2\") " Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.699648 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-config\") pod \"b9c1ef21-4ae0-4693-baf2-06678f62411e\" (UID: \"b9c1ef21-4ae0-4693-baf2-06678f62411e\") " Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.699737 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-dns-svc\") pod \"b9c1ef21-4ae0-4693-baf2-06678f62411e\" (UID: \"b9c1ef21-4ae0-4693-baf2-06678f62411e\") " Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.699826 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-dns-swift-storage-0\") pod \"b9c1ef21-4ae0-4693-baf2-06678f62411e\" (UID: \"b9c1ef21-4ae0-4693-baf2-06678f62411e\") " Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.699929 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-ovsdbserver-nb\") pod \"b9c1ef21-4ae0-4693-baf2-06678f62411e\" (UID: \"b9c1ef21-4ae0-4693-baf2-06678f62411e\") " Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.699982 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmc2f\" (UniqueName: \"kubernetes.io/projected/b9c1ef21-4ae0-4693-baf2-06678f62411e-kube-api-access-pmc2f\") pod \"b9c1ef21-4ae0-4693-baf2-06678f62411e\" (UID: \"b9c1ef21-4ae0-4693-baf2-06678f62411e\") " Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.700013 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-ovsdbserver-sb\") pod \"b9c1ef21-4ae0-4693-baf2-06678f62411e\" (UID: \"b9c1ef21-4ae0-4693-baf2-06678f62411e\") " Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.700167 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2-combined-ca-bundle\") pod \"3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2\" (UID: \"3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2\") " Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.713984 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2-kube-api-access-tstxv" (OuterVolumeSpecName: "kube-api-access-tstxv") pod "3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2" (UID: "3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2"). InnerVolumeSpecName "kube-api-access-tstxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.717403 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2-scripts" (OuterVolumeSpecName: "scripts") pod "3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2" (UID: "3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.731562 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9c1ef21-4ae0-4693-baf2-06678f62411e-kube-api-access-pmc2f" (OuterVolumeSpecName: "kube-api-access-pmc2f") pod "b9c1ef21-4ae0-4693-baf2-06678f62411e" (UID: "b9c1ef21-4ae0-4693-baf2-06678f62411e"). InnerVolumeSpecName "kube-api-access-pmc2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.740728 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2" (UID: "3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.761331 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2-config-data" (OuterVolumeSpecName: "config-data") pod "3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2" (UID: "3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.778525 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b9c1ef21-4ae0-4693-baf2-06678f62411e" (UID: "b9c1ef21-4ae0-4693-baf2-06678f62411e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.781519 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b9c1ef21-4ae0-4693-baf2-06678f62411e" (UID: "b9c1ef21-4ae0-4693-baf2-06678f62411e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.792258 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b9c1ef21-4ae0-4693-baf2-06678f62411e" (UID: "b9c1ef21-4ae0-4693-baf2-06678f62411e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.800788 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b9c1ef21-4ae0-4693-baf2-06678f62411e" (UID: "b9c1ef21-4ae0-4693-baf2-06678f62411e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.801977 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.801999 4825 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.802010 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.802020 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmc2f\" (UniqueName: \"kubernetes.io/projected/b9c1ef21-4ae0-4693-baf2-06678f62411e-kube-api-access-pmc2f\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.802028 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.802039 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.802049 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.802062 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tstxv\" (UniqueName: \"kubernetes.io/projected/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2-kube-api-access-tstxv\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.802070 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.803665 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-config" (OuterVolumeSpecName: "config") pod "b9c1ef21-4ae0-4693-baf2-06678f62411e" (UID: "b9c1ef21-4ae0-4693-baf2-06678f62411e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.882005 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5l64f" event={"ID":"3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2","Type":"ContainerDied","Data":"559ef4bd666291eae26c4b939416644a6b7eb90e6ef3542f1756432febae846d"} Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.882047 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="559ef4bd666291eae26c4b939416644a6b7eb90e6ef3542f1756432febae846d" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.882141 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5l64f" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.891406 4825 generic.go:334] "Generic (PLEG): container finished" podID="20a9f35b-0428-42cc-b856-5ed0fb3ae7f8" containerID="f89ec31fef0d4d2e15ad096c0a1700f50406174406b2345a0f72e2913f020f6a" exitCode=0 Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.891448 4825 generic.go:334] "Generic (PLEG): container finished" podID="20a9f35b-0428-42cc-b856-5ed0fb3ae7f8" containerID="26aa01fd9fbb6eefb82dd5becd8757efbcc08efc6d2ec8738abce656bce38f33" exitCode=143 Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.891511 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8","Type":"ContainerDied","Data":"f89ec31fef0d4d2e15ad096c0a1700f50406174406b2345a0f72e2913f020f6a"} Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.891601 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8","Type":"ContainerDied","Data":"26aa01fd9fbb6eefb82dd5becd8757efbcc08efc6d2ec8738abce656bce38f33"} Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.895074 4825 generic.go:334] "Generic (PLEG): container finished" podID="6965f726-906a-463f-b776-5ddf8e672c29" containerID="0dba2b616476849648bdd5c442999b93f5a2d75b861d8c1098438ac42d5eacf3" exitCode=143 Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.895144 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6965f726-906a-463f-b776-5ddf8e672c29","Type":"ContainerDied","Data":"0dba2b616476849648bdd5c442999b93f5a2d75b861d8c1098438ac42d5eacf3"} Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.903311 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9c1ef21-4ae0-4693-baf2-06678f62411e-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.905729 4825 generic.go:334] "Generic (PLEG): container finished" podID="b9c1ef21-4ae0-4693-baf2-06678f62411e" containerID="9bf08af71a6a3a7937ba9529a3e4285427d5f544f773c8146847f811e42d30f5" exitCode=0 Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.906115 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f9632a51-2c7a-4538-895d-c9c99e7f1e00" containerName="nova-scheduler-scheduler" containerID="cri-o://92bb61ab3a96d1bb04b8dc5bbd78d1594b7ade60769154636efd908f14fc8323" gracePeriod=30 Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.906531 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.909202 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" event={"ID":"b9c1ef21-4ae0-4693-baf2-06678f62411e","Type":"ContainerDied","Data":"9bf08af71a6a3a7937ba9529a3e4285427d5f544f773c8146847f811e42d30f5"} Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.909239 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-n7rdd" event={"ID":"b9c1ef21-4ae0-4693-baf2-06678f62411e","Type":"ContainerDied","Data":"3f91dfef15d972a20c51f7ab8eb6d49c3cef5ec6a1c022bd311a84911847dee3"} Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.909258 4825 scope.go:117] "RemoveContainer" containerID="9bf08af71a6a3a7937ba9529a3e4285427d5f544f773c8146847f811e42d30f5" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.921747 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 07:07:22 crc kubenswrapper[4825]: E0310 07:07:22.922191 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3" containerName="nova-manage" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.922211 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3" containerName="nova-manage" Mar 10 07:07:22 crc kubenswrapper[4825]: E0310 07:07:22.922221 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2" containerName="nova-cell1-conductor-db-sync" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.922230 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2" containerName="nova-cell1-conductor-db-sync" Mar 10 07:07:22 crc kubenswrapper[4825]: E0310 07:07:22.922266 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9c1ef21-4ae0-4693-baf2-06678f62411e" containerName="init" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.922273 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9c1ef21-4ae0-4693-baf2-06678f62411e" containerName="init" Mar 10 07:07:22 crc kubenswrapper[4825]: E0310 07:07:22.922283 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9c1ef21-4ae0-4693-baf2-06678f62411e" containerName="dnsmasq-dns" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.922288 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9c1ef21-4ae0-4693-baf2-06678f62411e" containerName="dnsmasq-dns" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.922460 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9c1ef21-4ae0-4693-baf2-06678f62411e" containerName="dnsmasq-dns" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.922477 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2" containerName="nova-cell1-conductor-db-sync" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.922490 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3" containerName="nova-manage" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.923070 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.931246 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.958367 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.971369 4825 scope.go:117] "RemoveContainer" containerID="3e7b56e402fc52c37eaffba0cead25e2e2f123e3f084b20fe95806e05bfb33dd" Mar 10 07:07:22 crc kubenswrapper[4825]: I0310 07:07:22.990733 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-n7rdd"] Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.000535 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.000940 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-n7rdd"] Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.005395 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9965b351-ed73-4c38-b393-ff72ba48cd66-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9965b351-ed73-4c38-b393-ff72ba48cd66\") " pod="openstack/nova-cell1-conductor-0" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.005452 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9965b351-ed73-4c38-b393-ff72ba48cd66-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9965b351-ed73-4c38-b393-ff72ba48cd66\") " pod="openstack/nova-cell1-conductor-0" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.005842 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cdsb\" (UniqueName: \"kubernetes.io/projected/9965b351-ed73-4c38-b393-ff72ba48cd66-kube-api-access-2cdsb\") pod \"nova-cell1-conductor-0\" (UID: \"9965b351-ed73-4c38-b393-ff72ba48cd66\") " pod="openstack/nova-cell1-conductor-0" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.030448 4825 scope.go:117] "RemoveContainer" containerID="9bf08af71a6a3a7937ba9529a3e4285427d5f544f773c8146847f811e42d30f5" Mar 10 07:07:23 crc kubenswrapper[4825]: E0310 07:07:23.031363 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bf08af71a6a3a7937ba9529a3e4285427d5f544f773c8146847f811e42d30f5\": container with ID starting with 9bf08af71a6a3a7937ba9529a3e4285427d5f544f773c8146847f811e42d30f5 not found: ID does not exist" containerID="9bf08af71a6a3a7937ba9529a3e4285427d5f544f773c8146847f811e42d30f5" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.031408 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bf08af71a6a3a7937ba9529a3e4285427d5f544f773c8146847f811e42d30f5"} err="failed to get container status \"9bf08af71a6a3a7937ba9529a3e4285427d5f544f773c8146847f811e42d30f5\": rpc error: code = NotFound desc = could not find container \"9bf08af71a6a3a7937ba9529a3e4285427d5f544f773c8146847f811e42d30f5\": container with ID starting with 9bf08af71a6a3a7937ba9529a3e4285427d5f544f773c8146847f811e42d30f5 not found: ID does not exist" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.031443 4825 scope.go:117] "RemoveContainer" containerID="3e7b56e402fc52c37eaffba0cead25e2e2f123e3f084b20fe95806e05bfb33dd" Mar 10 07:07:23 crc kubenswrapper[4825]: E0310 07:07:23.035439 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e7b56e402fc52c37eaffba0cead25e2e2f123e3f084b20fe95806e05bfb33dd\": container with ID starting with 3e7b56e402fc52c37eaffba0cead25e2e2f123e3f084b20fe95806e05bfb33dd not found: ID does not exist" containerID="3e7b56e402fc52c37eaffba0cead25e2e2f123e3f084b20fe95806e05bfb33dd" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.035498 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e7b56e402fc52c37eaffba0cead25e2e2f123e3f084b20fe95806e05bfb33dd"} err="failed to get container status \"3e7b56e402fc52c37eaffba0cead25e2e2f123e3f084b20fe95806e05bfb33dd\": rpc error: code = NotFound desc = could not find container \"3e7b56e402fc52c37eaffba0cead25e2e2f123e3f084b20fe95806e05bfb33dd\": container with ID starting with 3e7b56e402fc52c37eaffba0cead25e2e2f123e3f084b20fe95806e05bfb33dd not found: ID does not exist" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.108758 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-logs\") pod \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\" (UID: \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\") " Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.108834 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-combined-ca-bundle\") pod \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\" (UID: \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\") " Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.108915 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-nova-metadata-tls-certs\") pod \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\" (UID: \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\") " Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.108991 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-config-data\") pod \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\" (UID: \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\") " Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.109030 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvsh2\" (UniqueName: \"kubernetes.io/projected/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-kube-api-access-hvsh2\") pod \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\" (UID: \"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8\") " Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.109122 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-logs" (OuterVolumeSpecName: "logs") pod "20a9f35b-0428-42cc-b856-5ed0fb3ae7f8" (UID: "20a9f35b-0428-42cc-b856-5ed0fb3ae7f8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.109393 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cdsb\" (UniqueName: \"kubernetes.io/projected/9965b351-ed73-4c38-b393-ff72ba48cd66-kube-api-access-2cdsb\") pod \"nova-cell1-conductor-0\" (UID: \"9965b351-ed73-4c38-b393-ff72ba48cd66\") " pod="openstack/nova-cell1-conductor-0" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.109477 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9965b351-ed73-4c38-b393-ff72ba48cd66-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9965b351-ed73-4c38-b393-ff72ba48cd66\") " pod="openstack/nova-cell1-conductor-0" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.109514 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9965b351-ed73-4c38-b393-ff72ba48cd66-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9965b351-ed73-4c38-b393-ff72ba48cd66\") " pod="openstack/nova-cell1-conductor-0" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.109583 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.126545 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-kube-api-access-hvsh2" (OuterVolumeSpecName: "kube-api-access-hvsh2") pod "20a9f35b-0428-42cc-b856-5ed0fb3ae7f8" (UID: "20a9f35b-0428-42cc-b856-5ed0fb3ae7f8"). InnerVolumeSpecName "kube-api-access-hvsh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.126952 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9965b351-ed73-4c38-b393-ff72ba48cd66-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9965b351-ed73-4c38-b393-ff72ba48cd66\") " pod="openstack/nova-cell1-conductor-0" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.127071 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9965b351-ed73-4c38-b393-ff72ba48cd66-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9965b351-ed73-4c38-b393-ff72ba48cd66\") " pod="openstack/nova-cell1-conductor-0" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.132907 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cdsb\" (UniqueName: \"kubernetes.io/projected/9965b351-ed73-4c38-b393-ff72ba48cd66-kube-api-access-2cdsb\") pod \"nova-cell1-conductor-0\" (UID: \"9965b351-ed73-4c38-b393-ff72ba48cd66\") " pod="openstack/nova-cell1-conductor-0" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.137467 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20a9f35b-0428-42cc-b856-5ed0fb3ae7f8" (UID: "20a9f35b-0428-42cc-b856-5ed0fb3ae7f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.144861 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-config-data" (OuterVolumeSpecName: "config-data") pod "20a9f35b-0428-42cc-b856-5ed0fb3ae7f8" (UID: "20a9f35b-0428-42cc-b856-5ed0fb3ae7f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.183412 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "20a9f35b-0428-42cc-b856-5ed0fb3ae7f8" (UID: "20a9f35b-0428-42cc-b856-5ed0fb3ae7f8"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.211858 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.211902 4825 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.211913 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.211922 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvsh2\" (UniqueName: \"kubernetes.io/projected/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8-kube-api-access-hvsh2\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.252745 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.254695 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9c1ef21-4ae0-4693-baf2-06678f62411e" path="/var/lib/kubelet/pods/b9c1ef21-4ae0-4693-baf2-06678f62411e/volumes" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.800579 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.916851 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9965b351-ed73-4c38-b393-ff72ba48cd66","Type":"ContainerStarted","Data":"c8700bdf895a83e0ca2d032e0caac8e842ae7826849b3aa97bbcfaeaf203601b"} Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.919300 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"20a9f35b-0428-42cc-b856-5ed0fb3ae7f8","Type":"ContainerDied","Data":"ef532ace951b102bcddb11af19d731d778f625c155f4082587b9e1dd822c9b43"} Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.919333 4825 scope.go:117] "RemoveContainer" containerID="f89ec31fef0d4d2e15ad096c0a1700f50406174406b2345a0f72e2913f020f6a" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.919389 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.956416 4825 scope.go:117] "RemoveContainer" containerID="26aa01fd9fbb6eefb82dd5becd8757efbcc08efc6d2ec8738abce656bce38f33" Mar 10 07:07:23 crc kubenswrapper[4825]: I0310 07:07:23.969807 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.003555 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.022346 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 07:07:24 crc kubenswrapper[4825]: E0310 07:07:24.022910 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a9f35b-0428-42cc-b856-5ed0fb3ae7f8" containerName="nova-metadata-log" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.022934 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a9f35b-0428-42cc-b856-5ed0fb3ae7f8" containerName="nova-metadata-log" Mar 10 07:07:24 crc kubenswrapper[4825]: E0310 07:07:24.022973 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a9f35b-0428-42cc-b856-5ed0fb3ae7f8" containerName="nova-metadata-metadata" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.022981 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a9f35b-0428-42cc-b856-5ed0fb3ae7f8" containerName="nova-metadata-metadata" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.023234 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="20a9f35b-0428-42cc-b856-5ed0fb3ae7f8" containerName="nova-metadata-log" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.023257 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="20a9f35b-0428-42cc-b856-5ed0fb3ae7f8" containerName="nova-metadata-metadata" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.024611 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.032075 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.032353 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.035358 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.130207 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-logs\") pod \"nova-metadata-0\" (UID: \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\") " pod="openstack/nova-metadata-0" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.130267 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\") " pod="openstack/nova-metadata-0" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.130304 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\") " pod="openstack/nova-metadata-0" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.130390 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mscss\" (UniqueName: \"kubernetes.io/projected/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-kube-api-access-mscss\") pod \"nova-metadata-0\" (UID: \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\") " pod="openstack/nova-metadata-0" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.130744 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-config-data\") pod \"nova-metadata-0\" (UID: \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\") " pod="openstack/nova-metadata-0" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.232080 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-logs\") pod \"nova-metadata-0\" (UID: \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\") " pod="openstack/nova-metadata-0" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.232169 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\") " pod="openstack/nova-metadata-0" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.232198 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\") " pod="openstack/nova-metadata-0" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.232253 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mscss\" (UniqueName: \"kubernetes.io/projected/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-kube-api-access-mscss\") pod \"nova-metadata-0\" (UID: \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\") " pod="openstack/nova-metadata-0" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.232306 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-config-data\") pod \"nova-metadata-0\" (UID: \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\") " pod="openstack/nova-metadata-0" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.232603 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-logs\") pod \"nova-metadata-0\" (UID: \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\") " pod="openstack/nova-metadata-0" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.236622 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\") " pod="openstack/nova-metadata-0" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.237514 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\") " pod="openstack/nova-metadata-0" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.238653 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-config-data\") pod \"nova-metadata-0\" (UID: \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\") " pod="openstack/nova-metadata-0" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.250013 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mscss\" (UniqueName: \"kubernetes.io/projected/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-kube-api-access-mscss\") pod \"nova-metadata-0\" (UID: \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\") " pod="openstack/nova-metadata-0" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.354277 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.829970 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 07:07:24 crc kubenswrapper[4825]: W0310 07:07:24.841357 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba141fc1_b6e8_4d89_a27c_be685f51d6ad.slice/crio-2659159c5880ea26e1514eedac232ae2c791d5640de8edabf85bc186366fb5bf WatchSource:0}: Error finding container 2659159c5880ea26e1514eedac232ae2c791d5640de8edabf85bc186366fb5bf: Status 404 returned error can't find the container with id 2659159c5880ea26e1514eedac232ae2c791d5640de8edabf85bc186366fb5bf Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.952104 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9965b351-ed73-4c38-b393-ff72ba48cd66","Type":"ContainerStarted","Data":"3d70d139a1deb1c6a27cdf349fc1cc7878472659a23957efa5657e5551a800ef"} Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.952197 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.953975 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ba141fc1-b6e8-4d89-a27c-be685f51d6ad","Type":"ContainerStarted","Data":"2659159c5880ea26e1514eedac232ae2c791d5640de8edabf85bc186366fb5bf"} Mar 10 07:07:24 crc kubenswrapper[4825]: I0310 07:07:24.987445 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.98741906 podStartE2EDuration="2.98741906s" podCreationTimestamp="2026-03-10 07:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:07:24.97363251 +0000 UTC m=+1398.003413135" watchObservedRunningTime="2026-03-10 07:07:24.98741906 +0000 UTC m=+1398.017199685" Mar 10 07:07:25 crc kubenswrapper[4825]: I0310 07:07:25.251678 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20a9f35b-0428-42cc-b856-5ed0fb3ae7f8" path="/var/lib/kubelet/pods/20a9f35b-0428-42cc-b856-5ed0fb3ae7f8/volumes" Mar 10 07:07:25 crc kubenswrapper[4825]: I0310 07:07:25.974107 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ba141fc1-b6e8-4d89-a27c-be685f51d6ad","Type":"ContainerStarted","Data":"96f554d756b7d59701d89a4889be2adb5eb03daad6038fc9e00da48c875499dd"} Mar 10 07:07:25 crc kubenswrapper[4825]: I0310 07:07:25.974509 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ba141fc1-b6e8-4d89-a27c-be685f51d6ad","Type":"ContainerStarted","Data":"72dbff0f20a0e237d347b1529b4a95ef2ab111aa2d0f91ed82359a099402beb9"} Mar 10 07:07:25 crc kubenswrapper[4825]: I0310 07:07:25.997908 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.996870585 podStartE2EDuration="2.996870585s" podCreationTimestamp="2026-03-10 07:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:07:25.99629156 +0000 UTC m=+1399.026072215" watchObservedRunningTime="2026-03-10 07:07:25.996870585 +0000 UTC m=+1399.026651200" Mar 10 07:07:26 crc kubenswrapper[4825]: E0310 07:07:26.269561 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="92bb61ab3a96d1bb04b8dc5bbd78d1594b7ade60769154636efd908f14fc8323" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 07:07:26 crc kubenswrapper[4825]: E0310 07:07:26.272026 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="92bb61ab3a96d1bb04b8dc5bbd78d1594b7ade60769154636efd908f14fc8323" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 07:07:26 crc kubenswrapper[4825]: E0310 07:07:26.273780 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="92bb61ab3a96d1bb04b8dc5bbd78d1594b7ade60769154636efd908f14fc8323" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 07:07:26 crc kubenswrapper[4825]: E0310 07:07:26.273852 4825 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f9632a51-2c7a-4538-895d-c9c99e7f1e00" containerName="nova-scheduler-scheduler" Mar 10 07:07:27 crc kubenswrapper[4825]: I0310 07:07:27.480884 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 07:07:27 crc kubenswrapper[4825]: I0310 07:07:27.493195 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc6j9\" (UniqueName: \"kubernetes.io/projected/f9632a51-2c7a-4538-895d-c9c99e7f1e00-kube-api-access-nc6j9\") pod \"f9632a51-2c7a-4538-895d-c9c99e7f1e00\" (UID: \"f9632a51-2c7a-4538-895d-c9c99e7f1e00\") " Mar 10 07:07:27 crc kubenswrapper[4825]: I0310 07:07:27.493359 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9632a51-2c7a-4538-895d-c9c99e7f1e00-config-data\") pod \"f9632a51-2c7a-4538-895d-c9c99e7f1e00\" (UID: \"f9632a51-2c7a-4538-895d-c9c99e7f1e00\") " Mar 10 07:07:27 crc kubenswrapper[4825]: I0310 07:07:27.493407 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9632a51-2c7a-4538-895d-c9c99e7f1e00-combined-ca-bundle\") pod \"f9632a51-2c7a-4538-895d-c9c99e7f1e00\" (UID: \"f9632a51-2c7a-4538-895d-c9c99e7f1e00\") " Mar 10 07:07:27 crc kubenswrapper[4825]: I0310 07:07:27.500863 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9632a51-2c7a-4538-895d-c9c99e7f1e00-kube-api-access-nc6j9" (OuterVolumeSpecName: "kube-api-access-nc6j9") pod "f9632a51-2c7a-4538-895d-c9c99e7f1e00" (UID: "f9632a51-2c7a-4538-895d-c9c99e7f1e00"). InnerVolumeSpecName "kube-api-access-nc6j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:07:27 crc kubenswrapper[4825]: I0310 07:07:27.553983 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9632a51-2c7a-4538-895d-c9c99e7f1e00-config-data" (OuterVolumeSpecName: "config-data") pod "f9632a51-2c7a-4538-895d-c9c99e7f1e00" (UID: "f9632a51-2c7a-4538-895d-c9c99e7f1e00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:27 crc kubenswrapper[4825]: I0310 07:07:27.560076 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9632a51-2c7a-4538-895d-c9c99e7f1e00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9632a51-2c7a-4538-895d-c9c99e7f1e00" (UID: "f9632a51-2c7a-4538-895d-c9c99e7f1e00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:27 crc kubenswrapper[4825]: I0310 07:07:27.596543 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc6j9\" (UniqueName: \"kubernetes.io/projected/f9632a51-2c7a-4538-895d-c9c99e7f1e00-kube-api-access-nc6j9\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:27 crc kubenswrapper[4825]: I0310 07:07:27.596583 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9632a51-2c7a-4538-895d-c9c99e7f1e00-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:27 crc kubenswrapper[4825]: I0310 07:07:27.596595 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9632a51-2c7a-4538-895d-c9c99e7f1e00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.002961 4825 generic.go:334] "Generic (PLEG): container finished" podID="f9632a51-2c7a-4538-895d-c9c99e7f1e00" containerID="92bb61ab3a96d1bb04b8dc5bbd78d1594b7ade60769154636efd908f14fc8323" exitCode=0 Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.003036 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f9632a51-2c7a-4538-895d-c9c99e7f1e00","Type":"ContainerDied","Data":"92bb61ab3a96d1bb04b8dc5bbd78d1594b7ade60769154636efd908f14fc8323"} Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.003093 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f9632a51-2c7a-4538-895d-c9c99e7f1e00","Type":"ContainerDied","Data":"dead8cb276b6510c07d7f0ca63188a075598debe1e7e7359b6259c263f798c08"} Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.003177 4825 scope.go:117] "RemoveContainer" containerID="92bb61ab3a96d1bb04b8dc5bbd78d1594b7ade60769154636efd908f14fc8323" Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.003775 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.047943 4825 scope.go:117] "RemoveContainer" containerID="92bb61ab3a96d1bb04b8dc5bbd78d1594b7ade60769154636efd908f14fc8323" Mar 10 07:07:28 crc kubenswrapper[4825]: E0310 07:07:28.048969 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92bb61ab3a96d1bb04b8dc5bbd78d1594b7ade60769154636efd908f14fc8323\": container with ID starting with 92bb61ab3a96d1bb04b8dc5bbd78d1594b7ade60769154636efd908f14fc8323 not found: ID does not exist" containerID="92bb61ab3a96d1bb04b8dc5bbd78d1594b7ade60769154636efd908f14fc8323" Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.049044 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92bb61ab3a96d1bb04b8dc5bbd78d1594b7ade60769154636efd908f14fc8323"} err="failed to get container status \"92bb61ab3a96d1bb04b8dc5bbd78d1594b7ade60769154636efd908f14fc8323\": rpc error: code = NotFound desc = could not find container \"92bb61ab3a96d1bb04b8dc5bbd78d1594b7ade60769154636efd908f14fc8323\": container with ID starting with 92bb61ab3a96d1bb04b8dc5bbd78d1594b7ade60769154636efd908f14fc8323 not found: ID does not exist" Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.080215 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.103459 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.116798 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 07:07:28 crc kubenswrapper[4825]: E0310 07:07:28.117322 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9632a51-2c7a-4538-895d-c9c99e7f1e00" containerName="nova-scheduler-scheduler" Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.117346 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9632a51-2c7a-4538-895d-c9c99e7f1e00" containerName="nova-scheduler-scheduler" Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.117604 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9632a51-2c7a-4538-895d-c9c99e7f1e00" containerName="nova-scheduler-scheduler" Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.118504 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.123577 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.123621 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.223259 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7vxs\" (UniqueName: \"kubernetes.io/projected/3858ca8e-74ce-44bd-a942-a64b4f59270f-kube-api-access-v7vxs\") pod \"nova-scheduler-0\" (UID: \"3858ca8e-74ce-44bd-a942-a64b4f59270f\") " pod="openstack/nova-scheduler-0" Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.223309 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3858ca8e-74ce-44bd-a942-a64b4f59270f-config-data\") pod \"nova-scheduler-0\" (UID: \"3858ca8e-74ce-44bd-a942-a64b4f59270f\") " pod="openstack/nova-scheduler-0" Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.223400 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3858ca8e-74ce-44bd-a942-a64b4f59270f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3858ca8e-74ce-44bd-a942-a64b4f59270f\") " pod="openstack/nova-scheduler-0" Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.325834 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7vxs\" (UniqueName: \"kubernetes.io/projected/3858ca8e-74ce-44bd-a942-a64b4f59270f-kube-api-access-v7vxs\") pod \"nova-scheduler-0\" (UID: \"3858ca8e-74ce-44bd-a942-a64b4f59270f\") " pod="openstack/nova-scheduler-0" Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.325935 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3858ca8e-74ce-44bd-a942-a64b4f59270f-config-data\") pod \"nova-scheduler-0\" (UID: \"3858ca8e-74ce-44bd-a942-a64b4f59270f\") " pod="openstack/nova-scheduler-0" Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.326170 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3858ca8e-74ce-44bd-a942-a64b4f59270f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3858ca8e-74ce-44bd-a942-a64b4f59270f\") " pod="openstack/nova-scheduler-0" Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.330769 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3858ca8e-74ce-44bd-a942-a64b4f59270f-config-data\") pod \"nova-scheduler-0\" (UID: \"3858ca8e-74ce-44bd-a942-a64b4f59270f\") " pod="openstack/nova-scheduler-0" Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.330845 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3858ca8e-74ce-44bd-a942-a64b4f59270f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3858ca8e-74ce-44bd-a942-a64b4f59270f\") " pod="openstack/nova-scheduler-0" Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.357069 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7vxs\" (UniqueName: \"kubernetes.io/projected/3858ca8e-74ce-44bd-a942-a64b4f59270f-kube-api-access-v7vxs\") pod \"nova-scheduler-0\" (UID: \"3858ca8e-74ce-44bd-a942-a64b4f59270f\") " pod="openstack/nova-scheduler-0" Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.454545 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.896930 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 07:07:28 crc kubenswrapper[4825]: I0310 07:07:28.981830 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.025841 4825 generic.go:334] "Generic (PLEG): container finished" podID="6965f726-906a-463f-b776-5ddf8e672c29" containerID="11f9db886132d5a6eb8c4e724d2cdece1f01b59eff676d258ab784a238134bfa" exitCode=0 Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.026190 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.026241 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6965f726-906a-463f-b776-5ddf8e672c29","Type":"ContainerDied","Data":"11f9db886132d5a6eb8c4e724d2cdece1f01b59eff676d258ab784a238134bfa"} Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.026277 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6965f726-906a-463f-b776-5ddf8e672c29","Type":"ContainerDied","Data":"1bcc123db11e54e7a1032777a413cfd2454ffd2e1a195c44bc28bd53b74391f9"} Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.026299 4825 scope.go:117] "RemoveContainer" containerID="11f9db886132d5a6eb8c4e724d2cdece1f01b59eff676d258ab784a238134bfa" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.027647 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3858ca8e-74ce-44bd-a942-a64b4f59270f","Type":"ContainerStarted","Data":"f2d950393409df5d1cf1663c33e92b955996a6d637c9234998a222040980c8fd"} Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.040207 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvl7d\" (UniqueName: \"kubernetes.io/projected/6965f726-906a-463f-b776-5ddf8e672c29-kube-api-access-gvl7d\") pod \"6965f726-906a-463f-b776-5ddf8e672c29\" (UID: \"6965f726-906a-463f-b776-5ddf8e672c29\") " Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.040359 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6965f726-906a-463f-b776-5ddf8e672c29-config-data\") pod \"6965f726-906a-463f-b776-5ddf8e672c29\" (UID: \"6965f726-906a-463f-b776-5ddf8e672c29\") " Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.040408 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6965f726-906a-463f-b776-5ddf8e672c29-combined-ca-bundle\") pod \"6965f726-906a-463f-b776-5ddf8e672c29\" (UID: \"6965f726-906a-463f-b776-5ddf8e672c29\") " Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.040440 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6965f726-906a-463f-b776-5ddf8e672c29-logs\") pod \"6965f726-906a-463f-b776-5ddf8e672c29\" (UID: \"6965f726-906a-463f-b776-5ddf8e672c29\") " Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.041520 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6965f726-906a-463f-b776-5ddf8e672c29-logs" (OuterVolumeSpecName: "logs") pod "6965f726-906a-463f-b776-5ddf8e672c29" (UID: "6965f726-906a-463f-b776-5ddf8e672c29"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.046701 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6965f726-906a-463f-b776-5ddf8e672c29-kube-api-access-gvl7d" (OuterVolumeSpecName: "kube-api-access-gvl7d") pod "6965f726-906a-463f-b776-5ddf8e672c29" (UID: "6965f726-906a-463f-b776-5ddf8e672c29"). InnerVolumeSpecName "kube-api-access-gvl7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.065962 4825 scope.go:117] "RemoveContainer" containerID="0dba2b616476849648bdd5c442999b93f5a2d75b861d8c1098438ac42d5eacf3" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.071625 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6965f726-906a-463f-b776-5ddf8e672c29-config-data" (OuterVolumeSpecName: "config-data") pod "6965f726-906a-463f-b776-5ddf8e672c29" (UID: "6965f726-906a-463f-b776-5ddf8e672c29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.100601 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6965f726-906a-463f-b776-5ddf8e672c29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6965f726-906a-463f-b776-5ddf8e672c29" (UID: "6965f726-906a-463f-b776-5ddf8e672c29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.143893 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvl7d\" (UniqueName: \"kubernetes.io/projected/6965f726-906a-463f-b776-5ddf8e672c29-kube-api-access-gvl7d\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.143953 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6965f726-906a-463f-b776-5ddf8e672c29-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.143974 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6965f726-906a-463f-b776-5ddf8e672c29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.143994 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6965f726-906a-463f-b776-5ddf8e672c29-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.159253 4825 scope.go:117] "RemoveContainer" containerID="11f9db886132d5a6eb8c4e724d2cdece1f01b59eff676d258ab784a238134bfa" Mar 10 07:07:29 crc kubenswrapper[4825]: E0310 07:07:29.159906 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11f9db886132d5a6eb8c4e724d2cdece1f01b59eff676d258ab784a238134bfa\": container with ID starting with 11f9db886132d5a6eb8c4e724d2cdece1f01b59eff676d258ab784a238134bfa not found: ID does not exist" containerID="11f9db886132d5a6eb8c4e724d2cdece1f01b59eff676d258ab784a238134bfa" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.159955 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11f9db886132d5a6eb8c4e724d2cdece1f01b59eff676d258ab784a238134bfa"} err="failed to get container status \"11f9db886132d5a6eb8c4e724d2cdece1f01b59eff676d258ab784a238134bfa\": rpc error: code = NotFound desc = could not find container \"11f9db886132d5a6eb8c4e724d2cdece1f01b59eff676d258ab784a238134bfa\": container with ID starting with 11f9db886132d5a6eb8c4e724d2cdece1f01b59eff676d258ab784a238134bfa not found: ID does not exist" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.159991 4825 scope.go:117] "RemoveContainer" containerID="0dba2b616476849648bdd5c442999b93f5a2d75b861d8c1098438ac42d5eacf3" Mar 10 07:07:29 crc kubenswrapper[4825]: E0310 07:07:29.160581 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dba2b616476849648bdd5c442999b93f5a2d75b861d8c1098438ac42d5eacf3\": container with ID starting with 0dba2b616476849648bdd5c442999b93f5a2d75b861d8c1098438ac42d5eacf3 not found: ID does not exist" containerID="0dba2b616476849648bdd5c442999b93f5a2d75b861d8c1098438ac42d5eacf3" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.160637 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dba2b616476849648bdd5c442999b93f5a2d75b861d8c1098438ac42d5eacf3"} err="failed to get container status \"0dba2b616476849648bdd5c442999b93f5a2d75b861d8c1098438ac42d5eacf3\": rpc error: code = NotFound desc = could not find container \"0dba2b616476849648bdd5c442999b93f5a2d75b861d8c1098438ac42d5eacf3\": container with ID starting with 0dba2b616476849648bdd5c442999b93f5a2d75b861d8c1098438ac42d5eacf3 not found: ID does not exist" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.259551 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9632a51-2c7a-4538-895d-c9c99e7f1e00" path="/var/lib/kubelet/pods/f9632a51-2c7a-4538-895d-c9c99e7f1e00/volumes" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.352183 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.355425 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.355484 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.366189 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.375359 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 07:07:29 crc kubenswrapper[4825]: E0310 07:07:29.375759 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6965f726-906a-463f-b776-5ddf8e672c29" containerName="nova-api-api" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.375778 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6965f726-906a-463f-b776-5ddf8e672c29" containerName="nova-api-api" Mar 10 07:07:29 crc kubenswrapper[4825]: E0310 07:07:29.375805 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6965f726-906a-463f-b776-5ddf8e672c29" containerName="nova-api-log" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.375813 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6965f726-906a-463f-b776-5ddf8e672c29" containerName="nova-api-log" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.375997 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="6965f726-906a-463f-b776-5ddf8e672c29" containerName="nova-api-log" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.376019 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="6965f726-906a-463f-b776-5ddf8e672c29" containerName="nova-api-api" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.377113 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.379235 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.391169 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.449986 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb8wv\" (UniqueName: \"kubernetes.io/projected/3f36c81f-f491-403b-8491-463f9f6d3050-kube-api-access-wb8wv\") pod \"nova-api-0\" (UID: \"3f36c81f-f491-403b-8491-463f9f6d3050\") " pod="openstack/nova-api-0" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.450199 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f36c81f-f491-403b-8491-463f9f6d3050-logs\") pod \"nova-api-0\" (UID: \"3f36c81f-f491-403b-8491-463f9f6d3050\") " pod="openstack/nova-api-0" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.450403 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f36c81f-f491-403b-8491-463f9f6d3050-config-data\") pod \"nova-api-0\" (UID: \"3f36c81f-f491-403b-8491-463f9f6d3050\") " pod="openstack/nova-api-0" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.450460 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f36c81f-f491-403b-8491-463f9f6d3050-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3f36c81f-f491-403b-8491-463f9f6d3050\") " pod="openstack/nova-api-0" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.551950 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb8wv\" (UniqueName: \"kubernetes.io/projected/3f36c81f-f491-403b-8491-463f9f6d3050-kube-api-access-wb8wv\") pod \"nova-api-0\" (UID: \"3f36c81f-f491-403b-8491-463f9f6d3050\") " pod="openstack/nova-api-0" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.552178 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f36c81f-f491-403b-8491-463f9f6d3050-logs\") pod \"nova-api-0\" (UID: \"3f36c81f-f491-403b-8491-463f9f6d3050\") " pod="openstack/nova-api-0" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.552366 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f36c81f-f491-403b-8491-463f9f6d3050-config-data\") pod \"nova-api-0\" (UID: \"3f36c81f-f491-403b-8491-463f9f6d3050\") " pod="openstack/nova-api-0" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.552426 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f36c81f-f491-403b-8491-463f9f6d3050-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3f36c81f-f491-403b-8491-463f9f6d3050\") " pod="openstack/nova-api-0" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.552992 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f36c81f-f491-403b-8491-463f9f6d3050-logs\") pod \"nova-api-0\" (UID: \"3f36c81f-f491-403b-8491-463f9f6d3050\") " pod="openstack/nova-api-0" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.559237 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f36c81f-f491-403b-8491-463f9f6d3050-config-data\") pod \"nova-api-0\" (UID: \"3f36c81f-f491-403b-8491-463f9f6d3050\") " pod="openstack/nova-api-0" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.570449 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f36c81f-f491-403b-8491-463f9f6d3050-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3f36c81f-f491-403b-8491-463f9f6d3050\") " pod="openstack/nova-api-0" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.581008 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb8wv\" (UniqueName: \"kubernetes.io/projected/3f36c81f-f491-403b-8491-463f9f6d3050-kube-api-access-wb8wv\") pod \"nova-api-0\" (UID: \"3f36c81f-f491-403b-8491-463f9f6d3050\") " pod="openstack/nova-api-0" Mar 10 07:07:29 crc kubenswrapper[4825]: I0310 07:07:29.698039 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 07:07:30 crc kubenswrapper[4825]: I0310 07:07:30.057495 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3858ca8e-74ce-44bd-a942-a64b4f59270f","Type":"ContainerStarted","Data":"71c198772602fab08b7c89ff60b473e6b8d4a34a3f456f7a2f4966de4e6b819f"} Mar 10 07:07:30 crc kubenswrapper[4825]: I0310 07:07:30.076299 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.076277218 podStartE2EDuration="2.076277218s" podCreationTimestamp="2026-03-10 07:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:07:30.075886638 +0000 UTC m=+1403.105667263" watchObservedRunningTime="2026-03-10 07:07:30.076277218 +0000 UTC m=+1403.106057853" Mar 10 07:07:30 crc kubenswrapper[4825]: W0310 07:07:30.268365 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f36c81f_f491_403b_8491_463f9f6d3050.slice/crio-328696e78f8cbe465ad319c2bf4a98a19067ed628958595e962936175e436c2e WatchSource:0}: Error finding container 328696e78f8cbe465ad319c2bf4a98a19067ed628958595e962936175e436c2e: Status 404 returned error can't find the container with id 328696e78f8cbe465ad319c2bf4a98a19067ed628958595e962936175e436c2e Mar 10 07:07:30 crc kubenswrapper[4825]: I0310 07:07:30.269830 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 07:07:31 crc kubenswrapper[4825]: I0310 07:07:31.073712 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3f36c81f-f491-403b-8491-463f9f6d3050","Type":"ContainerStarted","Data":"00427ed4e18de15b3b8814048fae3907568de244ffc37a815158b69378579272"} Mar 10 07:07:31 crc kubenswrapper[4825]: I0310 07:07:31.074054 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3f36c81f-f491-403b-8491-463f9f6d3050","Type":"ContainerStarted","Data":"859d5bf9379371fcd955fa7dcaddd751f4194d56237c3bacbf4ca28267e8a1f4"} Mar 10 07:07:31 crc kubenswrapper[4825]: I0310 07:07:31.074066 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3f36c81f-f491-403b-8491-463f9f6d3050","Type":"ContainerStarted","Data":"328696e78f8cbe465ad319c2bf4a98a19067ed628958595e962936175e436c2e"} Mar 10 07:07:31 crc kubenswrapper[4825]: I0310 07:07:31.125483 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.125453819 podStartE2EDuration="2.125453819s" podCreationTimestamp="2026-03-10 07:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:07:31.106341851 +0000 UTC m=+1404.136122546" watchObservedRunningTime="2026-03-10 07:07:31.125453819 +0000 UTC m=+1404.155234464" Mar 10 07:07:31 crc kubenswrapper[4825]: I0310 07:07:31.249419 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6965f726-906a-463f-b776-5ddf8e672c29" path="/var/lib/kubelet/pods/6965f726-906a-463f-b776-5ddf8e672c29/volumes" Mar 10 07:07:33 crc kubenswrapper[4825]: I0310 07:07:33.187110 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 07:07:33 crc kubenswrapper[4825]: I0310 07:07:33.286472 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 10 07:07:33 crc kubenswrapper[4825]: I0310 07:07:33.408895 4825 scope.go:117] "RemoveContainer" containerID="619aadfcf1fa9cffc66af047cde7b089400ebe770c8064af3af91a21aa852993" Mar 10 07:07:33 crc kubenswrapper[4825]: I0310 07:07:33.442903 4825 scope.go:117] "RemoveContainer" containerID="662829f4979c777190519ae05e7fc99a265ecfc401485a93e987d4413c310339" Mar 10 07:07:33 crc kubenswrapper[4825]: I0310 07:07:33.454668 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 07:07:33 crc kubenswrapper[4825]: I0310 07:07:33.473865 4825 scope.go:117] "RemoveContainer" containerID="18a4d1ae07d393aafb11b6cd9dda419d601347bd61cdaaff8e2a66fc53d5398e" Mar 10 07:07:34 crc kubenswrapper[4825]: I0310 07:07:34.355362 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 07:07:34 crc kubenswrapper[4825]: I0310 07:07:34.355423 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 07:07:35 crc kubenswrapper[4825]: I0310 07:07:35.373364 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ba141fc1-b6e8-4d89-a27c-be685f51d6ad" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 07:07:35 crc kubenswrapper[4825]: I0310 07:07:35.373373 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ba141fc1-b6e8-4d89-a27c-be685f51d6ad" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 07:07:36 crc kubenswrapper[4825]: I0310 07:07:36.991370 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 07:07:36 crc kubenswrapper[4825]: I0310 07:07:36.991853 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="29b6d990-12b7-446f-9cd8-53c1111b1512" containerName="kube-state-metrics" containerID="cri-o://ab4092c2592a4c7eb52ffe086ca172917d888418a22ed5bfde36459943bc1706" gracePeriod=30 Mar 10 07:07:37 crc kubenswrapper[4825]: I0310 07:07:37.156942 4825 generic.go:334] "Generic (PLEG): container finished" podID="29b6d990-12b7-446f-9cd8-53c1111b1512" containerID="ab4092c2592a4c7eb52ffe086ca172917d888418a22ed5bfde36459943bc1706" exitCode=2 Mar 10 07:07:37 crc kubenswrapper[4825]: I0310 07:07:37.157007 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"29b6d990-12b7-446f-9cd8-53c1111b1512","Type":"ContainerDied","Data":"ab4092c2592a4c7eb52ffe086ca172917d888418a22ed5bfde36459943bc1706"} Mar 10 07:07:37 crc kubenswrapper[4825]: I0310 07:07:37.538215 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 07:07:37 crc kubenswrapper[4825]: I0310 07:07:37.565709 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdrkz\" (UniqueName: \"kubernetes.io/projected/29b6d990-12b7-446f-9cd8-53c1111b1512-kube-api-access-wdrkz\") pod \"29b6d990-12b7-446f-9cd8-53c1111b1512\" (UID: \"29b6d990-12b7-446f-9cd8-53c1111b1512\") " Mar 10 07:07:37 crc kubenswrapper[4825]: I0310 07:07:37.573314 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29b6d990-12b7-446f-9cd8-53c1111b1512-kube-api-access-wdrkz" (OuterVolumeSpecName: "kube-api-access-wdrkz") pod "29b6d990-12b7-446f-9cd8-53c1111b1512" (UID: "29b6d990-12b7-446f-9cd8-53c1111b1512"). InnerVolumeSpecName "kube-api-access-wdrkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:07:37 crc kubenswrapper[4825]: I0310 07:07:37.669889 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdrkz\" (UniqueName: \"kubernetes.io/projected/29b6d990-12b7-446f-9cd8-53c1111b1512-kube-api-access-wdrkz\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.167470 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"29b6d990-12b7-446f-9cd8-53c1111b1512","Type":"ContainerDied","Data":"b0b1f70ca4f0682b0de278256fe393fd457125b5c684916d6fa60bb244771f7a"} Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.167521 4825 scope.go:117] "RemoveContainer" containerID="ab4092c2592a4c7eb52ffe086ca172917d888418a22ed5bfde36459943bc1706" Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.167558 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.220953 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.230293 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.250113 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 07:07:38 crc kubenswrapper[4825]: E0310 07:07:38.250555 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29b6d990-12b7-446f-9cd8-53c1111b1512" containerName="kube-state-metrics" Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.250571 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="29b6d990-12b7-446f-9cd8-53c1111b1512" containerName="kube-state-metrics" Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.250765 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="29b6d990-12b7-446f-9cd8-53c1111b1512" containerName="kube-state-metrics" Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.251676 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.260962 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.261264 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.270906 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.286333 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkzfv\" (UniqueName: \"kubernetes.io/projected/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2-kube-api-access-bkzfv\") pod \"kube-state-metrics-0\" (UID: \"04a2aca4-f98f-4ae8-aca9-62ae6625e5e2\") " pod="openstack/kube-state-metrics-0" Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.286392 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"04a2aca4-f98f-4ae8-aca9-62ae6625e5e2\") " pod="openstack/kube-state-metrics-0" Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.286499 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"04a2aca4-f98f-4ae8-aca9-62ae6625e5e2\") " pod="openstack/kube-state-metrics-0" Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.286536 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"04a2aca4-f98f-4ae8-aca9-62ae6625e5e2\") " pod="openstack/kube-state-metrics-0" Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.389655 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkzfv\" (UniqueName: \"kubernetes.io/projected/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2-kube-api-access-bkzfv\") pod \"kube-state-metrics-0\" (UID: \"04a2aca4-f98f-4ae8-aca9-62ae6625e5e2\") " pod="openstack/kube-state-metrics-0" Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.389752 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"04a2aca4-f98f-4ae8-aca9-62ae6625e5e2\") " pod="openstack/kube-state-metrics-0" Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.389904 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"04a2aca4-f98f-4ae8-aca9-62ae6625e5e2\") " pod="openstack/kube-state-metrics-0" Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.389969 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"04a2aca4-f98f-4ae8-aca9-62ae6625e5e2\") " pod="openstack/kube-state-metrics-0" Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.396416 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"04a2aca4-f98f-4ae8-aca9-62ae6625e5e2\") " pod="openstack/kube-state-metrics-0" Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.397791 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"04a2aca4-f98f-4ae8-aca9-62ae6625e5e2\") " pod="openstack/kube-state-metrics-0" Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.398370 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"04a2aca4-f98f-4ae8-aca9-62ae6625e5e2\") " pod="openstack/kube-state-metrics-0" Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.406263 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkzfv\" (UniqueName: \"kubernetes.io/projected/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2-kube-api-access-bkzfv\") pod \"kube-state-metrics-0\" (UID: \"04a2aca4-f98f-4ae8-aca9-62ae6625e5e2\") " pod="openstack/kube-state-metrics-0" Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.454930 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.489585 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.588284 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.855099 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.855916 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8d8c6c8-bda0-456d-9990-687257c80d94" containerName="ceilometer-central-agent" containerID="cri-o://14db831a97790172eaa4da4b994739695ed6532134e042b0b881af61d5be3571" gracePeriod=30 Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.856349 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8d8c6c8-bda0-456d-9990-687257c80d94" containerName="proxy-httpd" containerID="cri-o://1eb716d9b690168d881b625fe363382d3aef275e433436e5abbfe3274a8d8e0c" gracePeriod=30 Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.856395 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8d8c6c8-bda0-456d-9990-687257c80d94" containerName="ceilometer-notification-agent" containerID="cri-o://eb8eb918d07b85f1845a186b5d8a359823392ba86b985cd163ad0001b3649d0f" gracePeriod=30 Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.856557 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b8d8c6c8-bda0-456d-9990-687257c80d94" containerName="sg-core" containerID="cri-o://30741bbef1130028001c57998a0fbcac6249f525d23117f98dbbac7ca47f746d" gracePeriod=30 Mar 10 07:07:38 crc kubenswrapper[4825]: I0310 07:07:38.905682 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 07:07:38 crc kubenswrapper[4825]: W0310 07:07:38.910015 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04a2aca4_f98f_4ae8_aca9_62ae6625e5e2.slice/crio-308db8163601cfc01ebf7e62c87675153377d65c18058084fe5a827d7e708b89 WatchSource:0}: Error finding container 308db8163601cfc01ebf7e62c87675153377d65c18058084fe5a827d7e708b89: Status 404 returned error can't find the container with id 308db8163601cfc01ebf7e62c87675153377d65c18058084fe5a827d7e708b89 Mar 10 07:07:39 crc kubenswrapper[4825]: I0310 07:07:39.180094 4825 generic.go:334] "Generic (PLEG): container finished" podID="b8d8c6c8-bda0-456d-9990-687257c80d94" containerID="1eb716d9b690168d881b625fe363382d3aef275e433436e5abbfe3274a8d8e0c" exitCode=0 Mar 10 07:07:39 crc kubenswrapper[4825]: I0310 07:07:39.180144 4825 generic.go:334] "Generic (PLEG): container finished" podID="b8d8c6c8-bda0-456d-9990-687257c80d94" containerID="30741bbef1130028001c57998a0fbcac6249f525d23117f98dbbac7ca47f746d" exitCode=2 Mar 10 07:07:39 crc kubenswrapper[4825]: I0310 07:07:39.180174 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8d8c6c8-bda0-456d-9990-687257c80d94","Type":"ContainerDied","Data":"1eb716d9b690168d881b625fe363382d3aef275e433436e5abbfe3274a8d8e0c"} Mar 10 07:07:39 crc kubenswrapper[4825]: I0310 07:07:39.180222 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8d8c6c8-bda0-456d-9990-687257c80d94","Type":"ContainerDied","Data":"30741bbef1130028001c57998a0fbcac6249f525d23117f98dbbac7ca47f746d"} Mar 10 07:07:39 crc kubenswrapper[4825]: I0310 07:07:39.181836 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"04a2aca4-f98f-4ae8-aca9-62ae6625e5e2","Type":"ContainerStarted","Data":"308db8163601cfc01ebf7e62c87675153377d65c18058084fe5a827d7e708b89"} Mar 10 07:07:39 crc kubenswrapper[4825]: I0310 07:07:39.218832 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 07:07:39 crc kubenswrapper[4825]: I0310 07:07:39.255627 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29b6d990-12b7-446f-9cd8-53c1111b1512" path="/var/lib/kubelet/pods/29b6d990-12b7-446f-9cd8-53c1111b1512/volumes" Mar 10 07:07:39 crc kubenswrapper[4825]: I0310 07:07:39.699151 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 07:07:39 crc kubenswrapper[4825]: I0310 07:07:39.699436 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 07:07:40 crc kubenswrapper[4825]: I0310 07:07:40.198482 4825 generic.go:334] "Generic (PLEG): container finished" podID="b8d8c6c8-bda0-456d-9990-687257c80d94" containerID="14db831a97790172eaa4da4b994739695ed6532134e042b0b881af61d5be3571" exitCode=0 Mar 10 07:07:40 crc kubenswrapper[4825]: I0310 07:07:40.198542 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8d8c6c8-bda0-456d-9990-687257c80d94","Type":"ContainerDied","Data":"14db831a97790172eaa4da4b994739695ed6532134e042b0b881af61d5be3571"} Mar 10 07:07:40 crc kubenswrapper[4825]: I0310 07:07:40.204013 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"04a2aca4-f98f-4ae8-aca9-62ae6625e5e2","Type":"ContainerStarted","Data":"fe8d444c42b3290f6dd09fc28b48048d43ae16fceb5be5dbfbb283eec952359c"} Mar 10 07:07:40 crc kubenswrapper[4825]: I0310 07:07:40.204242 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 10 07:07:40 crc kubenswrapper[4825]: I0310 07:07:40.233808 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.8679333630000001 podStartE2EDuration="2.2337923s" podCreationTimestamp="2026-03-10 07:07:38 +0000 UTC" firstStartedPulling="2026-03-10 07:07:38.915525485 +0000 UTC m=+1411.945306100" lastFinishedPulling="2026-03-10 07:07:39.281384412 +0000 UTC m=+1412.311165037" observedRunningTime="2026-03-10 07:07:40.227492746 +0000 UTC m=+1413.257273381" watchObservedRunningTime="2026-03-10 07:07:40.2337923 +0000 UTC m=+1413.263572915" Mar 10 07:07:40 crc kubenswrapper[4825]: I0310 07:07:40.782436 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3f36c81f-f491-403b-8491-463f9f6d3050" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.205:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 07:07:40 crc kubenswrapper[4825]: I0310 07:07:40.782452 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3f36c81f-f491-403b-8491-463f9f6d3050" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.205:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 07:07:42 crc kubenswrapper[4825]: I0310 07:07:42.814924 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:07:42 crc kubenswrapper[4825]: I0310 07:07:42.983789 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8d8c6c8-bda0-456d-9990-687257c80d94-scripts\") pod \"b8d8c6c8-bda0-456d-9990-687257c80d94\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " Mar 10 07:07:42 crc kubenswrapper[4825]: I0310 07:07:42.983881 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8d8c6c8-bda0-456d-9990-687257c80d94-config-data\") pod \"b8d8c6c8-bda0-456d-9990-687257c80d94\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " Mar 10 07:07:42 crc kubenswrapper[4825]: I0310 07:07:42.984193 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8d8c6c8-bda0-456d-9990-687257c80d94-combined-ca-bundle\") pod \"b8d8c6c8-bda0-456d-9990-687257c80d94\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " Mar 10 07:07:42 crc kubenswrapper[4825]: I0310 07:07:42.984260 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8d8c6c8-bda0-456d-9990-687257c80d94-run-httpd\") pod \"b8d8c6c8-bda0-456d-9990-687257c80d94\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " Mar 10 07:07:42 crc kubenswrapper[4825]: I0310 07:07:42.984344 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8d8c6c8-bda0-456d-9990-687257c80d94-log-httpd\") pod \"b8d8c6c8-bda0-456d-9990-687257c80d94\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " Mar 10 07:07:42 crc kubenswrapper[4825]: I0310 07:07:42.984515 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8d8c6c8-bda0-456d-9990-687257c80d94-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b8d8c6c8-bda0-456d-9990-687257c80d94" (UID: "b8d8c6c8-bda0-456d-9990-687257c80d94"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:07:42 crc kubenswrapper[4825]: I0310 07:07:42.984981 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8d8c6c8-bda0-456d-9990-687257c80d94-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b8d8c6c8-bda0-456d-9990-687257c80d94" (UID: "b8d8c6c8-bda0-456d-9990-687257c80d94"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:07:42 crc kubenswrapper[4825]: I0310 07:07:42.985099 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8d8c6c8-bda0-456d-9990-687257c80d94-sg-core-conf-yaml\") pod \"b8d8c6c8-bda0-456d-9990-687257c80d94\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " Mar 10 07:07:42 crc kubenswrapper[4825]: I0310 07:07:42.985654 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdprd\" (UniqueName: \"kubernetes.io/projected/b8d8c6c8-bda0-456d-9990-687257c80d94-kube-api-access-pdprd\") pod \"b8d8c6c8-bda0-456d-9990-687257c80d94\" (UID: \"b8d8c6c8-bda0-456d-9990-687257c80d94\") " Mar 10 07:07:42 crc kubenswrapper[4825]: I0310 07:07:42.986629 4825 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8d8c6c8-bda0-456d-9990-687257c80d94-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:42 crc kubenswrapper[4825]: I0310 07:07:42.986679 4825 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b8d8c6c8-bda0-456d-9990-687257c80d94-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:42 crc kubenswrapper[4825]: I0310 07:07:42.990330 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8d8c6c8-bda0-456d-9990-687257c80d94-scripts" (OuterVolumeSpecName: "scripts") pod "b8d8c6c8-bda0-456d-9990-687257c80d94" (UID: "b8d8c6c8-bda0-456d-9990-687257c80d94"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:42 crc kubenswrapper[4825]: I0310 07:07:42.990388 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8d8c6c8-bda0-456d-9990-687257c80d94-kube-api-access-pdprd" (OuterVolumeSpecName: "kube-api-access-pdprd") pod "b8d8c6c8-bda0-456d-9990-687257c80d94" (UID: "b8d8c6c8-bda0-456d-9990-687257c80d94"). InnerVolumeSpecName "kube-api-access-pdprd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.025389 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8d8c6c8-bda0-456d-9990-687257c80d94-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b8d8c6c8-bda0-456d-9990-687257c80d94" (UID: "b8d8c6c8-bda0-456d-9990-687257c80d94"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.073883 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8d8c6c8-bda0-456d-9990-687257c80d94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8d8c6c8-bda0-456d-9990-687257c80d94" (UID: "b8d8c6c8-bda0-456d-9990-687257c80d94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.088073 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8d8c6c8-bda0-456d-9990-687257c80d94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.088106 4825 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b8d8c6c8-bda0-456d-9990-687257c80d94-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.088115 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdprd\" (UniqueName: \"kubernetes.io/projected/b8d8c6c8-bda0-456d-9990-687257c80d94-kube-api-access-pdprd\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.088171 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8d8c6c8-bda0-456d-9990-687257c80d94-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.125314 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8d8c6c8-bda0-456d-9990-687257c80d94-config-data" (OuterVolumeSpecName: "config-data") pod "b8d8c6c8-bda0-456d-9990-687257c80d94" (UID: "b8d8c6c8-bda0-456d-9990-687257c80d94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.190182 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8d8c6c8-bda0-456d-9990-687257c80d94-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.242705 4825 generic.go:334] "Generic (PLEG): container finished" podID="b8d8c6c8-bda0-456d-9990-687257c80d94" containerID="eb8eb918d07b85f1845a186b5d8a359823392ba86b985cd163ad0001b3649d0f" exitCode=0 Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.242785 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.274427 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8d8c6c8-bda0-456d-9990-687257c80d94","Type":"ContainerDied","Data":"eb8eb918d07b85f1845a186b5d8a359823392ba86b985cd163ad0001b3649d0f"} Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.274485 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b8d8c6c8-bda0-456d-9990-687257c80d94","Type":"ContainerDied","Data":"adf5d4e91a570e81498106b743d4883dc61f0def6e11d66e836f8a700e0b8237"} Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.274514 4825 scope.go:117] "RemoveContainer" containerID="1eb716d9b690168d881b625fe363382d3aef275e433436e5abbfe3274a8d8e0c" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.318828 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.346009 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.357117 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:07:43 crc kubenswrapper[4825]: E0310 07:07:43.357894 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8d8c6c8-bda0-456d-9990-687257c80d94" containerName="sg-core" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.357947 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d8c6c8-bda0-456d-9990-687257c80d94" containerName="sg-core" Mar 10 07:07:43 crc kubenswrapper[4825]: E0310 07:07:43.357967 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8d8c6c8-bda0-456d-9990-687257c80d94" containerName="proxy-httpd" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.357975 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d8c6c8-bda0-456d-9990-687257c80d94" containerName="proxy-httpd" Mar 10 07:07:43 crc kubenswrapper[4825]: E0310 07:07:43.357993 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8d8c6c8-bda0-456d-9990-687257c80d94" containerName="ceilometer-central-agent" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.358034 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d8c6c8-bda0-456d-9990-687257c80d94" containerName="ceilometer-central-agent" Mar 10 07:07:43 crc kubenswrapper[4825]: E0310 07:07:43.358064 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8d8c6c8-bda0-456d-9990-687257c80d94" containerName="ceilometer-notification-agent" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.358072 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d8c6c8-bda0-456d-9990-687257c80d94" containerName="ceilometer-notification-agent" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.358575 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8d8c6c8-bda0-456d-9990-687257c80d94" containerName="sg-core" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.358632 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8d8c6c8-bda0-456d-9990-687257c80d94" containerName="ceilometer-notification-agent" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.358650 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8d8c6c8-bda0-456d-9990-687257c80d94" containerName="ceilometer-central-agent" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.358671 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8d8c6c8-bda0-456d-9990-687257c80d94" containerName="proxy-httpd" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.361528 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.364453 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.364634 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.367347 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.368945 4825 scope.go:117] "RemoveContainer" containerID="30741bbef1130028001c57998a0fbcac6249f525d23117f98dbbac7ca47f746d" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.371499 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.393059 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.393098 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89d2f\" (UniqueName: \"kubernetes.io/projected/8691271e-1ca4-43e0-86e1-f2c0aa90c948-kube-api-access-89d2f\") pod \"ceilometer-0\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.393164 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-scripts\") pod \"ceilometer-0\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.393213 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.393257 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-config-data\") pod \"ceilometer-0\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.393319 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.393337 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8691271e-1ca4-43e0-86e1-f2c0aa90c948-log-httpd\") pod \"ceilometer-0\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.393361 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8691271e-1ca4-43e0-86e1-f2c0aa90c948-run-httpd\") pod \"ceilometer-0\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.395470 4825 scope.go:117] "RemoveContainer" containerID="eb8eb918d07b85f1845a186b5d8a359823392ba86b985cd163ad0001b3649d0f" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.417073 4825 scope.go:117] "RemoveContainer" containerID="14db831a97790172eaa4da4b994739695ed6532134e042b0b881af61d5be3571" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.479329 4825 scope.go:117] "RemoveContainer" containerID="1eb716d9b690168d881b625fe363382d3aef275e433436e5abbfe3274a8d8e0c" Mar 10 07:07:43 crc kubenswrapper[4825]: E0310 07:07:43.479797 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eb716d9b690168d881b625fe363382d3aef275e433436e5abbfe3274a8d8e0c\": container with ID starting with 1eb716d9b690168d881b625fe363382d3aef275e433436e5abbfe3274a8d8e0c not found: ID does not exist" containerID="1eb716d9b690168d881b625fe363382d3aef275e433436e5abbfe3274a8d8e0c" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.479847 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eb716d9b690168d881b625fe363382d3aef275e433436e5abbfe3274a8d8e0c"} err="failed to get container status \"1eb716d9b690168d881b625fe363382d3aef275e433436e5abbfe3274a8d8e0c\": rpc error: code = NotFound desc = could not find container \"1eb716d9b690168d881b625fe363382d3aef275e433436e5abbfe3274a8d8e0c\": container with ID starting with 1eb716d9b690168d881b625fe363382d3aef275e433436e5abbfe3274a8d8e0c not found: ID does not exist" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.479881 4825 scope.go:117] "RemoveContainer" containerID="30741bbef1130028001c57998a0fbcac6249f525d23117f98dbbac7ca47f746d" Mar 10 07:07:43 crc kubenswrapper[4825]: E0310 07:07:43.480237 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30741bbef1130028001c57998a0fbcac6249f525d23117f98dbbac7ca47f746d\": container with ID starting with 30741bbef1130028001c57998a0fbcac6249f525d23117f98dbbac7ca47f746d not found: ID does not exist" containerID="30741bbef1130028001c57998a0fbcac6249f525d23117f98dbbac7ca47f746d" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.480282 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30741bbef1130028001c57998a0fbcac6249f525d23117f98dbbac7ca47f746d"} err="failed to get container status \"30741bbef1130028001c57998a0fbcac6249f525d23117f98dbbac7ca47f746d\": rpc error: code = NotFound desc = could not find container \"30741bbef1130028001c57998a0fbcac6249f525d23117f98dbbac7ca47f746d\": container with ID starting with 30741bbef1130028001c57998a0fbcac6249f525d23117f98dbbac7ca47f746d not found: ID does not exist" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.480315 4825 scope.go:117] "RemoveContainer" containerID="eb8eb918d07b85f1845a186b5d8a359823392ba86b985cd163ad0001b3649d0f" Mar 10 07:07:43 crc kubenswrapper[4825]: E0310 07:07:43.480608 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb8eb918d07b85f1845a186b5d8a359823392ba86b985cd163ad0001b3649d0f\": container with ID starting with eb8eb918d07b85f1845a186b5d8a359823392ba86b985cd163ad0001b3649d0f not found: ID does not exist" containerID="eb8eb918d07b85f1845a186b5d8a359823392ba86b985cd163ad0001b3649d0f" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.480643 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb8eb918d07b85f1845a186b5d8a359823392ba86b985cd163ad0001b3649d0f"} err="failed to get container status \"eb8eb918d07b85f1845a186b5d8a359823392ba86b985cd163ad0001b3649d0f\": rpc error: code = NotFound desc = could not find container \"eb8eb918d07b85f1845a186b5d8a359823392ba86b985cd163ad0001b3649d0f\": container with ID starting with eb8eb918d07b85f1845a186b5d8a359823392ba86b985cd163ad0001b3649d0f not found: ID does not exist" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.480666 4825 scope.go:117] "RemoveContainer" containerID="14db831a97790172eaa4da4b994739695ed6532134e042b0b881af61d5be3571" Mar 10 07:07:43 crc kubenswrapper[4825]: E0310 07:07:43.480877 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14db831a97790172eaa4da4b994739695ed6532134e042b0b881af61d5be3571\": container with ID starting with 14db831a97790172eaa4da4b994739695ed6532134e042b0b881af61d5be3571 not found: ID does not exist" containerID="14db831a97790172eaa4da4b994739695ed6532134e042b0b881af61d5be3571" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.480901 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14db831a97790172eaa4da4b994739695ed6532134e042b0b881af61d5be3571"} err="failed to get container status \"14db831a97790172eaa4da4b994739695ed6532134e042b0b881af61d5be3571\": rpc error: code = NotFound desc = could not find container \"14db831a97790172eaa4da4b994739695ed6532134e042b0b881af61d5be3571\": container with ID starting with 14db831a97790172eaa4da4b994739695ed6532134e042b0b881af61d5be3571 not found: ID does not exist" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.495114 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.495232 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8691271e-1ca4-43e0-86e1-f2c0aa90c948-log-httpd\") pod \"ceilometer-0\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.495270 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8691271e-1ca4-43e0-86e1-f2c0aa90c948-run-httpd\") pod \"ceilometer-0\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.496336 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8691271e-1ca4-43e0-86e1-f2c0aa90c948-log-httpd\") pod \"ceilometer-0\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.496424 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8691271e-1ca4-43e0-86e1-f2c0aa90c948-run-httpd\") pod \"ceilometer-0\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.496433 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.496503 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89d2f\" (UniqueName: \"kubernetes.io/projected/8691271e-1ca4-43e0-86e1-f2c0aa90c948-kube-api-access-89d2f\") pod \"ceilometer-0\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.496619 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-scripts\") pod \"ceilometer-0\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.496653 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.496763 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-config-data\") pod \"ceilometer-0\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.500543 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.500568 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-scripts\") pod \"ceilometer-0\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.500787 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.501071 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-config-data\") pod \"ceilometer-0\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.501735 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.516292 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89d2f\" (UniqueName: \"kubernetes.io/projected/8691271e-1ca4-43e0-86e1-f2c0aa90c948-kube-api-access-89d2f\") pod \"ceilometer-0\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " pod="openstack/ceilometer-0" Mar 10 07:07:43 crc kubenswrapper[4825]: I0310 07:07:43.687445 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:07:44 crc kubenswrapper[4825]: W0310 07:07:44.166542 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8691271e_1ca4_43e0_86e1_f2c0aa90c948.slice/crio-9fd6457fed95c4b7ee21ca9fcdcc45a30f2fe3de43c8f1bfa2a85472da924410 WatchSource:0}: Error finding container 9fd6457fed95c4b7ee21ca9fcdcc45a30f2fe3de43c8f1bfa2a85472da924410: Status 404 returned error can't find the container with id 9fd6457fed95c4b7ee21ca9fcdcc45a30f2fe3de43c8f1bfa2a85472da924410 Mar 10 07:07:44 crc kubenswrapper[4825]: I0310 07:07:44.172636 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:07:44 crc kubenswrapper[4825]: I0310 07:07:44.258793 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8691271e-1ca4-43e0-86e1-f2c0aa90c948","Type":"ContainerStarted","Data":"9fd6457fed95c4b7ee21ca9fcdcc45a30f2fe3de43c8f1bfa2a85472da924410"} Mar 10 07:07:44 crc kubenswrapper[4825]: I0310 07:07:44.364848 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 07:07:44 crc kubenswrapper[4825]: I0310 07:07:44.369303 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 07:07:44 crc kubenswrapper[4825]: I0310 07:07:44.373237 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 07:07:45 crc kubenswrapper[4825]: I0310 07:07:45.255763 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8d8c6c8-bda0-456d-9990-687257c80d94" path="/var/lib/kubelet/pods/b8d8c6c8-bda0-456d-9990-687257c80d94/volumes" Mar 10 07:07:45 crc kubenswrapper[4825]: I0310 07:07:45.272178 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8691271e-1ca4-43e0-86e1-f2c0aa90c948","Type":"ContainerStarted","Data":"c5c2db600706cdcc66a3813eebd52b1ce82e80b78169ce004e8f6b05f39dc3f3"} Mar 10 07:07:45 crc kubenswrapper[4825]: I0310 07:07:45.279209 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.064236 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.242734 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8hcw\" (UniqueName: \"kubernetes.io/projected/568490b2-38d9-45d8-ad81-cfffffc697f2-kube-api-access-j8hcw\") pod \"568490b2-38d9-45d8-ad81-cfffffc697f2\" (UID: \"568490b2-38d9-45d8-ad81-cfffffc697f2\") " Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.243098 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/568490b2-38d9-45d8-ad81-cfffffc697f2-config-data\") pod \"568490b2-38d9-45d8-ad81-cfffffc697f2\" (UID: \"568490b2-38d9-45d8-ad81-cfffffc697f2\") " Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.243391 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/568490b2-38d9-45d8-ad81-cfffffc697f2-combined-ca-bundle\") pod \"568490b2-38d9-45d8-ad81-cfffffc697f2\" (UID: \"568490b2-38d9-45d8-ad81-cfffffc697f2\") " Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.249362 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/568490b2-38d9-45d8-ad81-cfffffc697f2-kube-api-access-j8hcw" (OuterVolumeSpecName: "kube-api-access-j8hcw") pod "568490b2-38d9-45d8-ad81-cfffffc697f2" (UID: "568490b2-38d9-45d8-ad81-cfffffc697f2"). InnerVolumeSpecName "kube-api-access-j8hcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.272472 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/568490b2-38d9-45d8-ad81-cfffffc697f2-config-data" (OuterVolumeSpecName: "config-data") pod "568490b2-38d9-45d8-ad81-cfffffc697f2" (UID: "568490b2-38d9-45d8-ad81-cfffffc697f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.287127 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/568490b2-38d9-45d8-ad81-cfffffc697f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "568490b2-38d9-45d8-ad81-cfffffc697f2" (UID: "568490b2-38d9-45d8-ad81-cfffffc697f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.287627 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8691271e-1ca4-43e0-86e1-f2c0aa90c948","Type":"ContainerStarted","Data":"bd6f516eb4fef72025380179b0e0cae4a79cbd0e0d056ea4fd5825b3988695fa"} Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.294756 4825 generic.go:334] "Generic (PLEG): container finished" podID="568490b2-38d9-45d8-ad81-cfffffc697f2" containerID="17ed4ed2a06281c643252e4ec49d9bfea8bd858f1cae1cf04f57b290cecc1a71" exitCode=137 Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.296173 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"568490b2-38d9-45d8-ad81-cfffffc697f2","Type":"ContainerDied","Data":"17ed4ed2a06281c643252e4ec49d9bfea8bd858f1cae1cf04f57b290cecc1a71"} Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.296204 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.296272 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"568490b2-38d9-45d8-ad81-cfffffc697f2","Type":"ContainerDied","Data":"b8df5da5426b8daa53c19be52592664babf87026fce3ccbfcca15714e3d2f09a"} Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.296298 4825 scope.go:117] "RemoveContainer" containerID="17ed4ed2a06281c643252e4ec49d9bfea8bd858f1cae1cf04f57b290cecc1a71" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.328341 4825 scope.go:117] "RemoveContainer" containerID="17ed4ed2a06281c643252e4ec49d9bfea8bd858f1cae1cf04f57b290cecc1a71" Mar 10 07:07:46 crc kubenswrapper[4825]: E0310 07:07:46.328863 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17ed4ed2a06281c643252e4ec49d9bfea8bd858f1cae1cf04f57b290cecc1a71\": container with ID starting with 17ed4ed2a06281c643252e4ec49d9bfea8bd858f1cae1cf04f57b290cecc1a71 not found: ID does not exist" containerID="17ed4ed2a06281c643252e4ec49d9bfea8bd858f1cae1cf04f57b290cecc1a71" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.328908 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17ed4ed2a06281c643252e4ec49d9bfea8bd858f1cae1cf04f57b290cecc1a71"} err="failed to get container status \"17ed4ed2a06281c643252e4ec49d9bfea8bd858f1cae1cf04f57b290cecc1a71\": rpc error: code = NotFound desc = could not find container \"17ed4ed2a06281c643252e4ec49d9bfea8bd858f1cae1cf04f57b290cecc1a71\": container with ID starting with 17ed4ed2a06281c643252e4ec49d9bfea8bd858f1cae1cf04f57b290cecc1a71 not found: ID does not exist" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.345779 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8hcw\" (UniqueName: \"kubernetes.io/projected/568490b2-38d9-45d8-ad81-cfffffc697f2-kube-api-access-j8hcw\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.345820 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/568490b2-38d9-45d8-ad81-cfffffc697f2-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.345838 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/568490b2-38d9-45d8-ad81-cfffffc697f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.365686 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.381641 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.396323 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 07:07:46 crc kubenswrapper[4825]: E0310 07:07:46.416359 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="568490b2-38d9-45d8-ad81-cfffffc697f2" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.416395 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="568490b2-38d9-45d8-ad81-cfffffc697f2" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.417841 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="568490b2-38d9-45d8-ad81-cfffffc697f2" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.443474 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.443953 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.447987 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.448001 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.448330 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.450163 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.450248 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.450296 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.450341 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmmfw\" (UniqueName: \"kubernetes.io/projected/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-kube-api-access-tmmfw\") pod \"nova-cell1-novncproxy-0\" (UID: \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.450374 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.552507 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.552677 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.552753 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.552799 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.552843 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmmfw\" (UniqueName: \"kubernetes.io/projected/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-kube-api-access-tmmfw\") pod \"nova-cell1-novncproxy-0\" (UID: \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.556202 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.556507 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.557763 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.559072 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.568599 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmmfw\" (UniqueName: \"kubernetes.io/projected/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-kube-api-access-tmmfw\") pod \"nova-cell1-novncproxy-0\" (UID: \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:46 crc kubenswrapper[4825]: I0310 07:07:46.809555 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:47 crc kubenswrapper[4825]: I0310 07:07:47.258360 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="568490b2-38d9-45d8-ad81-cfffffc697f2" path="/var/lib/kubelet/pods/568490b2-38d9-45d8-ad81-cfffffc697f2/volumes" Mar 10 07:07:47 crc kubenswrapper[4825]: I0310 07:07:47.286227 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 07:07:47 crc kubenswrapper[4825]: W0310 07:07:47.297367 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66c3beca_bb2c_4b68_a4b3_3cfe936c25fd.slice/crio-73ff8081b411e296c16476218fa147d30c5d3bdabcdba8be0470daa44e29eb83 WatchSource:0}: Error finding container 73ff8081b411e296c16476218fa147d30c5d3bdabcdba8be0470daa44e29eb83: Status 404 returned error can't find the container with id 73ff8081b411e296c16476218fa147d30c5d3bdabcdba8be0470daa44e29eb83 Mar 10 07:07:47 crc kubenswrapper[4825]: I0310 07:07:47.312772 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8691271e-1ca4-43e0-86e1-f2c0aa90c948","Type":"ContainerStarted","Data":"3fe1d94f6bcbe8e8cac7d6e0aaea123234902339250d4be6e3d260b608ed6542"} Mar 10 07:07:48 crc kubenswrapper[4825]: I0310 07:07:48.332649 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd","Type":"ContainerStarted","Data":"84161414c9dbeb64c9936bc248a59e4b32d0591a084774c91d9da931fd27e821"} Mar 10 07:07:48 crc kubenswrapper[4825]: I0310 07:07:48.332920 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd","Type":"ContainerStarted","Data":"73ff8081b411e296c16476218fa147d30c5d3bdabcdba8be0470daa44e29eb83"} Mar 10 07:07:48 crc kubenswrapper[4825]: I0310 07:07:48.357704 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.357680709 podStartE2EDuration="2.357680709s" podCreationTimestamp="2026-03-10 07:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:07:48.35464479 +0000 UTC m=+1421.384425415" watchObservedRunningTime="2026-03-10 07:07:48.357680709 +0000 UTC m=+1421.387461324" Mar 10 07:07:48 crc kubenswrapper[4825]: I0310 07:07:48.604961 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 10 07:07:49 crc kubenswrapper[4825]: I0310 07:07:49.350524 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8691271e-1ca4-43e0-86e1-f2c0aa90c948","Type":"ContainerStarted","Data":"3bb4d944e2e1ba5c6daa524172b1d24c0ad434fe5d0521bb808e61afa02ef02d"} Mar 10 07:07:49 crc kubenswrapper[4825]: I0310 07:07:49.351513 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 07:07:49 crc kubenswrapper[4825]: I0310 07:07:49.382024 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.683062455 podStartE2EDuration="6.38199288s" podCreationTimestamp="2026-03-10 07:07:43 +0000 UTC" firstStartedPulling="2026-03-10 07:07:44.169708724 +0000 UTC m=+1417.199489349" lastFinishedPulling="2026-03-10 07:07:48.868639159 +0000 UTC m=+1421.898419774" observedRunningTime="2026-03-10 07:07:49.376494137 +0000 UTC m=+1422.406274752" watchObservedRunningTime="2026-03-10 07:07:49.38199288 +0000 UTC m=+1422.411773495" Mar 10 07:07:49 crc kubenswrapper[4825]: I0310 07:07:49.704098 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 07:07:49 crc kubenswrapper[4825]: I0310 07:07:49.704579 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 07:07:49 crc kubenswrapper[4825]: I0310 07:07:49.704632 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 07:07:49 crc kubenswrapper[4825]: I0310 07:07:49.733945 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 07:07:50 crc kubenswrapper[4825]: I0310 07:07:50.362679 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 07:07:50 crc kubenswrapper[4825]: I0310 07:07:50.366197 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 07:07:50 crc kubenswrapper[4825]: I0310 07:07:50.541196 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7749c44969-kww8g"] Mar 10 07:07:50 crc kubenswrapper[4825]: I0310 07:07:50.543907 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-kww8g" Mar 10 07:07:50 crc kubenswrapper[4825]: I0310 07:07:50.547895 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-ovsdbserver-nb\") pod \"dnsmasq-dns-7749c44969-kww8g\" (UID: \"7d184a8b-881b-4f12-8c5b-cb355193fd98\") " pod="openstack/dnsmasq-dns-7749c44969-kww8g" Mar 10 07:07:50 crc kubenswrapper[4825]: I0310 07:07:50.547957 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-dns-swift-storage-0\") pod \"dnsmasq-dns-7749c44969-kww8g\" (UID: \"7d184a8b-881b-4f12-8c5b-cb355193fd98\") " pod="openstack/dnsmasq-dns-7749c44969-kww8g" Mar 10 07:07:50 crc kubenswrapper[4825]: I0310 07:07:50.548030 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dsjb\" (UniqueName: \"kubernetes.io/projected/7d184a8b-881b-4f12-8c5b-cb355193fd98-kube-api-access-2dsjb\") pod \"dnsmasq-dns-7749c44969-kww8g\" (UID: \"7d184a8b-881b-4f12-8c5b-cb355193fd98\") " pod="openstack/dnsmasq-dns-7749c44969-kww8g" Mar 10 07:07:50 crc kubenswrapper[4825]: I0310 07:07:50.548074 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-config\") pod \"dnsmasq-dns-7749c44969-kww8g\" (UID: \"7d184a8b-881b-4f12-8c5b-cb355193fd98\") " pod="openstack/dnsmasq-dns-7749c44969-kww8g" Mar 10 07:07:50 crc kubenswrapper[4825]: I0310 07:07:50.548094 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-ovsdbserver-sb\") pod \"dnsmasq-dns-7749c44969-kww8g\" (UID: \"7d184a8b-881b-4f12-8c5b-cb355193fd98\") " pod="openstack/dnsmasq-dns-7749c44969-kww8g" Mar 10 07:07:50 crc kubenswrapper[4825]: I0310 07:07:50.548127 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-dns-svc\") pod \"dnsmasq-dns-7749c44969-kww8g\" (UID: \"7d184a8b-881b-4f12-8c5b-cb355193fd98\") " pod="openstack/dnsmasq-dns-7749c44969-kww8g" Mar 10 07:07:50 crc kubenswrapper[4825]: I0310 07:07:50.560941 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-kww8g"] Mar 10 07:07:50 crc kubenswrapper[4825]: I0310 07:07:50.649245 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dsjb\" (UniqueName: \"kubernetes.io/projected/7d184a8b-881b-4f12-8c5b-cb355193fd98-kube-api-access-2dsjb\") pod \"dnsmasq-dns-7749c44969-kww8g\" (UID: \"7d184a8b-881b-4f12-8c5b-cb355193fd98\") " pod="openstack/dnsmasq-dns-7749c44969-kww8g" Mar 10 07:07:50 crc kubenswrapper[4825]: I0310 07:07:50.649507 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-config\") pod \"dnsmasq-dns-7749c44969-kww8g\" (UID: \"7d184a8b-881b-4f12-8c5b-cb355193fd98\") " pod="openstack/dnsmasq-dns-7749c44969-kww8g" Mar 10 07:07:50 crc kubenswrapper[4825]: I0310 07:07:50.649531 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-ovsdbserver-sb\") pod \"dnsmasq-dns-7749c44969-kww8g\" (UID: \"7d184a8b-881b-4f12-8c5b-cb355193fd98\") " pod="openstack/dnsmasq-dns-7749c44969-kww8g" Mar 10 07:07:50 crc kubenswrapper[4825]: I0310 07:07:50.649562 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-dns-svc\") pod \"dnsmasq-dns-7749c44969-kww8g\" (UID: \"7d184a8b-881b-4f12-8c5b-cb355193fd98\") " pod="openstack/dnsmasq-dns-7749c44969-kww8g" Mar 10 07:07:50 crc kubenswrapper[4825]: I0310 07:07:50.649602 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-ovsdbserver-nb\") pod \"dnsmasq-dns-7749c44969-kww8g\" (UID: \"7d184a8b-881b-4f12-8c5b-cb355193fd98\") " pod="openstack/dnsmasq-dns-7749c44969-kww8g" Mar 10 07:07:50 crc kubenswrapper[4825]: I0310 07:07:50.649638 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-dns-swift-storage-0\") pod \"dnsmasq-dns-7749c44969-kww8g\" (UID: \"7d184a8b-881b-4f12-8c5b-cb355193fd98\") " pod="openstack/dnsmasq-dns-7749c44969-kww8g" Mar 10 07:07:50 crc kubenswrapper[4825]: I0310 07:07:50.650656 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-dns-swift-storage-0\") pod \"dnsmasq-dns-7749c44969-kww8g\" (UID: \"7d184a8b-881b-4f12-8c5b-cb355193fd98\") " pod="openstack/dnsmasq-dns-7749c44969-kww8g" Mar 10 07:07:50 crc kubenswrapper[4825]: I0310 07:07:50.651580 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-dns-svc\") pod \"dnsmasq-dns-7749c44969-kww8g\" (UID: \"7d184a8b-881b-4f12-8c5b-cb355193fd98\") " pod="openstack/dnsmasq-dns-7749c44969-kww8g" Mar 10 07:07:50 crc kubenswrapper[4825]: I0310 07:07:50.651767 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-ovsdbserver-nb\") pod \"dnsmasq-dns-7749c44969-kww8g\" (UID: \"7d184a8b-881b-4f12-8c5b-cb355193fd98\") " pod="openstack/dnsmasq-dns-7749c44969-kww8g" Mar 10 07:07:50 crc kubenswrapper[4825]: I0310 07:07:50.651801 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-ovsdbserver-sb\") pod \"dnsmasq-dns-7749c44969-kww8g\" (UID: \"7d184a8b-881b-4f12-8c5b-cb355193fd98\") " pod="openstack/dnsmasq-dns-7749c44969-kww8g" Mar 10 07:07:50 crc kubenswrapper[4825]: I0310 07:07:50.652398 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-config\") pod \"dnsmasq-dns-7749c44969-kww8g\" (UID: \"7d184a8b-881b-4f12-8c5b-cb355193fd98\") " pod="openstack/dnsmasq-dns-7749c44969-kww8g" Mar 10 07:07:50 crc kubenswrapper[4825]: I0310 07:07:50.674417 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dsjb\" (UniqueName: \"kubernetes.io/projected/7d184a8b-881b-4f12-8c5b-cb355193fd98-kube-api-access-2dsjb\") pod \"dnsmasq-dns-7749c44969-kww8g\" (UID: \"7d184a8b-881b-4f12-8c5b-cb355193fd98\") " pod="openstack/dnsmasq-dns-7749c44969-kww8g" Mar 10 07:07:50 crc kubenswrapper[4825]: I0310 07:07:50.864405 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-kww8g" Mar 10 07:07:51 crc kubenswrapper[4825]: I0310 07:07:51.431524 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-kww8g"] Mar 10 07:07:51 crc kubenswrapper[4825]: W0310 07:07:51.446724 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d184a8b_881b_4f12_8c5b_cb355193fd98.slice/crio-a93853eadccd3ebba1fd64ef732a9897ce888f33343a21ded8075d60e12b050e WatchSource:0}: Error finding container a93853eadccd3ebba1fd64ef732a9897ce888f33343a21ded8075d60e12b050e: Status 404 returned error can't find the container with id a93853eadccd3ebba1fd64ef732a9897ce888f33343a21ded8075d60e12b050e Mar 10 07:07:51 crc kubenswrapper[4825]: I0310 07:07:51.810208 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:52 crc kubenswrapper[4825]: I0310 07:07:52.380692 4825 generic.go:334] "Generic (PLEG): container finished" podID="7d184a8b-881b-4f12-8c5b-cb355193fd98" containerID="58239260abbd4dee2c279089bb91ba90875540f1742144cead2ca71ff93b34c3" exitCode=0 Mar 10 07:07:52 crc kubenswrapper[4825]: I0310 07:07:52.380874 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-kww8g" event={"ID":"7d184a8b-881b-4f12-8c5b-cb355193fd98","Type":"ContainerDied","Data":"58239260abbd4dee2c279089bb91ba90875540f1742144cead2ca71ff93b34c3"} Mar 10 07:07:52 crc kubenswrapper[4825]: I0310 07:07:52.381011 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-kww8g" event={"ID":"7d184a8b-881b-4f12-8c5b-cb355193fd98","Type":"ContainerStarted","Data":"a93853eadccd3ebba1fd64ef732a9897ce888f33343a21ded8075d60e12b050e"} Mar 10 07:07:53 crc kubenswrapper[4825]: I0310 07:07:53.395472 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-kww8g" event={"ID":"7d184a8b-881b-4f12-8c5b-cb355193fd98","Type":"ContainerStarted","Data":"4f7fbc28bf34f7df122011e3058dcf6f02dc2f8ce3ea18d2fad614b2293dc01c"} Mar 10 07:07:53 crc kubenswrapper[4825]: I0310 07:07:53.395856 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7749c44969-kww8g" Mar 10 07:07:53 crc kubenswrapper[4825]: I0310 07:07:53.418421 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7749c44969-kww8g" podStartSLOduration=3.418404214 podStartE2EDuration="3.418404214s" podCreationTimestamp="2026-03-10 07:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:07:53.413981329 +0000 UTC m=+1426.443761944" watchObservedRunningTime="2026-03-10 07:07:53.418404214 +0000 UTC m=+1426.448184829" Mar 10 07:07:53 crc kubenswrapper[4825]: I0310 07:07:53.921329 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:07:53 crc kubenswrapper[4825]: I0310 07:07:53.921824 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8691271e-1ca4-43e0-86e1-f2c0aa90c948" containerName="ceilometer-central-agent" containerID="cri-o://c5c2db600706cdcc66a3813eebd52b1ce82e80b78169ce004e8f6b05f39dc3f3" gracePeriod=30 Mar 10 07:07:53 crc kubenswrapper[4825]: I0310 07:07:53.921893 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8691271e-1ca4-43e0-86e1-f2c0aa90c948" containerName="proxy-httpd" containerID="cri-o://3bb4d944e2e1ba5c6daa524172b1d24c0ad434fe5d0521bb808e61afa02ef02d" gracePeriod=30 Mar 10 07:07:53 crc kubenswrapper[4825]: I0310 07:07:53.921935 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8691271e-1ca4-43e0-86e1-f2c0aa90c948" containerName="ceilometer-notification-agent" containerID="cri-o://bd6f516eb4fef72025380179b0e0cae4a79cbd0e0d056ea4fd5825b3988695fa" gracePeriod=30 Mar 10 07:07:53 crc kubenswrapper[4825]: I0310 07:07:53.921923 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8691271e-1ca4-43e0-86e1-f2c0aa90c948" containerName="sg-core" containerID="cri-o://3fe1d94f6bcbe8e8cac7d6e0aaea123234902339250d4be6e3d260b608ed6542" gracePeriod=30 Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.372710 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.372940 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3f36c81f-f491-403b-8491-463f9f6d3050" containerName="nova-api-log" containerID="cri-o://859d5bf9379371fcd955fa7dcaddd751f4194d56237c3bacbf4ca28267e8a1f4" gracePeriod=30 Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.373067 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3f36c81f-f491-403b-8491-463f9f6d3050" containerName="nova-api-api" containerID="cri-o://00427ed4e18de15b3b8814048fae3907568de244ffc37a815158b69378579272" gracePeriod=30 Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.407597 4825 generic.go:334] "Generic (PLEG): container finished" podID="8691271e-1ca4-43e0-86e1-f2c0aa90c948" containerID="3bb4d944e2e1ba5c6daa524172b1d24c0ad434fe5d0521bb808e61afa02ef02d" exitCode=0 Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.407624 4825 generic.go:334] "Generic (PLEG): container finished" podID="8691271e-1ca4-43e0-86e1-f2c0aa90c948" containerID="3fe1d94f6bcbe8e8cac7d6e0aaea123234902339250d4be6e3d260b608ed6542" exitCode=2 Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.407632 4825 generic.go:334] "Generic (PLEG): container finished" podID="8691271e-1ca4-43e0-86e1-f2c0aa90c948" containerID="c5c2db600706cdcc66a3813eebd52b1ce82e80b78169ce004e8f6b05f39dc3f3" exitCode=0 Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.407777 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8691271e-1ca4-43e0-86e1-f2c0aa90c948","Type":"ContainerDied","Data":"3bb4d944e2e1ba5c6daa524172b1d24c0ad434fe5d0521bb808e61afa02ef02d"} Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.407806 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8691271e-1ca4-43e0-86e1-f2c0aa90c948","Type":"ContainerDied","Data":"3fe1d94f6bcbe8e8cac7d6e0aaea123234902339250d4be6e3d260b608ed6542"} Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.407826 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8691271e-1ca4-43e0-86e1-f2c0aa90c948","Type":"ContainerDied","Data":"c5c2db600706cdcc66a3813eebd52b1ce82e80b78169ce004e8f6b05f39dc3f3"} Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.735304 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.933056 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8691271e-1ca4-43e0-86e1-f2c0aa90c948-log-httpd\") pod \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.933108 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-ceilometer-tls-certs\") pod \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.933255 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8691271e-1ca4-43e0-86e1-f2c0aa90c948-run-httpd\") pod \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.933322 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-sg-core-conf-yaml\") pod \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.933338 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-scripts\") pod \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.933355 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-config-data\") pod \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.933407 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-combined-ca-bundle\") pod \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.933479 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89d2f\" (UniqueName: \"kubernetes.io/projected/8691271e-1ca4-43e0-86e1-f2c0aa90c948-kube-api-access-89d2f\") pod \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\" (UID: \"8691271e-1ca4-43e0-86e1-f2c0aa90c948\") " Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.933674 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8691271e-1ca4-43e0-86e1-f2c0aa90c948-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8691271e-1ca4-43e0-86e1-f2c0aa90c948" (UID: "8691271e-1ca4-43e0-86e1-f2c0aa90c948"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.933706 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8691271e-1ca4-43e0-86e1-f2c0aa90c948-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8691271e-1ca4-43e0-86e1-f2c0aa90c948" (UID: "8691271e-1ca4-43e0-86e1-f2c0aa90c948"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.934296 4825 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8691271e-1ca4-43e0-86e1-f2c0aa90c948-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.934316 4825 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8691271e-1ca4-43e0-86e1-f2c0aa90c948-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.938623 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-scripts" (OuterVolumeSpecName: "scripts") pod "8691271e-1ca4-43e0-86e1-f2c0aa90c948" (UID: "8691271e-1ca4-43e0-86e1-f2c0aa90c948"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.938897 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8691271e-1ca4-43e0-86e1-f2c0aa90c948-kube-api-access-89d2f" (OuterVolumeSpecName: "kube-api-access-89d2f") pod "8691271e-1ca4-43e0-86e1-f2c0aa90c948" (UID: "8691271e-1ca4-43e0-86e1-f2c0aa90c948"). InnerVolumeSpecName "kube-api-access-89d2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:07:54 crc kubenswrapper[4825]: I0310 07:07:54.960075 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8691271e-1ca4-43e0-86e1-f2c0aa90c948" (UID: "8691271e-1ca4-43e0-86e1-f2c0aa90c948"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.012794 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8691271e-1ca4-43e0-86e1-f2c0aa90c948" (UID: "8691271e-1ca4-43e0-86e1-f2c0aa90c948"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.015118 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8691271e-1ca4-43e0-86e1-f2c0aa90c948" (UID: "8691271e-1ca4-43e0-86e1-f2c0aa90c948"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.045024 4825 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.045059 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.045072 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.045085 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89d2f\" (UniqueName: \"kubernetes.io/projected/8691271e-1ca4-43e0-86e1-f2c0aa90c948-kube-api-access-89d2f\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.045098 4825 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.051352 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-config-data" (OuterVolumeSpecName: "config-data") pod "8691271e-1ca4-43e0-86e1-f2c0aa90c948" (UID: "8691271e-1ca4-43e0-86e1-f2c0aa90c948"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.146990 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8691271e-1ca4-43e0-86e1-f2c0aa90c948-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.419727 4825 generic.go:334] "Generic (PLEG): container finished" podID="8691271e-1ca4-43e0-86e1-f2c0aa90c948" containerID="bd6f516eb4fef72025380179b0e0cae4a79cbd0e0d056ea4fd5825b3988695fa" exitCode=0 Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.419799 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8691271e-1ca4-43e0-86e1-f2c0aa90c948","Type":"ContainerDied","Data":"bd6f516eb4fef72025380179b0e0cae4a79cbd0e0d056ea4fd5825b3988695fa"} Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.419820 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.419928 4825 scope.go:117] "RemoveContainer" containerID="3bb4d944e2e1ba5c6daa524172b1d24c0ad434fe5d0521bb808e61afa02ef02d" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.419829 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8691271e-1ca4-43e0-86e1-f2c0aa90c948","Type":"ContainerDied","Data":"9fd6457fed95c4b7ee21ca9fcdcc45a30f2fe3de43c8f1bfa2a85472da924410"} Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.421921 4825 generic.go:334] "Generic (PLEG): container finished" podID="3f36c81f-f491-403b-8491-463f9f6d3050" containerID="859d5bf9379371fcd955fa7dcaddd751f4194d56237c3bacbf4ca28267e8a1f4" exitCode=143 Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.421963 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3f36c81f-f491-403b-8491-463f9f6d3050","Type":"ContainerDied","Data":"859d5bf9379371fcd955fa7dcaddd751f4194d56237c3bacbf4ca28267e8a1f4"} Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.449629 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.463982 4825 scope.go:117] "RemoveContainer" containerID="3fe1d94f6bcbe8e8cac7d6e0aaea123234902339250d4be6e3d260b608ed6542" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.492235 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.505300 4825 scope.go:117] "RemoveContainer" containerID="bd6f516eb4fef72025380179b0e0cae4a79cbd0e0d056ea4fd5825b3988695fa" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.507737 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:07:55 crc kubenswrapper[4825]: E0310 07:07:55.508178 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8691271e-1ca4-43e0-86e1-f2c0aa90c948" containerName="ceilometer-central-agent" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.508201 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8691271e-1ca4-43e0-86e1-f2c0aa90c948" containerName="ceilometer-central-agent" Mar 10 07:07:55 crc kubenswrapper[4825]: E0310 07:07:55.508226 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8691271e-1ca4-43e0-86e1-f2c0aa90c948" containerName="ceilometer-notification-agent" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.508236 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8691271e-1ca4-43e0-86e1-f2c0aa90c948" containerName="ceilometer-notification-agent" Mar 10 07:07:55 crc kubenswrapper[4825]: E0310 07:07:55.508255 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8691271e-1ca4-43e0-86e1-f2c0aa90c948" containerName="proxy-httpd" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.508262 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8691271e-1ca4-43e0-86e1-f2c0aa90c948" containerName="proxy-httpd" Mar 10 07:07:55 crc kubenswrapper[4825]: E0310 07:07:55.508294 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8691271e-1ca4-43e0-86e1-f2c0aa90c948" containerName="sg-core" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.508302 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8691271e-1ca4-43e0-86e1-f2c0aa90c948" containerName="sg-core" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.508526 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="8691271e-1ca4-43e0-86e1-f2c0aa90c948" containerName="ceilometer-central-agent" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.508549 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="8691271e-1ca4-43e0-86e1-f2c0aa90c948" containerName="proxy-httpd" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.508560 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="8691271e-1ca4-43e0-86e1-f2c0aa90c948" containerName="sg-core" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.508574 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="8691271e-1ca4-43e0-86e1-f2c0aa90c948" containerName="ceilometer-notification-agent" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.510601 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.514775 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.514980 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.519188 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.524481 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.546212 4825 scope.go:117] "RemoveContainer" containerID="c5c2db600706cdcc66a3813eebd52b1ce82e80b78169ce004e8f6b05f39dc3f3" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.570046 4825 scope.go:117] "RemoveContainer" containerID="3bb4d944e2e1ba5c6daa524172b1d24c0ad434fe5d0521bb808e61afa02ef02d" Mar 10 07:07:55 crc kubenswrapper[4825]: E0310 07:07:55.572279 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bb4d944e2e1ba5c6daa524172b1d24c0ad434fe5d0521bb808e61afa02ef02d\": container with ID starting with 3bb4d944e2e1ba5c6daa524172b1d24c0ad434fe5d0521bb808e61afa02ef02d not found: ID does not exist" containerID="3bb4d944e2e1ba5c6daa524172b1d24c0ad434fe5d0521bb808e61afa02ef02d" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.572323 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bb4d944e2e1ba5c6daa524172b1d24c0ad434fe5d0521bb808e61afa02ef02d"} err="failed to get container status \"3bb4d944e2e1ba5c6daa524172b1d24c0ad434fe5d0521bb808e61afa02ef02d\": rpc error: code = NotFound desc = could not find container \"3bb4d944e2e1ba5c6daa524172b1d24c0ad434fe5d0521bb808e61afa02ef02d\": container with ID starting with 3bb4d944e2e1ba5c6daa524172b1d24c0ad434fe5d0521bb808e61afa02ef02d not found: ID does not exist" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.572360 4825 scope.go:117] "RemoveContainer" containerID="3fe1d94f6bcbe8e8cac7d6e0aaea123234902339250d4be6e3d260b608ed6542" Mar 10 07:07:55 crc kubenswrapper[4825]: E0310 07:07:55.572840 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fe1d94f6bcbe8e8cac7d6e0aaea123234902339250d4be6e3d260b608ed6542\": container with ID starting with 3fe1d94f6bcbe8e8cac7d6e0aaea123234902339250d4be6e3d260b608ed6542 not found: ID does not exist" containerID="3fe1d94f6bcbe8e8cac7d6e0aaea123234902339250d4be6e3d260b608ed6542" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.572872 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fe1d94f6bcbe8e8cac7d6e0aaea123234902339250d4be6e3d260b608ed6542"} err="failed to get container status \"3fe1d94f6bcbe8e8cac7d6e0aaea123234902339250d4be6e3d260b608ed6542\": rpc error: code = NotFound desc = could not find container \"3fe1d94f6bcbe8e8cac7d6e0aaea123234902339250d4be6e3d260b608ed6542\": container with ID starting with 3fe1d94f6bcbe8e8cac7d6e0aaea123234902339250d4be6e3d260b608ed6542 not found: ID does not exist" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.572886 4825 scope.go:117] "RemoveContainer" containerID="bd6f516eb4fef72025380179b0e0cae4a79cbd0e0d056ea4fd5825b3988695fa" Mar 10 07:07:55 crc kubenswrapper[4825]: E0310 07:07:55.573182 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd6f516eb4fef72025380179b0e0cae4a79cbd0e0d056ea4fd5825b3988695fa\": container with ID starting with bd6f516eb4fef72025380179b0e0cae4a79cbd0e0d056ea4fd5825b3988695fa not found: ID does not exist" containerID="bd6f516eb4fef72025380179b0e0cae4a79cbd0e0d056ea4fd5825b3988695fa" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.573215 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd6f516eb4fef72025380179b0e0cae4a79cbd0e0d056ea4fd5825b3988695fa"} err="failed to get container status \"bd6f516eb4fef72025380179b0e0cae4a79cbd0e0d056ea4fd5825b3988695fa\": rpc error: code = NotFound desc = could not find container \"bd6f516eb4fef72025380179b0e0cae4a79cbd0e0d056ea4fd5825b3988695fa\": container with ID starting with bd6f516eb4fef72025380179b0e0cae4a79cbd0e0d056ea4fd5825b3988695fa not found: ID does not exist" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.573235 4825 scope.go:117] "RemoveContainer" containerID="c5c2db600706cdcc66a3813eebd52b1ce82e80b78169ce004e8f6b05f39dc3f3" Mar 10 07:07:55 crc kubenswrapper[4825]: E0310 07:07:55.573507 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5c2db600706cdcc66a3813eebd52b1ce82e80b78169ce004e8f6b05f39dc3f3\": container with ID starting with c5c2db600706cdcc66a3813eebd52b1ce82e80b78169ce004e8f6b05f39dc3f3 not found: ID does not exist" containerID="c5c2db600706cdcc66a3813eebd52b1ce82e80b78169ce004e8f6b05f39dc3f3" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.573537 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5c2db600706cdcc66a3813eebd52b1ce82e80b78169ce004e8f6b05f39dc3f3"} err="failed to get container status \"c5c2db600706cdcc66a3813eebd52b1ce82e80b78169ce004e8f6b05f39dc3f3\": rpc error: code = NotFound desc = could not find container \"c5c2db600706cdcc66a3813eebd52b1ce82e80b78169ce004e8f6b05f39dc3f3\": container with ID starting with c5c2db600706cdcc66a3813eebd52b1ce82e80b78169ce004e8f6b05f39dc3f3 not found: ID does not exist" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.665083 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.665137 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c09e2c9b-b94a-40cb-8646-687e010f3002-run-httpd\") pod \"ceilometer-0\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.665191 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-scripts\") pod \"ceilometer-0\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.666003 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.666070 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g4kn\" (UniqueName: \"kubernetes.io/projected/c09e2c9b-b94a-40cb-8646-687e010f3002-kube-api-access-7g4kn\") pod \"ceilometer-0\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.666115 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.666198 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c09e2c9b-b94a-40cb-8646-687e010f3002-log-httpd\") pod \"ceilometer-0\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.666214 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-config-data\") pod \"ceilometer-0\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.767839 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c09e2c9b-b94a-40cb-8646-687e010f3002-log-httpd\") pod \"ceilometer-0\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.767887 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-config-data\") pod \"ceilometer-0\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.767923 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.767960 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c09e2c9b-b94a-40cb-8646-687e010f3002-run-httpd\") pod \"ceilometer-0\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.767992 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-scripts\") pod \"ceilometer-0\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.768012 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.768076 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g4kn\" (UniqueName: \"kubernetes.io/projected/c09e2c9b-b94a-40cb-8646-687e010f3002-kube-api-access-7g4kn\") pod \"ceilometer-0\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.768199 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.768361 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c09e2c9b-b94a-40cb-8646-687e010f3002-log-httpd\") pod \"ceilometer-0\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.768625 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c09e2c9b-b94a-40cb-8646-687e010f3002-run-httpd\") pod \"ceilometer-0\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.773519 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.774270 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-config-data\") pod \"ceilometer-0\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.774778 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.775852 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.776419 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-scripts\") pod \"ceilometer-0\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.791197 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g4kn\" (UniqueName: \"kubernetes.io/projected/c09e2c9b-b94a-40cb-8646-687e010f3002-kube-api-access-7g4kn\") pod \"ceilometer-0\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.837337 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:07:55 crc kubenswrapper[4825]: I0310 07:07:55.948443 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:07:56 crc kubenswrapper[4825]: I0310 07:07:56.324392 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:07:56 crc kubenswrapper[4825]: I0310 07:07:56.433593 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c09e2c9b-b94a-40cb-8646-687e010f3002","Type":"ContainerStarted","Data":"a5159655c37fce49ddef6374d763715ba1db382aa253a51504395a622d8c518b"} Mar 10 07:07:56 crc kubenswrapper[4825]: I0310 07:07:56.811537 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:56 crc kubenswrapper[4825]: I0310 07:07:56.847571 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:57 crc kubenswrapper[4825]: I0310 07:07:57.252014 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8691271e-1ca4-43e0-86e1-f2c0aa90c948" path="/var/lib/kubelet/pods/8691271e-1ca4-43e0-86e1-f2c0aa90c948/volumes" Mar 10 07:07:57 crc kubenswrapper[4825]: I0310 07:07:57.445772 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c09e2c9b-b94a-40cb-8646-687e010f3002","Type":"ContainerStarted","Data":"a7195e9075890ba063068bd9b092e8946d42acae2a64c8f5a383ba522b13c281"} Mar 10 07:07:57 crc kubenswrapper[4825]: I0310 07:07:57.463997 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:07:57 crc kubenswrapper[4825]: I0310 07:07:57.641126 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-rcx8n"] Mar 10 07:07:57 crc kubenswrapper[4825]: I0310 07:07:57.642754 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rcx8n" Mar 10 07:07:57 crc kubenswrapper[4825]: I0310 07:07:57.652792 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 10 07:07:57 crc kubenswrapper[4825]: I0310 07:07:57.653403 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 10 07:07:57 crc kubenswrapper[4825]: I0310 07:07:57.663431 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-rcx8n"] Mar 10 07:07:57 crc kubenswrapper[4825]: I0310 07:07:57.708361 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20652aad-ebbd-472d-8cd8-61c598352ec4-config-data\") pod \"nova-cell1-cell-mapping-rcx8n\" (UID: \"20652aad-ebbd-472d-8cd8-61c598352ec4\") " pod="openstack/nova-cell1-cell-mapping-rcx8n" Mar 10 07:07:57 crc kubenswrapper[4825]: I0310 07:07:57.708507 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmb6t\" (UniqueName: \"kubernetes.io/projected/20652aad-ebbd-472d-8cd8-61c598352ec4-kube-api-access-cmb6t\") pod \"nova-cell1-cell-mapping-rcx8n\" (UID: \"20652aad-ebbd-472d-8cd8-61c598352ec4\") " pod="openstack/nova-cell1-cell-mapping-rcx8n" Mar 10 07:07:57 crc kubenswrapper[4825]: I0310 07:07:57.708546 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20652aad-ebbd-472d-8cd8-61c598352ec4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-rcx8n\" (UID: \"20652aad-ebbd-472d-8cd8-61c598352ec4\") " pod="openstack/nova-cell1-cell-mapping-rcx8n" Mar 10 07:07:57 crc kubenswrapper[4825]: I0310 07:07:57.708611 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20652aad-ebbd-472d-8cd8-61c598352ec4-scripts\") pod \"nova-cell1-cell-mapping-rcx8n\" (UID: \"20652aad-ebbd-472d-8cd8-61c598352ec4\") " pod="openstack/nova-cell1-cell-mapping-rcx8n" Mar 10 07:07:57 crc kubenswrapper[4825]: I0310 07:07:57.816568 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmb6t\" (UniqueName: \"kubernetes.io/projected/20652aad-ebbd-472d-8cd8-61c598352ec4-kube-api-access-cmb6t\") pod \"nova-cell1-cell-mapping-rcx8n\" (UID: \"20652aad-ebbd-472d-8cd8-61c598352ec4\") " pod="openstack/nova-cell1-cell-mapping-rcx8n" Mar 10 07:07:57 crc kubenswrapper[4825]: I0310 07:07:57.816631 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20652aad-ebbd-472d-8cd8-61c598352ec4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-rcx8n\" (UID: \"20652aad-ebbd-472d-8cd8-61c598352ec4\") " pod="openstack/nova-cell1-cell-mapping-rcx8n" Mar 10 07:07:57 crc kubenswrapper[4825]: I0310 07:07:57.816687 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20652aad-ebbd-472d-8cd8-61c598352ec4-scripts\") pod \"nova-cell1-cell-mapping-rcx8n\" (UID: \"20652aad-ebbd-472d-8cd8-61c598352ec4\") " pod="openstack/nova-cell1-cell-mapping-rcx8n" Mar 10 07:07:57 crc kubenswrapper[4825]: I0310 07:07:57.816733 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20652aad-ebbd-472d-8cd8-61c598352ec4-config-data\") pod \"nova-cell1-cell-mapping-rcx8n\" (UID: \"20652aad-ebbd-472d-8cd8-61c598352ec4\") " pod="openstack/nova-cell1-cell-mapping-rcx8n" Mar 10 07:07:57 crc kubenswrapper[4825]: I0310 07:07:57.827030 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20652aad-ebbd-472d-8cd8-61c598352ec4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-rcx8n\" (UID: \"20652aad-ebbd-472d-8cd8-61c598352ec4\") " pod="openstack/nova-cell1-cell-mapping-rcx8n" Mar 10 07:07:57 crc kubenswrapper[4825]: I0310 07:07:57.827489 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20652aad-ebbd-472d-8cd8-61c598352ec4-config-data\") pod \"nova-cell1-cell-mapping-rcx8n\" (UID: \"20652aad-ebbd-472d-8cd8-61c598352ec4\") " pod="openstack/nova-cell1-cell-mapping-rcx8n" Mar 10 07:07:57 crc kubenswrapper[4825]: I0310 07:07:57.833662 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20652aad-ebbd-472d-8cd8-61c598352ec4-scripts\") pod \"nova-cell1-cell-mapping-rcx8n\" (UID: \"20652aad-ebbd-472d-8cd8-61c598352ec4\") " pod="openstack/nova-cell1-cell-mapping-rcx8n" Mar 10 07:07:57 crc kubenswrapper[4825]: I0310 07:07:57.854217 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmb6t\" (UniqueName: \"kubernetes.io/projected/20652aad-ebbd-472d-8cd8-61c598352ec4-kube-api-access-cmb6t\") pod \"nova-cell1-cell-mapping-rcx8n\" (UID: \"20652aad-ebbd-472d-8cd8-61c598352ec4\") " pod="openstack/nova-cell1-cell-mapping-rcx8n" Mar 10 07:07:57 crc kubenswrapper[4825]: I0310 07:07:57.996973 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.019552 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb8wv\" (UniqueName: \"kubernetes.io/projected/3f36c81f-f491-403b-8491-463f9f6d3050-kube-api-access-wb8wv\") pod \"3f36c81f-f491-403b-8491-463f9f6d3050\" (UID: \"3f36c81f-f491-403b-8491-463f9f6d3050\") " Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.019597 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f36c81f-f491-403b-8491-463f9f6d3050-combined-ca-bundle\") pod \"3f36c81f-f491-403b-8491-463f9f6d3050\" (UID: \"3f36c81f-f491-403b-8491-463f9f6d3050\") " Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.019740 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f36c81f-f491-403b-8491-463f9f6d3050-config-data\") pod \"3f36c81f-f491-403b-8491-463f9f6d3050\" (UID: \"3f36c81f-f491-403b-8491-463f9f6d3050\") " Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.019781 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f36c81f-f491-403b-8491-463f9f6d3050-logs\") pod \"3f36c81f-f491-403b-8491-463f9f6d3050\" (UID: \"3f36c81f-f491-403b-8491-463f9f6d3050\") " Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.020430 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f36c81f-f491-403b-8491-463f9f6d3050-logs" (OuterVolumeSpecName: "logs") pod "3f36c81f-f491-403b-8491-463f9f6d3050" (UID: "3f36c81f-f491-403b-8491-463f9f6d3050"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.024354 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f36c81f-f491-403b-8491-463f9f6d3050-kube-api-access-wb8wv" (OuterVolumeSpecName: "kube-api-access-wb8wv") pod "3f36c81f-f491-403b-8491-463f9f6d3050" (UID: "3f36c81f-f491-403b-8491-463f9f6d3050"). InnerVolumeSpecName "kube-api-access-wb8wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.060549 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f36c81f-f491-403b-8491-463f9f6d3050-config-data" (OuterVolumeSpecName: "config-data") pod "3f36c81f-f491-403b-8491-463f9f6d3050" (UID: "3f36c81f-f491-403b-8491-463f9f6d3050"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.071676 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rcx8n" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.072888 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f36c81f-f491-403b-8491-463f9f6d3050-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f36c81f-f491-403b-8491-463f9f6d3050" (UID: "3f36c81f-f491-403b-8491-463f9f6d3050"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.121918 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb8wv\" (UniqueName: \"kubernetes.io/projected/3f36c81f-f491-403b-8491-463f9f6d3050-kube-api-access-wb8wv\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.121954 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f36c81f-f491-403b-8491-463f9f6d3050-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.121965 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f36c81f-f491-403b-8491-463f9f6d3050-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.121976 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f36c81f-f491-403b-8491-463f9f6d3050-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.457108 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c09e2c9b-b94a-40cb-8646-687e010f3002","Type":"ContainerStarted","Data":"0356f5ab6145f37ff4290da40a1602a3abe05333ce86f47dbb2d5041cfd72341"} Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.459925 4825 generic.go:334] "Generic (PLEG): container finished" podID="3f36c81f-f491-403b-8491-463f9f6d3050" containerID="00427ed4e18de15b3b8814048fae3907568de244ffc37a815158b69378579272" exitCode=0 Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.460993 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.461315 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3f36c81f-f491-403b-8491-463f9f6d3050","Type":"ContainerDied","Data":"00427ed4e18de15b3b8814048fae3907568de244ffc37a815158b69378579272"} Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.461365 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3f36c81f-f491-403b-8491-463f9f6d3050","Type":"ContainerDied","Data":"328696e78f8cbe465ad319c2bf4a98a19067ed628958595e962936175e436c2e"} Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.461407 4825 scope.go:117] "RemoveContainer" containerID="00427ed4e18de15b3b8814048fae3907568de244ffc37a815158b69378579272" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.538497 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-rcx8n"] Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.692523 4825 scope.go:117] "RemoveContainer" containerID="859d5bf9379371fcd955fa7dcaddd751f4194d56237c3bacbf4ca28267e8a1f4" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.731194 4825 scope.go:117] "RemoveContainer" containerID="00427ed4e18de15b3b8814048fae3907568de244ffc37a815158b69378579272" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.731316 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 07:07:58 crc kubenswrapper[4825]: E0310 07:07:58.731539 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00427ed4e18de15b3b8814048fae3907568de244ffc37a815158b69378579272\": container with ID starting with 00427ed4e18de15b3b8814048fae3907568de244ffc37a815158b69378579272 not found: ID does not exist" containerID="00427ed4e18de15b3b8814048fae3907568de244ffc37a815158b69378579272" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.731570 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00427ed4e18de15b3b8814048fae3907568de244ffc37a815158b69378579272"} err="failed to get container status \"00427ed4e18de15b3b8814048fae3907568de244ffc37a815158b69378579272\": rpc error: code = NotFound desc = could not find container \"00427ed4e18de15b3b8814048fae3907568de244ffc37a815158b69378579272\": container with ID starting with 00427ed4e18de15b3b8814048fae3907568de244ffc37a815158b69378579272 not found: ID does not exist" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.731592 4825 scope.go:117] "RemoveContainer" containerID="859d5bf9379371fcd955fa7dcaddd751f4194d56237c3bacbf4ca28267e8a1f4" Mar 10 07:07:58 crc kubenswrapper[4825]: E0310 07:07:58.731975 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"859d5bf9379371fcd955fa7dcaddd751f4194d56237c3bacbf4ca28267e8a1f4\": container with ID starting with 859d5bf9379371fcd955fa7dcaddd751f4194d56237c3bacbf4ca28267e8a1f4 not found: ID does not exist" containerID="859d5bf9379371fcd955fa7dcaddd751f4194d56237c3bacbf4ca28267e8a1f4" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.732017 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"859d5bf9379371fcd955fa7dcaddd751f4194d56237c3bacbf4ca28267e8a1f4"} err="failed to get container status \"859d5bf9379371fcd955fa7dcaddd751f4194d56237c3bacbf4ca28267e8a1f4\": rpc error: code = NotFound desc = could not find container \"859d5bf9379371fcd955fa7dcaddd751f4194d56237c3bacbf4ca28267e8a1f4\": container with ID starting with 859d5bf9379371fcd955fa7dcaddd751f4194d56237c3bacbf4ca28267e8a1f4 not found: ID does not exist" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.745858 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.762325 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 07:07:58 crc kubenswrapper[4825]: E0310 07:07:58.762774 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f36c81f-f491-403b-8491-463f9f6d3050" containerName="nova-api-api" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.762794 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f36c81f-f491-403b-8491-463f9f6d3050" containerName="nova-api-api" Mar 10 07:07:58 crc kubenswrapper[4825]: E0310 07:07:58.762821 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f36c81f-f491-403b-8491-463f9f6d3050" containerName="nova-api-log" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.762829 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f36c81f-f491-403b-8491-463f9f6d3050" containerName="nova-api-log" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.763878 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f36c81f-f491-403b-8491-463f9f6d3050" containerName="nova-api-api" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.763902 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f36c81f-f491-403b-8491-463f9f6d3050" containerName="nova-api-log" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.765343 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.766916 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.767733 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.767940 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.772236 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.840929 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kl24\" (UniqueName: \"kubernetes.io/projected/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-kube-api-access-8kl24\") pod \"nova-api-0\" (UID: \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\") " pod="openstack/nova-api-0" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.840972 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-config-data\") pod \"nova-api-0\" (UID: \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\") " pod="openstack/nova-api-0" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.841012 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\") " pod="openstack/nova-api-0" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.841207 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\") " pod="openstack/nova-api-0" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.841346 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-public-tls-certs\") pod \"nova-api-0\" (UID: \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\") " pod="openstack/nova-api-0" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.841400 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-logs\") pod \"nova-api-0\" (UID: \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\") " pod="openstack/nova-api-0" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.943124 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kl24\" (UniqueName: \"kubernetes.io/projected/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-kube-api-access-8kl24\") pod \"nova-api-0\" (UID: \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\") " pod="openstack/nova-api-0" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.943198 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-config-data\") pod \"nova-api-0\" (UID: \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\") " pod="openstack/nova-api-0" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.943243 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\") " pod="openstack/nova-api-0" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.943284 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\") " pod="openstack/nova-api-0" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.943332 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-public-tls-certs\") pod \"nova-api-0\" (UID: \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\") " pod="openstack/nova-api-0" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.943358 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-logs\") pod \"nova-api-0\" (UID: \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\") " pod="openstack/nova-api-0" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.945023 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-logs\") pod \"nova-api-0\" (UID: \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\") " pod="openstack/nova-api-0" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.947304 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\") " pod="openstack/nova-api-0" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.947666 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\") " pod="openstack/nova-api-0" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.948650 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-public-tls-certs\") pod \"nova-api-0\" (UID: \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\") " pod="openstack/nova-api-0" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.958709 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-config-data\") pod \"nova-api-0\" (UID: \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\") " pod="openstack/nova-api-0" Mar 10 07:07:58 crc kubenswrapper[4825]: I0310 07:07:58.960611 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kl24\" (UniqueName: \"kubernetes.io/projected/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-kube-api-access-8kl24\") pod \"nova-api-0\" (UID: \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\") " pod="openstack/nova-api-0" Mar 10 07:07:59 crc kubenswrapper[4825]: I0310 07:07:59.081496 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 07:07:59 crc kubenswrapper[4825]: I0310 07:07:59.252649 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f36c81f-f491-403b-8491-463f9f6d3050" path="/var/lib/kubelet/pods/3f36c81f-f491-403b-8491-463f9f6d3050/volumes" Mar 10 07:07:59 crc kubenswrapper[4825]: I0310 07:07:59.474941 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c09e2c9b-b94a-40cb-8646-687e010f3002","Type":"ContainerStarted","Data":"b1d41b7d027d77c9513bde337e8c8167ef7c3e29806e7839c5c877933d476511"} Mar 10 07:07:59 crc kubenswrapper[4825]: I0310 07:07:59.477346 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rcx8n" event={"ID":"20652aad-ebbd-472d-8cd8-61c598352ec4","Type":"ContainerStarted","Data":"470e2428d5cbe39e87f8dca68dad46978460d1f245bcfa54ac6931e573586b33"} Mar 10 07:07:59 crc kubenswrapper[4825]: I0310 07:07:59.477371 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rcx8n" event={"ID":"20652aad-ebbd-472d-8cd8-61c598352ec4","Type":"ContainerStarted","Data":"8bbc215801b422f8ff0a1dcb17b6ef3dfca823ac1f23d4526df9822e650b7499"} Mar 10 07:07:59 crc kubenswrapper[4825]: I0310 07:07:59.499889 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 07:07:59 crc kubenswrapper[4825]: I0310 07:07:59.503523 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-rcx8n" podStartSLOduration=2.503506315 podStartE2EDuration="2.503506315s" podCreationTimestamp="2026-03-10 07:07:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:07:59.490261979 +0000 UTC m=+1432.520042594" watchObservedRunningTime="2026-03-10 07:07:59.503506315 +0000 UTC m=+1432.533286930" Mar 10 07:07:59 crc kubenswrapper[4825]: W0310 07:07:59.505267 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fe14b8d_8183_4faf_bd66_2ebeb0cfcfe4.slice/crio-cfd731c8572062ce63da0e86f5c6c404d619981eaa9b31b3cb539e450e25765a WatchSource:0}: Error finding container cfd731c8572062ce63da0e86f5c6c404d619981eaa9b31b3cb539e450e25765a: Status 404 returned error can't find the container with id cfd731c8572062ce63da0e86f5c6c404d619981eaa9b31b3cb539e450e25765a Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.129035 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552108-z7bwv"] Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.130338 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552108-z7bwv" Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.132077 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.133007 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.133193 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.145297 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552108-z7bwv"] Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.172703 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2plpq\" (UniqueName: \"kubernetes.io/projected/fa4c8337-0ae4-4b85-9793-4adb29f920a7-kube-api-access-2plpq\") pod \"auto-csr-approver-29552108-z7bwv\" (UID: \"fa4c8337-0ae4-4b85-9793-4adb29f920a7\") " pod="openshift-infra/auto-csr-approver-29552108-z7bwv" Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.276540 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2plpq\" (UniqueName: \"kubernetes.io/projected/fa4c8337-0ae4-4b85-9793-4adb29f920a7-kube-api-access-2plpq\") pod \"auto-csr-approver-29552108-z7bwv\" (UID: \"fa4c8337-0ae4-4b85-9793-4adb29f920a7\") " pod="openshift-infra/auto-csr-approver-29552108-z7bwv" Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.299812 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2plpq\" (UniqueName: \"kubernetes.io/projected/fa4c8337-0ae4-4b85-9793-4adb29f920a7-kube-api-access-2plpq\") pod \"auto-csr-approver-29552108-z7bwv\" (UID: \"fa4c8337-0ae4-4b85-9793-4adb29f920a7\") " pod="openshift-infra/auto-csr-approver-29552108-z7bwv" Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.451207 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552108-z7bwv" Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.493536 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c09e2c9b-b94a-40cb-8646-687e010f3002","Type":"ContainerStarted","Data":"126696bf9938795cae75dc7fc16fc96bf0c323ea2004daf4fa07abd6a757a8a9"} Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.493659 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c09e2c9b-b94a-40cb-8646-687e010f3002" containerName="ceilometer-central-agent" containerID="cri-o://a7195e9075890ba063068bd9b092e8946d42acae2a64c8f5a383ba522b13c281" gracePeriod=30 Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.493713 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.493776 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c09e2c9b-b94a-40cb-8646-687e010f3002" containerName="proxy-httpd" containerID="cri-o://126696bf9938795cae75dc7fc16fc96bf0c323ea2004daf4fa07abd6a757a8a9" gracePeriod=30 Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.493823 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c09e2c9b-b94a-40cb-8646-687e010f3002" containerName="sg-core" containerID="cri-o://b1d41b7d027d77c9513bde337e8c8167ef7c3e29806e7839c5c877933d476511" gracePeriod=30 Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.494094 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c09e2c9b-b94a-40cb-8646-687e010f3002" containerName="ceilometer-notification-agent" containerID="cri-o://0356f5ab6145f37ff4290da40a1602a3abe05333ce86f47dbb2d5041cfd72341" gracePeriod=30 Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.500651 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4","Type":"ContainerStarted","Data":"dc6685c6e0f12fa720d6767fc387619cfe31ffdbdf578ca9f5cfbbba0597c404"} Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.500681 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4","Type":"ContainerStarted","Data":"737b2daf7646ea66e2e708f65ad5210026921fa5a0e2fe99c43bea726002539e"} Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.500691 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4","Type":"ContainerStarted","Data":"cfd731c8572062ce63da0e86f5c6c404d619981eaa9b31b3cb539e450e25765a"} Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.524628 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.538789758 podStartE2EDuration="5.524610973s" podCreationTimestamp="2026-03-10 07:07:55 +0000 UTC" firstStartedPulling="2026-03-10 07:07:56.329732898 +0000 UTC m=+1429.359513513" lastFinishedPulling="2026-03-10 07:08:00.315554123 +0000 UTC m=+1433.345334728" observedRunningTime="2026-03-10 07:08:00.522457337 +0000 UTC m=+1433.552237982" watchObservedRunningTime="2026-03-10 07:08:00.524610973 +0000 UTC m=+1433.554391588" Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.552989 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.552970822 podStartE2EDuration="2.552970822s" podCreationTimestamp="2026-03-10 07:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:08:00.543642179 +0000 UTC m=+1433.573422794" watchObservedRunningTime="2026-03-10 07:08:00.552970822 +0000 UTC m=+1433.582751437" Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.866816 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7749c44969-kww8g" Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.926769 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-q8gbc"] Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.927476 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" podUID="3ffd2f64-247a-4adf-bdc5-dce905192f17" containerName="dnsmasq-dns" containerID="cri-o://ef26b86d0482989024104298ad30b8391e618d4bc4de4ccada15c03fbac43938" gracePeriod=10 Mar 10 07:08:00 crc kubenswrapper[4825]: I0310 07:08:00.970900 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552108-z7bwv"] Mar 10 07:08:00 crc kubenswrapper[4825]: W0310 07:08:00.977834 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa4c8337_0ae4_4b85_9793_4adb29f920a7.slice/crio-6125280b7178ce702d9080c69ee2b5922379bbdc054c9e7f3b22933bc6f81465 WatchSource:0}: Error finding container 6125280b7178ce702d9080c69ee2b5922379bbdc054c9e7f3b22933bc6f81465: Status 404 returned error can't find the container with id 6125280b7178ce702d9080c69ee2b5922379bbdc054c9e7f3b22933bc6f81465 Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.347544 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.407344 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-ovsdbserver-nb\") pod \"3ffd2f64-247a-4adf-bdc5-dce905192f17\" (UID: \"3ffd2f64-247a-4adf-bdc5-dce905192f17\") " Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.407560 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-ovsdbserver-sb\") pod \"3ffd2f64-247a-4adf-bdc5-dce905192f17\" (UID: \"3ffd2f64-247a-4adf-bdc5-dce905192f17\") " Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.407582 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-dns-svc\") pod \"3ffd2f64-247a-4adf-bdc5-dce905192f17\" (UID: \"3ffd2f64-247a-4adf-bdc5-dce905192f17\") " Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.407614 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-config\") pod \"3ffd2f64-247a-4adf-bdc5-dce905192f17\" (UID: \"3ffd2f64-247a-4adf-bdc5-dce905192f17\") " Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.407735 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-dns-swift-storage-0\") pod \"3ffd2f64-247a-4adf-bdc5-dce905192f17\" (UID: \"3ffd2f64-247a-4adf-bdc5-dce905192f17\") " Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.407767 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlrvx\" (UniqueName: \"kubernetes.io/projected/3ffd2f64-247a-4adf-bdc5-dce905192f17-kube-api-access-dlrvx\") pod \"3ffd2f64-247a-4adf-bdc5-dce905192f17\" (UID: \"3ffd2f64-247a-4adf-bdc5-dce905192f17\") " Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.413744 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ffd2f64-247a-4adf-bdc5-dce905192f17-kube-api-access-dlrvx" (OuterVolumeSpecName: "kube-api-access-dlrvx") pod "3ffd2f64-247a-4adf-bdc5-dce905192f17" (UID: "3ffd2f64-247a-4adf-bdc5-dce905192f17"). InnerVolumeSpecName "kube-api-access-dlrvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.470209 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-config" (OuterVolumeSpecName: "config") pod "3ffd2f64-247a-4adf-bdc5-dce905192f17" (UID: "3ffd2f64-247a-4adf-bdc5-dce905192f17"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.480629 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3ffd2f64-247a-4adf-bdc5-dce905192f17" (UID: "3ffd2f64-247a-4adf-bdc5-dce905192f17"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.482541 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3ffd2f64-247a-4adf-bdc5-dce905192f17" (UID: "3ffd2f64-247a-4adf-bdc5-dce905192f17"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.484324 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3ffd2f64-247a-4adf-bdc5-dce905192f17" (UID: "3ffd2f64-247a-4adf-bdc5-dce905192f17"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.500728 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3ffd2f64-247a-4adf-bdc5-dce905192f17" (UID: "3ffd2f64-247a-4adf-bdc5-dce905192f17"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.513372 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.513395 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.513406 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.513416 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.513426 4825 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ffd2f64-247a-4adf-bdc5-dce905192f17-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.513436 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlrvx\" (UniqueName: \"kubernetes.io/projected/3ffd2f64-247a-4adf-bdc5-dce905192f17-kube-api-access-dlrvx\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.534599 4825 generic.go:334] "Generic (PLEG): container finished" podID="3ffd2f64-247a-4adf-bdc5-dce905192f17" containerID="ef26b86d0482989024104298ad30b8391e618d4bc4de4ccada15c03fbac43938" exitCode=0 Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.534671 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" event={"ID":"3ffd2f64-247a-4adf-bdc5-dce905192f17","Type":"ContainerDied","Data":"ef26b86d0482989024104298ad30b8391e618d4bc4de4ccada15c03fbac43938"} Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.534705 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" event={"ID":"3ffd2f64-247a-4adf-bdc5-dce905192f17","Type":"ContainerDied","Data":"521ed245bf2452731ceaec17dcb1bbbed78deeb48213416bef0ca079b5b1c0cd"} Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.534724 4825 scope.go:117] "RemoveContainer" containerID="ef26b86d0482989024104298ad30b8391e618d4bc4de4ccada15c03fbac43938" Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.534857 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-q8gbc" Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.544907 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552108-z7bwv" event={"ID":"fa4c8337-0ae4-4b85-9793-4adb29f920a7","Type":"ContainerStarted","Data":"6125280b7178ce702d9080c69ee2b5922379bbdc054c9e7f3b22933bc6f81465"} Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.550946 4825 generic.go:334] "Generic (PLEG): container finished" podID="c09e2c9b-b94a-40cb-8646-687e010f3002" containerID="b1d41b7d027d77c9513bde337e8c8167ef7c3e29806e7839c5c877933d476511" exitCode=2 Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.550980 4825 generic.go:334] "Generic (PLEG): container finished" podID="c09e2c9b-b94a-40cb-8646-687e010f3002" containerID="0356f5ab6145f37ff4290da40a1602a3abe05333ce86f47dbb2d5041cfd72341" exitCode=0 Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.552901 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c09e2c9b-b94a-40cb-8646-687e010f3002","Type":"ContainerDied","Data":"b1d41b7d027d77c9513bde337e8c8167ef7c3e29806e7839c5c877933d476511"} Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.552944 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c09e2c9b-b94a-40cb-8646-687e010f3002","Type":"ContainerDied","Data":"0356f5ab6145f37ff4290da40a1602a3abe05333ce86f47dbb2d5041cfd72341"} Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.588021 4825 scope.go:117] "RemoveContainer" containerID="b2975973fabf7cfa881a9ec9b70d675113e43e495fbc5493dca9e9faa3a17b72" Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.592527 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-q8gbc"] Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.602326 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-q8gbc"] Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.617209 4825 scope.go:117] "RemoveContainer" containerID="ef26b86d0482989024104298ad30b8391e618d4bc4de4ccada15c03fbac43938" Mar 10 07:08:01 crc kubenswrapper[4825]: E0310 07:08:01.617769 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef26b86d0482989024104298ad30b8391e618d4bc4de4ccada15c03fbac43938\": container with ID starting with ef26b86d0482989024104298ad30b8391e618d4bc4de4ccada15c03fbac43938 not found: ID does not exist" containerID="ef26b86d0482989024104298ad30b8391e618d4bc4de4ccada15c03fbac43938" Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.617804 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef26b86d0482989024104298ad30b8391e618d4bc4de4ccada15c03fbac43938"} err="failed to get container status \"ef26b86d0482989024104298ad30b8391e618d4bc4de4ccada15c03fbac43938\": rpc error: code = NotFound desc = could not find container \"ef26b86d0482989024104298ad30b8391e618d4bc4de4ccada15c03fbac43938\": container with ID starting with ef26b86d0482989024104298ad30b8391e618d4bc4de4ccada15c03fbac43938 not found: ID does not exist" Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.617829 4825 scope.go:117] "RemoveContainer" containerID="b2975973fabf7cfa881a9ec9b70d675113e43e495fbc5493dca9e9faa3a17b72" Mar 10 07:08:01 crc kubenswrapper[4825]: E0310 07:08:01.618088 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2975973fabf7cfa881a9ec9b70d675113e43e495fbc5493dca9e9faa3a17b72\": container with ID starting with b2975973fabf7cfa881a9ec9b70d675113e43e495fbc5493dca9e9faa3a17b72 not found: ID does not exist" containerID="b2975973fabf7cfa881a9ec9b70d675113e43e495fbc5493dca9e9faa3a17b72" Mar 10 07:08:01 crc kubenswrapper[4825]: I0310 07:08:01.618116 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2975973fabf7cfa881a9ec9b70d675113e43e495fbc5493dca9e9faa3a17b72"} err="failed to get container status \"b2975973fabf7cfa881a9ec9b70d675113e43e495fbc5493dca9e9faa3a17b72\": rpc error: code = NotFound desc = could not find container \"b2975973fabf7cfa881a9ec9b70d675113e43e495fbc5493dca9e9faa3a17b72\": container with ID starting with b2975973fabf7cfa881a9ec9b70d675113e43e495fbc5493dca9e9faa3a17b72 not found: ID does not exist" Mar 10 07:08:02 crc kubenswrapper[4825]: I0310 07:08:02.564746 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552108-z7bwv" event={"ID":"fa4c8337-0ae4-4b85-9793-4adb29f920a7","Type":"ContainerStarted","Data":"9f37201bc3341b1523532415ac8473925029730a22506fbff5e8ff2982720c54"} Mar 10 07:08:02 crc kubenswrapper[4825]: I0310 07:08:02.583630 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552108-z7bwv" podStartSLOduration=1.545554307 podStartE2EDuration="2.583608548s" podCreationTimestamp="2026-03-10 07:08:00 +0000 UTC" firstStartedPulling="2026-03-10 07:08:00.983412353 +0000 UTC m=+1434.013192968" lastFinishedPulling="2026-03-10 07:08:02.021466604 +0000 UTC m=+1435.051247209" observedRunningTime="2026-03-10 07:08:02.580322573 +0000 UTC m=+1435.610103188" watchObservedRunningTime="2026-03-10 07:08:02.583608548 +0000 UTC m=+1435.613389163" Mar 10 07:08:03 crc kubenswrapper[4825]: I0310 07:08:03.253480 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ffd2f64-247a-4adf-bdc5-dce905192f17" path="/var/lib/kubelet/pods/3ffd2f64-247a-4adf-bdc5-dce905192f17/volumes" Mar 10 07:08:03 crc kubenswrapper[4825]: I0310 07:08:03.597748 4825 generic.go:334] "Generic (PLEG): container finished" podID="fa4c8337-0ae4-4b85-9793-4adb29f920a7" containerID="9f37201bc3341b1523532415ac8473925029730a22506fbff5e8ff2982720c54" exitCode=0 Mar 10 07:08:03 crc kubenswrapper[4825]: I0310 07:08:03.598272 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552108-z7bwv" event={"ID":"fa4c8337-0ae4-4b85-9793-4adb29f920a7","Type":"ContainerDied","Data":"9f37201bc3341b1523532415ac8473925029730a22506fbff5e8ff2982720c54"} Mar 10 07:08:03 crc kubenswrapper[4825]: I0310 07:08:03.605569 4825 generic.go:334] "Generic (PLEG): container finished" podID="c09e2c9b-b94a-40cb-8646-687e010f3002" containerID="a7195e9075890ba063068bd9b092e8946d42acae2a64c8f5a383ba522b13c281" exitCode=0 Mar 10 07:08:03 crc kubenswrapper[4825]: I0310 07:08:03.605640 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c09e2c9b-b94a-40cb-8646-687e010f3002","Type":"ContainerDied","Data":"a7195e9075890ba063068bd9b092e8946d42acae2a64c8f5a383ba522b13c281"} Mar 10 07:08:04 crc kubenswrapper[4825]: I0310 07:08:04.621034 4825 generic.go:334] "Generic (PLEG): container finished" podID="20652aad-ebbd-472d-8cd8-61c598352ec4" containerID="470e2428d5cbe39e87f8dca68dad46978460d1f245bcfa54ac6931e573586b33" exitCode=0 Mar 10 07:08:04 crc kubenswrapper[4825]: I0310 07:08:04.621067 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rcx8n" event={"ID":"20652aad-ebbd-472d-8cd8-61c598352ec4","Type":"ContainerDied","Data":"470e2428d5cbe39e87f8dca68dad46978460d1f245bcfa54ac6931e573586b33"} Mar 10 07:08:05 crc kubenswrapper[4825]: I0310 07:08:05.046418 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552108-z7bwv" Mar 10 07:08:05 crc kubenswrapper[4825]: I0310 07:08:05.084940 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2plpq\" (UniqueName: \"kubernetes.io/projected/fa4c8337-0ae4-4b85-9793-4adb29f920a7-kube-api-access-2plpq\") pod \"fa4c8337-0ae4-4b85-9793-4adb29f920a7\" (UID: \"fa4c8337-0ae4-4b85-9793-4adb29f920a7\") " Mar 10 07:08:05 crc kubenswrapper[4825]: I0310 07:08:05.091177 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa4c8337-0ae4-4b85-9793-4adb29f920a7-kube-api-access-2plpq" (OuterVolumeSpecName: "kube-api-access-2plpq") pod "fa4c8337-0ae4-4b85-9793-4adb29f920a7" (UID: "fa4c8337-0ae4-4b85-9793-4adb29f920a7"). InnerVolumeSpecName "kube-api-access-2plpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:08:05 crc kubenswrapper[4825]: I0310 07:08:05.186718 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2plpq\" (UniqueName: \"kubernetes.io/projected/fa4c8337-0ae4-4b85-9793-4adb29f920a7-kube-api-access-2plpq\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:05 crc kubenswrapper[4825]: I0310 07:08:05.638839 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552108-z7bwv" Mar 10 07:08:05 crc kubenswrapper[4825]: I0310 07:08:05.639664 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552108-z7bwv" event={"ID":"fa4c8337-0ae4-4b85-9793-4adb29f920a7","Type":"ContainerDied","Data":"6125280b7178ce702d9080c69ee2b5922379bbdc054c9e7f3b22933bc6f81465"} Mar 10 07:08:05 crc kubenswrapper[4825]: I0310 07:08:05.639692 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6125280b7178ce702d9080c69ee2b5922379bbdc054c9e7f3b22933bc6f81465" Mar 10 07:08:05 crc kubenswrapper[4825]: I0310 07:08:05.698950 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552102-5gmst"] Mar 10 07:08:05 crc kubenswrapper[4825]: I0310 07:08:05.711833 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552102-5gmst"] Mar 10 07:08:06 crc kubenswrapper[4825]: I0310 07:08:06.043617 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rcx8n" Mar 10 07:08:06 crc kubenswrapper[4825]: I0310 07:08:06.108872 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20652aad-ebbd-472d-8cd8-61c598352ec4-config-data\") pod \"20652aad-ebbd-472d-8cd8-61c598352ec4\" (UID: \"20652aad-ebbd-472d-8cd8-61c598352ec4\") " Mar 10 07:08:06 crc kubenswrapper[4825]: I0310 07:08:06.109037 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20652aad-ebbd-472d-8cd8-61c598352ec4-scripts\") pod \"20652aad-ebbd-472d-8cd8-61c598352ec4\" (UID: \"20652aad-ebbd-472d-8cd8-61c598352ec4\") " Mar 10 07:08:06 crc kubenswrapper[4825]: I0310 07:08:06.109161 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20652aad-ebbd-472d-8cd8-61c598352ec4-combined-ca-bundle\") pod \"20652aad-ebbd-472d-8cd8-61c598352ec4\" (UID: \"20652aad-ebbd-472d-8cd8-61c598352ec4\") " Mar 10 07:08:06 crc kubenswrapper[4825]: I0310 07:08:06.109290 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmb6t\" (UniqueName: \"kubernetes.io/projected/20652aad-ebbd-472d-8cd8-61c598352ec4-kube-api-access-cmb6t\") pod \"20652aad-ebbd-472d-8cd8-61c598352ec4\" (UID: \"20652aad-ebbd-472d-8cd8-61c598352ec4\") " Mar 10 07:08:06 crc kubenswrapper[4825]: I0310 07:08:06.114213 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20652aad-ebbd-472d-8cd8-61c598352ec4-kube-api-access-cmb6t" (OuterVolumeSpecName: "kube-api-access-cmb6t") pod "20652aad-ebbd-472d-8cd8-61c598352ec4" (UID: "20652aad-ebbd-472d-8cd8-61c598352ec4"). InnerVolumeSpecName "kube-api-access-cmb6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:08:06 crc kubenswrapper[4825]: I0310 07:08:06.115543 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20652aad-ebbd-472d-8cd8-61c598352ec4-scripts" (OuterVolumeSpecName: "scripts") pod "20652aad-ebbd-472d-8cd8-61c598352ec4" (UID: "20652aad-ebbd-472d-8cd8-61c598352ec4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:08:06 crc kubenswrapper[4825]: I0310 07:08:06.137689 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20652aad-ebbd-472d-8cd8-61c598352ec4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20652aad-ebbd-472d-8cd8-61c598352ec4" (UID: "20652aad-ebbd-472d-8cd8-61c598352ec4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:08:06 crc kubenswrapper[4825]: I0310 07:08:06.148537 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20652aad-ebbd-472d-8cd8-61c598352ec4-config-data" (OuterVolumeSpecName: "config-data") pod "20652aad-ebbd-472d-8cd8-61c598352ec4" (UID: "20652aad-ebbd-472d-8cd8-61c598352ec4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:08:06 crc kubenswrapper[4825]: I0310 07:08:06.212083 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmb6t\" (UniqueName: \"kubernetes.io/projected/20652aad-ebbd-472d-8cd8-61c598352ec4-kube-api-access-cmb6t\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:06 crc kubenswrapper[4825]: I0310 07:08:06.212120 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20652aad-ebbd-472d-8cd8-61c598352ec4-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:06 crc kubenswrapper[4825]: I0310 07:08:06.212149 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20652aad-ebbd-472d-8cd8-61c598352ec4-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:06 crc kubenswrapper[4825]: I0310 07:08:06.212160 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20652aad-ebbd-472d-8cd8-61c598352ec4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:06 crc kubenswrapper[4825]: I0310 07:08:06.656680 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rcx8n" event={"ID":"20652aad-ebbd-472d-8cd8-61c598352ec4","Type":"ContainerDied","Data":"8bbc215801b422f8ff0a1dcb17b6ef3dfca823ac1f23d4526df9822e650b7499"} Mar 10 07:08:06 crc kubenswrapper[4825]: I0310 07:08:06.657661 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bbc215801b422f8ff0a1dcb17b6ef3dfca823ac1f23d4526df9822e650b7499" Mar 10 07:08:06 crc kubenswrapper[4825]: I0310 07:08:06.656780 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rcx8n" Mar 10 07:08:06 crc kubenswrapper[4825]: I0310 07:08:06.900119 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 07:08:06 crc kubenswrapper[4825]: I0310 07:08:06.903208 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="3858ca8e-74ce-44bd-a942-a64b4f59270f" containerName="nova-scheduler-scheduler" containerID="cri-o://71c198772602fab08b7c89ff60b473e6b8d4a34a3f456f7a2f4966de4e6b819f" gracePeriod=30 Mar 10 07:08:06 crc kubenswrapper[4825]: I0310 07:08:06.931360 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 07:08:06 crc kubenswrapper[4825]: I0310 07:08:06.931803 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ba141fc1-b6e8-4d89-a27c-be685f51d6ad" containerName="nova-metadata-log" containerID="cri-o://72dbff0f20a0e237d347b1529b4a95ef2ab111aa2d0f91ed82359a099402beb9" gracePeriod=30 Mar 10 07:08:06 crc kubenswrapper[4825]: I0310 07:08:06.931904 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ba141fc1-b6e8-4d89-a27c-be685f51d6ad" containerName="nova-metadata-metadata" containerID="cri-o://96f554d756b7d59701d89a4889be2adb5eb03daad6038fc9e00da48c875499dd" gracePeriod=30 Mar 10 07:08:06 crc kubenswrapper[4825]: I0310 07:08:06.953454 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 07:08:06 crc kubenswrapper[4825]: I0310 07:08:06.953798 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4" containerName="nova-api-log" containerID="cri-o://737b2daf7646ea66e2e708f65ad5210026921fa5a0e2fe99c43bea726002539e" gracePeriod=30 Mar 10 07:08:06 crc kubenswrapper[4825]: I0310 07:08:06.954023 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4" containerName="nova-api-api" containerID="cri-o://dc6685c6e0f12fa720d6767fc387619cfe31ffdbdf578ca9f5cfbbba0597c404" gracePeriod=30 Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.247429 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31aa0f29-4516-4ee9-b3a3-378723e6945e" path="/var/lib/kubelet/pods/31aa0f29-4516-4ee9-b3a3-378723e6945e/volumes" Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.546992 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.646061 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-logs\") pod \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\" (UID: \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\") " Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.646257 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-config-data\") pod \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\" (UID: \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\") " Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.646309 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-internal-tls-certs\") pod \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\" (UID: \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\") " Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.646345 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-combined-ca-bundle\") pod \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\" (UID: \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\") " Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.646415 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kl24\" (UniqueName: \"kubernetes.io/projected/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-kube-api-access-8kl24\") pod \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\" (UID: \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\") " Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.646500 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-public-tls-certs\") pod \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\" (UID: \"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4\") " Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.646559 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-logs" (OuterVolumeSpecName: "logs") pod "1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4" (UID: "1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.647004 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.655283 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-kube-api-access-8kl24" (OuterVolumeSpecName: "kube-api-access-8kl24") pod "1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4" (UID: "1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4"). InnerVolumeSpecName "kube-api-access-8kl24". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.672544 4825 generic.go:334] "Generic (PLEG): container finished" podID="ba141fc1-b6e8-4d89-a27c-be685f51d6ad" containerID="72dbff0f20a0e237d347b1529b4a95ef2ab111aa2d0f91ed82359a099402beb9" exitCode=143 Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.672649 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ba141fc1-b6e8-4d89-a27c-be685f51d6ad","Type":"ContainerDied","Data":"72dbff0f20a0e237d347b1529b4a95ef2ab111aa2d0f91ed82359a099402beb9"} Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.678189 4825 generic.go:334] "Generic (PLEG): container finished" podID="1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4" containerID="dc6685c6e0f12fa720d6767fc387619cfe31ffdbdf578ca9f5cfbbba0597c404" exitCode=0 Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.678213 4825 generic.go:334] "Generic (PLEG): container finished" podID="1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4" containerID="737b2daf7646ea66e2e708f65ad5210026921fa5a0e2fe99c43bea726002539e" exitCode=143 Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.678244 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4","Type":"ContainerDied","Data":"dc6685c6e0f12fa720d6767fc387619cfe31ffdbdf578ca9f5cfbbba0597c404"} Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.678271 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.678304 4825 scope.go:117] "RemoveContainer" containerID="dc6685c6e0f12fa720d6767fc387619cfe31ffdbdf578ca9f5cfbbba0597c404" Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.678286 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4","Type":"ContainerDied","Data":"737b2daf7646ea66e2e708f65ad5210026921fa5a0e2fe99c43bea726002539e"} Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.678503 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4","Type":"ContainerDied","Data":"cfd731c8572062ce63da0e86f5c6c404d619981eaa9b31b3cb539e450e25765a"} Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.681250 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-config-data" (OuterVolumeSpecName: "config-data") pod "1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4" (UID: "1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.696444 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4" (UID: "1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.712384 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4" (UID: "1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.712479 4825 scope.go:117] "RemoveContainer" containerID="737b2daf7646ea66e2e708f65ad5210026921fa5a0e2fe99c43bea726002539e" Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.724745 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4" (UID: "1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.743492 4825 scope.go:117] "RemoveContainer" containerID="dc6685c6e0f12fa720d6767fc387619cfe31ffdbdf578ca9f5cfbbba0597c404" Mar 10 07:08:07 crc kubenswrapper[4825]: E0310 07:08:07.744322 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc6685c6e0f12fa720d6767fc387619cfe31ffdbdf578ca9f5cfbbba0597c404\": container with ID starting with dc6685c6e0f12fa720d6767fc387619cfe31ffdbdf578ca9f5cfbbba0597c404 not found: ID does not exist" containerID="dc6685c6e0f12fa720d6767fc387619cfe31ffdbdf578ca9f5cfbbba0597c404" Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.744395 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6685c6e0f12fa720d6767fc387619cfe31ffdbdf578ca9f5cfbbba0597c404"} err="failed to get container status \"dc6685c6e0f12fa720d6767fc387619cfe31ffdbdf578ca9f5cfbbba0597c404\": rpc error: code = NotFound desc = could not find container \"dc6685c6e0f12fa720d6767fc387619cfe31ffdbdf578ca9f5cfbbba0597c404\": container with ID starting with dc6685c6e0f12fa720d6767fc387619cfe31ffdbdf578ca9f5cfbbba0597c404 not found: ID does not exist" Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.744434 4825 scope.go:117] "RemoveContainer" containerID="737b2daf7646ea66e2e708f65ad5210026921fa5a0e2fe99c43bea726002539e" Mar 10 07:08:07 crc kubenswrapper[4825]: E0310 07:08:07.745157 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"737b2daf7646ea66e2e708f65ad5210026921fa5a0e2fe99c43bea726002539e\": container with ID starting with 737b2daf7646ea66e2e708f65ad5210026921fa5a0e2fe99c43bea726002539e not found: ID does not exist" containerID="737b2daf7646ea66e2e708f65ad5210026921fa5a0e2fe99c43bea726002539e" Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.745206 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"737b2daf7646ea66e2e708f65ad5210026921fa5a0e2fe99c43bea726002539e"} err="failed to get container status \"737b2daf7646ea66e2e708f65ad5210026921fa5a0e2fe99c43bea726002539e\": rpc error: code = NotFound desc = could not find container \"737b2daf7646ea66e2e708f65ad5210026921fa5a0e2fe99c43bea726002539e\": container with ID starting with 737b2daf7646ea66e2e708f65ad5210026921fa5a0e2fe99c43bea726002539e not found: ID does not exist" Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.745239 4825 scope.go:117] "RemoveContainer" containerID="dc6685c6e0f12fa720d6767fc387619cfe31ffdbdf578ca9f5cfbbba0597c404" Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.745774 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6685c6e0f12fa720d6767fc387619cfe31ffdbdf578ca9f5cfbbba0597c404"} err="failed to get container status \"dc6685c6e0f12fa720d6767fc387619cfe31ffdbdf578ca9f5cfbbba0597c404\": rpc error: code = NotFound desc = could not find container \"dc6685c6e0f12fa720d6767fc387619cfe31ffdbdf578ca9f5cfbbba0597c404\": container with ID starting with dc6685c6e0f12fa720d6767fc387619cfe31ffdbdf578ca9f5cfbbba0597c404 not found: ID does not exist" Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.745812 4825 scope.go:117] "RemoveContainer" containerID="737b2daf7646ea66e2e708f65ad5210026921fa5a0e2fe99c43bea726002539e" Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.746264 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"737b2daf7646ea66e2e708f65ad5210026921fa5a0e2fe99c43bea726002539e"} err="failed to get container status \"737b2daf7646ea66e2e708f65ad5210026921fa5a0e2fe99c43bea726002539e\": rpc error: code = NotFound desc = could not find container \"737b2daf7646ea66e2e708f65ad5210026921fa5a0e2fe99c43bea726002539e\": container with ID starting with 737b2daf7646ea66e2e708f65ad5210026921fa5a0e2fe99c43bea726002539e not found: ID does not exist" Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.750092 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kl24\" (UniqueName: \"kubernetes.io/projected/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-kube-api-access-8kl24\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.750147 4825 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.750161 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.750173 4825 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:07 crc kubenswrapper[4825]: I0310 07:08:07.750186 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.022031 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.046039 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.055874 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 07:08:08 crc kubenswrapper[4825]: E0310 07:08:08.056543 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ffd2f64-247a-4adf-bdc5-dce905192f17" containerName="init" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.056578 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ffd2f64-247a-4adf-bdc5-dce905192f17" containerName="init" Mar 10 07:08:08 crc kubenswrapper[4825]: E0310 07:08:08.056612 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ffd2f64-247a-4adf-bdc5-dce905192f17" containerName="dnsmasq-dns" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.056624 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ffd2f64-247a-4adf-bdc5-dce905192f17" containerName="dnsmasq-dns" Mar 10 07:08:08 crc kubenswrapper[4825]: E0310 07:08:08.056648 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4" containerName="nova-api-log" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.056660 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4" containerName="nova-api-log" Mar 10 07:08:08 crc kubenswrapper[4825]: E0310 07:08:08.056689 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20652aad-ebbd-472d-8cd8-61c598352ec4" containerName="nova-manage" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.056700 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="20652aad-ebbd-472d-8cd8-61c598352ec4" containerName="nova-manage" Mar 10 07:08:08 crc kubenswrapper[4825]: E0310 07:08:08.056720 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa4c8337-0ae4-4b85-9793-4adb29f920a7" containerName="oc" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.056732 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa4c8337-0ae4-4b85-9793-4adb29f920a7" containerName="oc" Mar 10 07:08:08 crc kubenswrapper[4825]: E0310 07:08:08.056761 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4" containerName="nova-api-api" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.056772 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4" containerName="nova-api-api" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.057076 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="20652aad-ebbd-472d-8cd8-61c598352ec4" containerName="nova-manage" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.057095 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ffd2f64-247a-4adf-bdc5-dce905192f17" containerName="dnsmasq-dns" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.057121 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa4c8337-0ae4-4b85-9793-4adb29f920a7" containerName="oc" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.057174 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4" containerName="nova-api-api" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.057196 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4" containerName="nova-api-log" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.058937 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.061873 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.062517 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.062660 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.115157 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.159105 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-logs\") pod \"nova-api-0\" (UID: \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\") " pod="openstack/nova-api-0" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.159176 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\") " pod="openstack/nova-api-0" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.159241 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-config-data\") pod \"nova-api-0\" (UID: \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\") " pod="openstack/nova-api-0" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.159261 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\") " pod="openstack/nova-api-0" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.159297 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-public-tls-certs\") pod \"nova-api-0\" (UID: \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\") " pod="openstack/nova-api-0" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.159525 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86xqr\" (UniqueName: \"kubernetes.io/projected/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-kube-api-access-86xqr\") pod \"nova-api-0\" (UID: \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\") " pod="openstack/nova-api-0" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.260633 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-public-tls-certs\") pod \"nova-api-0\" (UID: \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\") " pod="openstack/nova-api-0" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.260785 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86xqr\" (UniqueName: \"kubernetes.io/projected/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-kube-api-access-86xqr\") pod \"nova-api-0\" (UID: \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\") " pod="openstack/nova-api-0" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.260808 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-logs\") pod \"nova-api-0\" (UID: \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\") " pod="openstack/nova-api-0" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.260844 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\") " pod="openstack/nova-api-0" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.260902 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-config-data\") pod \"nova-api-0\" (UID: \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\") " pod="openstack/nova-api-0" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.260929 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\") " pod="openstack/nova-api-0" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.262330 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-logs\") pod \"nova-api-0\" (UID: \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\") " pod="openstack/nova-api-0" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.278181 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-config-data\") pod \"nova-api-0\" (UID: \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\") " pod="openstack/nova-api-0" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.278640 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\") " pod="openstack/nova-api-0" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.289737 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-public-tls-certs\") pod \"nova-api-0\" (UID: \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\") " pod="openstack/nova-api-0" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.300828 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86xqr\" (UniqueName: \"kubernetes.io/projected/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-kube-api-access-86xqr\") pod \"nova-api-0\" (UID: \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\") " pod="openstack/nova-api-0" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.314890 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\") " pod="openstack/nova-api-0" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.451225 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 07:08:08 crc kubenswrapper[4825]: E0310 07:08:08.457396 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="71c198772602fab08b7c89ff60b473e6b8d4a34a3f456f7a2f4966de4e6b819f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 07:08:08 crc kubenswrapper[4825]: E0310 07:08:08.459001 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="71c198772602fab08b7c89ff60b473e6b8d4a34a3f456f7a2f4966de4e6b819f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 07:08:08 crc kubenswrapper[4825]: E0310 07:08:08.460479 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="71c198772602fab08b7c89ff60b473e6b8d4a34a3f456f7a2f4966de4e6b819f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 07:08:08 crc kubenswrapper[4825]: E0310 07:08:08.460558 4825 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="3858ca8e-74ce-44bd-a942-a64b4f59270f" containerName="nova-scheduler-scheduler" Mar 10 07:08:08 crc kubenswrapper[4825]: I0310 07:08:08.920422 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 07:08:08 crc kubenswrapper[4825]: W0310 07:08:08.921978 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcf819fc_0560_45a8_be5e_04e6ef2bbf32.slice/crio-dd360048a2592b5601b12ac87d8106f90fc64522cda299c67f0fc7a86da614aa WatchSource:0}: Error finding container dd360048a2592b5601b12ac87d8106f90fc64522cda299c67f0fc7a86da614aa: Status 404 returned error can't find the container with id dd360048a2592b5601b12ac87d8106f90fc64522cda299c67f0fc7a86da614aa Mar 10 07:08:09 crc kubenswrapper[4825]: I0310 07:08:09.254090 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4" path="/var/lib/kubelet/pods/1fe14b8d-8183-4faf-bd66-2ebeb0cfcfe4/volumes" Mar 10 07:08:09 crc kubenswrapper[4825]: I0310 07:08:09.708463 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fcf819fc-0560-45a8-be5e-04e6ef2bbf32","Type":"ContainerStarted","Data":"b0692876912e4dddda340c37876d5dcd6e82ee0c786afdeaa20b480be4903f55"} Mar 10 07:08:09 crc kubenswrapper[4825]: I0310 07:08:09.708526 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fcf819fc-0560-45a8-be5e-04e6ef2bbf32","Type":"ContainerStarted","Data":"40b43e7e62b535d8a9ba96b7af5069797ffcd786e53c1075b83a21ba139ed0fa"} Mar 10 07:08:09 crc kubenswrapper[4825]: I0310 07:08:09.708551 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fcf819fc-0560-45a8-be5e-04e6ef2bbf32","Type":"ContainerStarted","Data":"dd360048a2592b5601b12ac87d8106f90fc64522cda299c67f0fc7a86da614aa"} Mar 10 07:08:09 crc kubenswrapper[4825]: I0310 07:08:09.736964 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.736945862 podStartE2EDuration="1.736945862s" podCreationTimestamp="2026-03-10 07:08:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:08:09.729204621 +0000 UTC m=+1442.758985236" watchObservedRunningTime="2026-03-10 07:08:09.736945862 +0000 UTC m=+1442.766726477" Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.097805 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ba141fc1-b6e8-4d89-a27c-be685f51d6ad" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": read tcp 10.217.0.2:58490->10.217.0.203:8775: read: connection reset by peer" Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.097864 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ba141fc1-b6e8-4d89-a27c-be685f51d6ad" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": read tcp 10.217.0.2:58502->10.217.0.203:8775: read: connection reset by peer" Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.626572 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.712797 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-config-data\") pod \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\" (UID: \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\") " Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.712986 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-combined-ca-bundle\") pod \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\" (UID: \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\") " Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.713042 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-nova-metadata-tls-certs\") pod \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\" (UID: \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\") " Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.713099 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mscss\" (UniqueName: \"kubernetes.io/projected/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-kube-api-access-mscss\") pod \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\" (UID: \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\") " Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.713167 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-logs\") pod \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\" (UID: \"ba141fc1-b6e8-4d89-a27c-be685f51d6ad\") " Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.713708 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-logs" (OuterVolumeSpecName: "logs") pod "ba141fc1-b6e8-4d89-a27c-be685f51d6ad" (UID: "ba141fc1-b6e8-4d89-a27c-be685f51d6ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.719796 4825 generic.go:334] "Generic (PLEG): container finished" podID="ba141fc1-b6e8-4d89-a27c-be685f51d6ad" containerID="96f554d756b7d59701d89a4889be2adb5eb03daad6038fc9e00da48c875499dd" exitCode=0 Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.720277 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.720295 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ba141fc1-b6e8-4d89-a27c-be685f51d6ad","Type":"ContainerDied","Data":"96f554d756b7d59701d89a4889be2adb5eb03daad6038fc9e00da48c875499dd"} Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.720349 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ba141fc1-b6e8-4d89-a27c-be685f51d6ad","Type":"ContainerDied","Data":"2659159c5880ea26e1514eedac232ae2c791d5640de8edabf85bc186366fb5bf"} Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.720374 4825 scope.go:117] "RemoveContainer" containerID="96f554d756b7d59701d89a4889be2adb5eb03daad6038fc9e00da48c875499dd" Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.730378 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-kube-api-access-mscss" (OuterVolumeSpecName: "kube-api-access-mscss") pod "ba141fc1-b6e8-4d89-a27c-be685f51d6ad" (UID: "ba141fc1-b6e8-4d89-a27c-be685f51d6ad"). InnerVolumeSpecName "kube-api-access-mscss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.749984 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba141fc1-b6e8-4d89-a27c-be685f51d6ad" (UID: "ba141fc1-b6e8-4d89-a27c-be685f51d6ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.773276 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-config-data" (OuterVolumeSpecName: "config-data") pod "ba141fc1-b6e8-4d89-a27c-be685f51d6ad" (UID: "ba141fc1-b6e8-4d89-a27c-be685f51d6ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.806155 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ba141fc1-b6e8-4d89-a27c-be685f51d6ad" (UID: "ba141fc1-b6e8-4d89-a27c-be685f51d6ad"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.814693 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.814863 4825 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.814943 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mscss\" (UniqueName: \"kubernetes.io/projected/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-kube-api-access-mscss\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.815015 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.815076 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba141fc1-b6e8-4d89-a27c-be685f51d6ad-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.830213 4825 scope.go:117] "RemoveContainer" containerID="72dbff0f20a0e237d347b1529b4a95ef2ab111aa2d0f91ed82359a099402beb9" Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.851991 4825 scope.go:117] "RemoveContainer" containerID="96f554d756b7d59701d89a4889be2adb5eb03daad6038fc9e00da48c875499dd" Mar 10 07:08:10 crc kubenswrapper[4825]: E0310 07:08:10.852951 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96f554d756b7d59701d89a4889be2adb5eb03daad6038fc9e00da48c875499dd\": container with ID starting with 96f554d756b7d59701d89a4889be2adb5eb03daad6038fc9e00da48c875499dd not found: ID does not exist" containerID="96f554d756b7d59701d89a4889be2adb5eb03daad6038fc9e00da48c875499dd" Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.853159 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96f554d756b7d59701d89a4889be2adb5eb03daad6038fc9e00da48c875499dd"} err="failed to get container status \"96f554d756b7d59701d89a4889be2adb5eb03daad6038fc9e00da48c875499dd\": rpc error: code = NotFound desc = could not find container \"96f554d756b7d59701d89a4889be2adb5eb03daad6038fc9e00da48c875499dd\": container with ID starting with 96f554d756b7d59701d89a4889be2adb5eb03daad6038fc9e00da48c875499dd not found: ID does not exist" Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.853288 4825 scope.go:117] "RemoveContainer" containerID="72dbff0f20a0e237d347b1529b4a95ef2ab111aa2d0f91ed82359a099402beb9" Mar 10 07:08:10 crc kubenswrapper[4825]: E0310 07:08:10.853624 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72dbff0f20a0e237d347b1529b4a95ef2ab111aa2d0f91ed82359a099402beb9\": container with ID starting with 72dbff0f20a0e237d347b1529b4a95ef2ab111aa2d0f91ed82359a099402beb9 not found: ID does not exist" containerID="72dbff0f20a0e237d347b1529b4a95ef2ab111aa2d0f91ed82359a099402beb9" Mar 10 07:08:10 crc kubenswrapper[4825]: I0310 07:08:10.853654 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72dbff0f20a0e237d347b1529b4a95ef2ab111aa2d0f91ed82359a099402beb9"} err="failed to get container status \"72dbff0f20a0e237d347b1529b4a95ef2ab111aa2d0f91ed82359a099402beb9\": rpc error: code = NotFound desc = could not find container \"72dbff0f20a0e237d347b1529b4a95ef2ab111aa2d0f91ed82359a099402beb9\": container with ID starting with 72dbff0f20a0e237d347b1529b4a95ef2ab111aa2d0f91ed82359a099402beb9 not found: ID does not exist" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.054120 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.063567 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.082365 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 07:08:11 crc kubenswrapper[4825]: E0310 07:08:11.082764 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba141fc1-b6e8-4d89-a27c-be685f51d6ad" containerName="nova-metadata-log" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.082783 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba141fc1-b6e8-4d89-a27c-be685f51d6ad" containerName="nova-metadata-log" Mar 10 07:08:11 crc kubenswrapper[4825]: E0310 07:08:11.082814 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba141fc1-b6e8-4d89-a27c-be685f51d6ad" containerName="nova-metadata-metadata" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.082836 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba141fc1-b6e8-4d89-a27c-be685f51d6ad" containerName="nova-metadata-metadata" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.083061 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba141fc1-b6e8-4d89-a27c-be685f51d6ad" containerName="nova-metadata-metadata" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.083100 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba141fc1-b6e8-4d89-a27c-be685f51d6ad" containerName="nova-metadata-log" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.084249 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.087496 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.088374 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.095654 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.123115 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1222a91-b344-4f54-bf9f-75d03d5f8549-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b1222a91-b344-4f54-bf9f-75d03d5f8549\") " pod="openstack/nova-metadata-0" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.123263 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1222a91-b344-4f54-bf9f-75d03d5f8549-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b1222a91-b344-4f54-bf9f-75d03d5f8549\") " pod="openstack/nova-metadata-0" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.123323 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1222a91-b344-4f54-bf9f-75d03d5f8549-config-data\") pod \"nova-metadata-0\" (UID: \"b1222a91-b344-4f54-bf9f-75d03d5f8549\") " pod="openstack/nova-metadata-0" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.123420 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmn4h\" (UniqueName: \"kubernetes.io/projected/b1222a91-b344-4f54-bf9f-75d03d5f8549-kube-api-access-tmn4h\") pod \"nova-metadata-0\" (UID: \"b1222a91-b344-4f54-bf9f-75d03d5f8549\") " pod="openstack/nova-metadata-0" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.123508 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1222a91-b344-4f54-bf9f-75d03d5f8549-logs\") pod \"nova-metadata-0\" (UID: \"b1222a91-b344-4f54-bf9f-75d03d5f8549\") " pod="openstack/nova-metadata-0" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.225556 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1222a91-b344-4f54-bf9f-75d03d5f8549-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b1222a91-b344-4f54-bf9f-75d03d5f8549\") " pod="openstack/nova-metadata-0" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.225646 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1222a91-b344-4f54-bf9f-75d03d5f8549-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b1222a91-b344-4f54-bf9f-75d03d5f8549\") " pod="openstack/nova-metadata-0" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.225684 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1222a91-b344-4f54-bf9f-75d03d5f8549-config-data\") pod \"nova-metadata-0\" (UID: \"b1222a91-b344-4f54-bf9f-75d03d5f8549\") " pod="openstack/nova-metadata-0" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.225717 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmn4h\" (UniqueName: \"kubernetes.io/projected/b1222a91-b344-4f54-bf9f-75d03d5f8549-kube-api-access-tmn4h\") pod \"nova-metadata-0\" (UID: \"b1222a91-b344-4f54-bf9f-75d03d5f8549\") " pod="openstack/nova-metadata-0" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.225749 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1222a91-b344-4f54-bf9f-75d03d5f8549-logs\") pod \"nova-metadata-0\" (UID: \"b1222a91-b344-4f54-bf9f-75d03d5f8549\") " pod="openstack/nova-metadata-0" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.226382 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1222a91-b344-4f54-bf9f-75d03d5f8549-logs\") pod \"nova-metadata-0\" (UID: \"b1222a91-b344-4f54-bf9f-75d03d5f8549\") " pod="openstack/nova-metadata-0" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.229822 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1222a91-b344-4f54-bf9f-75d03d5f8549-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b1222a91-b344-4f54-bf9f-75d03d5f8549\") " pod="openstack/nova-metadata-0" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.230350 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1222a91-b344-4f54-bf9f-75d03d5f8549-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b1222a91-b344-4f54-bf9f-75d03d5f8549\") " pod="openstack/nova-metadata-0" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.231924 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1222a91-b344-4f54-bf9f-75d03d5f8549-config-data\") pod \"nova-metadata-0\" (UID: \"b1222a91-b344-4f54-bf9f-75d03d5f8549\") " pod="openstack/nova-metadata-0" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.243559 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmn4h\" (UniqueName: \"kubernetes.io/projected/b1222a91-b344-4f54-bf9f-75d03d5f8549-kube-api-access-tmn4h\") pod \"nova-metadata-0\" (UID: \"b1222a91-b344-4f54-bf9f-75d03d5f8549\") " pod="openstack/nova-metadata-0" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.250592 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba141fc1-b6e8-4d89-a27c-be685f51d6ad" path="/var/lib/kubelet/pods/ba141fc1-b6e8-4d89-a27c-be685f51d6ad/volumes" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.406079 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 07:08:11 crc kubenswrapper[4825]: I0310 07:08:11.908797 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 07:08:12 crc kubenswrapper[4825]: W0310 07:08:12.181380 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1222a91_b344_4f54_bf9f_75d03d5f8549.slice/crio-7bf1f87eaf0884a2262921933bd4c53aab23815c84d61c53a1b7c86ab76b98fb WatchSource:0}: Error finding container 7bf1f87eaf0884a2262921933bd4c53aab23815c84d61c53a1b7c86ab76b98fb: Status 404 returned error can't find the container with id 7bf1f87eaf0884a2262921933bd4c53aab23815c84d61c53a1b7c86ab76b98fb Mar 10 07:08:12 crc kubenswrapper[4825]: I0310 07:08:12.732518 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 07:08:12 crc kubenswrapper[4825]: I0310 07:08:12.746617 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b1222a91-b344-4f54-bf9f-75d03d5f8549","Type":"ContainerStarted","Data":"eeadc36b533bcb861ae1e4ac68b98711c46023cbb63985038bb6a6302eb5d0a3"} Mar 10 07:08:12 crc kubenswrapper[4825]: I0310 07:08:12.746883 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b1222a91-b344-4f54-bf9f-75d03d5f8549","Type":"ContainerStarted","Data":"7bf1f87eaf0884a2262921933bd4c53aab23815c84d61c53a1b7c86ab76b98fb"} Mar 10 07:08:12 crc kubenswrapper[4825]: I0310 07:08:12.752463 4825 generic.go:334] "Generic (PLEG): container finished" podID="3858ca8e-74ce-44bd-a942-a64b4f59270f" containerID="71c198772602fab08b7c89ff60b473e6b8d4a34a3f456f7a2f4966de4e6b819f" exitCode=0 Mar 10 07:08:12 crc kubenswrapper[4825]: I0310 07:08:12.752510 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3858ca8e-74ce-44bd-a942-a64b4f59270f","Type":"ContainerDied","Data":"71c198772602fab08b7c89ff60b473e6b8d4a34a3f456f7a2f4966de4e6b819f"} Mar 10 07:08:12 crc kubenswrapper[4825]: I0310 07:08:12.752540 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3858ca8e-74ce-44bd-a942-a64b4f59270f","Type":"ContainerDied","Data":"f2d950393409df5d1cf1663c33e92b955996a6d637c9234998a222040980c8fd"} Mar 10 07:08:12 crc kubenswrapper[4825]: I0310 07:08:12.752558 4825 scope.go:117] "RemoveContainer" containerID="71c198772602fab08b7c89ff60b473e6b8d4a34a3f456f7a2f4966de4e6b819f" Mar 10 07:08:12 crc kubenswrapper[4825]: I0310 07:08:12.752669 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 07:08:12 crc kubenswrapper[4825]: I0310 07:08:12.830266 4825 scope.go:117] "RemoveContainer" containerID="71c198772602fab08b7c89ff60b473e6b8d4a34a3f456f7a2f4966de4e6b819f" Mar 10 07:08:12 crc kubenswrapper[4825]: E0310 07:08:12.830737 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71c198772602fab08b7c89ff60b473e6b8d4a34a3f456f7a2f4966de4e6b819f\": container with ID starting with 71c198772602fab08b7c89ff60b473e6b8d4a34a3f456f7a2f4966de4e6b819f not found: ID does not exist" containerID="71c198772602fab08b7c89ff60b473e6b8d4a34a3f456f7a2f4966de4e6b819f" Mar 10 07:08:12 crc kubenswrapper[4825]: I0310 07:08:12.830812 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71c198772602fab08b7c89ff60b473e6b8d4a34a3f456f7a2f4966de4e6b819f"} err="failed to get container status \"71c198772602fab08b7c89ff60b473e6b8d4a34a3f456f7a2f4966de4e6b819f\": rpc error: code = NotFound desc = could not find container \"71c198772602fab08b7c89ff60b473e6b8d4a34a3f456f7a2f4966de4e6b819f\": container with ID starting with 71c198772602fab08b7c89ff60b473e6b8d4a34a3f456f7a2f4966de4e6b819f not found: ID does not exist" Mar 10 07:08:12 crc kubenswrapper[4825]: I0310 07:08:12.891070 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7vxs\" (UniqueName: \"kubernetes.io/projected/3858ca8e-74ce-44bd-a942-a64b4f59270f-kube-api-access-v7vxs\") pod \"3858ca8e-74ce-44bd-a942-a64b4f59270f\" (UID: \"3858ca8e-74ce-44bd-a942-a64b4f59270f\") " Mar 10 07:08:12 crc kubenswrapper[4825]: I0310 07:08:12.891198 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3858ca8e-74ce-44bd-a942-a64b4f59270f-combined-ca-bundle\") pod \"3858ca8e-74ce-44bd-a942-a64b4f59270f\" (UID: \"3858ca8e-74ce-44bd-a942-a64b4f59270f\") " Mar 10 07:08:12 crc kubenswrapper[4825]: I0310 07:08:12.891345 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3858ca8e-74ce-44bd-a942-a64b4f59270f-config-data\") pod \"3858ca8e-74ce-44bd-a942-a64b4f59270f\" (UID: \"3858ca8e-74ce-44bd-a942-a64b4f59270f\") " Mar 10 07:08:12 crc kubenswrapper[4825]: I0310 07:08:12.895652 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3858ca8e-74ce-44bd-a942-a64b4f59270f-kube-api-access-v7vxs" (OuterVolumeSpecName: "kube-api-access-v7vxs") pod "3858ca8e-74ce-44bd-a942-a64b4f59270f" (UID: "3858ca8e-74ce-44bd-a942-a64b4f59270f"). InnerVolumeSpecName "kube-api-access-v7vxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:08:12 crc kubenswrapper[4825]: I0310 07:08:12.921404 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3858ca8e-74ce-44bd-a942-a64b4f59270f-config-data" (OuterVolumeSpecName: "config-data") pod "3858ca8e-74ce-44bd-a942-a64b4f59270f" (UID: "3858ca8e-74ce-44bd-a942-a64b4f59270f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:08:12 crc kubenswrapper[4825]: I0310 07:08:12.942901 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3858ca8e-74ce-44bd-a942-a64b4f59270f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3858ca8e-74ce-44bd-a942-a64b4f59270f" (UID: "3858ca8e-74ce-44bd-a942-a64b4f59270f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:08:12 crc kubenswrapper[4825]: I0310 07:08:12.993589 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3858ca8e-74ce-44bd-a942-a64b4f59270f-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:12 crc kubenswrapper[4825]: I0310 07:08:12.993630 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7vxs\" (UniqueName: \"kubernetes.io/projected/3858ca8e-74ce-44bd-a942-a64b4f59270f-kube-api-access-v7vxs\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:12 crc kubenswrapper[4825]: I0310 07:08:12.993645 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3858ca8e-74ce-44bd-a942-a64b4f59270f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:13 crc kubenswrapper[4825]: I0310 07:08:13.084525 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 07:08:13 crc kubenswrapper[4825]: I0310 07:08:13.094464 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 07:08:13 crc kubenswrapper[4825]: I0310 07:08:13.103672 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 07:08:13 crc kubenswrapper[4825]: E0310 07:08:13.104111 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3858ca8e-74ce-44bd-a942-a64b4f59270f" containerName="nova-scheduler-scheduler" Mar 10 07:08:13 crc kubenswrapper[4825]: I0310 07:08:13.104155 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3858ca8e-74ce-44bd-a942-a64b4f59270f" containerName="nova-scheduler-scheduler" Mar 10 07:08:13 crc kubenswrapper[4825]: I0310 07:08:13.104366 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3858ca8e-74ce-44bd-a942-a64b4f59270f" containerName="nova-scheduler-scheduler" Mar 10 07:08:13 crc kubenswrapper[4825]: I0310 07:08:13.105159 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 07:08:13 crc kubenswrapper[4825]: I0310 07:08:13.107666 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 07:08:13 crc kubenswrapper[4825]: I0310 07:08:13.117436 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 07:08:13 crc kubenswrapper[4825]: I0310 07:08:13.257505 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3858ca8e-74ce-44bd-a942-a64b4f59270f" path="/var/lib/kubelet/pods/3858ca8e-74ce-44bd-a942-a64b4f59270f/volumes" Mar 10 07:08:13 crc kubenswrapper[4825]: I0310 07:08:13.299430 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f5995b-4ad8-4840-82f1-6659152c3ed4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"85f5995b-4ad8-4840-82f1-6659152c3ed4\") " pod="openstack/nova-scheduler-0" Mar 10 07:08:13 crc kubenswrapper[4825]: I0310 07:08:13.299534 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvjqh\" (UniqueName: \"kubernetes.io/projected/85f5995b-4ad8-4840-82f1-6659152c3ed4-kube-api-access-tvjqh\") pod \"nova-scheduler-0\" (UID: \"85f5995b-4ad8-4840-82f1-6659152c3ed4\") " pod="openstack/nova-scheduler-0" Mar 10 07:08:13 crc kubenswrapper[4825]: I0310 07:08:13.299622 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f5995b-4ad8-4840-82f1-6659152c3ed4-config-data\") pod \"nova-scheduler-0\" (UID: \"85f5995b-4ad8-4840-82f1-6659152c3ed4\") " pod="openstack/nova-scheduler-0" Mar 10 07:08:13 crc kubenswrapper[4825]: I0310 07:08:13.401198 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvjqh\" (UniqueName: \"kubernetes.io/projected/85f5995b-4ad8-4840-82f1-6659152c3ed4-kube-api-access-tvjqh\") pod \"nova-scheduler-0\" (UID: \"85f5995b-4ad8-4840-82f1-6659152c3ed4\") " pod="openstack/nova-scheduler-0" Mar 10 07:08:13 crc kubenswrapper[4825]: I0310 07:08:13.401400 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f5995b-4ad8-4840-82f1-6659152c3ed4-config-data\") pod \"nova-scheduler-0\" (UID: \"85f5995b-4ad8-4840-82f1-6659152c3ed4\") " pod="openstack/nova-scheduler-0" Mar 10 07:08:13 crc kubenswrapper[4825]: I0310 07:08:13.401552 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f5995b-4ad8-4840-82f1-6659152c3ed4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"85f5995b-4ad8-4840-82f1-6659152c3ed4\") " pod="openstack/nova-scheduler-0" Mar 10 07:08:13 crc kubenswrapper[4825]: I0310 07:08:13.408084 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f5995b-4ad8-4840-82f1-6659152c3ed4-config-data\") pod \"nova-scheduler-0\" (UID: \"85f5995b-4ad8-4840-82f1-6659152c3ed4\") " pod="openstack/nova-scheduler-0" Mar 10 07:08:13 crc kubenswrapper[4825]: I0310 07:08:13.408659 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f5995b-4ad8-4840-82f1-6659152c3ed4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"85f5995b-4ad8-4840-82f1-6659152c3ed4\") " pod="openstack/nova-scheduler-0" Mar 10 07:08:13 crc kubenswrapper[4825]: I0310 07:08:13.423069 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvjqh\" (UniqueName: \"kubernetes.io/projected/85f5995b-4ad8-4840-82f1-6659152c3ed4-kube-api-access-tvjqh\") pod \"nova-scheduler-0\" (UID: \"85f5995b-4ad8-4840-82f1-6659152c3ed4\") " pod="openstack/nova-scheduler-0" Mar 10 07:08:13 crc kubenswrapper[4825]: I0310 07:08:13.721122 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 07:08:13 crc kubenswrapper[4825]: I0310 07:08:13.771850 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b1222a91-b344-4f54-bf9f-75d03d5f8549","Type":"ContainerStarted","Data":"fe83016a56de378ec7204c67a3bc85b458b1da0d43277d26413e0aa8e2ea9dc4"} Mar 10 07:08:14 crc kubenswrapper[4825]: I0310 07:08:14.220828 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.22080674 podStartE2EDuration="3.22080674s" podCreationTimestamp="2026-03-10 07:08:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:08:13.799708363 +0000 UTC m=+1446.829488998" watchObservedRunningTime="2026-03-10 07:08:14.22080674 +0000 UTC m=+1447.250587355" Mar 10 07:08:14 crc kubenswrapper[4825]: I0310 07:08:14.226063 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 07:08:14 crc kubenswrapper[4825]: I0310 07:08:14.791548 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"85f5995b-4ad8-4840-82f1-6659152c3ed4","Type":"ContainerStarted","Data":"23d5b7cd25a646bc4179b5e41cb642aa99ae3019bdbfa362dc0578d525c4bfdf"} Mar 10 07:08:14 crc kubenswrapper[4825]: I0310 07:08:14.791917 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"85f5995b-4ad8-4840-82f1-6659152c3ed4","Type":"ContainerStarted","Data":"141b7ddad9bdd5daae4c0afcd89acbe29d3c3b9a413154d4f74d7e889beeb150"} Mar 10 07:08:14 crc kubenswrapper[4825]: I0310 07:08:14.821497 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.821479278 podStartE2EDuration="1.821479278s" podCreationTimestamp="2026-03-10 07:08:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 07:08:14.819122797 +0000 UTC m=+1447.848903492" watchObservedRunningTime="2026-03-10 07:08:14.821479278 +0000 UTC m=+1447.851259903" Mar 10 07:08:16 crc kubenswrapper[4825]: I0310 07:08:16.406682 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 07:08:16 crc kubenswrapper[4825]: I0310 07:08:16.406989 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 07:08:18 crc kubenswrapper[4825]: I0310 07:08:18.451753 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 07:08:18 crc kubenswrapper[4825]: I0310 07:08:18.451921 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 07:08:18 crc kubenswrapper[4825]: I0310 07:08:18.722887 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 07:08:19 crc kubenswrapper[4825]: I0310 07:08:19.470396 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fcf819fc-0560-45a8-be5e-04e6ef2bbf32" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.214:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 07:08:19 crc kubenswrapper[4825]: I0310 07:08:19.470419 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fcf819fc-0560-45a8-be5e-04e6ef2bbf32" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.214:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 07:08:21 crc kubenswrapper[4825]: I0310 07:08:21.407435 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 07:08:21 crc kubenswrapper[4825]: I0310 07:08:21.407484 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 07:08:22 crc kubenswrapper[4825]: I0310 07:08:22.483406 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b1222a91-b344-4f54-bf9f-75d03d5f8549" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 07:08:22 crc kubenswrapper[4825]: I0310 07:08:22.483416 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b1222a91-b344-4f54-bf9f-75d03d5f8549" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 07:08:23 crc kubenswrapper[4825]: I0310 07:08:23.722341 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 07:08:23 crc kubenswrapper[4825]: I0310 07:08:23.757532 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 07:08:23 crc kubenswrapper[4825]: I0310 07:08:23.913809 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 07:08:25 crc kubenswrapper[4825]: I0310 07:08:25.854119 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c09e2c9b-b94a-40cb-8646-687e010f3002" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 10 07:08:28 crc kubenswrapper[4825]: I0310 07:08:28.462063 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 07:08:28 crc kubenswrapper[4825]: I0310 07:08:28.462969 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 07:08:28 crc kubenswrapper[4825]: I0310 07:08:28.463097 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 07:08:28 crc kubenswrapper[4825]: I0310 07:08:28.479331 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 07:08:28 crc kubenswrapper[4825]: I0310 07:08:28.934670 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 07:08:28 crc kubenswrapper[4825]: I0310 07:08:28.941415 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 07:08:30 crc kubenswrapper[4825]: I0310 07:08:30.993402 4825 generic.go:334] "Generic (PLEG): container finished" podID="c09e2c9b-b94a-40cb-8646-687e010f3002" containerID="126696bf9938795cae75dc7fc16fc96bf0c323ea2004daf4fa07abd6a757a8a9" exitCode=137 Mar 10 07:08:30 crc kubenswrapper[4825]: I0310 07:08:30.993529 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c09e2c9b-b94a-40cb-8646-687e010f3002","Type":"ContainerDied","Data":"126696bf9938795cae75dc7fc16fc96bf0c323ea2004daf4fa07abd6a757a8a9"} Mar 10 07:08:30 crc kubenswrapper[4825]: I0310 07:08:30.993608 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c09e2c9b-b94a-40cb-8646-687e010f3002","Type":"ContainerDied","Data":"a5159655c37fce49ddef6374d763715ba1db382aa253a51504395a622d8c518b"} Mar 10 07:08:30 crc kubenswrapper[4825]: I0310 07:08:30.993627 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5159655c37fce49ddef6374d763715ba1db382aa253a51504395a622d8c518b" Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.033686 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.196849 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-combined-ca-bundle\") pod \"c09e2c9b-b94a-40cb-8646-687e010f3002\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.196931 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c09e2c9b-b94a-40cb-8646-687e010f3002-log-httpd\") pod \"c09e2c9b-b94a-40cb-8646-687e010f3002\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.196985 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g4kn\" (UniqueName: \"kubernetes.io/projected/c09e2c9b-b94a-40cb-8646-687e010f3002-kube-api-access-7g4kn\") pod \"c09e2c9b-b94a-40cb-8646-687e010f3002\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.197067 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-ceilometer-tls-certs\") pod \"c09e2c9b-b94a-40cb-8646-687e010f3002\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.197235 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-scripts\") pod \"c09e2c9b-b94a-40cb-8646-687e010f3002\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.197284 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c09e2c9b-b94a-40cb-8646-687e010f3002-run-httpd\") pod \"c09e2c9b-b94a-40cb-8646-687e010f3002\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.197324 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-config-data\") pod \"c09e2c9b-b94a-40cb-8646-687e010f3002\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.197353 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-sg-core-conf-yaml\") pod \"c09e2c9b-b94a-40cb-8646-687e010f3002\" (UID: \"c09e2c9b-b94a-40cb-8646-687e010f3002\") " Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.197894 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c09e2c9b-b94a-40cb-8646-687e010f3002-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c09e2c9b-b94a-40cb-8646-687e010f3002" (UID: "c09e2c9b-b94a-40cb-8646-687e010f3002"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.198037 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c09e2c9b-b94a-40cb-8646-687e010f3002-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c09e2c9b-b94a-40cb-8646-687e010f3002" (UID: "c09e2c9b-b94a-40cb-8646-687e010f3002"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.214493 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-scripts" (OuterVolumeSpecName: "scripts") pod "c09e2c9b-b94a-40cb-8646-687e010f3002" (UID: "c09e2c9b-b94a-40cb-8646-687e010f3002"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.214591 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c09e2c9b-b94a-40cb-8646-687e010f3002-kube-api-access-7g4kn" (OuterVolumeSpecName: "kube-api-access-7g4kn") pod "c09e2c9b-b94a-40cb-8646-687e010f3002" (UID: "c09e2c9b-b94a-40cb-8646-687e010f3002"). InnerVolumeSpecName "kube-api-access-7g4kn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.233953 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c09e2c9b-b94a-40cb-8646-687e010f3002" (UID: "c09e2c9b-b94a-40cb-8646-687e010f3002"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.253331 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c09e2c9b-b94a-40cb-8646-687e010f3002" (UID: "c09e2c9b-b94a-40cb-8646-687e010f3002"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.288571 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c09e2c9b-b94a-40cb-8646-687e010f3002" (UID: "c09e2c9b-b94a-40cb-8646-687e010f3002"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.300205 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.300241 4825 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c09e2c9b-b94a-40cb-8646-687e010f3002-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.300255 4825 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.300269 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.300283 4825 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c09e2c9b-b94a-40cb-8646-687e010f3002-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.300295 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g4kn\" (UniqueName: \"kubernetes.io/projected/c09e2c9b-b94a-40cb-8646-687e010f3002-kube-api-access-7g4kn\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.300308 4825 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.316741 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-config-data" (OuterVolumeSpecName: "config-data") pod "c09e2c9b-b94a-40cb-8646-687e010f3002" (UID: "c09e2c9b-b94a-40cb-8646-687e010f3002"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.402341 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c09e2c9b-b94a-40cb-8646-687e010f3002-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.413224 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.414430 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 07:08:31 crc kubenswrapper[4825]: I0310 07:08:31.418092 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.006204 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.017614 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.080410 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.105757 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.117211 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:08:32 crc kubenswrapper[4825]: E0310 07:08:32.117562 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c09e2c9b-b94a-40cb-8646-687e010f3002" containerName="ceilometer-central-agent" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.117575 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c09e2c9b-b94a-40cb-8646-687e010f3002" containerName="ceilometer-central-agent" Mar 10 07:08:32 crc kubenswrapper[4825]: E0310 07:08:32.117588 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c09e2c9b-b94a-40cb-8646-687e010f3002" containerName="ceilometer-notification-agent" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.117595 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c09e2c9b-b94a-40cb-8646-687e010f3002" containerName="ceilometer-notification-agent" Mar 10 07:08:32 crc kubenswrapper[4825]: E0310 07:08:32.117611 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c09e2c9b-b94a-40cb-8646-687e010f3002" containerName="proxy-httpd" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.117618 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c09e2c9b-b94a-40cb-8646-687e010f3002" containerName="proxy-httpd" Mar 10 07:08:32 crc kubenswrapper[4825]: E0310 07:08:32.117647 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c09e2c9b-b94a-40cb-8646-687e010f3002" containerName="sg-core" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.117653 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c09e2c9b-b94a-40cb-8646-687e010f3002" containerName="sg-core" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.117813 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c09e2c9b-b94a-40cb-8646-687e010f3002" containerName="ceilometer-notification-agent" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.117826 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c09e2c9b-b94a-40cb-8646-687e010f3002" containerName="proxy-httpd" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.117843 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c09e2c9b-b94a-40cb-8646-687e010f3002" containerName="sg-core" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.117853 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c09e2c9b-b94a-40cb-8646-687e010f3002" containerName="ceilometer-central-agent" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.119747 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.122556 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.122715 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.122898 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.143252 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.220054 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42rdp\" (UniqueName: \"kubernetes.io/projected/e516b5cf-d44c-4f03-9247-7727319f0a85-kube-api-access-42rdp\") pod \"ceilometer-0\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.220273 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e516b5cf-d44c-4f03-9247-7727319f0a85-log-httpd\") pod \"ceilometer-0\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.220372 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-scripts\") pod \"ceilometer-0\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.220449 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e516b5cf-d44c-4f03-9247-7727319f0a85-run-httpd\") pod \"ceilometer-0\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.220477 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.220510 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-config-data\") pod \"ceilometer-0\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.220597 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.220724 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.322788 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42rdp\" (UniqueName: \"kubernetes.io/projected/e516b5cf-d44c-4f03-9247-7727319f0a85-kube-api-access-42rdp\") pod \"ceilometer-0\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.322900 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e516b5cf-d44c-4f03-9247-7727319f0a85-log-httpd\") pod \"ceilometer-0\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.322955 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-scripts\") pod \"ceilometer-0\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.323008 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e516b5cf-d44c-4f03-9247-7727319f0a85-run-httpd\") pod \"ceilometer-0\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.323042 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.323099 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-config-data\") pod \"ceilometer-0\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.323181 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.323371 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.323805 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e516b5cf-d44c-4f03-9247-7727319f0a85-log-httpd\") pod \"ceilometer-0\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.324443 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e516b5cf-d44c-4f03-9247-7727319f0a85-run-httpd\") pod \"ceilometer-0\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.327782 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.329069 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.330088 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-config-data\") pod \"ceilometer-0\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.331186 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-scripts\") pod \"ceilometer-0\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.334564 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.342916 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42rdp\" (UniqueName: \"kubernetes.io/projected/e516b5cf-d44c-4f03-9247-7727319f0a85-kube-api-access-42rdp\") pod \"ceilometer-0\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " pod="openstack/ceilometer-0" Mar 10 07:08:32 crc kubenswrapper[4825]: I0310 07:08:32.441745 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:08:33 crc kubenswrapper[4825]: I0310 07:08:33.030946 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:08:33 crc kubenswrapper[4825]: I0310 07:08:33.251648 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c09e2c9b-b94a-40cb-8646-687e010f3002" path="/var/lib/kubelet/pods/c09e2c9b-b94a-40cb-8646-687e010f3002/volumes" Mar 10 07:08:33 crc kubenswrapper[4825]: I0310 07:08:33.770957 4825 scope.go:117] "RemoveContainer" containerID="dca2ad7e4a1d246c7cb38e7f11895db1d7d4fb25450a32dd1c07efe0f4d74bf7" Mar 10 07:08:34 crc kubenswrapper[4825]: I0310 07:08:34.069827 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e516b5cf-d44c-4f03-9247-7727319f0a85","Type":"ContainerStarted","Data":"b5fc534b6fc5bdb62a0dd359fd61377bd0c4ca9f7943fd237ff73b6a79274f5c"} Mar 10 07:08:34 crc kubenswrapper[4825]: I0310 07:08:34.069863 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e516b5cf-d44c-4f03-9247-7727319f0a85","Type":"ContainerStarted","Data":"46acdbecf5b0d066e1d674a88c83a7a55aa838f9900872fc9a70c8c736c01984"} Mar 10 07:08:35 crc kubenswrapper[4825]: I0310 07:08:35.092892 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e516b5cf-d44c-4f03-9247-7727319f0a85","Type":"ContainerStarted","Data":"d021be5f73ead6231f65789871b1004af37bd859e93d3efa1540dedd07f320d9"} Mar 10 07:08:36 crc kubenswrapper[4825]: I0310 07:08:36.107238 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e516b5cf-d44c-4f03-9247-7727319f0a85","Type":"ContainerStarted","Data":"b95484ef86f19d2fa040d657d6e6e16f10d17dd575ffd8009352291bebecc277"} Mar 10 07:08:37 crc kubenswrapper[4825]: I0310 07:08:37.124419 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e516b5cf-d44c-4f03-9247-7727319f0a85","Type":"ContainerStarted","Data":"062160b5e34d30d57397fa7d0b1b030e8fd845e13121bd5d0cdea1d6a20a2de8"} Mar 10 07:08:37 crc kubenswrapper[4825]: I0310 07:08:37.124909 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 07:08:37 crc kubenswrapper[4825]: I0310 07:08:37.147969 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.5234759329999998 podStartE2EDuration="5.147953238s" podCreationTimestamp="2026-03-10 07:08:32 +0000 UTC" firstStartedPulling="2026-03-10 07:08:33.047262399 +0000 UTC m=+1466.077043014" lastFinishedPulling="2026-03-10 07:08:36.671739664 +0000 UTC m=+1469.701520319" observedRunningTime="2026-03-10 07:08:37.145873994 +0000 UTC m=+1470.175654609" watchObservedRunningTime="2026-03-10 07:08:37.147953238 +0000 UTC m=+1470.177733853" Mar 10 07:08:55 crc kubenswrapper[4825]: I0310 07:08:55.002858 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nlhhs"] Mar 10 07:08:55 crc kubenswrapper[4825]: I0310 07:08:55.006677 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nlhhs" Mar 10 07:08:55 crc kubenswrapper[4825]: I0310 07:08:55.027245 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nlhhs"] Mar 10 07:08:55 crc kubenswrapper[4825]: I0310 07:08:55.194670 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djcmw\" (UniqueName: \"kubernetes.io/projected/65a7e280-1972-4414-a0dc-420a790f9be6-kube-api-access-djcmw\") pod \"redhat-operators-nlhhs\" (UID: \"65a7e280-1972-4414-a0dc-420a790f9be6\") " pod="openshift-marketplace/redhat-operators-nlhhs" Mar 10 07:08:55 crc kubenswrapper[4825]: I0310 07:08:55.195000 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a7e280-1972-4414-a0dc-420a790f9be6-utilities\") pod \"redhat-operators-nlhhs\" (UID: \"65a7e280-1972-4414-a0dc-420a790f9be6\") " pod="openshift-marketplace/redhat-operators-nlhhs" Mar 10 07:08:55 crc kubenswrapper[4825]: I0310 07:08:55.195028 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a7e280-1972-4414-a0dc-420a790f9be6-catalog-content\") pod \"redhat-operators-nlhhs\" (UID: \"65a7e280-1972-4414-a0dc-420a790f9be6\") " pod="openshift-marketplace/redhat-operators-nlhhs" Mar 10 07:08:55 crc kubenswrapper[4825]: I0310 07:08:55.296480 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djcmw\" (UniqueName: \"kubernetes.io/projected/65a7e280-1972-4414-a0dc-420a790f9be6-kube-api-access-djcmw\") pod \"redhat-operators-nlhhs\" (UID: \"65a7e280-1972-4414-a0dc-420a790f9be6\") " pod="openshift-marketplace/redhat-operators-nlhhs" Mar 10 07:08:55 crc kubenswrapper[4825]: I0310 07:08:55.296552 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a7e280-1972-4414-a0dc-420a790f9be6-utilities\") pod \"redhat-operators-nlhhs\" (UID: \"65a7e280-1972-4414-a0dc-420a790f9be6\") " pod="openshift-marketplace/redhat-operators-nlhhs" Mar 10 07:08:55 crc kubenswrapper[4825]: I0310 07:08:55.296577 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a7e280-1972-4414-a0dc-420a790f9be6-catalog-content\") pod \"redhat-operators-nlhhs\" (UID: \"65a7e280-1972-4414-a0dc-420a790f9be6\") " pod="openshift-marketplace/redhat-operators-nlhhs" Mar 10 07:08:55 crc kubenswrapper[4825]: I0310 07:08:55.297113 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a7e280-1972-4414-a0dc-420a790f9be6-catalog-content\") pod \"redhat-operators-nlhhs\" (UID: \"65a7e280-1972-4414-a0dc-420a790f9be6\") " pod="openshift-marketplace/redhat-operators-nlhhs" Mar 10 07:08:55 crc kubenswrapper[4825]: I0310 07:08:55.297272 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a7e280-1972-4414-a0dc-420a790f9be6-utilities\") pod \"redhat-operators-nlhhs\" (UID: \"65a7e280-1972-4414-a0dc-420a790f9be6\") " pod="openshift-marketplace/redhat-operators-nlhhs" Mar 10 07:08:55 crc kubenswrapper[4825]: I0310 07:08:55.333020 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djcmw\" (UniqueName: \"kubernetes.io/projected/65a7e280-1972-4414-a0dc-420a790f9be6-kube-api-access-djcmw\") pod \"redhat-operators-nlhhs\" (UID: \"65a7e280-1972-4414-a0dc-420a790f9be6\") " pod="openshift-marketplace/redhat-operators-nlhhs" Mar 10 07:08:55 crc kubenswrapper[4825]: I0310 07:08:55.338476 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nlhhs" Mar 10 07:08:55 crc kubenswrapper[4825]: I0310 07:08:55.790302 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nlhhs"] Mar 10 07:08:55 crc kubenswrapper[4825]: W0310 07:08:55.795387 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65a7e280_1972_4414_a0dc_420a790f9be6.slice/crio-cf3c52a1c4cd830a413f6d8474a71383c06b997bc4ea4ec4782384dc0a894c84 WatchSource:0}: Error finding container cf3c52a1c4cd830a413f6d8474a71383c06b997bc4ea4ec4782384dc0a894c84: Status 404 returned error can't find the container with id cf3c52a1c4cd830a413f6d8474a71383c06b997bc4ea4ec4782384dc0a894c84 Mar 10 07:08:56 crc kubenswrapper[4825]: I0310 07:08:56.389673 4825 generic.go:334] "Generic (PLEG): container finished" podID="65a7e280-1972-4414-a0dc-420a790f9be6" containerID="c1e93edc34dd8bc64f2eb94ecb3e1898f7f17b546f62e3b6735090797e997228" exitCode=0 Mar 10 07:08:56 crc kubenswrapper[4825]: I0310 07:08:56.389812 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nlhhs" event={"ID":"65a7e280-1972-4414-a0dc-420a790f9be6","Type":"ContainerDied","Data":"c1e93edc34dd8bc64f2eb94ecb3e1898f7f17b546f62e3b6735090797e997228"} Mar 10 07:08:56 crc kubenswrapper[4825]: I0310 07:08:56.389961 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nlhhs" event={"ID":"65a7e280-1972-4414-a0dc-420a790f9be6","Type":"ContainerStarted","Data":"cf3c52a1c4cd830a413f6d8474a71383c06b997bc4ea4ec4782384dc0a894c84"} Mar 10 07:08:58 crc kubenswrapper[4825]: I0310 07:08:58.410459 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nlhhs" event={"ID":"65a7e280-1972-4414-a0dc-420a790f9be6","Type":"ContainerStarted","Data":"c382bfff07f574aeb3f430c9d0c463c50ef65ebaa99b73d7da818e02ad96ea7b"} Mar 10 07:08:59 crc kubenswrapper[4825]: I0310 07:08:59.421054 4825 generic.go:334] "Generic (PLEG): container finished" podID="65a7e280-1972-4414-a0dc-420a790f9be6" containerID="c382bfff07f574aeb3f430c9d0c463c50ef65ebaa99b73d7da818e02ad96ea7b" exitCode=0 Mar 10 07:08:59 crc kubenswrapper[4825]: I0310 07:08:59.421104 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nlhhs" event={"ID":"65a7e280-1972-4414-a0dc-420a790f9be6","Type":"ContainerDied","Data":"c382bfff07f574aeb3f430c9d0c463c50ef65ebaa99b73d7da818e02ad96ea7b"} Mar 10 07:09:00 crc kubenswrapper[4825]: I0310 07:09:00.431323 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nlhhs" event={"ID":"65a7e280-1972-4414-a0dc-420a790f9be6","Type":"ContainerStarted","Data":"bcd1144a48f5b6bbff222c98f7d86167e5293815d9e4eb3bf54ea72daa3c2d87"} Mar 10 07:09:00 crc kubenswrapper[4825]: I0310 07:09:00.455820 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nlhhs" podStartSLOduration=2.924079601 podStartE2EDuration="6.455800088s" podCreationTimestamp="2026-03-10 07:08:54 +0000 UTC" firstStartedPulling="2026-03-10 07:08:56.391860387 +0000 UTC m=+1489.421641002" lastFinishedPulling="2026-03-10 07:08:59.923580874 +0000 UTC m=+1492.953361489" observedRunningTime="2026-03-10 07:09:00.45127136 +0000 UTC m=+1493.481052005" watchObservedRunningTime="2026-03-10 07:09:00.455800088 +0000 UTC m=+1493.485580713" Mar 10 07:09:02 crc kubenswrapper[4825]: I0310 07:09:02.459180 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 07:09:05 crc kubenswrapper[4825]: I0310 07:09:05.339934 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nlhhs" Mar 10 07:09:05 crc kubenswrapper[4825]: I0310 07:09:05.340423 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nlhhs" Mar 10 07:09:06 crc kubenswrapper[4825]: I0310 07:09:06.420082 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nlhhs" podUID="65a7e280-1972-4414-a0dc-420a790f9be6" containerName="registry-server" probeResult="failure" output=< Mar 10 07:09:06 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 07:09:06 crc kubenswrapper[4825]: > Mar 10 07:09:15 crc kubenswrapper[4825]: I0310 07:09:15.423361 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nlhhs" Mar 10 07:09:15 crc kubenswrapper[4825]: I0310 07:09:15.499265 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nlhhs" Mar 10 07:09:15 crc kubenswrapper[4825]: I0310 07:09:15.670533 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nlhhs"] Mar 10 07:09:16 crc kubenswrapper[4825]: I0310 07:09:16.612527 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nlhhs" podUID="65a7e280-1972-4414-a0dc-420a790f9be6" containerName="registry-server" containerID="cri-o://bcd1144a48f5b6bbff222c98f7d86167e5293815d9e4eb3bf54ea72daa3c2d87" gracePeriod=2 Mar 10 07:09:16 crc kubenswrapper[4825]: I0310 07:09:16.888359 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:09:16 crc kubenswrapper[4825]: I0310 07:09:16.888605 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.170746 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nlhhs" Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.338671 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a7e280-1972-4414-a0dc-420a790f9be6-catalog-content\") pod \"65a7e280-1972-4414-a0dc-420a790f9be6\" (UID: \"65a7e280-1972-4414-a0dc-420a790f9be6\") " Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.338765 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djcmw\" (UniqueName: \"kubernetes.io/projected/65a7e280-1972-4414-a0dc-420a790f9be6-kube-api-access-djcmw\") pod \"65a7e280-1972-4414-a0dc-420a790f9be6\" (UID: \"65a7e280-1972-4414-a0dc-420a790f9be6\") " Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.338790 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a7e280-1972-4414-a0dc-420a790f9be6-utilities\") pod \"65a7e280-1972-4414-a0dc-420a790f9be6\" (UID: \"65a7e280-1972-4414-a0dc-420a790f9be6\") " Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.340701 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65a7e280-1972-4414-a0dc-420a790f9be6-utilities" (OuterVolumeSpecName: "utilities") pod "65a7e280-1972-4414-a0dc-420a790f9be6" (UID: "65a7e280-1972-4414-a0dc-420a790f9be6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.345059 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65a7e280-1972-4414-a0dc-420a790f9be6-kube-api-access-djcmw" (OuterVolumeSpecName: "kube-api-access-djcmw") pod "65a7e280-1972-4414-a0dc-420a790f9be6" (UID: "65a7e280-1972-4414-a0dc-420a790f9be6"). InnerVolumeSpecName "kube-api-access-djcmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.441407 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djcmw\" (UniqueName: \"kubernetes.io/projected/65a7e280-1972-4414-a0dc-420a790f9be6-kube-api-access-djcmw\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.441459 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a7e280-1972-4414-a0dc-420a790f9be6-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.484324 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65a7e280-1972-4414-a0dc-420a790f9be6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65a7e280-1972-4414-a0dc-420a790f9be6" (UID: "65a7e280-1972-4414-a0dc-420a790f9be6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.543040 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a7e280-1972-4414-a0dc-420a790f9be6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.624344 4825 generic.go:334] "Generic (PLEG): container finished" podID="65a7e280-1972-4414-a0dc-420a790f9be6" containerID="bcd1144a48f5b6bbff222c98f7d86167e5293815d9e4eb3bf54ea72daa3c2d87" exitCode=0 Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.624381 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nlhhs" event={"ID":"65a7e280-1972-4414-a0dc-420a790f9be6","Type":"ContainerDied","Data":"bcd1144a48f5b6bbff222c98f7d86167e5293815d9e4eb3bf54ea72daa3c2d87"} Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.624409 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nlhhs" event={"ID":"65a7e280-1972-4414-a0dc-420a790f9be6","Type":"ContainerDied","Data":"cf3c52a1c4cd830a413f6d8474a71383c06b997bc4ea4ec4782384dc0a894c84"} Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.624427 4825 scope.go:117] "RemoveContainer" containerID="bcd1144a48f5b6bbff222c98f7d86167e5293815d9e4eb3bf54ea72daa3c2d87" Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.624450 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nlhhs" Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.667098 4825 scope.go:117] "RemoveContainer" containerID="c382bfff07f574aeb3f430c9d0c463c50ef65ebaa99b73d7da818e02ad96ea7b" Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.673182 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nlhhs"] Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.690055 4825 scope.go:117] "RemoveContainer" containerID="c1e93edc34dd8bc64f2eb94ecb3e1898f7f17b546f62e3b6735090797e997228" Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.712278 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nlhhs"] Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.733251 4825 scope.go:117] "RemoveContainer" containerID="bcd1144a48f5b6bbff222c98f7d86167e5293815d9e4eb3bf54ea72daa3c2d87" Mar 10 07:09:17 crc kubenswrapper[4825]: E0310 07:09:17.733667 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcd1144a48f5b6bbff222c98f7d86167e5293815d9e4eb3bf54ea72daa3c2d87\": container with ID starting with bcd1144a48f5b6bbff222c98f7d86167e5293815d9e4eb3bf54ea72daa3c2d87 not found: ID does not exist" containerID="bcd1144a48f5b6bbff222c98f7d86167e5293815d9e4eb3bf54ea72daa3c2d87" Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.733711 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcd1144a48f5b6bbff222c98f7d86167e5293815d9e4eb3bf54ea72daa3c2d87"} err="failed to get container status \"bcd1144a48f5b6bbff222c98f7d86167e5293815d9e4eb3bf54ea72daa3c2d87\": rpc error: code = NotFound desc = could not find container \"bcd1144a48f5b6bbff222c98f7d86167e5293815d9e4eb3bf54ea72daa3c2d87\": container with ID starting with bcd1144a48f5b6bbff222c98f7d86167e5293815d9e4eb3bf54ea72daa3c2d87 not found: ID does not exist" Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.733746 4825 scope.go:117] "RemoveContainer" containerID="c382bfff07f574aeb3f430c9d0c463c50ef65ebaa99b73d7da818e02ad96ea7b" Mar 10 07:09:17 crc kubenswrapper[4825]: E0310 07:09:17.734277 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c382bfff07f574aeb3f430c9d0c463c50ef65ebaa99b73d7da818e02ad96ea7b\": container with ID starting with c382bfff07f574aeb3f430c9d0c463c50ef65ebaa99b73d7da818e02ad96ea7b not found: ID does not exist" containerID="c382bfff07f574aeb3f430c9d0c463c50ef65ebaa99b73d7da818e02ad96ea7b" Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.734333 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c382bfff07f574aeb3f430c9d0c463c50ef65ebaa99b73d7da818e02ad96ea7b"} err="failed to get container status \"c382bfff07f574aeb3f430c9d0c463c50ef65ebaa99b73d7da818e02ad96ea7b\": rpc error: code = NotFound desc = could not find container \"c382bfff07f574aeb3f430c9d0c463c50ef65ebaa99b73d7da818e02ad96ea7b\": container with ID starting with c382bfff07f574aeb3f430c9d0c463c50ef65ebaa99b73d7da818e02ad96ea7b not found: ID does not exist" Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.734364 4825 scope.go:117] "RemoveContainer" containerID="c1e93edc34dd8bc64f2eb94ecb3e1898f7f17b546f62e3b6735090797e997228" Mar 10 07:09:17 crc kubenswrapper[4825]: E0310 07:09:17.734761 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1e93edc34dd8bc64f2eb94ecb3e1898f7f17b546f62e3b6735090797e997228\": container with ID starting with c1e93edc34dd8bc64f2eb94ecb3e1898f7f17b546f62e3b6735090797e997228 not found: ID does not exist" containerID="c1e93edc34dd8bc64f2eb94ecb3e1898f7f17b546f62e3b6735090797e997228" Mar 10 07:09:17 crc kubenswrapper[4825]: I0310 07:09:17.734827 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1e93edc34dd8bc64f2eb94ecb3e1898f7f17b546f62e3b6735090797e997228"} err="failed to get container status \"c1e93edc34dd8bc64f2eb94ecb3e1898f7f17b546f62e3b6735090797e997228\": rpc error: code = NotFound desc = could not find container \"c1e93edc34dd8bc64f2eb94ecb3e1898f7f17b546f62e3b6735090797e997228\": container with ID starting with c1e93edc34dd8bc64f2eb94ecb3e1898f7f17b546f62e3b6735090797e997228 not found: ID does not exist" Mar 10 07:09:19 crc kubenswrapper[4825]: I0310 07:09:19.256817 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65a7e280-1972-4414-a0dc-420a790f9be6" path="/var/lib/kubelet/pods/65a7e280-1972-4414-a0dc-420a790f9be6/volumes" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.361836 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.362689 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11" containerName="openstackclient" containerID="cri-o://eae523cad9b94daad0aa32c8a855fc6def30624e340d1c962919717b99480d5e" gracePeriod=2 Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.430644 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.465466 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-1e7f-account-create-update-cxt8s"] Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.483682 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-1e7f-account-create-update-cxt8s"] Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.526588 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-1e7f-account-create-update-rfzld"] Mar 10 07:09:22 crc kubenswrapper[4825]: E0310 07:09:22.527584 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a7e280-1972-4414-a0dc-420a790f9be6" containerName="registry-server" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.527607 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a7e280-1972-4414-a0dc-420a790f9be6" containerName="registry-server" Mar 10 07:09:22 crc kubenswrapper[4825]: E0310 07:09:22.527636 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a7e280-1972-4414-a0dc-420a790f9be6" containerName="extract-content" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.527644 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a7e280-1972-4414-a0dc-420a790f9be6" containerName="extract-content" Mar 10 07:09:22 crc kubenswrapper[4825]: E0310 07:09:22.527658 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11" containerName="openstackclient" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.527668 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11" containerName="openstackclient" Mar 10 07:09:22 crc kubenswrapper[4825]: E0310 07:09:22.527689 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a7e280-1972-4414-a0dc-420a790f9be6" containerName="extract-utilities" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.527697 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a7e280-1972-4414-a0dc-420a790f9be6" containerName="extract-utilities" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.527903 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11" containerName="openstackclient" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.527936 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a7e280-1972-4414-a0dc-420a790f9be6" containerName="registry-server" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.528737 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1e7f-account-create-update-rfzld" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.547995 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1e7f-account-create-update-rfzld"] Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.568974 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.592074 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-pwt26"] Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.593558 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pwt26" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.602076 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.649491 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-pwt26"] Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.676281 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e808443d-0b2f-45a3-baa9-61e4d9819e6c-operator-scripts\") pod \"placement-1e7f-account-create-update-rfzld\" (UID: \"e808443d-0b2f-45a3-baa9-61e4d9819e6c\") " pod="openstack/placement-1e7f-account-create-update-rfzld" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.676367 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2dhk\" (UniqueName: \"kubernetes.io/projected/e808443d-0b2f-45a3-baa9-61e4d9819e6c-kube-api-access-d2dhk\") pod \"placement-1e7f-account-create-update-rfzld\" (UID: \"e808443d-0b2f-45a3-baa9-61e4d9819e6c\") " pod="openstack/placement-1e7f-account-create-update-rfzld" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.676424 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c684f3e4-ced9-49c3-aa54-515bc2c4fb56-operator-scripts\") pod \"root-account-create-update-pwt26\" (UID: \"c684f3e4-ced9-49c3-aa54-515bc2c4fb56\") " pod="openstack/root-account-create-update-pwt26" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.676443 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k67d\" (UniqueName: \"kubernetes.io/projected/c684f3e4-ced9-49c3-aa54-515bc2c4fb56-kube-api-access-9k67d\") pod \"root-account-create-update-pwt26\" (UID: \"c684f3e4-ced9-49c3-aa54-515bc2c4fb56\") " pod="openstack/root-account-create-update-pwt26" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.706471 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-z4g4q"] Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.738073 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-z4g4q"] Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.781216 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2dhk\" (UniqueName: \"kubernetes.io/projected/e808443d-0b2f-45a3-baa9-61e4d9819e6c-kube-api-access-d2dhk\") pod \"placement-1e7f-account-create-update-rfzld\" (UID: \"e808443d-0b2f-45a3-baa9-61e4d9819e6c\") " pod="openstack/placement-1e7f-account-create-update-rfzld" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.781268 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c684f3e4-ced9-49c3-aa54-515bc2c4fb56-operator-scripts\") pod \"root-account-create-update-pwt26\" (UID: \"c684f3e4-ced9-49c3-aa54-515bc2c4fb56\") " pod="openstack/root-account-create-update-pwt26" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.781289 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k67d\" (UniqueName: \"kubernetes.io/projected/c684f3e4-ced9-49c3-aa54-515bc2c4fb56-kube-api-access-9k67d\") pod \"root-account-create-update-pwt26\" (UID: \"c684f3e4-ced9-49c3-aa54-515bc2c4fb56\") " pod="openstack/root-account-create-update-pwt26" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.781403 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e808443d-0b2f-45a3-baa9-61e4d9819e6c-operator-scripts\") pod \"placement-1e7f-account-create-update-rfzld\" (UID: \"e808443d-0b2f-45a3-baa9-61e4d9819e6c\") " pod="openstack/placement-1e7f-account-create-update-rfzld" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.782062 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e808443d-0b2f-45a3-baa9-61e4d9819e6c-operator-scripts\") pod \"placement-1e7f-account-create-update-rfzld\" (UID: \"e808443d-0b2f-45a3-baa9-61e4d9819e6c\") " pod="openstack/placement-1e7f-account-create-update-rfzld" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.782740 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c684f3e4-ced9-49c3-aa54-515bc2c4fb56-operator-scripts\") pod \"root-account-create-update-pwt26\" (UID: \"c684f3e4-ced9-49c3-aa54-515bc2c4fb56\") " pod="openstack/root-account-create-update-pwt26" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.852867 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.879907 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b26b-account-create-update-8l4dm"] Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.881835 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k67d\" (UniqueName: \"kubernetes.io/projected/c684f3e4-ced9-49c3-aa54-515bc2c4fb56-kube-api-access-9k67d\") pod \"root-account-create-update-pwt26\" (UID: \"c684f3e4-ced9-49c3-aa54-515bc2c4fb56\") " pod="openstack/root-account-create-update-pwt26" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.884907 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2dhk\" (UniqueName: \"kubernetes.io/projected/e808443d-0b2f-45a3-baa9-61e4d9819e6c-kube-api-access-d2dhk\") pod \"placement-1e7f-account-create-update-rfzld\" (UID: \"e808443d-0b2f-45a3-baa9-61e4d9819e6c\") " pod="openstack/placement-1e7f-account-create-update-rfzld" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.934941 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pwt26" Mar 10 07:09:22 crc kubenswrapper[4825]: I0310 07:09:22.968879 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b26b-account-create-update-8l4dm"] Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.067476 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-cnt4k"] Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.068362 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-cnt4k" podUID="d2e08f36-91c9-4bad-b1fb-88a0938c4d25" containerName="openstack-network-exporter" containerID="cri-o://a0f14677c076a4d6fea4cbdb391654ebcd047d9f5740b2422a0cfce1097e8da1" gracePeriod=30 Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.126198 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-8x4pm"] Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.185934 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1e7f-account-create-update-rfzld" Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.201496 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-jctbb"] Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.221766 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.290381 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e9617ba-17da-4946-8c5b-b1060683237d" path="/var/lib/kubelet/pods/4e9617ba-17da-4946-8c5b-b1060683237d/volumes" Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.290936 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b31a6071-5d6d-4ea5-9639-8fc8ea220862" path="/var/lib/kubelet/pods/b31a6071-5d6d-4ea5-9639-8fc8ea220862/volumes" Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.291474 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3950c3f-9b1c-495c-a837-9d2e13500042" path="/var/lib/kubelet/pods/f3950c3f-9b1c-495c-a837-9d2e13500042/volumes" Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.292048 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-b6jlj"] Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.347200 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-b6jlj"] Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.394593 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.395053 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="d77fced6-dee7-49fc-9088-e173a5be3cee" containerName="ovn-northd" containerID="cri-o://6a4f962505809098311731c7df1e9e9acaa99f0d4d2a16ebefd3083955087398" gracePeriod=30 Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.395196 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="d77fced6-dee7-49fc-9088-e173a5be3cee" containerName="openstack-network-exporter" containerID="cri-o://4ceda535e48bc00d5f96a25e503044a292df24630498d86c8a415946cacf0904" gracePeriod=30 Mar 10 07:09:23 crc kubenswrapper[4825]: E0310 07:09:23.404162 4825 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 10 07:09:23 crc kubenswrapper[4825]: E0310 07:09:23.409418 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-config-data podName:40efa241-98cc-4dec-9ae8-8a892b367ebc nodeName:}" failed. No retries permitted until 2026-03-10 07:09:23.909400384 +0000 UTC m=+1516.939180999 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-config-data") pod "rabbitmq-server-0" (UID: "40efa241-98cc-4dec-9ae8-8a892b367ebc") : configmap "rabbitmq-config-data" not found Mar 10 07:09:23 crc kubenswrapper[4825]: E0310 07:09:23.409949 4825 secret.go:188] Couldn't get secret openstack/glance-scripts: secret "glance-scripts" not found Mar 10 07:09:23 crc kubenswrapper[4825]: E0310 07:09:23.410000 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-scripts podName:f33e10c8-8be2-46d4-8653-1960855a2a40 nodeName:}" failed. No retries permitted until 2026-03-10 07:09:23.909983799 +0000 UTC m=+1516.939764414 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-scripts") pod "glance-default-external-api-0" (UID: "f33e10c8-8be2-46d4-8653-1960855a2a40") : secret "glance-scripts" not found Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.425520 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-833c-account-create-update-stmjm"] Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.460074 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-j9wfx"] Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.526214 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-833c-account-create-update-stmjm"] Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.555720 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-j9wfx"] Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.576264 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-7zgw8"] Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.606210 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-7zgw8"] Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.647726 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-qd4jq"] Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.691232 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-qd4jq"] Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.724037 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-qlkjv"] Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.764549 4825 generic.go:334] "Generic (PLEG): container finished" podID="d77fced6-dee7-49fc-9088-e173a5be3cee" containerID="4ceda535e48bc00d5f96a25e503044a292df24630498d86c8a415946cacf0904" exitCode=2 Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.764665 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d77fced6-dee7-49fc-9088-e173a5be3cee","Type":"ContainerDied","Data":"4ceda535e48bc00d5f96a25e503044a292df24630498d86c8a415946cacf0904"} Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.784871 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-cnt4k_d2e08f36-91c9-4bad-b1fb-88a0938c4d25/openstack-network-exporter/0.log" Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.785121 4825 generic.go:334] "Generic (PLEG): container finished" podID="d2e08f36-91c9-4bad-b1fb-88a0938c4d25" containerID="a0f14677c076a4d6fea4cbdb391654ebcd047d9f5740b2422a0cfce1097e8da1" exitCode=2 Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.785222 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cnt4k" event={"ID":"d2e08f36-91c9-4bad-b1fb-88a0938c4d25","Type":"ContainerDied","Data":"a0f14677c076a4d6fea4cbdb391654ebcd047d9f5740b2422a0cfce1097e8da1"} Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.873960 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-qlkjv"] Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.904973 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-55b54bdfdb-8b9ns"] Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.906348 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-55b54bdfdb-8b9ns" podUID="56c74365-656c-4362-8358-bbb17d0c8be0" containerName="placement-log" containerID="cri-o://cf0bf49129662bee51dc0d9f341998b13bf6e4b5c8b81b94887a67926afba8ca" gracePeriod=30 Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.907120 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-55b54bdfdb-8b9ns" podUID="56c74365-656c-4362-8358-bbb17d0c8be0" containerName="placement-api" containerID="cri-o://506bc72445e5baadada4d8eeada583023eaf1965303dabc6bfea713ddd7e5cda" gracePeriod=30 Mar 10 07:09:23 crc kubenswrapper[4825]: E0310 07:09:23.936704 4825 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 10 07:09:23 crc kubenswrapper[4825]: E0310 07:09:23.936761 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-config-data podName:40efa241-98cc-4dec-9ae8-8a892b367ebc nodeName:}" failed. No retries permitted until 2026-03-10 07:09:24.936747411 +0000 UTC m=+1517.966528016 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-config-data") pod "rabbitmq-server-0" (UID: "40efa241-98cc-4dec-9ae8-8a892b367ebc") : configmap "rabbitmq-config-data" not found Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.936822 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.939973 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="4d0be589-d069-4183-b9f6-bde715ad716b" containerName="openstack-network-exporter" containerID="cri-o://445d3424a112177b35839673ae3ebc80192ee7a09e556dccf91f7607c8d6b87a" gracePeriod=300 Mar 10 07:09:23 crc kubenswrapper[4825]: E0310 07:09:23.942087 4825 secret.go:188] Couldn't get secret openstack/glance-scripts: secret "glance-scripts" not found Mar 10 07:09:23 crc kubenswrapper[4825]: E0310 07:09:23.942245 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-scripts podName:f33e10c8-8be2-46d4-8653-1960855a2a40 nodeName:}" failed. No retries permitted until 2026-03-10 07:09:24.942123951 +0000 UTC m=+1517.971904566 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-scripts") pod "glance-default-external-api-0" (UID: "f33e10c8-8be2-46d4-8653-1960855a2a40") : secret "glance-scripts" not found Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.957521 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.958068 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="e6301c97-5fb5-4f12-9181-ae937aa01b33" containerName="openstack-network-exporter" containerID="cri-o://6d4ba65a7a74c4f484262169282d716735558f576441bf3306734527b8d19733" gracePeriod=300 Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.975706 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.985367 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="account-server" containerID="cri-o://b6da432e4ef6e11dcf09322fd99510f39d8b7a43cde3cc5579134c6692dd38d0" gracePeriod=30 Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.985691 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="swift-recon-cron" containerID="cri-o://0eb3526b2f9a3dc4c181eae70d6e3f55e5aedd2b23585785d07ab9b47c86be86" gracePeriod=30 Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.985727 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="rsync" containerID="cri-o://bbfb6db4ff1cf66b5cefbdf85aba01f8712fe08811899971bacb92c4cffa4c80" gracePeriod=30 Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.985762 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="object-expirer" containerID="cri-o://24e7978cba137e69c6bcc8c4236d6cec9f0db399c5f65bdbb35f3ca5a0021d8b" gracePeriod=30 Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.985794 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="object-updater" containerID="cri-o://2696dee8a458e4895f3ba7693b75ddaad87852936a2c94221d87c43242a8d9c0" gracePeriod=30 Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.985821 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="object-auditor" containerID="cri-o://1b4b9f5c44b3bd7e6bdce3108794f5acc85e48ec01796fe34d6bd0e82000c702" gracePeriod=30 Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.985851 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="object-replicator" containerID="cri-o://6b6649c00a311af982a228be67d8b620ca48866ed93599303fea93062ada078a" gracePeriod=30 Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.985882 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="object-server" containerID="cri-o://6a94411539794d68dfeebd72122285b13eda3087dbbeb9b315116582797f9ae7" gracePeriod=30 Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.985933 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="container-updater" containerID="cri-o://ec1f01797e26b3ab6c17de9ed25bdc13caaa429fb99d4c291f3d27585657d67a" gracePeriod=30 Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.985964 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="container-auditor" containerID="cri-o://76563e2f197f8efff20de4d4a18477c3d06424bc7c12b84fc84e0bc0a8e03877" gracePeriod=30 Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.985994 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="container-replicator" containerID="cri-o://3c0a38e663ef3eeb91871a7cbf3ab45bc29c396569c4dd16e651b9a699527022" gracePeriod=30 Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.986047 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="container-server" containerID="cri-o://13e1e93f8ea417f0086cb72d1b5c3c472463fcb9b293144a2e6d095358753ed6" gracePeriod=30 Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.986082 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="account-reaper" containerID="cri-o://dbff9d40c47a9b57b2e18d794a415a9500d5967b3986aef885713d9846f81e1f" gracePeriod=30 Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.986113 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="account-auditor" containerID="cri-o://49e3de3dc2efd76ad3dd592ab843057ad3883dc58b1b23503b05369a2fb81259" gracePeriod=30 Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.986158 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="account-replicator" containerID="cri-o://001ce68a7777bd6940dec1d7b9d36b3306b90993cc66dd9412186267450b2a7a" gracePeriod=30 Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.990263 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-kww8g"] Mar 10 07:09:23 crc kubenswrapper[4825]: I0310 07:09:23.990509 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7749c44969-kww8g" podUID="7d184a8b-881b-4f12-8c5b-cb355193fd98" containerName="dnsmasq-dns" containerID="cri-o://4f7fbc28bf34f7df122011e3058dcf6f02dc2f8ce3ea18d2fad614b2293dc01c" gracePeriod=10 Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.070427 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-2sxbl"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.089599 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-npbf2"] Mar 10 07:09:24 crc kubenswrapper[4825]: E0310 07:09:24.108535 4825 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-jctbb" message=< Mar 10 07:09:24 crc kubenswrapper[4825]: Exiting ovn-controller (1) [ OK ] Mar 10 07:09:24 crc kubenswrapper[4825]: > Mar 10 07:09:24 crc kubenswrapper[4825]: E0310 07:09:24.108569 4825 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-jctbb" podUID="2b7468b7-ceaa-44b4-8364-5d3601f43c1b" containerName="ovn-controller" containerID="cri-o://dbfa45eee1893e39936ca7ed89cd4006d13a24a21be39d324c10431ed74834ee" Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.108606 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-jctbb" podUID="2b7468b7-ceaa-44b4-8364-5d3601f43c1b" containerName="ovn-controller" containerID="cri-o://dbfa45eee1893e39936ca7ed89cd4006d13a24a21be39d324c10431ed74834ee" gracePeriod=30 Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.135645 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-2sxbl"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.135888 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="4d0be589-d069-4183-b9f6-bde715ad716b" containerName="ovsdbserver-nb" containerID="cri-o://287e2ed2fe7cd936cd37546d987ad73fd2626cb93c8a46da8028302a3c2cd37d" gracePeriod=300 Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.141542 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="e6301c97-5fb5-4f12-9181-ae937aa01b33" containerName="ovsdbserver-sb" containerID="cri-o://890a4d9166366f896b7b573d660a3e9559cfb944d463e538a8cb9f89a3568bf9" gracePeriod=300 Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.150658 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-npbf2"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.170283 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.170635 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ca4ebafb-aa22-44ad-8037-487a9c3baca4" containerName="cinder-scheduler" containerID="cri-o://2afb8783418624fe53a581d5b384b0eac94f03f7b397fe6d2851a79acaf2faeb" gracePeriod=30 Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.171027 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ca4ebafb-aa22-44ad-8037-487a9c3baca4" containerName="probe" containerID="cri-o://69a4b33cb0df81f2796dea2e3671cf92e7c2fcaf46f6acf19185262e97be0696" gracePeriod=30 Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.191089 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-rcx8n"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.237848 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-rcx8n"] Mar 10 07:09:24 crc kubenswrapper[4825]: E0310 07:09:24.250448 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbfa45eee1893e39936ca7ed89cd4006d13a24a21be39d324c10431ed74834ee is running failed: container process not found" containerID="dbfa45eee1893e39936ca7ed89cd4006d13a24a21be39d324c10431ed74834ee" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Mar 10 07:09:24 crc kubenswrapper[4825]: E0310 07:09:24.257124 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbfa45eee1893e39936ca7ed89cd4006d13a24a21be39d324c10431ed74834ee is running failed: container process not found" containerID="dbfa45eee1893e39936ca7ed89cd4006d13a24a21be39d324c10431ed74834ee" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.261682 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.262021 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8b3d94ce-f8d8-4653-a6b0-2682b23d834e" containerName="glance-log" containerID="cri-o://8c02a587ff89f403a3ba0a4a6efae15db19b3375bb2065fb2488307352268600" gracePeriod=30 Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.262566 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8b3d94ce-f8d8-4653-a6b0-2682b23d834e" containerName="glance-httpd" containerID="cri-o://b75f52a9c2b47ee09bd6f6c1c6a4ac378afdcf400070a6ab9a12012c65129fcc" gracePeriod=30 Mar 10 07:09:24 crc kubenswrapper[4825]: E0310 07:09:24.264292 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbfa45eee1893e39936ca7ed89cd4006d13a24a21be39d324c10431ed74834ee is running failed: container process not found" containerID="dbfa45eee1893e39936ca7ed89cd4006d13a24a21be39d324c10431ed74834ee" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Mar 10 07:09:24 crc kubenswrapper[4825]: E0310 07:09:24.264378 4825 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbfa45eee1893e39936ca7ed89cd4006d13a24a21be39d324c10431ed74834ee is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-jctbb" podUID="2b7468b7-ceaa-44b4-8364-5d3601f43c1b" containerName="ovn-controller" Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.299582 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.299820 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="209fbdac-3b3c-451a-ae30-5888e1cbb891" containerName="cinder-api-log" containerID="cri-o://41fab81e9dbca0e39235649b019d1b34334b13bd38163f9b2e2254c248abda96" gracePeriod=30 Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.300190 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="209fbdac-3b3c-451a-ae30-5888e1cbb891" containerName="cinder-api" containerID="cri-o://bb56c1153d981c89befdcab3c321e793fa3da2a54c78cde62ebc8522df37f74c" gracePeriod=30 Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.344980 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-pwt26"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.369339 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-67bc54d95c-r8n6n"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.369695 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-67bc54d95c-r8n6n" podUID="f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6" containerName="neutron-api" containerID="cri-o://2281ab9c75c402ac8b53f6007d3f795cf1ff09aad4c21d1c2d1f97b2f04c2605" gracePeriod=30 Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.370239 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-67bc54d95c-r8n6n" podUID="f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6" containerName="neutron-httpd" containerID="cri-o://3ad7566f82944df07d73d40d26dee49f887701e2a6b35d333784838aa475be9b" gracePeriod=30 Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.409694 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-e713-account-create-update-8c9hv"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.467396 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-e713-account-create-update-8c9hv"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.482698 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.482928 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f33e10c8-8be2-46d4-8653-1960855a2a40" containerName="glance-log" containerID="cri-o://0596a09cf87b1eba90ca5be28a144085a5ee0e4870ea3024df884f81b878aeaf" gracePeriod=30 Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.484063 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f33e10c8-8be2-46d4-8653-1960855a2a40" containerName="glance-httpd" containerID="cri-o://eb096eba10587e7f334a63bf327913dd03205b29419eb194a05f626f07286905" gracePeriod=30 Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.585584 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.601584 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-vd86v"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.613002 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-vd86v"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.632867 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-10c2-account-create-update-7fscp"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.635993 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-cnt4k_d2e08f36-91c9-4bad-b1fb-88a0938c4d25/openstack-network-exporter/0.log" Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.636050 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cnt4k" Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.664033 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-6cw56"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.671483 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="9ba0e3ee-1309-4411-a927-866b35c2776b" containerName="rabbitmq" containerID="cri-o://33787b88c881e69ba026aff5962a5455c1e4b2f94327fcc119388ebe1cd30211" gracePeriod=604800 Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.707669 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-10c2-account-create-update-7fscp"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.736277 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-l5vf2"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.750181 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-6cw56"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.752356 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-l5vf2"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.766246 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2a9b-account-create-update-zq6mk"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.775857 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-2a9b-account-create-update-zq6mk"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.789961 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.790191 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="85f5995b-4ad8-4840-82f1-6659152c3ed4" containerName="nova-scheduler-scheduler" containerID="cri-o://23d5b7cd25a646bc4179b5e41cb642aa99ae3019bdbfa362dc0578d525c4bfdf" gracePeriod=30 Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.796215 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-metrics-certs-tls-certs\") pod \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\" (UID: \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\") " Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.796485 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-config\") pod \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\" (UID: \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\") " Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.796562 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-combined-ca-bundle\") pod \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\" (UID: \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\") " Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.796733 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-ovn-rundir\") pod \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\" (UID: \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\") " Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.796806 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-ovs-rundir\") pod \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\" (UID: \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\") " Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.796912 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcb8g\" (UniqueName: \"kubernetes.io/projected/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-kube-api-access-dcb8g\") pod \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\" (UID: \"d2e08f36-91c9-4bad-b1fb-88a0938c4d25\") " Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.802264 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "d2e08f36-91c9-4bad-b1fb-88a0938c4d25" (UID: "d2e08f36-91c9-4bad-b1fb-88a0938c4d25"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.802332 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "d2e08f36-91c9-4bad-b1fb-88a0938c4d25" (UID: "d2e08f36-91c9-4bad-b1fb-88a0938c4d25"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.803390 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-kube-api-access-dcb8g" (OuterVolumeSpecName: "kube-api-access-dcb8g") pod "d2e08f36-91c9-4bad-b1fb-88a0938c4d25" (UID: "d2e08f36-91c9-4bad-b1fb-88a0938c4d25"). InnerVolumeSpecName "kube-api-access-dcb8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.805856 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-config" (OuterVolumeSpecName: "config") pod "d2e08f36-91c9-4bad-b1fb-88a0938c4d25" (UID: "d2e08f36-91c9-4bad-b1fb-88a0938c4d25"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.809203 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.835190 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.845150 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-86ff-account-create-update-fh6cl"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.852390 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-86ff-account-create-update-fh6cl"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.864750 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-75884b96b6-fqkns"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.865007 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" podUID="d907281e-010a-403c-ba2a-b8d178dacbb0" containerName="barbican-keystone-listener-log" containerID="cri-o://f0fabece877d02360c9f8c4a495f79b17208aceb1c8e9d8dac1c99cae589b5d0" gracePeriod=30 Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.865114 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" podUID="d907281e-010a-403c-ba2a-b8d178dacbb0" containerName="barbican-keystone-listener" containerID="cri-o://c5e15cd589465e5cfd321144406581f242067dc2f205033f4254ac94393e1ff6" gracePeriod=30 Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.865317 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-54d455c957-2jn47"] Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.866092 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-54d455c957-2jn47" podUID="79caf655-43c2-401d-9967-97d9a35d9741" containerName="barbican-worker-log" containerID="cri-o://b0dc5fbc17a28febd97f83275e735288d32e837f8d7e53a674a6790d1cb06167" gracePeriod=30 Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.866233 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-54d455c957-2jn47" podUID="79caf655-43c2-401d-9967-97d9a35d9741" containerName="barbican-worker" containerID="cri-o://e0c5268cbaf4fcdd48619feb929fd81e110e9f60040fedec074af3e0608a9a8d" gracePeriod=30 Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.899586 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.899887 4825 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.899898 4825 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-ovs-rundir\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.899909 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcb8g\" (UniqueName: \"kubernetes.io/projected/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-kube-api-access-dcb8g\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.901589 4825 generic.go:334] "Generic (PLEG): container finished" podID="8b3d94ce-f8d8-4653-a6b0-2682b23d834e" containerID="8c02a587ff89f403a3ba0a4a6efae15db19b3375bb2065fb2488307352268600" exitCode=143 Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.901623 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8b3d94ce-f8d8-4653-a6b0-2682b23d834e","Type":"ContainerDied","Data":"8c02a587ff89f403a3ba0a4a6efae15db19b3375bb2065fb2488307352268600"} Mar 10 07:09:24 crc kubenswrapper[4825]: I0310 07:09:24.952993 4825 generic.go:334] "Generic (PLEG): container finished" podID="59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11" containerID="eae523cad9b94daad0aa32c8a855fc6def30624e340d1c962919717b99480d5e" exitCode=137 Mar 10 07:09:25 crc kubenswrapper[4825]: E0310 07:09:25.023968 4825 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 10 07:09:25 crc kubenswrapper[4825]: E0310 07:09:25.024043 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-config-data podName:40efa241-98cc-4dec-9ae8-8a892b367ebc nodeName:}" failed. No retries permitted until 2026-03-10 07:09:27.024024915 +0000 UTC m=+1520.053805530 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-config-data") pod "rabbitmq-server-0" (UID: "40efa241-98cc-4dec-9ae8-8a892b367ebc") : configmap "rabbitmq-config-data" not found Mar 10 07:09:25 crc kubenswrapper[4825]: E0310 07:09:25.024424 4825 secret.go:188] Couldn't get secret openstack/glance-scripts: secret "glance-scripts" not found Mar 10 07:09:25 crc kubenswrapper[4825]: E0310 07:09:25.024454 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-scripts podName:f33e10c8-8be2-46d4-8653-1960855a2a40 nodeName:}" failed. No retries permitted until 2026-03-10 07:09:27.024446806 +0000 UTC m=+1520.054227421 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-scripts") pod "glance-default-external-api-0" (UID: "f33e10c8-8be2-46d4-8653-1960855a2a40") : secret "glance-scripts" not found Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.033225 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.033536 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b1222a91-b344-4f54-bf9f-75d03d5f8549" containerName="nova-metadata-log" containerID="cri-o://eeadc36b533bcb861ae1e4ac68b98711c46023cbb63985038bb6a6302eb5d0a3" gracePeriod=30 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.034100 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b1222a91-b344-4f54-bf9f-75d03d5f8549" containerName="nova-metadata-metadata" containerID="cri-o://fe83016a56de378ec7204c67a3bc85b458b1da0d43277d26413e0aa8e2ea9dc4" gracePeriod=30 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.059478 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-cnt4k_d2e08f36-91c9-4bad-b1fb-88a0938c4d25/openstack-network-exporter/0.log" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.059592 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cnt4k" event={"ID":"d2e08f36-91c9-4bad-b1fb-88a0938c4d25","Type":"ContainerDied","Data":"14d95d2f7730bc87f1858043cd3e90fb5b3130d6474c421938b5770e215eb821"} Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.059640 4825 scope.go:117] "RemoveContainer" containerID="a0f14677c076a4d6fea4cbdb391654ebcd047d9f5740b2422a0cfce1097e8da1" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.059780 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cnt4k" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.063121 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2e08f36-91c9-4bad-b1fb-88a0938c4d25" (UID: "d2e08f36-91c9-4bad-b1fb-88a0938c4d25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.112887 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4d0be589-d069-4183-b9f6-bde715ad716b/ovsdbserver-nb/0.log" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.112949 4825 generic.go:334] "Generic (PLEG): container finished" podID="4d0be589-d069-4183-b9f6-bde715ad716b" containerID="445d3424a112177b35839673ae3ebc80192ee7a09e556dccf91f7607c8d6b87a" exitCode=2 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.112975 4825 generic.go:334] "Generic (PLEG): container finished" podID="4d0be589-d069-4183-b9f6-bde715ad716b" containerID="287e2ed2fe7cd936cd37546d987ad73fd2626cb93c8a46da8028302a3c2cd37d" exitCode=143 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.113077 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4d0be589-d069-4183-b9f6-bde715ad716b","Type":"ContainerDied","Data":"445d3424a112177b35839673ae3ebc80192ee7a09e556dccf91f7607c8d6b87a"} Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.113125 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4d0be589-d069-4183-b9f6-bde715ad716b","Type":"ContainerDied","Data":"287e2ed2fe7cd936cd37546d987ad73fd2626cb93c8a46da8028302a3c2cd37d"} Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.115238 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-t5vjd"] Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.119232 4825 generic.go:334] "Generic (PLEG): container finished" podID="7d184a8b-881b-4f12-8c5b-cb355193fd98" containerID="4f7fbc28bf34f7df122011e3058dcf6f02dc2f8ce3ea18d2fad614b2293dc01c" exitCode=0 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.119310 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-kww8g" event={"ID":"7d184a8b-881b-4f12-8c5b-cb355193fd98","Type":"ContainerDied","Data":"4f7fbc28bf34f7df122011e3058dcf6f02dc2f8ce3ea18d2fad614b2293dc01c"} Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.126975 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.137943 4825 generic.go:334] "Generic (PLEG): container finished" podID="2b7468b7-ceaa-44b4-8364-5d3601f43c1b" containerID="dbfa45eee1893e39936ca7ed89cd4006d13a24a21be39d324c10431ed74834ee" exitCode=0 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.138051 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jctbb" event={"ID":"2b7468b7-ceaa-44b4-8364-5d3601f43c1b","Type":"ContainerDied","Data":"dbfa45eee1893e39936ca7ed89cd4006d13a24a21be39d324c10431ed74834ee"} Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.171360 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="40efa241-98cc-4dec-9ae8-8a892b367ebc" containerName="rabbitmq" containerID="cri-o://a98a5e36d3a3124ffe2ffa323d51e71ad524e44ce610050ac778bcba5e77fd17" gracePeriod=604800 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.200203 4825 generic.go:334] "Generic (PLEG): container finished" podID="f33e10c8-8be2-46d4-8653-1960855a2a40" containerID="0596a09cf87b1eba90ca5be28a144085a5ee0e4870ea3024df884f81b878aeaf" exitCode=143 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.200284 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f33e10c8-8be2-46d4-8653-1960855a2a40","Type":"ContainerDied","Data":"0596a09cf87b1eba90ca5be28a144085a5ee0e4870ea3024df884f81b878aeaf"} Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.201104 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-t5vjd"] Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.203095 4825 generic.go:334] "Generic (PLEG): container finished" podID="209fbdac-3b3c-451a-ae30-5888e1cbb891" containerID="41fab81e9dbca0e39235649b019d1b34334b13bd38163f9b2e2254c248abda96" exitCode=143 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.203205 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"209fbdac-3b3c-451a-ae30-5888e1cbb891","Type":"ContainerDied","Data":"41fab81e9dbca0e39235649b019d1b34334b13bd38163f9b2e2254c248abda96"} Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.209325 4825 generic.go:334] "Generic (PLEG): container finished" podID="56c74365-656c-4362-8358-bbb17d0c8be0" containerID="cf0bf49129662bee51dc0d9f341998b13bf6e4b5c8b81b94887a67926afba8ca" exitCode=143 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.209389 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55b54bdfdb-8b9ns" event={"ID":"56c74365-656c-4362-8358-bbb17d0c8be0","Type":"ContainerDied","Data":"cf0bf49129662bee51dc0d9f341998b13bf6e4b5c8b81b94887a67926afba8ca"} Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.226100 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e6301c97-5fb5-4f12-9181-ae937aa01b33/ovsdbserver-sb/0.log" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.226174 4825 generic.go:334] "Generic (PLEG): container finished" podID="e6301c97-5fb5-4f12-9181-ae937aa01b33" containerID="6d4ba65a7a74c4f484262169282d716735558f576441bf3306734527b8d19733" exitCode=2 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.226194 4825 generic.go:334] "Generic (PLEG): container finished" podID="e6301c97-5fb5-4f12-9181-ae937aa01b33" containerID="890a4d9166366f896b7b573d660a3e9559cfb944d463e538a8cb9f89a3568bf9" exitCode=143 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.226256 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e6301c97-5fb5-4f12-9181-ae937aa01b33","Type":"ContainerDied","Data":"6d4ba65a7a74c4f484262169282d716735558f576441bf3306734527b8d19733"} Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.226281 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e6301c97-5fb5-4f12-9181-ae937aa01b33","Type":"ContainerDied","Data":"890a4d9166366f896b7b573d660a3e9559cfb944d463e538a8cb9f89a3568bf9"} Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.232598 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pwt26" event={"ID":"c684f3e4-ced9-49c3-aa54-515bc2c4fb56","Type":"ContainerStarted","Data":"25f403db5ebe5475717c8f47289b1479fce9b0d0eac9e70adf5907cdfb44e580"} Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.233745 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "d2e08f36-91c9-4bad-b1fb-88a0938c4d25" (UID: "d2e08f36-91c9-4bad-b1fb-88a0938c4d25"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.234554 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.234915 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fcf819fc-0560-45a8-be5e-04e6ef2bbf32" containerName="nova-api-log" containerID="cri-o://40b43e7e62b535d8a9ba96b7af5069797ffcd786e53c1075b83a21ba139ed0fa" gracePeriod=30 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.235368 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fcf819fc-0560-45a8-be5e-04e6ef2bbf32" containerName="nova-api-api" containerID="cri-o://b0692876912e4dddda340c37876d5dcd6e82ee0c786afdeaa20b480be4903f55" gracePeriod=30 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.236949 4825 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2e08f36-91c9-4bad-b1fb-88a0938c4d25-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.273214 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10fac422-52dd-409a-adb6-156fe67b68d3" path="/var/lib/kubelet/pods/10fac422-52dd-409a-adb6-156fe67b68d3/volumes" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.273602 4825 generic.go:334] "Generic (PLEG): container finished" podID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerID="bbfb6db4ff1cf66b5cefbdf85aba01f8712fe08811899971bacb92c4cffa4c80" exitCode=0 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.273618 4825 generic.go:334] "Generic (PLEG): container finished" podID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerID="24e7978cba137e69c6bcc8c4236d6cec9f0db399c5f65bdbb35f3ca5a0021d8b" exitCode=0 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.273625 4825 generic.go:334] "Generic (PLEG): container finished" podID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerID="2696dee8a458e4895f3ba7693b75ddaad87852936a2c94221d87c43242a8d9c0" exitCode=0 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.273632 4825 generic.go:334] "Generic (PLEG): container finished" podID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerID="1b4b9f5c44b3bd7e6bdce3108794f5acc85e48ec01796fe34d6bd0e82000c702" exitCode=0 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.273640 4825 generic.go:334] "Generic (PLEG): container finished" podID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerID="6b6649c00a311af982a228be67d8b620ca48866ed93599303fea93062ada078a" exitCode=0 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.273646 4825 generic.go:334] "Generic (PLEG): container finished" podID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerID="6a94411539794d68dfeebd72122285b13eda3087dbbeb9b315116582797f9ae7" exitCode=0 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.273652 4825 generic.go:334] "Generic (PLEG): container finished" podID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerID="ec1f01797e26b3ab6c17de9ed25bdc13caaa429fb99d4c291f3d27585657d67a" exitCode=0 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.273657 4825 generic.go:334] "Generic (PLEG): container finished" podID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerID="76563e2f197f8efff20de4d4a18477c3d06424bc7c12b84fc84e0bc0a8e03877" exitCode=0 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.273663 4825 generic.go:334] "Generic (PLEG): container finished" podID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerID="3c0a38e663ef3eeb91871a7cbf3ab45bc29c396569c4dd16e651b9a699527022" exitCode=0 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.273669 4825 generic.go:334] "Generic (PLEG): container finished" podID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerID="dbff9d40c47a9b57b2e18d794a415a9500d5967b3986aef885713d9846f81e1f" exitCode=0 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.273675 4825 generic.go:334] "Generic (PLEG): container finished" podID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerID="49e3de3dc2efd76ad3dd592ab843057ad3883dc58b1b23503b05369a2fb81259" exitCode=0 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.273681 4825 generic.go:334] "Generic (PLEG): container finished" podID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerID="001ce68a7777bd6940dec1d7b9d36b3306b90993cc66dd9412186267450b2a7a" exitCode=0 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.273688 4825 generic.go:334] "Generic (PLEG): container finished" podID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerID="b6da432e4ef6e11dcf09322fd99510f39d8b7a43cde3cc5579134c6692dd38d0" exitCode=0 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.276603 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="178e89cb-daf7-488f-8188-de4e98bc1047" path="/var/lib/kubelet/pods/178e89cb-daf7-488f-8188-de4e98bc1047/volumes" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.277406 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e84f490-c283-41ba-92af-bca6b65b95cf" path="/var/lib/kubelet/pods/1e84f490-c283-41ba-92af-bca6b65b95cf/volumes" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.278930 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20652aad-ebbd-472d-8cd8-61c598352ec4" path="/var/lib/kubelet/pods/20652aad-ebbd-472d-8cd8-61c598352ec4/volumes" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.280630 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff" path="/var/lib/kubelet/pods/31d721ea-21f5-4f7b-a5c2-cee3aa78f6ff/volumes" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.281486 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37023b62-fd36-44d5-9c0f-45cf1b640da6" path="/var/lib/kubelet/pods/37023b62-fd36-44d5-9c0f-45cf1b640da6/volumes" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.282878 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b166cbd-0add-4896-92ab-caed991a0a06" path="/var/lib/kubelet/pods/3b166cbd-0add-4896-92ab-caed991a0a06/volumes" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.283857 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="489d7ca6-7e05-482e-b880-69bea3b57c62" path="/var/lib/kubelet/pods/489d7ca6-7e05-482e-b880-69bea3b57c62/volumes" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.285091 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f59ca51-fd91-41a3-ad9c-6b04ccd93288" path="/var/lib/kubelet/pods/5f59ca51-fd91-41a3-ad9c-6b04ccd93288/volumes" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.285913 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="758e4473-67ca-4d59-bf8c-3584b92a8663" path="/var/lib/kubelet/pods/758e4473-67ca-4d59-bf8c-3584b92a8663/volumes" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.287002 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7df341ed-690d-4799-b9a4-18691d7de8d7" path="/var/lib/kubelet/pods/7df341ed-690d-4799-b9a4-18691d7de8d7/volumes" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.288209 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3" path="/var/lib/kubelet/pods/a16da4fc-3f0f-45f0-b5ee-d8c5254e4ed3/volumes" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.289686 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a762be8c-156d-453d-bcd6-ae8571d9133f" path="/var/lib/kubelet/pods/a762be8c-156d-453d-bcd6-ae8571d9133f/volumes" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.291022 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3f58042-5c26-4b0b-9f85-d35d9305115e" path="/var/lib/kubelet/pods/c3f58042-5c26-4b0b-9f85-d35d9305115e/volumes" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.294346 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce0e86d1-5b0d-491f-bbd9-59fdd8024236" path="/var/lib/kubelet/pods/ce0e86d1-5b0d-491f-bbd9-59fdd8024236/volumes" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.295153 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e22d1830-21e9-40f0-ab2b-83335a568a18" path="/var/lib/kubelet/pods/e22d1830-21e9-40f0-ab2b-83335a568a18/volumes" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.296394 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6533be0-d453-4d29-980d-6be63a601fce" path="/var/lib/kubelet/pods/e6533be0-d453-4d29-980d-6be63a601fce/volumes" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.297710 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerDied","Data":"bbfb6db4ff1cf66b5cefbdf85aba01f8712fe08811899971bacb92c4cffa4c80"} Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.297738 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-vv657"] Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.297753 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerDied","Data":"24e7978cba137e69c6bcc8c4236d6cec9f0db399c5f65bdbb35f3ca5a0021d8b"} Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.297768 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-vv657"] Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.297783 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerDied","Data":"2696dee8a458e4895f3ba7693b75ddaad87852936a2c94221d87c43242a8d9c0"} Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.297793 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-5vfc4"] Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.297803 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerDied","Data":"1b4b9f5c44b3bd7e6bdce3108794f5acc85e48ec01796fe34d6bd0e82000c702"} Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.297811 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerDied","Data":"6b6649c00a311af982a228be67d8b620ca48866ed93599303fea93062ada078a"} Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.297820 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerDied","Data":"6a94411539794d68dfeebd72122285b13eda3087dbbeb9b315116582797f9ae7"} Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.297830 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerDied","Data":"ec1f01797e26b3ab6c17de9ed25bdc13caaa429fb99d4c291f3d27585657d67a"} Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.297838 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerDied","Data":"76563e2f197f8efff20de4d4a18477c3d06424bc7c12b84fc84e0bc0a8e03877"} Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.297848 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerDied","Data":"3c0a38e663ef3eeb91871a7cbf3ab45bc29c396569c4dd16e651b9a699527022"} Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.297857 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerDied","Data":"dbff9d40c47a9b57b2e18d794a415a9500d5967b3986aef885713d9846f81e1f"} Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.297871 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerDied","Data":"49e3de3dc2efd76ad3dd592ab843057ad3883dc58b1b23503b05369a2fb81259"} Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.297880 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerDied","Data":"001ce68a7777bd6940dec1d7b9d36b3306b90993cc66dd9412186267450b2a7a"} Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.298075 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerDied","Data":"b6da432e4ef6e11dcf09322fd99510f39d8b7a43cde3cc5579134c6692dd38d0"} Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.308144 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-2jsr2"] Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.359663 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-9c98f44d4-hm6qz"] Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.361100 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-9c98f44d4-hm6qz" podUID="42d8b8af-9916-4aba-b4e7-f825ed30f182" containerName="barbican-api-log" containerID="cri-o://6a1691cba1fef49f1aa2f7c44e95ac06b21fe92d69b0eadbad46361719368876" gracePeriod=30 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.361677 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-9c98f44d4-hm6qz" podUID="42d8b8af-9916-4aba-b4e7-f825ed30f182" containerName="barbican-api" containerID="cri-o://45339b16b80b77a1ec497dc0cfdf7cad94e760f0b42f91faed72b2b7acaeea0f" gracePeriod=30 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.414680 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-d132-account-create-update-gqrfl"] Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.420170 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-j22lk"] Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.429402 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-1e7f-account-create-update-rfzld"] Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.437201 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-5vfc4"] Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.442766 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-2jsr2"] Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.452159 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-d132-account-create-update-gqrfl"] Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.456153 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-j22lk"] Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.466392 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="757635f4-9aaf-48bb-bd50-60398db738d4" containerName="galera" containerID="cri-o://64283b83ccd27989eef21b478fef4edaf5b901a9f55b243380a9b8e95c69504f" gracePeriod=30 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.503588 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5l64f"] Mar 10 07:09:25 crc kubenswrapper[4825]: E0310 07:09:25.511817 4825 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 10 07:09:25 crc kubenswrapper[4825]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 10 07:09:25 crc kubenswrapper[4825]: + source /usr/local/bin/container-scripts/functions Mar 10 07:09:25 crc kubenswrapper[4825]: ++ OVNBridge=br-int Mar 10 07:09:25 crc kubenswrapper[4825]: ++ OVNRemote=tcp:localhost:6642 Mar 10 07:09:25 crc kubenswrapper[4825]: ++ OVNEncapType=geneve Mar 10 07:09:25 crc kubenswrapper[4825]: ++ OVNAvailabilityZones= Mar 10 07:09:25 crc kubenswrapper[4825]: ++ EnableChassisAsGateway=true Mar 10 07:09:25 crc kubenswrapper[4825]: ++ PhysicalNetworks= Mar 10 07:09:25 crc kubenswrapper[4825]: ++ OVNHostName= Mar 10 07:09:25 crc kubenswrapper[4825]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 10 07:09:25 crc kubenswrapper[4825]: ++ ovs_dir=/var/lib/openvswitch Mar 10 07:09:25 crc kubenswrapper[4825]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 10 07:09:25 crc kubenswrapper[4825]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 10 07:09:25 crc kubenswrapper[4825]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 10 07:09:25 crc kubenswrapper[4825]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 07:09:25 crc kubenswrapper[4825]: + sleep 0.5 Mar 10 07:09:25 crc kubenswrapper[4825]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 07:09:25 crc kubenswrapper[4825]: + sleep 0.5 Mar 10 07:09:25 crc kubenswrapper[4825]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 07:09:25 crc kubenswrapper[4825]: + sleep 0.5 Mar 10 07:09:25 crc kubenswrapper[4825]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 07:09:25 crc kubenswrapper[4825]: + sleep 0.5 Mar 10 07:09:25 crc kubenswrapper[4825]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 07:09:25 crc kubenswrapper[4825]: + cleanup_ovsdb_server_semaphore Mar 10 07:09:25 crc kubenswrapper[4825]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 10 07:09:25 crc kubenswrapper[4825]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 10 07:09:25 crc kubenswrapper[4825]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-8x4pm" message=< Mar 10 07:09:25 crc kubenswrapper[4825]: Exiting ovsdb-server (5) [ OK ] Mar 10 07:09:25 crc kubenswrapper[4825]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 10 07:09:25 crc kubenswrapper[4825]: + source /usr/local/bin/container-scripts/functions Mar 10 07:09:25 crc kubenswrapper[4825]: ++ OVNBridge=br-int Mar 10 07:09:25 crc kubenswrapper[4825]: ++ OVNRemote=tcp:localhost:6642 Mar 10 07:09:25 crc kubenswrapper[4825]: ++ OVNEncapType=geneve Mar 10 07:09:25 crc kubenswrapper[4825]: ++ OVNAvailabilityZones= Mar 10 07:09:25 crc kubenswrapper[4825]: ++ EnableChassisAsGateway=true Mar 10 07:09:25 crc kubenswrapper[4825]: ++ PhysicalNetworks= Mar 10 07:09:25 crc kubenswrapper[4825]: ++ OVNHostName= Mar 10 07:09:25 crc kubenswrapper[4825]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 10 07:09:25 crc kubenswrapper[4825]: ++ ovs_dir=/var/lib/openvswitch Mar 10 07:09:25 crc kubenswrapper[4825]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 10 07:09:25 crc kubenswrapper[4825]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 10 07:09:25 crc kubenswrapper[4825]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 10 07:09:25 crc kubenswrapper[4825]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 07:09:25 crc kubenswrapper[4825]: + sleep 0.5 Mar 10 07:09:25 crc kubenswrapper[4825]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 07:09:25 crc kubenswrapper[4825]: + sleep 0.5 Mar 10 07:09:25 crc kubenswrapper[4825]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 07:09:25 crc kubenswrapper[4825]: + sleep 0.5 Mar 10 07:09:25 crc kubenswrapper[4825]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 07:09:25 crc kubenswrapper[4825]: + sleep 0.5 Mar 10 07:09:25 crc kubenswrapper[4825]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 07:09:25 crc kubenswrapper[4825]: + cleanup_ovsdb_server_semaphore Mar 10 07:09:25 crc kubenswrapper[4825]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 10 07:09:25 crc kubenswrapper[4825]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 10 07:09:25 crc kubenswrapper[4825]: > Mar 10 07:09:25 crc kubenswrapper[4825]: E0310 07:09:25.513297 4825 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 10 07:09:25 crc kubenswrapper[4825]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 10 07:09:25 crc kubenswrapper[4825]: + source /usr/local/bin/container-scripts/functions Mar 10 07:09:25 crc kubenswrapper[4825]: ++ OVNBridge=br-int Mar 10 07:09:25 crc kubenswrapper[4825]: ++ OVNRemote=tcp:localhost:6642 Mar 10 07:09:25 crc kubenswrapper[4825]: ++ OVNEncapType=geneve Mar 10 07:09:25 crc kubenswrapper[4825]: ++ OVNAvailabilityZones= Mar 10 07:09:25 crc kubenswrapper[4825]: ++ EnableChassisAsGateway=true Mar 10 07:09:25 crc kubenswrapper[4825]: ++ PhysicalNetworks= Mar 10 07:09:25 crc kubenswrapper[4825]: ++ OVNHostName= Mar 10 07:09:25 crc kubenswrapper[4825]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 10 07:09:25 crc kubenswrapper[4825]: ++ ovs_dir=/var/lib/openvswitch Mar 10 07:09:25 crc kubenswrapper[4825]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 10 07:09:25 crc kubenswrapper[4825]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 10 07:09:25 crc kubenswrapper[4825]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 10 07:09:25 crc kubenswrapper[4825]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 07:09:25 crc kubenswrapper[4825]: + sleep 0.5 Mar 10 07:09:25 crc kubenswrapper[4825]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 07:09:25 crc kubenswrapper[4825]: + sleep 0.5 Mar 10 07:09:25 crc kubenswrapper[4825]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 07:09:25 crc kubenswrapper[4825]: + sleep 0.5 Mar 10 07:09:25 crc kubenswrapper[4825]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 07:09:25 crc kubenswrapper[4825]: + sleep 0.5 Mar 10 07:09:25 crc kubenswrapper[4825]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 10 07:09:25 crc kubenswrapper[4825]: + cleanup_ovsdb_server_semaphore Mar 10 07:09:25 crc kubenswrapper[4825]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 10 07:09:25 crc kubenswrapper[4825]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 10 07:09:25 crc kubenswrapper[4825]: > pod="openstack/ovn-controller-ovs-8x4pm" podUID="f04e11a7-e387-4e51-b878-8633ca528b1a" containerName="ovsdb-server" containerID="cri-o://dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.530936 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.534435 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="9965b351-ed73-4c38-b393-ff72ba48cd66" containerName="nova-cell1-conductor-conductor" containerID="cri-o://3d70d139a1deb1c6a27cdf349fc1cc7878472659a23957efa5657e5551a800ef" gracePeriod=30 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.535802 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-8x4pm" podUID="f04e11a7-e387-4e51-b878-8633ca528b1a" containerName="ovsdb-server" containerID="cri-o://dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134" gracePeriod=28 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.553451 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-8x4pm" podUID="f04e11a7-e387-4e51-b878-8633ca528b1a" containerName="ovs-vswitchd" containerID="cri-o://1f073cb0d55b119007da7e16a09131032c9832ee0f4faf6e839adba2f4de4803" gracePeriod=28 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.559050 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k7b7g"] Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.591515 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jctbb" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.592834 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5l64f"] Mar 10 07:09:25 crc kubenswrapper[4825]: W0310 07:09:25.597233 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode808443d_0b2f_45a3_baa9_61e4d9819e6c.slice/crio-7eee7a7c47b0665f1acabe7ae84c3da45d155ec4e0dfb6c96eb0fd896337ed1b WatchSource:0}: Error finding container 7eee7a7c47b0665f1acabe7ae84c3da45d155ec4e0dfb6c96eb0fd896337ed1b: Status 404 returned error can't find the container with id 7eee7a7c47b0665f1acabe7ae84c3da45d155ec4e0dfb6c96eb0fd896337ed1b Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.607338 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-kww8g" Mar 10 07:09:25 crc kubenswrapper[4825]: E0310 07:09:25.609569 4825 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 07:09:25 crc kubenswrapper[4825]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 10 07:09:25 crc kubenswrapper[4825]: Mar 10 07:09:25 crc kubenswrapper[4825]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 10 07:09:25 crc kubenswrapper[4825]: Mar 10 07:09:25 crc kubenswrapper[4825]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 10 07:09:25 crc kubenswrapper[4825]: Mar 10 07:09:25 crc kubenswrapper[4825]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 10 07:09:25 crc kubenswrapper[4825]: Mar 10 07:09:25 crc kubenswrapper[4825]: if [ -n "placement" ]; then Mar 10 07:09:25 crc kubenswrapper[4825]: GRANT_DATABASE="placement" Mar 10 07:09:25 crc kubenswrapper[4825]: else Mar 10 07:09:25 crc kubenswrapper[4825]: GRANT_DATABASE="*" Mar 10 07:09:25 crc kubenswrapper[4825]: fi Mar 10 07:09:25 crc kubenswrapper[4825]: Mar 10 07:09:25 crc kubenswrapper[4825]: # going for maximum compatibility here: Mar 10 07:09:25 crc kubenswrapper[4825]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 10 07:09:25 crc kubenswrapper[4825]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 10 07:09:25 crc kubenswrapper[4825]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 10 07:09:25 crc kubenswrapper[4825]: # support updates Mar 10 07:09:25 crc kubenswrapper[4825]: Mar 10 07:09:25 crc kubenswrapper[4825]: $MYSQL_CMD < logger="UnhandledError" Mar 10 07:09:25 crc kubenswrapper[4825]: E0310 07:09:25.614545 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack/placement-1e7f-account-create-update-rfzld" podUID="e808443d-0b2f-45a3-baa9-61e4d9819e6c" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.616845 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.617023 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="47d73d8a-acf6-42e6-a30d-e093144ee0b9" containerName="nova-cell0-conductor-conductor" containerID="cri-o://a5a7ebb2da38af819e648d6e5ad4c63fc12ab758ecc1f1178b06bc2d53a663b1" gracePeriod=30 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.617584 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e6301c97-5fb5-4f12-9181-ae937aa01b33/ovsdbserver-sb/0.log" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.617654 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.620296 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.641197 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.641401 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="66c3beca-bb2c-4b68-a4b3-3cfe936c25fd" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://84161414c9dbeb64c9936bc248a59e4b32d0591a084774c91d9da931fd27e821" gracePeriod=30 Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.650918 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-var-log-ovn\") pod \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.650969 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-ovsdbserver-sb\") pod \"7d184a8b-881b-4f12-8c5b-cb355193fd98\" (UID: \"7d184a8b-881b-4f12-8c5b-cb355193fd98\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.650989 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dsjb\" (UniqueName: \"kubernetes.io/projected/7d184a8b-881b-4f12-8c5b-cb355193fd98-kube-api-access-2dsjb\") pod \"7d184a8b-881b-4f12-8c5b-cb355193fd98\" (UID: \"7d184a8b-881b-4f12-8c5b-cb355193fd98\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.651010 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6301c97-5fb5-4f12-9181-ae937aa01b33-config\") pod \"e6301c97-5fb5-4f12-9181-ae937aa01b33\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.651030 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-ovn-controller-tls-certs\") pod \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.651046 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11-openstack-config\") pod \"59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11\" (UID: \"59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.651062 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-scripts\") pod \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.651083 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-config\") pod \"7d184a8b-881b-4f12-8c5b-cb355193fd98\" (UID: \"7d184a8b-881b-4f12-8c5b-cb355193fd98\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.651101 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6301c97-5fb5-4f12-9181-ae937aa01b33-metrics-certs-tls-certs\") pod \"e6301c97-5fb5-4f12-9181-ae937aa01b33\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.651119 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcwbj\" (UniqueName: \"kubernetes.io/projected/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-kube-api-access-qcwbj\") pod \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.651149 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11-combined-ca-bundle\") pod \"59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11\" (UID: \"59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.651186 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6301c97-5fb5-4f12-9181-ae937aa01b33-combined-ca-bundle\") pod \"e6301c97-5fb5-4f12-9181-ae937aa01b33\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.651202 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjrvq\" (UniqueName: \"kubernetes.io/projected/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11-kube-api-access-rjrvq\") pod \"59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11\" (UID: \"59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.651216 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-dns-swift-storage-0\") pod \"7d184a8b-881b-4f12-8c5b-cb355193fd98\" (UID: \"7d184a8b-881b-4f12-8c5b-cb355193fd98\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.651237 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6301c97-5fb5-4f12-9181-ae937aa01b33-scripts\") pod \"e6301c97-5fb5-4f12-9181-ae937aa01b33\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.651265 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6301c97-5fb5-4f12-9181-ae937aa01b33-ovsdbserver-sb-tls-certs\") pod \"e6301c97-5fb5-4f12-9181-ae937aa01b33\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.651283 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11-openstack-config-secret\") pod \"59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11\" (UID: \"59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.651309 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-var-run\") pod \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.651348 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-var-run-ovn\") pod \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.651363 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whptl\" (UniqueName: \"kubernetes.io/projected/e6301c97-5fb5-4f12-9181-ae937aa01b33-kube-api-access-whptl\") pod \"e6301c97-5fb5-4f12-9181-ae937aa01b33\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.651383 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"e6301c97-5fb5-4f12-9181-ae937aa01b33\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.651398 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-dns-svc\") pod \"7d184a8b-881b-4f12-8c5b-cb355193fd98\" (UID: \"7d184a8b-881b-4f12-8c5b-cb355193fd98\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.651417 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-combined-ca-bundle\") pod \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\" (UID: \"2b7468b7-ceaa-44b4-8364-5d3601f43c1b\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.651447 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-ovsdbserver-nb\") pod \"7d184a8b-881b-4f12-8c5b-cb355193fd98\" (UID: \"7d184a8b-881b-4f12-8c5b-cb355193fd98\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.651463 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e6301c97-5fb5-4f12-9181-ae937aa01b33-ovsdb-rundir\") pod \"e6301c97-5fb5-4f12-9181-ae937aa01b33\" (UID: \"e6301c97-5fb5-4f12-9181-ae937aa01b33\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.652188 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6301c97-5fb5-4f12-9181-ae937aa01b33-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "e6301c97-5fb5-4f12-9181-ae937aa01b33" (UID: "e6301c97-5fb5-4f12-9181-ae937aa01b33"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.652468 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "2b7468b7-ceaa-44b4-8364-5d3601f43c1b" (UID: "2b7468b7-ceaa-44b4-8364-5d3601f43c1b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.658957 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "2b7468b7-ceaa-44b4-8364-5d3601f43c1b" (UID: "2b7468b7-ceaa-44b4-8364-5d3601f43c1b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.659218 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k7b7g"] Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.659598 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-var-run" (OuterVolumeSpecName: "var-run") pod "2b7468b7-ceaa-44b4-8364-5d3601f43c1b" (UID: "2b7468b7-ceaa-44b4-8364-5d3601f43c1b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.660203 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6301c97-5fb5-4f12-9181-ae937aa01b33-config" (OuterVolumeSpecName: "config") pod "e6301c97-5fb5-4f12-9181-ae937aa01b33" (UID: "e6301c97-5fb5-4f12-9181-ae937aa01b33"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.660550 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-scripts" (OuterVolumeSpecName: "scripts") pod "2b7468b7-ceaa-44b4-8364-5d3601f43c1b" (UID: "2b7468b7-ceaa-44b4-8364-5d3601f43c1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.671996 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6301c97-5fb5-4f12-9181-ae937aa01b33-scripts" (OuterVolumeSpecName: "scripts") pod "e6301c97-5fb5-4f12-9181-ae937aa01b33" (UID: "e6301c97-5fb5-4f12-9181-ae937aa01b33"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.683704 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-1e7f-account-create-update-rfzld"] Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.689195 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11-kube-api-access-rjrvq" (OuterVolumeSpecName: "kube-api-access-rjrvq") pod "59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11" (UID: "59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11"). InnerVolumeSpecName "kube-api-access-rjrvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.690701 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-kube-api-access-qcwbj" (OuterVolumeSpecName: "kube-api-access-qcwbj") pod "2b7468b7-ceaa-44b4-8364-5d3601f43c1b" (UID: "2b7468b7-ceaa-44b4-8364-5d3601f43c1b"). InnerVolumeSpecName "kube-api-access-qcwbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.693842 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4d0be589-d069-4183-b9f6-bde715ad716b/ovsdbserver-nb/0.log" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.693984 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.694416 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="9ba0e3ee-1309-4411-a927-866b35c2776b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.105:5671: connect: connection refused" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.696536 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-cnt4k"] Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.705364 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-cnt4k"] Mar 10 07:09:25 crc kubenswrapper[4825]: E0310 07:09:25.721949 4825 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42d8b8af_9916_4aba_b4e7_f825ed30f182.slice/crio-6a1691cba1fef49f1aa2f7c44e95ac06b21fe92d69b0eadbad46361719368876.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2e08f36_91c9_4bad_b1fb_88a0938c4d25.slice/crio-14d95d2f7730bc87f1858043cd3e90fb5b3130d6474c421938b5770e215eb821\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79caf655_43c2_401d_9967_97d9a35d9741.slice/crio-conmon-b0dc5fbc17a28febd97f83275e735288d32e837f8d7e53a674a6790d1cb06167.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcf819fc_0560_45a8_be5e_04e6ef2bbf32.slice/crio-40b43e7e62b535d8a9ba96b7af5069797ffcd786e53c1075b83a21ba139ed0fa.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcf819fc_0560_45a8_be5e_04e6ef2bbf32.slice/crio-conmon-40b43e7e62b535d8a9ba96b7af5069797ffcd786e53c1075b83a21ba139ed0fa.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd907281e_010a_403c_ba2a_b8d178dacbb0.slice/crio-c5e15cd589465e5cfd321144406581f242067dc2f205033f4254ac94393e1ff6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd907281e_010a_403c_ba2a_b8d178dacbb0.slice/crio-f0fabece877d02360c9f8c4a495f79b17208aceb1c8e9d8dac1c99cae589b5d0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1222a91_b344_4f54_bf9f_75d03d5f8549.slice/crio-conmon-eeadc36b533bcb861ae1e4ac68b98711c46023cbb63985038bb6a6302eb5d0a3.scope\": RecentStats: unable to find data in memory cache]" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.730062 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6301c97-5fb5-4f12-9181-ae937aa01b33-kube-api-access-whptl" (OuterVolumeSpecName: "kube-api-access-whptl") pod "e6301c97-5fb5-4f12-9181-ae937aa01b33" (UID: "e6301c97-5fb5-4f12-9181-ae937aa01b33"). InnerVolumeSpecName "kube-api-access-whptl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.730150 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d184a8b-881b-4f12-8c5b-cb355193fd98-kube-api-access-2dsjb" (OuterVolumeSpecName: "kube-api-access-2dsjb") pod "7d184a8b-881b-4f12-8c5b-cb355193fd98" (UID: "7d184a8b-881b-4f12-8c5b-cb355193fd98"). InnerVolumeSpecName "kube-api-access-2dsjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.736865 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "e6301c97-5fb5-4f12-9181-ae937aa01b33" (UID: "e6301c97-5fb5-4f12-9181-ae937aa01b33"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.752660 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0be589-d069-4183-b9f6-bde715ad716b-metrics-certs-tls-certs\") pod \"4d0be589-d069-4183-b9f6-bde715ad716b\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.752706 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d0be589-d069-4183-b9f6-bde715ad716b-config\") pod \"4d0be589-d069-4183-b9f6-bde715ad716b\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.752821 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4d0be589-d069-4183-b9f6-bde715ad716b-ovsdb-rundir\") pod \"4d0be589-d069-4183-b9f6-bde715ad716b\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.753095 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtbvq\" (UniqueName: \"kubernetes.io/projected/4d0be589-d069-4183-b9f6-bde715ad716b-kube-api-access-dtbvq\") pod \"4d0be589-d069-4183-b9f6-bde715ad716b\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.753116 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0be589-d069-4183-b9f6-bde715ad716b-ovsdbserver-nb-tls-certs\") pod \"4d0be589-d069-4183-b9f6-bde715ad716b\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.753178 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d0be589-d069-4183-b9f6-bde715ad716b-scripts\") pod \"4d0be589-d069-4183-b9f6-bde715ad716b\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.753212 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0be589-d069-4183-b9f6-bde715ad716b-combined-ca-bundle\") pod \"4d0be589-d069-4183-b9f6-bde715ad716b\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.753342 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"4d0be589-d069-4183-b9f6-bde715ad716b\" (UID: \"4d0be589-d069-4183-b9f6-bde715ad716b\") " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.753757 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjrvq\" (UniqueName: \"kubernetes.io/projected/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11-kube-api-access-rjrvq\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.753774 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6301c97-5fb5-4f12-9181-ae937aa01b33-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.753784 4825 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-var-run\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.753793 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whptl\" (UniqueName: \"kubernetes.io/projected/e6301c97-5fb5-4f12-9181-ae937aa01b33-kube-api-access-whptl\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.753816 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.753825 4825 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.753834 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e6301c97-5fb5-4f12-9181-ae937aa01b33-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.753842 4825 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.753851 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6301c97-5fb5-4f12-9181-ae937aa01b33-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.753859 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dsjb\" (UniqueName: \"kubernetes.io/projected/7d184a8b-881b-4f12-8c5b-cb355193fd98-kube-api-access-2dsjb\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.753868 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.753855 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d0be589-d069-4183-b9f6-bde715ad716b-scripts" (OuterVolumeSpecName: "scripts") pod "4d0be589-d069-4183-b9f6-bde715ad716b" (UID: "4d0be589-d069-4183-b9f6-bde715ad716b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.753875 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcwbj\" (UniqueName: \"kubernetes.io/projected/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-kube-api-access-qcwbj\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.753766 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d0be589-d069-4183-b9f6-bde715ad716b-config" (OuterVolumeSpecName: "config") pod "4d0be589-d069-4183-b9f6-bde715ad716b" (UID: "4d0be589-d069-4183-b9f6-bde715ad716b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.754581 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d0be589-d069-4183-b9f6-bde715ad716b-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "4d0be589-d069-4183-b9f6-bde715ad716b" (UID: "4d0be589-d069-4183-b9f6-bde715ad716b"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.796591 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "4d0be589-d069-4183-b9f6-bde715ad716b" (UID: "4d0be589-d069-4183-b9f6-bde715ad716b"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.806973 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d0be589-d069-4183-b9f6-bde715ad716b-kube-api-access-dtbvq" (OuterVolumeSpecName: "kube-api-access-dtbvq") pod "4d0be589-d069-4183-b9f6-bde715ad716b" (UID: "4d0be589-d069-4183-b9f6-bde715ad716b"). InnerVolumeSpecName "kube-api-access-dtbvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.855695 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4d0be589-d069-4183-b9f6-bde715ad716b-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.855727 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtbvq\" (UniqueName: \"kubernetes.io/projected/4d0be589-d069-4183-b9f6-bde715ad716b-kube-api-access-dtbvq\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.855741 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d0be589-d069-4183-b9f6-bde715ad716b-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.855769 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.855781 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d0be589-d069-4183-b9f6-bde715ad716b-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.918733 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7d184a8b-881b-4f12-8c5b-cb355193fd98" (UID: "7d184a8b-881b-4f12-8c5b-cb355193fd98"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.925327 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6301c97-5fb5-4f12-9181-ae937aa01b33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6301c97-5fb5-4f12-9181-ae937aa01b33" (UID: "e6301c97-5fb5-4f12-9181-ae937aa01b33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.959284 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6301c97-5fb5-4f12-9181-ae937aa01b33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:25 crc kubenswrapper[4825]: I0310 07:09:25.959347 4825 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:25.995640 4825 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.042287 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7d184a8b-881b-4f12-8c5b-cb355193fd98" (UID: "7d184a8b-881b-4f12-8c5b-cb355193fd98"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.077272 4825 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.077314 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.100076 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11" (UID: "59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.113203 4825 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.173274 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11" (UID: "59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.195794 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b7468b7-ceaa-44b4-8364-5d3601f43c1b" (UID: "2b7468b7-ceaa-44b4-8364-5d3601f43c1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.197343 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11" (UID: "59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.201329 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.201356 4825 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.201366 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.201375 4825 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.201385 4825 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.230386 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0be589-d069-4183-b9f6-bde715ad716b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d0be589-d069-4183-b9f6-bde715ad716b" (UID: "4d0be589-d069-4183-b9f6-bde715ad716b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.270619 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7d184a8b-881b-4f12-8c5b-cb355193fd98" (UID: "7d184a8b-881b-4f12-8c5b-cb355193fd98"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.277202 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6301c97-5fb5-4f12-9181-ae937aa01b33-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "e6301c97-5fb5-4f12-9181-ae937aa01b33" (UID: "e6301c97-5fb5-4f12-9181-ae937aa01b33"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.299052 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7d184a8b-881b-4f12-8c5b-cb355193fd98" (UID: "7d184a8b-881b-4f12-8c5b-cb355193fd98"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.305764 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.305808 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0be589-d069-4183-b9f6-bde715ad716b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.305826 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6301c97-5fb5-4f12-9181-ae937aa01b33-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.305845 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.321516 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-config" (OuterVolumeSpecName: "config") pod "7d184a8b-881b-4f12-8c5b-cb355193fd98" (UID: "7d184a8b-881b-4f12-8c5b-cb355193fd98"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.323929 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-kww8g" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.323939 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-kww8g" event={"ID":"7d184a8b-881b-4f12-8c5b-cb355193fd98","Type":"ContainerDied","Data":"a93853eadccd3ebba1fd64ef732a9897ce888f33343a21ded8075d60e12b050e"} Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.325834 4825 scope.go:117] "RemoveContainer" containerID="4f7fbc28bf34f7df122011e3058dcf6f02dc2f8ce3ea18d2fad614b2293dc01c" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.336275 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6301c97-5fb5-4f12-9181-ae937aa01b33-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "e6301c97-5fb5-4f12-9181-ae937aa01b33" (UID: "e6301c97-5fb5-4f12-9181-ae937aa01b33"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.344213 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jctbb" event={"ID":"2b7468b7-ceaa-44b4-8364-5d3601f43c1b","Type":"ContainerDied","Data":"374e92a40ee4ebc87a6630a572595e29f1d285aa432e2d541165266f1ae0d414"} Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.344337 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-bb7dcfc7-n79vf"] Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.344365 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jctbb" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.344874 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-bb7dcfc7-n79vf" podUID="af39779a-3a65-4379-9e92-d69ab1610fc6" containerName="proxy-httpd" containerID="cri-o://2dd5d463672a820d3d3d4f47fbb2be409cc30d40e5202b4d480569146558f8cd" gracePeriod=30 Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.344964 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-bb7dcfc7-n79vf" podUID="af39779a-3a65-4379-9e92-d69ab1610fc6" containerName="proxy-server" containerID="cri-o://8f8c2fc2f611cbc298369095f2d9421d72e169c12c731c66dc4b7e2b27276c19" gracePeriod=30 Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.357081 4825 generic.go:334] "Generic (PLEG): container finished" podID="b1222a91-b344-4f54-bf9f-75d03d5f8549" containerID="eeadc36b533bcb861ae1e4ac68b98711c46023cbb63985038bb6a6302eb5d0a3" exitCode=143 Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.357641 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b1222a91-b344-4f54-bf9f-75d03d5f8549","Type":"ContainerDied","Data":"eeadc36b533bcb861ae1e4ac68b98711c46023cbb63985038bb6a6302eb5d0a3"} Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.364840 4825 generic.go:334] "Generic (PLEG): container finished" podID="ca4ebafb-aa22-44ad-8037-487a9c3baca4" containerID="69a4b33cb0df81f2796dea2e3671cf92e7c2fcaf46f6acf19185262e97be0696" exitCode=0 Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.364901 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ca4ebafb-aa22-44ad-8037-487a9c3baca4","Type":"ContainerDied","Data":"69a4b33cb0df81f2796dea2e3671cf92e7c2fcaf46f6acf19185262e97be0696"} Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.368005 4825 generic.go:334] "Generic (PLEG): container finished" podID="c684f3e4-ced9-49c3-aa54-515bc2c4fb56" containerID="a281eeb65879a51d71eaf01bb8092df711b2e91f86c5156b3dc7040201fbe49a" exitCode=1 Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.368060 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pwt26" event={"ID":"c684f3e4-ced9-49c3-aa54-515bc2c4fb56","Type":"ContainerDied","Data":"a281eeb65879a51d71eaf01bb8092df711b2e91f86c5156b3dc7040201fbe49a"} Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.368607 4825 scope.go:117] "RemoveContainer" containerID="a281eeb65879a51d71eaf01bb8092df711b2e91f86c5156b3dc7040201fbe49a" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.381209 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.402807 4825 generic.go:334] "Generic (PLEG): container finished" podID="79caf655-43c2-401d-9967-97d9a35d9741" containerID="e0c5268cbaf4fcdd48619feb929fd81e110e9f60040fedec074af3e0608a9a8d" exitCode=0 Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.402832 4825 generic.go:334] "Generic (PLEG): container finished" podID="79caf655-43c2-401d-9967-97d9a35d9741" containerID="b0dc5fbc17a28febd97f83275e735288d32e837f8d7e53a674a6790d1cb06167" exitCode=143 Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.402871 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54d455c957-2jn47" event={"ID":"79caf655-43c2-401d-9967-97d9a35d9741","Type":"ContainerDied","Data":"e0c5268cbaf4fcdd48619feb929fd81e110e9f60040fedec074af3e0608a9a8d"} Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.402897 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54d455c957-2jn47" event={"ID":"79caf655-43c2-401d-9967-97d9a35d9741","Type":"ContainerDied","Data":"b0dc5fbc17a28febd97f83275e735288d32e837f8d7e53a674a6790d1cb06167"} Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.410459 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d184a8b-881b-4f12-8c5b-cb355193fd98-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.410494 4825 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6301c97-5fb5-4f12-9181-ae937aa01b33-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.412232 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0be589-d069-4183-b9f6-bde715ad716b-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "4d0be589-d069-4183-b9f6-bde715ad716b" (UID: "4d0be589-d069-4183-b9f6-bde715ad716b"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.436847 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-kww8g"] Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.437341 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0be589-d069-4183-b9f6-bde715ad716b-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "4d0be589-d069-4183-b9f6-bde715ad716b" (UID: "4d0be589-d069-4183-b9f6-bde715ad716b"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.439088 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "2b7468b7-ceaa-44b4-8364-5d3601f43c1b" (UID: "2b7468b7-ceaa-44b4-8364-5d3601f43c1b"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.447029 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.451960 4825 scope.go:117] "RemoveContainer" containerID="58239260abbd4dee2c279089bb91ba90875540f1742144cead2ca71ff93b34c3" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.455004 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-kww8g"] Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.462493 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-54d455c957-2jn47" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.467124 4825 generic.go:334] "Generic (PLEG): container finished" podID="f04e11a7-e387-4e51-b878-8633ca528b1a" containerID="dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134" exitCode=0 Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.467206 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8x4pm" event={"ID":"f04e11a7-e387-4e51-b878-8633ca528b1a","Type":"ContainerDied","Data":"dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134"} Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.472099 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4d0be589-d069-4183-b9f6-bde715ad716b/ovsdbserver-nb/0.log" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.472273 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.472374 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4d0be589-d069-4183-b9f6-bde715ad716b","Type":"ContainerDied","Data":"b8310da36e0e36ef0978b2d77b8b278dccb06b90cb2d218f30ee6f9afc8ccd7b"} Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.474550 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1e7f-account-create-update-rfzld" event={"ID":"e808443d-0b2f-45a3-baa9-61e4d9819e6c","Type":"ContainerStarted","Data":"7eee7a7c47b0665f1acabe7ae84c3da45d155ec4e0dfb6c96eb0fd896337ed1b"} Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.498554 4825 generic.go:334] "Generic (PLEG): container finished" podID="f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6" containerID="3ad7566f82944df07d73d40d26dee49f887701e2a6b35d333784838aa475be9b" exitCode=0 Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.498604 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67bc54d95c-r8n6n" event={"ID":"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6","Type":"ContainerDied","Data":"3ad7566f82944df07d73d40d26dee49f887701e2a6b35d333784838aa475be9b"} Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.516224 4825 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b7468b7-ceaa-44b4-8364-5d3601f43c1b-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.516249 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0be589-d069-4183-b9f6-bde715ad716b-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.516260 4825 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d0be589-d069-4183-b9f6-bde715ad716b-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.517612 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e6301c97-5fb5-4f12-9181-ae937aa01b33/ovsdbserver-sb/0.log" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.517795 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.517841 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e6301c97-5fb5-4f12-9181-ae937aa01b33","Type":"ContainerDied","Data":"6db8497e28aa6376de67f3e7e22a09e5d46729a54a85d651a2e5a34cfc2e8f11"} Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.551886 4825 scope.go:117] "RemoveContainer" containerID="dbfa45eee1893e39936ca7ed89cd4006d13a24a21be39d324c10431ed74834ee" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.563689 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.568846 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.571546 4825 generic.go:334] "Generic (PLEG): container finished" podID="42d8b8af-9916-4aba-b4e7-f825ed30f182" containerID="6a1691cba1fef49f1aa2f7c44e95ac06b21fe92d69b0eadbad46361719368876" exitCode=143 Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.571603 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9c98f44d4-hm6qz" event={"ID":"42d8b8af-9916-4aba-b4e7-f825ed30f182","Type":"ContainerDied","Data":"6a1691cba1fef49f1aa2f7c44e95ac06b21fe92d69b0eadbad46361719368876"} Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.583113 4825 generic.go:334] "Generic (PLEG): container finished" podID="fcf819fc-0560-45a8-be5e-04e6ef2bbf32" containerID="40b43e7e62b535d8a9ba96b7af5069797ffcd786e53c1075b83a21ba139ed0fa" exitCode=143 Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.583222 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fcf819fc-0560-45a8-be5e-04e6ef2bbf32","Type":"ContainerDied","Data":"40b43e7e62b535d8a9ba96b7af5069797ffcd786e53c1075b83a21ba139ed0fa"} Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.585108 4825 generic.go:334] "Generic (PLEG): container finished" podID="66c3beca-bb2c-4b68-a4b3-3cfe936c25fd" containerID="84161414c9dbeb64c9936bc248a59e4b32d0591a084774c91d9da931fd27e821" exitCode=0 Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.585171 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd","Type":"ContainerDied","Data":"84161414c9dbeb64c9936bc248a59e4b32d0591a084774c91d9da931fd27e821"} Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.591827 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.606417 4825 generic.go:334] "Generic (PLEG): container finished" podID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerID="13e1e93f8ea417f0086cb72d1b5c3c472463fcb9b293144a2e6d095358753ed6" exitCode=0 Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.606475 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerDied","Data":"13e1e93f8ea417f0086cb72d1b5c3c472463fcb9b293144a2e6d095358753ed6"} Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.609094 4825 generic.go:334] "Generic (PLEG): container finished" podID="d907281e-010a-403c-ba2a-b8d178dacbb0" containerID="c5e15cd589465e5cfd321144406581f242067dc2f205033f4254ac94393e1ff6" exitCode=0 Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.609114 4825 generic.go:334] "Generic (PLEG): container finished" podID="d907281e-010a-403c-ba2a-b8d178dacbb0" containerID="f0fabece877d02360c9f8c4a495f79b17208aceb1c8e9d8dac1c99cae589b5d0" exitCode=143 Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.609219 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" event={"ID":"d907281e-010a-403c-ba2a-b8d178dacbb0","Type":"ContainerDied","Data":"c5e15cd589465e5cfd321144406581f242067dc2f205033f4254ac94393e1ff6"} Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.609238 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" event={"ID":"d907281e-010a-403c-ba2a-b8d178dacbb0","Type":"ContainerDied","Data":"f0fabece877d02360c9f8c4a495f79b17208aceb1c8e9d8dac1c99cae589b5d0"} Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.609247 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" event={"ID":"d907281e-010a-403c-ba2a-b8d178dacbb0","Type":"ContainerDied","Data":"b06ac46e842de2fe235b4bf3a5e34fbf009cceee6b2267a3ea7854b38f4849fe"} Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.609323 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75884b96b6-fqkns" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.616838 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g6d7\" (UniqueName: \"kubernetes.io/projected/d907281e-010a-403c-ba2a-b8d178dacbb0-kube-api-access-7g6d7\") pod \"d907281e-010a-403c-ba2a-b8d178dacbb0\" (UID: \"d907281e-010a-403c-ba2a-b8d178dacbb0\") " Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.616875 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77l52\" (UniqueName: \"kubernetes.io/projected/79caf655-43c2-401d-9967-97d9a35d9741-kube-api-access-77l52\") pod \"79caf655-43c2-401d-9967-97d9a35d9741\" (UID: \"79caf655-43c2-401d-9967-97d9a35d9741\") " Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.616942 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79caf655-43c2-401d-9967-97d9a35d9741-logs\") pod \"79caf655-43c2-401d-9967-97d9a35d9741\" (UID: \"79caf655-43c2-401d-9967-97d9a35d9741\") " Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.616995 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d907281e-010a-403c-ba2a-b8d178dacbb0-logs\") pod \"d907281e-010a-403c-ba2a-b8d178dacbb0\" (UID: \"d907281e-010a-403c-ba2a-b8d178dacbb0\") " Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.617060 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79caf655-43c2-401d-9967-97d9a35d9741-combined-ca-bundle\") pod \"79caf655-43c2-401d-9967-97d9a35d9741\" (UID: \"79caf655-43c2-401d-9967-97d9a35d9741\") " Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.617078 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79caf655-43c2-401d-9967-97d9a35d9741-config-data-custom\") pod \"79caf655-43c2-401d-9967-97d9a35d9741\" (UID: \"79caf655-43c2-401d-9967-97d9a35d9741\") " Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.617184 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d907281e-010a-403c-ba2a-b8d178dacbb0-combined-ca-bundle\") pod \"d907281e-010a-403c-ba2a-b8d178dacbb0\" (UID: \"d907281e-010a-403c-ba2a-b8d178dacbb0\") " Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.617204 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d907281e-010a-403c-ba2a-b8d178dacbb0-config-data\") pod \"d907281e-010a-403c-ba2a-b8d178dacbb0\" (UID: \"d907281e-010a-403c-ba2a-b8d178dacbb0\") " Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.617247 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d907281e-010a-403c-ba2a-b8d178dacbb0-config-data-custom\") pod \"d907281e-010a-403c-ba2a-b8d178dacbb0\" (UID: \"d907281e-010a-403c-ba2a-b8d178dacbb0\") " Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.617271 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79caf655-43c2-401d-9967-97d9a35d9741-config-data\") pod \"79caf655-43c2-401d-9967-97d9a35d9741\" (UID: \"79caf655-43c2-401d-9967-97d9a35d9741\") " Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.619649 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.626650 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79caf655-43c2-401d-9967-97d9a35d9741-logs" (OuterVolumeSpecName: "logs") pod "79caf655-43c2-401d-9967-97d9a35d9741" (UID: "79caf655-43c2-401d-9967-97d9a35d9741"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.627107 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d907281e-010a-403c-ba2a-b8d178dacbb0-logs" (OuterVolumeSpecName: "logs") pod "d907281e-010a-403c-ba2a-b8d178dacbb0" (UID: "d907281e-010a-403c-ba2a-b8d178dacbb0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.634238 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d907281e-010a-403c-ba2a-b8d178dacbb0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d907281e-010a-403c-ba2a-b8d178dacbb0" (UID: "d907281e-010a-403c-ba2a-b8d178dacbb0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.634614 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79caf655-43c2-401d-9967-97d9a35d9741-kube-api-access-77l52" (OuterVolumeSpecName: "kube-api-access-77l52") pod "79caf655-43c2-401d-9967-97d9a35d9741" (UID: "79caf655-43c2-401d-9967-97d9a35d9741"). InnerVolumeSpecName "kube-api-access-77l52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.654688 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79caf655-43c2-401d-9967-97d9a35d9741-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "79caf655-43c2-401d-9967-97d9a35d9741" (UID: "79caf655-43c2-401d-9967-97d9a35d9741"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.654722 4825 scope.go:117] "RemoveContainer" containerID="eae523cad9b94daad0aa32c8a855fc6def30624e340d1c962919717b99480d5e" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.654943 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d907281e-010a-403c-ba2a-b8d178dacbb0-kube-api-access-7g6d7" (OuterVolumeSpecName: "kube-api-access-7g6d7") pod "d907281e-010a-403c-ba2a-b8d178dacbb0" (UID: "d907281e-010a-403c-ba2a-b8d178dacbb0"). InnerVolumeSpecName "kube-api-access-7g6d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.715465 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-jctbb"] Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.719809 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-jctbb"] Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.720543 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d907281e-010a-403c-ba2a-b8d178dacbb0-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.720630 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g6d7\" (UniqueName: \"kubernetes.io/projected/d907281e-010a-403c-ba2a-b8d178dacbb0-kube-api-access-7g6d7\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.720760 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77l52\" (UniqueName: \"kubernetes.io/projected/79caf655-43c2-401d-9967-97d9a35d9741-kube-api-access-77l52\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.720885 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79caf655-43c2-401d-9967-97d9a35d9741-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.720940 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d907281e-010a-403c-ba2a-b8d178dacbb0-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.720993 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79caf655-43c2-401d-9967-97d9a35d9741-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.730352 4825 scope.go:117] "RemoveContainer" containerID="445d3424a112177b35839673ae3ebc80192ee7a09e556dccf91f7607c8d6b87a" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.754405 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79caf655-43c2-401d-9967-97d9a35d9741-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79caf655-43c2-401d-9967-97d9a35d9741" (UID: "79caf655-43c2-401d-9967-97d9a35d9741"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.766410 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.796252 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d907281e-010a-403c-ba2a-b8d178dacbb0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d907281e-010a-403c-ba2a-b8d178dacbb0" (UID: "d907281e-010a-403c-ba2a-b8d178dacbb0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.816380 4825 scope.go:117] "RemoveContainer" containerID="287e2ed2fe7cd936cd37546d987ad73fd2626cb93c8a46da8028302a3c2cd37d" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.822964 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79caf655-43c2-401d-9967-97d9a35d9741-config-data" (OuterVolumeSpecName: "config-data") pod "79caf655-43c2-401d-9967-97d9a35d9741" (UID: "79caf655-43c2-401d-9967-97d9a35d9741"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.827353 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79caf655-43c2-401d-9967-97d9a35d9741-config-data\") pod \"79caf655-43c2-401d-9967-97d9a35d9741\" (UID: \"79caf655-43c2-401d-9967-97d9a35d9741\") " Mar 10 07:09:26 crc kubenswrapper[4825]: W0310 07:09:26.827563 4825 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/79caf655-43c2-401d-9967-97d9a35d9741/volumes/kubernetes.io~secret/config-data Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.827582 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79caf655-43c2-401d-9967-97d9a35d9741-config-data" (OuterVolumeSpecName: "config-data") pod "79caf655-43c2-401d-9967-97d9a35d9741" (UID: "79caf655-43c2-401d-9967-97d9a35d9741"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.830765 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79caf655-43c2-401d-9967-97d9a35d9741-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.830784 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d907281e-010a-403c-ba2a-b8d178dacbb0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.830794 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79caf655-43c2-401d-9967-97d9a35d9741-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.836034 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d907281e-010a-403c-ba2a-b8d178dacbb0-config-data" (OuterVolumeSpecName: "config-data") pod "d907281e-010a-403c-ba2a-b8d178dacbb0" (UID: "d907281e-010a-403c-ba2a-b8d178dacbb0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.845316 4825 scope.go:117] "RemoveContainer" containerID="6d4ba65a7a74c4f484262169282d716735558f576441bf3306734527b8d19733" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.931706 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-nova-novncproxy-tls-certs\") pod \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\" (UID: \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\") " Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.931901 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-config-data\") pod \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\" (UID: \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\") " Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.931942 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-vencrypt-tls-certs\") pod \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\" (UID: \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\") " Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.931966 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-combined-ca-bundle\") pod \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\" (UID: \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\") " Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.931986 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmmfw\" (UniqueName: \"kubernetes.io/projected/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-kube-api-access-tmmfw\") pod \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\" (UID: \"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd\") " Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.932413 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d907281e-010a-403c-ba2a-b8d178dacbb0-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:26 crc kubenswrapper[4825]: I0310 07:09:26.965808 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-kube-api-access-tmmfw" (OuterVolumeSpecName: "kube-api-access-tmmfw") pod "66c3beca-bb2c-4b68-a4b3-3cfe936c25fd" (UID: "66c3beca-bb2c-4b68-a4b3-3cfe936c25fd"). InnerVolumeSpecName "kube-api-access-tmmfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.014628 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-config-data" (OuterVolumeSpecName: "config-data") pod "66c3beca-bb2c-4b68-a4b3-3cfe936c25fd" (UID: "66c3beca-bb2c-4b68-a4b3-3cfe936c25fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.026795 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66c3beca-bb2c-4b68-a4b3-3cfe936c25fd" (UID: "66c3beca-bb2c-4b68-a4b3-3cfe936c25fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.034798 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.034827 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.034839 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmmfw\" (UniqueName: \"kubernetes.io/projected/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-kube-api-access-tmmfw\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:27 crc kubenswrapper[4825]: E0310 07:09:27.034919 4825 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 10 07:09:27 crc kubenswrapper[4825]: E0310 07:09:27.034963 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-config-data podName:40efa241-98cc-4dec-9ae8-8a892b367ebc nodeName:}" failed. No retries permitted until 2026-03-10 07:09:31.034950147 +0000 UTC m=+1524.064730762 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-config-data") pod "rabbitmq-server-0" (UID: "40efa241-98cc-4dec-9ae8-8a892b367ebc") : configmap "rabbitmq-config-data" not found Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.035210 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "66c3beca-bb2c-4b68-a4b3-3cfe936c25fd" (UID: "66c3beca-bb2c-4b68-a4b3-3cfe936c25fd"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:27 crc kubenswrapper[4825]: E0310 07:09:27.035257 4825 secret.go:188] Couldn't get secret openstack/glance-scripts: secret "glance-scripts" not found Mar 10 07:09:27 crc kubenswrapper[4825]: E0310 07:09:27.035312 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-scripts podName:f33e10c8-8be2-46d4-8653-1960855a2a40 nodeName:}" failed. No retries permitted until 2026-03-10 07:09:31.035294986 +0000 UTC m=+1524.065075591 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-scripts") pod "glance-default-external-api-0" (UID: "f33e10c8-8be2-46d4-8653-1960855a2a40") : secret "glance-scripts" not found Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.085605 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "66c3beca-bb2c-4b68-a4b3-3cfe936c25fd" (UID: "66c3beca-bb2c-4b68-a4b3-3cfe936c25fd"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.136603 4825 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.136636 4825 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.209905 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1e7f-account-create-update-rfzld" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.221314 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.226790 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-75884b96b6-fqkns"] Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.232418 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-75884b96b6-fqkns"] Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.248521 4825 scope.go:117] "RemoveContainer" containerID="890a4d9166366f896b7b573d660a3e9559cfb944d463e538a8cb9f89a3568bf9" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.251845 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="256445a5-41a0-47b0-a149-3c31cb2f0959" path="/var/lib/kubelet/pods/256445a5-41a0-47b0-a149-3c31cb2f0959/volumes" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.252671 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b7468b7-ceaa-44b4-8364-5d3601f43c1b" path="/var/lib/kubelet/pods/2b7468b7-ceaa-44b4-8364-5d3601f43c1b/volumes" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.264246 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2" path="/var/lib/kubelet/pods/3e8ff55f-6ceb-43fc-bba7-18cc77c7d7d2/volumes" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.264970 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d0be589-d069-4183-b9f6-bde715ad716b" path="/var/lib/kubelet/pods/4d0be589-d069-4183-b9f6-bde715ad716b/volumes" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.265570 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11" path="/var/lib/kubelet/pods/59e7664c-cb9b-4ef1-a7ab-8f7c1b130b11/volumes" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.272966 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="663403bd-27d6-4584-9aaa-859445f539d6" path="/var/lib/kubelet/pods/663403bd-27d6-4584-9aaa-859445f539d6/volumes" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.273763 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d184a8b-881b-4f12-8c5b-cb355193fd98" path="/var/lib/kubelet/pods/7d184a8b-881b-4f12-8c5b-cb355193fd98/volumes" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.282751 4825 scope.go:117] "RemoveContainer" containerID="c5e15cd589465e5cfd321144406581f242067dc2f205033f4254ac94393e1ff6" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.286478 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d35d77e-fb8b-4aff-82f2-be95d6eae583" path="/var/lib/kubelet/pods/7d35d77e-fb8b-4aff-82f2-be95d6eae583/volumes" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.287243 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b061d3ea-7025-4381-a6b3-232d5998b15f" path="/var/lib/kubelet/pods/b061d3ea-7025-4381-a6b3-232d5998b15f/volumes" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.287729 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cba79823-f7ea-4ac9-80e0-ce6e216af1bc" path="/var/lib/kubelet/pods/cba79823-f7ea-4ac9-80e0-ce6e216af1bc/volumes" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.288240 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f" path="/var/lib/kubelet/pods/cf42b4fa-76fe-4d07-bdc3-c05c94cb4f6f/volumes" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.289627 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2e08f36-91c9-4bad-b1fb-88a0938c4d25" path="/var/lib/kubelet/pods/d2e08f36-91c9-4bad-b1fb-88a0938c4d25/volumes" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.290564 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d907281e-010a-403c-ba2a-b8d178dacbb0" path="/var/lib/kubelet/pods/d907281e-010a-403c-ba2a-b8d178dacbb0/volumes" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.291992 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6301c97-5fb5-4f12-9181-ae937aa01b33" path="/var/lib/kubelet/pods/e6301c97-5fb5-4f12-9181-ae937aa01b33/volumes" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.308797 4825 scope.go:117] "RemoveContainer" containerID="f0fabece877d02360c9f8c4a495f79b17208aceb1c8e9d8dac1c99cae589b5d0" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.340881 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj8r5\" (UniqueName: \"kubernetes.io/projected/757635f4-9aaf-48bb-bd50-60398db738d4-kube-api-access-zj8r5\") pod \"757635f4-9aaf-48bb-bd50-60398db738d4\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.341001 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"757635f4-9aaf-48bb-bd50-60398db738d4\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.341035 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/757635f4-9aaf-48bb-bd50-60398db738d4-kolla-config\") pod \"757635f4-9aaf-48bb-bd50-60398db738d4\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.341054 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/757635f4-9aaf-48bb-bd50-60398db738d4-config-data-default\") pod \"757635f4-9aaf-48bb-bd50-60398db738d4\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.341119 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/757635f4-9aaf-48bb-bd50-60398db738d4-combined-ca-bundle\") pod \"757635f4-9aaf-48bb-bd50-60398db738d4\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.341163 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/757635f4-9aaf-48bb-bd50-60398db738d4-operator-scripts\") pod \"757635f4-9aaf-48bb-bd50-60398db738d4\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.341199 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e808443d-0b2f-45a3-baa9-61e4d9819e6c-operator-scripts\") pod \"e808443d-0b2f-45a3-baa9-61e4d9819e6c\" (UID: \"e808443d-0b2f-45a3-baa9-61e4d9819e6c\") " Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.341249 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/757635f4-9aaf-48bb-bd50-60398db738d4-config-data-generated\") pod \"757635f4-9aaf-48bb-bd50-60398db738d4\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.341293 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2dhk\" (UniqueName: \"kubernetes.io/projected/e808443d-0b2f-45a3-baa9-61e4d9819e6c-kube-api-access-d2dhk\") pod \"e808443d-0b2f-45a3-baa9-61e4d9819e6c\" (UID: \"e808443d-0b2f-45a3-baa9-61e4d9819e6c\") " Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.341323 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/757635f4-9aaf-48bb-bd50-60398db738d4-galera-tls-certs\") pod \"757635f4-9aaf-48bb-bd50-60398db738d4\" (UID: \"757635f4-9aaf-48bb-bd50-60398db738d4\") " Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.342660 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/757635f4-9aaf-48bb-bd50-60398db738d4-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "757635f4-9aaf-48bb-bd50-60398db738d4" (UID: "757635f4-9aaf-48bb-bd50-60398db738d4"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.342984 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757635f4-9aaf-48bb-bd50-60398db738d4-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "757635f4-9aaf-48bb-bd50-60398db738d4" (UID: "757635f4-9aaf-48bb-bd50-60398db738d4"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.342999 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e808443d-0b2f-45a3-baa9-61e4d9819e6c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e808443d-0b2f-45a3-baa9-61e4d9819e6c" (UID: "e808443d-0b2f-45a3-baa9-61e4d9819e6c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.343617 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757635f4-9aaf-48bb-bd50-60398db738d4-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "757635f4-9aaf-48bb-bd50-60398db738d4" (UID: "757635f4-9aaf-48bb-bd50-60398db738d4"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.343796 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757635f4-9aaf-48bb-bd50-60398db738d4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "757635f4-9aaf-48bb-bd50-60398db738d4" (UID: "757635f4-9aaf-48bb-bd50-60398db738d4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.343956 4825 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/757635f4-9aaf-48bb-bd50-60398db738d4-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.343984 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/757635f4-9aaf-48bb-bd50-60398db738d4-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.343998 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e808443d-0b2f-45a3-baa9-61e4d9819e6c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.344012 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/757635f4-9aaf-48bb-bd50-60398db738d4-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.347217 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e808443d-0b2f-45a3-baa9-61e4d9819e6c-kube-api-access-d2dhk" (OuterVolumeSpecName: "kube-api-access-d2dhk") pod "e808443d-0b2f-45a3-baa9-61e4d9819e6c" (UID: "e808443d-0b2f-45a3-baa9-61e4d9819e6c"). InnerVolumeSpecName "kube-api-access-d2dhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.347449 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/757635f4-9aaf-48bb-bd50-60398db738d4-kube-api-access-zj8r5" (OuterVolumeSpecName: "kube-api-access-zj8r5") pod "757635f4-9aaf-48bb-bd50-60398db738d4" (UID: "757635f4-9aaf-48bb-bd50-60398db738d4"). InnerVolumeSpecName "kube-api-access-zj8r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.359456 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "757635f4-9aaf-48bb-bd50-60398db738d4" (UID: "757635f4-9aaf-48bb-bd50-60398db738d4"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.373668 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/757635f4-9aaf-48bb-bd50-60398db738d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "757635f4-9aaf-48bb-bd50-60398db738d4" (UID: "757635f4-9aaf-48bb-bd50-60398db738d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.392772 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/757635f4-9aaf-48bb-bd50-60398db738d4-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "757635f4-9aaf-48bb-bd50-60398db738d4" (UID: "757635f4-9aaf-48bb-bd50-60398db738d4"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.437201 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.446172 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zj8r5\" (UniqueName: \"kubernetes.io/projected/757635f4-9aaf-48bb-bd50-60398db738d4-kube-api-access-zj8r5\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.446222 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.446236 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/757635f4-9aaf-48bb-bd50-60398db738d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.446248 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/757635f4-9aaf-48bb-bd50-60398db738d4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.446261 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2dhk\" (UniqueName: \"kubernetes.io/projected/e808443d-0b2f-45a3-baa9-61e4d9819e6c-kube-api-access-d2dhk\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.446271 4825 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/757635f4-9aaf-48bb-bd50-60398db738d4-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.462692 4825 scope.go:117] "RemoveContainer" containerID="c5e15cd589465e5cfd321144406581f242067dc2f205033f4254ac94393e1ff6" Mar 10 07:09:27 crc kubenswrapper[4825]: E0310 07:09:27.463502 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5e15cd589465e5cfd321144406581f242067dc2f205033f4254ac94393e1ff6\": container with ID starting with c5e15cd589465e5cfd321144406581f242067dc2f205033f4254ac94393e1ff6 not found: ID does not exist" containerID="c5e15cd589465e5cfd321144406581f242067dc2f205033f4254ac94393e1ff6" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.463536 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5e15cd589465e5cfd321144406581f242067dc2f205033f4254ac94393e1ff6"} err="failed to get container status \"c5e15cd589465e5cfd321144406581f242067dc2f205033f4254ac94393e1ff6\": rpc error: code = NotFound desc = could not find container \"c5e15cd589465e5cfd321144406581f242067dc2f205033f4254ac94393e1ff6\": container with ID starting with c5e15cd589465e5cfd321144406581f242067dc2f205033f4254ac94393e1ff6 not found: ID does not exist" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.463556 4825 scope.go:117] "RemoveContainer" containerID="f0fabece877d02360c9f8c4a495f79b17208aceb1c8e9d8dac1c99cae589b5d0" Mar 10 07:09:27 crc kubenswrapper[4825]: E0310 07:09:27.463797 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0fabece877d02360c9f8c4a495f79b17208aceb1c8e9d8dac1c99cae589b5d0\": container with ID starting with f0fabece877d02360c9f8c4a495f79b17208aceb1c8e9d8dac1c99cae589b5d0 not found: ID does not exist" containerID="f0fabece877d02360c9f8c4a495f79b17208aceb1c8e9d8dac1c99cae589b5d0" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.463820 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0fabece877d02360c9f8c4a495f79b17208aceb1c8e9d8dac1c99cae589b5d0"} err="failed to get container status \"f0fabece877d02360c9f8c4a495f79b17208aceb1c8e9d8dac1c99cae589b5d0\": rpc error: code = NotFound desc = could not find container \"f0fabece877d02360c9f8c4a495f79b17208aceb1c8e9d8dac1c99cae589b5d0\": container with ID starting with f0fabece877d02360c9f8c4a495f79b17208aceb1c8e9d8dac1c99cae589b5d0 not found: ID does not exist" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.463834 4825 scope.go:117] "RemoveContainer" containerID="c5e15cd589465e5cfd321144406581f242067dc2f205033f4254ac94393e1ff6" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.464002 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5e15cd589465e5cfd321144406581f242067dc2f205033f4254ac94393e1ff6"} err="failed to get container status \"c5e15cd589465e5cfd321144406581f242067dc2f205033f4254ac94393e1ff6\": rpc error: code = NotFound desc = could not find container \"c5e15cd589465e5cfd321144406581f242067dc2f205033f4254ac94393e1ff6\": container with ID starting with c5e15cd589465e5cfd321144406581f242067dc2f205033f4254ac94393e1ff6 not found: ID does not exist" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.464018 4825 scope.go:117] "RemoveContainer" containerID="f0fabece877d02360c9f8c4a495f79b17208aceb1c8e9d8dac1c99cae589b5d0" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.464202 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0fabece877d02360c9f8c4a495f79b17208aceb1c8e9d8dac1c99cae589b5d0"} err="failed to get container status \"f0fabece877d02360c9f8c4a495f79b17208aceb1c8e9d8dac1c99cae589b5d0\": rpc error: code = NotFound desc = could not find container \"f0fabece877d02360c9f8c4a495f79b17208aceb1c8e9d8dac1c99cae589b5d0\": container with ID starting with f0fabece877d02360c9f8c4a495f79b17208aceb1c8e9d8dac1c99cae589b5d0 not found: ID does not exist" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.477473 4825 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.547641 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af39779a-3a65-4379-9e92-d69ab1610fc6-combined-ca-bundle\") pod \"af39779a-3a65-4379-9e92-d69ab1610fc6\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.547795 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af39779a-3a65-4379-9e92-d69ab1610fc6-log-httpd\") pod \"af39779a-3a65-4379-9e92-d69ab1610fc6\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.547818 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af39779a-3a65-4379-9e92-d69ab1610fc6-run-httpd\") pod \"af39779a-3a65-4379-9e92-d69ab1610fc6\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.547840 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcznn\" (UniqueName: \"kubernetes.io/projected/af39779a-3a65-4379-9e92-d69ab1610fc6-kube-api-access-lcznn\") pod \"af39779a-3a65-4379-9e92-d69ab1610fc6\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.547891 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af39779a-3a65-4379-9e92-d69ab1610fc6-internal-tls-certs\") pod \"af39779a-3a65-4379-9e92-d69ab1610fc6\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.547914 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/af39779a-3a65-4379-9e92-d69ab1610fc6-etc-swift\") pod \"af39779a-3a65-4379-9e92-d69ab1610fc6\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.547950 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af39779a-3a65-4379-9e92-d69ab1610fc6-public-tls-certs\") pod \"af39779a-3a65-4379-9e92-d69ab1610fc6\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.548003 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af39779a-3a65-4379-9e92-d69ab1610fc6-config-data\") pod \"af39779a-3a65-4379-9e92-d69ab1610fc6\" (UID: \"af39779a-3a65-4379-9e92-d69ab1610fc6\") " Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.548451 4825 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.550351 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af39779a-3a65-4379-9e92-d69ab1610fc6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "af39779a-3a65-4379-9e92-d69ab1610fc6" (UID: "af39779a-3a65-4379-9e92-d69ab1610fc6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.550960 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af39779a-3a65-4379-9e92-d69ab1610fc6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "af39779a-3a65-4379-9e92-d69ab1610fc6" (UID: "af39779a-3a65-4379-9e92-d69ab1610fc6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.556191 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af39779a-3a65-4379-9e92-d69ab1610fc6-kube-api-access-lcznn" (OuterVolumeSpecName: "kube-api-access-lcznn") pod "af39779a-3a65-4379-9e92-d69ab1610fc6" (UID: "af39779a-3a65-4379-9e92-d69ab1610fc6"). InnerVolumeSpecName "kube-api-access-lcznn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.559117 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af39779a-3a65-4379-9e92-d69ab1610fc6-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "af39779a-3a65-4379-9e92-d69ab1610fc6" (UID: "af39779a-3a65-4379-9e92-d69ab1610fc6"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.598293 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af39779a-3a65-4379-9e92-d69ab1610fc6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af39779a-3a65-4379-9e92-d69ab1610fc6" (UID: "af39779a-3a65-4379-9e92-d69ab1610fc6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.600786 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af39779a-3a65-4379-9e92-d69ab1610fc6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "af39779a-3a65-4379-9e92-d69ab1610fc6" (UID: "af39779a-3a65-4379-9e92-d69ab1610fc6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.603382 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af39779a-3a65-4379-9e92-d69ab1610fc6-config-data" (OuterVolumeSpecName: "config-data") pod "af39779a-3a65-4379-9e92-d69ab1610fc6" (UID: "af39779a-3a65-4379-9e92-d69ab1610fc6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.626907 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af39779a-3a65-4379-9e92-d69ab1610fc6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "af39779a-3a65-4379-9e92-d69ab1610fc6" (UID: "af39779a-3a65-4379-9e92-d69ab1610fc6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.636671 4825 generic.go:334] "Generic (PLEG): container finished" podID="c684f3e4-ced9-49c3-aa54-515bc2c4fb56" containerID="a197f76cf3bde62283e41aa4013c9afdb9aa3afe6880137cee79408f1cdb53ae" exitCode=1 Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.636726 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pwt26" event={"ID":"c684f3e4-ced9-49c3-aa54-515bc2c4fb56","Type":"ContainerDied","Data":"a197f76cf3bde62283e41aa4013c9afdb9aa3afe6880137cee79408f1cdb53ae"} Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.636759 4825 scope.go:117] "RemoveContainer" containerID="a281eeb65879a51d71eaf01bb8092df711b2e91f86c5156b3dc7040201fbe49a" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.637287 4825 scope.go:117] "RemoveContainer" containerID="a197f76cf3bde62283e41aa4013c9afdb9aa3afe6880137cee79408f1cdb53ae" Mar 10 07:09:27 crc kubenswrapper[4825]: E0310 07:09:27.637492 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-pwt26_openstack(c684f3e4-ced9-49c3-aa54-515bc2c4fb56)\"" pod="openstack/root-account-create-update-pwt26" podUID="c684f3e4-ced9-49c3-aa54-515bc2c4fb56" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.644383 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-54d455c957-2jn47" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.644385 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54d455c957-2jn47" event={"ID":"79caf655-43c2-401d-9967-97d9a35d9741","Type":"ContainerDied","Data":"53553d1e2e85651fb2a187ec88eff96e47b92e66639e428ccf1bf578323e6b0d"} Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.650613 4825 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af39779a-3a65-4379-9e92-d69ab1610fc6-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.650744 4825 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af39779a-3a65-4379-9e92-d69ab1610fc6-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.650833 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcznn\" (UniqueName: \"kubernetes.io/projected/af39779a-3a65-4379-9e92-d69ab1610fc6-kube-api-access-lcznn\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.650919 4825 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af39779a-3a65-4379-9e92-d69ab1610fc6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.651000 4825 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/af39779a-3a65-4379-9e92-d69ab1610fc6-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.651077 4825 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af39779a-3a65-4379-9e92-d69ab1610fc6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.651160 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af39779a-3a65-4379-9e92-d69ab1610fc6-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.651243 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af39779a-3a65-4379-9e92-d69ab1610fc6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.653231 4825 generic.go:334] "Generic (PLEG): container finished" podID="757635f4-9aaf-48bb-bd50-60398db738d4" containerID="64283b83ccd27989eef21b478fef4edaf5b901a9f55b243380a9b8e95c69504f" exitCode=0 Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.653315 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.653389 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"757635f4-9aaf-48bb-bd50-60398db738d4","Type":"ContainerDied","Data":"64283b83ccd27989eef21b478fef4edaf5b901a9f55b243380a9b8e95c69504f"} Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.653431 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"757635f4-9aaf-48bb-bd50-60398db738d4","Type":"ContainerDied","Data":"8edd9f19672976cb56f4c38fafcec17938c94445105f5ca564ed9e0d46b66553"} Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.699497 4825 scope.go:117] "RemoveContainer" containerID="e0c5268cbaf4fcdd48619feb929fd81e110e9f60040fedec074af3e0608a9a8d" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.699716 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.699746 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"66c3beca-bb2c-4b68-a4b3-3cfe936c25fd","Type":"ContainerDied","Data":"73ff8081b411e296c16476218fa147d30c5d3bdabcdba8be0470daa44e29eb83"} Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.708525 4825 generic.go:334] "Generic (PLEG): container finished" podID="af39779a-3a65-4379-9e92-d69ab1610fc6" containerID="8f8c2fc2f611cbc298369095f2d9421d72e169c12c731c66dc4b7e2b27276c19" exitCode=0 Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.708565 4825 generic.go:334] "Generic (PLEG): container finished" podID="af39779a-3a65-4379-9e92-d69ab1610fc6" containerID="2dd5d463672a820d3d3d4f47fbb2be409cc30d40e5202b4d480569146558f8cd" exitCode=0 Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.708604 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-bb7dcfc7-n79vf" event={"ID":"af39779a-3a65-4379-9e92-d69ab1610fc6","Type":"ContainerDied","Data":"8f8c2fc2f611cbc298369095f2d9421d72e169c12c731c66dc4b7e2b27276c19"} Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.708639 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-bb7dcfc7-n79vf" event={"ID":"af39779a-3a65-4379-9e92-d69ab1610fc6","Type":"ContainerDied","Data":"2dd5d463672a820d3d3d4f47fbb2be409cc30d40e5202b4d480569146558f8cd"} Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.708651 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-bb7dcfc7-n79vf" event={"ID":"af39779a-3a65-4379-9e92-d69ab1610fc6","Type":"ContainerDied","Data":"032212a00cfecef0137bdd8ffd7c86108ce60785d0d5c2e541cc86c0f1a910fe"} Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.708852 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-bb7dcfc7-n79vf" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.729765 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1e7f-account-create-update-rfzld" event={"ID":"e808443d-0b2f-45a3-baa9-61e4d9819e6c","Type":"ContainerDied","Data":"7eee7a7c47b0665f1acabe7ae84c3da45d155ec4e0dfb6c96eb0fd896337ed1b"} Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.731324 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1e7f-account-create-update-rfzld" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.745449 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-54d455c957-2jn47"] Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.749778 4825 scope.go:117] "RemoveContainer" containerID="b0dc5fbc17a28febd97f83275e735288d32e837f8d7e53a674a6790d1cb06167" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.750256 4825 generic.go:334] "Generic (PLEG): container finished" podID="56c74365-656c-4362-8358-bbb17d0c8be0" containerID="506bc72445e5baadada4d8eeada583023eaf1965303dabc6bfea713ddd7e5cda" exitCode=0 Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.750316 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55b54bdfdb-8b9ns" event={"ID":"56c74365-656c-4362-8358-bbb17d0c8be0","Type":"ContainerDied","Data":"506bc72445e5baadada4d8eeada583023eaf1965303dabc6bfea713ddd7e5cda"} Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.758621 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-54d455c957-2jn47"] Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.777580 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.788847 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.790023 4825 scope.go:117] "RemoveContainer" containerID="64283b83ccd27989eef21b478fef4edaf5b901a9f55b243380a9b8e95c69504f" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.820198 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.825093 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.830373 4825 scope.go:117] "RemoveContainer" containerID="6289da05c1ca1af51b3d7bc91bf14b2681c592f4f4c2d675793e7468ec8ede52" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.846632 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-bb7dcfc7-n79vf"] Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.853203 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-bb7dcfc7-n79vf"] Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.861419 4825 scope.go:117] "RemoveContainer" containerID="64283b83ccd27989eef21b478fef4edaf5b901a9f55b243380a9b8e95c69504f" Mar 10 07:09:27 crc kubenswrapper[4825]: E0310 07:09:27.865519 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64283b83ccd27989eef21b478fef4edaf5b901a9f55b243380a9b8e95c69504f\": container with ID starting with 64283b83ccd27989eef21b478fef4edaf5b901a9f55b243380a9b8e95c69504f not found: ID does not exist" containerID="64283b83ccd27989eef21b478fef4edaf5b901a9f55b243380a9b8e95c69504f" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.865560 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64283b83ccd27989eef21b478fef4edaf5b901a9f55b243380a9b8e95c69504f"} err="failed to get container status \"64283b83ccd27989eef21b478fef4edaf5b901a9f55b243380a9b8e95c69504f\": rpc error: code = NotFound desc = could not find container \"64283b83ccd27989eef21b478fef4edaf5b901a9f55b243380a9b8e95c69504f\": container with ID starting with 64283b83ccd27989eef21b478fef4edaf5b901a9f55b243380a9b8e95c69504f not found: ID does not exist" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.865585 4825 scope.go:117] "RemoveContainer" containerID="6289da05c1ca1af51b3d7bc91bf14b2681c592f4f4c2d675793e7468ec8ede52" Mar 10 07:09:27 crc kubenswrapper[4825]: E0310 07:09:27.865961 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6289da05c1ca1af51b3d7bc91bf14b2681c592f4f4c2d675793e7468ec8ede52\": container with ID starting with 6289da05c1ca1af51b3d7bc91bf14b2681c592f4f4c2d675793e7468ec8ede52 not found: ID does not exist" containerID="6289da05c1ca1af51b3d7bc91bf14b2681c592f4f4c2d675793e7468ec8ede52" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.865982 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6289da05c1ca1af51b3d7bc91bf14b2681c592f4f4c2d675793e7468ec8ede52"} err="failed to get container status \"6289da05c1ca1af51b3d7bc91bf14b2681c592f4f4c2d675793e7468ec8ede52\": rpc error: code = NotFound desc = could not find container \"6289da05c1ca1af51b3d7bc91bf14b2681c592f4f4c2d675793e7468ec8ede52\": container with ID starting with 6289da05c1ca1af51b3d7bc91bf14b2681c592f4f4c2d675793e7468ec8ede52 not found: ID does not exist" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.865995 4825 scope.go:117] "RemoveContainer" containerID="84161414c9dbeb64c9936bc248a59e4b32d0591a084774c91d9da931fd27e821" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.866060 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-1e7f-account-create-update-rfzld"] Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.873175 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-1e7f-account-create-update-rfzld"] Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.934355 4825 scope.go:117] "RemoveContainer" containerID="8f8c2fc2f611cbc298369095f2d9421d72e169c12c731c66dc4b7e2b27276c19" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.935543 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.954764 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.955250 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e516b5cf-d44c-4f03-9247-7727319f0a85" containerName="ceilometer-central-agent" containerID="cri-o://b5fc534b6fc5bdb62a0dd359fd61377bd0c4ca9f7943fd237ff73b6a79274f5c" gracePeriod=30 Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.955294 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e516b5cf-d44c-4f03-9247-7727319f0a85" containerName="proxy-httpd" containerID="cri-o://062160b5e34d30d57397fa7d0b1b030e8fd845e13121bd5d0cdea1d6a20a2de8" gracePeriod=30 Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.955364 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e516b5cf-d44c-4f03-9247-7727319f0a85" containerName="sg-core" containerID="cri-o://b95484ef86f19d2fa040d657d6e6e16f10d17dd575ffd8009352291bebecc277" gracePeriod=30 Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.955398 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e516b5cf-d44c-4f03-9247-7727319f0a85" containerName="ceilometer-notification-agent" containerID="cri-o://d021be5f73ead6231f65789871b1004af37bd859e93d3efa1540dedd07f320d9" gracePeriod=30 Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.987944 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 07:09:27 crc kubenswrapper[4825]: I0310 07:09:27.988204 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="04a2aca4-f98f-4ae8-aca9-62ae6625e5e2" containerName="kube-state-metrics" containerID="cri-o://fe8d444c42b3290f6dd09fc28b48048d43ae16fceb5be5dbfbb283eec952359c" gracePeriod=30 Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.058010 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-scripts\") pod \"56c74365-656c-4362-8358-bbb17d0c8be0\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.058105 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-config-data\") pod \"56c74365-656c-4362-8358-bbb17d0c8be0\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.058230 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-public-tls-certs\") pod \"56c74365-656c-4362-8358-bbb17d0c8be0\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.058323 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-internal-tls-certs\") pod \"56c74365-656c-4362-8358-bbb17d0c8be0\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.058383 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56c74365-656c-4362-8358-bbb17d0c8be0-logs\") pod \"56c74365-656c-4362-8358-bbb17d0c8be0\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.058403 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-combined-ca-bundle\") pod \"56c74365-656c-4362-8358-bbb17d0c8be0\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.058430 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2kp2\" (UniqueName: \"kubernetes.io/projected/56c74365-656c-4362-8358-bbb17d0c8be0-kube-api-access-x2kp2\") pod \"56c74365-656c-4362-8358-bbb17d0c8be0\" (UID: \"56c74365-656c-4362-8358-bbb17d0c8be0\") " Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.061587 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56c74365-656c-4362-8358-bbb17d0c8be0-logs" (OuterVolumeSpecName: "logs") pod "56c74365-656c-4362-8358-bbb17d0c8be0" (UID: "56c74365-656c-4362-8358-bbb17d0c8be0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.064025 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56c74365-656c-4362-8358-bbb17d0c8be0-kube-api-access-x2kp2" (OuterVolumeSpecName: "kube-api-access-x2kp2") pod "56c74365-656c-4362-8358-bbb17d0c8be0" (UID: "56c74365-656c-4362-8358-bbb17d0c8be0"). InnerVolumeSpecName "kube-api-access-x2kp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.067963 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-scripts" (OuterVolumeSpecName: "scripts") pod "56c74365-656c-4362-8358-bbb17d0c8be0" (UID: "56c74365-656c-4362-8358-bbb17d0c8be0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.161476 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56c74365-656c-4362-8358-bbb17d0c8be0-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.161513 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2kp2\" (UniqueName: \"kubernetes.io/projected/56c74365-656c-4362-8358-bbb17d0c8be0-kube-api-access-x2kp2\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.161526 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.173198 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.173479 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="5d4252ca-f279-4c95-8f10-205339d028a5" containerName="memcached" containerID="cri-o://4c5195f2b444f93836dcb7df7bcc67ecb17ab72fc077c910233e60c95d78edb3" gracePeriod=30 Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.266105 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5fb2-account-create-update-vpbd9"] Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.326201 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5fb2-account-create-update-vpbd9"] Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.338247 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5fb2-account-create-update-z6tqt"] Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.338740 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d184a8b-881b-4f12-8c5b-cb355193fd98" containerName="init" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.338759 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d184a8b-881b-4f12-8c5b-cb355193fd98" containerName="init" Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.338783 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="757635f4-9aaf-48bb-bd50-60398db738d4" containerName="galera" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.338792 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="757635f4-9aaf-48bb-bd50-60398db738d4" containerName="galera" Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.338810 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d0be589-d069-4183-b9f6-bde715ad716b" containerName="ovsdbserver-nb" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.338819 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d0be589-d069-4183-b9f6-bde715ad716b" containerName="ovsdbserver-nb" Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.338834 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="757635f4-9aaf-48bb-bd50-60398db738d4" containerName="mysql-bootstrap" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.338843 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="757635f4-9aaf-48bb-bd50-60398db738d4" containerName="mysql-bootstrap" Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.338857 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79caf655-43c2-401d-9967-97d9a35d9741" containerName="barbican-worker-log" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.338865 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="79caf655-43c2-401d-9967-97d9a35d9741" containerName="barbican-worker-log" Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.338881 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c3beca-bb2c-4b68-a4b3-3cfe936c25fd" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.338889 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c3beca-bb2c-4b68-a4b3-3cfe936c25fd" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.338902 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6301c97-5fb5-4f12-9181-ae937aa01b33" containerName="ovsdbserver-sb" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.338910 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6301c97-5fb5-4f12-9181-ae937aa01b33" containerName="ovsdbserver-sb" Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.338924 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d184a8b-881b-4f12-8c5b-cb355193fd98" containerName="dnsmasq-dns" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.338931 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d184a8b-881b-4f12-8c5b-cb355193fd98" containerName="dnsmasq-dns" Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.338944 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79caf655-43c2-401d-9967-97d9a35d9741" containerName="barbican-worker" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.338952 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="79caf655-43c2-401d-9967-97d9a35d9741" containerName="barbican-worker" Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.338966 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c74365-656c-4362-8358-bbb17d0c8be0" containerName="placement-log" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.338974 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c74365-656c-4362-8358-bbb17d0c8be0" containerName="placement-log" Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.338986 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c74365-656c-4362-8358-bbb17d0c8be0" containerName="placement-api" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.338994 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c74365-656c-4362-8358-bbb17d0c8be0" containerName="placement-api" Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.339008 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6301c97-5fb5-4f12-9181-ae937aa01b33" containerName="openstack-network-exporter" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339016 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6301c97-5fb5-4f12-9181-ae937aa01b33" containerName="openstack-network-exporter" Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.339031 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d907281e-010a-403c-ba2a-b8d178dacbb0" containerName="barbican-keystone-listener-log" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339038 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d907281e-010a-403c-ba2a-b8d178dacbb0" containerName="barbican-keystone-listener-log" Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.339050 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b7468b7-ceaa-44b4-8364-5d3601f43c1b" containerName="ovn-controller" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339060 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7468b7-ceaa-44b4-8364-5d3601f43c1b" containerName="ovn-controller" Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.339074 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d907281e-010a-403c-ba2a-b8d178dacbb0" containerName="barbican-keystone-listener" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339084 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d907281e-010a-403c-ba2a-b8d178dacbb0" containerName="barbican-keystone-listener" Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.339098 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2e08f36-91c9-4bad-b1fb-88a0938c4d25" containerName="openstack-network-exporter" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339105 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2e08f36-91c9-4bad-b1fb-88a0938c4d25" containerName="openstack-network-exporter" Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.339120 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d0be589-d069-4183-b9f6-bde715ad716b" containerName="openstack-network-exporter" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339150 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d0be589-d069-4183-b9f6-bde715ad716b" containerName="openstack-network-exporter" Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.339168 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af39779a-3a65-4379-9e92-d69ab1610fc6" containerName="proxy-server" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339177 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="af39779a-3a65-4379-9e92-d69ab1610fc6" containerName="proxy-server" Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.339185 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af39779a-3a65-4379-9e92-d69ab1610fc6" containerName="proxy-httpd" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339193 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="af39779a-3a65-4379-9e92-d69ab1610fc6" containerName="proxy-httpd" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339414 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="af39779a-3a65-4379-9e92-d69ab1610fc6" containerName="proxy-httpd" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339436 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d184a8b-881b-4f12-8c5b-cb355193fd98" containerName="dnsmasq-dns" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339450 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="d907281e-010a-403c-ba2a-b8d178dacbb0" containerName="barbican-keystone-listener-log" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339460 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="d907281e-010a-403c-ba2a-b8d178dacbb0" containerName="barbican-keystone-listener" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339468 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="757635f4-9aaf-48bb-bd50-60398db738d4" containerName="galera" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339480 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c74365-656c-4362-8358-bbb17d0c8be0" containerName="placement-api" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339496 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="af39779a-3a65-4379-9e92-d69ab1610fc6" containerName="proxy-server" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339506 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d0be589-d069-4183-b9f6-bde715ad716b" containerName="ovsdbserver-nb" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339521 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6301c97-5fb5-4f12-9181-ae937aa01b33" containerName="ovsdbserver-sb" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339531 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c74365-656c-4362-8358-bbb17d0c8be0" containerName="placement-log" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339541 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="79caf655-43c2-401d-9967-97d9a35d9741" containerName="barbican-worker-log" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339557 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b7468b7-ceaa-44b4-8364-5d3601f43c1b" containerName="ovn-controller" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339571 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="66c3beca-bb2c-4b68-a4b3-3cfe936c25fd" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339586 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2e08f36-91c9-4bad-b1fb-88a0938c4d25" containerName="openstack-network-exporter" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339597 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d0be589-d069-4183-b9f6-bde715ad716b" containerName="openstack-network-exporter" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339610 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="79caf655-43c2-401d-9967-97d9a35d9741" containerName="barbican-worker" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.339623 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6301c97-5fb5-4f12-9181-ae937aa01b33" containerName="openstack-network-exporter" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.340406 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5fb2-account-create-update-z6tqt" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.348185 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5fb2-account-create-update-z6tqt"] Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.348374 4825 scope.go:117] "RemoveContainer" containerID="2dd5d463672a820d3d3d4f47fbb2be409cc30d40e5202b4d480569146558f8cd" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.348549 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="209fbdac-3b3c-451a-ae30-5888e1cbb891" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.174:8776/healthcheck\": read tcp 10.217.0.2:55572->10.217.0.174:8776: read: connection reset by peer" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.348950 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.368322 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-t7ktm"] Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.413642 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3d70d139a1deb1c6a27cdf349fc1cc7878472659a23957efa5657e5551a800ef" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.415749 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3d70d139a1deb1c6a27cdf349fc1cc7878472659a23957efa5657e5551a800ef" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.436851 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3d70d139a1deb1c6a27cdf349fc1cc7878472659a23957efa5657e5551a800ef" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.436912 4825 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="9965b351-ed73-4c38-b393-ff72ba48cd66" containerName="nova-cell1-conductor-conductor" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.437532 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-mcpm6"] Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.448908 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-config-data" (OuterVolumeSpecName: "config-data") pod "56c74365-656c-4362-8358-bbb17d0c8be0" (UID: "56c74365-656c-4362-8358-bbb17d0c8be0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.452446 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56c74365-656c-4362-8358-bbb17d0c8be0" (UID: "56c74365-656c-4362-8358-bbb17d0c8be0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.483263 4825 scope.go:117] "RemoveContainer" containerID="8f8c2fc2f611cbc298369095f2d9421d72e169c12c731c66dc4b7e2b27276c19" Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.495321 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f8c2fc2f611cbc298369095f2d9421d72e169c12c731c66dc4b7e2b27276c19\": container with ID starting with 8f8c2fc2f611cbc298369095f2d9421d72e169c12c731c66dc4b7e2b27276c19 not found: ID does not exist" containerID="8f8c2fc2f611cbc298369095f2d9421d72e169c12c731c66dc4b7e2b27276c19" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.495637 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f8c2fc2f611cbc298369095f2d9421d72e169c12c731c66dc4b7e2b27276c19"} err="failed to get container status \"8f8c2fc2f611cbc298369095f2d9421d72e169c12c731c66dc4b7e2b27276c19\": rpc error: code = NotFound desc = could not find container \"8f8c2fc2f611cbc298369095f2d9421d72e169c12c731c66dc4b7e2b27276c19\": container with ID starting with 8f8c2fc2f611cbc298369095f2d9421d72e169c12c731c66dc4b7e2b27276c19 not found: ID does not exist" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.495665 4825 scope.go:117] "RemoveContainer" containerID="2dd5d463672a820d3d3d4f47fbb2be409cc30d40e5202b4d480569146558f8cd" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.496848 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a668a190-c176-4d20-b26f-4ac28fa83476-operator-scripts\") pod \"keystone-5fb2-account-create-update-z6tqt\" (UID: \"a668a190-c176-4d20-b26f-4ac28fa83476\") " pod="openstack/keystone-5fb2-account-create-update-z6tqt" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.496973 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22k77\" (UniqueName: \"kubernetes.io/projected/a668a190-c176-4d20-b26f-4ac28fa83476-kube-api-access-22k77\") pod \"keystone-5fb2-account-create-update-z6tqt\" (UID: \"a668a190-c176-4d20-b26f-4ac28fa83476\") " pod="openstack/keystone-5fb2-account-create-update-z6tqt" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.497055 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.497080 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.497116 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-t7ktm"] Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.501470 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dd5d463672a820d3d3d4f47fbb2be409cc30d40e5202b4d480569146558f8cd\": container with ID starting with 2dd5d463672a820d3d3d4f47fbb2be409cc30d40e5202b4d480569146558f8cd not found: ID does not exist" containerID="2dd5d463672a820d3d3d4f47fbb2be409cc30d40e5202b4d480569146558f8cd" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.501518 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dd5d463672a820d3d3d4f47fbb2be409cc30d40e5202b4d480569146558f8cd"} err="failed to get container status \"2dd5d463672a820d3d3d4f47fbb2be409cc30d40e5202b4d480569146558f8cd\": rpc error: code = NotFound desc = could not find container \"2dd5d463672a820d3d3d4f47fbb2be409cc30d40e5202b4d480569146558f8cd\": container with ID starting with 2dd5d463672a820d3d3d4f47fbb2be409cc30d40e5202b4d480569146558f8cd not found: ID does not exist" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.501536 4825 scope.go:117] "RemoveContainer" containerID="8f8c2fc2f611cbc298369095f2d9421d72e169c12c731c66dc4b7e2b27276c19" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.508381 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f8c2fc2f611cbc298369095f2d9421d72e169c12c731c66dc4b7e2b27276c19"} err="failed to get container status \"8f8c2fc2f611cbc298369095f2d9421d72e169c12c731c66dc4b7e2b27276c19\": rpc error: code = NotFound desc = could not find container \"8f8c2fc2f611cbc298369095f2d9421d72e169c12c731c66dc4b7e2b27276c19\": container with ID starting with 8f8c2fc2f611cbc298369095f2d9421d72e169c12c731c66dc4b7e2b27276c19 not found: ID does not exist" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.508411 4825 scope.go:117] "RemoveContainer" containerID="2dd5d463672a820d3d3d4f47fbb2be409cc30d40e5202b4d480569146558f8cd" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.517227 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dd5d463672a820d3d3d4f47fbb2be409cc30d40e5202b4d480569146558f8cd"} err="failed to get container status \"2dd5d463672a820d3d3d4f47fbb2be409cc30d40e5202b4d480569146558f8cd\": rpc error: code = NotFound desc = could not find container \"2dd5d463672a820d3d3d4f47fbb2be409cc30d40e5202b4d480569146558f8cd\": container with ID starting with 2dd5d463672a820d3d3d4f47fbb2be409cc30d40e5202b4d480569146558f8cd not found: ID does not exist" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.543193 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-mcpm6"] Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.568223 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-599df5898d-bqcpr"] Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.568522 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-599df5898d-bqcpr" podUID="86dc5ac4-dac8-4fdf-bba9-06d13efacd53" containerName="keystone-api" containerID="cri-o://878047a39225dab631a6decdc7de8a0629c77fdf4b999f47c2c1d0353e8c9e86" gracePeriod=30 Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.598072 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22k77\" (UniqueName: \"kubernetes.io/projected/a668a190-c176-4d20-b26f-4ac28fa83476-kube-api-access-22k77\") pod \"keystone-5fb2-account-create-update-z6tqt\" (UID: \"a668a190-c176-4d20-b26f-4ac28fa83476\") " pod="openstack/keystone-5fb2-account-create-update-z6tqt" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.598186 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a668a190-c176-4d20-b26f-4ac28fa83476-operator-scripts\") pod \"keystone-5fb2-account-create-update-z6tqt\" (UID: \"a668a190-c176-4d20-b26f-4ac28fa83476\") " pod="openstack/keystone-5fb2-account-create-update-z6tqt" Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.598333 4825 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.598381 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a668a190-c176-4d20-b26f-4ac28fa83476-operator-scripts podName:a668a190-c176-4d20-b26f-4ac28fa83476 nodeName:}" failed. No retries permitted until 2026-03-10 07:09:29.098368473 +0000 UTC m=+1522.128149088 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a668a190-c176-4d20-b26f-4ac28fa83476-operator-scripts") pod "keystone-5fb2-account-create-update-z6tqt" (UID: "a668a190-c176-4d20-b26f-4ac28fa83476") : configmap "openstack-scripts" not found Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.605284 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.611510 4825 projected.go:194] Error preparing data for projected volume kube-api-access-22k77 for pod openstack/keystone-5fb2-account-create-update-z6tqt: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.611577 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a668a190-c176-4d20-b26f-4ac28fa83476-kube-api-access-22k77 podName:a668a190-c176-4d20-b26f-4ac28fa83476 nodeName:}" failed. No retries permitted until 2026-03-10 07:09:29.111561507 +0000 UTC m=+1522.141342122 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-22k77" (UniqueName: "kubernetes.io/projected/a668a190-c176-4d20-b26f-4ac28fa83476-kube-api-access-22k77") pod "keystone-5fb2-account-create-update-z6tqt" (UID: "a668a190-c176-4d20-b26f-4ac28fa83476") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.660523 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-gpq78"] Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.663538 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-gpq78"] Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.673421 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "56c74365-656c-4362-8358-bbb17d0c8be0" (UID: "56c74365-656c-4362-8358-bbb17d0c8be0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.684435 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5fb2-account-create-update-z6tqt"] Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.685182 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-22k77 operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-5fb2-account-create-update-z6tqt" podUID="a668a190-c176-4d20-b26f-4ac28fa83476" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.687359 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b1222a91-b344-4f54-bf9f-75d03d5f8549" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": read tcp 10.217.0.2:34014->10.217.0.215:8775: read: connection reset by peer" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.687379 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b1222a91-b344-4f54-bf9f-75d03d5f8549" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": read tcp 10.217.0.2:34024->10.217.0.215:8775: read: connection reset by peer" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.699579 4825 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.724972 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-pwt26"] Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.725333 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "56c74365-656c-4362-8358-bbb17d0c8be0" (UID: "56c74365-656c-4362-8358-bbb17d0c8be0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.725630 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="23d5b7cd25a646bc4179b5e41cb642aa99ae3019bdbfa362dc0578d525c4bfdf" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.738738 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="23d5b7cd25a646bc4179b5e41cb642aa99ae3019bdbfa362dc0578d525c4bfdf" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.746204 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="23d5b7cd25a646bc4179b5e41cb642aa99ae3019bdbfa362dc0578d525c4bfdf" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.746264 4825 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="85f5995b-4ad8-4840-82f1-6659152c3ed4" containerName="nova-scheduler-scheduler" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.777436 4825 generic.go:334] "Generic (PLEG): container finished" podID="04a2aca4-f98f-4ae8-aca9-62ae6625e5e2" containerID="fe8d444c42b3290f6dd09fc28b48048d43ae16fceb5be5dbfbb283eec952359c" exitCode=2 Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.777493 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"04a2aca4-f98f-4ae8-aca9-62ae6625e5e2","Type":"ContainerDied","Data":"fe8d444c42b3290f6dd09fc28b48048d43ae16fceb5be5dbfbb283eec952359c"} Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.793332 4825 generic.go:334] "Generic (PLEG): container finished" podID="f33e10c8-8be2-46d4-8653-1960855a2a40" containerID="eb096eba10587e7f334a63bf327913dd03205b29419eb194a05f626f07286905" exitCode=0 Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.793418 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f33e10c8-8be2-46d4-8653-1960855a2a40","Type":"ContainerDied","Data":"eb096eba10587e7f334a63bf327913dd03205b29419eb194a05f626f07286905"} Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.808731 4825 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56c74365-656c-4362-8358-bbb17d0c8be0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.815751 4825 generic.go:334] "Generic (PLEG): container finished" podID="8b3d94ce-f8d8-4653-a6b0-2682b23d834e" containerID="b75f52a9c2b47ee09bd6f6c1c6a4ac378afdcf400070a6ab9a12012c65129fcc" exitCode=0 Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.815810 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8b3d94ce-f8d8-4653-a6b0-2682b23d834e","Type":"ContainerDied","Data":"b75f52a9c2b47ee09bd6f6c1c6a4ac378afdcf400070a6ab9a12012c65129fcc"} Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.845377 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-9c98f44d4-hm6qz" podUID="42d8b8af-9916-4aba-b4e7-f825ed30f182" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.167:9311/healthcheck\": read tcp 10.217.0.2:41628->10.217.0.167:9311: read: connection reset by peer" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.845436 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-9c98f44d4-hm6qz" podUID="42d8b8af-9916-4aba-b4e7-f825ed30f182" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.167:9311/healthcheck\": read tcp 10.217.0.2:41638->10.217.0.167:9311: read: connection reset by peer" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.845629 4825 generic.go:334] "Generic (PLEG): container finished" podID="e516b5cf-d44c-4f03-9247-7727319f0a85" containerID="062160b5e34d30d57397fa7d0b1b030e8fd845e13121bd5d0cdea1d6a20a2de8" exitCode=0 Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.845656 4825 generic.go:334] "Generic (PLEG): container finished" podID="e516b5cf-d44c-4f03-9247-7727319f0a85" containerID="b95484ef86f19d2fa040d657d6e6e16f10d17dd575ffd8009352291bebecc277" exitCode=2 Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.845729 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e516b5cf-d44c-4f03-9247-7727319f0a85","Type":"ContainerDied","Data":"062160b5e34d30d57397fa7d0b1b030e8fd845e13121bd5d0cdea1d6a20a2de8"} Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.845762 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e516b5cf-d44c-4f03-9247-7727319f0a85","Type":"ContainerDied","Data":"b95484ef86f19d2fa040d657d6e6e16f10d17dd575ffd8009352291bebecc277"} Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.877220 4825 generic.go:334] "Generic (PLEG): container finished" podID="fcf819fc-0560-45a8-be5e-04e6ef2bbf32" containerID="b0692876912e4dddda340c37876d5dcd6e82ee0c786afdeaa20b480be4903f55" exitCode=0 Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.877297 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fcf819fc-0560-45a8-be5e-04e6ef2bbf32","Type":"ContainerDied","Data":"b0692876912e4dddda340c37876d5dcd6e82ee0c786afdeaa20b480be4903f55"} Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.881903 4825 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-pwt26" secret="" err="secret \"galera-openstack-dockercfg-w8jxw\" not found" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.881942 4825 scope.go:117] "RemoveContainer" containerID="a197f76cf3bde62283e41aa4013c9afdb9aa3afe6880137cee79408f1cdb53ae" Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.882112 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-pwt26_openstack(c684f3e4-ced9-49c3-aa54-515bc2c4fb56)\"" pod="openstack/root-account-create-update-pwt26" podUID="c684f3e4-ced9-49c3-aa54-515bc2c4fb56" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.902664 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="1b5b179f-f4fb-479c-9720-25587566c518" containerName="galera" containerID="cri-o://c5251bd347e4e2a0285d072a7e65894076feb5953eaeecf9032c40e2a8952418" gracePeriod=30 Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.912759 4825 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 10 07:09:28 crc kubenswrapper[4825]: E0310 07:09:28.912807 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c684f3e4-ced9-49c3-aa54-515bc2c4fb56-operator-scripts podName:c684f3e4-ced9-49c3-aa54-515bc2c4fb56 nodeName:}" failed. No retries permitted until 2026-03-10 07:09:29.41279423 +0000 UTC m=+1522.442574845 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c684f3e4-ced9-49c3-aa54-515bc2c4fb56-operator-scripts") pod "root-account-create-update-pwt26" (UID: "c684f3e4-ced9-49c3-aa54-515bc2c4fb56") : configmap "openstack-scripts" not found Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.951761 4825 generic.go:334] "Generic (PLEG): container finished" podID="209fbdac-3b3c-451a-ae30-5888e1cbb891" containerID="bb56c1153d981c89befdcab3c321e793fa3da2a54c78cde62ebc8522df37f74c" exitCode=0 Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.951843 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"209fbdac-3b3c-451a-ae30-5888e1cbb891","Type":"ContainerDied","Data":"bb56c1153d981c89befdcab3c321e793fa3da2a54c78cde62ebc8522df37f74c"} Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.955015 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5fb2-account-create-update-z6tqt" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.955466 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.959364 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55b54bdfdb-8b9ns" event={"ID":"56c74365-656c-4362-8358-bbb17d0c8be0","Type":"ContainerDied","Data":"f51a988b85767d51c95d15a0175c79733a28ab4631101dd4cb9b8e62151b9f0d"} Mar 10 07:09:28 crc kubenswrapper[4825]: I0310 07:09:28.959509 4825 scope.go:117] "RemoveContainer" containerID="506bc72445e5baadada4d8eeada583023eaf1965303dabc6bfea713ddd7e5cda" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.118649 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22k77\" (UniqueName: \"kubernetes.io/projected/a668a190-c176-4d20-b26f-4ac28fa83476-kube-api-access-22k77\") pod \"keystone-5fb2-account-create-update-z6tqt\" (UID: \"a668a190-c176-4d20-b26f-4ac28fa83476\") " pod="openstack/keystone-5fb2-account-create-update-z6tqt" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.118850 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a668a190-c176-4d20-b26f-4ac28fa83476-operator-scripts\") pod \"keystone-5fb2-account-create-update-z6tqt\" (UID: \"a668a190-c176-4d20-b26f-4ac28fa83476\") " pod="openstack/keystone-5fb2-account-create-update-z6tqt" Mar 10 07:09:29 crc kubenswrapper[4825]: E0310 07:09:29.118985 4825 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 10 07:09:29 crc kubenswrapper[4825]: E0310 07:09:29.119040 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a668a190-c176-4d20-b26f-4ac28fa83476-operator-scripts podName:a668a190-c176-4d20-b26f-4ac28fa83476 nodeName:}" failed. No retries permitted until 2026-03-10 07:09:30.119025066 +0000 UTC m=+1523.148805681 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a668a190-c176-4d20-b26f-4ac28fa83476-operator-scripts") pod "keystone-5fb2-account-create-update-z6tqt" (UID: "a668a190-c176-4d20-b26f-4ac28fa83476") : configmap "openstack-scripts" not found Mar 10 07:09:29 crc kubenswrapper[4825]: E0310 07:09:29.124713 4825 projected.go:194] Error preparing data for projected volume kube-api-access-22k77 for pod openstack/keystone-5fb2-account-create-update-z6tqt: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 10 07:09:29 crc kubenswrapper[4825]: E0310 07:09:29.124776 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a668a190-c176-4d20-b26f-4ac28fa83476-kube-api-access-22k77 podName:a668a190-c176-4d20-b26f-4ac28fa83476 nodeName:}" failed. No retries permitted until 2026-03-10 07:09:30.124757856 +0000 UTC m=+1523.154538471 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-22k77" (UniqueName: "kubernetes.io/projected/a668a190-c176-4d20-b26f-4ac28fa83476-kube-api-access-22k77") pod "keystone-5fb2-account-create-update-z6tqt" (UID: "a668a190-c176-4d20-b26f-4ac28fa83476") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.301530 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ff86ab8-a122-4256-8529-12265e6177e4" path="/var/lib/kubelet/pods/0ff86ab8-a122-4256-8529-12265e6177e4/volumes" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.302411 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66c3beca-bb2c-4b68-a4b3-3cfe936c25fd" path="/var/lib/kubelet/pods/66c3beca-bb2c-4b68-a4b3-3cfe936c25fd/volumes" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.303785 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="757635f4-9aaf-48bb-bd50-60398db738d4" path="/var/lib/kubelet/pods/757635f4-9aaf-48bb-bd50-60398db738d4/volumes" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.304531 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79caf655-43c2-401d-9967-97d9a35d9741" path="/var/lib/kubelet/pods/79caf655-43c2-401d-9967-97d9a35d9741/volumes" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.308570 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8530cfd9-9a8b-4d03-93cd-52edad33a965" path="/var/lib/kubelet/pods/8530cfd9-9a8b-4d03-93cd-52edad33a965/volumes" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.309190 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af39779a-3a65-4379-9e92-d69ab1610fc6" path="/var/lib/kubelet/pods/af39779a-3a65-4379-9e92-d69ab1610fc6/volumes" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.309760 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2f77e29-2f56-4415-b778-376f67322f69" path="/var/lib/kubelet/pods/e2f77e29-2f56-4415-b778-376f67322f69/volumes" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.335815 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e808443d-0b2f-45a3-baa9-61e4d9819e6c" path="/var/lib/kubelet/pods/e808443d-0b2f-45a3-baa9-61e4d9819e6c/volumes" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.336549 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fafc3bbc-dcca-4650-8c76-e1bbda061f75" path="/var/lib/kubelet/pods/fafc3bbc-dcca-4650-8c76-e1bbda061f75/volumes" Mar 10 07:09:29 crc kubenswrapper[4825]: E0310 07:09:29.344266 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134 is running failed: container process not found" containerID="dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 07:09:29 crc kubenswrapper[4825]: E0310 07:09:29.345723 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134 is running failed: container process not found" containerID="dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 07:09:29 crc kubenswrapper[4825]: E0310 07:09:29.348371 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134 is running failed: container process not found" containerID="dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 07:09:29 crc kubenswrapper[4825]: E0310 07:09:29.348411 4825 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8x4pm" podUID="f04e11a7-e387-4e51-b878-8633ca528b1a" containerName="ovsdb-server" Mar 10 07:09:29 crc kubenswrapper[4825]: E0310 07:09:29.356486 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1f073cb0d55b119007da7e16a09131032c9832ee0f4faf6e839adba2f4de4803" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 07:09:29 crc kubenswrapper[4825]: E0310 07:09:29.362378 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1f073cb0d55b119007da7e16a09131032c9832ee0f4faf6e839adba2f4de4803" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 07:09:29 crc kubenswrapper[4825]: E0310 07:09:29.366281 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1f073cb0d55b119007da7e16a09131032c9832ee0f4faf6e839adba2f4de4803" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 07:09:29 crc kubenswrapper[4825]: E0310 07:09:29.366367 4825 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-8x4pm" podUID="f04e11a7-e387-4e51-b878-8633ca528b1a" containerName="ovs-vswitchd" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.428489 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.428578 4825 scope.go:117] "RemoveContainer" containerID="cf0bf49129662bee51dc0d9f341998b13bf6e4b5c8b81b94887a67926afba8ca" Mar 10 07:09:29 crc kubenswrapper[4825]: E0310 07:09:29.432594 4825 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 10 07:09:29 crc kubenswrapper[4825]: E0310 07:09:29.432839 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c684f3e4-ced9-49c3-aa54-515bc2c4fb56-operator-scripts podName:c684f3e4-ced9-49c3-aa54-515bc2c4fb56 nodeName:}" failed. No retries permitted until 2026-03-10 07:09:30.432822965 +0000 UTC m=+1523.462603580 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c684f3e4-ced9-49c3-aa54-515bc2c4fb56-operator-scripts") pod "root-account-create-update-pwt26" (UID: "c684f3e4-ced9-49c3-aa54-515bc2c4fb56") : configmap "openstack-scripts" not found Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.436031 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.436161 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5fb2-account-create-update-z6tqt" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.445690 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.457434 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.533481 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-internal-tls-certs\") pod \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.533753 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/209fbdac-3b3c-451a-ae30-5888e1cbb891-logs\") pod \"209fbdac-3b3c-451a-ae30-5888e1cbb891\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.533853 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-scripts\") pod \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.533941 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.534036 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-config-data\") pod \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.534106 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-config-data\") pod \"209fbdac-3b3c-451a-ae30-5888e1cbb891\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.534226 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-combined-ca-bundle\") pod \"f33e10c8-8be2-46d4-8653-1960855a2a40\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.534315 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-public-tls-certs\") pod \"f33e10c8-8be2-46d4-8653-1960855a2a40\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.534409 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-combined-ca-bundle\") pod \"209fbdac-3b3c-451a-ae30-5888e1cbb891\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.534495 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-scripts\") pod \"209fbdac-3b3c-451a-ae30-5888e1cbb891\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.534559 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f33e10c8-8be2-46d4-8653-1960855a2a40-httpd-run\") pod \"f33e10c8-8be2-46d4-8653-1960855a2a40\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.534668 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2-kube-state-metrics-tls-certs\") pod \"04a2aca4-f98f-4ae8-aca9-62ae6625e5e2\" (UID: \"04a2aca4-f98f-4ae8-aca9-62ae6625e5e2\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.534741 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkzfv\" (UniqueName: \"kubernetes.io/projected/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2-kube-api-access-bkzfv\") pod \"04a2aca4-f98f-4ae8-aca9-62ae6625e5e2\" (UID: \"04a2aca4-f98f-4ae8-aca9-62ae6625e5e2\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.534821 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwppm\" (UniqueName: \"kubernetes.io/projected/f33e10c8-8be2-46d4-8653-1960855a2a40-kube-api-access-mwppm\") pod \"f33e10c8-8be2-46d4-8653-1960855a2a40\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.534897 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-config-data\") pod \"f33e10c8-8be2-46d4-8653-1960855a2a40\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.534959 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2-combined-ca-bundle\") pod \"04a2aca4-f98f-4ae8-aca9-62ae6625e5e2\" (UID: \"04a2aca4-f98f-4ae8-aca9-62ae6625e5e2\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.535041 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-727td\" (UniqueName: \"kubernetes.io/projected/209fbdac-3b3c-451a-ae30-5888e1cbb891-kube-api-access-727td\") pod \"209fbdac-3b3c-451a-ae30-5888e1cbb891\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.535154 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-public-tls-certs\") pod \"209fbdac-3b3c-451a-ae30-5888e1cbb891\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.536165 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f33e10c8-8be2-46d4-8653-1960855a2a40-logs\") pod \"f33e10c8-8be2-46d4-8653-1960855a2a40\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.536287 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-scripts\") pod \"f33e10c8-8be2-46d4-8653-1960855a2a40\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.536397 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-combined-ca-bundle\") pod \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.536514 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-internal-tls-certs\") pod \"209fbdac-3b3c-451a-ae30-5888e1cbb891\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.536598 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-httpd-run\") pod \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.538312 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxbws\" (UniqueName: \"kubernetes.io/projected/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-kube-api-access-gxbws\") pod \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.538366 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/209fbdac-3b3c-451a-ae30-5888e1cbb891-etc-machine-id\") pod \"209fbdac-3b3c-451a-ae30-5888e1cbb891\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.538408 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"f33e10c8-8be2-46d4-8653-1960855a2a40\" (UID: \"f33e10c8-8be2-46d4-8653-1960855a2a40\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.538472 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-config-data-custom\") pod \"209fbdac-3b3c-451a-ae30-5888e1cbb891\" (UID: \"209fbdac-3b3c-451a-ae30-5888e1cbb891\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.538515 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2-kube-state-metrics-tls-config\") pod \"04a2aca4-f98f-4ae8-aca9-62ae6625e5e2\" (UID: \"04a2aca4-f98f-4ae8-aca9-62ae6625e5e2\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.538567 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-logs\") pod \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\" (UID: \"8b3d94ce-f8d8-4653-a6b0-2682b23d834e\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.539196 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2-kube-api-access-bkzfv" (OuterVolumeSpecName: "kube-api-access-bkzfv") pod "04a2aca4-f98f-4ae8-aca9-62ae6625e5e2" (UID: "04a2aca4-f98f-4ae8-aca9-62ae6625e5e2"). InnerVolumeSpecName "kube-api-access-bkzfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.539430 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkzfv\" (UniqueName: \"kubernetes.io/projected/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2-kube-api-access-bkzfv\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.539750 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f33e10c8-8be2-46d4-8653-1960855a2a40-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f33e10c8-8be2-46d4-8653-1960855a2a40" (UID: "f33e10c8-8be2-46d4-8653-1960855a2a40"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.540108 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-logs" (OuterVolumeSpecName: "logs") pod "8b3d94ce-f8d8-4653-a6b0-2682b23d834e" (UID: "8b3d94ce-f8d8-4653-a6b0-2682b23d834e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.541345 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/209fbdac-3b3c-451a-ae30-5888e1cbb891-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "209fbdac-3b3c-451a-ae30-5888e1cbb891" (UID: "209fbdac-3b3c-451a-ae30-5888e1cbb891"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.543896 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f33e10c8-8be2-46d4-8653-1960855a2a40-kube-api-access-mwppm" (OuterVolumeSpecName: "kube-api-access-mwppm") pod "f33e10c8-8be2-46d4-8653-1960855a2a40" (UID: "f33e10c8-8be2-46d4-8653-1960855a2a40"). InnerVolumeSpecName "kube-api-access-mwppm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.547186 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8b3d94ce-f8d8-4653-a6b0-2682b23d834e" (UID: "8b3d94ce-f8d8-4653-a6b0-2682b23d834e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.549221 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "f33e10c8-8be2-46d4-8653-1960855a2a40" (UID: "f33e10c8-8be2-46d4-8653-1960855a2a40"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.550955 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-scripts" (OuterVolumeSpecName: "scripts") pod "209fbdac-3b3c-451a-ae30-5888e1cbb891" (UID: "209fbdac-3b3c-451a-ae30-5888e1cbb891"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.551982 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f33e10c8-8be2-46d4-8653-1960855a2a40-logs" (OuterVolumeSpecName: "logs") pod "f33e10c8-8be2-46d4-8653-1960855a2a40" (UID: "f33e10c8-8be2-46d4-8653-1960855a2a40"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.574526 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/209fbdac-3b3c-451a-ae30-5888e1cbb891-logs" (OuterVolumeSpecName: "logs") pod "209fbdac-3b3c-451a-ae30-5888e1cbb891" (UID: "209fbdac-3b3c-451a-ae30-5888e1cbb891"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.578280 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-scripts" (OuterVolumeSpecName: "scripts") pod "f33e10c8-8be2-46d4-8653-1960855a2a40" (UID: "f33e10c8-8be2-46d4-8653-1960855a2a40"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.583932 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "209fbdac-3b3c-451a-ae30-5888e1cbb891" (UID: "209fbdac-3b3c-451a-ae30-5888e1cbb891"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.584040 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "8b3d94ce-f8d8-4653-a6b0-2682b23d834e" (UID: "8b3d94ce-f8d8-4653-a6b0-2682b23d834e"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.625793 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/209fbdac-3b3c-451a-ae30-5888e1cbb891-kube-api-access-727td" (OuterVolumeSpecName: "kube-api-access-727td") pod "209fbdac-3b3c-451a-ae30-5888e1cbb891" (UID: "209fbdac-3b3c-451a-ae30-5888e1cbb891"). InnerVolumeSpecName "kube-api-access-727td". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.635266 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-scripts" (OuterVolumeSpecName: "scripts") pod "8b3d94ce-f8d8-4653-a6b0-2682b23d834e" (UID: "8b3d94ce-f8d8-4653-a6b0-2682b23d834e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.635349 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-kube-api-access-gxbws" (OuterVolumeSpecName: "kube-api-access-gxbws") pod "8b3d94ce-f8d8-4653-a6b0-2682b23d834e" (UID: "8b3d94ce-f8d8-4653-a6b0-2682b23d834e"). InnerVolumeSpecName "kube-api-access-gxbws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.640996 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.641021 4825 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f33e10c8-8be2-46d4-8653-1960855a2a40-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.641034 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwppm\" (UniqueName: \"kubernetes.io/projected/f33e10c8-8be2-46d4-8653-1960855a2a40-kube-api-access-mwppm\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.641043 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-727td\" (UniqueName: \"kubernetes.io/projected/209fbdac-3b3c-451a-ae30-5888e1cbb891-kube-api-access-727td\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.641052 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f33e10c8-8be2-46d4-8653-1960855a2a40-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.641060 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.641069 4825 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.641079 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxbws\" (UniqueName: \"kubernetes.io/projected/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-kube-api-access-gxbws\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.641109 4825 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/209fbdac-3b3c-451a-ae30-5888e1cbb891-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.641152 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.641164 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.641173 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.641181 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/209fbdac-3b3c-451a-ae30-5888e1cbb891-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.641189 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.641203 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.697541 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "209fbdac-3b3c-451a-ae30-5888e1cbb891" (UID: "209fbdac-3b3c-451a-ae30-5888e1cbb891"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.731080 4825 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.742474 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.742498 4825 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.750905 4825 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.754659 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.804357 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.808766 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b3d94ce-f8d8-4653-a6b0-2682b23d834e" (UID: "8b3d94ce-f8d8-4653-a6b0-2682b23d834e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.813742 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.832784 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.835397 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.838044 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.844051 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-config-data\") pod \"42d8b8af-9916-4aba-b4e7-f825ed30f182\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.844103 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1222a91-b344-4f54-bf9f-75d03d5f8549-nova-metadata-tls-certs\") pod \"b1222a91-b344-4f54-bf9f-75d03d5f8549\" (UID: \"b1222a91-b344-4f54-bf9f-75d03d5f8549\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.844195 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-config-data-custom\") pod \"42d8b8af-9916-4aba-b4e7-f825ed30f182\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.844225 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-combined-ca-bundle\") pod \"e516b5cf-d44c-4f03-9247-7727319f0a85\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.844252 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m55qr\" (UniqueName: \"kubernetes.io/projected/42d8b8af-9916-4aba-b4e7-f825ed30f182-kube-api-access-m55qr\") pod \"42d8b8af-9916-4aba-b4e7-f825ed30f182\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.844276 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmn4h\" (UniqueName: \"kubernetes.io/projected/b1222a91-b344-4f54-bf9f-75d03d5f8549-kube-api-access-tmn4h\") pod \"b1222a91-b344-4f54-bf9f-75d03d5f8549\" (UID: \"b1222a91-b344-4f54-bf9f-75d03d5f8549\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.844317 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-public-tls-certs\") pod \"42d8b8af-9916-4aba-b4e7-f825ed30f182\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.844369 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42rdp\" (UniqueName: \"kubernetes.io/projected/e516b5cf-d44c-4f03-9247-7727319f0a85-kube-api-access-42rdp\") pod \"e516b5cf-d44c-4f03-9247-7727319f0a85\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.844485 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-internal-tls-certs\") pod \"42d8b8af-9916-4aba-b4e7-f825ed30f182\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.844501 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-config-data\") pod \"e516b5cf-d44c-4f03-9247-7727319f0a85\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.844520 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-combined-ca-bundle\") pod \"42d8b8af-9916-4aba-b4e7-f825ed30f182\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.844611 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-ceilometer-tls-certs\") pod \"e516b5cf-d44c-4f03-9247-7727319f0a85\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.844632 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e516b5cf-d44c-4f03-9247-7727319f0a85-log-httpd\") pod \"e516b5cf-d44c-4f03-9247-7727319f0a85\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.844663 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e516b5cf-d44c-4f03-9247-7727319f0a85-run-httpd\") pod \"e516b5cf-d44c-4f03-9247-7727319f0a85\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.844681 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-scripts\") pod \"e516b5cf-d44c-4f03-9247-7727319f0a85\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.844705 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1222a91-b344-4f54-bf9f-75d03d5f8549-logs\") pod \"b1222a91-b344-4f54-bf9f-75d03d5f8549\" (UID: \"b1222a91-b344-4f54-bf9f-75d03d5f8549\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.844725 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42d8b8af-9916-4aba-b4e7-f825ed30f182-logs\") pod \"42d8b8af-9916-4aba-b4e7-f825ed30f182\" (UID: \"42d8b8af-9916-4aba-b4e7-f825ed30f182\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.844742 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1222a91-b344-4f54-bf9f-75d03d5f8549-combined-ca-bundle\") pod \"b1222a91-b344-4f54-bf9f-75d03d5f8549\" (UID: \"b1222a91-b344-4f54-bf9f-75d03d5f8549\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.844780 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-sg-core-conf-yaml\") pod \"e516b5cf-d44c-4f03-9247-7727319f0a85\" (UID: \"e516b5cf-d44c-4f03-9247-7727319f0a85\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.844813 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1222a91-b344-4f54-bf9f-75d03d5f8549-config-data\") pod \"b1222a91-b344-4f54-bf9f-75d03d5f8549\" (UID: \"b1222a91-b344-4f54-bf9f-75d03d5f8549\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.846667 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.846684 4825 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.853188 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42d8b8af-9916-4aba-b4e7-f825ed30f182-logs" (OuterVolumeSpecName: "logs") pod "42d8b8af-9916-4aba-b4e7-f825ed30f182" (UID: "42d8b8af-9916-4aba-b4e7-f825ed30f182"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.853716 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1222a91-b344-4f54-bf9f-75d03d5f8549-logs" (OuterVolumeSpecName: "logs") pod "b1222a91-b344-4f54-bf9f-75d03d5f8549" (UID: "b1222a91-b344-4f54-bf9f-75d03d5f8549"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.854624 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-scripts" (OuterVolumeSpecName: "scripts") pod "e516b5cf-d44c-4f03-9247-7727319f0a85" (UID: "e516b5cf-d44c-4f03-9247-7727319f0a85"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.864771 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e516b5cf-d44c-4f03-9247-7727319f0a85-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e516b5cf-d44c-4f03-9247-7727319f0a85" (UID: "e516b5cf-d44c-4f03-9247-7727319f0a85"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.868376 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "04a2aca4-f98f-4ae8-aca9-62ae6625e5e2" (UID: "04a2aca4-f98f-4ae8-aca9-62ae6625e5e2"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.868627 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e516b5cf-d44c-4f03-9247-7727319f0a85-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e516b5cf-d44c-4f03-9247-7727319f0a85" (UID: "e516b5cf-d44c-4f03-9247-7727319f0a85"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.869752 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42d8b8af-9916-4aba-b4e7-f825ed30f182-kube-api-access-m55qr" (OuterVolumeSpecName: "kube-api-access-m55qr") pod "42d8b8af-9916-4aba-b4e7-f825ed30f182" (UID: "42d8b8af-9916-4aba-b4e7-f825ed30f182"). InnerVolumeSpecName "kube-api-access-m55qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.873187 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "42d8b8af-9916-4aba-b4e7-f825ed30f182" (UID: "42d8b8af-9916-4aba-b4e7-f825ed30f182"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.873297 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e516b5cf-d44c-4f03-9247-7727319f0a85-kube-api-access-42rdp" (OuterVolumeSpecName: "kube-api-access-42rdp") pod "e516b5cf-d44c-4f03-9247-7727319f0a85" (UID: "e516b5cf-d44c-4f03-9247-7727319f0a85"). InnerVolumeSpecName "kube-api-access-42rdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.888349 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f33e10c8-8be2-46d4-8653-1960855a2a40" (UID: "f33e10c8-8be2-46d4-8653-1960855a2a40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.888588 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1222a91-b344-4f54-bf9f-75d03d5f8549-kube-api-access-tmn4h" (OuterVolumeSpecName: "kube-api-access-tmn4h") pod "b1222a91-b344-4f54-bf9f-75d03d5f8549" (UID: "b1222a91-b344-4f54-bf9f-75d03d5f8549"). InnerVolumeSpecName "kube-api-access-tmn4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.942226 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "04a2aca4-f98f-4ae8-aca9-62ae6625e5e2" (UID: "04a2aca4-f98f-4ae8-aca9-62ae6625e5e2"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.944257 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "209fbdac-3b3c-451a-ae30-5888e1cbb891" (UID: "209fbdac-3b3c-451a-ae30-5888e1cbb891"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.944467 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "209fbdac-3b3c-451a-ae30-5888e1cbb891" (UID: "209fbdac-3b3c-451a-ae30-5888e1cbb891"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.947651 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-config-data\") pod \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\" (UID: \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.947811 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4ebafb-aa22-44ad-8037-487a9c3baca4-config-data\") pod \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\" (UID: \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.947850 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-logs\") pod \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\" (UID: \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.947947 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4ebafb-aa22-44ad-8037-487a9c3baca4-combined-ca-bundle\") pod \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\" (UID: \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.947975 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9965b351-ed73-4c38-b393-ff72ba48cd66-config-data\") pod \"9965b351-ed73-4c38-b393-ff72ba48cd66\" (UID: \"9965b351-ed73-4c38-b393-ff72ba48cd66\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.947997 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca4ebafb-aa22-44ad-8037-487a9c3baca4-scripts\") pod \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\" (UID: \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.948086 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-combined-ca-bundle\") pod \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\" (UID: \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.948160 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-internal-tls-certs\") pod \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\" (UID: \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.948179 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca4ebafb-aa22-44ad-8037-487a9c3baca4-config-data-custom\") pod \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\" (UID: \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.948198 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9965b351-ed73-4c38-b393-ff72ba48cd66-combined-ca-bundle\") pod \"9965b351-ed73-4c38-b393-ff72ba48cd66\" (UID: \"9965b351-ed73-4c38-b393-ff72ba48cd66\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.948234 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86xqr\" (UniqueName: \"kubernetes.io/projected/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-kube-api-access-86xqr\") pod \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\" (UID: \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.948252 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cdsb\" (UniqueName: \"kubernetes.io/projected/9965b351-ed73-4c38-b393-ff72ba48cd66-kube-api-access-2cdsb\") pod \"9965b351-ed73-4c38-b393-ff72ba48cd66\" (UID: \"9965b351-ed73-4c38-b393-ff72ba48cd66\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.948289 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj4g7\" (UniqueName: \"kubernetes.io/projected/ca4ebafb-aa22-44ad-8037-487a9c3baca4-kube-api-access-cj4g7\") pod \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\" (UID: \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.948303 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-public-tls-certs\") pod \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\" (UID: \"fcf819fc-0560-45a8-be5e-04e6ef2bbf32\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.948345 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca4ebafb-aa22-44ad-8037-487a9c3baca4-etc-machine-id\") pod \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\" (UID: \"ca4ebafb-aa22-44ad-8037-487a9c3baca4\") " Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.948671 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.948682 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m55qr\" (UniqueName: \"kubernetes.io/projected/42d8b8af-9916-4aba-b4e7-f825ed30f182-kube-api-access-m55qr\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.948692 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmn4h\" (UniqueName: \"kubernetes.io/projected/b1222a91-b344-4f54-bf9f-75d03d5f8549-kube-api-access-tmn4h\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.948701 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42rdp\" (UniqueName: \"kubernetes.io/projected/e516b5cf-d44c-4f03-9247-7727319f0a85-kube-api-access-42rdp\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.948710 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.948719 4825 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.948727 4825 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.948735 4825 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e516b5cf-d44c-4f03-9247-7727319f0a85-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.948743 4825 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.948751 4825 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e516b5cf-d44c-4f03-9247-7727319f0a85-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.948758 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.948767 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1222a91-b344-4f54-bf9f-75d03d5f8549-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.948775 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42d8b8af-9916-4aba-b4e7-f825ed30f182-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.948783 4825 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.948816 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca4ebafb-aa22-44ad-8037-487a9c3baca4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ca4ebafb-aa22-44ad-8037-487a9c3baca4" (UID: "ca4ebafb-aa22-44ad-8037-487a9c3baca4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.952063 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-config-data" (OuterVolumeSpecName: "config-data") pod "8b3d94ce-f8d8-4653-a6b0-2682b23d834e" (UID: "8b3d94ce-f8d8-4653-a6b0-2682b23d834e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.962472 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-logs" (OuterVolumeSpecName: "logs") pod "fcf819fc-0560-45a8-be5e-04e6ef2bbf32" (UID: "fcf819fc-0560-45a8-be5e-04e6ef2bbf32"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.963236 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8b3d94ce-f8d8-4653-a6b0-2682b23d834e" (UID: "8b3d94ce-f8d8-4653-a6b0-2682b23d834e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.965444 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca4ebafb-aa22-44ad-8037-487a9c3baca4-scripts" (OuterVolumeSpecName: "scripts") pod "ca4ebafb-aa22-44ad-8037-487a9c3baca4" (UID: "ca4ebafb-aa22-44ad-8037-487a9c3baca4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.967282 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca4ebafb-aa22-44ad-8037-487a9c3baca4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ca4ebafb-aa22-44ad-8037-487a9c3baca4" (UID: "ca4ebafb-aa22-44ad-8037-487a9c3baca4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.970558 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9965b351-ed73-4c38-b393-ff72ba48cd66-kube-api-access-2cdsb" (OuterVolumeSpecName: "kube-api-access-2cdsb") pod "9965b351-ed73-4c38-b393-ff72ba48cd66" (UID: "9965b351-ed73-4c38-b393-ff72ba48cd66"). InnerVolumeSpecName "kube-api-access-2cdsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.974122 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fcf819fc-0560-45a8-be5e-04e6ef2bbf32","Type":"ContainerDied","Data":"dd360048a2592b5601b12ac87d8106f90fc64522cda299c67f0fc7a86da614aa"} Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.974204 4825 scope.go:117] "RemoveContainer" containerID="b0692876912e4dddda340c37876d5dcd6e82ee0c786afdeaa20b480be4903f55" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.974300 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.977723 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"04a2aca4-f98f-4ae8-aca9-62ae6625e5e2","Type":"ContainerDied","Data":"308db8163601cfc01ebf7e62c87675153377d65c18058084fe5a827d7e708b89"} Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.977779 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.980526 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.980528 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8b3d94ce-f8d8-4653-a6b0-2682b23d834e","Type":"ContainerDied","Data":"0eeecbff2b247ce21995a47c48d1ef0c0ebe8bfcbb15ff5e28bd7ff8f48a6d0c"} Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.985063 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"209fbdac-3b3c-451a-ae30-5888e1cbb891","Type":"ContainerDied","Data":"198f9a8326763796da8dab9aa851fe9756310d926e22948824d3289fcfb0b98f"} Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.985091 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.987097 4825 generic.go:334] "Generic (PLEG): container finished" podID="b1222a91-b344-4f54-bf9f-75d03d5f8549" containerID="fe83016a56de378ec7204c67a3bc85b458b1da0d43277d26413e0aa8e2ea9dc4" exitCode=0 Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.987190 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b1222a91-b344-4f54-bf9f-75d03d5f8549","Type":"ContainerDied","Data":"fe83016a56de378ec7204c67a3bc85b458b1da0d43277d26413e0aa8e2ea9dc4"} Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.987245 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b1222a91-b344-4f54-bf9f-75d03d5f8549","Type":"ContainerDied","Data":"7bf1f87eaf0884a2262921933bd4c53aab23815c84d61c53a1b7c86ab76b98fb"} Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.987285 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.987855 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-kube-api-access-86xqr" (OuterVolumeSpecName: "kube-api-access-86xqr") pod "fcf819fc-0560-45a8-be5e-04e6ef2bbf32" (UID: "fcf819fc-0560-45a8-be5e-04e6ef2bbf32"). InnerVolumeSpecName "kube-api-access-86xqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:29 crc kubenswrapper[4825]: I0310 07:09:29.999023 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f33e10c8-8be2-46d4-8653-1960855a2a40" (UID: "f33e10c8-8be2-46d4-8653-1960855a2a40"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.000634 4825 generic.go:334] "Generic (PLEG): container finished" podID="42d8b8af-9916-4aba-b4e7-f825ed30f182" containerID="45339b16b80b77a1ec497dc0cfdf7cad94e760f0b42f91faed72b2b7acaeea0f" exitCode=0 Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.000685 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9c98f44d4-hm6qz" event={"ID":"42d8b8af-9916-4aba-b4e7-f825ed30f182","Type":"ContainerDied","Data":"45339b16b80b77a1ec497dc0cfdf7cad94e760f0b42f91faed72b2b7acaeea0f"} Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.000705 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9c98f44d4-hm6qz" event={"ID":"42d8b8af-9916-4aba-b4e7-f825ed30f182","Type":"ContainerDied","Data":"113dd6b9c04855397b2f1ec5c7cdc93dc6e65cfadd3838790802d833cf902ff4"} Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.000755 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9c98f44d4-hm6qz" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.003546 4825 generic.go:334] "Generic (PLEG): container finished" podID="5d4252ca-f279-4c95-8f10-205339d028a5" containerID="4c5195f2b444f93836dcb7df7bcc67ecb17ab72fc077c910233e60c95d78edb3" exitCode=0 Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.003587 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5d4252ca-f279-4c95-8f10-205339d028a5","Type":"ContainerDied","Data":"4c5195f2b444f93836dcb7df7bcc67ecb17ab72fc077c910233e60c95d78edb3"} Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.006044 4825 generic.go:334] "Generic (PLEG): container finished" podID="e516b5cf-d44c-4f03-9247-7727319f0a85" containerID="d021be5f73ead6231f65789871b1004af37bd859e93d3efa1540dedd07f320d9" exitCode=0 Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.006057 4825 generic.go:334] "Generic (PLEG): container finished" podID="e516b5cf-d44c-4f03-9247-7727319f0a85" containerID="b5fc534b6fc5bdb62a0dd359fd61377bd0c4ca9f7943fd237ff73b6a79274f5c" exitCode=0 Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.006087 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e516b5cf-d44c-4f03-9247-7727319f0a85","Type":"ContainerDied","Data":"d021be5f73ead6231f65789871b1004af37bd859e93d3efa1540dedd07f320d9"} Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.006102 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e516b5cf-d44c-4f03-9247-7727319f0a85","Type":"ContainerDied","Data":"b5fc534b6fc5bdb62a0dd359fd61377bd0c4ca9f7943fd237ff73b6a79274f5c"} Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.006113 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e516b5cf-d44c-4f03-9247-7727319f0a85","Type":"ContainerDied","Data":"46acdbecf5b0d066e1d674a88c83a7a55aa838f9900872fc9a70c8c736c01984"} Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.006181 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 07:09:30 crc kubenswrapper[4825]: E0310 07:09:30.006238 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a5a7ebb2da38af819e648d6e5ad4c63fc12ab758ecc1f1178b06bc2d53a663b1 is running failed: container process not found" containerID="a5a7ebb2da38af819e648d6e5ad4c63fc12ab758ecc1f1178b06bc2d53a663b1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 07:09:30 crc kubenswrapper[4825]: E0310 07:09:30.006498 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a5a7ebb2da38af819e648d6e5ad4c63fc12ab758ecc1f1178b06bc2d53a663b1 is running failed: container process not found" containerID="a5a7ebb2da38af819e648d6e5ad4c63fc12ab758ecc1f1178b06bc2d53a663b1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 07:09:30 crc kubenswrapper[4825]: E0310 07:09:30.006822 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a5a7ebb2da38af819e648d6e5ad4c63fc12ab758ecc1f1178b06bc2d53a663b1 is running failed: container process not found" containerID="a5a7ebb2da38af819e648d6e5ad4c63fc12ab758ecc1f1178b06bc2d53a663b1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 10 07:09:30 crc kubenswrapper[4825]: E0310 07:09:30.006841 4825 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a5a7ebb2da38af819e648d6e5ad4c63fc12ab758ecc1f1178b06bc2d53a663b1 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="47d73d8a-acf6-42e6-a30d-e093144ee0b9" containerName="nova-cell0-conductor-conductor" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.007794 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca4ebafb-aa22-44ad-8037-487a9c3baca4-kube-api-access-cj4g7" (OuterVolumeSpecName: "kube-api-access-cj4g7") pod "ca4ebafb-aa22-44ad-8037-487a9c3baca4" (UID: "ca4ebafb-aa22-44ad-8037-487a9c3baca4"). InnerVolumeSpecName "kube-api-access-cj4g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.033544 4825 generic.go:334] "Generic (PLEG): container finished" podID="85f5995b-4ad8-4840-82f1-6659152c3ed4" containerID="23d5b7cd25a646bc4179b5e41cb642aa99ae3019bdbfa362dc0578d525c4bfdf" exitCode=0 Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.033664 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"85f5995b-4ad8-4840-82f1-6659152c3ed4","Type":"ContainerDied","Data":"23d5b7cd25a646bc4179b5e41cb642aa99ae3019bdbfa362dc0578d525c4bfdf"} Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.037742 4825 generic.go:334] "Generic (PLEG): container finished" podID="9965b351-ed73-4c38-b393-ff72ba48cd66" containerID="3d70d139a1deb1c6a27cdf349fc1cc7878472659a23957efa5657e5551a800ef" exitCode=0 Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.038411 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9965b351-ed73-4c38-b393-ff72ba48cd66","Type":"ContainerDied","Data":"3d70d139a1deb1c6a27cdf349fc1cc7878472659a23957efa5657e5551a800ef"} Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.040480 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9965b351-ed73-4c38-b393-ff72ba48cd66","Type":"ContainerDied","Data":"c8700bdf895a83e0ca2d032e0caac8e842ae7826849b3aa97bbcfaeaf203601b"} Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.038461 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.044669 4825 generic.go:334] "Generic (PLEG): container finished" podID="47d73d8a-acf6-42e6-a30d-e093144ee0b9" containerID="a5a7ebb2da38af819e648d6e5ad4c63fc12ab758ecc1f1178b06bc2d53a663b1" exitCode=0 Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.044925 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"47d73d8a-acf6-42e6-a30d-e093144ee0b9","Type":"ContainerDied","Data":"a5a7ebb2da38af819e648d6e5ad4c63fc12ab758ecc1f1178b06bc2d53a663b1"} Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.052477 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca4ebafb-aa22-44ad-8037-487a9c3baca4-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.052515 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.052527 4825 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.052540 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca4ebafb-aa22-44ad-8037-487a9c3baca4-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.052552 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86xqr\" (UniqueName: \"kubernetes.io/projected/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-kube-api-access-86xqr\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.052563 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cdsb\" (UniqueName: \"kubernetes.io/projected/9965b351-ed73-4c38-b393-ff72ba48cd66-kube-api-access-2cdsb\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.052575 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj4g7\" (UniqueName: \"kubernetes.io/projected/ca4ebafb-aa22-44ad-8037-487a9c3baca4-kube-api-access-cj4g7\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.052588 4825 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca4ebafb-aa22-44ad-8037-487a9c3baca4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.052599 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-logs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.052611 4825 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b3d94ce-f8d8-4653-a6b0-2682b23d834e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.076042 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f33e10c8-8be2-46d4-8653-1960855a2a40","Type":"ContainerDied","Data":"540452d0c5f041913e598421282abfb7820efbe982dcad500a2958d00def99fe"} Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.076703 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.085471 4825 generic.go:334] "Generic (PLEG): container finished" podID="ca4ebafb-aa22-44ad-8037-487a9c3baca4" containerID="2afb8783418624fe53a581d5b384b0eac94f03f7b397fe6d2851a79acaf2faeb" exitCode=0 Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.085910 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5fb2-account-create-update-z6tqt" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.086256 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.086411 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ca4ebafb-aa22-44ad-8037-487a9c3baca4","Type":"ContainerDied","Data":"2afb8783418624fe53a581d5b384b0eac94f03f7b397fe6d2851a79acaf2faeb"} Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.086769 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ca4ebafb-aa22-44ad-8037-487a9c3baca4","Type":"ContainerDied","Data":"a253bdb397bc3dc3551756150243b4f3fe2f467d36dcaa92266ddfafc085214c"} Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.154468 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22k77\" (UniqueName: \"kubernetes.io/projected/a668a190-c176-4d20-b26f-4ac28fa83476-kube-api-access-22k77\") pod \"keystone-5fb2-account-create-update-z6tqt\" (UID: \"a668a190-c176-4d20-b26f-4ac28fa83476\") " pod="openstack/keystone-5fb2-account-create-update-z6tqt" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.154560 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a668a190-c176-4d20-b26f-4ac28fa83476-operator-scripts\") pod \"keystone-5fb2-account-create-update-z6tqt\" (UID: \"a668a190-c176-4d20-b26f-4ac28fa83476\") " pod="openstack/keystone-5fb2-account-create-update-z6tqt" Mar 10 07:09:30 crc kubenswrapper[4825]: E0310 07:09:30.154722 4825 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 10 07:09:30 crc kubenswrapper[4825]: E0310 07:09:30.154769 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a668a190-c176-4d20-b26f-4ac28fa83476-operator-scripts podName:a668a190-c176-4d20-b26f-4ac28fa83476 nodeName:}" failed. No retries permitted until 2026-03-10 07:09:32.154756715 +0000 UTC m=+1525.184537330 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a668a190-c176-4d20-b26f-4ac28fa83476-operator-scripts") pod "keystone-5fb2-account-create-update-z6tqt" (UID: "a668a190-c176-4d20-b26f-4ac28fa83476") : configmap "openstack-scripts" not found Mar 10 07:09:30 crc kubenswrapper[4825]: E0310 07:09:30.160447 4825 projected.go:194] Error preparing data for projected volume kube-api-access-22k77 for pod openstack/keystone-5fb2-account-create-update-z6tqt: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 10 07:09:30 crc kubenswrapper[4825]: E0310 07:09:30.160533 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a668a190-c176-4d20-b26f-4ac28fa83476-kube-api-access-22k77 podName:a668a190-c176-4d20-b26f-4ac28fa83476 nodeName:}" failed. No retries permitted until 2026-03-10 07:09:32.160509335 +0000 UTC m=+1525.190290020 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-22k77" (UniqueName: "kubernetes.io/projected/a668a190-c176-4d20-b26f-4ac28fa83476-kube-api-access-22k77") pod "keystone-5fb2-account-create-update-z6tqt" (UID: "a668a190-c176-4d20-b26f-4ac28fa83476") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.162445 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-config-data" (OuterVolumeSpecName: "config-data") pod "209fbdac-3b3c-451a-ae30-5888e1cbb891" (UID: "209fbdac-3b3c-451a-ae30-5888e1cbb891"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: E0310 07:09:30.230603 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6a4f962505809098311731c7df1e9e9acaa99f0d4d2a16ebefd3083955087398" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 10 07:09:30 crc kubenswrapper[4825]: E0310 07:09:30.231761 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6a4f962505809098311731c7df1e9e9acaa99f0d4d2a16ebefd3083955087398" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 10 07:09:30 crc kubenswrapper[4825]: E0310 07:09:30.233584 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6a4f962505809098311731c7df1e9e9acaa99f0d4d2a16ebefd3083955087398" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 10 07:09:30 crc kubenswrapper[4825]: E0310 07:09:30.233696 4825 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="d77fced6-dee7-49fc-9088-e173a5be3cee" containerName="ovn-northd" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.237455 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9965b351-ed73-4c38-b393-ff72ba48cd66-config-data" (OuterVolumeSpecName: "config-data") pod "9965b351-ed73-4c38-b393-ff72ba48cd66" (UID: "9965b351-ed73-4c38-b393-ff72ba48cd66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.239822 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42d8b8af-9916-4aba-b4e7-f825ed30f182" (UID: "42d8b8af-9916-4aba-b4e7-f825ed30f182"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.258923 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9965b351-ed73-4c38-b393-ff72ba48cd66-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.258952 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/209fbdac-3b3c-451a-ae30-5888e1cbb891-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.258963 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.271005 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e516b5cf-d44c-4f03-9247-7727319f0a85" (UID: "e516b5cf-d44c-4f03-9247-7727319f0a85"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.289493 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04a2aca4-f98f-4ae8-aca9-62ae6625e5e2" (UID: "04a2aca4-f98f-4ae8-aca9-62ae6625e5e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.307224 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1222a91-b344-4f54-bf9f-75d03d5f8549-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1222a91-b344-4f54-bf9f-75d03d5f8549" (UID: "b1222a91-b344-4f54-bf9f-75d03d5f8549"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.342842 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-config-data" (OuterVolumeSpecName: "config-data") pod "f33e10c8-8be2-46d4-8653-1960855a2a40" (UID: "f33e10c8-8be2-46d4-8653-1960855a2a40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.346786 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-config-data" (OuterVolumeSpecName: "config-data") pod "fcf819fc-0560-45a8-be5e-04e6ef2bbf32" (UID: "fcf819fc-0560-45a8-be5e-04e6ef2bbf32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.347997 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e516b5cf-d44c-4f03-9247-7727319f0a85" (UID: "e516b5cf-d44c-4f03-9247-7727319f0a85"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.350237 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fcf819fc-0560-45a8-be5e-04e6ef2bbf32" (UID: "fcf819fc-0560-45a8-be5e-04e6ef2bbf32"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.351525 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "42d8b8af-9916-4aba-b4e7-f825ed30f182" (UID: "42d8b8af-9916-4aba-b4e7-f825ed30f182"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.354025 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca4ebafb-aa22-44ad-8037-487a9c3baca4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca4ebafb-aa22-44ad-8037-487a9c3baca4" (UID: "ca4ebafb-aa22-44ad-8037-487a9c3baca4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.360285 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1222a91-b344-4f54-bf9f-75d03d5f8549-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.360315 4825 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.360328 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.360341 4825 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.360356 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4ebafb-aa22-44ad-8037-487a9c3baca4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.360367 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f33e10c8-8be2-46d4-8653-1960855a2a40-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.360377 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.360387 4825 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.360395 4825 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.365807 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1222a91-b344-4f54-bf9f-75d03d5f8549-config-data" (OuterVolumeSpecName: "config-data") pod "b1222a91-b344-4f54-bf9f-75d03d5f8549" (UID: "b1222a91-b344-4f54-bf9f-75d03d5f8549"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.377272 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e516b5cf-d44c-4f03-9247-7727319f0a85" (UID: "e516b5cf-d44c-4f03-9247-7727319f0a85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.380570 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.381320 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fcf819fc-0560-45a8-be5e-04e6ef2bbf32" (UID: "fcf819fc-0560-45a8-be5e-04e6ef2bbf32"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.382988 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.383038 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "42d8b8af-9916-4aba-b4e7-f825ed30f182" (UID: "42d8b8af-9916-4aba-b4e7-f825ed30f182"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.383616 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-config-data" (OuterVolumeSpecName: "config-data") pod "42d8b8af-9916-4aba-b4e7-f825ed30f182" (UID: "42d8b8af-9916-4aba-b4e7-f825ed30f182"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.388804 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.392813 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcf819fc-0560-45a8-be5e-04e6ef2bbf32" (UID: "fcf819fc-0560-45a8-be5e-04e6ef2bbf32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.397482 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9965b351-ed73-4c38-b393-ff72ba48cd66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9965b351-ed73-4c38-b393-ff72ba48cd66" (UID: "9965b351-ed73-4c38-b393-ff72ba48cd66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.398389 4825 scope.go:117] "RemoveContainer" containerID="40b43e7e62b535d8a9ba96b7af5069797ffcd786e53c1075b83a21ba139ed0fa" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.422216 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5fb2-account-create-update-z6tqt"] Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.432753 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1222a91-b344-4f54-bf9f-75d03d5f8549-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b1222a91-b344-4f54-bf9f-75d03d5f8549" (UID: "b1222a91-b344-4f54-bf9f-75d03d5f8549"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.434826 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5fb2-account-create-update-z6tqt"] Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.437470 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca4ebafb-aa22-44ad-8037-487a9c3baca4-config-data" (OuterVolumeSpecName: "config-data") pod "ca4ebafb-aa22-44ad-8037-487a9c3baca4" (UID: "ca4ebafb-aa22-44ad-8037-487a9c3baca4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.451687 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-config-data" (OuterVolumeSpecName: "config-data") pod "e516b5cf-d44c-4f03-9247-7727319f0a85" (UID: "e516b5cf-d44c-4f03-9247-7727319f0a85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.455490 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.461328 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d4252ca-f279-4c95-8f10-205339d028a5-combined-ca-bundle\") pod \"5d4252ca-f279-4c95-8f10-205339d028a5\" (UID: \"5d4252ca-f279-4c95-8f10-205339d028a5\") " Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.461362 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f5995b-4ad8-4840-82f1-6659152c3ed4-config-data\") pod \"85f5995b-4ad8-4840-82f1-6659152c3ed4\" (UID: \"85f5995b-4ad8-4840-82f1-6659152c3ed4\") " Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.461386 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d73d8a-acf6-42e6-a30d-e093144ee0b9-combined-ca-bundle\") pod \"47d73d8a-acf6-42e6-a30d-e093144ee0b9\" (UID: \"47d73d8a-acf6-42e6-a30d-e093144ee0b9\") " Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.461409 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5d4252ca-f279-4c95-8f10-205339d028a5-kolla-config\") pod \"5d4252ca-f279-4c95-8f10-205339d028a5\" (UID: \"5d4252ca-f279-4c95-8f10-205339d028a5\") " Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.461428 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d4252ca-f279-4c95-8f10-205339d028a5-config-data\") pod \"5d4252ca-f279-4c95-8f10-205339d028a5\" (UID: \"5d4252ca-f279-4c95-8f10-205339d028a5\") " Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.461457 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sstsk\" (UniqueName: \"kubernetes.io/projected/47d73d8a-acf6-42e6-a30d-e093144ee0b9-kube-api-access-sstsk\") pod \"47d73d8a-acf6-42e6-a30d-e093144ee0b9\" (UID: \"47d73d8a-acf6-42e6-a30d-e093144ee0b9\") " Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.461474 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f5995b-4ad8-4840-82f1-6659152c3ed4-combined-ca-bundle\") pod \"85f5995b-4ad8-4840-82f1-6659152c3ed4\" (UID: \"85f5995b-4ad8-4840-82f1-6659152c3ed4\") " Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.461494 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvjqh\" (UniqueName: \"kubernetes.io/projected/85f5995b-4ad8-4840-82f1-6659152c3ed4-kube-api-access-tvjqh\") pod \"85f5995b-4ad8-4840-82f1-6659152c3ed4\" (UID: \"85f5995b-4ad8-4840-82f1-6659152c3ed4\") " Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.461537 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d73d8a-acf6-42e6-a30d-e093144ee0b9-config-data\") pod \"47d73d8a-acf6-42e6-a30d-e093144ee0b9\" (UID: \"47d73d8a-acf6-42e6-a30d-e093144ee0b9\") " Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.461604 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cqq4\" (UniqueName: \"kubernetes.io/projected/5d4252ca-f279-4c95-8f10-205339d028a5-kube-api-access-2cqq4\") pod \"5d4252ca-f279-4c95-8f10-205339d028a5\" (UID: \"5d4252ca-f279-4c95-8f10-205339d028a5\") " Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.461638 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d4252ca-f279-4c95-8f10-205339d028a5-memcached-tls-certs\") pod \"5d4252ca-f279-4c95-8f10-205339d028a5\" (UID: \"5d4252ca-f279-4c95-8f10-205339d028a5\") " Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.461983 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.461994 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9965b351-ed73-4c38-b393-ff72ba48cd66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.462003 4825 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.462011 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1222a91-b344-4f54-bf9f-75d03d5f8549-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.462024 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4ebafb-aa22-44ad-8037-487a9c3baca4-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.462043 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.462051 4825 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1222a91-b344-4f54-bf9f-75d03d5f8549-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.462062 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e516b5cf-d44c-4f03-9247-7727319f0a85-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.462086 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a668a190-c176-4d20-b26f-4ac28fa83476-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.462097 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22k77\" (UniqueName: \"kubernetes.io/projected/a668a190-c176-4d20-b26f-4ac28fa83476-kube-api-access-22k77\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.462108 4825 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42d8b8af-9916-4aba-b4e7-f825ed30f182-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.462117 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcf819fc-0560-45a8-be5e-04e6ef2bbf32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.462253 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d4252ca-f279-4c95-8f10-205339d028a5-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "5d4252ca-f279-4c95-8f10-205339d028a5" (UID: "5d4252ca-f279-4c95-8f10-205339d028a5"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: E0310 07:09:30.462850 4825 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 10 07:09:30 crc kubenswrapper[4825]: E0310 07:09:30.462930 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c684f3e4-ced9-49c3-aa54-515bc2c4fb56-operator-scripts podName:c684f3e4-ced9-49c3-aa54-515bc2c4fb56 nodeName:}" failed. No retries permitted until 2026-03-10 07:09:32.462885348 +0000 UTC m=+1525.492665963 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c684f3e4-ced9-49c3-aa54-515bc2c4fb56-operator-scripts") pod "root-account-create-update-pwt26" (UID: "c684f3e4-ced9-49c3-aa54-515bc2c4fb56") : configmap "openstack-scripts" not found Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.472434 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47d73d8a-acf6-42e6-a30d-e093144ee0b9-kube-api-access-sstsk" (OuterVolumeSpecName: "kube-api-access-sstsk") pod "47d73d8a-acf6-42e6-a30d-e093144ee0b9" (UID: "47d73d8a-acf6-42e6-a30d-e093144ee0b9"). InnerVolumeSpecName "kube-api-access-sstsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.473412 4825 scope.go:117] "RemoveContainer" containerID="fe8d444c42b3290f6dd09fc28b48048d43ae16fceb5be5dbfbb283eec952359c" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.473697 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.474053 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pwt26" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.476348 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d4252ca-f279-4c95-8f10-205339d028a5-config-data" (OuterVolumeSpecName: "config-data") pod "5d4252ca-f279-4c95-8f10-205339d028a5" (UID: "5d4252ca-f279-4c95-8f10-205339d028a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.484724 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85f5995b-4ad8-4840-82f1-6659152c3ed4-kube-api-access-tvjqh" (OuterVolumeSpecName: "kube-api-access-tvjqh") pod "85f5995b-4ad8-4840-82f1-6659152c3ed4" (UID: "85f5995b-4ad8-4840-82f1-6659152c3ed4"). InnerVolumeSpecName "kube-api-access-tvjqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.491344 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d4252ca-f279-4c95-8f10-205339d028a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d4252ca-f279-4c95-8f10-205339d028a5" (UID: "5d4252ca-f279-4c95-8f10-205339d028a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.493305 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f5995b-4ad8-4840-82f1-6659152c3ed4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85f5995b-4ad8-4840-82f1-6659152c3ed4" (UID: "85f5995b-4ad8-4840-82f1-6659152c3ed4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.495513 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f5995b-4ad8-4840-82f1-6659152c3ed4-config-data" (OuterVolumeSpecName: "config-data") pod "85f5995b-4ad8-4840-82f1-6659152c3ed4" (UID: "85f5995b-4ad8-4840-82f1-6659152c3ed4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.495688 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d4252ca-f279-4c95-8f10-205339d028a5-kube-api-access-2cqq4" (OuterVolumeSpecName: "kube-api-access-2cqq4") pod "5d4252ca-f279-4c95-8f10-205339d028a5" (UID: "5d4252ca-f279-4c95-8f10-205339d028a5"). InnerVolumeSpecName "kube-api-access-2cqq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.512874 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47d73d8a-acf6-42e6-a30d-e093144ee0b9-config-data" (OuterVolumeSpecName: "config-data") pod "47d73d8a-acf6-42e6-a30d-e093144ee0b9" (UID: "47d73d8a-acf6-42e6-a30d-e093144ee0b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.514616 4825 scope.go:117] "RemoveContainer" containerID="b75f52a9c2b47ee09bd6f6c1c6a4ac378afdcf400070a6ab9a12012c65129fcc" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.532815 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.534494 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47d73d8a-acf6-42e6-a30d-e093144ee0b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47d73d8a-acf6-42e6-a30d-e093144ee0b9" (UID: "47d73d8a-acf6-42e6-a30d-e093144ee0b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.545508 4825 scope.go:117] "RemoveContainer" containerID="8c02a587ff89f403a3ba0a4a6efae15db19b3375bb2065fb2488307352268600" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.549195 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.568587 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c684f3e4-ced9-49c3-aa54-515bc2c4fb56-operator-scripts\") pod \"c684f3e4-ced9-49c3-aa54-515bc2c4fb56\" (UID: \"c684f3e4-ced9-49c3-aa54-515bc2c4fb56\") " Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.568836 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k67d\" (UniqueName: \"kubernetes.io/projected/c684f3e4-ced9-49c3-aa54-515bc2c4fb56-kube-api-access-9k67d\") pod \"c684f3e4-ced9-49c3-aa54-515bc2c4fb56\" (UID: \"c684f3e4-ced9-49c3-aa54-515bc2c4fb56\") " Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.569555 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c684f3e4-ced9-49c3-aa54-515bc2c4fb56-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c684f3e4-ced9-49c3-aa54-515bc2c4fb56" (UID: "c684f3e4-ced9-49c3-aa54-515bc2c4fb56"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.577168 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.580455 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d73d8a-acf6-42e6-a30d-e093144ee0b9-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.580492 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c684f3e4-ced9-49c3-aa54-515bc2c4fb56-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.580509 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cqq4\" (UniqueName: \"kubernetes.io/projected/5d4252ca-f279-4c95-8f10-205339d028a5-kube-api-access-2cqq4\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.580521 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d4252ca-f279-4c95-8f10-205339d028a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.580535 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f5995b-4ad8-4840-82f1-6659152c3ed4-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.580544 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d73d8a-acf6-42e6-a30d-e093144ee0b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.580552 4825 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5d4252ca-f279-4c95-8f10-205339d028a5-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.580563 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d4252ca-f279-4c95-8f10-205339d028a5-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.580577 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sstsk\" (UniqueName: \"kubernetes.io/projected/47d73d8a-acf6-42e6-a30d-e093144ee0b9-kube-api-access-sstsk\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.580585 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f5995b-4ad8-4840-82f1-6659152c3ed4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.580594 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvjqh\" (UniqueName: \"kubernetes.io/projected/85f5995b-4ad8-4840-82f1-6659152c3ed4-kube-api-access-tvjqh\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.580927 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d4252ca-f279-4c95-8f10-205339d028a5-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "5d4252ca-f279-4c95-8f10-205339d028a5" (UID: "5d4252ca-f279-4c95-8f10-205339d028a5"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.583504 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.587089 4825 scope.go:117] "RemoveContainer" containerID="bb56c1153d981c89befdcab3c321e793fa3da2a54c78cde62ebc8522df37f74c" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.587538 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c684f3e4-ced9-49c3-aa54-515bc2c4fb56-kube-api-access-9k67d" (OuterVolumeSpecName: "kube-api-access-9k67d") pod "c684f3e4-ced9-49c3-aa54-515bc2c4fb56" (UID: "c684f3e4-ced9-49c3-aa54-515bc2c4fb56"). InnerVolumeSpecName "kube-api-access-9k67d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.633453 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.642160 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.649889 4825 scope.go:117] "RemoveContainer" containerID="41fab81e9dbca0e39235649b019d1b34334b13bd38163f9b2e2254c248abda96" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.657790 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.669955 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.678582 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.681934 4825 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d4252ca-f279-4c95-8f10-205339d028a5-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.681963 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k67d\" (UniqueName: \"kubernetes.io/projected/c684f3e4-ced9-49c3-aa54-515bc2c4fb56-kube-api-access-9k67d\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.683960 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.684001 4825 scope.go:117] "RemoveContainer" containerID="fe83016a56de378ec7204c67a3bc85b458b1da0d43277d26413e0aa8e2ea9dc4" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.698467 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-9c98f44d4-hm6qz"] Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.705447 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-9c98f44d4-hm6qz"] Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.711702 4825 scope.go:117] "RemoveContainer" containerID="eeadc36b533bcb861ae1e4ac68b98711c46023cbb63985038bb6a6302eb5d0a3" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.732140 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.732807 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.744447 4825 scope.go:117] "RemoveContainer" containerID="fe83016a56de378ec7204c67a3bc85b458b1da0d43277d26413e0aa8e2ea9dc4" Mar 10 07:09:30 crc kubenswrapper[4825]: E0310 07:09:30.745042 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe83016a56de378ec7204c67a3bc85b458b1da0d43277d26413e0aa8e2ea9dc4\": container with ID starting with fe83016a56de378ec7204c67a3bc85b458b1da0d43277d26413e0aa8e2ea9dc4 not found: ID does not exist" containerID="fe83016a56de378ec7204c67a3bc85b458b1da0d43277d26413e0aa8e2ea9dc4" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.745080 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe83016a56de378ec7204c67a3bc85b458b1da0d43277d26413e0aa8e2ea9dc4"} err="failed to get container status \"fe83016a56de378ec7204c67a3bc85b458b1da0d43277d26413e0aa8e2ea9dc4\": rpc error: code = NotFound desc = could not find container \"fe83016a56de378ec7204c67a3bc85b458b1da0d43277d26413e0aa8e2ea9dc4\": container with ID starting with fe83016a56de378ec7204c67a3bc85b458b1da0d43277d26413e0aa8e2ea9dc4 not found: ID does not exist" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.745101 4825 scope.go:117] "RemoveContainer" containerID="eeadc36b533bcb861ae1e4ac68b98711c46023cbb63985038bb6a6302eb5d0a3" Mar 10 07:09:30 crc kubenswrapper[4825]: E0310 07:09:30.745483 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eeadc36b533bcb861ae1e4ac68b98711c46023cbb63985038bb6a6302eb5d0a3\": container with ID starting with eeadc36b533bcb861ae1e4ac68b98711c46023cbb63985038bb6a6302eb5d0a3 not found: ID does not exist" containerID="eeadc36b533bcb861ae1e4ac68b98711c46023cbb63985038bb6a6302eb5d0a3" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.745510 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeadc36b533bcb861ae1e4ac68b98711c46023cbb63985038bb6a6302eb5d0a3"} err="failed to get container status \"eeadc36b533bcb861ae1e4ac68b98711c46023cbb63985038bb6a6302eb5d0a3\": rpc error: code = NotFound desc = could not find container \"eeadc36b533bcb861ae1e4ac68b98711c46023cbb63985038bb6a6302eb5d0a3\": container with ID starting with eeadc36b533bcb861ae1e4ac68b98711c46023cbb63985038bb6a6302eb5d0a3 not found: ID does not exist" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.745525 4825 scope.go:117] "RemoveContainer" containerID="45339b16b80b77a1ec497dc0cfdf7cad94e760f0b42f91faed72b2b7acaeea0f" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.750077 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.752356 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.761736 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.770149 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.771268 4825 scope.go:117] "RemoveContainer" containerID="6a1691cba1fef49f1aa2f7c44e95ac06b21fe92d69b0eadbad46361719368876" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.791196 4825 scope.go:117] "RemoveContainer" containerID="45339b16b80b77a1ec497dc0cfdf7cad94e760f0b42f91faed72b2b7acaeea0f" Mar 10 07:09:30 crc kubenswrapper[4825]: E0310 07:09:30.792192 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45339b16b80b77a1ec497dc0cfdf7cad94e760f0b42f91faed72b2b7acaeea0f\": container with ID starting with 45339b16b80b77a1ec497dc0cfdf7cad94e760f0b42f91faed72b2b7acaeea0f not found: ID does not exist" containerID="45339b16b80b77a1ec497dc0cfdf7cad94e760f0b42f91faed72b2b7acaeea0f" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.792227 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45339b16b80b77a1ec497dc0cfdf7cad94e760f0b42f91faed72b2b7acaeea0f"} err="failed to get container status \"45339b16b80b77a1ec497dc0cfdf7cad94e760f0b42f91faed72b2b7acaeea0f\": rpc error: code = NotFound desc = could not find container \"45339b16b80b77a1ec497dc0cfdf7cad94e760f0b42f91faed72b2b7acaeea0f\": container with ID starting with 45339b16b80b77a1ec497dc0cfdf7cad94e760f0b42f91faed72b2b7acaeea0f not found: ID does not exist" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.792252 4825 scope.go:117] "RemoveContainer" containerID="6a1691cba1fef49f1aa2f7c44e95ac06b21fe92d69b0eadbad46361719368876" Mar 10 07:09:30 crc kubenswrapper[4825]: E0310 07:09:30.792652 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a1691cba1fef49f1aa2f7c44e95ac06b21fe92d69b0eadbad46361719368876\": container with ID starting with 6a1691cba1fef49f1aa2f7c44e95ac06b21fe92d69b0eadbad46361719368876 not found: ID does not exist" containerID="6a1691cba1fef49f1aa2f7c44e95ac06b21fe92d69b0eadbad46361719368876" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.792670 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a1691cba1fef49f1aa2f7c44e95ac06b21fe92d69b0eadbad46361719368876"} err="failed to get container status \"6a1691cba1fef49f1aa2f7c44e95ac06b21fe92d69b0eadbad46361719368876\": rpc error: code = NotFound desc = could not find container \"6a1691cba1fef49f1aa2f7c44e95ac06b21fe92d69b0eadbad46361719368876\": container with ID starting with 6a1691cba1fef49f1aa2f7c44e95ac06b21fe92d69b0eadbad46361719368876 not found: ID does not exist" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.792683 4825 scope.go:117] "RemoveContainer" containerID="062160b5e34d30d57397fa7d0b1b030e8fd845e13121bd5d0cdea1d6a20a2de8" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.812829 4825 scope.go:117] "RemoveContainer" containerID="b95484ef86f19d2fa040d657d6e6e16f10d17dd575ffd8009352291bebecc277" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.855040 4825 scope.go:117] "RemoveContainer" containerID="d021be5f73ead6231f65789871b1004af37bd859e93d3efa1540dedd07f320d9" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.868642 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d77fced6-dee7-49fc-9088-e173a5be3cee/ovn-northd/0.log" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.868730 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.884380 4825 scope.go:117] "RemoveContainer" containerID="b5fc534b6fc5bdb62a0dd359fd61377bd0c4ca9f7943fd237ff73b6a79274f5c" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.883900 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d77fced6-dee7-49fc-9088-e173a5be3cee-metrics-certs-tls-certs\") pod \"d77fced6-dee7-49fc-9088-e173a5be3cee\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.889016 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77fced6-dee7-49fc-9088-e173a5be3cee-combined-ca-bundle\") pod \"d77fced6-dee7-49fc-9088-e173a5be3cee\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.889053 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d77fced6-dee7-49fc-9088-e173a5be3cee-ovn-northd-tls-certs\") pod \"d77fced6-dee7-49fc-9088-e173a5be3cee\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.889329 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d77fced6-dee7-49fc-9088-e173a5be3cee-scripts\") pod \"d77fced6-dee7-49fc-9088-e173a5be3cee\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.889448 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv4pm\" (UniqueName: \"kubernetes.io/projected/d77fced6-dee7-49fc-9088-e173a5be3cee-kube-api-access-fv4pm\") pod \"d77fced6-dee7-49fc-9088-e173a5be3cee\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.889499 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d77fced6-dee7-49fc-9088-e173a5be3cee-ovn-rundir\") pod \"d77fced6-dee7-49fc-9088-e173a5be3cee\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.889549 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d77fced6-dee7-49fc-9088-e173a5be3cee-config\") pod \"d77fced6-dee7-49fc-9088-e173a5be3cee\" (UID: \"d77fced6-dee7-49fc-9088-e173a5be3cee\") " Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.890343 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d77fced6-dee7-49fc-9088-e173a5be3cee-scripts" (OuterVolumeSpecName: "scripts") pod "d77fced6-dee7-49fc-9088-e173a5be3cee" (UID: "d77fced6-dee7-49fc-9088-e173a5be3cee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.890827 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d77fced6-dee7-49fc-9088-e173a5be3cee-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "d77fced6-dee7-49fc-9088-e173a5be3cee" (UID: "d77fced6-dee7-49fc-9088-e173a5be3cee"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.890988 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d77fced6-dee7-49fc-9088-e173a5be3cee-config" (OuterVolumeSpecName: "config") pod "d77fced6-dee7-49fc-9088-e173a5be3cee" (UID: "d77fced6-dee7-49fc-9088-e173a5be3cee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.895449 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d77fced6-dee7-49fc-9088-e173a5be3cee-kube-api-access-fv4pm" (OuterVolumeSpecName: "kube-api-access-fv4pm") pod "d77fced6-dee7-49fc-9088-e173a5be3cee" (UID: "d77fced6-dee7-49fc-9088-e173a5be3cee"). InnerVolumeSpecName "kube-api-access-fv4pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.963171 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d77fced6-dee7-49fc-9088-e173a5be3cee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d77fced6-dee7-49fc-9088-e173a5be3cee" (UID: "d77fced6-dee7-49fc-9088-e173a5be3cee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.992153 4825 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d77fced6-dee7-49fc-9088-e173a5be3cee-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.992194 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d77fced6-dee7-49fc-9088-e173a5be3cee-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.992205 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d77fced6-dee7-49fc-9088-e173a5be3cee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.992214 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d77fced6-dee7-49fc-9088-e173a5be3cee-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:30 crc kubenswrapper[4825]: I0310 07:09:30.992224 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv4pm\" (UniqueName: \"kubernetes.io/projected/d77fced6-dee7-49fc-9088-e173a5be3cee-kube-api-access-fv4pm\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.007637 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d77fced6-dee7-49fc-9088-e173a5be3cee-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "d77fced6-dee7-49fc-9088-e173a5be3cee" (UID: "d77fced6-dee7-49fc-9088-e173a5be3cee"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.007730 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d77fced6-dee7-49fc-9088-e173a5be3cee-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "d77fced6-dee7-49fc-9088-e173a5be3cee" (UID: "d77fced6-dee7-49fc-9088-e173a5be3cee"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.093034 4825 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d77fced6-dee7-49fc-9088-e173a5be3cee-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.093055 4825 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d77fced6-dee7-49fc-9088-e173a5be3cee-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: E0310 07:09:31.093110 4825 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 10 07:09:31 crc kubenswrapper[4825]: E0310 07:09:31.093182 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-config-data podName:40efa241-98cc-4dec-9ae8-8a892b367ebc nodeName:}" failed. No retries permitted until 2026-03-10 07:09:39.093167279 +0000 UTC m=+1532.122947894 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-config-data") pod "rabbitmq-server-0" (UID: "40efa241-98cc-4dec-9ae8-8a892b367ebc") : configmap "rabbitmq-config-data" not found Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.100887 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d77fced6-dee7-49fc-9088-e173a5be3cee/ovn-northd/0.log" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.100925 4825 generic.go:334] "Generic (PLEG): container finished" podID="d77fced6-dee7-49fc-9088-e173a5be3cee" containerID="6a4f962505809098311731c7df1e9e9acaa99f0d4d2a16ebefd3083955087398" exitCode=139 Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.101017 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.102283 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d77fced6-dee7-49fc-9088-e173a5be3cee","Type":"ContainerDied","Data":"6a4f962505809098311731c7df1e9e9acaa99f0d4d2a16ebefd3083955087398"} Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.102347 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d77fced6-dee7-49fc-9088-e173a5be3cee","Type":"ContainerDied","Data":"e80f70e559e88ab7e4a0b13d4c1292a7c68c490c36f06ac9365c83bd1c1a3bd8"} Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.108501 4825 generic.go:334] "Generic (PLEG): container finished" podID="1b5b179f-f4fb-479c-9720-25587566c518" containerID="c5251bd347e4e2a0285d072a7e65894076feb5953eaeecf9032c40e2a8952418" exitCode=0 Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.108613 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1b5b179f-f4fb-479c-9720-25587566c518","Type":"ContainerDied","Data":"c5251bd347e4e2a0285d072a7e65894076feb5953eaeecf9032c40e2a8952418"} Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.113385 4825 generic.go:334] "Generic (PLEG): container finished" podID="9ba0e3ee-1309-4411-a927-866b35c2776b" containerID="33787b88c881e69ba026aff5962a5455c1e4b2f94327fcc119388ebe1cd30211" exitCode=0 Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.113434 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9ba0e3ee-1309-4411-a927-866b35c2776b","Type":"ContainerDied","Data":"33787b88c881e69ba026aff5962a5455c1e4b2f94327fcc119388ebe1cd30211"} Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.116383 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"47d73d8a-acf6-42e6-a30d-e093144ee0b9","Type":"ContainerDied","Data":"01f1d5d9ae5b566ec85b11984ea694cea253fd859bb997b38c5b4718a9006b4a"} Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.116466 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.135358 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"85f5995b-4ad8-4840-82f1-6659152c3ed4","Type":"ContainerDied","Data":"141b7ddad9bdd5daae4c0afcd89acbe29d3c3b9a413154d4f74d7e889beeb150"} Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.135501 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.137846 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5d4252ca-f279-4c95-8f10-205339d028a5","Type":"ContainerDied","Data":"07bcbd9d0c0735b9472b5e79d9c83d84abc98a52a384930a35a5190919aa256c"} Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.137946 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.158213 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pwt26" event={"ID":"c684f3e4-ced9-49c3-aa54-515bc2c4fb56","Type":"ContainerDied","Data":"25f403db5ebe5475717c8f47289b1479fce9b0d0eac9e70adf5907cdfb44e580"} Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.158314 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pwt26" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.224780 4825 scope.go:117] "RemoveContainer" containerID="062160b5e34d30d57397fa7d0b1b030e8fd845e13121bd5d0cdea1d6a20a2de8" Mar 10 07:09:31 crc kubenswrapper[4825]: E0310 07:09:31.225303 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"062160b5e34d30d57397fa7d0b1b030e8fd845e13121bd5d0cdea1d6a20a2de8\": container with ID starting with 062160b5e34d30d57397fa7d0b1b030e8fd845e13121bd5d0cdea1d6a20a2de8 not found: ID does not exist" containerID="062160b5e34d30d57397fa7d0b1b030e8fd845e13121bd5d0cdea1d6a20a2de8" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.225353 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"062160b5e34d30d57397fa7d0b1b030e8fd845e13121bd5d0cdea1d6a20a2de8"} err="failed to get container status \"062160b5e34d30d57397fa7d0b1b030e8fd845e13121bd5d0cdea1d6a20a2de8\": rpc error: code = NotFound desc = could not find container \"062160b5e34d30d57397fa7d0b1b030e8fd845e13121bd5d0cdea1d6a20a2de8\": container with ID starting with 062160b5e34d30d57397fa7d0b1b030e8fd845e13121bd5d0cdea1d6a20a2de8 not found: ID does not exist" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.225384 4825 scope.go:117] "RemoveContainer" containerID="b95484ef86f19d2fa040d657d6e6e16f10d17dd575ffd8009352291bebecc277" Mar 10 07:09:31 crc kubenswrapper[4825]: E0310 07:09:31.225814 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b95484ef86f19d2fa040d657d6e6e16f10d17dd575ffd8009352291bebecc277\": container with ID starting with b95484ef86f19d2fa040d657d6e6e16f10d17dd575ffd8009352291bebecc277 not found: ID does not exist" containerID="b95484ef86f19d2fa040d657d6e6e16f10d17dd575ffd8009352291bebecc277" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.225856 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b95484ef86f19d2fa040d657d6e6e16f10d17dd575ffd8009352291bebecc277"} err="failed to get container status \"b95484ef86f19d2fa040d657d6e6e16f10d17dd575ffd8009352291bebecc277\": rpc error: code = NotFound desc = could not find container \"b95484ef86f19d2fa040d657d6e6e16f10d17dd575ffd8009352291bebecc277\": container with ID starting with b95484ef86f19d2fa040d657d6e6e16f10d17dd575ffd8009352291bebecc277 not found: ID does not exist" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.225886 4825 scope.go:117] "RemoveContainer" containerID="d021be5f73ead6231f65789871b1004af37bd859e93d3efa1540dedd07f320d9" Mar 10 07:09:31 crc kubenswrapper[4825]: E0310 07:09:31.226204 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d021be5f73ead6231f65789871b1004af37bd859e93d3efa1540dedd07f320d9\": container with ID starting with d021be5f73ead6231f65789871b1004af37bd859e93d3efa1540dedd07f320d9 not found: ID does not exist" containerID="d021be5f73ead6231f65789871b1004af37bd859e93d3efa1540dedd07f320d9" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.226235 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d021be5f73ead6231f65789871b1004af37bd859e93d3efa1540dedd07f320d9"} err="failed to get container status \"d021be5f73ead6231f65789871b1004af37bd859e93d3efa1540dedd07f320d9\": rpc error: code = NotFound desc = could not find container \"d021be5f73ead6231f65789871b1004af37bd859e93d3efa1540dedd07f320d9\": container with ID starting with d021be5f73ead6231f65789871b1004af37bd859e93d3efa1540dedd07f320d9 not found: ID does not exist" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.226254 4825 scope.go:117] "RemoveContainer" containerID="b5fc534b6fc5bdb62a0dd359fd61377bd0c4ca9f7943fd237ff73b6a79274f5c" Mar 10 07:09:31 crc kubenswrapper[4825]: E0310 07:09:31.226522 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5fc534b6fc5bdb62a0dd359fd61377bd0c4ca9f7943fd237ff73b6a79274f5c\": container with ID starting with b5fc534b6fc5bdb62a0dd359fd61377bd0c4ca9f7943fd237ff73b6a79274f5c not found: ID does not exist" containerID="b5fc534b6fc5bdb62a0dd359fd61377bd0c4ca9f7943fd237ff73b6a79274f5c" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.226548 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5fc534b6fc5bdb62a0dd359fd61377bd0c4ca9f7943fd237ff73b6a79274f5c"} err="failed to get container status \"b5fc534b6fc5bdb62a0dd359fd61377bd0c4ca9f7943fd237ff73b6a79274f5c\": rpc error: code = NotFound desc = could not find container \"b5fc534b6fc5bdb62a0dd359fd61377bd0c4ca9f7943fd237ff73b6a79274f5c\": container with ID starting with b5fc534b6fc5bdb62a0dd359fd61377bd0c4ca9f7943fd237ff73b6a79274f5c not found: ID does not exist" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.226565 4825 scope.go:117] "RemoveContainer" containerID="062160b5e34d30d57397fa7d0b1b030e8fd845e13121bd5d0cdea1d6a20a2de8" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.227002 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"062160b5e34d30d57397fa7d0b1b030e8fd845e13121bd5d0cdea1d6a20a2de8"} err="failed to get container status \"062160b5e34d30d57397fa7d0b1b030e8fd845e13121bd5d0cdea1d6a20a2de8\": rpc error: code = NotFound desc = could not find container \"062160b5e34d30d57397fa7d0b1b030e8fd845e13121bd5d0cdea1d6a20a2de8\": container with ID starting with 062160b5e34d30d57397fa7d0b1b030e8fd845e13121bd5d0cdea1d6a20a2de8 not found: ID does not exist" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.227024 4825 scope.go:117] "RemoveContainer" containerID="b95484ef86f19d2fa040d657d6e6e16f10d17dd575ffd8009352291bebecc277" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.227296 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b95484ef86f19d2fa040d657d6e6e16f10d17dd575ffd8009352291bebecc277"} err="failed to get container status \"b95484ef86f19d2fa040d657d6e6e16f10d17dd575ffd8009352291bebecc277\": rpc error: code = NotFound desc = could not find container \"b95484ef86f19d2fa040d657d6e6e16f10d17dd575ffd8009352291bebecc277\": container with ID starting with b95484ef86f19d2fa040d657d6e6e16f10d17dd575ffd8009352291bebecc277 not found: ID does not exist" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.227326 4825 scope.go:117] "RemoveContainer" containerID="d021be5f73ead6231f65789871b1004af37bd859e93d3efa1540dedd07f320d9" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.227608 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d021be5f73ead6231f65789871b1004af37bd859e93d3efa1540dedd07f320d9"} err="failed to get container status \"d021be5f73ead6231f65789871b1004af37bd859e93d3efa1540dedd07f320d9\": rpc error: code = NotFound desc = could not find container \"d021be5f73ead6231f65789871b1004af37bd859e93d3efa1540dedd07f320d9\": container with ID starting with d021be5f73ead6231f65789871b1004af37bd859e93d3efa1540dedd07f320d9 not found: ID does not exist" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.227625 4825 scope.go:117] "RemoveContainer" containerID="b5fc534b6fc5bdb62a0dd359fd61377bd0c4ca9f7943fd237ff73b6a79274f5c" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.227900 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5fc534b6fc5bdb62a0dd359fd61377bd0c4ca9f7943fd237ff73b6a79274f5c"} err="failed to get container status \"b5fc534b6fc5bdb62a0dd359fd61377bd0c4ca9f7943fd237ff73b6a79274f5c\": rpc error: code = NotFound desc = could not find container \"b5fc534b6fc5bdb62a0dd359fd61377bd0c4ca9f7943fd237ff73b6a79274f5c\": container with ID starting with b5fc534b6fc5bdb62a0dd359fd61377bd0c4ca9f7943fd237ff73b6a79274f5c not found: ID does not exist" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.227916 4825 scope.go:117] "RemoveContainer" containerID="3d70d139a1deb1c6a27cdf349fc1cc7878472659a23957efa5657e5551a800ef" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.264471 4825 scope.go:117] "RemoveContainer" containerID="3d70d139a1deb1c6a27cdf349fc1cc7878472659a23957efa5657e5551a800ef" Mar 10 07:09:31 crc kubenswrapper[4825]: E0310 07:09:31.268019 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d70d139a1deb1c6a27cdf349fc1cc7878472659a23957efa5657e5551a800ef\": container with ID starting with 3d70d139a1deb1c6a27cdf349fc1cc7878472659a23957efa5657e5551a800ef not found: ID does not exist" containerID="3d70d139a1deb1c6a27cdf349fc1cc7878472659a23957efa5657e5551a800ef" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.268054 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d70d139a1deb1c6a27cdf349fc1cc7878472659a23957efa5657e5551a800ef"} err="failed to get container status \"3d70d139a1deb1c6a27cdf349fc1cc7878472659a23957efa5657e5551a800ef\": rpc error: code = NotFound desc = could not find container \"3d70d139a1deb1c6a27cdf349fc1cc7878472659a23957efa5657e5551a800ef\": container with ID starting with 3d70d139a1deb1c6a27cdf349fc1cc7878472659a23957efa5657e5551a800ef not found: ID does not exist" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.268080 4825 scope.go:117] "RemoveContainer" containerID="eb096eba10587e7f334a63bf327913dd03205b29419eb194a05f626f07286905" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.273940 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04a2aca4-f98f-4ae8-aca9-62ae6625e5e2" path="/var/lib/kubelet/pods/04a2aca4-f98f-4ae8-aca9-62ae6625e5e2/volumes" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.274881 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="209fbdac-3b3c-451a-ae30-5888e1cbb891" path="/var/lib/kubelet/pods/209fbdac-3b3c-451a-ae30-5888e1cbb891/volumes" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.275709 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42d8b8af-9916-4aba-b4e7-f825ed30f182" path="/var/lib/kubelet/pods/42d8b8af-9916-4aba-b4e7-f825ed30f182/volumes" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.277041 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b3d94ce-f8d8-4653-a6b0-2682b23d834e" path="/var/lib/kubelet/pods/8b3d94ce-f8d8-4653-a6b0-2682b23d834e/volumes" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.278001 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9965b351-ed73-4c38-b393-ff72ba48cd66" path="/var/lib/kubelet/pods/9965b351-ed73-4c38-b393-ff72ba48cd66/volumes" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.278485 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a668a190-c176-4d20-b26f-4ac28fa83476" path="/var/lib/kubelet/pods/a668a190-c176-4d20-b26f-4ac28fa83476/volumes" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.278827 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1222a91-b344-4f54-bf9f-75d03d5f8549" path="/var/lib/kubelet/pods/b1222a91-b344-4f54-bf9f-75d03d5f8549/volumes" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.279937 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca4ebafb-aa22-44ad-8037-487a9c3baca4" path="/var/lib/kubelet/pods/ca4ebafb-aa22-44ad-8037-487a9c3baca4/volumes" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.280606 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e516b5cf-d44c-4f03-9247-7727319f0a85" path="/var/lib/kubelet/pods/e516b5cf-d44c-4f03-9247-7727319f0a85/volumes" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.281896 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f33e10c8-8be2-46d4-8653-1960855a2a40" path="/var/lib/kubelet/pods/f33e10c8-8be2-46d4-8653-1960855a2a40/volumes" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.282503 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcf819fc-0560-45a8-be5e-04e6ef2bbf32" path="/var/lib/kubelet/pods/fcf819fc-0560-45a8-be5e-04e6ef2bbf32/volumes" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.283376 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.283408 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.288187 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.305243 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.311271 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.313872 4825 scope.go:117] "RemoveContainer" containerID="0596a09cf87b1eba90ca5be28a144085a5ee0e4870ea3024df884f81b878aeaf" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.316179 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.331593 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-pwt26"] Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.337230 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-pwt26"] Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.345080 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.350569 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.355710 4825 scope.go:117] "RemoveContainer" containerID="69a4b33cb0df81f2796dea2e3671cf92e7c2fcaf46f6acf19185262e97be0696" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.408211 4825 scope.go:117] "RemoveContainer" containerID="2afb8783418624fe53a581d5b384b0eac94f03f7b397fe6d2851a79acaf2faeb" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.468396 4825 scope.go:117] "RemoveContainer" containerID="69a4b33cb0df81f2796dea2e3671cf92e7c2fcaf46f6acf19185262e97be0696" Mar 10 07:09:31 crc kubenswrapper[4825]: E0310 07:09:31.468840 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69a4b33cb0df81f2796dea2e3671cf92e7c2fcaf46f6acf19185262e97be0696\": container with ID starting with 69a4b33cb0df81f2796dea2e3671cf92e7c2fcaf46f6acf19185262e97be0696 not found: ID does not exist" containerID="69a4b33cb0df81f2796dea2e3671cf92e7c2fcaf46f6acf19185262e97be0696" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.468881 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69a4b33cb0df81f2796dea2e3671cf92e7c2fcaf46f6acf19185262e97be0696"} err="failed to get container status \"69a4b33cb0df81f2796dea2e3671cf92e7c2fcaf46f6acf19185262e97be0696\": rpc error: code = NotFound desc = could not find container \"69a4b33cb0df81f2796dea2e3671cf92e7c2fcaf46f6acf19185262e97be0696\": container with ID starting with 69a4b33cb0df81f2796dea2e3671cf92e7c2fcaf46f6acf19185262e97be0696 not found: ID does not exist" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.468913 4825 scope.go:117] "RemoveContainer" containerID="2afb8783418624fe53a581d5b384b0eac94f03f7b397fe6d2851a79acaf2faeb" Mar 10 07:09:31 crc kubenswrapper[4825]: E0310 07:09:31.469324 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2afb8783418624fe53a581d5b384b0eac94f03f7b397fe6d2851a79acaf2faeb\": container with ID starting with 2afb8783418624fe53a581d5b384b0eac94f03f7b397fe6d2851a79acaf2faeb not found: ID does not exist" containerID="2afb8783418624fe53a581d5b384b0eac94f03f7b397fe6d2851a79acaf2faeb" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.469655 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2afb8783418624fe53a581d5b384b0eac94f03f7b397fe6d2851a79acaf2faeb"} err="failed to get container status \"2afb8783418624fe53a581d5b384b0eac94f03f7b397fe6d2851a79acaf2faeb\": rpc error: code = NotFound desc = could not find container \"2afb8783418624fe53a581d5b384b0eac94f03f7b397fe6d2851a79acaf2faeb\": container with ID starting with 2afb8783418624fe53a581d5b384b0eac94f03f7b397fe6d2851a79acaf2faeb not found: ID does not exist" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.469670 4825 scope.go:117] "RemoveContainer" containerID="4ceda535e48bc00d5f96a25e503044a292df24630498d86c8a415946cacf0904" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.480904 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.496741 4825 scope.go:117] "RemoveContainer" containerID="6a4f962505809098311731c7df1e9e9acaa99f0d4d2a16ebefd3083955087398" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.497911 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1b5b179f-f4fb-479c-9720-25587566c518-config-data-default\") pod \"1b5b179f-f4fb-479c-9720-25587566c518\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.497971 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5b179f-f4fb-479c-9720-25587566c518-combined-ca-bundle\") pod \"1b5b179f-f4fb-479c-9720-25587566c518\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.498013 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1b5b179f-f4fb-479c-9720-25587566c518-config-data-generated\") pod \"1b5b179f-f4fb-479c-9720-25587566c518\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.498044 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1b5b179f-f4fb-479c-9720-25587566c518-kolla-config\") pod \"1b5b179f-f4fb-479c-9720-25587566c518\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.498118 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b5b179f-f4fb-479c-9720-25587566c518-operator-scripts\") pod \"1b5b179f-f4fb-479c-9720-25587566c518\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.498238 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b5b179f-f4fb-479c-9720-25587566c518-galera-tls-certs\") pod \"1b5b179f-f4fb-479c-9720-25587566c518\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.498269 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"1b5b179f-f4fb-479c-9720-25587566c518\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.498315 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqzfd\" (UniqueName: \"kubernetes.io/projected/1b5b179f-f4fb-479c-9720-25587566c518-kube-api-access-pqzfd\") pod \"1b5b179f-f4fb-479c-9720-25587566c518\" (UID: \"1b5b179f-f4fb-479c-9720-25587566c518\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.501452 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5b179f-f4fb-479c-9720-25587566c518-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1b5b179f-f4fb-479c-9720-25587566c518" (UID: "1b5b179f-f4fb-479c-9720-25587566c518"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.501824 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b5b179f-f4fb-479c-9720-25587566c518-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "1b5b179f-f4fb-479c-9720-25587566c518" (UID: "1b5b179f-f4fb-479c-9720-25587566c518"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.502821 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5b179f-f4fb-479c-9720-25587566c518-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "1b5b179f-f4fb-479c-9720-25587566c518" (UID: "1b5b179f-f4fb-479c-9720-25587566c518"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.502925 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5b179f-f4fb-479c-9720-25587566c518-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "1b5b179f-f4fb-479c-9720-25587566c518" (UID: "1b5b179f-f4fb-479c-9720-25587566c518"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.504103 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b5b179f-f4fb-479c-9720-25587566c518-kube-api-access-pqzfd" (OuterVolumeSpecName: "kube-api-access-pqzfd") pod "1b5b179f-f4fb-479c-9720-25587566c518" (UID: "1b5b179f-f4fb-479c-9720-25587566c518"). InnerVolumeSpecName "kube-api-access-pqzfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.510041 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "mysql-db") pod "1b5b179f-f4fb-479c-9720-25587566c518" (UID: "1b5b179f-f4fb-479c-9720-25587566c518"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.538795 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.539933 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5b179f-f4fb-479c-9720-25587566c518-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b5b179f-f4fb-479c-9720-25587566c518" (UID: "1b5b179f-f4fb-479c-9720-25587566c518"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.541835 4825 scope.go:117] "RemoveContainer" containerID="4ceda535e48bc00d5f96a25e503044a292df24630498d86c8a415946cacf0904" Mar 10 07:09:31 crc kubenswrapper[4825]: E0310 07:09:31.543484 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ceda535e48bc00d5f96a25e503044a292df24630498d86c8a415946cacf0904\": container with ID starting with 4ceda535e48bc00d5f96a25e503044a292df24630498d86c8a415946cacf0904 not found: ID does not exist" containerID="4ceda535e48bc00d5f96a25e503044a292df24630498d86c8a415946cacf0904" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.543538 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ceda535e48bc00d5f96a25e503044a292df24630498d86c8a415946cacf0904"} err="failed to get container status \"4ceda535e48bc00d5f96a25e503044a292df24630498d86c8a415946cacf0904\": rpc error: code = NotFound desc = could not find container \"4ceda535e48bc00d5f96a25e503044a292df24630498d86c8a415946cacf0904\": container with ID starting with 4ceda535e48bc00d5f96a25e503044a292df24630498d86c8a415946cacf0904 not found: ID does not exist" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.543559 4825 scope.go:117] "RemoveContainer" containerID="6a4f962505809098311731c7df1e9e9acaa99f0d4d2a16ebefd3083955087398" Mar 10 07:09:31 crc kubenswrapper[4825]: E0310 07:09:31.543868 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a4f962505809098311731c7df1e9e9acaa99f0d4d2a16ebefd3083955087398\": container with ID starting with 6a4f962505809098311731c7df1e9e9acaa99f0d4d2a16ebefd3083955087398 not found: ID does not exist" containerID="6a4f962505809098311731c7df1e9e9acaa99f0d4d2a16ebefd3083955087398" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.543893 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a4f962505809098311731c7df1e9e9acaa99f0d4d2a16ebefd3083955087398"} err="failed to get container status \"6a4f962505809098311731c7df1e9e9acaa99f0d4d2a16ebefd3083955087398\": rpc error: code = NotFound desc = could not find container \"6a4f962505809098311731c7df1e9e9acaa99f0d4d2a16ebefd3083955087398\": container with ID starting with 6a4f962505809098311731c7df1e9e9acaa99f0d4d2a16ebefd3083955087398 not found: ID does not exist" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.543929 4825 scope.go:117] "RemoveContainer" containerID="a5a7ebb2da38af819e648d6e5ad4c63fc12ab758ecc1f1178b06bc2d53a663b1" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.557469 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5b179f-f4fb-479c-9720-25587566c518-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "1b5b179f-f4fb-479c-9720-25587566c518" (UID: "1b5b179f-f4fb-479c-9720-25587566c518"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.578216 4825 scope.go:117] "RemoveContainer" containerID="23d5b7cd25a646bc4179b5e41cb642aa99ae3019bdbfa362dc0578d525c4bfdf" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.594333 4825 scope.go:117] "RemoveContainer" containerID="4c5195f2b444f93836dcb7df7bcc67ecb17ab72fc077c910233e60c95d78edb3" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.599686 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9ba0e3ee-1309-4411-a927-866b35c2776b-rabbitmq-tls\") pod \"9ba0e3ee-1309-4411-a927-866b35c2776b\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.599720 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9ba0e3ee-1309-4411-a927-866b35c2776b-erlang-cookie-secret\") pod \"9ba0e3ee-1309-4411-a927-866b35c2776b\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.599768 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ba0e3ee-1309-4411-a927-866b35c2776b-config-data\") pod \"9ba0e3ee-1309-4411-a927-866b35c2776b\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.599795 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9ba0e3ee-1309-4411-a927-866b35c2776b-rabbitmq-plugins\") pod \"9ba0e3ee-1309-4411-a927-866b35c2776b\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.599850 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"9ba0e3ee-1309-4411-a927-866b35c2776b\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.599938 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9ba0e3ee-1309-4411-a927-866b35c2776b-pod-info\") pod \"9ba0e3ee-1309-4411-a927-866b35c2776b\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.599962 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9ba0e3ee-1309-4411-a927-866b35c2776b-server-conf\") pod \"9ba0e3ee-1309-4411-a927-866b35c2776b\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.600501 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ba0e3ee-1309-4411-a927-866b35c2776b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "9ba0e3ee-1309-4411-a927-866b35c2776b" (UID: "9ba0e3ee-1309-4411-a927-866b35c2776b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.600831 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9ba0e3ee-1309-4411-a927-866b35c2776b-rabbitmq-confd\") pod \"9ba0e3ee-1309-4411-a927-866b35c2776b\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.600872 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9ba0e3ee-1309-4411-a927-866b35c2776b-plugins-conf\") pod \"9ba0e3ee-1309-4411-a927-866b35c2776b\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.600902 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7t4t\" (UniqueName: \"kubernetes.io/projected/9ba0e3ee-1309-4411-a927-866b35c2776b-kube-api-access-d7t4t\") pod \"9ba0e3ee-1309-4411-a927-866b35c2776b\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.600926 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9ba0e3ee-1309-4411-a927-866b35c2776b-rabbitmq-erlang-cookie\") pod \"9ba0e3ee-1309-4411-a927-866b35c2776b\" (UID: \"9ba0e3ee-1309-4411-a927-866b35c2776b\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.601249 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1b5b179f-f4fb-479c-9720-25587566c518-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.601260 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5b179f-f4fb-479c-9720-25587566c518-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.601269 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1b5b179f-f4fb-479c-9720-25587566c518-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.601278 4825 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1b5b179f-f4fb-479c-9720-25587566c518-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.601286 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b5b179f-f4fb-479c-9720-25587566c518-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.601294 4825 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b5b179f-f4fb-479c-9720-25587566c518-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.601312 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.601321 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqzfd\" (UniqueName: \"kubernetes.io/projected/1b5b179f-f4fb-479c-9720-25587566c518-kube-api-access-pqzfd\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.603259 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ba0e3ee-1309-4411-a927-866b35c2776b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "9ba0e3ee-1309-4411-a927-866b35c2776b" (UID: "9ba0e3ee-1309-4411-a927-866b35c2776b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.603299 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ba0e3ee-1309-4411-a927-866b35c2776b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "9ba0e3ee-1309-4411-a927-866b35c2776b" (UID: "9ba0e3ee-1309-4411-a927-866b35c2776b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.606452 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba0e3ee-1309-4411-a927-866b35c2776b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "9ba0e3ee-1309-4411-a927-866b35c2776b" (UID: "9ba0e3ee-1309-4411-a927-866b35c2776b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.606556 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ba0e3ee-1309-4411-a927-866b35c2776b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "9ba0e3ee-1309-4411-a927-866b35c2776b" (UID: "9ba0e3ee-1309-4411-a927-866b35c2776b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.608361 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "9ba0e3ee-1309-4411-a927-866b35c2776b" (UID: "9ba0e3ee-1309-4411-a927-866b35c2776b"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.608715 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/9ba0e3ee-1309-4411-a927-866b35c2776b-pod-info" (OuterVolumeSpecName: "pod-info") pod "9ba0e3ee-1309-4411-a927-866b35c2776b" (UID: "9ba0e3ee-1309-4411-a927-866b35c2776b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.626783 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ba0e3ee-1309-4411-a927-866b35c2776b-config-data" (OuterVolumeSpecName: "config-data") pod "9ba0e3ee-1309-4411-a927-866b35c2776b" (UID: "9ba0e3ee-1309-4411-a927-866b35c2776b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.630203 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ba0e3ee-1309-4411-a927-866b35c2776b-kube-api-access-d7t4t" (OuterVolumeSpecName: "kube-api-access-d7t4t") pod "9ba0e3ee-1309-4411-a927-866b35c2776b" (UID: "9ba0e3ee-1309-4411-a927-866b35c2776b"). InnerVolumeSpecName "kube-api-access-d7t4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.641477 4825 scope.go:117] "RemoveContainer" containerID="a197f76cf3bde62283e41aa4013c9afdb9aa3afe6880137cee79408f1cdb53ae" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.642552 4825 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.661397 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ba0e3ee-1309-4411-a927-866b35c2776b-server-conf" (OuterVolumeSpecName: "server-conf") pod "9ba0e3ee-1309-4411-a927-866b35c2776b" (UID: "9ba0e3ee-1309-4411-a927-866b35c2776b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.689906 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ba0e3ee-1309-4411-a927-866b35c2776b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "9ba0e3ee-1309-4411-a927-866b35c2776b" (UID: "9ba0e3ee-1309-4411-a927-866b35c2776b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.706883 4825 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9ba0e3ee-1309-4411-a927-866b35c2776b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.706933 4825 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9ba0e3ee-1309-4411-a927-866b35c2776b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.706948 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ba0e3ee-1309-4411-a927-866b35c2776b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.706962 4825 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9ba0e3ee-1309-4411-a927-866b35c2776b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.707000 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.707013 4825 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9ba0e3ee-1309-4411-a927-866b35c2776b-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.707028 4825 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9ba0e3ee-1309-4411-a927-866b35c2776b-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.707041 4825 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9ba0e3ee-1309-4411-a927-866b35c2776b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.707057 4825 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9ba0e3ee-1309-4411-a927-866b35c2776b-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.707072 4825 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.707090 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7t4t\" (UniqueName: \"kubernetes.io/projected/9ba0e3ee-1309-4411-a927-866b35c2776b-kube-api-access-d7t4t\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.707103 4825 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9ba0e3ee-1309-4411-a927-866b35c2776b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.738016 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.738756 4825 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.807793 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"40efa241-98cc-4dec-9ae8-8a892b367ebc\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.807907 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2nzn\" (UniqueName: \"kubernetes.io/projected/40efa241-98cc-4dec-9ae8-8a892b367ebc-kube-api-access-p2nzn\") pod \"40efa241-98cc-4dec-9ae8-8a892b367ebc\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.807945 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-config-data\") pod \"40efa241-98cc-4dec-9ae8-8a892b367ebc\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.807969 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/40efa241-98cc-4dec-9ae8-8a892b367ebc-rabbitmq-plugins\") pod \"40efa241-98cc-4dec-9ae8-8a892b367ebc\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.808013 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-server-conf\") pod \"40efa241-98cc-4dec-9ae8-8a892b367ebc\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.808067 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/40efa241-98cc-4dec-9ae8-8a892b367ebc-erlang-cookie-secret\") pod \"40efa241-98cc-4dec-9ae8-8a892b367ebc\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.808096 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/40efa241-98cc-4dec-9ae8-8a892b367ebc-rabbitmq-erlang-cookie\") pod \"40efa241-98cc-4dec-9ae8-8a892b367ebc\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.808126 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/40efa241-98cc-4dec-9ae8-8a892b367ebc-rabbitmq-tls\") pod \"40efa241-98cc-4dec-9ae8-8a892b367ebc\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.808181 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/40efa241-98cc-4dec-9ae8-8a892b367ebc-rabbitmq-confd\") pod \"40efa241-98cc-4dec-9ae8-8a892b367ebc\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.808207 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/40efa241-98cc-4dec-9ae8-8a892b367ebc-pod-info\") pod \"40efa241-98cc-4dec-9ae8-8a892b367ebc\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.808231 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-plugins-conf\") pod \"40efa241-98cc-4dec-9ae8-8a892b367ebc\" (UID: \"40efa241-98cc-4dec-9ae8-8a892b367ebc\") " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.808447 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40efa241-98cc-4dec-9ae8-8a892b367ebc-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "40efa241-98cc-4dec-9ae8-8a892b367ebc" (UID: "40efa241-98cc-4dec-9ae8-8a892b367ebc"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.808805 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40efa241-98cc-4dec-9ae8-8a892b367ebc-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "40efa241-98cc-4dec-9ae8-8a892b367ebc" (UID: "40efa241-98cc-4dec-9ae8-8a892b367ebc"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.809282 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "40efa241-98cc-4dec-9ae8-8a892b367ebc" (UID: "40efa241-98cc-4dec-9ae8-8a892b367ebc"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.810643 4825 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.810668 4825 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/40efa241-98cc-4dec-9ae8-8a892b367ebc-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.810678 4825 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.810687 4825 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/40efa241-98cc-4dec-9ae8-8a892b367ebc-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.811665 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40efa241-98cc-4dec-9ae8-8a892b367ebc-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "40efa241-98cc-4dec-9ae8-8a892b367ebc" (UID: "40efa241-98cc-4dec-9ae8-8a892b367ebc"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.817712 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40efa241-98cc-4dec-9ae8-8a892b367ebc-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "40efa241-98cc-4dec-9ae8-8a892b367ebc" (UID: "40efa241-98cc-4dec-9ae8-8a892b367ebc"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.820615 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40efa241-98cc-4dec-9ae8-8a892b367ebc-kube-api-access-p2nzn" (OuterVolumeSpecName: "kube-api-access-p2nzn") pod "40efa241-98cc-4dec-9ae8-8a892b367ebc" (UID: "40efa241-98cc-4dec-9ae8-8a892b367ebc"). InnerVolumeSpecName "kube-api-access-p2nzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.820734 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/40efa241-98cc-4dec-9ae8-8a892b367ebc-pod-info" (OuterVolumeSpecName: "pod-info") pod "40efa241-98cc-4dec-9ae8-8a892b367ebc" (UID: "40efa241-98cc-4dec-9ae8-8a892b367ebc"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.820784 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "40efa241-98cc-4dec-9ae8-8a892b367ebc" (UID: "40efa241-98cc-4dec-9ae8-8a892b367ebc"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.825632 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-config-data" (OuterVolumeSpecName: "config-data") pod "40efa241-98cc-4dec-9ae8-8a892b367ebc" (UID: "40efa241-98cc-4dec-9ae8-8a892b367ebc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.853090 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-server-conf" (OuterVolumeSpecName: "server-conf") pod "40efa241-98cc-4dec-9ae8-8a892b367ebc" (UID: "40efa241-98cc-4dec-9ae8-8a892b367ebc"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.881173 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40efa241-98cc-4dec-9ae8-8a892b367ebc-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "40efa241-98cc-4dec-9ae8-8a892b367ebc" (UID: "40efa241-98cc-4dec-9ae8-8a892b367ebc"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.911836 4825 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/40efa241-98cc-4dec-9ae8-8a892b367ebc-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.912181 4825 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/40efa241-98cc-4dec-9ae8-8a892b367ebc-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.912195 4825 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/40efa241-98cc-4dec-9ae8-8a892b367ebc-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.912206 4825 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/40efa241-98cc-4dec-9ae8-8a892b367ebc-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.912241 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.912252 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2nzn\" (UniqueName: \"kubernetes.io/projected/40efa241-98cc-4dec-9ae8-8a892b367ebc-kube-api-access-p2nzn\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.912264 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.912274 4825 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/40efa241-98cc-4dec-9ae8-8a892b367ebc-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:31 crc kubenswrapper[4825]: I0310 07:09:31.926107 4825 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.012812 4825 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.205021 4825 generic.go:334] "Generic (PLEG): container finished" podID="40efa241-98cc-4dec-9ae8-8a892b367ebc" containerID="a98a5e36d3a3124ffe2ffa323d51e71ad524e44ce610050ac778bcba5e77fd17" exitCode=0 Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.205085 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"40efa241-98cc-4dec-9ae8-8a892b367ebc","Type":"ContainerDied","Data":"a98a5e36d3a3124ffe2ffa323d51e71ad524e44ce610050ac778bcba5e77fd17"} Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.205110 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"40efa241-98cc-4dec-9ae8-8a892b367ebc","Type":"ContainerDied","Data":"f182dd5eece1a0a99b1b33073d409d5bdc7dbaa561efb7a7b4099bcd91d756b7"} Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.205150 4825 scope.go:117] "RemoveContainer" containerID="a98a5e36d3a3124ffe2ffa323d51e71ad524e44ce610050ac778bcba5e77fd17" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.205296 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.222330 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9ba0e3ee-1309-4411-a927-866b35c2776b","Type":"ContainerDied","Data":"21bde652f73dcd32c4af6e16297b56cef02a99aea3f4f774ca6588596babae0f"} Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.222509 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.284833 4825 scope.go:117] "RemoveContainer" containerID="afabc77fd2eb1f467ac3dc9c5bc7ab28337af9e10ce2b6cf66724e5f31f41951" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.292664 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1b5b179f-f4fb-479c-9720-25587566c518","Type":"ContainerDied","Data":"c951929b4cab3e3155d3ea8f2442bc5d1e335eb49649bc66959b618a6bb7f4fb"} Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.292767 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.313404 4825 generic.go:334] "Generic (PLEG): container finished" podID="86dc5ac4-dac8-4fdf-bba9-06d13efacd53" containerID="878047a39225dab631a6decdc7de8a0629c77fdf4b999f47c2c1d0353e8c9e86" exitCode=0 Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.313499 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-599df5898d-bqcpr" event={"ID":"86dc5ac4-dac8-4fdf-bba9-06d13efacd53","Type":"ContainerDied","Data":"878047a39225dab631a6decdc7de8a0629c77fdf4b999f47c2c1d0353e8c9e86"} Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.371611 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.432717 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.439900 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.440867 4825 scope.go:117] "RemoveContainer" containerID="a98a5e36d3a3124ffe2ffa323d51e71ad524e44ce610050ac778bcba5e77fd17" Mar 10 07:09:32 crc kubenswrapper[4825]: E0310 07:09:32.441404 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a98a5e36d3a3124ffe2ffa323d51e71ad524e44ce610050ac778bcba5e77fd17\": container with ID starting with a98a5e36d3a3124ffe2ffa323d51e71ad524e44ce610050ac778bcba5e77fd17 not found: ID does not exist" containerID="a98a5e36d3a3124ffe2ffa323d51e71ad524e44ce610050ac778bcba5e77fd17" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.441464 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a98a5e36d3a3124ffe2ffa323d51e71ad524e44ce610050ac778bcba5e77fd17"} err="failed to get container status \"a98a5e36d3a3124ffe2ffa323d51e71ad524e44ce610050ac778bcba5e77fd17\": rpc error: code = NotFound desc = could not find container \"a98a5e36d3a3124ffe2ffa323d51e71ad524e44ce610050ac778bcba5e77fd17\": container with ID starting with a98a5e36d3a3124ffe2ffa323d51e71ad524e44ce610050ac778bcba5e77fd17 not found: ID does not exist" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.441487 4825 scope.go:117] "RemoveContainer" containerID="afabc77fd2eb1f467ac3dc9c5bc7ab28337af9e10ce2b6cf66724e5f31f41951" Mar 10 07:09:32 crc kubenswrapper[4825]: E0310 07:09:32.446126 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afabc77fd2eb1f467ac3dc9c5bc7ab28337af9e10ce2b6cf66724e5f31f41951\": container with ID starting with afabc77fd2eb1f467ac3dc9c5bc7ab28337af9e10ce2b6cf66724e5f31f41951 not found: ID does not exist" containerID="afabc77fd2eb1f467ac3dc9c5bc7ab28337af9e10ce2b6cf66724e5f31f41951" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.446179 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afabc77fd2eb1f467ac3dc9c5bc7ab28337af9e10ce2b6cf66724e5f31f41951"} err="failed to get container status \"afabc77fd2eb1f467ac3dc9c5bc7ab28337af9e10ce2b6cf66724e5f31f41951\": rpc error: code = NotFound desc = could not find container \"afabc77fd2eb1f467ac3dc9c5bc7ab28337af9e10ce2b6cf66724e5f31f41951\": container with ID starting with afabc77fd2eb1f467ac3dc9c5bc7ab28337af9e10ce2b6cf66724e5f31f41951 not found: ID does not exist" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.446194 4825 scope.go:117] "RemoveContainer" containerID="33787b88c881e69ba026aff5962a5455c1e4b2f94327fcc119388ebe1cd30211" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.462307 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.496424 4825 scope.go:117] "RemoveContainer" containerID="85a2b6ae62f4562edc4361ee48380de641fa71091bc57a5172633641c40144a7" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.497612 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.503491 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.525567 4825 scope.go:117] "RemoveContainer" containerID="c5251bd347e4e2a0285d072a7e65894076feb5953eaeecf9032c40e2a8952418" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.553056 4825 scope.go:117] "RemoveContainer" containerID="8f908b02b36a68508bfbba63d180143eca8f4c5bb9968039511ca87e022f56c8" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.571284 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.639291 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t87rc\" (UniqueName: \"kubernetes.io/projected/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-kube-api-access-t87rc\") pod \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.639352 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-public-tls-certs\") pod \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.639382 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-credential-keys\") pod \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.639413 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-config-data\") pod \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.639448 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-internal-tls-certs\") pod \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.639521 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-scripts\") pod \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.639567 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-fernet-keys\") pod \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.639602 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-combined-ca-bundle\") pod \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\" (UID: \"86dc5ac4-dac8-4fdf-bba9-06d13efacd53\") " Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.643312 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "86dc5ac4-dac8-4fdf-bba9-06d13efacd53" (UID: "86dc5ac4-dac8-4fdf-bba9-06d13efacd53"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.644032 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-kube-api-access-t87rc" (OuterVolumeSpecName: "kube-api-access-t87rc") pod "86dc5ac4-dac8-4fdf-bba9-06d13efacd53" (UID: "86dc5ac4-dac8-4fdf-bba9-06d13efacd53"). InnerVolumeSpecName "kube-api-access-t87rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.644883 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-scripts" (OuterVolumeSpecName: "scripts") pod "86dc5ac4-dac8-4fdf-bba9-06d13efacd53" (UID: "86dc5ac4-dac8-4fdf-bba9-06d13efacd53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.645574 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "86dc5ac4-dac8-4fdf-bba9-06d13efacd53" (UID: "86dc5ac4-dac8-4fdf-bba9-06d13efacd53"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.660412 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86dc5ac4-dac8-4fdf-bba9-06d13efacd53" (UID: "86dc5ac4-dac8-4fdf-bba9-06d13efacd53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.667437 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-config-data" (OuterVolumeSpecName: "config-data") pod "86dc5ac4-dac8-4fdf-bba9-06d13efacd53" (UID: "86dc5ac4-dac8-4fdf-bba9-06d13efacd53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.686633 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "86dc5ac4-dac8-4fdf-bba9-06d13efacd53" (UID: "86dc5ac4-dac8-4fdf-bba9-06d13efacd53"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.692273 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "86dc5ac4-dac8-4fdf-bba9-06d13efacd53" (UID: "86dc5ac4-dac8-4fdf-bba9-06d13efacd53"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.740739 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t87rc\" (UniqueName: \"kubernetes.io/projected/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-kube-api-access-t87rc\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.740787 4825 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.740807 4825 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.740823 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.740841 4825 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.740857 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.740871 4825 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:32 crc kubenswrapper[4825]: I0310 07:09:32.740886 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86dc5ac4-dac8-4fdf-bba9-06d13efacd53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:33 crc kubenswrapper[4825]: I0310 07:09:33.256306 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b5b179f-f4fb-479c-9720-25587566c518" path="/var/lib/kubelet/pods/1b5b179f-f4fb-479c-9720-25587566c518/volumes" Mar 10 07:09:33 crc kubenswrapper[4825]: I0310 07:09:33.258261 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40efa241-98cc-4dec-9ae8-8a892b367ebc" path="/var/lib/kubelet/pods/40efa241-98cc-4dec-9ae8-8a892b367ebc/volumes" Mar 10 07:09:33 crc kubenswrapper[4825]: I0310 07:09:33.260588 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47d73d8a-acf6-42e6-a30d-e093144ee0b9" path="/var/lib/kubelet/pods/47d73d8a-acf6-42e6-a30d-e093144ee0b9/volumes" Mar 10 07:09:33 crc kubenswrapper[4825]: I0310 07:09:33.262116 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d4252ca-f279-4c95-8f10-205339d028a5" path="/var/lib/kubelet/pods/5d4252ca-f279-4c95-8f10-205339d028a5/volumes" Mar 10 07:09:33 crc kubenswrapper[4825]: I0310 07:09:33.263619 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85f5995b-4ad8-4840-82f1-6659152c3ed4" path="/var/lib/kubelet/pods/85f5995b-4ad8-4840-82f1-6659152c3ed4/volumes" Mar 10 07:09:33 crc kubenswrapper[4825]: I0310 07:09:33.266424 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ba0e3ee-1309-4411-a927-866b35c2776b" path="/var/lib/kubelet/pods/9ba0e3ee-1309-4411-a927-866b35c2776b/volumes" Mar 10 07:09:33 crc kubenswrapper[4825]: I0310 07:09:33.267864 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c684f3e4-ced9-49c3-aa54-515bc2c4fb56" path="/var/lib/kubelet/pods/c684f3e4-ced9-49c3-aa54-515bc2c4fb56/volumes" Mar 10 07:09:33 crc kubenswrapper[4825]: I0310 07:09:33.269270 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d77fced6-dee7-49fc-9088-e173a5be3cee" path="/var/lib/kubelet/pods/d77fced6-dee7-49fc-9088-e173a5be3cee/volumes" Mar 10 07:09:33 crc kubenswrapper[4825]: I0310 07:09:33.461257 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-599df5898d-bqcpr" event={"ID":"86dc5ac4-dac8-4fdf-bba9-06d13efacd53","Type":"ContainerDied","Data":"45d34bb2ca07c04238a805768d304ffe30e1488ed0f99c18fc8607e89c89570d"} Mar 10 07:09:33 crc kubenswrapper[4825]: I0310 07:09:33.461628 4825 scope.go:117] "RemoveContainer" containerID="878047a39225dab631a6decdc7de8a0629c77fdf4b999f47c2c1d0353e8c9e86" Mar 10 07:09:33 crc kubenswrapper[4825]: I0310 07:09:33.461345 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-599df5898d-bqcpr" Mar 10 07:09:33 crc kubenswrapper[4825]: I0310 07:09:33.492197 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-599df5898d-bqcpr"] Mar 10 07:09:33 crc kubenswrapper[4825]: I0310 07:09:33.495481 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-599df5898d-bqcpr"] Mar 10 07:09:33 crc kubenswrapper[4825]: I0310 07:09:33.589324 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="04a2aca4-f98f-4ae8-aca9-62ae6625e5e2" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.206:8081/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 07:09:34 crc kubenswrapper[4825]: E0310 07:09:34.339474 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134 is running failed: container process not found" containerID="dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 07:09:34 crc kubenswrapper[4825]: E0310 07:09:34.340001 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134 is running failed: container process not found" containerID="dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 07:09:34 crc kubenswrapper[4825]: E0310 07:09:34.340541 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134 is running failed: container process not found" containerID="dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 07:09:34 crc kubenswrapper[4825]: E0310 07:09:34.340614 4825 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8x4pm" podUID="f04e11a7-e387-4e51-b878-8633ca528b1a" containerName="ovsdb-server" Mar 10 07:09:34 crc kubenswrapper[4825]: E0310 07:09:34.342612 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1f073cb0d55b119007da7e16a09131032c9832ee0f4faf6e839adba2f4de4803" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 07:09:34 crc kubenswrapper[4825]: E0310 07:09:34.349278 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1f073cb0d55b119007da7e16a09131032c9832ee0f4faf6e839adba2f4de4803" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 07:09:34 crc kubenswrapper[4825]: E0310 07:09:34.351924 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1f073cb0d55b119007da7e16a09131032c9832ee0f4faf6e839adba2f4de4803" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 07:09:34 crc kubenswrapper[4825]: E0310 07:09:34.351984 4825 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-8x4pm" podUID="f04e11a7-e387-4e51-b878-8633ca528b1a" containerName="ovs-vswitchd" Mar 10 07:09:35 crc kubenswrapper[4825]: I0310 07:09:35.249381 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86dc5ac4-dac8-4fdf-bba9-06d13efacd53" path="/var/lib/kubelet/pods/86dc5ac4-dac8-4fdf-bba9-06d13efacd53/volumes" Mar 10 07:09:35 crc kubenswrapper[4825]: I0310 07:09:35.587817 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-67bc54d95c-r8n6n" podUID="f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.175:9696/\": dial tcp 10.217.0.175:9696: connect: connection refused" Mar 10 07:09:39 crc kubenswrapper[4825]: E0310 07:09:39.340984 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134 is running failed: container process not found" containerID="dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 07:09:39 crc kubenswrapper[4825]: E0310 07:09:39.343635 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1f073cb0d55b119007da7e16a09131032c9832ee0f4faf6e839adba2f4de4803" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 07:09:39 crc kubenswrapper[4825]: E0310 07:09:39.343704 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134 is running failed: container process not found" containerID="dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 07:09:39 crc kubenswrapper[4825]: E0310 07:09:39.344589 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134 is running failed: container process not found" containerID="dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 07:09:39 crc kubenswrapper[4825]: E0310 07:09:39.344712 4825 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8x4pm" podUID="f04e11a7-e387-4e51-b878-8633ca528b1a" containerName="ovsdb-server" Mar 10 07:09:39 crc kubenswrapper[4825]: E0310 07:09:39.346463 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1f073cb0d55b119007da7e16a09131032c9832ee0f4faf6e839adba2f4de4803" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 07:09:39 crc kubenswrapper[4825]: E0310 07:09:39.369987 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1f073cb0d55b119007da7e16a09131032c9832ee0f4faf6e839adba2f4de4803" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 07:09:39 crc kubenswrapper[4825]: E0310 07:09:39.370057 4825 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-8x4pm" podUID="f04e11a7-e387-4e51-b878-8633ca528b1a" containerName="ovs-vswitchd" Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.082786 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.132222 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-public-tls-certs\") pod \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.132305 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-internal-tls-certs\") pod \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.132346 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-ovndb-tls-certs\") pod \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.132404 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-config\") pod \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.132434 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-httpd-config\") pod \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.132473 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlznw\" (UniqueName: \"kubernetes.io/projected/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-kube-api-access-mlznw\") pod \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.132507 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-combined-ca-bundle\") pod \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\" (UID: \"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6\") " Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.140501 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6" (UID: "f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.143261 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-kube-api-access-mlznw" (OuterVolumeSpecName: "kube-api-access-mlznw") pod "f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6" (UID: "f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6"). InnerVolumeSpecName "kube-api-access-mlznw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.188733 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6" (UID: "f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.189201 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6" (UID: "f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.191358 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6" (UID: "f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.191787 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-config" (OuterVolumeSpecName: "config") pod "f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6" (UID: "f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.210406 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6" (UID: "f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.235042 4825 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.235067 4825 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.235076 4825 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.235084 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.235095 4825 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.235104 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlznw\" (UniqueName: \"kubernetes.io/projected/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-kube-api-access-mlznw\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.235115 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.568560 4825 generic.go:334] "Generic (PLEG): container finished" podID="f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6" containerID="2281ab9c75c402ac8b53f6007d3f795cf1ff09aad4c21d1c2d1f97b2f04c2605" exitCode=0 Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.568628 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67bc54d95c-r8n6n" event={"ID":"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6","Type":"ContainerDied","Data":"2281ab9c75c402ac8b53f6007d3f795cf1ff09aad4c21d1c2d1f97b2f04c2605"} Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.568683 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67bc54d95c-r8n6n" event={"ID":"f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6","Type":"ContainerDied","Data":"e38d9882bdf6e2fd0a27032188b5caf3fbe81d2335e92cda553f3e1084342906"} Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.568712 4825 scope.go:117] "RemoveContainer" containerID="3ad7566f82944df07d73d40d26dee49f887701e2a6b35d333784838aa475be9b" Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.569307 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67bc54d95c-r8n6n" Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.662511 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-67bc54d95c-r8n6n"] Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.674057 4825 scope.go:117] "RemoveContainer" containerID="2281ab9c75c402ac8b53f6007d3f795cf1ff09aad4c21d1c2d1f97b2f04c2605" Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.675551 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-67bc54d95c-r8n6n"] Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.697869 4825 scope.go:117] "RemoveContainer" containerID="3ad7566f82944df07d73d40d26dee49f887701e2a6b35d333784838aa475be9b" Mar 10 07:09:43 crc kubenswrapper[4825]: E0310 07:09:43.698480 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ad7566f82944df07d73d40d26dee49f887701e2a6b35d333784838aa475be9b\": container with ID starting with 3ad7566f82944df07d73d40d26dee49f887701e2a6b35d333784838aa475be9b not found: ID does not exist" containerID="3ad7566f82944df07d73d40d26dee49f887701e2a6b35d333784838aa475be9b" Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.698531 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ad7566f82944df07d73d40d26dee49f887701e2a6b35d333784838aa475be9b"} err="failed to get container status \"3ad7566f82944df07d73d40d26dee49f887701e2a6b35d333784838aa475be9b\": rpc error: code = NotFound desc = could not find container \"3ad7566f82944df07d73d40d26dee49f887701e2a6b35d333784838aa475be9b\": container with ID starting with 3ad7566f82944df07d73d40d26dee49f887701e2a6b35d333784838aa475be9b not found: ID does not exist" Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.698570 4825 scope.go:117] "RemoveContainer" containerID="2281ab9c75c402ac8b53f6007d3f795cf1ff09aad4c21d1c2d1f97b2f04c2605" Mar 10 07:09:43 crc kubenswrapper[4825]: E0310 07:09:43.699118 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2281ab9c75c402ac8b53f6007d3f795cf1ff09aad4c21d1c2d1f97b2f04c2605\": container with ID starting with 2281ab9c75c402ac8b53f6007d3f795cf1ff09aad4c21d1c2d1f97b2f04c2605 not found: ID does not exist" containerID="2281ab9c75c402ac8b53f6007d3f795cf1ff09aad4c21d1c2d1f97b2f04c2605" Mar 10 07:09:43 crc kubenswrapper[4825]: I0310 07:09:43.699176 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2281ab9c75c402ac8b53f6007d3f795cf1ff09aad4c21d1c2d1f97b2f04c2605"} err="failed to get container status \"2281ab9c75c402ac8b53f6007d3f795cf1ff09aad4c21d1c2d1f97b2f04c2605\": rpc error: code = NotFound desc = could not find container \"2281ab9c75c402ac8b53f6007d3f795cf1ff09aad4c21d1c2d1f97b2f04c2605\": container with ID starting with 2281ab9c75c402ac8b53f6007d3f795cf1ff09aad4c21d1c2d1f97b2f04c2605 not found: ID does not exist" Mar 10 07:09:44 crc kubenswrapper[4825]: E0310 07:09:44.339717 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134 is running failed: container process not found" containerID="dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 07:09:44 crc kubenswrapper[4825]: E0310 07:09:44.340357 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134 is running failed: container process not found" containerID="dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 07:09:44 crc kubenswrapper[4825]: E0310 07:09:44.340952 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134 is running failed: container process not found" containerID="dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 07:09:44 crc kubenswrapper[4825]: E0310 07:09:44.340983 4825 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8x4pm" podUID="f04e11a7-e387-4e51-b878-8633ca528b1a" containerName="ovsdb-server" Mar 10 07:09:44 crc kubenswrapper[4825]: E0310 07:09:44.342849 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1f073cb0d55b119007da7e16a09131032c9832ee0f4faf6e839adba2f4de4803" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 07:09:44 crc kubenswrapper[4825]: E0310 07:09:44.344987 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1f073cb0d55b119007da7e16a09131032c9832ee0f4faf6e839adba2f4de4803" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 07:09:44 crc kubenswrapper[4825]: E0310 07:09:44.347222 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1f073cb0d55b119007da7e16a09131032c9832ee0f4faf6e839adba2f4de4803" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 07:09:44 crc kubenswrapper[4825]: E0310 07:09:44.347259 4825 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-8x4pm" podUID="f04e11a7-e387-4e51-b878-8633ca528b1a" containerName="ovs-vswitchd" Mar 10 07:09:45 crc kubenswrapper[4825]: I0310 07:09:45.253675 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6" path="/var/lib/kubelet/pods/f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6/volumes" Mar 10 07:09:46 crc kubenswrapper[4825]: I0310 07:09:46.888763 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:09:46 crc kubenswrapper[4825]: I0310 07:09:46.889176 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:09:49 crc kubenswrapper[4825]: E0310 07:09:49.339912 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134 is running failed: container process not found" containerID="dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 07:09:49 crc kubenswrapper[4825]: E0310 07:09:49.340516 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134 is running failed: container process not found" containerID="dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 07:09:49 crc kubenswrapper[4825]: E0310 07:09:49.341156 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134 is running failed: container process not found" containerID="dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 10 07:09:49 crc kubenswrapper[4825]: E0310 07:09:49.341196 4825 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8x4pm" podUID="f04e11a7-e387-4e51-b878-8633ca528b1a" containerName="ovsdb-server" Mar 10 07:09:49 crc kubenswrapper[4825]: E0310 07:09:49.341904 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1f073cb0d55b119007da7e16a09131032c9832ee0f4faf6e839adba2f4de4803" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 07:09:49 crc kubenswrapper[4825]: E0310 07:09:49.344557 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1f073cb0d55b119007da7e16a09131032c9832ee0f4faf6e839adba2f4de4803" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 07:09:49 crc kubenswrapper[4825]: E0310 07:09:49.347644 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1f073cb0d55b119007da7e16a09131032c9832ee0f4faf6e839adba2f4de4803" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 10 07:09:49 crc kubenswrapper[4825]: E0310 07:09:49.347747 4825 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-8x4pm" podUID="f04e11a7-e387-4e51-b878-8633ca528b1a" containerName="ovs-vswitchd" Mar 10 07:09:53 crc kubenswrapper[4825]: I0310 07:09:53.680546 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8x4pm_f04e11a7-e387-4e51-b878-8633ca528b1a/ovs-vswitchd/0.log" Mar 10 07:09:53 crc kubenswrapper[4825]: I0310 07:09:53.681618 4825 generic.go:334] "Generic (PLEG): container finished" podID="f04e11a7-e387-4e51-b878-8633ca528b1a" containerID="1f073cb0d55b119007da7e16a09131032c9832ee0f4faf6e839adba2f4de4803" exitCode=137 Mar 10 07:09:53 crc kubenswrapper[4825]: I0310 07:09:53.681654 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8x4pm" event={"ID":"f04e11a7-e387-4e51-b878-8633ca528b1a","Type":"ContainerDied","Data":"1f073cb0d55b119007da7e16a09131032c9832ee0f4faf6e839adba2f4de4803"} Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.223309 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8x4pm_f04e11a7-e387-4e51-b878-8633ca528b1a/ovs-vswitchd/0.log" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.224849 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.410408 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f04e11a7-e387-4e51-b878-8633ca528b1a-var-log\") pod \"f04e11a7-e387-4e51-b878-8633ca528b1a\" (UID: \"f04e11a7-e387-4e51-b878-8633ca528b1a\") " Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.410484 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f04e11a7-e387-4e51-b878-8633ca528b1a-etc-ovs\") pod \"f04e11a7-e387-4e51-b878-8633ca528b1a\" (UID: \"f04e11a7-e387-4e51-b878-8633ca528b1a\") " Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.410528 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f04e11a7-e387-4e51-b878-8633ca528b1a-var-run\") pod \"f04e11a7-e387-4e51-b878-8633ca528b1a\" (UID: \"f04e11a7-e387-4e51-b878-8633ca528b1a\") " Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.410604 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmnpc\" (UniqueName: \"kubernetes.io/projected/f04e11a7-e387-4e51-b878-8633ca528b1a-kube-api-access-bmnpc\") pod \"f04e11a7-e387-4e51-b878-8633ca528b1a\" (UID: \"f04e11a7-e387-4e51-b878-8633ca528b1a\") " Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.410682 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f04e11a7-e387-4e51-b878-8633ca528b1a-var-lib\") pod \"f04e11a7-e387-4e51-b878-8633ca528b1a\" (UID: \"f04e11a7-e387-4e51-b878-8633ca528b1a\") " Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.410713 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f04e11a7-e387-4e51-b878-8633ca528b1a-scripts\") pod \"f04e11a7-e387-4e51-b878-8633ca528b1a\" (UID: \"f04e11a7-e387-4e51-b878-8633ca528b1a\") " Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.411049 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f04e11a7-e387-4e51-b878-8633ca528b1a-var-run" (OuterVolumeSpecName: "var-run") pod "f04e11a7-e387-4e51-b878-8633ca528b1a" (UID: "f04e11a7-e387-4e51-b878-8633ca528b1a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.411064 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f04e11a7-e387-4e51-b878-8633ca528b1a-var-log" (OuterVolumeSpecName: "var-log") pod "f04e11a7-e387-4e51-b878-8633ca528b1a" (UID: "f04e11a7-e387-4e51-b878-8633ca528b1a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.411094 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f04e11a7-e387-4e51-b878-8633ca528b1a-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "f04e11a7-e387-4e51-b878-8633ca528b1a" (UID: "f04e11a7-e387-4e51-b878-8633ca528b1a"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.411110 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f04e11a7-e387-4e51-b878-8633ca528b1a-var-lib" (OuterVolumeSpecName: "var-lib") pod "f04e11a7-e387-4e51-b878-8633ca528b1a" (UID: "f04e11a7-e387-4e51-b878-8633ca528b1a"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.411418 4825 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f04e11a7-e387-4e51-b878-8633ca528b1a-etc-ovs\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.411435 4825 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f04e11a7-e387-4e51-b878-8633ca528b1a-var-run\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.411446 4825 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f04e11a7-e387-4e51-b878-8633ca528b1a-var-lib\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.411457 4825 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f04e11a7-e387-4e51-b878-8633ca528b1a-var-log\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.412686 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f04e11a7-e387-4e51-b878-8633ca528b1a-scripts" (OuterVolumeSpecName: "scripts") pod "f04e11a7-e387-4e51-b878-8633ca528b1a" (UID: "f04e11a7-e387-4e51-b878-8633ca528b1a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.434175 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f04e11a7-e387-4e51-b878-8633ca528b1a-kube-api-access-bmnpc" (OuterVolumeSpecName: "kube-api-access-bmnpc") pod "f04e11a7-e387-4e51-b878-8633ca528b1a" (UID: "f04e11a7-e387-4e51-b878-8633ca528b1a"). InnerVolumeSpecName "kube-api-access-bmnpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.512258 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f04e11a7-e387-4e51-b878-8633ca528b1a-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.512287 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmnpc\" (UniqueName: \"kubernetes.io/projected/f04e11a7-e387-4e51-b878-8633ca528b1a-kube-api-access-bmnpc\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.603897 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.702418 4825 generic.go:334] "Generic (PLEG): container finished" podID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerID="0eb3526b2f9a3dc4c181eae70d6e3f55e5aedd2b23585785d07ab9b47c86be86" exitCode=137 Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.702514 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerDied","Data":"0eb3526b2f9a3dc4c181eae70d6e3f55e5aedd2b23585785d07ab9b47c86be86"} Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.702556 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.702599 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"632168fa-0532-4c7e-b688-fb361ee89ec8","Type":"ContainerDied","Data":"5d31298fc11e6c4ffc4b3bf96856e7a83277dceb2c2cca7b069e69b3b80cf543"} Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.702622 4825 scope.go:117] "RemoveContainer" containerID="0eb3526b2f9a3dc4c181eae70d6e3f55e5aedd2b23585785d07ab9b47c86be86" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.708718 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8x4pm_f04e11a7-e387-4e51-b878-8633ca528b1a/ovs-vswitchd/0.log" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.712339 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8x4pm" event={"ID":"f04e11a7-e387-4e51-b878-8633ca528b1a","Type":"ContainerDied","Data":"754ad60a5b8873bafdcc44f51b7f245d114da8d4e8574e9082261004cb8fcd7e"} Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.712394 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8x4pm" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.713746 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/632168fa-0532-4c7e-b688-fb361ee89ec8-cache\") pod \"632168fa-0532-4c7e-b688-fb361ee89ec8\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.713784 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/632168fa-0532-4c7e-b688-fb361ee89ec8-combined-ca-bundle\") pod \"632168fa-0532-4c7e-b688-fb361ee89ec8\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.713845 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7gvh\" (UniqueName: \"kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-kube-api-access-v7gvh\") pod \"632168fa-0532-4c7e-b688-fb361ee89ec8\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.713877 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"632168fa-0532-4c7e-b688-fb361ee89ec8\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.713918 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/632168fa-0532-4c7e-b688-fb361ee89ec8-lock\") pod \"632168fa-0532-4c7e-b688-fb361ee89ec8\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.713989 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-etc-swift\") pod \"632168fa-0532-4c7e-b688-fb361ee89ec8\" (UID: \"632168fa-0532-4c7e-b688-fb361ee89ec8\") " Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.714531 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/632168fa-0532-4c7e-b688-fb361ee89ec8-lock" (OuterVolumeSpecName: "lock") pod "632168fa-0532-4c7e-b688-fb361ee89ec8" (UID: "632168fa-0532-4c7e-b688-fb361ee89ec8"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.714640 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/632168fa-0532-4c7e-b688-fb361ee89ec8-cache" (OuterVolumeSpecName: "cache") pod "632168fa-0532-4c7e-b688-fb361ee89ec8" (UID: "632168fa-0532-4c7e-b688-fb361ee89ec8"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.719217 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "swift") pod "632168fa-0532-4c7e-b688-fb361ee89ec8" (UID: "632168fa-0532-4c7e-b688-fb361ee89ec8"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.719544 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-kube-api-access-v7gvh" (OuterVolumeSpecName: "kube-api-access-v7gvh") pod "632168fa-0532-4c7e-b688-fb361ee89ec8" (UID: "632168fa-0532-4c7e-b688-fb361ee89ec8"). InnerVolumeSpecName "kube-api-access-v7gvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.719627 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "632168fa-0532-4c7e-b688-fb361ee89ec8" (UID: "632168fa-0532-4c7e-b688-fb361ee89ec8"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.815278 4825 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/632168fa-0532-4c7e-b688-fb361ee89ec8-cache\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.815318 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7gvh\" (UniqueName: \"kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-kube-api-access-v7gvh\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.815356 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.815371 4825 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/632168fa-0532-4c7e-b688-fb361ee89ec8-lock\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.815384 4825 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/632168fa-0532-4c7e-b688-fb361ee89ec8-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.837596 4825 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.854801 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-8x4pm"] Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.862830 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-8x4pm"] Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.882935 4825 scope.go:117] "RemoveContainer" containerID="bbfb6db4ff1cf66b5cefbdf85aba01f8712fe08811899971bacb92c4cffa4c80" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.914769 4825 scope.go:117] "RemoveContainer" containerID="24e7978cba137e69c6bcc8c4236d6cec9f0db399c5f65bdbb35f3ca5a0021d8b" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.916401 4825 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.944527 4825 scope.go:117] "RemoveContainer" containerID="2696dee8a458e4895f3ba7693b75ddaad87852936a2c94221d87c43242a8d9c0" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.966435 4825 scope.go:117] "RemoveContainer" containerID="1b4b9f5c44b3bd7e6bdce3108794f5acc85e48ec01796fe34d6bd0e82000c702" Mar 10 07:09:54 crc kubenswrapper[4825]: I0310 07:09:54.990616 4825 scope.go:117] "RemoveContainer" containerID="6b6649c00a311af982a228be67d8b620ca48866ed93599303fea93062ada078a" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.012198 4825 scope.go:117] "RemoveContainer" containerID="6a94411539794d68dfeebd72122285b13eda3087dbbeb9b315116582797f9ae7" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.018418 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/632168fa-0532-4c7e-b688-fb361ee89ec8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "632168fa-0532-4c7e-b688-fb361ee89ec8" (UID: "632168fa-0532-4c7e-b688-fb361ee89ec8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.032627 4825 scope.go:117] "RemoveContainer" containerID="ec1f01797e26b3ab6c17de9ed25bdc13caaa429fb99d4c291f3d27585657d67a" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.053374 4825 scope.go:117] "RemoveContainer" containerID="76563e2f197f8efff20de4d4a18477c3d06424bc7c12b84fc84e0bc0a8e03877" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.072212 4825 scope.go:117] "RemoveContainer" containerID="3c0a38e663ef3eeb91871a7cbf3ab45bc29c396569c4dd16e651b9a699527022" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.097343 4825 scope.go:117] "RemoveContainer" containerID="13e1e93f8ea417f0086cb72d1b5c3c472463fcb9b293144a2e6d095358753ed6" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.117258 4825 scope.go:117] "RemoveContainer" containerID="dbff9d40c47a9b57b2e18d794a415a9500d5967b3986aef885713d9846f81e1f" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.118925 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/632168fa-0532-4c7e-b688-fb361ee89ec8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.135792 4825 scope.go:117] "RemoveContainer" containerID="49e3de3dc2efd76ad3dd592ab843057ad3883dc58b1b23503b05369a2fb81259" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.167100 4825 scope.go:117] "RemoveContainer" containerID="001ce68a7777bd6940dec1d7b9d36b3306b90993cc66dd9412186267450b2a7a" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.207570 4825 scope.go:117] "RemoveContainer" containerID="b6da432e4ef6e11dcf09322fd99510f39d8b7a43cde3cc5579134c6692dd38d0" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.229839 4825 scope.go:117] "RemoveContainer" containerID="0eb3526b2f9a3dc4c181eae70d6e3f55e5aedd2b23585785d07ab9b47c86be86" Mar 10 07:09:55 crc kubenswrapper[4825]: E0310 07:09:55.230318 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eb3526b2f9a3dc4c181eae70d6e3f55e5aedd2b23585785d07ab9b47c86be86\": container with ID starting with 0eb3526b2f9a3dc4c181eae70d6e3f55e5aedd2b23585785d07ab9b47c86be86 not found: ID does not exist" containerID="0eb3526b2f9a3dc4c181eae70d6e3f55e5aedd2b23585785d07ab9b47c86be86" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.230366 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eb3526b2f9a3dc4c181eae70d6e3f55e5aedd2b23585785d07ab9b47c86be86"} err="failed to get container status \"0eb3526b2f9a3dc4c181eae70d6e3f55e5aedd2b23585785d07ab9b47c86be86\": rpc error: code = NotFound desc = could not find container \"0eb3526b2f9a3dc4c181eae70d6e3f55e5aedd2b23585785d07ab9b47c86be86\": container with ID starting with 0eb3526b2f9a3dc4c181eae70d6e3f55e5aedd2b23585785d07ab9b47c86be86 not found: ID does not exist" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.230399 4825 scope.go:117] "RemoveContainer" containerID="bbfb6db4ff1cf66b5cefbdf85aba01f8712fe08811899971bacb92c4cffa4c80" Mar 10 07:09:55 crc kubenswrapper[4825]: E0310 07:09:55.230898 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbfb6db4ff1cf66b5cefbdf85aba01f8712fe08811899971bacb92c4cffa4c80\": container with ID starting with bbfb6db4ff1cf66b5cefbdf85aba01f8712fe08811899971bacb92c4cffa4c80 not found: ID does not exist" containerID="bbfb6db4ff1cf66b5cefbdf85aba01f8712fe08811899971bacb92c4cffa4c80" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.230938 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbfb6db4ff1cf66b5cefbdf85aba01f8712fe08811899971bacb92c4cffa4c80"} err="failed to get container status \"bbfb6db4ff1cf66b5cefbdf85aba01f8712fe08811899971bacb92c4cffa4c80\": rpc error: code = NotFound desc = could not find container \"bbfb6db4ff1cf66b5cefbdf85aba01f8712fe08811899971bacb92c4cffa4c80\": container with ID starting with bbfb6db4ff1cf66b5cefbdf85aba01f8712fe08811899971bacb92c4cffa4c80 not found: ID does not exist" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.230965 4825 scope.go:117] "RemoveContainer" containerID="24e7978cba137e69c6bcc8c4236d6cec9f0db399c5f65bdbb35f3ca5a0021d8b" Mar 10 07:09:55 crc kubenswrapper[4825]: E0310 07:09:55.231268 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24e7978cba137e69c6bcc8c4236d6cec9f0db399c5f65bdbb35f3ca5a0021d8b\": container with ID starting with 24e7978cba137e69c6bcc8c4236d6cec9f0db399c5f65bdbb35f3ca5a0021d8b not found: ID does not exist" containerID="24e7978cba137e69c6bcc8c4236d6cec9f0db399c5f65bdbb35f3ca5a0021d8b" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.231288 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24e7978cba137e69c6bcc8c4236d6cec9f0db399c5f65bdbb35f3ca5a0021d8b"} err="failed to get container status \"24e7978cba137e69c6bcc8c4236d6cec9f0db399c5f65bdbb35f3ca5a0021d8b\": rpc error: code = NotFound desc = could not find container \"24e7978cba137e69c6bcc8c4236d6cec9f0db399c5f65bdbb35f3ca5a0021d8b\": container with ID starting with 24e7978cba137e69c6bcc8c4236d6cec9f0db399c5f65bdbb35f3ca5a0021d8b not found: ID does not exist" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.231305 4825 scope.go:117] "RemoveContainer" containerID="2696dee8a458e4895f3ba7693b75ddaad87852936a2c94221d87c43242a8d9c0" Mar 10 07:09:55 crc kubenswrapper[4825]: E0310 07:09:55.231584 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2696dee8a458e4895f3ba7693b75ddaad87852936a2c94221d87c43242a8d9c0\": container with ID starting with 2696dee8a458e4895f3ba7693b75ddaad87852936a2c94221d87c43242a8d9c0 not found: ID does not exist" containerID="2696dee8a458e4895f3ba7693b75ddaad87852936a2c94221d87c43242a8d9c0" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.231610 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2696dee8a458e4895f3ba7693b75ddaad87852936a2c94221d87c43242a8d9c0"} err="failed to get container status \"2696dee8a458e4895f3ba7693b75ddaad87852936a2c94221d87c43242a8d9c0\": rpc error: code = NotFound desc = could not find container \"2696dee8a458e4895f3ba7693b75ddaad87852936a2c94221d87c43242a8d9c0\": container with ID starting with 2696dee8a458e4895f3ba7693b75ddaad87852936a2c94221d87c43242a8d9c0 not found: ID does not exist" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.231626 4825 scope.go:117] "RemoveContainer" containerID="1b4b9f5c44b3bd7e6bdce3108794f5acc85e48ec01796fe34d6bd0e82000c702" Mar 10 07:09:55 crc kubenswrapper[4825]: E0310 07:09:55.231881 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b4b9f5c44b3bd7e6bdce3108794f5acc85e48ec01796fe34d6bd0e82000c702\": container with ID starting with 1b4b9f5c44b3bd7e6bdce3108794f5acc85e48ec01796fe34d6bd0e82000c702 not found: ID does not exist" containerID="1b4b9f5c44b3bd7e6bdce3108794f5acc85e48ec01796fe34d6bd0e82000c702" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.231905 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b4b9f5c44b3bd7e6bdce3108794f5acc85e48ec01796fe34d6bd0e82000c702"} err="failed to get container status \"1b4b9f5c44b3bd7e6bdce3108794f5acc85e48ec01796fe34d6bd0e82000c702\": rpc error: code = NotFound desc = could not find container \"1b4b9f5c44b3bd7e6bdce3108794f5acc85e48ec01796fe34d6bd0e82000c702\": container with ID starting with 1b4b9f5c44b3bd7e6bdce3108794f5acc85e48ec01796fe34d6bd0e82000c702 not found: ID does not exist" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.231920 4825 scope.go:117] "RemoveContainer" containerID="6b6649c00a311af982a228be67d8b620ca48866ed93599303fea93062ada078a" Mar 10 07:09:55 crc kubenswrapper[4825]: E0310 07:09:55.232275 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b6649c00a311af982a228be67d8b620ca48866ed93599303fea93062ada078a\": container with ID starting with 6b6649c00a311af982a228be67d8b620ca48866ed93599303fea93062ada078a not found: ID does not exist" containerID="6b6649c00a311af982a228be67d8b620ca48866ed93599303fea93062ada078a" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.232300 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b6649c00a311af982a228be67d8b620ca48866ed93599303fea93062ada078a"} err="failed to get container status \"6b6649c00a311af982a228be67d8b620ca48866ed93599303fea93062ada078a\": rpc error: code = NotFound desc = could not find container \"6b6649c00a311af982a228be67d8b620ca48866ed93599303fea93062ada078a\": container with ID starting with 6b6649c00a311af982a228be67d8b620ca48866ed93599303fea93062ada078a not found: ID does not exist" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.232315 4825 scope.go:117] "RemoveContainer" containerID="6a94411539794d68dfeebd72122285b13eda3087dbbeb9b315116582797f9ae7" Mar 10 07:09:55 crc kubenswrapper[4825]: E0310 07:09:55.232774 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a94411539794d68dfeebd72122285b13eda3087dbbeb9b315116582797f9ae7\": container with ID starting with 6a94411539794d68dfeebd72122285b13eda3087dbbeb9b315116582797f9ae7 not found: ID does not exist" containerID="6a94411539794d68dfeebd72122285b13eda3087dbbeb9b315116582797f9ae7" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.232806 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a94411539794d68dfeebd72122285b13eda3087dbbeb9b315116582797f9ae7"} err="failed to get container status \"6a94411539794d68dfeebd72122285b13eda3087dbbeb9b315116582797f9ae7\": rpc error: code = NotFound desc = could not find container \"6a94411539794d68dfeebd72122285b13eda3087dbbeb9b315116582797f9ae7\": container with ID starting with 6a94411539794d68dfeebd72122285b13eda3087dbbeb9b315116582797f9ae7 not found: ID does not exist" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.232826 4825 scope.go:117] "RemoveContainer" containerID="ec1f01797e26b3ab6c17de9ed25bdc13caaa429fb99d4c291f3d27585657d67a" Mar 10 07:09:55 crc kubenswrapper[4825]: E0310 07:09:55.233098 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec1f01797e26b3ab6c17de9ed25bdc13caaa429fb99d4c291f3d27585657d67a\": container with ID starting with ec1f01797e26b3ab6c17de9ed25bdc13caaa429fb99d4c291f3d27585657d67a not found: ID does not exist" containerID="ec1f01797e26b3ab6c17de9ed25bdc13caaa429fb99d4c291f3d27585657d67a" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.233127 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec1f01797e26b3ab6c17de9ed25bdc13caaa429fb99d4c291f3d27585657d67a"} err="failed to get container status \"ec1f01797e26b3ab6c17de9ed25bdc13caaa429fb99d4c291f3d27585657d67a\": rpc error: code = NotFound desc = could not find container \"ec1f01797e26b3ab6c17de9ed25bdc13caaa429fb99d4c291f3d27585657d67a\": container with ID starting with ec1f01797e26b3ab6c17de9ed25bdc13caaa429fb99d4c291f3d27585657d67a not found: ID does not exist" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.233158 4825 scope.go:117] "RemoveContainer" containerID="76563e2f197f8efff20de4d4a18477c3d06424bc7c12b84fc84e0bc0a8e03877" Mar 10 07:09:55 crc kubenswrapper[4825]: E0310 07:09:55.233460 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76563e2f197f8efff20de4d4a18477c3d06424bc7c12b84fc84e0bc0a8e03877\": container with ID starting with 76563e2f197f8efff20de4d4a18477c3d06424bc7c12b84fc84e0bc0a8e03877 not found: ID does not exist" containerID="76563e2f197f8efff20de4d4a18477c3d06424bc7c12b84fc84e0bc0a8e03877" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.233485 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76563e2f197f8efff20de4d4a18477c3d06424bc7c12b84fc84e0bc0a8e03877"} err="failed to get container status \"76563e2f197f8efff20de4d4a18477c3d06424bc7c12b84fc84e0bc0a8e03877\": rpc error: code = NotFound desc = could not find container \"76563e2f197f8efff20de4d4a18477c3d06424bc7c12b84fc84e0bc0a8e03877\": container with ID starting with 76563e2f197f8efff20de4d4a18477c3d06424bc7c12b84fc84e0bc0a8e03877 not found: ID does not exist" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.233501 4825 scope.go:117] "RemoveContainer" containerID="3c0a38e663ef3eeb91871a7cbf3ab45bc29c396569c4dd16e651b9a699527022" Mar 10 07:09:55 crc kubenswrapper[4825]: E0310 07:09:55.233902 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c0a38e663ef3eeb91871a7cbf3ab45bc29c396569c4dd16e651b9a699527022\": container with ID starting with 3c0a38e663ef3eeb91871a7cbf3ab45bc29c396569c4dd16e651b9a699527022 not found: ID does not exist" containerID="3c0a38e663ef3eeb91871a7cbf3ab45bc29c396569c4dd16e651b9a699527022" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.233927 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c0a38e663ef3eeb91871a7cbf3ab45bc29c396569c4dd16e651b9a699527022"} err="failed to get container status \"3c0a38e663ef3eeb91871a7cbf3ab45bc29c396569c4dd16e651b9a699527022\": rpc error: code = NotFound desc = could not find container \"3c0a38e663ef3eeb91871a7cbf3ab45bc29c396569c4dd16e651b9a699527022\": container with ID starting with 3c0a38e663ef3eeb91871a7cbf3ab45bc29c396569c4dd16e651b9a699527022 not found: ID does not exist" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.233941 4825 scope.go:117] "RemoveContainer" containerID="13e1e93f8ea417f0086cb72d1b5c3c472463fcb9b293144a2e6d095358753ed6" Mar 10 07:09:55 crc kubenswrapper[4825]: E0310 07:09:55.234233 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13e1e93f8ea417f0086cb72d1b5c3c472463fcb9b293144a2e6d095358753ed6\": container with ID starting with 13e1e93f8ea417f0086cb72d1b5c3c472463fcb9b293144a2e6d095358753ed6 not found: ID does not exist" containerID="13e1e93f8ea417f0086cb72d1b5c3c472463fcb9b293144a2e6d095358753ed6" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.234262 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13e1e93f8ea417f0086cb72d1b5c3c472463fcb9b293144a2e6d095358753ed6"} err="failed to get container status \"13e1e93f8ea417f0086cb72d1b5c3c472463fcb9b293144a2e6d095358753ed6\": rpc error: code = NotFound desc = could not find container \"13e1e93f8ea417f0086cb72d1b5c3c472463fcb9b293144a2e6d095358753ed6\": container with ID starting with 13e1e93f8ea417f0086cb72d1b5c3c472463fcb9b293144a2e6d095358753ed6 not found: ID does not exist" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.234281 4825 scope.go:117] "RemoveContainer" containerID="dbff9d40c47a9b57b2e18d794a415a9500d5967b3986aef885713d9846f81e1f" Mar 10 07:09:55 crc kubenswrapper[4825]: E0310 07:09:55.234665 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbff9d40c47a9b57b2e18d794a415a9500d5967b3986aef885713d9846f81e1f\": container with ID starting with dbff9d40c47a9b57b2e18d794a415a9500d5967b3986aef885713d9846f81e1f not found: ID does not exist" containerID="dbff9d40c47a9b57b2e18d794a415a9500d5967b3986aef885713d9846f81e1f" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.234694 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbff9d40c47a9b57b2e18d794a415a9500d5967b3986aef885713d9846f81e1f"} err="failed to get container status \"dbff9d40c47a9b57b2e18d794a415a9500d5967b3986aef885713d9846f81e1f\": rpc error: code = NotFound desc = could not find container \"dbff9d40c47a9b57b2e18d794a415a9500d5967b3986aef885713d9846f81e1f\": container with ID starting with dbff9d40c47a9b57b2e18d794a415a9500d5967b3986aef885713d9846f81e1f not found: ID does not exist" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.234713 4825 scope.go:117] "RemoveContainer" containerID="49e3de3dc2efd76ad3dd592ab843057ad3883dc58b1b23503b05369a2fb81259" Mar 10 07:09:55 crc kubenswrapper[4825]: E0310 07:09:55.235016 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49e3de3dc2efd76ad3dd592ab843057ad3883dc58b1b23503b05369a2fb81259\": container with ID starting with 49e3de3dc2efd76ad3dd592ab843057ad3883dc58b1b23503b05369a2fb81259 not found: ID does not exist" containerID="49e3de3dc2efd76ad3dd592ab843057ad3883dc58b1b23503b05369a2fb81259" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.235037 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49e3de3dc2efd76ad3dd592ab843057ad3883dc58b1b23503b05369a2fb81259"} err="failed to get container status \"49e3de3dc2efd76ad3dd592ab843057ad3883dc58b1b23503b05369a2fb81259\": rpc error: code = NotFound desc = could not find container \"49e3de3dc2efd76ad3dd592ab843057ad3883dc58b1b23503b05369a2fb81259\": container with ID starting with 49e3de3dc2efd76ad3dd592ab843057ad3883dc58b1b23503b05369a2fb81259 not found: ID does not exist" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.235057 4825 scope.go:117] "RemoveContainer" containerID="001ce68a7777bd6940dec1d7b9d36b3306b90993cc66dd9412186267450b2a7a" Mar 10 07:09:55 crc kubenswrapper[4825]: E0310 07:09:55.235558 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"001ce68a7777bd6940dec1d7b9d36b3306b90993cc66dd9412186267450b2a7a\": container with ID starting with 001ce68a7777bd6940dec1d7b9d36b3306b90993cc66dd9412186267450b2a7a not found: ID does not exist" containerID="001ce68a7777bd6940dec1d7b9d36b3306b90993cc66dd9412186267450b2a7a" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.235586 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"001ce68a7777bd6940dec1d7b9d36b3306b90993cc66dd9412186267450b2a7a"} err="failed to get container status \"001ce68a7777bd6940dec1d7b9d36b3306b90993cc66dd9412186267450b2a7a\": rpc error: code = NotFound desc = could not find container \"001ce68a7777bd6940dec1d7b9d36b3306b90993cc66dd9412186267450b2a7a\": container with ID starting with 001ce68a7777bd6940dec1d7b9d36b3306b90993cc66dd9412186267450b2a7a not found: ID does not exist" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.235603 4825 scope.go:117] "RemoveContainer" containerID="b6da432e4ef6e11dcf09322fd99510f39d8b7a43cde3cc5579134c6692dd38d0" Mar 10 07:09:55 crc kubenswrapper[4825]: E0310 07:09:55.236084 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6da432e4ef6e11dcf09322fd99510f39d8b7a43cde3cc5579134c6692dd38d0\": container with ID starting with b6da432e4ef6e11dcf09322fd99510f39d8b7a43cde3cc5579134c6692dd38d0 not found: ID does not exist" containerID="b6da432e4ef6e11dcf09322fd99510f39d8b7a43cde3cc5579134c6692dd38d0" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.236111 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6da432e4ef6e11dcf09322fd99510f39d8b7a43cde3cc5579134c6692dd38d0"} err="failed to get container status \"b6da432e4ef6e11dcf09322fd99510f39d8b7a43cde3cc5579134c6692dd38d0\": rpc error: code = NotFound desc = could not find container \"b6da432e4ef6e11dcf09322fd99510f39d8b7a43cde3cc5579134c6692dd38d0\": container with ID starting with b6da432e4ef6e11dcf09322fd99510f39d8b7a43cde3cc5579134c6692dd38d0 not found: ID does not exist" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.236161 4825 scope.go:117] "RemoveContainer" containerID="1f073cb0d55b119007da7e16a09131032c9832ee0f4faf6e839adba2f4de4803" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.249563 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f04e11a7-e387-4e51-b878-8633ca528b1a" path="/var/lib/kubelet/pods/f04e11a7-e387-4e51-b878-8633ca528b1a/volumes" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.272554 4825 scope.go:117] "RemoveContainer" containerID="dbdb4c597ea9262b0a3b7810260c5ec8336d5f6419c63f838f27d493f6678134" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.294746 4825 scope.go:117] "RemoveContainer" containerID="0629a4d8ca064194d19a4f47c25dd149490196967be2ed943f6370d705b5cad2" Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.330049 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 10 07:09:55 crc kubenswrapper[4825]: I0310 07:09:55.334699 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Mar 10 07:09:57 crc kubenswrapper[4825]: I0310 07:09:57.257459 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" path="/var/lib/kubelet/pods/632168fa-0532-4c7e-b688-fb361ee89ec8/volumes" Mar 10 07:09:59 crc kubenswrapper[4825]: I0310 07:09:59.396125 4825 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod56c74365-656c-4362-8358-bbb17d0c8be0"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod56c74365-656c-4362-8358-bbb17d0c8be0] : Timed out while waiting for systemd to remove kubepods-besteffort-pod56c74365_656c_4362_8358_bbb17d0c8be0.slice" Mar 10 07:09:59 crc kubenswrapper[4825]: E0310 07:09:59.396501 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod56c74365-656c-4362-8358-bbb17d0c8be0] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod56c74365-656c-4362-8358-bbb17d0c8be0] : Timed out while waiting for systemd to remove kubepods-besteffort-pod56c74365_656c_4362_8358_bbb17d0c8be0.slice" pod="openstack/placement-55b54bdfdb-8b9ns" podUID="56c74365-656c-4362-8358-bbb17d0c8be0" Mar 10 07:09:59 crc kubenswrapper[4825]: I0310 07:09:59.766562 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55b54bdfdb-8b9ns" Mar 10 07:09:59 crc kubenswrapper[4825]: I0310 07:09:59.835278 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-55b54bdfdb-8b9ns"] Mar 10 07:09:59 crc kubenswrapper[4825]: I0310 07:09:59.847368 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-55b54bdfdb-8b9ns"] Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.160321 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552110-n6wf2"] Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.160729 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ba0e3ee-1309-4411-a927-866b35c2776b" containerName="setup-container" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.160750 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ba0e3ee-1309-4411-a927-866b35c2776b" containerName="setup-container" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.160764 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="209fbdac-3b3c-451a-ae30-5888e1cbb891" containerName="cinder-api-log" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.160773 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="209fbdac-3b3c-451a-ae30-5888e1cbb891" containerName="cinder-api-log" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.160783 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f5995b-4ad8-4840-82f1-6659152c3ed4" containerName="nova-scheduler-scheduler" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.160791 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f5995b-4ad8-4840-82f1-6659152c3ed4" containerName="nova-scheduler-scheduler" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.160803 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="account-auditor" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.160810 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="account-auditor" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.160824 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf819fc-0560-45a8-be5e-04e6ef2bbf32" containerName="nova-api-api" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.160834 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf819fc-0560-45a8-be5e-04e6ef2bbf32" containerName="nova-api-api" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.160848 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5b179f-f4fb-479c-9720-25587566c518" containerName="galera" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.160856 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5b179f-f4fb-479c-9720-25587566c518" containerName="galera" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.160876 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="account-reaper" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.160884 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="account-reaper" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.160903 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04e11a7-e387-4e51-b878-8633ca528b1a" containerName="ovsdb-server-init" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.160911 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04e11a7-e387-4e51-b878-8633ca528b1a" containerName="ovsdb-server-init" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.160921 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="account-server" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.160928 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="account-server" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.160940 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="container-auditor" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.160948 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="container-auditor" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.160960 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="object-updater" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.160967 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="object-updater" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.160978 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d73d8a-acf6-42e6-a30d-e093144ee0b9" containerName="nova-cell0-conductor-conductor" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.160985 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d73d8a-acf6-42e6-a30d-e093144ee0b9" containerName="nova-cell0-conductor-conductor" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.160997 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40efa241-98cc-4dec-9ae8-8a892b367ebc" containerName="setup-container" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161004 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="40efa241-98cc-4dec-9ae8-8a892b367ebc" containerName="setup-container" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161017 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="object-auditor" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161024 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="object-auditor" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161033 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9965b351-ed73-4c38-b393-ff72ba48cd66" containerName="nova-cell1-conductor-conductor" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161041 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="9965b351-ed73-4c38-b393-ff72ba48cd66" containerName="nova-cell1-conductor-conductor" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161054 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="container-updater" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161061 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="container-updater" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161074 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6" containerName="neutron-api" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161081 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6" containerName="neutron-api" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161092 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="container-server" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161100 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="container-server" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161165 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e516b5cf-d44c-4f03-9247-7727319f0a85" containerName="ceilometer-notification-agent" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161173 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e516b5cf-d44c-4f03-9247-7727319f0a85" containerName="ceilometer-notification-agent" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161183 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e516b5cf-d44c-4f03-9247-7727319f0a85" containerName="proxy-httpd" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161190 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e516b5cf-d44c-4f03-9247-7727319f0a85" containerName="proxy-httpd" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161199 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04e11a7-e387-4e51-b878-8633ca528b1a" containerName="ovs-vswitchd" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161206 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04e11a7-e387-4e51-b878-8633ca528b1a" containerName="ovs-vswitchd" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161216 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c684f3e4-ced9-49c3-aa54-515bc2c4fb56" containerName="mariadb-account-create-update" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161223 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c684f3e4-ced9-49c3-aa54-515bc2c4fb56" containerName="mariadb-account-create-update" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161238 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b3d94ce-f8d8-4653-a6b0-2682b23d834e" containerName="glance-log" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161245 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b3d94ce-f8d8-4653-a6b0-2682b23d834e" containerName="glance-log" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161261 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f33e10c8-8be2-46d4-8653-1960855a2a40" containerName="glance-httpd" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161268 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f33e10c8-8be2-46d4-8653-1960855a2a40" containerName="glance-httpd" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161281 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca4ebafb-aa22-44ad-8037-487a9c3baca4" containerName="cinder-scheduler" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161289 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4ebafb-aa22-44ad-8037-487a9c3baca4" containerName="cinder-scheduler" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161299 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40efa241-98cc-4dec-9ae8-8a892b367ebc" containerName="rabbitmq" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161306 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="40efa241-98cc-4dec-9ae8-8a892b367ebc" containerName="rabbitmq" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161314 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d4252ca-f279-4c95-8f10-205339d028a5" containerName="memcached" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161321 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d4252ca-f279-4c95-8f10-205339d028a5" containerName="memcached" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161337 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5b179f-f4fb-479c-9720-25587566c518" containerName="mysql-bootstrap" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161344 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5b179f-f4fb-479c-9720-25587566c518" containerName="mysql-bootstrap" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161355 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca4ebafb-aa22-44ad-8037-487a9c3baca4" containerName="probe" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161361 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4ebafb-aa22-44ad-8037-487a9c3baca4" containerName="probe" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161371 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f33e10c8-8be2-46d4-8653-1960855a2a40" containerName="glance-log" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161380 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f33e10c8-8be2-46d4-8653-1960855a2a40" containerName="glance-log" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161392 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1222a91-b344-4f54-bf9f-75d03d5f8549" containerName="nova-metadata-log" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161399 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1222a91-b344-4f54-bf9f-75d03d5f8549" containerName="nova-metadata-log" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161411 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86dc5ac4-dac8-4fdf-bba9-06d13efacd53" containerName="keystone-api" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161419 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="86dc5ac4-dac8-4fdf-bba9-06d13efacd53" containerName="keystone-api" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161427 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04e11a7-e387-4e51-b878-8633ca528b1a" containerName="ovsdb-server" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161435 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04e11a7-e387-4e51-b878-8633ca528b1a" containerName="ovsdb-server" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161448 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="account-replicator" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161457 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="account-replicator" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161467 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="container-replicator" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161476 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="container-replicator" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161489 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e516b5cf-d44c-4f03-9247-7727319f0a85" containerName="ceilometer-central-agent" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161499 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e516b5cf-d44c-4f03-9247-7727319f0a85" containerName="ceilometer-central-agent" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161512 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="object-replicator" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161521 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="object-replicator" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161533 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e516b5cf-d44c-4f03-9247-7727319f0a85" containerName="sg-core" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161541 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e516b5cf-d44c-4f03-9247-7727319f0a85" containerName="sg-core" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161550 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a2aca4-f98f-4ae8-aca9-62ae6625e5e2" containerName="kube-state-metrics" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161558 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a2aca4-f98f-4ae8-aca9-62ae6625e5e2" containerName="kube-state-metrics" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161568 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="rsync" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161576 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="rsync" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161590 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1222a91-b344-4f54-bf9f-75d03d5f8549" containerName="nova-metadata-metadata" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161600 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1222a91-b344-4f54-bf9f-75d03d5f8549" containerName="nova-metadata-metadata" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161611 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="object-expirer" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161619 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="object-expirer" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161631 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d77fced6-dee7-49fc-9088-e173a5be3cee" containerName="ovn-northd" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161639 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d77fced6-dee7-49fc-9088-e173a5be3cee" containerName="ovn-northd" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161650 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="209fbdac-3b3c-451a-ae30-5888e1cbb891" containerName="cinder-api" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161657 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="209fbdac-3b3c-451a-ae30-5888e1cbb891" containerName="cinder-api" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161667 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf819fc-0560-45a8-be5e-04e6ef2bbf32" containerName="nova-api-log" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161674 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf819fc-0560-45a8-be5e-04e6ef2bbf32" containerName="nova-api-log" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161689 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="swift-recon-cron" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161696 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="swift-recon-cron" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161707 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ba0e3ee-1309-4411-a927-866b35c2776b" containerName="rabbitmq" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161715 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ba0e3ee-1309-4411-a927-866b35c2776b" containerName="rabbitmq" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161724 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b3d94ce-f8d8-4653-a6b0-2682b23d834e" containerName="glance-httpd" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161732 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b3d94ce-f8d8-4653-a6b0-2682b23d834e" containerName="glance-httpd" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161741 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42d8b8af-9916-4aba-b4e7-f825ed30f182" containerName="barbican-api-log" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161748 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d8b8af-9916-4aba-b4e7-f825ed30f182" containerName="barbican-api-log" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161756 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d77fced6-dee7-49fc-9088-e173a5be3cee" containerName="openstack-network-exporter" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161764 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d77fced6-dee7-49fc-9088-e173a5be3cee" containerName="openstack-network-exporter" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161777 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="object-server" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161785 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="object-server" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161795 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42d8b8af-9916-4aba-b4e7-f825ed30f182" containerName="barbican-api" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161803 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d8b8af-9916-4aba-b4e7-f825ed30f182" containerName="barbican-api" Mar 10 07:10:00 crc kubenswrapper[4825]: E0310 07:10:00.161814 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6" containerName="neutron-httpd" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161821 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6" containerName="neutron-httpd" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161968 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="40efa241-98cc-4dec-9ae8-8a892b367ebc" containerName="rabbitmq" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161979 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="container-auditor" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.161986 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1222a91-b344-4f54-bf9f-75d03d5f8549" containerName="nova-metadata-log" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162001 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="47d73d8a-acf6-42e6-a30d-e093144ee0b9" containerName="nova-cell0-conductor-conductor" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162012 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="account-auditor" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162038 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c684f3e4-ced9-49c3-aa54-515bc2c4fb56" containerName="mariadb-account-create-update" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162047 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b3d94ce-f8d8-4653-a6b0-2682b23d834e" containerName="glance-log" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162059 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f04e11a7-e387-4e51-b878-8633ca528b1a" containerName="ovsdb-server" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162068 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b5b179f-f4fb-479c-9720-25587566c518" containerName="galera" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162079 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="container-updater" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162089 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="d77fced6-dee7-49fc-9088-e173a5be3cee" containerName="openstack-network-exporter" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162098 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="object-replicator" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162110 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="85f5995b-4ad8-4840-82f1-6659152c3ed4" containerName="nova-scheduler-scheduler" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162120 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="account-server" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162207 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6" containerName="neutron-httpd" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162220 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f33e10c8-8be2-46d4-8653-1960855a2a40" containerName="glance-httpd" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162231 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e516b5cf-d44c-4f03-9247-7727319f0a85" containerName="ceilometer-central-agent" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162241 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="d77fced6-dee7-49fc-9088-e173a5be3cee" containerName="ovn-northd" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162251 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e516b5cf-d44c-4f03-9247-7727319f0a85" containerName="proxy-httpd" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162259 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c684f3e4-ced9-49c3-aa54-515bc2c4fb56" containerName="mariadb-account-create-update" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162268 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="42d8b8af-9916-4aba-b4e7-f825ed30f182" containerName="barbican-api-log" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162277 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="rsync" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162285 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="swift-recon-cron" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162295 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f04e11a7-e387-4e51-b878-8633ca528b1a" containerName="ovs-vswitchd" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162304 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="object-expirer" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162311 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="04a2aca4-f98f-4ae8-aca9-62ae6625e5e2" containerName="kube-state-metrics" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162323 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcf819fc-0560-45a8-be5e-04e6ef2bbf32" containerName="nova-api-api" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162335 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1222a91-b344-4f54-bf9f-75d03d5f8549" containerName="nova-metadata-metadata" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162359 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="account-replicator" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162369 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="container-server" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162378 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="209fbdac-3b3c-451a-ae30-5888e1cbb891" containerName="cinder-api" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162388 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="42d8b8af-9916-4aba-b4e7-f825ed30f182" containerName="barbican-api" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162399 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="object-updater" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162409 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f33e10c8-8be2-46d4-8653-1960855a2a40" containerName="glance-log" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162421 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="9965b351-ed73-4c38-b393-ff72ba48cd66" containerName="nova-cell1-conductor-conductor" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162428 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca4ebafb-aa22-44ad-8037-487a9c3baca4" containerName="cinder-scheduler" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162438 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="object-auditor" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162447 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="object-server" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162457 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e516b5cf-d44c-4f03-9247-7727319f0a85" containerName="sg-core" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162466 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="86dc5ac4-dac8-4fdf-bba9-06d13efacd53" containerName="keystone-api" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162475 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="account-reaper" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162485 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="209fbdac-3b3c-451a-ae30-5888e1cbb891" containerName="cinder-api-log" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162497 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcf819fc-0560-45a8-be5e-04e6ef2bbf32" containerName="nova-api-log" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162507 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b3d94ce-f8d8-4653-a6b0-2682b23d834e" containerName="glance-httpd" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162514 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e516b5cf-d44c-4f03-9247-7727319f0a85" containerName="ceilometer-notification-agent" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162525 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d4252ca-f279-4c95-8f10-205339d028a5" containerName="memcached" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162534 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e1778b-fc9b-472f-b7ae-ccbad8fa90d6" containerName="neutron-api" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162546 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="632168fa-0532-4c7e-b688-fb361ee89ec8" containerName="container-replicator" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162557 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ba0e3ee-1309-4411-a927-866b35c2776b" containerName="rabbitmq" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.162565 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca4ebafb-aa22-44ad-8037-487a9c3baca4" containerName="probe" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.163169 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552110-n6wf2" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.168188 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.168191 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.169213 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.169943 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552110-n6wf2"] Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.348627 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t88dc\" (UniqueName: \"kubernetes.io/projected/971b7933-e72a-4c44-831f-2d63ab760bc9-kube-api-access-t88dc\") pod \"auto-csr-approver-29552110-n6wf2\" (UID: \"971b7933-e72a-4c44-831f-2d63ab760bc9\") " pod="openshift-infra/auto-csr-approver-29552110-n6wf2" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.450727 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t88dc\" (UniqueName: \"kubernetes.io/projected/971b7933-e72a-4c44-831f-2d63ab760bc9-kube-api-access-t88dc\") pod \"auto-csr-approver-29552110-n6wf2\" (UID: \"971b7933-e72a-4c44-831f-2d63ab760bc9\") " pod="openshift-infra/auto-csr-approver-29552110-n6wf2" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.486301 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t88dc\" (UniqueName: \"kubernetes.io/projected/971b7933-e72a-4c44-831f-2d63ab760bc9-kube-api-access-t88dc\") pod \"auto-csr-approver-29552110-n6wf2\" (UID: \"971b7933-e72a-4c44-831f-2d63ab760bc9\") " pod="openshift-infra/auto-csr-approver-29552110-n6wf2" Mar 10 07:10:00 crc kubenswrapper[4825]: I0310 07:10:00.499910 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552110-n6wf2" Mar 10 07:10:01 crc kubenswrapper[4825]: I0310 07:10:01.056756 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 07:10:01 crc kubenswrapper[4825]: I0310 07:10:01.057805 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552110-n6wf2"] Mar 10 07:10:01 crc kubenswrapper[4825]: I0310 07:10:01.248717 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56c74365-656c-4362-8358-bbb17d0c8be0" path="/var/lib/kubelet/pods/56c74365-656c-4362-8358-bbb17d0c8be0/volumes" Mar 10 07:10:01 crc kubenswrapper[4825]: I0310 07:10:01.794822 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552110-n6wf2" event={"ID":"971b7933-e72a-4c44-831f-2d63ab760bc9","Type":"ContainerStarted","Data":"791027b77bb2676e6aced86af7d371500c166f68c0d3c88b3649e2e24f228b8b"} Mar 10 07:10:02 crc kubenswrapper[4825]: I0310 07:10:02.812305 4825 generic.go:334] "Generic (PLEG): container finished" podID="971b7933-e72a-4c44-831f-2d63ab760bc9" containerID="256d5ad716b49df18c9479773dade007a478fa0a069736a036755dc714809c73" exitCode=0 Mar 10 07:10:02 crc kubenswrapper[4825]: I0310 07:10:02.812369 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552110-n6wf2" event={"ID":"971b7933-e72a-4c44-831f-2d63ab760bc9","Type":"ContainerDied","Data":"256d5ad716b49df18c9479773dade007a478fa0a069736a036755dc714809c73"} Mar 10 07:10:04 crc kubenswrapper[4825]: I0310 07:10:04.152091 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552110-n6wf2" Mar 10 07:10:04 crc kubenswrapper[4825]: I0310 07:10:04.323064 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t88dc\" (UniqueName: \"kubernetes.io/projected/971b7933-e72a-4c44-831f-2d63ab760bc9-kube-api-access-t88dc\") pod \"971b7933-e72a-4c44-831f-2d63ab760bc9\" (UID: \"971b7933-e72a-4c44-831f-2d63ab760bc9\") " Mar 10 07:10:04 crc kubenswrapper[4825]: I0310 07:10:04.328808 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/971b7933-e72a-4c44-831f-2d63ab760bc9-kube-api-access-t88dc" (OuterVolumeSpecName: "kube-api-access-t88dc") pod "971b7933-e72a-4c44-831f-2d63ab760bc9" (UID: "971b7933-e72a-4c44-831f-2d63ab760bc9"). InnerVolumeSpecName "kube-api-access-t88dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:10:04 crc kubenswrapper[4825]: I0310 07:10:04.425436 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t88dc\" (UniqueName: \"kubernetes.io/projected/971b7933-e72a-4c44-831f-2d63ab760bc9-kube-api-access-t88dc\") on node \"crc\" DevicePath \"\"" Mar 10 07:10:04 crc kubenswrapper[4825]: I0310 07:10:04.831670 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552110-n6wf2" event={"ID":"971b7933-e72a-4c44-831f-2d63ab760bc9","Type":"ContainerDied","Data":"791027b77bb2676e6aced86af7d371500c166f68c0d3c88b3649e2e24f228b8b"} Mar 10 07:10:04 crc kubenswrapper[4825]: I0310 07:10:04.831800 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="791027b77bb2676e6aced86af7d371500c166f68c0d3c88b3649e2e24f228b8b" Mar 10 07:10:04 crc kubenswrapper[4825]: I0310 07:10:04.831691 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552110-n6wf2" Mar 10 07:10:05 crc kubenswrapper[4825]: I0310 07:10:05.230276 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552104-z7jkb"] Mar 10 07:10:05 crc kubenswrapper[4825]: I0310 07:10:05.248910 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552104-z7jkb"] Mar 10 07:10:07 crc kubenswrapper[4825]: I0310 07:10:07.251768 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f5ca3f7-1d76-4749-a5c9-1063d3d44886" path="/var/lib/kubelet/pods/2f5ca3f7-1d76-4749-a5c9-1063d3d44886/volumes" Mar 10 07:10:16 crc kubenswrapper[4825]: I0310 07:10:16.888078 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:10:16 crc kubenswrapper[4825]: I0310 07:10:16.888775 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:10:16 crc kubenswrapper[4825]: I0310 07:10:16.888825 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 07:10:16 crc kubenswrapper[4825]: I0310 07:10:16.889563 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c615601d462179e42a2a1b6a7ad7dd6409fbaf9c9befec571e34a43e2c4ba121"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 07:10:16 crc kubenswrapper[4825]: I0310 07:10:16.889623 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://c615601d462179e42a2a1b6a7ad7dd6409fbaf9c9befec571e34a43e2c4ba121" gracePeriod=600 Mar 10 07:10:17 crc kubenswrapper[4825]: I0310 07:10:17.965600 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="c615601d462179e42a2a1b6a7ad7dd6409fbaf9c9befec571e34a43e2c4ba121" exitCode=0 Mar 10 07:10:17 crc kubenswrapper[4825]: I0310 07:10:17.965641 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"c615601d462179e42a2a1b6a7ad7dd6409fbaf9c9befec571e34a43e2c4ba121"} Mar 10 07:10:17 crc kubenswrapper[4825]: I0310 07:10:17.967468 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5"} Mar 10 07:10:17 crc kubenswrapper[4825]: I0310 07:10:17.967531 4825 scope.go:117] "RemoveContainer" containerID="5fc08ef292fcd7ed8a89f718e5e3157a2a220984fb44d362f65a63f59128261b" Mar 10 07:10:34 crc kubenswrapper[4825]: I0310 07:10:34.793986 4825 scope.go:117] "RemoveContainer" containerID="19eeac237e74a5220a577c3d082d7b8b22aaedda3a7f589b4665325a76f1f281" Mar 10 07:10:34 crc kubenswrapper[4825]: I0310 07:10:34.833916 4825 scope.go:117] "RemoveContainer" containerID="7f5c45f549267da3853bd7ddfebfbea9374ef2c39b83d88180442cd2939efb42" Mar 10 07:10:34 crc kubenswrapper[4825]: I0310 07:10:34.878321 4825 scope.go:117] "RemoveContainer" containerID="d1e097358167e39ddccce6b130e413b76f204a644ef1a982d1f7345519f8fd77" Mar 10 07:10:34 crc kubenswrapper[4825]: I0310 07:10:34.910503 4825 scope.go:117] "RemoveContainer" containerID="a6005c6bc4da885ce83884c2c94dd7496cf366fd8be1169072b634c4a4fd77de" Mar 10 07:10:34 crc kubenswrapper[4825]: I0310 07:10:34.935775 4825 scope.go:117] "RemoveContainer" containerID="3c1a26aa489fef14c83ee2f8dab937b7ecb3974bfcc67e5ac84499f654860d6b" Mar 10 07:10:34 crc kubenswrapper[4825]: I0310 07:10:34.981858 4825 scope.go:117] "RemoveContainer" containerID="4021e97ed12d312ed4f3a61752f965d4945892a16e4d8c80f26b392bf7588282" Mar 10 07:10:35 crc kubenswrapper[4825]: I0310 07:10:35.012291 4825 scope.go:117] "RemoveContainer" containerID="940fb8b9f6cdbf71b704bd64a4630d8473c614b5066613b8ce93f6601e6420ac" Mar 10 07:10:35 crc kubenswrapper[4825]: I0310 07:10:35.035633 4825 scope.go:117] "RemoveContainer" containerID="e38be62763739c1a7ecde94467ed37f7af960422776051d7c021bd000a095079" Mar 10 07:10:35 crc kubenswrapper[4825]: I0310 07:10:35.059151 4825 scope.go:117] "RemoveContainer" containerID="d511a3187b10669bd4c998f3a57faa64870755649eb5a3cb0b79fc7f46b76cf2" Mar 10 07:10:35 crc kubenswrapper[4825]: I0310 07:10:35.075163 4825 scope.go:117] "RemoveContainer" containerID="edf716e37dfc32124d097f2a38586d83b24327f8a4221726babeb8c34418ddad" Mar 10 07:11:35 crc kubenswrapper[4825]: I0310 07:11:35.352807 4825 scope.go:117] "RemoveContainer" containerID="59e7d3f59f28bf9a85655e65cea5774e18262e1f33b27b8421e9a03b0b4731ad" Mar 10 07:11:35 crc kubenswrapper[4825]: I0310 07:11:35.390245 4825 scope.go:117] "RemoveContainer" containerID="d09ee82d5aa4a7a17bf2b371f7100ef94c15ec19c7501d5875a9d27e816de091" Mar 10 07:11:35 crc kubenswrapper[4825]: I0310 07:11:35.432833 4825 scope.go:117] "RemoveContainer" containerID="3bd00aaa4f5f332f9aa4d37bf582366cd2e28121d59d7ff53887a4d10a61b3e2" Mar 10 07:11:35 crc kubenswrapper[4825]: I0310 07:11:35.476990 4825 scope.go:117] "RemoveContainer" containerID="8522014e2b89450213c7c7bf32a784e2ab3f811e78e3dd033eb199f51fcc0e91" Mar 10 07:11:35 crc kubenswrapper[4825]: I0310 07:11:35.514690 4825 scope.go:117] "RemoveContainer" containerID="703a298507850b13b032c3baeba76b06d9175a69c35e6fb7102fe7069f2b5cb9" Mar 10 07:11:35 crc kubenswrapper[4825]: I0310 07:11:35.536659 4825 scope.go:117] "RemoveContainer" containerID="c05058771a30272a89ebdf20f49d8eaf62dce4f91bce9a630a67ad47518d7ccc" Mar 10 07:11:35 crc kubenswrapper[4825]: I0310 07:11:35.593746 4825 scope.go:117] "RemoveContainer" containerID="d9fbc166daf6efaadd2da5517d14469cb803ed05f0994d79c6b487e8b8563e3e" Mar 10 07:11:35 crc kubenswrapper[4825]: I0310 07:11:35.631522 4825 scope.go:117] "RemoveContainer" containerID="9e2afec1031575eab3dc1ec0e1af73607aeb7fdff1f66af902b177103558df55" Mar 10 07:11:35 crc kubenswrapper[4825]: I0310 07:11:35.676565 4825 scope.go:117] "RemoveContainer" containerID="33eb4c1cd9c309922b3b82212e200be97e525a98f1adfa545e20e9f5febfc35a" Mar 10 07:11:35 crc kubenswrapper[4825]: I0310 07:11:35.706164 4825 scope.go:117] "RemoveContainer" containerID="5e77a76f7d16b2669bd7d5e44e7c46216e39d10ff92e441fc9810b89690b1df6" Mar 10 07:11:35 crc kubenswrapper[4825]: I0310 07:11:35.733596 4825 scope.go:117] "RemoveContainer" containerID="c7bdda645430db45f7b8e88e993abb18f845999fa4f97858f9f4c8547b9bd21f" Mar 10 07:11:35 crc kubenswrapper[4825]: I0310 07:11:35.769728 4825 scope.go:117] "RemoveContainer" containerID="ee6a2dd99ef1498477db567018498314266ac3980ae3e3dff6c88841a37b0704" Mar 10 07:11:35 crc kubenswrapper[4825]: I0310 07:11:35.793665 4825 scope.go:117] "RemoveContainer" containerID="f7ac3927b544e90a8dc6d089575a7df963d2e88feeb45b58caabdeb8c821d487" Mar 10 07:11:35 crc kubenswrapper[4825]: I0310 07:11:35.817726 4825 scope.go:117] "RemoveContainer" containerID="9c0002c4d7294538321f66a795d740ab3d1a89b545a70e9228503bb72c6cd2b1" Mar 10 07:11:35 crc kubenswrapper[4825]: I0310 07:11:35.860019 4825 scope.go:117] "RemoveContainer" containerID="314c7056d3c2823b0ecdeb62329c9c412680ceadc703f4aa197bcd466ad43054" Mar 10 07:11:35 crc kubenswrapper[4825]: I0310 07:11:35.918460 4825 scope.go:117] "RemoveContainer" containerID="38a51644caf8c9a32ccb226b34fbd7261ce3347b38b99707dc8437f9e1f32f8a" Mar 10 07:11:35 crc kubenswrapper[4825]: I0310 07:11:35.973055 4825 scope.go:117] "RemoveContainer" containerID="6ea644254c4be2d8dccc8488e346df3272810cf37b2c4aec97ada14fd9383c97" Mar 10 07:11:47 crc kubenswrapper[4825]: I0310 07:11:47.041566 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dwlwv"] Mar 10 07:11:47 crc kubenswrapper[4825]: E0310 07:11:47.042426 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="971b7933-e72a-4c44-831f-2d63ab760bc9" containerName="oc" Mar 10 07:11:47 crc kubenswrapper[4825]: I0310 07:11:47.042449 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="971b7933-e72a-4c44-831f-2d63ab760bc9" containerName="oc" Mar 10 07:11:47 crc kubenswrapper[4825]: E0310 07:11:47.042486 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c684f3e4-ced9-49c3-aa54-515bc2c4fb56" containerName="mariadb-account-create-update" Mar 10 07:11:47 crc kubenswrapper[4825]: I0310 07:11:47.042499 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c684f3e4-ced9-49c3-aa54-515bc2c4fb56" containerName="mariadb-account-create-update" Mar 10 07:11:47 crc kubenswrapper[4825]: I0310 07:11:47.042773 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="971b7933-e72a-4c44-831f-2d63ab760bc9" containerName="oc" Mar 10 07:11:47 crc kubenswrapper[4825]: I0310 07:11:47.044543 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwlwv" Mar 10 07:11:47 crc kubenswrapper[4825]: I0310 07:11:47.057729 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwlwv"] Mar 10 07:11:47 crc kubenswrapper[4825]: I0310 07:11:47.171205 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1949f5b1-56f4-499c-83dd-20f96f7f8c7f-catalog-content\") pod \"redhat-marketplace-dwlwv\" (UID: \"1949f5b1-56f4-499c-83dd-20f96f7f8c7f\") " pod="openshift-marketplace/redhat-marketplace-dwlwv" Mar 10 07:11:47 crc kubenswrapper[4825]: I0310 07:11:47.171277 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1949f5b1-56f4-499c-83dd-20f96f7f8c7f-utilities\") pod \"redhat-marketplace-dwlwv\" (UID: \"1949f5b1-56f4-499c-83dd-20f96f7f8c7f\") " pod="openshift-marketplace/redhat-marketplace-dwlwv" Mar 10 07:11:47 crc kubenswrapper[4825]: I0310 07:11:47.171369 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4l5h\" (UniqueName: \"kubernetes.io/projected/1949f5b1-56f4-499c-83dd-20f96f7f8c7f-kube-api-access-l4l5h\") pod \"redhat-marketplace-dwlwv\" (UID: \"1949f5b1-56f4-499c-83dd-20f96f7f8c7f\") " pod="openshift-marketplace/redhat-marketplace-dwlwv" Mar 10 07:11:47 crc kubenswrapper[4825]: I0310 07:11:47.272692 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1949f5b1-56f4-499c-83dd-20f96f7f8c7f-catalog-content\") pod \"redhat-marketplace-dwlwv\" (UID: \"1949f5b1-56f4-499c-83dd-20f96f7f8c7f\") " pod="openshift-marketplace/redhat-marketplace-dwlwv" Mar 10 07:11:47 crc kubenswrapper[4825]: I0310 07:11:47.272805 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1949f5b1-56f4-499c-83dd-20f96f7f8c7f-utilities\") pod \"redhat-marketplace-dwlwv\" (UID: \"1949f5b1-56f4-499c-83dd-20f96f7f8c7f\") " pod="openshift-marketplace/redhat-marketplace-dwlwv" Mar 10 07:11:47 crc kubenswrapper[4825]: I0310 07:11:47.272949 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4l5h\" (UniqueName: \"kubernetes.io/projected/1949f5b1-56f4-499c-83dd-20f96f7f8c7f-kube-api-access-l4l5h\") pod \"redhat-marketplace-dwlwv\" (UID: \"1949f5b1-56f4-499c-83dd-20f96f7f8c7f\") " pod="openshift-marketplace/redhat-marketplace-dwlwv" Mar 10 07:11:47 crc kubenswrapper[4825]: I0310 07:11:47.273371 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1949f5b1-56f4-499c-83dd-20f96f7f8c7f-utilities\") pod \"redhat-marketplace-dwlwv\" (UID: \"1949f5b1-56f4-499c-83dd-20f96f7f8c7f\") " pod="openshift-marketplace/redhat-marketplace-dwlwv" Mar 10 07:11:47 crc kubenswrapper[4825]: I0310 07:11:47.273582 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1949f5b1-56f4-499c-83dd-20f96f7f8c7f-catalog-content\") pod \"redhat-marketplace-dwlwv\" (UID: \"1949f5b1-56f4-499c-83dd-20f96f7f8c7f\") " pod="openshift-marketplace/redhat-marketplace-dwlwv" Mar 10 07:11:47 crc kubenswrapper[4825]: I0310 07:11:47.304251 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4l5h\" (UniqueName: \"kubernetes.io/projected/1949f5b1-56f4-499c-83dd-20f96f7f8c7f-kube-api-access-l4l5h\") pod \"redhat-marketplace-dwlwv\" (UID: \"1949f5b1-56f4-499c-83dd-20f96f7f8c7f\") " pod="openshift-marketplace/redhat-marketplace-dwlwv" Mar 10 07:11:47 crc kubenswrapper[4825]: I0310 07:11:47.374085 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwlwv" Mar 10 07:11:47 crc kubenswrapper[4825]: I0310 07:11:47.922386 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwlwv"] Mar 10 07:11:47 crc kubenswrapper[4825]: I0310 07:11:47.976463 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwlwv" event={"ID":"1949f5b1-56f4-499c-83dd-20f96f7f8c7f","Type":"ContainerStarted","Data":"831d14e297f2c094660ab551aace70f43a0618f4c928065c3cca6d55ad6f75b6"} Mar 10 07:11:48 crc kubenswrapper[4825]: I0310 07:11:48.987280 4825 generic.go:334] "Generic (PLEG): container finished" podID="1949f5b1-56f4-499c-83dd-20f96f7f8c7f" containerID="948e56529b0f7ac2ecf6ca07f0444af011f7d95a3c8ee9e0280959ea64dce766" exitCode=0 Mar 10 07:11:48 crc kubenswrapper[4825]: I0310 07:11:48.987372 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwlwv" event={"ID":"1949f5b1-56f4-499c-83dd-20f96f7f8c7f","Type":"ContainerDied","Data":"948e56529b0f7ac2ecf6ca07f0444af011f7d95a3c8ee9e0280959ea64dce766"} Mar 10 07:11:49 crc kubenswrapper[4825]: I0310 07:11:49.803608 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-46l6r"] Mar 10 07:11:49 crc kubenswrapper[4825]: I0310 07:11:49.805278 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-46l6r" Mar 10 07:11:49 crc kubenswrapper[4825]: I0310 07:11:49.836117 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-46l6r"] Mar 10 07:11:49 crc kubenswrapper[4825]: I0310 07:11:49.920683 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/146e48ac-f636-4728-a5e0-4cfc7c2fea58-utilities\") pod \"certified-operators-46l6r\" (UID: \"146e48ac-f636-4728-a5e0-4cfc7c2fea58\") " pod="openshift-marketplace/certified-operators-46l6r" Mar 10 07:11:49 crc kubenswrapper[4825]: I0310 07:11:49.920865 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5f9b\" (UniqueName: \"kubernetes.io/projected/146e48ac-f636-4728-a5e0-4cfc7c2fea58-kube-api-access-n5f9b\") pod \"certified-operators-46l6r\" (UID: \"146e48ac-f636-4728-a5e0-4cfc7c2fea58\") " pod="openshift-marketplace/certified-operators-46l6r" Mar 10 07:11:49 crc kubenswrapper[4825]: I0310 07:11:49.920996 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/146e48ac-f636-4728-a5e0-4cfc7c2fea58-catalog-content\") pod \"certified-operators-46l6r\" (UID: \"146e48ac-f636-4728-a5e0-4cfc7c2fea58\") " pod="openshift-marketplace/certified-operators-46l6r" Mar 10 07:11:49 crc kubenswrapper[4825]: I0310 07:11:49.997374 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwlwv" event={"ID":"1949f5b1-56f4-499c-83dd-20f96f7f8c7f","Type":"ContainerStarted","Data":"2118bd0693b673a9cf58063c075913bc39f8dc24a9f998527d2c7b0701214ec9"} Mar 10 07:11:50 crc kubenswrapper[4825]: I0310 07:11:50.021872 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/146e48ac-f636-4728-a5e0-4cfc7c2fea58-utilities\") pod \"certified-operators-46l6r\" (UID: \"146e48ac-f636-4728-a5e0-4cfc7c2fea58\") " pod="openshift-marketplace/certified-operators-46l6r" Mar 10 07:11:50 crc kubenswrapper[4825]: I0310 07:11:50.021953 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5f9b\" (UniqueName: \"kubernetes.io/projected/146e48ac-f636-4728-a5e0-4cfc7c2fea58-kube-api-access-n5f9b\") pod \"certified-operators-46l6r\" (UID: \"146e48ac-f636-4728-a5e0-4cfc7c2fea58\") " pod="openshift-marketplace/certified-operators-46l6r" Mar 10 07:11:50 crc kubenswrapper[4825]: I0310 07:11:50.022006 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/146e48ac-f636-4728-a5e0-4cfc7c2fea58-catalog-content\") pod \"certified-operators-46l6r\" (UID: \"146e48ac-f636-4728-a5e0-4cfc7c2fea58\") " pod="openshift-marketplace/certified-operators-46l6r" Mar 10 07:11:50 crc kubenswrapper[4825]: I0310 07:11:50.022681 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/146e48ac-f636-4728-a5e0-4cfc7c2fea58-utilities\") pod \"certified-operators-46l6r\" (UID: \"146e48ac-f636-4728-a5e0-4cfc7c2fea58\") " pod="openshift-marketplace/certified-operators-46l6r" Mar 10 07:11:50 crc kubenswrapper[4825]: I0310 07:11:50.022690 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/146e48ac-f636-4728-a5e0-4cfc7c2fea58-catalog-content\") pod \"certified-operators-46l6r\" (UID: \"146e48ac-f636-4728-a5e0-4cfc7c2fea58\") " pod="openshift-marketplace/certified-operators-46l6r" Mar 10 07:11:50 crc kubenswrapper[4825]: I0310 07:11:50.055385 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5f9b\" (UniqueName: \"kubernetes.io/projected/146e48ac-f636-4728-a5e0-4cfc7c2fea58-kube-api-access-n5f9b\") pod \"certified-operators-46l6r\" (UID: \"146e48ac-f636-4728-a5e0-4cfc7c2fea58\") " pod="openshift-marketplace/certified-operators-46l6r" Mar 10 07:11:50 crc kubenswrapper[4825]: I0310 07:11:50.134657 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-46l6r" Mar 10 07:11:50 crc kubenswrapper[4825]: I0310 07:11:50.679120 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-46l6r"] Mar 10 07:11:51 crc kubenswrapper[4825]: I0310 07:11:51.009956 4825 generic.go:334] "Generic (PLEG): container finished" podID="1949f5b1-56f4-499c-83dd-20f96f7f8c7f" containerID="2118bd0693b673a9cf58063c075913bc39f8dc24a9f998527d2c7b0701214ec9" exitCode=0 Mar 10 07:11:51 crc kubenswrapper[4825]: I0310 07:11:51.010037 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwlwv" event={"ID":"1949f5b1-56f4-499c-83dd-20f96f7f8c7f","Type":"ContainerDied","Data":"2118bd0693b673a9cf58063c075913bc39f8dc24a9f998527d2c7b0701214ec9"} Mar 10 07:11:51 crc kubenswrapper[4825]: I0310 07:11:51.012295 4825 generic.go:334] "Generic (PLEG): container finished" podID="146e48ac-f636-4728-a5e0-4cfc7c2fea58" containerID="0046f4b6b09af076f92d994e5a0808158b0b94de08f2de3b9d290d32f99dfd95" exitCode=0 Mar 10 07:11:51 crc kubenswrapper[4825]: I0310 07:11:51.012344 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46l6r" event={"ID":"146e48ac-f636-4728-a5e0-4cfc7c2fea58","Type":"ContainerDied","Data":"0046f4b6b09af076f92d994e5a0808158b0b94de08f2de3b9d290d32f99dfd95"} Mar 10 07:11:51 crc kubenswrapper[4825]: I0310 07:11:51.012375 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46l6r" event={"ID":"146e48ac-f636-4728-a5e0-4cfc7c2fea58","Type":"ContainerStarted","Data":"dcf0f4cf12d32f81fa29c4b390c82507b685fa1eeca313c42894c6ffc41a459c"} Mar 10 07:11:52 crc kubenswrapper[4825]: I0310 07:11:52.023509 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwlwv" event={"ID":"1949f5b1-56f4-499c-83dd-20f96f7f8c7f","Type":"ContainerStarted","Data":"33405ab4e5fc5f2637066f84bc93dcffc580fa4e20034e5820b63e0ae509cff3"} Mar 10 07:11:52 crc kubenswrapper[4825]: I0310 07:11:52.026695 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46l6r" event={"ID":"146e48ac-f636-4728-a5e0-4cfc7c2fea58","Type":"ContainerStarted","Data":"201612e0d130d679809b6b2fa151d5f40daf86ee4e9f362aa958750bdb930fa6"} Mar 10 07:11:52 crc kubenswrapper[4825]: I0310 07:11:52.044438 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dwlwv" podStartSLOduration=2.636935847 podStartE2EDuration="5.044418866s" podCreationTimestamp="2026-03-10 07:11:47 +0000 UTC" firstStartedPulling="2026-03-10 07:11:48.991700288 +0000 UTC m=+1662.021480943" lastFinishedPulling="2026-03-10 07:11:51.399183347 +0000 UTC m=+1664.428963962" observedRunningTime="2026-03-10 07:11:52.043258785 +0000 UTC m=+1665.073039440" watchObservedRunningTime="2026-03-10 07:11:52.044418866 +0000 UTC m=+1665.074199491" Mar 10 07:11:53 crc kubenswrapper[4825]: I0310 07:11:53.039151 4825 generic.go:334] "Generic (PLEG): container finished" podID="146e48ac-f636-4728-a5e0-4cfc7c2fea58" containerID="201612e0d130d679809b6b2fa151d5f40daf86ee4e9f362aa958750bdb930fa6" exitCode=0 Mar 10 07:11:53 crc kubenswrapper[4825]: I0310 07:11:53.039281 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46l6r" event={"ID":"146e48ac-f636-4728-a5e0-4cfc7c2fea58","Type":"ContainerDied","Data":"201612e0d130d679809b6b2fa151d5f40daf86ee4e9f362aa958750bdb930fa6"} Mar 10 07:11:54 crc kubenswrapper[4825]: I0310 07:11:54.049534 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46l6r" event={"ID":"146e48ac-f636-4728-a5e0-4cfc7c2fea58","Type":"ContainerStarted","Data":"0bbab3d6fb346865b949ec24bedfbd9984c5539d7a600ef1a0f02ba00c9d80b9"} Mar 10 07:11:54 crc kubenswrapper[4825]: I0310 07:11:54.069337 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-46l6r" podStartSLOduration=2.6177386499999997 podStartE2EDuration="5.069319499s" podCreationTimestamp="2026-03-10 07:11:49 +0000 UTC" firstStartedPulling="2026-03-10 07:11:51.014493757 +0000 UTC m=+1664.044274372" lastFinishedPulling="2026-03-10 07:11:53.466074606 +0000 UTC m=+1666.495855221" observedRunningTime="2026-03-10 07:11:54.062962444 +0000 UTC m=+1667.092743069" watchObservedRunningTime="2026-03-10 07:11:54.069319499 +0000 UTC m=+1667.099100114" Mar 10 07:11:57 crc kubenswrapper[4825]: I0310 07:11:57.374768 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dwlwv" Mar 10 07:11:57 crc kubenswrapper[4825]: I0310 07:11:57.374853 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dwlwv" Mar 10 07:11:57 crc kubenswrapper[4825]: I0310 07:11:57.438689 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dwlwv" Mar 10 07:11:58 crc kubenswrapper[4825]: I0310 07:11:58.156553 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dwlwv" Mar 10 07:11:58 crc kubenswrapper[4825]: I0310 07:11:58.603618 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwlwv"] Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.110468 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dwlwv" podUID="1949f5b1-56f4-499c-83dd-20f96f7f8c7f" containerName="registry-server" containerID="cri-o://33405ab4e5fc5f2637066f84bc93dcffc580fa4e20034e5820b63e0ae509cff3" gracePeriod=2 Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.135017 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-46l6r" Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.135108 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-46l6r" Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.173669 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552112-dwgg9"] Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.175200 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552112-dwgg9" Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.178970 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.180986 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.181493 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.199869 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552112-dwgg9"] Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.232283 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-46l6r" Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.293310 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmhpt\" (UniqueName: \"kubernetes.io/projected/c450af65-3ce2-472b-b05b-b22c285c61b5-kube-api-access-nmhpt\") pod \"auto-csr-approver-29552112-dwgg9\" (UID: \"c450af65-3ce2-472b-b05b-b22c285c61b5\") " pod="openshift-infra/auto-csr-approver-29552112-dwgg9" Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.400987 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmhpt\" (UniqueName: \"kubernetes.io/projected/c450af65-3ce2-472b-b05b-b22c285c61b5-kube-api-access-nmhpt\") pod \"auto-csr-approver-29552112-dwgg9\" (UID: \"c450af65-3ce2-472b-b05b-b22c285c61b5\") " pod="openshift-infra/auto-csr-approver-29552112-dwgg9" Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.425485 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmhpt\" (UniqueName: \"kubernetes.io/projected/c450af65-3ce2-472b-b05b-b22c285c61b5-kube-api-access-nmhpt\") pod \"auto-csr-approver-29552112-dwgg9\" (UID: \"c450af65-3ce2-472b-b05b-b22c285c61b5\") " pod="openshift-infra/auto-csr-approver-29552112-dwgg9" Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.558399 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552112-dwgg9" Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.571460 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwlwv" Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.704702 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4l5h\" (UniqueName: \"kubernetes.io/projected/1949f5b1-56f4-499c-83dd-20f96f7f8c7f-kube-api-access-l4l5h\") pod \"1949f5b1-56f4-499c-83dd-20f96f7f8c7f\" (UID: \"1949f5b1-56f4-499c-83dd-20f96f7f8c7f\") " Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.704768 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1949f5b1-56f4-499c-83dd-20f96f7f8c7f-catalog-content\") pod \"1949f5b1-56f4-499c-83dd-20f96f7f8c7f\" (UID: \"1949f5b1-56f4-499c-83dd-20f96f7f8c7f\") " Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.704858 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1949f5b1-56f4-499c-83dd-20f96f7f8c7f-utilities\") pod \"1949f5b1-56f4-499c-83dd-20f96f7f8c7f\" (UID: \"1949f5b1-56f4-499c-83dd-20f96f7f8c7f\") " Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.706413 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1949f5b1-56f4-499c-83dd-20f96f7f8c7f-utilities" (OuterVolumeSpecName: "utilities") pod "1949f5b1-56f4-499c-83dd-20f96f7f8c7f" (UID: "1949f5b1-56f4-499c-83dd-20f96f7f8c7f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.710934 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1949f5b1-56f4-499c-83dd-20f96f7f8c7f-kube-api-access-l4l5h" (OuterVolumeSpecName: "kube-api-access-l4l5h") pod "1949f5b1-56f4-499c-83dd-20f96f7f8c7f" (UID: "1949f5b1-56f4-499c-83dd-20f96f7f8c7f"). InnerVolumeSpecName "kube-api-access-l4l5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.736450 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1949f5b1-56f4-499c-83dd-20f96f7f8c7f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1949f5b1-56f4-499c-83dd-20f96f7f8c7f" (UID: "1949f5b1-56f4-499c-83dd-20f96f7f8c7f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.794578 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552112-dwgg9"] Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.806906 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1949f5b1-56f4-499c-83dd-20f96f7f8c7f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.806935 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4l5h\" (UniqueName: \"kubernetes.io/projected/1949f5b1-56f4-499c-83dd-20f96f7f8c7f-kube-api-access-l4l5h\") on node \"crc\" DevicePath \"\"" Mar 10 07:12:00 crc kubenswrapper[4825]: I0310 07:12:00.806949 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1949f5b1-56f4-499c-83dd-20f96f7f8c7f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 07:12:01 crc kubenswrapper[4825]: I0310 07:12:01.120645 4825 generic.go:334] "Generic (PLEG): container finished" podID="1949f5b1-56f4-499c-83dd-20f96f7f8c7f" containerID="33405ab4e5fc5f2637066f84bc93dcffc580fa4e20034e5820b63e0ae509cff3" exitCode=0 Mar 10 07:12:01 crc kubenswrapper[4825]: I0310 07:12:01.120719 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwlwv" event={"ID":"1949f5b1-56f4-499c-83dd-20f96f7f8c7f","Type":"ContainerDied","Data":"33405ab4e5fc5f2637066f84bc93dcffc580fa4e20034e5820b63e0ae509cff3"} Mar 10 07:12:01 crc kubenswrapper[4825]: I0310 07:12:01.120751 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dwlwv" event={"ID":"1949f5b1-56f4-499c-83dd-20f96f7f8c7f","Type":"ContainerDied","Data":"831d14e297f2c094660ab551aace70f43a0618f4c928065c3cca6d55ad6f75b6"} Mar 10 07:12:01 crc kubenswrapper[4825]: I0310 07:12:01.120757 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dwlwv" Mar 10 07:12:01 crc kubenswrapper[4825]: I0310 07:12:01.120770 4825 scope.go:117] "RemoveContainer" containerID="33405ab4e5fc5f2637066f84bc93dcffc580fa4e20034e5820b63e0ae509cff3" Mar 10 07:12:01 crc kubenswrapper[4825]: I0310 07:12:01.124285 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552112-dwgg9" event={"ID":"c450af65-3ce2-472b-b05b-b22c285c61b5","Type":"ContainerStarted","Data":"c2dd18f55eb14c9b2ef1fa04e38f54bbe87e624d032c1aa28085e0157f9945de"} Mar 10 07:12:01 crc kubenswrapper[4825]: I0310 07:12:01.149439 4825 scope.go:117] "RemoveContainer" containerID="2118bd0693b673a9cf58063c075913bc39f8dc24a9f998527d2c7b0701214ec9" Mar 10 07:12:01 crc kubenswrapper[4825]: I0310 07:12:01.187188 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-46l6r" Mar 10 07:12:01 crc kubenswrapper[4825]: I0310 07:12:01.187729 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwlwv"] Mar 10 07:12:01 crc kubenswrapper[4825]: I0310 07:12:01.194043 4825 scope.go:117] "RemoveContainer" containerID="948e56529b0f7ac2ecf6ca07f0444af011f7d95a3c8ee9e0280959ea64dce766" Mar 10 07:12:01 crc kubenswrapper[4825]: I0310 07:12:01.203533 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dwlwv"] Mar 10 07:12:01 crc kubenswrapper[4825]: I0310 07:12:01.217320 4825 scope.go:117] "RemoveContainer" containerID="33405ab4e5fc5f2637066f84bc93dcffc580fa4e20034e5820b63e0ae509cff3" Mar 10 07:12:01 crc kubenswrapper[4825]: E0310 07:12:01.217849 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33405ab4e5fc5f2637066f84bc93dcffc580fa4e20034e5820b63e0ae509cff3\": container with ID starting with 33405ab4e5fc5f2637066f84bc93dcffc580fa4e20034e5820b63e0ae509cff3 not found: ID does not exist" containerID="33405ab4e5fc5f2637066f84bc93dcffc580fa4e20034e5820b63e0ae509cff3" Mar 10 07:12:01 crc kubenswrapper[4825]: I0310 07:12:01.217875 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33405ab4e5fc5f2637066f84bc93dcffc580fa4e20034e5820b63e0ae509cff3"} err="failed to get container status \"33405ab4e5fc5f2637066f84bc93dcffc580fa4e20034e5820b63e0ae509cff3\": rpc error: code = NotFound desc = could not find container \"33405ab4e5fc5f2637066f84bc93dcffc580fa4e20034e5820b63e0ae509cff3\": container with ID starting with 33405ab4e5fc5f2637066f84bc93dcffc580fa4e20034e5820b63e0ae509cff3 not found: ID does not exist" Mar 10 07:12:01 crc kubenswrapper[4825]: I0310 07:12:01.217897 4825 scope.go:117] "RemoveContainer" containerID="2118bd0693b673a9cf58063c075913bc39f8dc24a9f998527d2c7b0701214ec9" Mar 10 07:12:01 crc kubenswrapper[4825]: E0310 07:12:01.218484 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2118bd0693b673a9cf58063c075913bc39f8dc24a9f998527d2c7b0701214ec9\": container with ID starting with 2118bd0693b673a9cf58063c075913bc39f8dc24a9f998527d2c7b0701214ec9 not found: ID does not exist" containerID="2118bd0693b673a9cf58063c075913bc39f8dc24a9f998527d2c7b0701214ec9" Mar 10 07:12:01 crc kubenswrapper[4825]: I0310 07:12:01.218552 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2118bd0693b673a9cf58063c075913bc39f8dc24a9f998527d2c7b0701214ec9"} err="failed to get container status \"2118bd0693b673a9cf58063c075913bc39f8dc24a9f998527d2c7b0701214ec9\": rpc error: code = NotFound desc = could not find container \"2118bd0693b673a9cf58063c075913bc39f8dc24a9f998527d2c7b0701214ec9\": container with ID starting with 2118bd0693b673a9cf58063c075913bc39f8dc24a9f998527d2c7b0701214ec9 not found: ID does not exist" Mar 10 07:12:01 crc kubenswrapper[4825]: I0310 07:12:01.218612 4825 scope.go:117] "RemoveContainer" containerID="948e56529b0f7ac2ecf6ca07f0444af011f7d95a3c8ee9e0280959ea64dce766" Mar 10 07:12:01 crc kubenswrapper[4825]: E0310 07:12:01.218991 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"948e56529b0f7ac2ecf6ca07f0444af011f7d95a3c8ee9e0280959ea64dce766\": container with ID starting with 948e56529b0f7ac2ecf6ca07f0444af011f7d95a3c8ee9e0280959ea64dce766 not found: ID does not exist" containerID="948e56529b0f7ac2ecf6ca07f0444af011f7d95a3c8ee9e0280959ea64dce766" Mar 10 07:12:01 crc kubenswrapper[4825]: I0310 07:12:01.219018 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"948e56529b0f7ac2ecf6ca07f0444af011f7d95a3c8ee9e0280959ea64dce766"} err="failed to get container status \"948e56529b0f7ac2ecf6ca07f0444af011f7d95a3c8ee9e0280959ea64dce766\": rpc error: code = NotFound desc = could not find container \"948e56529b0f7ac2ecf6ca07f0444af011f7d95a3c8ee9e0280959ea64dce766\": container with ID starting with 948e56529b0f7ac2ecf6ca07f0444af011f7d95a3c8ee9e0280959ea64dce766 not found: ID does not exist" Mar 10 07:12:01 crc kubenswrapper[4825]: I0310 07:12:01.249243 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1949f5b1-56f4-499c-83dd-20f96f7f8c7f" path="/var/lib/kubelet/pods/1949f5b1-56f4-499c-83dd-20f96f7f8c7f/volumes" Mar 10 07:12:02 crc kubenswrapper[4825]: I0310 07:12:02.139793 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552112-dwgg9" event={"ID":"c450af65-3ce2-472b-b05b-b22c285c61b5","Type":"ContainerStarted","Data":"834470ca53cf68f911b932bdf9b193654a066d9dbe28b6b5f1e5b648d0739ae9"} Mar 10 07:12:02 crc kubenswrapper[4825]: I0310 07:12:02.172015 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552112-dwgg9" podStartSLOduration=1.315989057 podStartE2EDuration="2.171996276s" podCreationTimestamp="2026-03-10 07:12:00 +0000 UTC" firstStartedPulling="2026-03-10 07:12:00.802239969 +0000 UTC m=+1673.832020584" lastFinishedPulling="2026-03-10 07:12:01.658247158 +0000 UTC m=+1674.688027803" observedRunningTime="2026-03-10 07:12:02.161537713 +0000 UTC m=+1675.191318348" watchObservedRunningTime="2026-03-10 07:12:02.171996276 +0000 UTC m=+1675.201776901" Mar 10 07:12:03 crc kubenswrapper[4825]: I0310 07:12:03.152010 4825 generic.go:334] "Generic (PLEG): container finished" podID="c450af65-3ce2-472b-b05b-b22c285c61b5" containerID="834470ca53cf68f911b932bdf9b193654a066d9dbe28b6b5f1e5b648d0739ae9" exitCode=0 Mar 10 07:12:03 crc kubenswrapper[4825]: I0310 07:12:03.152092 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552112-dwgg9" event={"ID":"c450af65-3ce2-472b-b05b-b22c285c61b5","Type":"ContainerDied","Data":"834470ca53cf68f911b932bdf9b193654a066d9dbe28b6b5f1e5b648d0739ae9"} Mar 10 07:12:03 crc kubenswrapper[4825]: I0310 07:12:03.597070 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-46l6r"] Mar 10 07:12:03 crc kubenswrapper[4825]: I0310 07:12:03.597322 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-46l6r" podUID="146e48ac-f636-4728-a5e0-4cfc7c2fea58" containerName="registry-server" containerID="cri-o://0bbab3d6fb346865b949ec24bedfbd9984c5539d7a600ef1a0f02ba00c9d80b9" gracePeriod=2 Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.053182 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-46l6r" Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.161631 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/146e48ac-f636-4728-a5e0-4cfc7c2fea58-catalog-content\") pod \"146e48ac-f636-4728-a5e0-4cfc7c2fea58\" (UID: \"146e48ac-f636-4728-a5e0-4cfc7c2fea58\") " Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.161752 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/146e48ac-f636-4728-a5e0-4cfc7c2fea58-utilities\") pod \"146e48ac-f636-4728-a5e0-4cfc7c2fea58\" (UID: \"146e48ac-f636-4728-a5e0-4cfc7c2fea58\") " Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.161811 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5f9b\" (UniqueName: \"kubernetes.io/projected/146e48ac-f636-4728-a5e0-4cfc7c2fea58-kube-api-access-n5f9b\") pod \"146e48ac-f636-4728-a5e0-4cfc7c2fea58\" (UID: \"146e48ac-f636-4728-a5e0-4cfc7c2fea58\") " Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.164298 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/146e48ac-f636-4728-a5e0-4cfc7c2fea58-utilities" (OuterVolumeSpecName: "utilities") pod "146e48ac-f636-4728-a5e0-4cfc7c2fea58" (UID: "146e48ac-f636-4728-a5e0-4cfc7c2fea58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.168447 4825 generic.go:334] "Generic (PLEG): container finished" podID="146e48ac-f636-4728-a5e0-4cfc7c2fea58" containerID="0bbab3d6fb346865b949ec24bedfbd9984c5539d7a600ef1a0f02ba00c9d80b9" exitCode=0 Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.168497 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-46l6r" Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.168537 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46l6r" event={"ID":"146e48ac-f636-4728-a5e0-4cfc7c2fea58","Type":"ContainerDied","Data":"0bbab3d6fb346865b949ec24bedfbd9984c5539d7a600ef1a0f02ba00c9d80b9"} Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.168588 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46l6r" event={"ID":"146e48ac-f636-4728-a5e0-4cfc7c2fea58","Type":"ContainerDied","Data":"dcf0f4cf12d32f81fa29c4b390c82507b685fa1eeca313c42894c6ffc41a459c"} Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.168618 4825 scope.go:117] "RemoveContainer" containerID="0bbab3d6fb346865b949ec24bedfbd9984c5539d7a600ef1a0f02ba00c9d80b9" Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.171793 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/146e48ac-f636-4728-a5e0-4cfc7c2fea58-kube-api-access-n5f9b" (OuterVolumeSpecName: "kube-api-access-n5f9b") pod "146e48ac-f636-4728-a5e0-4cfc7c2fea58" (UID: "146e48ac-f636-4728-a5e0-4cfc7c2fea58"). InnerVolumeSpecName "kube-api-access-n5f9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.232328 4825 scope.go:117] "RemoveContainer" containerID="201612e0d130d679809b6b2fa151d5f40daf86ee4e9f362aa958750bdb930fa6" Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.234953 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/146e48ac-f636-4728-a5e0-4cfc7c2fea58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "146e48ac-f636-4728-a5e0-4cfc7c2fea58" (UID: "146e48ac-f636-4728-a5e0-4cfc7c2fea58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.264398 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/146e48ac-f636-4728-a5e0-4cfc7c2fea58-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.264488 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/146e48ac-f636-4728-a5e0-4cfc7c2fea58-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.264511 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5f9b\" (UniqueName: \"kubernetes.io/projected/146e48ac-f636-4728-a5e0-4cfc7c2fea58-kube-api-access-n5f9b\") on node \"crc\" DevicePath \"\"" Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.267564 4825 scope.go:117] "RemoveContainer" containerID="0046f4b6b09af076f92d994e5a0808158b0b94de08f2de3b9d290d32f99dfd95" Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.310463 4825 scope.go:117] "RemoveContainer" containerID="0bbab3d6fb346865b949ec24bedfbd9984c5539d7a600ef1a0f02ba00c9d80b9" Mar 10 07:12:04 crc kubenswrapper[4825]: E0310 07:12:04.311482 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bbab3d6fb346865b949ec24bedfbd9984c5539d7a600ef1a0f02ba00c9d80b9\": container with ID starting with 0bbab3d6fb346865b949ec24bedfbd9984c5539d7a600ef1a0f02ba00c9d80b9 not found: ID does not exist" containerID="0bbab3d6fb346865b949ec24bedfbd9984c5539d7a600ef1a0f02ba00c9d80b9" Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.311561 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bbab3d6fb346865b949ec24bedfbd9984c5539d7a600ef1a0f02ba00c9d80b9"} err="failed to get container status \"0bbab3d6fb346865b949ec24bedfbd9984c5539d7a600ef1a0f02ba00c9d80b9\": rpc error: code = NotFound desc = could not find container \"0bbab3d6fb346865b949ec24bedfbd9984c5539d7a600ef1a0f02ba00c9d80b9\": container with ID starting with 0bbab3d6fb346865b949ec24bedfbd9984c5539d7a600ef1a0f02ba00c9d80b9 not found: ID does not exist" Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.311610 4825 scope.go:117] "RemoveContainer" containerID="201612e0d130d679809b6b2fa151d5f40daf86ee4e9f362aa958750bdb930fa6" Mar 10 07:12:04 crc kubenswrapper[4825]: E0310 07:12:04.313534 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"201612e0d130d679809b6b2fa151d5f40daf86ee4e9f362aa958750bdb930fa6\": container with ID starting with 201612e0d130d679809b6b2fa151d5f40daf86ee4e9f362aa958750bdb930fa6 not found: ID does not exist" containerID="201612e0d130d679809b6b2fa151d5f40daf86ee4e9f362aa958750bdb930fa6" Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.313603 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"201612e0d130d679809b6b2fa151d5f40daf86ee4e9f362aa958750bdb930fa6"} err="failed to get container status \"201612e0d130d679809b6b2fa151d5f40daf86ee4e9f362aa958750bdb930fa6\": rpc error: code = NotFound desc = could not find container \"201612e0d130d679809b6b2fa151d5f40daf86ee4e9f362aa958750bdb930fa6\": container with ID starting with 201612e0d130d679809b6b2fa151d5f40daf86ee4e9f362aa958750bdb930fa6 not found: ID does not exist" Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.313648 4825 scope.go:117] "RemoveContainer" containerID="0046f4b6b09af076f92d994e5a0808158b0b94de08f2de3b9d290d32f99dfd95" Mar 10 07:12:04 crc kubenswrapper[4825]: E0310 07:12:04.314052 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0046f4b6b09af076f92d994e5a0808158b0b94de08f2de3b9d290d32f99dfd95\": container with ID starting with 0046f4b6b09af076f92d994e5a0808158b0b94de08f2de3b9d290d32f99dfd95 not found: ID does not exist" containerID="0046f4b6b09af076f92d994e5a0808158b0b94de08f2de3b9d290d32f99dfd95" Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.314117 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0046f4b6b09af076f92d994e5a0808158b0b94de08f2de3b9d290d32f99dfd95"} err="failed to get container status \"0046f4b6b09af076f92d994e5a0808158b0b94de08f2de3b9d290d32f99dfd95\": rpc error: code = NotFound desc = could not find container \"0046f4b6b09af076f92d994e5a0808158b0b94de08f2de3b9d290d32f99dfd95\": container with ID starting with 0046f4b6b09af076f92d994e5a0808158b0b94de08f2de3b9d290d32f99dfd95 not found: ID does not exist" Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.445440 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552112-dwgg9" Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.508792 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-46l6r"] Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.513584 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-46l6r"] Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.569415 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmhpt\" (UniqueName: \"kubernetes.io/projected/c450af65-3ce2-472b-b05b-b22c285c61b5-kube-api-access-nmhpt\") pod \"c450af65-3ce2-472b-b05b-b22c285c61b5\" (UID: \"c450af65-3ce2-472b-b05b-b22c285c61b5\") " Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.574511 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c450af65-3ce2-472b-b05b-b22c285c61b5-kube-api-access-nmhpt" (OuterVolumeSpecName: "kube-api-access-nmhpt") pod "c450af65-3ce2-472b-b05b-b22c285c61b5" (UID: "c450af65-3ce2-472b-b05b-b22c285c61b5"). InnerVolumeSpecName "kube-api-access-nmhpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:12:04 crc kubenswrapper[4825]: I0310 07:12:04.671614 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmhpt\" (UniqueName: \"kubernetes.io/projected/c450af65-3ce2-472b-b05b-b22c285c61b5-kube-api-access-nmhpt\") on node \"crc\" DevicePath \"\"" Mar 10 07:12:05 crc kubenswrapper[4825]: I0310 07:12:05.180752 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552112-dwgg9" event={"ID":"c450af65-3ce2-472b-b05b-b22c285c61b5","Type":"ContainerDied","Data":"c2dd18f55eb14c9b2ef1fa04e38f54bbe87e624d032c1aa28085e0157f9945de"} Mar 10 07:12:05 crc kubenswrapper[4825]: I0310 07:12:05.180798 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552112-dwgg9" Mar 10 07:12:05 crc kubenswrapper[4825]: I0310 07:12:05.180827 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2dd18f55eb14c9b2ef1fa04e38f54bbe87e624d032c1aa28085e0157f9945de" Mar 10 07:12:05 crc kubenswrapper[4825]: I0310 07:12:05.252580 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="146e48ac-f636-4728-a5e0-4cfc7c2fea58" path="/var/lib/kubelet/pods/146e48ac-f636-4728-a5e0-4cfc7c2fea58/volumes" Mar 10 07:12:05 crc kubenswrapper[4825]: I0310 07:12:05.269849 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552106-kjbwd"] Mar 10 07:12:05 crc kubenswrapper[4825]: I0310 07:12:05.281361 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552106-kjbwd"] Mar 10 07:12:07 crc kubenswrapper[4825]: I0310 07:12:07.251191 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ead0475-146d-4055-9e56-ab3ba945041c" path="/var/lib/kubelet/pods/7ead0475-146d-4055-9e56-ab3ba945041c/volumes" Mar 10 07:12:24 crc kubenswrapper[4825]: I0310 07:12:24.406232 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nzcbh"] Mar 10 07:12:24 crc kubenswrapper[4825]: E0310 07:12:24.407475 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146e48ac-f636-4728-a5e0-4cfc7c2fea58" containerName="registry-server" Mar 10 07:12:24 crc kubenswrapper[4825]: I0310 07:12:24.407497 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="146e48ac-f636-4728-a5e0-4cfc7c2fea58" containerName="registry-server" Mar 10 07:12:24 crc kubenswrapper[4825]: E0310 07:12:24.407519 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146e48ac-f636-4728-a5e0-4cfc7c2fea58" containerName="extract-content" Mar 10 07:12:24 crc kubenswrapper[4825]: I0310 07:12:24.407532 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="146e48ac-f636-4728-a5e0-4cfc7c2fea58" containerName="extract-content" Mar 10 07:12:24 crc kubenswrapper[4825]: E0310 07:12:24.407557 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1949f5b1-56f4-499c-83dd-20f96f7f8c7f" containerName="extract-utilities" Mar 10 07:12:24 crc kubenswrapper[4825]: I0310 07:12:24.407569 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1949f5b1-56f4-499c-83dd-20f96f7f8c7f" containerName="extract-utilities" Mar 10 07:12:24 crc kubenswrapper[4825]: E0310 07:12:24.407584 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1949f5b1-56f4-499c-83dd-20f96f7f8c7f" containerName="extract-content" Mar 10 07:12:24 crc kubenswrapper[4825]: I0310 07:12:24.407598 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1949f5b1-56f4-499c-83dd-20f96f7f8c7f" containerName="extract-content" Mar 10 07:12:24 crc kubenswrapper[4825]: E0310 07:12:24.407619 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1949f5b1-56f4-499c-83dd-20f96f7f8c7f" containerName="registry-server" Mar 10 07:12:24 crc kubenswrapper[4825]: I0310 07:12:24.407631 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1949f5b1-56f4-499c-83dd-20f96f7f8c7f" containerName="registry-server" Mar 10 07:12:24 crc kubenswrapper[4825]: E0310 07:12:24.407659 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c450af65-3ce2-472b-b05b-b22c285c61b5" containerName="oc" Mar 10 07:12:24 crc kubenswrapper[4825]: I0310 07:12:24.407672 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c450af65-3ce2-472b-b05b-b22c285c61b5" containerName="oc" Mar 10 07:12:24 crc kubenswrapper[4825]: E0310 07:12:24.407691 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146e48ac-f636-4728-a5e0-4cfc7c2fea58" containerName="extract-utilities" Mar 10 07:12:24 crc kubenswrapper[4825]: I0310 07:12:24.407703 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="146e48ac-f636-4728-a5e0-4cfc7c2fea58" containerName="extract-utilities" Mar 10 07:12:24 crc kubenswrapper[4825]: I0310 07:12:24.407944 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="146e48ac-f636-4728-a5e0-4cfc7c2fea58" containerName="registry-server" Mar 10 07:12:24 crc kubenswrapper[4825]: I0310 07:12:24.407966 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="1949f5b1-56f4-499c-83dd-20f96f7f8c7f" containerName="registry-server" Mar 10 07:12:24 crc kubenswrapper[4825]: I0310 07:12:24.407983 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c450af65-3ce2-472b-b05b-b22c285c61b5" containerName="oc" Mar 10 07:12:24 crc kubenswrapper[4825]: I0310 07:12:24.412548 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nzcbh" Mar 10 07:12:24 crc kubenswrapper[4825]: I0310 07:12:24.430167 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nzcbh"] Mar 10 07:12:24 crc kubenswrapper[4825]: I0310 07:12:24.518258 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0cc3c9f-d4a5-4942-8182-7cc13f657ec8-catalog-content\") pod \"community-operators-nzcbh\" (UID: \"d0cc3c9f-d4a5-4942-8182-7cc13f657ec8\") " pod="openshift-marketplace/community-operators-nzcbh" Mar 10 07:12:24 crc kubenswrapper[4825]: I0310 07:12:24.518361 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0cc3c9f-d4a5-4942-8182-7cc13f657ec8-utilities\") pod \"community-operators-nzcbh\" (UID: \"d0cc3c9f-d4a5-4942-8182-7cc13f657ec8\") " pod="openshift-marketplace/community-operators-nzcbh" Mar 10 07:12:24 crc kubenswrapper[4825]: I0310 07:12:24.518404 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff4jp\" (UniqueName: \"kubernetes.io/projected/d0cc3c9f-d4a5-4942-8182-7cc13f657ec8-kube-api-access-ff4jp\") pod \"community-operators-nzcbh\" (UID: \"d0cc3c9f-d4a5-4942-8182-7cc13f657ec8\") " pod="openshift-marketplace/community-operators-nzcbh" Mar 10 07:12:24 crc kubenswrapper[4825]: I0310 07:12:24.619899 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0cc3c9f-d4a5-4942-8182-7cc13f657ec8-utilities\") pod \"community-operators-nzcbh\" (UID: \"d0cc3c9f-d4a5-4942-8182-7cc13f657ec8\") " pod="openshift-marketplace/community-operators-nzcbh" Mar 10 07:12:24 crc kubenswrapper[4825]: I0310 07:12:24.620282 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff4jp\" (UniqueName: \"kubernetes.io/projected/d0cc3c9f-d4a5-4942-8182-7cc13f657ec8-kube-api-access-ff4jp\") pod \"community-operators-nzcbh\" (UID: \"d0cc3c9f-d4a5-4942-8182-7cc13f657ec8\") " pod="openshift-marketplace/community-operators-nzcbh" Mar 10 07:12:24 crc kubenswrapper[4825]: I0310 07:12:24.620496 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0cc3c9f-d4a5-4942-8182-7cc13f657ec8-catalog-content\") pod \"community-operators-nzcbh\" (UID: \"d0cc3c9f-d4a5-4942-8182-7cc13f657ec8\") " pod="openshift-marketplace/community-operators-nzcbh" Mar 10 07:12:24 crc kubenswrapper[4825]: I0310 07:12:24.620508 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0cc3c9f-d4a5-4942-8182-7cc13f657ec8-utilities\") pod \"community-operators-nzcbh\" (UID: \"d0cc3c9f-d4a5-4942-8182-7cc13f657ec8\") " pod="openshift-marketplace/community-operators-nzcbh" Mar 10 07:12:24 crc kubenswrapper[4825]: I0310 07:12:24.620955 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0cc3c9f-d4a5-4942-8182-7cc13f657ec8-catalog-content\") pod \"community-operators-nzcbh\" (UID: \"d0cc3c9f-d4a5-4942-8182-7cc13f657ec8\") " pod="openshift-marketplace/community-operators-nzcbh" Mar 10 07:12:24 crc kubenswrapper[4825]: I0310 07:12:24.637673 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff4jp\" (UniqueName: \"kubernetes.io/projected/d0cc3c9f-d4a5-4942-8182-7cc13f657ec8-kube-api-access-ff4jp\") pod \"community-operators-nzcbh\" (UID: \"d0cc3c9f-d4a5-4942-8182-7cc13f657ec8\") " pod="openshift-marketplace/community-operators-nzcbh" Mar 10 07:12:24 crc kubenswrapper[4825]: I0310 07:12:24.737780 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nzcbh" Mar 10 07:12:25 crc kubenswrapper[4825]: I0310 07:12:25.085196 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nzcbh"] Mar 10 07:12:25 crc kubenswrapper[4825]: I0310 07:12:25.416723 4825 generic.go:334] "Generic (PLEG): container finished" podID="d0cc3c9f-d4a5-4942-8182-7cc13f657ec8" containerID="aafd62cc48d5da6ebf84043771c00cf3fec7ae1051613bc627c972d766912060" exitCode=0 Mar 10 07:12:25 crc kubenswrapper[4825]: I0310 07:12:25.416805 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzcbh" event={"ID":"d0cc3c9f-d4a5-4942-8182-7cc13f657ec8","Type":"ContainerDied","Data":"aafd62cc48d5da6ebf84043771c00cf3fec7ae1051613bc627c972d766912060"} Mar 10 07:12:25 crc kubenswrapper[4825]: I0310 07:12:25.417080 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzcbh" event={"ID":"d0cc3c9f-d4a5-4942-8182-7cc13f657ec8","Type":"ContainerStarted","Data":"52c4e79d14c48a7b3cdd8f5659820ff0099c2c79efb32663f793b4cf8ba578fe"} Mar 10 07:12:26 crc kubenswrapper[4825]: I0310 07:12:26.429706 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzcbh" event={"ID":"d0cc3c9f-d4a5-4942-8182-7cc13f657ec8","Type":"ContainerStarted","Data":"bc27b58443c53b78ba04498d5de74a9b37fb02c8b7731cc9452091e26132ab49"} Mar 10 07:12:27 crc kubenswrapper[4825]: I0310 07:12:27.442423 4825 generic.go:334] "Generic (PLEG): container finished" podID="d0cc3c9f-d4a5-4942-8182-7cc13f657ec8" containerID="bc27b58443c53b78ba04498d5de74a9b37fb02c8b7731cc9452091e26132ab49" exitCode=0 Mar 10 07:12:27 crc kubenswrapper[4825]: I0310 07:12:27.442507 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzcbh" event={"ID":"d0cc3c9f-d4a5-4942-8182-7cc13f657ec8","Type":"ContainerDied","Data":"bc27b58443c53b78ba04498d5de74a9b37fb02c8b7731cc9452091e26132ab49"} Mar 10 07:12:28 crc kubenswrapper[4825]: I0310 07:12:28.455707 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzcbh" event={"ID":"d0cc3c9f-d4a5-4942-8182-7cc13f657ec8","Type":"ContainerStarted","Data":"685f1cbe19fecac12b24a2710648387db0e36b4c07244c32489bfbf37140cdfe"} Mar 10 07:12:28 crc kubenswrapper[4825]: I0310 07:12:28.488663 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nzcbh" podStartSLOduration=1.960832844 podStartE2EDuration="4.488626483s" podCreationTimestamp="2026-03-10 07:12:24 +0000 UTC" firstStartedPulling="2026-03-10 07:12:25.418519712 +0000 UTC m=+1698.448300327" lastFinishedPulling="2026-03-10 07:12:27.946313311 +0000 UTC m=+1700.976093966" observedRunningTime="2026-03-10 07:12:28.482461582 +0000 UTC m=+1701.512242227" watchObservedRunningTime="2026-03-10 07:12:28.488626483 +0000 UTC m=+1701.518407138" Mar 10 07:12:34 crc kubenswrapper[4825]: I0310 07:12:34.737875 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nzcbh" Mar 10 07:12:34 crc kubenswrapper[4825]: I0310 07:12:34.738465 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nzcbh" Mar 10 07:12:34 crc kubenswrapper[4825]: I0310 07:12:34.820096 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nzcbh" Mar 10 07:12:35 crc kubenswrapper[4825]: I0310 07:12:35.605419 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nzcbh" Mar 10 07:12:35 crc kubenswrapper[4825]: I0310 07:12:35.681720 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nzcbh"] Mar 10 07:12:36 crc kubenswrapper[4825]: I0310 07:12:36.353063 4825 scope.go:117] "RemoveContainer" containerID="c79985f0c20e51e94ef65276b3b0679ff00e78c8af4eb7e94defcd1ac905c499" Mar 10 07:12:36 crc kubenswrapper[4825]: I0310 07:12:36.385307 4825 scope.go:117] "RemoveContainer" containerID="c31f3d6f869e260c7cfef4fcd38079b721d8d8d8bf13e9740b4a1faedbf0b849" Mar 10 07:12:36 crc kubenswrapper[4825]: I0310 07:12:36.429401 4825 scope.go:117] "RemoveContainer" containerID="7c7257e4058fc753891db09721efcb315ea6caeeab1b10d2c3b36746db13b108" Mar 10 07:12:36 crc kubenswrapper[4825]: I0310 07:12:36.481273 4825 scope.go:117] "RemoveContainer" containerID="3d8dc26b029c96b4d27209c0234e0de2c19670232d57153317bfaac26bfd5e16" Mar 10 07:12:36 crc kubenswrapper[4825]: I0310 07:12:36.505598 4825 scope.go:117] "RemoveContainer" containerID="ff574575ba140cd21c83cc1ec9df541d4e5a9107a11a2d6cd35ab129d3763676" Mar 10 07:12:36 crc kubenswrapper[4825]: I0310 07:12:36.555974 4825 scope.go:117] "RemoveContainer" containerID="6b20962e48b9baf3eaa366f3b6fd06bf9bf5f98ff0a301cbbdd95be4fc14ca33" Mar 10 07:12:36 crc kubenswrapper[4825]: I0310 07:12:36.593787 4825 scope.go:117] "RemoveContainer" containerID="d070d8dc5655990ee6a75c959e20121a68217564d45799abf7aa61ef7c8cb442" Mar 10 07:12:36 crc kubenswrapper[4825]: I0310 07:12:36.676517 4825 scope.go:117] "RemoveContainer" containerID="b62f07a8427cfba96a047aaa4a6ebd9cd84408e32b9a49b383d2817f4848d858" Mar 10 07:12:37 crc kubenswrapper[4825]: I0310 07:12:37.551094 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nzcbh" podUID="d0cc3c9f-d4a5-4942-8182-7cc13f657ec8" containerName="registry-server" containerID="cri-o://685f1cbe19fecac12b24a2710648387db0e36b4c07244c32489bfbf37140cdfe" gracePeriod=2 Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.059909 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nzcbh" Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.240901 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff4jp\" (UniqueName: \"kubernetes.io/projected/d0cc3c9f-d4a5-4942-8182-7cc13f657ec8-kube-api-access-ff4jp\") pod \"d0cc3c9f-d4a5-4942-8182-7cc13f657ec8\" (UID: \"d0cc3c9f-d4a5-4942-8182-7cc13f657ec8\") " Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.243218 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0cc3c9f-d4a5-4942-8182-7cc13f657ec8-catalog-content\") pod \"d0cc3c9f-d4a5-4942-8182-7cc13f657ec8\" (UID: \"d0cc3c9f-d4a5-4942-8182-7cc13f657ec8\") " Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.243342 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0cc3c9f-d4a5-4942-8182-7cc13f657ec8-utilities\") pod \"d0cc3c9f-d4a5-4942-8182-7cc13f657ec8\" (UID: \"d0cc3c9f-d4a5-4942-8182-7cc13f657ec8\") " Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.247206 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0cc3c9f-d4a5-4942-8182-7cc13f657ec8-utilities" (OuterVolumeSpecName: "utilities") pod "d0cc3c9f-d4a5-4942-8182-7cc13f657ec8" (UID: "d0cc3c9f-d4a5-4942-8182-7cc13f657ec8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.252344 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0cc3c9f-d4a5-4942-8182-7cc13f657ec8-kube-api-access-ff4jp" (OuterVolumeSpecName: "kube-api-access-ff4jp") pod "d0cc3c9f-d4a5-4942-8182-7cc13f657ec8" (UID: "d0cc3c9f-d4a5-4942-8182-7cc13f657ec8"). InnerVolumeSpecName "kube-api-access-ff4jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.334965 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0cc3c9f-d4a5-4942-8182-7cc13f657ec8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0cc3c9f-d4a5-4942-8182-7cc13f657ec8" (UID: "d0cc3c9f-d4a5-4942-8182-7cc13f657ec8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.345568 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff4jp\" (UniqueName: \"kubernetes.io/projected/d0cc3c9f-d4a5-4942-8182-7cc13f657ec8-kube-api-access-ff4jp\") on node \"crc\" DevicePath \"\"" Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.345616 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0cc3c9f-d4a5-4942-8182-7cc13f657ec8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.345638 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0cc3c9f-d4a5-4942-8182-7cc13f657ec8-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.575328 4825 generic.go:334] "Generic (PLEG): container finished" podID="d0cc3c9f-d4a5-4942-8182-7cc13f657ec8" containerID="685f1cbe19fecac12b24a2710648387db0e36b4c07244c32489bfbf37140cdfe" exitCode=0 Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.575393 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzcbh" event={"ID":"d0cc3c9f-d4a5-4942-8182-7cc13f657ec8","Type":"ContainerDied","Data":"685f1cbe19fecac12b24a2710648387db0e36b4c07244c32489bfbf37140cdfe"} Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.575403 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nzcbh" Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.575435 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzcbh" event={"ID":"d0cc3c9f-d4a5-4942-8182-7cc13f657ec8","Type":"ContainerDied","Data":"52c4e79d14c48a7b3cdd8f5659820ff0099c2c79efb32663f793b4cf8ba578fe"} Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.575461 4825 scope.go:117] "RemoveContainer" containerID="685f1cbe19fecac12b24a2710648387db0e36b4c07244c32489bfbf37140cdfe" Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.598764 4825 scope.go:117] "RemoveContainer" containerID="bc27b58443c53b78ba04498d5de74a9b37fb02c8b7731cc9452091e26132ab49" Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.628064 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nzcbh"] Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.631396 4825 scope.go:117] "RemoveContainer" containerID="aafd62cc48d5da6ebf84043771c00cf3fec7ae1051613bc627c972d766912060" Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.635532 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nzcbh"] Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.667496 4825 scope.go:117] "RemoveContainer" containerID="685f1cbe19fecac12b24a2710648387db0e36b4c07244c32489bfbf37140cdfe" Mar 10 07:12:38 crc kubenswrapper[4825]: E0310 07:12:38.668068 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"685f1cbe19fecac12b24a2710648387db0e36b4c07244c32489bfbf37140cdfe\": container with ID starting with 685f1cbe19fecac12b24a2710648387db0e36b4c07244c32489bfbf37140cdfe not found: ID does not exist" containerID="685f1cbe19fecac12b24a2710648387db0e36b4c07244c32489bfbf37140cdfe" Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.668110 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"685f1cbe19fecac12b24a2710648387db0e36b4c07244c32489bfbf37140cdfe"} err="failed to get container status \"685f1cbe19fecac12b24a2710648387db0e36b4c07244c32489bfbf37140cdfe\": rpc error: code = NotFound desc = could not find container \"685f1cbe19fecac12b24a2710648387db0e36b4c07244c32489bfbf37140cdfe\": container with ID starting with 685f1cbe19fecac12b24a2710648387db0e36b4c07244c32489bfbf37140cdfe not found: ID does not exist" Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.668154 4825 scope.go:117] "RemoveContainer" containerID="bc27b58443c53b78ba04498d5de74a9b37fb02c8b7731cc9452091e26132ab49" Mar 10 07:12:38 crc kubenswrapper[4825]: E0310 07:12:38.668585 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc27b58443c53b78ba04498d5de74a9b37fb02c8b7731cc9452091e26132ab49\": container with ID starting with bc27b58443c53b78ba04498d5de74a9b37fb02c8b7731cc9452091e26132ab49 not found: ID does not exist" containerID="bc27b58443c53b78ba04498d5de74a9b37fb02c8b7731cc9452091e26132ab49" Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.668646 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc27b58443c53b78ba04498d5de74a9b37fb02c8b7731cc9452091e26132ab49"} err="failed to get container status \"bc27b58443c53b78ba04498d5de74a9b37fb02c8b7731cc9452091e26132ab49\": rpc error: code = NotFound desc = could not find container \"bc27b58443c53b78ba04498d5de74a9b37fb02c8b7731cc9452091e26132ab49\": container with ID starting with bc27b58443c53b78ba04498d5de74a9b37fb02c8b7731cc9452091e26132ab49 not found: ID does not exist" Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.668682 4825 scope.go:117] "RemoveContainer" containerID="aafd62cc48d5da6ebf84043771c00cf3fec7ae1051613bc627c972d766912060" Mar 10 07:12:38 crc kubenswrapper[4825]: E0310 07:12:38.669049 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aafd62cc48d5da6ebf84043771c00cf3fec7ae1051613bc627c972d766912060\": container with ID starting with aafd62cc48d5da6ebf84043771c00cf3fec7ae1051613bc627c972d766912060 not found: ID does not exist" containerID="aafd62cc48d5da6ebf84043771c00cf3fec7ae1051613bc627c972d766912060" Mar 10 07:12:38 crc kubenswrapper[4825]: I0310 07:12:38.669100 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aafd62cc48d5da6ebf84043771c00cf3fec7ae1051613bc627c972d766912060"} err="failed to get container status \"aafd62cc48d5da6ebf84043771c00cf3fec7ae1051613bc627c972d766912060\": rpc error: code = NotFound desc = could not find container \"aafd62cc48d5da6ebf84043771c00cf3fec7ae1051613bc627c972d766912060\": container with ID starting with aafd62cc48d5da6ebf84043771c00cf3fec7ae1051613bc627c972d766912060 not found: ID does not exist" Mar 10 07:12:39 crc kubenswrapper[4825]: I0310 07:12:39.253899 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0cc3c9f-d4a5-4942-8182-7cc13f657ec8" path="/var/lib/kubelet/pods/d0cc3c9f-d4a5-4942-8182-7cc13f657ec8/volumes" Mar 10 07:12:46 crc kubenswrapper[4825]: I0310 07:12:46.888581 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:12:46 crc kubenswrapper[4825]: I0310 07:12:46.888923 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:13:16 crc kubenswrapper[4825]: I0310 07:13:16.888285 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:13:16 crc kubenswrapper[4825]: I0310 07:13:16.888827 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:13:36 crc kubenswrapper[4825]: I0310 07:13:36.840086 4825 scope.go:117] "RemoveContainer" containerID="3403e1d7a6a0a6a2cc35efd1e1dc9852c16e21f87682c0116ab023c66bfdfc6a" Mar 10 07:13:36 crc kubenswrapper[4825]: I0310 07:13:36.925111 4825 scope.go:117] "RemoveContainer" containerID="d12d342890f106ec66965c139b6179bf426cf28f4c0fcf70081ef10bb7ced185" Mar 10 07:13:36 crc kubenswrapper[4825]: I0310 07:13:36.953259 4825 scope.go:117] "RemoveContainer" containerID="88d73553fa50a5bca19b77f26ea09c721d2c802bb054b7c91d38d67f0e15583f" Mar 10 07:13:36 crc kubenswrapper[4825]: I0310 07:13:36.968867 4825 scope.go:117] "RemoveContainer" containerID="e4fc36f4f8673da278d64fd9e1e19a5d1e27f954df79bea201495848ce4acc7b" Mar 10 07:13:36 crc kubenswrapper[4825]: I0310 07:13:36.995311 4825 scope.go:117] "RemoveContainer" containerID="5a6694411ba07209eab6dc1d0ce1dc4fc71125bfd4984ffce3d54b7f66b38d63" Mar 10 07:13:37 crc kubenswrapper[4825]: I0310 07:13:37.027153 4825 scope.go:117] "RemoveContainer" containerID="d622562b5f42f435db6b40bb64c857ae773b508272fe26be239a6df4fd3fc779" Mar 10 07:13:37 crc kubenswrapper[4825]: I0310 07:13:37.043481 4825 scope.go:117] "RemoveContainer" containerID="bba109d5fe5e55a82c907945f0e3fe0772bd962a63f9a8d77ed335d3def73f12" Mar 10 07:13:46 crc kubenswrapper[4825]: I0310 07:13:46.888637 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:13:46 crc kubenswrapper[4825]: I0310 07:13:46.889047 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:13:46 crc kubenswrapper[4825]: I0310 07:13:46.889130 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 07:13:46 crc kubenswrapper[4825]: I0310 07:13:46.889996 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 07:13:46 crc kubenswrapper[4825]: I0310 07:13:46.890124 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" gracePeriod=600 Mar 10 07:13:47 crc kubenswrapper[4825]: E0310 07:13:47.025941 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:13:47 crc kubenswrapper[4825]: I0310 07:13:47.429618 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" exitCode=0 Mar 10 07:13:47 crc kubenswrapper[4825]: I0310 07:13:47.429869 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5"} Mar 10 07:13:47 crc kubenswrapper[4825]: I0310 07:13:47.430028 4825 scope.go:117] "RemoveContainer" containerID="c615601d462179e42a2a1b6a7ad7dd6409fbaf9c9befec571e34a43e2c4ba121" Mar 10 07:13:47 crc kubenswrapper[4825]: I0310 07:13:47.430868 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:13:47 crc kubenswrapper[4825]: E0310 07:13:47.431285 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:14:00 crc kubenswrapper[4825]: I0310 07:14:00.154847 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552114-ql6j9"] Mar 10 07:14:00 crc kubenswrapper[4825]: E0310 07:14:00.155710 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0cc3c9f-d4a5-4942-8182-7cc13f657ec8" containerName="extract-content" Mar 10 07:14:00 crc kubenswrapper[4825]: I0310 07:14:00.155723 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0cc3c9f-d4a5-4942-8182-7cc13f657ec8" containerName="extract-content" Mar 10 07:14:00 crc kubenswrapper[4825]: E0310 07:14:00.155747 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0cc3c9f-d4a5-4942-8182-7cc13f657ec8" containerName="extract-utilities" Mar 10 07:14:00 crc kubenswrapper[4825]: I0310 07:14:00.155756 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0cc3c9f-d4a5-4942-8182-7cc13f657ec8" containerName="extract-utilities" Mar 10 07:14:00 crc kubenswrapper[4825]: E0310 07:14:00.155775 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0cc3c9f-d4a5-4942-8182-7cc13f657ec8" containerName="registry-server" Mar 10 07:14:00 crc kubenswrapper[4825]: I0310 07:14:00.155783 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0cc3c9f-d4a5-4942-8182-7cc13f657ec8" containerName="registry-server" Mar 10 07:14:00 crc kubenswrapper[4825]: I0310 07:14:00.155941 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0cc3c9f-d4a5-4942-8182-7cc13f657ec8" containerName="registry-server" Mar 10 07:14:00 crc kubenswrapper[4825]: I0310 07:14:00.156558 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552114-ql6j9" Mar 10 07:14:00 crc kubenswrapper[4825]: I0310 07:14:00.158810 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:14:00 crc kubenswrapper[4825]: I0310 07:14:00.159737 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:14:00 crc kubenswrapper[4825]: I0310 07:14:00.170082 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:14:00 crc kubenswrapper[4825]: I0310 07:14:00.171494 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552114-ql6j9"] Mar 10 07:14:00 crc kubenswrapper[4825]: I0310 07:14:00.188701 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk6tf\" (UniqueName: \"kubernetes.io/projected/c659e062-f5dd-48b0-bcbe-4b448da7bb7b-kube-api-access-lk6tf\") pod \"auto-csr-approver-29552114-ql6j9\" (UID: \"c659e062-f5dd-48b0-bcbe-4b448da7bb7b\") " pod="openshift-infra/auto-csr-approver-29552114-ql6j9" Mar 10 07:14:00 crc kubenswrapper[4825]: I0310 07:14:00.289591 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk6tf\" (UniqueName: \"kubernetes.io/projected/c659e062-f5dd-48b0-bcbe-4b448da7bb7b-kube-api-access-lk6tf\") pod \"auto-csr-approver-29552114-ql6j9\" (UID: \"c659e062-f5dd-48b0-bcbe-4b448da7bb7b\") " pod="openshift-infra/auto-csr-approver-29552114-ql6j9" Mar 10 07:14:00 crc kubenswrapper[4825]: I0310 07:14:00.314157 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk6tf\" (UniqueName: \"kubernetes.io/projected/c659e062-f5dd-48b0-bcbe-4b448da7bb7b-kube-api-access-lk6tf\") pod \"auto-csr-approver-29552114-ql6j9\" (UID: \"c659e062-f5dd-48b0-bcbe-4b448da7bb7b\") " pod="openshift-infra/auto-csr-approver-29552114-ql6j9" Mar 10 07:14:00 crc kubenswrapper[4825]: I0310 07:14:00.469578 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552114-ql6j9" Mar 10 07:14:00 crc kubenswrapper[4825]: I0310 07:14:00.717180 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552114-ql6j9"] Mar 10 07:14:01 crc kubenswrapper[4825]: I0310 07:14:01.237091 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:14:01 crc kubenswrapper[4825]: E0310 07:14:01.237633 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:14:01 crc kubenswrapper[4825]: I0310 07:14:01.555500 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552114-ql6j9" event={"ID":"c659e062-f5dd-48b0-bcbe-4b448da7bb7b","Type":"ContainerStarted","Data":"0e04cc096bcaa174eb94e0cb65d420ffb9f734ea3890a1054833b92dfa102ab9"} Mar 10 07:14:02 crc kubenswrapper[4825]: I0310 07:14:02.567250 4825 generic.go:334] "Generic (PLEG): container finished" podID="c659e062-f5dd-48b0-bcbe-4b448da7bb7b" containerID="9dbaf27159878fa662337cfb370440e49dfeff645bb54f3dd5183e305fbae167" exitCode=0 Mar 10 07:14:02 crc kubenswrapper[4825]: I0310 07:14:02.567345 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552114-ql6j9" event={"ID":"c659e062-f5dd-48b0-bcbe-4b448da7bb7b","Type":"ContainerDied","Data":"9dbaf27159878fa662337cfb370440e49dfeff645bb54f3dd5183e305fbae167"} Mar 10 07:14:03 crc kubenswrapper[4825]: I0310 07:14:03.878295 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552114-ql6j9" Mar 10 07:14:03 crc kubenswrapper[4825]: I0310 07:14:03.946559 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk6tf\" (UniqueName: \"kubernetes.io/projected/c659e062-f5dd-48b0-bcbe-4b448da7bb7b-kube-api-access-lk6tf\") pod \"c659e062-f5dd-48b0-bcbe-4b448da7bb7b\" (UID: \"c659e062-f5dd-48b0-bcbe-4b448da7bb7b\") " Mar 10 07:14:03 crc kubenswrapper[4825]: I0310 07:14:03.951482 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c659e062-f5dd-48b0-bcbe-4b448da7bb7b-kube-api-access-lk6tf" (OuterVolumeSpecName: "kube-api-access-lk6tf") pod "c659e062-f5dd-48b0-bcbe-4b448da7bb7b" (UID: "c659e062-f5dd-48b0-bcbe-4b448da7bb7b"). InnerVolumeSpecName "kube-api-access-lk6tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:14:04 crc kubenswrapper[4825]: I0310 07:14:04.048330 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk6tf\" (UniqueName: \"kubernetes.io/projected/c659e062-f5dd-48b0-bcbe-4b448da7bb7b-kube-api-access-lk6tf\") on node \"crc\" DevicePath \"\"" Mar 10 07:14:04 crc kubenswrapper[4825]: I0310 07:14:04.587578 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552114-ql6j9" event={"ID":"c659e062-f5dd-48b0-bcbe-4b448da7bb7b","Type":"ContainerDied","Data":"0e04cc096bcaa174eb94e0cb65d420ffb9f734ea3890a1054833b92dfa102ab9"} Mar 10 07:14:04 crc kubenswrapper[4825]: I0310 07:14:04.587620 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e04cc096bcaa174eb94e0cb65d420ffb9f734ea3890a1054833b92dfa102ab9" Mar 10 07:14:04 crc kubenswrapper[4825]: I0310 07:14:04.587648 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552114-ql6j9" Mar 10 07:14:04 crc kubenswrapper[4825]: I0310 07:14:04.957802 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552108-z7bwv"] Mar 10 07:14:04 crc kubenswrapper[4825]: I0310 07:14:04.968096 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552108-z7bwv"] Mar 10 07:14:05 crc kubenswrapper[4825]: I0310 07:14:05.250798 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa4c8337-0ae4-4b85-9793-4adb29f920a7" path="/var/lib/kubelet/pods/fa4c8337-0ae4-4b85-9793-4adb29f920a7/volumes" Mar 10 07:14:12 crc kubenswrapper[4825]: I0310 07:14:12.236658 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:14:12 crc kubenswrapper[4825]: E0310 07:14:12.237385 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:14:27 crc kubenswrapper[4825]: I0310 07:14:27.237853 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:14:27 crc kubenswrapper[4825]: E0310 07:14:27.239009 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:14:37 crc kubenswrapper[4825]: I0310 07:14:37.130428 4825 scope.go:117] "RemoveContainer" containerID="9f37201bc3341b1523532415ac8473925029730a22506fbff5e8ff2982720c54" Mar 10 07:14:37 crc kubenswrapper[4825]: I0310 07:14:37.191398 4825 scope.go:117] "RemoveContainer" containerID="126696bf9938795cae75dc7fc16fc96bf0c323ea2004daf4fa07abd6a757a8a9" Mar 10 07:14:37 crc kubenswrapper[4825]: I0310 07:14:37.210223 4825 scope.go:117] "RemoveContainer" containerID="b1d41b7d027d77c9513bde337e8c8167ef7c3e29806e7839c5c877933d476511" Mar 10 07:14:37 crc kubenswrapper[4825]: I0310 07:14:37.231652 4825 scope.go:117] "RemoveContainer" containerID="a7195e9075890ba063068bd9b092e8946d42acae2a64c8f5a383ba522b13c281" Mar 10 07:14:37 crc kubenswrapper[4825]: I0310 07:14:37.253338 4825 scope.go:117] "RemoveContainer" containerID="470e2428d5cbe39e87f8dca68dad46978460d1f245bcfa54ac6931e573586b33" Mar 10 07:14:37 crc kubenswrapper[4825]: I0310 07:14:37.293195 4825 scope.go:117] "RemoveContainer" containerID="0356f5ab6145f37ff4290da40a1602a3abe05333ce86f47dbb2d5041cfd72341" Mar 10 07:14:39 crc kubenswrapper[4825]: I0310 07:14:39.237085 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:14:39 crc kubenswrapper[4825]: E0310 07:14:39.238305 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:14:50 crc kubenswrapper[4825]: I0310 07:14:50.238346 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:14:50 crc kubenswrapper[4825]: E0310 07:14:50.239070 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:15:00 crc kubenswrapper[4825]: I0310 07:15:00.186305 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552115-k2j8m"] Mar 10 07:15:00 crc kubenswrapper[4825]: E0310 07:15:00.187733 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c659e062-f5dd-48b0-bcbe-4b448da7bb7b" containerName="oc" Mar 10 07:15:00 crc kubenswrapper[4825]: I0310 07:15:00.187768 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c659e062-f5dd-48b0-bcbe-4b448da7bb7b" containerName="oc" Mar 10 07:15:00 crc kubenswrapper[4825]: I0310 07:15:00.188125 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c659e062-f5dd-48b0-bcbe-4b448da7bb7b" containerName="oc" Mar 10 07:15:00 crc kubenswrapper[4825]: I0310 07:15:00.189359 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552115-k2j8m" Mar 10 07:15:00 crc kubenswrapper[4825]: I0310 07:15:00.193519 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552115-k2j8m"] Mar 10 07:15:00 crc kubenswrapper[4825]: I0310 07:15:00.194111 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 07:15:00 crc kubenswrapper[4825]: I0310 07:15:00.194599 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 07:15:00 crc kubenswrapper[4825]: I0310 07:15:00.329504 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/16589365-ed0f-41b3-af61-118fe9d14e14-config-volume\") pod \"collect-profiles-29552115-k2j8m\" (UID: \"16589365-ed0f-41b3-af61-118fe9d14e14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552115-k2j8m" Mar 10 07:15:00 crc kubenswrapper[4825]: I0310 07:15:00.329658 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p6ts\" (UniqueName: \"kubernetes.io/projected/16589365-ed0f-41b3-af61-118fe9d14e14-kube-api-access-6p6ts\") pod \"collect-profiles-29552115-k2j8m\" (UID: \"16589365-ed0f-41b3-af61-118fe9d14e14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552115-k2j8m" Mar 10 07:15:00 crc kubenswrapper[4825]: I0310 07:15:00.329732 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/16589365-ed0f-41b3-af61-118fe9d14e14-secret-volume\") pod \"collect-profiles-29552115-k2j8m\" (UID: \"16589365-ed0f-41b3-af61-118fe9d14e14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552115-k2j8m" Mar 10 07:15:00 crc kubenswrapper[4825]: I0310 07:15:00.430535 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p6ts\" (UniqueName: \"kubernetes.io/projected/16589365-ed0f-41b3-af61-118fe9d14e14-kube-api-access-6p6ts\") pod \"collect-profiles-29552115-k2j8m\" (UID: \"16589365-ed0f-41b3-af61-118fe9d14e14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552115-k2j8m" Mar 10 07:15:00 crc kubenswrapper[4825]: I0310 07:15:00.430582 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/16589365-ed0f-41b3-af61-118fe9d14e14-secret-volume\") pod \"collect-profiles-29552115-k2j8m\" (UID: \"16589365-ed0f-41b3-af61-118fe9d14e14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552115-k2j8m" Mar 10 07:15:00 crc kubenswrapper[4825]: I0310 07:15:00.430633 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/16589365-ed0f-41b3-af61-118fe9d14e14-config-volume\") pod \"collect-profiles-29552115-k2j8m\" (UID: \"16589365-ed0f-41b3-af61-118fe9d14e14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552115-k2j8m" Mar 10 07:15:00 crc kubenswrapper[4825]: I0310 07:15:00.433000 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/16589365-ed0f-41b3-af61-118fe9d14e14-config-volume\") pod \"collect-profiles-29552115-k2j8m\" (UID: \"16589365-ed0f-41b3-af61-118fe9d14e14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552115-k2j8m" Mar 10 07:15:00 crc kubenswrapper[4825]: I0310 07:15:00.442942 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/16589365-ed0f-41b3-af61-118fe9d14e14-secret-volume\") pod \"collect-profiles-29552115-k2j8m\" (UID: \"16589365-ed0f-41b3-af61-118fe9d14e14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552115-k2j8m" Mar 10 07:15:00 crc kubenswrapper[4825]: I0310 07:15:00.447638 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p6ts\" (UniqueName: \"kubernetes.io/projected/16589365-ed0f-41b3-af61-118fe9d14e14-kube-api-access-6p6ts\") pod \"collect-profiles-29552115-k2j8m\" (UID: \"16589365-ed0f-41b3-af61-118fe9d14e14\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552115-k2j8m" Mar 10 07:15:00 crc kubenswrapper[4825]: I0310 07:15:00.519879 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552115-k2j8m" Mar 10 07:15:00 crc kubenswrapper[4825]: I0310 07:15:00.729822 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552115-k2j8m"] Mar 10 07:15:01 crc kubenswrapper[4825]: I0310 07:15:01.139193 4825 generic.go:334] "Generic (PLEG): container finished" podID="16589365-ed0f-41b3-af61-118fe9d14e14" containerID="9ddc9b1808547bac708633152aac261201399154d44c29d69246c28751850144" exitCode=0 Mar 10 07:15:01 crc kubenswrapper[4825]: I0310 07:15:01.139259 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552115-k2j8m" event={"ID":"16589365-ed0f-41b3-af61-118fe9d14e14","Type":"ContainerDied","Data":"9ddc9b1808547bac708633152aac261201399154d44c29d69246c28751850144"} Mar 10 07:15:01 crc kubenswrapper[4825]: I0310 07:15:01.139330 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552115-k2j8m" event={"ID":"16589365-ed0f-41b3-af61-118fe9d14e14","Type":"ContainerStarted","Data":"d98d5073407e8a5b2a43decb8a6a32210d4e20600306037a762a6dcca40ae0cf"} Mar 10 07:15:02 crc kubenswrapper[4825]: I0310 07:15:02.464037 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552115-k2j8m" Mar 10 07:15:02 crc kubenswrapper[4825]: I0310 07:15:02.567382 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p6ts\" (UniqueName: \"kubernetes.io/projected/16589365-ed0f-41b3-af61-118fe9d14e14-kube-api-access-6p6ts\") pod \"16589365-ed0f-41b3-af61-118fe9d14e14\" (UID: \"16589365-ed0f-41b3-af61-118fe9d14e14\") " Mar 10 07:15:02 crc kubenswrapper[4825]: I0310 07:15:02.567453 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/16589365-ed0f-41b3-af61-118fe9d14e14-config-volume\") pod \"16589365-ed0f-41b3-af61-118fe9d14e14\" (UID: \"16589365-ed0f-41b3-af61-118fe9d14e14\") " Mar 10 07:15:02 crc kubenswrapper[4825]: I0310 07:15:02.567533 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/16589365-ed0f-41b3-af61-118fe9d14e14-secret-volume\") pod \"16589365-ed0f-41b3-af61-118fe9d14e14\" (UID: \"16589365-ed0f-41b3-af61-118fe9d14e14\") " Mar 10 07:15:02 crc kubenswrapper[4825]: I0310 07:15:02.568296 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16589365-ed0f-41b3-af61-118fe9d14e14-config-volume" (OuterVolumeSpecName: "config-volume") pod "16589365-ed0f-41b3-af61-118fe9d14e14" (UID: "16589365-ed0f-41b3-af61-118fe9d14e14"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:15:02 crc kubenswrapper[4825]: I0310 07:15:02.573418 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16589365-ed0f-41b3-af61-118fe9d14e14-kube-api-access-6p6ts" (OuterVolumeSpecName: "kube-api-access-6p6ts") pod "16589365-ed0f-41b3-af61-118fe9d14e14" (UID: "16589365-ed0f-41b3-af61-118fe9d14e14"). InnerVolumeSpecName "kube-api-access-6p6ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:15:02 crc kubenswrapper[4825]: I0310 07:15:02.573704 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16589365-ed0f-41b3-af61-118fe9d14e14-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "16589365-ed0f-41b3-af61-118fe9d14e14" (UID: "16589365-ed0f-41b3-af61-118fe9d14e14"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:15:02 crc kubenswrapper[4825]: I0310 07:15:02.669346 4825 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/16589365-ed0f-41b3-af61-118fe9d14e14-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 07:15:02 crc kubenswrapper[4825]: I0310 07:15:02.669403 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p6ts\" (UniqueName: \"kubernetes.io/projected/16589365-ed0f-41b3-af61-118fe9d14e14-kube-api-access-6p6ts\") on node \"crc\" DevicePath \"\"" Mar 10 07:15:02 crc kubenswrapper[4825]: I0310 07:15:02.669421 4825 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/16589365-ed0f-41b3-af61-118fe9d14e14-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 07:15:03 crc kubenswrapper[4825]: I0310 07:15:03.160164 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552115-k2j8m" event={"ID":"16589365-ed0f-41b3-af61-118fe9d14e14","Type":"ContainerDied","Data":"d98d5073407e8a5b2a43decb8a6a32210d4e20600306037a762a6dcca40ae0cf"} Mar 10 07:15:03 crc kubenswrapper[4825]: I0310 07:15:03.160480 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d98d5073407e8a5b2a43decb8a6a32210d4e20600306037a762a6dcca40ae0cf" Mar 10 07:15:03 crc kubenswrapper[4825]: I0310 07:15:03.160279 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552115-k2j8m" Mar 10 07:15:05 crc kubenswrapper[4825]: I0310 07:15:05.237358 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:15:05 crc kubenswrapper[4825]: E0310 07:15:05.237955 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:15:17 crc kubenswrapper[4825]: I0310 07:15:17.236560 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:15:17 crc kubenswrapper[4825]: E0310 07:15:17.239055 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:15:31 crc kubenswrapper[4825]: I0310 07:15:31.237743 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:15:31 crc kubenswrapper[4825]: E0310 07:15:31.238747 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:15:45 crc kubenswrapper[4825]: I0310 07:15:45.236960 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:15:45 crc kubenswrapper[4825]: E0310 07:15:45.238585 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:15:59 crc kubenswrapper[4825]: I0310 07:15:59.243688 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:15:59 crc kubenswrapper[4825]: E0310 07:15:59.245723 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:16:00 crc kubenswrapper[4825]: I0310 07:16:00.165851 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552116-ncwdw"] Mar 10 07:16:00 crc kubenswrapper[4825]: E0310 07:16:00.166582 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16589365-ed0f-41b3-af61-118fe9d14e14" containerName="collect-profiles" Mar 10 07:16:00 crc kubenswrapper[4825]: I0310 07:16:00.166609 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="16589365-ed0f-41b3-af61-118fe9d14e14" containerName="collect-profiles" Mar 10 07:16:00 crc kubenswrapper[4825]: I0310 07:16:00.166875 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="16589365-ed0f-41b3-af61-118fe9d14e14" containerName="collect-profiles" Mar 10 07:16:00 crc kubenswrapper[4825]: I0310 07:16:00.167640 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552116-ncwdw" Mar 10 07:16:00 crc kubenswrapper[4825]: I0310 07:16:00.171394 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:16:00 crc kubenswrapper[4825]: I0310 07:16:00.172045 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:16:00 crc kubenswrapper[4825]: I0310 07:16:00.176329 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:16:00 crc kubenswrapper[4825]: I0310 07:16:00.184256 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552116-ncwdw"] Mar 10 07:16:00 crc kubenswrapper[4825]: I0310 07:16:00.211046 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxfx5\" (UniqueName: \"kubernetes.io/projected/906f198a-b4c0-4958-b736-1cf9febbdc59-kube-api-access-gxfx5\") pod \"auto-csr-approver-29552116-ncwdw\" (UID: \"906f198a-b4c0-4958-b736-1cf9febbdc59\") " pod="openshift-infra/auto-csr-approver-29552116-ncwdw" Mar 10 07:16:00 crc kubenswrapper[4825]: I0310 07:16:00.311854 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxfx5\" (UniqueName: \"kubernetes.io/projected/906f198a-b4c0-4958-b736-1cf9febbdc59-kube-api-access-gxfx5\") pod \"auto-csr-approver-29552116-ncwdw\" (UID: \"906f198a-b4c0-4958-b736-1cf9febbdc59\") " pod="openshift-infra/auto-csr-approver-29552116-ncwdw" Mar 10 07:16:00 crc kubenswrapper[4825]: I0310 07:16:00.331173 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxfx5\" (UniqueName: \"kubernetes.io/projected/906f198a-b4c0-4958-b736-1cf9febbdc59-kube-api-access-gxfx5\") pod \"auto-csr-approver-29552116-ncwdw\" (UID: \"906f198a-b4c0-4958-b736-1cf9febbdc59\") " pod="openshift-infra/auto-csr-approver-29552116-ncwdw" Mar 10 07:16:00 crc kubenswrapper[4825]: I0310 07:16:00.499397 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552116-ncwdw" Mar 10 07:16:01 crc kubenswrapper[4825]: I0310 07:16:01.040443 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552116-ncwdw"] Mar 10 07:16:01 crc kubenswrapper[4825]: W0310 07:16:01.047199 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod906f198a_b4c0_4958_b736_1cf9febbdc59.slice/crio-1d09e2004299f65ba6f5cb2daae1b371073ffba9205ef208951d8678962f829d WatchSource:0}: Error finding container 1d09e2004299f65ba6f5cb2daae1b371073ffba9205ef208951d8678962f829d: Status 404 returned error can't find the container with id 1d09e2004299f65ba6f5cb2daae1b371073ffba9205ef208951d8678962f829d Mar 10 07:16:01 crc kubenswrapper[4825]: I0310 07:16:01.050447 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 07:16:01 crc kubenswrapper[4825]: I0310 07:16:01.755503 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552116-ncwdw" event={"ID":"906f198a-b4c0-4958-b736-1cf9febbdc59","Type":"ContainerStarted","Data":"1d09e2004299f65ba6f5cb2daae1b371073ffba9205ef208951d8678962f829d"} Mar 10 07:16:02 crc kubenswrapper[4825]: I0310 07:16:02.767077 4825 generic.go:334] "Generic (PLEG): container finished" podID="906f198a-b4c0-4958-b736-1cf9febbdc59" containerID="9e28c7381c0a537ec0798b11356883dada8501972ab1bf4e10dff52e0ca20c53" exitCode=0 Mar 10 07:16:02 crc kubenswrapper[4825]: I0310 07:16:02.767182 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552116-ncwdw" event={"ID":"906f198a-b4c0-4958-b736-1cf9febbdc59","Type":"ContainerDied","Data":"9e28c7381c0a537ec0798b11356883dada8501972ab1bf4e10dff52e0ca20c53"} Mar 10 07:16:04 crc kubenswrapper[4825]: I0310 07:16:04.083747 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552116-ncwdw" Mar 10 07:16:04 crc kubenswrapper[4825]: I0310 07:16:04.088030 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxfx5\" (UniqueName: \"kubernetes.io/projected/906f198a-b4c0-4958-b736-1cf9febbdc59-kube-api-access-gxfx5\") pod \"906f198a-b4c0-4958-b736-1cf9febbdc59\" (UID: \"906f198a-b4c0-4958-b736-1cf9febbdc59\") " Mar 10 07:16:04 crc kubenswrapper[4825]: I0310 07:16:04.093075 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/906f198a-b4c0-4958-b736-1cf9febbdc59-kube-api-access-gxfx5" (OuterVolumeSpecName: "kube-api-access-gxfx5") pod "906f198a-b4c0-4958-b736-1cf9febbdc59" (UID: "906f198a-b4c0-4958-b736-1cf9febbdc59"). InnerVolumeSpecName "kube-api-access-gxfx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:16:04 crc kubenswrapper[4825]: I0310 07:16:04.189454 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxfx5\" (UniqueName: \"kubernetes.io/projected/906f198a-b4c0-4958-b736-1cf9febbdc59-kube-api-access-gxfx5\") on node \"crc\" DevicePath \"\"" Mar 10 07:16:04 crc kubenswrapper[4825]: I0310 07:16:04.787821 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552116-ncwdw" event={"ID":"906f198a-b4c0-4958-b736-1cf9febbdc59","Type":"ContainerDied","Data":"1d09e2004299f65ba6f5cb2daae1b371073ffba9205ef208951d8678962f829d"} Mar 10 07:16:04 crc kubenswrapper[4825]: I0310 07:16:04.787870 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d09e2004299f65ba6f5cb2daae1b371073ffba9205ef208951d8678962f829d" Mar 10 07:16:04 crc kubenswrapper[4825]: I0310 07:16:04.787919 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552116-ncwdw" Mar 10 07:16:05 crc kubenswrapper[4825]: I0310 07:16:05.183284 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552110-n6wf2"] Mar 10 07:16:05 crc kubenswrapper[4825]: I0310 07:16:05.194817 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552110-n6wf2"] Mar 10 07:16:05 crc kubenswrapper[4825]: I0310 07:16:05.247069 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="971b7933-e72a-4c44-831f-2d63ab760bc9" path="/var/lib/kubelet/pods/971b7933-e72a-4c44-831f-2d63ab760bc9/volumes" Mar 10 07:16:14 crc kubenswrapper[4825]: I0310 07:16:14.236412 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:16:14 crc kubenswrapper[4825]: E0310 07:16:14.237295 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:16:29 crc kubenswrapper[4825]: I0310 07:16:29.245549 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:16:29 crc kubenswrapper[4825]: E0310 07:16:29.246488 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:16:37 crc kubenswrapper[4825]: I0310 07:16:37.382196 4825 scope.go:117] "RemoveContainer" containerID="256d5ad716b49df18c9479773dade007a478fa0a069736a036755dc714809c73" Mar 10 07:16:43 crc kubenswrapper[4825]: I0310 07:16:43.237443 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:16:43 crc kubenswrapper[4825]: E0310 07:16:43.238371 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:16:55 crc kubenswrapper[4825]: I0310 07:16:55.236423 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:16:55 crc kubenswrapper[4825]: E0310 07:16:55.237144 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:17:10 crc kubenswrapper[4825]: I0310 07:17:10.236252 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:17:10 crc kubenswrapper[4825]: E0310 07:17:10.237431 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:17:21 crc kubenswrapper[4825]: I0310 07:17:21.237921 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:17:21 crc kubenswrapper[4825]: E0310 07:17:21.238600 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:17:33 crc kubenswrapper[4825]: I0310 07:17:33.238864 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:17:33 crc kubenswrapper[4825]: E0310 07:17:33.239869 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:17:45 crc kubenswrapper[4825]: I0310 07:17:45.236368 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:17:45 crc kubenswrapper[4825]: E0310 07:17:45.239224 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:17:58 crc kubenswrapper[4825]: I0310 07:17:58.236544 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:17:58 crc kubenswrapper[4825]: E0310 07:17:58.237392 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:18:00 crc kubenswrapper[4825]: I0310 07:18:00.157831 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552118-lvgdb"] Mar 10 07:18:00 crc kubenswrapper[4825]: E0310 07:18:00.158506 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906f198a-b4c0-4958-b736-1cf9febbdc59" containerName="oc" Mar 10 07:18:00 crc kubenswrapper[4825]: I0310 07:18:00.158525 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="906f198a-b4c0-4958-b736-1cf9febbdc59" containerName="oc" Mar 10 07:18:00 crc kubenswrapper[4825]: I0310 07:18:00.158676 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="906f198a-b4c0-4958-b736-1cf9febbdc59" containerName="oc" Mar 10 07:18:00 crc kubenswrapper[4825]: I0310 07:18:00.159109 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552118-lvgdb" Mar 10 07:18:00 crc kubenswrapper[4825]: I0310 07:18:00.162119 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:18:00 crc kubenswrapper[4825]: I0310 07:18:00.162413 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:18:00 crc kubenswrapper[4825]: I0310 07:18:00.166356 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552118-lvgdb"] Mar 10 07:18:00 crc kubenswrapper[4825]: I0310 07:18:00.167295 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:18:00 crc kubenswrapper[4825]: I0310 07:18:00.308406 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxbth\" (UniqueName: \"kubernetes.io/projected/48b65f42-3de8-4e55-a302-7475deec8981-kube-api-access-zxbth\") pod \"auto-csr-approver-29552118-lvgdb\" (UID: \"48b65f42-3de8-4e55-a302-7475deec8981\") " pod="openshift-infra/auto-csr-approver-29552118-lvgdb" Mar 10 07:18:00 crc kubenswrapper[4825]: I0310 07:18:00.411281 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxbth\" (UniqueName: \"kubernetes.io/projected/48b65f42-3de8-4e55-a302-7475deec8981-kube-api-access-zxbth\") pod \"auto-csr-approver-29552118-lvgdb\" (UID: \"48b65f42-3de8-4e55-a302-7475deec8981\") " pod="openshift-infra/auto-csr-approver-29552118-lvgdb" Mar 10 07:18:00 crc kubenswrapper[4825]: I0310 07:18:00.428763 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxbth\" (UniqueName: \"kubernetes.io/projected/48b65f42-3de8-4e55-a302-7475deec8981-kube-api-access-zxbth\") pod \"auto-csr-approver-29552118-lvgdb\" (UID: \"48b65f42-3de8-4e55-a302-7475deec8981\") " pod="openshift-infra/auto-csr-approver-29552118-lvgdb" Mar 10 07:18:00 crc kubenswrapper[4825]: I0310 07:18:00.518609 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552118-lvgdb" Mar 10 07:18:00 crc kubenswrapper[4825]: I0310 07:18:00.929370 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552118-lvgdb"] Mar 10 07:18:01 crc kubenswrapper[4825]: I0310 07:18:01.859890 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552118-lvgdb" event={"ID":"48b65f42-3de8-4e55-a302-7475deec8981","Type":"ContainerStarted","Data":"bd319f6e4c6a119acd9abaab61b844f3687dce09cd5323d46b548613eda0f2ec"} Mar 10 07:18:02 crc kubenswrapper[4825]: I0310 07:18:02.871685 4825 generic.go:334] "Generic (PLEG): container finished" podID="48b65f42-3de8-4e55-a302-7475deec8981" containerID="a137bbd6c70d8d4ece1dd6e26b2a8e147fdde2d0f1873d42630561b75223678a" exitCode=0 Mar 10 07:18:02 crc kubenswrapper[4825]: I0310 07:18:02.871758 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552118-lvgdb" event={"ID":"48b65f42-3de8-4e55-a302-7475deec8981","Type":"ContainerDied","Data":"a137bbd6c70d8d4ece1dd6e26b2a8e147fdde2d0f1873d42630561b75223678a"} Mar 10 07:18:04 crc kubenswrapper[4825]: I0310 07:18:04.230753 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552118-lvgdb" Mar 10 07:18:04 crc kubenswrapper[4825]: I0310 07:18:04.374270 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxbth\" (UniqueName: \"kubernetes.io/projected/48b65f42-3de8-4e55-a302-7475deec8981-kube-api-access-zxbth\") pod \"48b65f42-3de8-4e55-a302-7475deec8981\" (UID: \"48b65f42-3de8-4e55-a302-7475deec8981\") " Mar 10 07:18:04 crc kubenswrapper[4825]: I0310 07:18:04.379746 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48b65f42-3de8-4e55-a302-7475deec8981-kube-api-access-zxbth" (OuterVolumeSpecName: "kube-api-access-zxbth") pod "48b65f42-3de8-4e55-a302-7475deec8981" (UID: "48b65f42-3de8-4e55-a302-7475deec8981"). InnerVolumeSpecName "kube-api-access-zxbth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:18:04 crc kubenswrapper[4825]: I0310 07:18:04.475753 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxbth\" (UniqueName: \"kubernetes.io/projected/48b65f42-3de8-4e55-a302-7475deec8981-kube-api-access-zxbth\") on node \"crc\" DevicePath \"\"" Mar 10 07:18:04 crc kubenswrapper[4825]: I0310 07:18:04.891341 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552118-lvgdb" event={"ID":"48b65f42-3de8-4e55-a302-7475deec8981","Type":"ContainerDied","Data":"bd319f6e4c6a119acd9abaab61b844f3687dce09cd5323d46b548613eda0f2ec"} Mar 10 07:18:04 crc kubenswrapper[4825]: I0310 07:18:04.891398 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd319f6e4c6a119acd9abaab61b844f3687dce09cd5323d46b548613eda0f2ec" Mar 10 07:18:04 crc kubenswrapper[4825]: I0310 07:18:04.891423 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552118-lvgdb" Mar 10 07:18:05 crc kubenswrapper[4825]: I0310 07:18:05.310600 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552112-dwgg9"] Mar 10 07:18:05 crc kubenswrapper[4825]: I0310 07:18:05.321378 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552112-dwgg9"] Mar 10 07:18:07 crc kubenswrapper[4825]: I0310 07:18:07.254898 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c450af65-3ce2-472b-b05b-b22c285c61b5" path="/var/lib/kubelet/pods/c450af65-3ce2-472b-b05b-b22c285c61b5/volumes" Mar 10 07:18:13 crc kubenswrapper[4825]: I0310 07:18:13.237170 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:18:13 crc kubenswrapper[4825]: E0310 07:18:13.238024 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:18:28 crc kubenswrapper[4825]: I0310 07:18:28.237382 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:18:28 crc kubenswrapper[4825]: E0310 07:18:28.240128 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:18:37 crc kubenswrapper[4825]: I0310 07:18:37.523898 4825 scope.go:117] "RemoveContainer" containerID="834470ca53cf68f911b932bdf9b193654a066d9dbe28b6b5f1e5b648d0739ae9" Mar 10 07:18:43 crc kubenswrapper[4825]: I0310 07:18:43.236998 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:18:43 crc kubenswrapper[4825]: E0310 07:18:43.237544 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:18:54 crc kubenswrapper[4825]: I0310 07:18:54.237022 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:18:55 crc kubenswrapper[4825]: I0310 07:18:55.374718 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"dddfba3f7a2f8d3823d9a072175efb197634fd1df552eb76662bdf50e8519ec5"} Mar 10 07:19:22 crc kubenswrapper[4825]: I0310 07:19:22.167350 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gk27t"] Mar 10 07:19:22 crc kubenswrapper[4825]: E0310 07:19:22.168486 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48b65f42-3de8-4e55-a302-7475deec8981" containerName="oc" Mar 10 07:19:22 crc kubenswrapper[4825]: I0310 07:19:22.168510 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="48b65f42-3de8-4e55-a302-7475deec8981" containerName="oc" Mar 10 07:19:22 crc kubenswrapper[4825]: I0310 07:19:22.168775 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="48b65f42-3de8-4e55-a302-7475deec8981" containerName="oc" Mar 10 07:19:22 crc kubenswrapper[4825]: I0310 07:19:22.170581 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gk27t" Mar 10 07:19:22 crc kubenswrapper[4825]: I0310 07:19:22.200234 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gk27t"] Mar 10 07:19:22 crc kubenswrapper[4825]: I0310 07:19:22.250285 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b362c1f8-afb5-4eda-bc93-117a7f8ea281-catalog-content\") pod \"redhat-operators-gk27t\" (UID: \"b362c1f8-afb5-4eda-bc93-117a7f8ea281\") " pod="openshift-marketplace/redhat-operators-gk27t" Mar 10 07:19:22 crc kubenswrapper[4825]: I0310 07:19:22.250360 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbnbq\" (UniqueName: \"kubernetes.io/projected/b362c1f8-afb5-4eda-bc93-117a7f8ea281-kube-api-access-fbnbq\") pod \"redhat-operators-gk27t\" (UID: \"b362c1f8-afb5-4eda-bc93-117a7f8ea281\") " pod="openshift-marketplace/redhat-operators-gk27t" Mar 10 07:19:22 crc kubenswrapper[4825]: I0310 07:19:22.250468 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b362c1f8-afb5-4eda-bc93-117a7f8ea281-utilities\") pod \"redhat-operators-gk27t\" (UID: \"b362c1f8-afb5-4eda-bc93-117a7f8ea281\") " pod="openshift-marketplace/redhat-operators-gk27t" Mar 10 07:19:22 crc kubenswrapper[4825]: I0310 07:19:22.352006 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b362c1f8-afb5-4eda-bc93-117a7f8ea281-utilities\") pod \"redhat-operators-gk27t\" (UID: \"b362c1f8-afb5-4eda-bc93-117a7f8ea281\") " pod="openshift-marketplace/redhat-operators-gk27t" Mar 10 07:19:22 crc kubenswrapper[4825]: I0310 07:19:22.352187 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b362c1f8-afb5-4eda-bc93-117a7f8ea281-catalog-content\") pod \"redhat-operators-gk27t\" (UID: \"b362c1f8-afb5-4eda-bc93-117a7f8ea281\") " pod="openshift-marketplace/redhat-operators-gk27t" Mar 10 07:19:22 crc kubenswrapper[4825]: I0310 07:19:22.352266 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbnbq\" (UniqueName: \"kubernetes.io/projected/b362c1f8-afb5-4eda-bc93-117a7f8ea281-kube-api-access-fbnbq\") pod \"redhat-operators-gk27t\" (UID: \"b362c1f8-afb5-4eda-bc93-117a7f8ea281\") " pod="openshift-marketplace/redhat-operators-gk27t" Mar 10 07:19:22 crc kubenswrapper[4825]: I0310 07:19:22.352649 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b362c1f8-afb5-4eda-bc93-117a7f8ea281-utilities\") pod \"redhat-operators-gk27t\" (UID: \"b362c1f8-afb5-4eda-bc93-117a7f8ea281\") " pod="openshift-marketplace/redhat-operators-gk27t" Mar 10 07:19:22 crc kubenswrapper[4825]: I0310 07:19:22.352963 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b362c1f8-afb5-4eda-bc93-117a7f8ea281-catalog-content\") pod \"redhat-operators-gk27t\" (UID: \"b362c1f8-afb5-4eda-bc93-117a7f8ea281\") " pod="openshift-marketplace/redhat-operators-gk27t" Mar 10 07:19:22 crc kubenswrapper[4825]: I0310 07:19:22.383934 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbnbq\" (UniqueName: \"kubernetes.io/projected/b362c1f8-afb5-4eda-bc93-117a7f8ea281-kube-api-access-fbnbq\") pod \"redhat-operators-gk27t\" (UID: \"b362c1f8-afb5-4eda-bc93-117a7f8ea281\") " pod="openshift-marketplace/redhat-operators-gk27t" Mar 10 07:19:22 crc kubenswrapper[4825]: I0310 07:19:22.489055 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gk27t" Mar 10 07:19:22 crc kubenswrapper[4825]: I0310 07:19:22.766475 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gk27t"] Mar 10 07:19:22 crc kubenswrapper[4825]: I0310 07:19:22.832825 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gk27t" event={"ID":"b362c1f8-afb5-4eda-bc93-117a7f8ea281","Type":"ContainerStarted","Data":"d174de2a930e05d2f58ed6a1aedc19dfdb538ec90ad4a075b378a0a9c392ac36"} Mar 10 07:19:23 crc kubenswrapper[4825]: I0310 07:19:23.840697 4825 generic.go:334] "Generic (PLEG): container finished" podID="b362c1f8-afb5-4eda-bc93-117a7f8ea281" containerID="8b386e049fada32aaccec3276081e7577673d285c9b376ef1e93c2438f347507" exitCode=0 Mar 10 07:19:23 crc kubenswrapper[4825]: I0310 07:19:23.840755 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gk27t" event={"ID":"b362c1f8-afb5-4eda-bc93-117a7f8ea281","Type":"ContainerDied","Data":"8b386e049fada32aaccec3276081e7577673d285c9b376ef1e93c2438f347507"} Mar 10 07:19:24 crc kubenswrapper[4825]: I0310 07:19:24.852378 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gk27t" event={"ID":"b362c1f8-afb5-4eda-bc93-117a7f8ea281","Type":"ContainerStarted","Data":"35802b163595b3987194b6387d087b85f0b406cb60e0ddc122cff8ab0395fe00"} Mar 10 07:19:25 crc kubenswrapper[4825]: I0310 07:19:25.868310 4825 generic.go:334] "Generic (PLEG): container finished" podID="b362c1f8-afb5-4eda-bc93-117a7f8ea281" containerID="35802b163595b3987194b6387d087b85f0b406cb60e0ddc122cff8ab0395fe00" exitCode=0 Mar 10 07:19:25 crc kubenswrapper[4825]: I0310 07:19:25.868388 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gk27t" event={"ID":"b362c1f8-afb5-4eda-bc93-117a7f8ea281","Type":"ContainerDied","Data":"35802b163595b3987194b6387d087b85f0b406cb60e0ddc122cff8ab0395fe00"} Mar 10 07:19:26 crc kubenswrapper[4825]: I0310 07:19:26.876890 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gk27t" event={"ID":"b362c1f8-afb5-4eda-bc93-117a7f8ea281","Type":"ContainerStarted","Data":"c026e6c5b2a79459fb122830e5d6e738b96d953b6867024835f1da65c50b3e8d"} Mar 10 07:19:26 crc kubenswrapper[4825]: I0310 07:19:26.905930 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gk27t" podStartSLOduration=2.476928093 podStartE2EDuration="4.905907898s" podCreationTimestamp="2026-03-10 07:19:22 +0000 UTC" firstStartedPulling="2026-03-10 07:19:23.842835287 +0000 UTC m=+2116.872615902" lastFinishedPulling="2026-03-10 07:19:26.271815052 +0000 UTC m=+2119.301595707" observedRunningTime="2026-03-10 07:19:26.903186576 +0000 UTC m=+2119.932967241" watchObservedRunningTime="2026-03-10 07:19:26.905907898 +0000 UTC m=+2119.935688523" Mar 10 07:19:32 crc kubenswrapper[4825]: I0310 07:19:32.489588 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gk27t" Mar 10 07:19:32 crc kubenswrapper[4825]: I0310 07:19:32.490109 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gk27t" Mar 10 07:19:33 crc kubenswrapper[4825]: I0310 07:19:33.571343 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gk27t" podUID="b362c1f8-afb5-4eda-bc93-117a7f8ea281" containerName="registry-server" probeResult="failure" output=< Mar 10 07:19:33 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 07:19:33 crc kubenswrapper[4825]: > Mar 10 07:19:42 crc kubenswrapper[4825]: I0310 07:19:42.567876 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gk27t" Mar 10 07:19:42 crc kubenswrapper[4825]: I0310 07:19:42.653940 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gk27t" Mar 10 07:19:42 crc kubenswrapper[4825]: I0310 07:19:42.820842 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gk27t"] Mar 10 07:19:44 crc kubenswrapper[4825]: I0310 07:19:44.026737 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gk27t" podUID="b362c1f8-afb5-4eda-bc93-117a7f8ea281" containerName="registry-server" containerID="cri-o://c026e6c5b2a79459fb122830e5d6e738b96d953b6867024835f1da65c50b3e8d" gracePeriod=2 Mar 10 07:19:44 crc kubenswrapper[4825]: I0310 07:19:44.469862 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gk27t" Mar 10 07:19:44 crc kubenswrapper[4825]: I0310 07:19:44.511987 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b362c1f8-afb5-4eda-bc93-117a7f8ea281-catalog-content\") pod \"b362c1f8-afb5-4eda-bc93-117a7f8ea281\" (UID: \"b362c1f8-afb5-4eda-bc93-117a7f8ea281\") " Mar 10 07:19:44 crc kubenswrapper[4825]: I0310 07:19:44.512066 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b362c1f8-afb5-4eda-bc93-117a7f8ea281-utilities\") pod \"b362c1f8-afb5-4eda-bc93-117a7f8ea281\" (UID: \"b362c1f8-afb5-4eda-bc93-117a7f8ea281\") " Mar 10 07:19:44 crc kubenswrapper[4825]: I0310 07:19:44.512101 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbnbq\" (UniqueName: \"kubernetes.io/projected/b362c1f8-afb5-4eda-bc93-117a7f8ea281-kube-api-access-fbnbq\") pod \"b362c1f8-afb5-4eda-bc93-117a7f8ea281\" (UID: \"b362c1f8-afb5-4eda-bc93-117a7f8ea281\") " Mar 10 07:19:44 crc kubenswrapper[4825]: I0310 07:19:44.513214 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b362c1f8-afb5-4eda-bc93-117a7f8ea281-utilities" (OuterVolumeSpecName: "utilities") pod "b362c1f8-afb5-4eda-bc93-117a7f8ea281" (UID: "b362c1f8-afb5-4eda-bc93-117a7f8ea281"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:19:44 crc kubenswrapper[4825]: I0310 07:19:44.517790 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b362c1f8-afb5-4eda-bc93-117a7f8ea281-kube-api-access-fbnbq" (OuterVolumeSpecName: "kube-api-access-fbnbq") pod "b362c1f8-afb5-4eda-bc93-117a7f8ea281" (UID: "b362c1f8-afb5-4eda-bc93-117a7f8ea281"). InnerVolumeSpecName "kube-api-access-fbnbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:19:44 crc kubenswrapper[4825]: I0310 07:19:44.613873 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b362c1f8-afb5-4eda-bc93-117a7f8ea281-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 07:19:44 crc kubenswrapper[4825]: I0310 07:19:44.613931 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbnbq\" (UniqueName: \"kubernetes.io/projected/b362c1f8-afb5-4eda-bc93-117a7f8ea281-kube-api-access-fbnbq\") on node \"crc\" DevicePath \"\"" Mar 10 07:19:44 crc kubenswrapper[4825]: I0310 07:19:44.682823 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b362c1f8-afb5-4eda-bc93-117a7f8ea281-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b362c1f8-afb5-4eda-bc93-117a7f8ea281" (UID: "b362c1f8-afb5-4eda-bc93-117a7f8ea281"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:19:44 crc kubenswrapper[4825]: I0310 07:19:44.715292 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b362c1f8-afb5-4eda-bc93-117a7f8ea281-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 07:19:45 crc kubenswrapper[4825]: I0310 07:19:45.042539 4825 generic.go:334] "Generic (PLEG): container finished" podID="b362c1f8-afb5-4eda-bc93-117a7f8ea281" containerID="c026e6c5b2a79459fb122830e5d6e738b96d953b6867024835f1da65c50b3e8d" exitCode=0 Mar 10 07:19:45 crc kubenswrapper[4825]: I0310 07:19:45.042612 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gk27t" event={"ID":"b362c1f8-afb5-4eda-bc93-117a7f8ea281","Type":"ContainerDied","Data":"c026e6c5b2a79459fb122830e5d6e738b96d953b6867024835f1da65c50b3e8d"} Mar 10 07:19:45 crc kubenswrapper[4825]: I0310 07:19:45.042680 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gk27t" event={"ID":"b362c1f8-afb5-4eda-bc93-117a7f8ea281","Type":"ContainerDied","Data":"d174de2a930e05d2f58ed6a1aedc19dfdb538ec90ad4a075b378a0a9c392ac36"} Mar 10 07:19:45 crc kubenswrapper[4825]: I0310 07:19:45.042719 4825 scope.go:117] "RemoveContainer" containerID="c026e6c5b2a79459fb122830e5d6e738b96d953b6867024835f1da65c50b3e8d" Mar 10 07:19:45 crc kubenswrapper[4825]: I0310 07:19:45.042715 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gk27t" Mar 10 07:19:45 crc kubenswrapper[4825]: I0310 07:19:45.075075 4825 scope.go:117] "RemoveContainer" containerID="35802b163595b3987194b6387d087b85f0b406cb60e0ddc122cff8ab0395fe00" Mar 10 07:19:45 crc kubenswrapper[4825]: I0310 07:19:45.103914 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gk27t"] Mar 10 07:19:45 crc kubenswrapper[4825]: I0310 07:19:45.105448 4825 scope.go:117] "RemoveContainer" containerID="8b386e049fada32aaccec3276081e7577673d285c9b376ef1e93c2438f347507" Mar 10 07:19:45 crc kubenswrapper[4825]: I0310 07:19:45.110657 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gk27t"] Mar 10 07:19:45 crc kubenswrapper[4825]: I0310 07:19:45.141423 4825 scope.go:117] "RemoveContainer" containerID="c026e6c5b2a79459fb122830e5d6e738b96d953b6867024835f1da65c50b3e8d" Mar 10 07:19:45 crc kubenswrapper[4825]: E0310 07:19:45.142610 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c026e6c5b2a79459fb122830e5d6e738b96d953b6867024835f1da65c50b3e8d\": container with ID starting with c026e6c5b2a79459fb122830e5d6e738b96d953b6867024835f1da65c50b3e8d not found: ID does not exist" containerID="c026e6c5b2a79459fb122830e5d6e738b96d953b6867024835f1da65c50b3e8d" Mar 10 07:19:45 crc kubenswrapper[4825]: I0310 07:19:45.142653 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c026e6c5b2a79459fb122830e5d6e738b96d953b6867024835f1da65c50b3e8d"} err="failed to get container status \"c026e6c5b2a79459fb122830e5d6e738b96d953b6867024835f1da65c50b3e8d\": rpc error: code = NotFound desc = could not find container \"c026e6c5b2a79459fb122830e5d6e738b96d953b6867024835f1da65c50b3e8d\": container with ID starting with c026e6c5b2a79459fb122830e5d6e738b96d953b6867024835f1da65c50b3e8d not found: ID does not exist" Mar 10 07:19:45 crc kubenswrapper[4825]: I0310 07:19:45.142700 4825 scope.go:117] "RemoveContainer" containerID="35802b163595b3987194b6387d087b85f0b406cb60e0ddc122cff8ab0395fe00" Mar 10 07:19:45 crc kubenswrapper[4825]: E0310 07:19:45.143730 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35802b163595b3987194b6387d087b85f0b406cb60e0ddc122cff8ab0395fe00\": container with ID starting with 35802b163595b3987194b6387d087b85f0b406cb60e0ddc122cff8ab0395fe00 not found: ID does not exist" containerID="35802b163595b3987194b6387d087b85f0b406cb60e0ddc122cff8ab0395fe00" Mar 10 07:19:45 crc kubenswrapper[4825]: I0310 07:19:45.143790 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35802b163595b3987194b6387d087b85f0b406cb60e0ddc122cff8ab0395fe00"} err="failed to get container status \"35802b163595b3987194b6387d087b85f0b406cb60e0ddc122cff8ab0395fe00\": rpc error: code = NotFound desc = could not find container \"35802b163595b3987194b6387d087b85f0b406cb60e0ddc122cff8ab0395fe00\": container with ID starting with 35802b163595b3987194b6387d087b85f0b406cb60e0ddc122cff8ab0395fe00 not found: ID does not exist" Mar 10 07:19:45 crc kubenswrapper[4825]: I0310 07:19:45.143829 4825 scope.go:117] "RemoveContainer" containerID="8b386e049fada32aaccec3276081e7577673d285c9b376ef1e93c2438f347507" Mar 10 07:19:45 crc kubenswrapper[4825]: E0310 07:19:45.145360 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b386e049fada32aaccec3276081e7577673d285c9b376ef1e93c2438f347507\": container with ID starting with 8b386e049fada32aaccec3276081e7577673d285c9b376ef1e93c2438f347507 not found: ID does not exist" containerID="8b386e049fada32aaccec3276081e7577673d285c9b376ef1e93c2438f347507" Mar 10 07:19:45 crc kubenswrapper[4825]: I0310 07:19:45.145444 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b386e049fada32aaccec3276081e7577673d285c9b376ef1e93c2438f347507"} err="failed to get container status \"8b386e049fada32aaccec3276081e7577673d285c9b376ef1e93c2438f347507\": rpc error: code = NotFound desc = could not find container \"8b386e049fada32aaccec3276081e7577673d285c9b376ef1e93c2438f347507\": container with ID starting with 8b386e049fada32aaccec3276081e7577673d285c9b376ef1e93c2438f347507 not found: ID does not exist" Mar 10 07:19:45 crc kubenswrapper[4825]: I0310 07:19:45.245034 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b362c1f8-afb5-4eda-bc93-117a7f8ea281" path="/var/lib/kubelet/pods/b362c1f8-afb5-4eda-bc93-117a7f8ea281/volumes" Mar 10 07:20:00 crc kubenswrapper[4825]: I0310 07:20:00.154664 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552120-lzjsg"] Mar 10 07:20:00 crc kubenswrapper[4825]: E0310 07:20:00.156607 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b362c1f8-afb5-4eda-bc93-117a7f8ea281" containerName="extract-content" Mar 10 07:20:00 crc kubenswrapper[4825]: I0310 07:20:00.156710 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b362c1f8-afb5-4eda-bc93-117a7f8ea281" containerName="extract-content" Mar 10 07:20:00 crc kubenswrapper[4825]: E0310 07:20:00.156802 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b362c1f8-afb5-4eda-bc93-117a7f8ea281" containerName="registry-server" Mar 10 07:20:00 crc kubenswrapper[4825]: I0310 07:20:00.156880 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b362c1f8-afb5-4eda-bc93-117a7f8ea281" containerName="registry-server" Mar 10 07:20:00 crc kubenswrapper[4825]: E0310 07:20:00.156983 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b362c1f8-afb5-4eda-bc93-117a7f8ea281" containerName="extract-utilities" Mar 10 07:20:00 crc kubenswrapper[4825]: I0310 07:20:00.157060 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b362c1f8-afb5-4eda-bc93-117a7f8ea281" containerName="extract-utilities" Mar 10 07:20:00 crc kubenswrapper[4825]: I0310 07:20:00.157284 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b362c1f8-afb5-4eda-bc93-117a7f8ea281" containerName="registry-server" Mar 10 07:20:00 crc kubenswrapper[4825]: I0310 07:20:00.157846 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552120-lzjsg" Mar 10 07:20:00 crc kubenswrapper[4825]: I0310 07:20:00.162476 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:20:00 crc kubenswrapper[4825]: I0310 07:20:00.162636 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:20:00 crc kubenswrapper[4825]: I0310 07:20:00.164558 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:20:00 crc kubenswrapper[4825]: I0310 07:20:00.187371 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552120-lzjsg"] Mar 10 07:20:00 crc kubenswrapper[4825]: I0310 07:20:00.261061 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqfqx\" (UniqueName: \"kubernetes.io/projected/8ee48ed3-b478-4fe2-ba16-39060711f446-kube-api-access-kqfqx\") pod \"auto-csr-approver-29552120-lzjsg\" (UID: \"8ee48ed3-b478-4fe2-ba16-39060711f446\") " pod="openshift-infra/auto-csr-approver-29552120-lzjsg" Mar 10 07:20:00 crc kubenswrapper[4825]: I0310 07:20:00.362371 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqfqx\" (UniqueName: \"kubernetes.io/projected/8ee48ed3-b478-4fe2-ba16-39060711f446-kube-api-access-kqfqx\") pod \"auto-csr-approver-29552120-lzjsg\" (UID: \"8ee48ed3-b478-4fe2-ba16-39060711f446\") " pod="openshift-infra/auto-csr-approver-29552120-lzjsg" Mar 10 07:20:00 crc kubenswrapper[4825]: I0310 07:20:00.381595 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqfqx\" (UniqueName: \"kubernetes.io/projected/8ee48ed3-b478-4fe2-ba16-39060711f446-kube-api-access-kqfqx\") pod \"auto-csr-approver-29552120-lzjsg\" (UID: \"8ee48ed3-b478-4fe2-ba16-39060711f446\") " pod="openshift-infra/auto-csr-approver-29552120-lzjsg" Mar 10 07:20:00 crc kubenswrapper[4825]: I0310 07:20:00.478423 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552120-lzjsg" Mar 10 07:20:00 crc kubenswrapper[4825]: I0310 07:20:00.745832 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552120-lzjsg"] Mar 10 07:20:01 crc kubenswrapper[4825]: I0310 07:20:01.202687 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552120-lzjsg" event={"ID":"8ee48ed3-b478-4fe2-ba16-39060711f446","Type":"ContainerStarted","Data":"9a4c099acd4aeca0dacb5574979d12cc5566a61e0e113736059cca5c9a3e247f"} Mar 10 07:20:02 crc kubenswrapper[4825]: I0310 07:20:02.214684 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552120-lzjsg" event={"ID":"8ee48ed3-b478-4fe2-ba16-39060711f446","Type":"ContainerStarted","Data":"c05715d8163cbcbef2a3774a17e0efef9fdf1d7aef0433f053f9f328e4345879"} Mar 10 07:20:02 crc kubenswrapper[4825]: I0310 07:20:02.242751 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552120-lzjsg" podStartSLOduration=1.2192299659999999 podStartE2EDuration="2.242731497s" podCreationTimestamp="2026-03-10 07:20:00 +0000 UTC" firstStartedPulling="2026-03-10 07:20:00.760879776 +0000 UTC m=+2153.790660411" lastFinishedPulling="2026-03-10 07:20:01.784381317 +0000 UTC m=+2154.814161942" observedRunningTime="2026-03-10 07:20:02.23486926 +0000 UTC m=+2155.264649905" watchObservedRunningTime="2026-03-10 07:20:02.242731497 +0000 UTC m=+2155.272512122" Mar 10 07:20:03 crc kubenswrapper[4825]: I0310 07:20:03.224020 4825 generic.go:334] "Generic (PLEG): container finished" podID="8ee48ed3-b478-4fe2-ba16-39060711f446" containerID="c05715d8163cbcbef2a3774a17e0efef9fdf1d7aef0433f053f9f328e4345879" exitCode=0 Mar 10 07:20:03 crc kubenswrapper[4825]: I0310 07:20:03.224085 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552120-lzjsg" event={"ID":"8ee48ed3-b478-4fe2-ba16-39060711f446","Type":"ContainerDied","Data":"c05715d8163cbcbef2a3774a17e0efef9fdf1d7aef0433f053f9f328e4345879"} Mar 10 07:20:04 crc kubenswrapper[4825]: I0310 07:20:04.573853 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552120-lzjsg" Mar 10 07:20:04 crc kubenswrapper[4825]: I0310 07:20:04.761196 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqfqx\" (UniqueName: \"kubernetes.io/projected/8ee48ed3-b478-4fe2-ba16-39060711f446-kube-api-access-kqfqx\") pod \"8ee48ed3-b478-4fe2-ba16-39060711f446\" (UID: \"8ee48ed3-b478-4fe2-ba16-39060711f446\") " Mar 10 07:20:04 crc kubenswrapper[4825]: I0310 07:20:04.769926 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ee48ed3-b478-4fe2-ba16-39060711f446-kube-api-access-kqfqx" (OuterVolumeSpecName: "kube-api-access-kqfqx") pod "8ee48ed3-b478-4fe2-ba16-39060711f446" (UID: "8ee48ed3-b478-4fe2-ba16-39060711f446"). InnerVolumeSpecName "kube-api-access-kqfqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:20:04 crc kubenswrapper[4825]: I0310 07:20:04.863167 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqfqx\" (UniqueName: \"kubernetes.io/projected/8ee48ed3-b478-4fe2-ba16-39060711f446-kube-api-access-kqfqx\") on node \"crc\" DevicePath \"\"" Mar 10 07:20:05 crc kubenswrapper[4825]: I0310 07:20:05.244044 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552120-lzjsg" Mar 10 07:20:05 crc kubenswrapper[4825]: I0310 07:20:05.252127 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552120-lzjsg" event={"ID":"8ee48ed3-b478-4fe2-ba16-39060711f446","Type":"ContainerDied","Data":"9a4c099acd4aeca0dacb5574979d12cc5566a61e0e113736059cca5c9a3e247f"} Mar 10 07:20:05 crc kubenswrapper[4825]: I0310 07:20:05.252223 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a4c099acd4aeca0dacb5574979d12cc5566a61e0e113736059cca5c9a3e247f" Mar 10 07:20:05 crc kubenswrapper[4825]: I0310 07:20:05.332269 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552114-ql6j9"] Mar 10 07:20:05 crc kubenswrapper[4825]: I0310 07:20:05.339712 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552114-ql6j9"] Mar 10 07:20:07 crc kubenswrapper[4825]: I0310 07:20:07.251598 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c659e062-f5dd-48b0-bcbe-4b448da7bb7b" path="/var/lib/kubelet/pods/c659e062-f5dd-48b0-bcbe-4b448da7bb7b/volumes" Mar 10 07:20:37 crc kubenswrapper[4825]: I0310 07:20:37.631768 4825 scope.go:117] "RemoveContainer" containerID="9dbaf27159878fa662337cfb370440e49dfeff645bb54f3dd5183e305fbae167" Mar 10 07:21:16 crc kubenswrapper[4825]: I0310 07:21:16.889080 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:21:16 crc kubenswrapper[4825]: I0310 07:21:16.890059 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:21:46 crc kubenswrapper[4825]: I0310 07:21:46.888364 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:21:46 crc kubenswrapper[4825]: I0310 07:21:46.890775 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:22:00 crc kubenswrapper[4825]: I0310 07:22:00.173244 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552122-zwxnr"] Mar 10 07:22:00 crc kubenswrapper[4825]: E0310 07:22:00.174661 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee48ed3-b478-4fe2-ba16-39060711f446" containerName="oc" Mar 10 07:22:00 crc kubenswrapper[4825]: I0310 07:22:00.174698 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee48ed3-b478-4fe2-ba16-39060711f446" containerName="oc" Mar 10 07:22:00 crc kubenswrapper[4825]: I0310 07:22:00.175052 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ee48ed3-b478-4fe2-ba16-39060711f446" containerName="oc" Mar 10 07:22:00 crc kubenswrapper[4825]: I0310 07:22:00.176036 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552122-zwxnr" Mar 10 07:22:00 crc kubenswrapper[4825]: I0310 07:22:00.181873 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:22:00 crc kubenswrapper[4825]: I0310 07:22:00.182444 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:22:00 crc kubenswrapper[4825]: I0310 07:22:00.182963 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:22:00 crc kubenswrapper[4825]: I0310 07:22:00.185706 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552122-zwxnr"] Mar 10 07:22:00 crc kubenswrapper[4825]: I0310 07:22:00.253204 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dvt8\" (UniqueName: \"kubernetes.io/projected/18954899-280e-4222-bd8f-f23dde790714-kube-api-access-6dvt8\") pod \"auto-csr-approver-29552122-zwxnr\" (UID: \"18954899-280e-4222-bd8f-f23dde790714\") " pod="openshift-infra/auto-csr-approver-29552122-zwxnr" Mar 10 07:22:00 crc kubenswrapper[4825]: I0310 07:22:00.354256 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dvt8\" (UniqueName: \"kubernetes.io/projected/18954899-280e-4222-bd8f-f23dde790714-kube-api-access-6dvt8\") pod \"auto-csr-approver-29552122-zwxnr\" (UID: \"18954899-280e-4222-bd8f-f23dde790714\") " pod="openshift-infra/auto-csr-approver-29552122-zwxnr" Mar 10 07:22:00 crc kubenswrapper[4825]: I0310 07:22:00.386487 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dvt8\" (UniqueName: \"kubernetes.io/projected/18954899-280e-4222-bd8f-f23dde790714-kube-api-access-6dvt8\") pod \"auto-csr-approver-29552122-zwxnr\" (UID: \"18954899-280e-4222-bd8f-f23dde790714\") " pod="openshift-infra/auto-csr-approver-29552122-zwxnr" Mar 10 07:22:00 crc kubenswrapper[4825]: I0310 07:22:00.506857 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552122-zwxnr" Mar 10 07:22:01 crc kubenswrapper[4825]: I0310 07:22:01.049739 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552122-zwxnr"] Mar 10 07:22:01 crc kubenswrapper[4825]: I0310 07:22:01.051861 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 07:22:01 crc kubenswrapper[4825]: I0310 07:22:01.311530 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552122-zwxnr" event={"ID":"18954899-280e-4222-bd8f-f23dde790714","Type":"ContainerStarted","Data":"e715b7d0f3498f1168b559d08405a5ac5df4e139eb053f6c2306e3c838b876a6"} Mar 10 07:22:02 crc kubenswrapper[4825]: I0310 07:22:02.323169 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552122-zwxnr" event={"ID":"18954899-280e-4222-bd8f-f23dde790714","Type":"ContainerStarted","Data":"6e624e4ecf0c2bd0cb9f6b0efd1293653cc2b35372878c46b873163f5fa2b138"} Mar 10 07:22:02 crc kubenswrapper[4825]: I0310 07:22:02.352557 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552122-zwxnr" podStartSLOduration=1.455721827 podStartE2EDuration="2.352537562s" podCreationTimestamp="2026-03-10 07:22:00 +0000 UTC" firstStartedPulling="2026-03-10 07:22:01.050472832 +0000 UTC m=+2274.080253487" lastFinishedPulling="2026-03-10 07:22:01.947288567 +0000 UTC m=+2274.977069222" observedRunningTime="2026-03-10 07:22:02.346650667 +0000 UTC m=+2275.376431322" watchObservedRunningTime="2026-03-10 07:22:02.352537562 +0000 UTC m=+2275.382318187" Mar 10 07:22:03 crc kubenswrapper[4825]: I0310 07:22:03.334616 4825 generic.go:334] "Generic (PLEG): container finished" podID="18954899-280e-4222-bd8f-f23dde790714" containerID="6e624e4ecf0c2bd0cb9f6b0efd1293653cc2b35372878c46b873163f5fa2b138" exitCode=0 Mar 10 07:22:03 crc kubenswrapper[4825]: I0310 07:22:03.334684 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552122-zwxnr" event={"ID":"18954899-280e-4222-bd8f-f23dde790714","Type":"ContainerDied","Data":"6e624e4ecf0c2bd0cb9f6b0efd1293653cc2b35372878c46b873163f5fa2b138"} Mar 10 07:22:04 crc kubenswrapper[4825]: I0310 07:22:04.691433 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552122-zwxnr" Mar 10 07:22:04 crc kubenswrapper[4825]: I0310 07:22:04.860733 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dvt8\" (UniqueName: \"kubernetes.io/projected/18954899-280e-4222-bd8f-f23dde790714-kube-api-access-6dvt8\") pod \"18954899-280e-4222-bd8f-f23dde790714\" (UID: \"18954899-280e-4222-bd8f-f23dde790714\") " Mar 10 07:22:04 crc kubenswrapper[4825]: I0310 07:22:04.865633 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18954899-280e-4222-bd8f-f23dde790714-kube-api-access-6dvt8" (OuterVolumeSpecName: "kube-api-access-6dvt8") pod "18954899-280e-4222-bd8f-f23dde790714" (UID: "18954899-280e-4222-bd8f-f23dde790714"). InnerVolumeSpecName "kube-api-access-6dvt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:22:04 crc kubenswrapper[4825]: I0310 07:22:04.962257 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dvt8\" (UniqueName: \"kubernetes.io/projected/18954899-280e-4222-bd8f-f23dde790714-kube-api-access-6dvt8\") on node \"crc\" DevicePath \"\"" Mar 10 07:22:05 crc kubenswrapper[4825]: I0310 07:22:05.353388 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552122-zwxnr" event={"ID":"18954899-280e-4222-bd8f-f23dde790714","Type":"ContainerDied","Data":"e715b7d0f3498f1168b559d08405a5ac5df4e139eb053f6c2306e3c838b876a6"} Mar 10 07:22:05 crc kubenswrapper[4825]: I0310 07:22:05.353523 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e715b7d0f3498f1168b559d08405a5ac5df4e139eb053f6c2306e3c838b876a6" Mar 10 07:22:05 crc kubenswrapper[4825]: I0310 07:22:05.353610 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552122-zwxnr" Mar 10 07:22:05 crc kubenswrapper[4825]: I0310 07:22:05.428154 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552116-ncwdw"] Mar 10 07:22:05 crc kubenswrapper[4825]: I0310 07:22:05.435160 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552116-ncwdw"] Mar 10 07:22:07 crc kubenswrapper[4825]: I0310 07:22:07.255580 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="906f198a-b4c0-4958-b736-1cf9febbdc59" path="/var/lib/kubelet/pods/906f198a-b4c0-4958-b736-1cf9febbdc59/volumes" Mar 10 07:22:16 crc kubenswrapper[4825]: I0310 07:22:16.888819 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:22:16 crc kubenswrapper[4825]: I0310 07:22:16.889360 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:22:16 crc kubenswrapper[4825]: I0310 07:22:16.889409 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 07:22:16 crc kubenswrapper[4825]: I0310 07:22:16.889975 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dddfba3f7a2f8d3823d9a072175efb197634fd1df552eb76662bdf50e8519ec5"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 07:22:16 crc kubenswrapper[4825]: I0310 07:22:16.890047 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://dddfba3f7a2f8d3823d9a072175efb197634fd1df552eb76662bdf50e8519ec5" gracePeriod=600 Mar 10 07:22:17 crc kubenswrapper[4825]: I0310 07:22:17.468078 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="dddfba3f7a2f8d3823d9a072175efb197634fd1df552eb76662bdf50e8519ec5" exitCode=0 Mar 10 07:22:17 crc kubenswrapper[4825]: I0310 07:22:17.468315 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"dddfba3f7a2f8d3823d9a072175efb197634fd1df552eb76662bdf50e8519ec5"} Mar 10 07:22:17 crc kubenswrapper[4825]: I0310 07:22:17.468529 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45"} Mar 10 07:22:17 crc kubenswrapper[4825]: I0310 07:22:17.468557 4825 scope.go:117] "RemoveContainer" containerID="4b318206344caaa244c87f6f76d1f2dd80b8f698eb9b562ec892c059af8f81e5" Mar 10 07:22:25 crc kubenswrapper[4825]: I0310 07:22:25.180059 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hj9ws"] Mar 10 07:22:25 crc kubenswrapper[4825]: E0310 07:22:25.181093 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18954899-280e-4222-bd8f-f23dde790714" containerName="oc" Mar 10 07:22:25 crc kubenswrapper[4825]: I0310 07:22:25.181114 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="18954899-280e-4222-bd8f-f23dde790714" containerName="oc" Mar 10 07:22:25 crc kubenswrapper[4825]: I0310 07:22:25.181397 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="18954899-280e-4222-bd8f-f23dde790714" containerName="oc" Mar 10 07:22:25 crc kubenswrapper[4825]: I0310 07:22:25.183086 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hj9ws" Mar 10 07:22:25 crc kubenswrapper[4825]: I0310 07:22:25.209763 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hj9ws"] Mar 10 07:22:25 crc kubenswrapper[4825]: I0310 07:22:25.315853 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z5x5\" (UniqueName: \"kubernetes.io/projected/459d5515-68ea-44dd-b54a-af57cd3fe2b7-kube-api-access-7z5x5\") pod \"redhat-marketplace-hj9ws\" (UID: \"459d5515-68ea-44dd-b54a-af57cd3fe2b7\") " pod="openshift-marketplace/redhat-marketplace-hj9ws" Mar 10 07:22:25 crc kubenswrapper[4825]: I0310 07:22:25.316189 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459d5515-68ea-44dd-b54a-af57cd3fe2b7-utilities\") pod \"redhat-marketplace-hj9ws\" (UID: \"459d5515-68ea-44dd-b54a-af57cd3fe2b7\") " pod="openshift-marketplace/redhat-marketplace-hj9ws" Mar 10 07:22:25 crc kubenswrapper[4825]: I0310 07:22:25.316292 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459d5515-68ea-44dd-b54a-af57cd3fe2b7-catalog-content\") pod \"redhat-marketplace-hj9ws\" (UID: \"459d5515-68ea-44dd-b54a-af57cd3fe2b7\") " pod="openshift-marketplace/redhat-marketplace-hj9ws" Mar 10 07:22:25 crc kubenswrapper[4825]: I0310 07:22:25.417158 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459d5515-68ea-44dd-b54a-af57cd3fe2b7-catalog-content\") pod \"redhat-marketplace-hj9ws\" (UID: \"459d5515-68ea-44dd-b54a-af57cd3fe2b7\") " pod="openshift-marketplace/redhat-marketplace-hj9ws" Mar 10 07:22:25 crc kubenswrapper[4825]: I0310 07:22:25.417203 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z5x5\" (UniqueName: \"kubernetes.io/projected/459d5515-68ea-44dd-b54a-af57cd3fe2b7-kube-api-access-7z5x5\") pod \"redhat-marketplace-hj9ws\" (UID: \"459d5515-68ea-44dd-b54a-af57cd3fe2b7\") " pod="openshift-marketplace/redhat-marketplace-hj9ws" Mar 10 07:22:25 crc kubenswrapper[4825]: I0310 07:22:25.417258 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459d5515-68ea-44dd-b54a-af57cd3fe2b7-utilities\") pod \"redhat-marketplace-hj9ws\" (UID: \"459d5515-68ea-44dd-b54a-af57cd3fe2b7\") " pod="openshift-marketplace/redhat-marketplace-hj9ws" Mar 10 07:22:25 crc kubenswrapper[4825]: I0310 07:22:25.418060 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459d5515-68ea-44dd-b54a-af57cd3fe2b7-utilities\") pod \"redhat-marketplace-hj9ws\" (UID: \"459d5515-68ea-44dd-b54a-af57cd3fe2b7\") " pod="openshift-marketplace/redhat-marketplace-hj9ws" Mar 10 07:22:25 crc kubenswrapper[4825]: I0310 07:22:25.418538 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459d5515-68ea-44dd-b54a-af57cd3fe2b7-catalog-content\") pod \"redhat-marketplace-hj9ws\" (UID: \"459d5515-68ea-44dd-b54a-af57cd3fe2b7\") " pod="openshift-marketplace/redhat-marketplace-hj9ws" Mar 10 07:22:25 crc kubenswrapper[4825]: I0310 07:22:25.443186 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z5x5\" (UniqueName: \"kubernetes.io/projected/459d5515-68ea-44dd-b54a-af57cd3fe2b7-kube-api-access-7z5x5\") pod \"redhat-marketplace-hj9ws\" (UID: \"459d5515-68ea-44dd-b54a-af57cd3fe2b7\") " pod="openshift-marketplace/redhat-marketplace-hj9ws" Mar 10 07:22:25 crc kubenswrapper[4825]: I0310 07:22:25.516492 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hj9ws" Mar 10 07:22:25 crc kubenswrapper[4825]: I0310 07:22:25.775459 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hj9ws"] Mar 10 07:22:26 crc kubenswrapper[4825]: I0310 07:22:26.549223 4825 generic.go:334] "Generic (PLEG): container finished" podID="459d5515-68ea-44dd-b54a-af57cd3fe2b7" containerID="ee7ba71e4c8449495a51ef38211341c52c0920325dd2b06ddd56c983af2c29d1" exitCode=0 Mar 10 07:22:26 crc kubenswrapper[4825]: I0310 07:22:26.549321 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hj9ws" event={"ID":"459d5515-68ea-44dd-b54a-af57cd3fe2b7","Type":"ContainerDied","Data":"ee7ba71e4c8449495a51ef38211341c52c0920325dd2b06ddd56c983af2c29d1"} Mar 10 07:22:26 crc kubenswrapper[4825]: I0310 07:22:26.549838 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hj9ws" event={"ID":"459d5515-68ea-44dd-b54a-af57cd3fe2b7","Type":"ContainerStarted","Data":"53fe35e876c0866031915b108233ef604110d4339a41e453607c39dd824647ba"} Mar 10 07:22:27 crc kubenswrapper[4825]: I0310 07:22:27.563333 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hj9ws" event={"ID":"459d5515-68ea-44dd-b54a-af57cd3fe2b7","Type":"ContainerStarted","Data":"f4fc73f98cdc3593f02411f5f1ff649040dfd0302a41f5ddf37b4756dc19cb9d"} Mar 10 07:22:28 crc kubenswrapper[4825]: I0310 07:22:28.571782 4825 generic.go:334] "Generic (PLEG): container finished" podID="459d5515-68ea-44dd-b54a-af57cd3fe2b7" containerID="f4fc73f98cdc3593f02411f5f1ff649040dfd0302a41f5ddf37b4756dc19cb9d" exitCode=0 Mar 10 07:22:28 crc kubenswrapper[4825]: I0310 07:22:28.571840 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hj9ws" event={"ID":"459d5515-68ea-44dd-b54a-af57cd3fe2b7","Type":"ContainerDied","Data":"f4fc73f98cdc3593f02411f5f1ff649040dfd0302a41f5ddf37b4756dc19cb9d"} Mar 10 07:22:29 crc kubenswrapper[4825]: I0310 07:22:29.580965 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hj9ws" event={"ID":"459d5515-68ea-44dd-b54a-af57cd3fe2b7","Type":"ContainerStarted","Data":"0cc920bfc278cfb3013374d5461cdde11ca3424c536569537b2643237e18eed4"} Mar 10 07:22:29 crc kubenswrapper[4825]: I0310 07:22:29.604535 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hj9ws" podStartSLOduration=2.194060102 podStartE2EDuration="4.60451545s" podCreationTimestamp="2026-03-10 07:22:25 +0000 UTC" firstStartedPulling="2026-03-10 07:22:26.551622648 +0000 UTC m=+2299.581403303" lastFinishedPulling="2026-03-10 07:22:28.962078036 +0000 UTC m=+2301.991858651" observedRunningTime="2026-03-10 07:22:29.600471443 +0000 UTC m=+2302.630252078" watchObservedRunningTime="2026-03-10 07:22:29.60451545 +0000 UTC m=+2302.634296055" Mar 10 07:22:35 crc kubenswrapper[4825]: I0310 07:22:35.517403 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hj9ws" Mar 10 07:22:35 crc kubenswrapper[4825]: I0310 07:22:35.517993 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hj9ws" Mar 10 07:22:35 crc kubenswrapper[4825]: I0310 07:22:35.600051 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hj9ws" Mar 10 07:22:35 crc kubenswrapper[4825]: I0310 07:22:35.694418 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hj9ws" Mar 10 07:22:35 crc kubenswrapper[4825]: I0310 07:22:35.844323 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hj9ws"] Mar 10 07:22:37 crc kubenswrapper[4825]: I0310 07:22:37.658350 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hj9ws" podUID="459d5515-68ea-44dd-b54a-af57cd3fe2b7" containerName="registry-server" containerID="cri-o://0cc920bfc278cfb3013374d5461cdde11ca3424c536569537b2643237e18eed4" gracePeriod=2 Mar 10 07:22:37 crc kubenswrapper[4825]: I0310 07:22:37.747338 4825 scope.go:117] "RemoveContainer" containerID="9e28c7381c0a537ec0798b11356883dada8501972ab1bf4e10dff52e0ca20c53" Mar 10 07:22:38 crc kubenswrapper[4825]: I0310 07:22:38.657748 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hj9ws" Mar 10 07:22:38 crc kubenswrapper[4825]: I0310 07:22:38.668109 4825 generic.go:334] "Generic (PLEG): container finished" podID="459d5515-68ea-44dd-b54a-af57cd3fe2b7" containerID="0cc920bfc278cfb3013374d5461cdde11ca3424c536569537b2643237e18eed4" exitCode=0 Mar 10 07:22:38 crc kubenswrapper[4825]: I0310 07:22:38.668173 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hj9ws" event={"ID":"459d5515-68ea-44dd-b54a-af57cd3fe2b7","Type":"ContainerDied","Data":"0cc920bfc278cfb3013374d5461cdde11ca3424c536569537b2643237e18eed4"} Mar 10 07:22:38 crc kubenswrapper[4825]: I0310 07:22:38.668224 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hj9ws" Mar 10 07:22:38 crc kubenswrapper[4825]: I0310 07:22:38.668243 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hj9ws" event={"ID":"459d5515-68ea-44dd-b54a-af57cd3fe2b7","Type":"ContainerDied","Data":"53fe35e876c0866031915b108233ef604110d4339a41e453607c39dd824647ba"} Mar 10 07:22:38 crc kubenswrapper[4825]: I0310 07:22:38.668290 4825 scope.go:117] "RemoveContainer" containerID="0cc920bfc278cfb3013374d5461cdde11ca3424c536569537b2643237e18eed4" Mar 10 07:22:38 crc kubenswrapper[4825]: I0310 07:22:38.700882 4825 scope.go:117] "RemoveContainer" containerID="f4fc73f98cdc3593f02411f5f1ff649040dfd0302a41f5ddf37b4756dc19cb9d" Mar 10 07:22:38 crc kubenswrapper[4825]: I0310 07:22:38.719898 4825 scope.go:117] "RemoveContainer" containerID="ee7ba71e4c8449495a51ef38211341c52c0920325dd2b06ddd56c983af2c29d1" Mar 10 07:22:38 crc kubenswrapper[4825]: I0310 07:22:38.739831 4825 scope.go:117] "RemoveContainer" containerID="0cc920bfc278cfb3013374d5461cdde11ca3424c536569537b2643237e18eed4" Mar 10 07:22:38 crc kubenswrapper[4825]: E0310 07:22:38.740319 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cc920bfc278cfb3013374d5461cdde11ca3424c536569537b2643237e18eed4\": container with ID starting with 0cc920bfc278cfb3013374d5461cdde11ca3424c536569537b2643237e18eed4 not found: ID does not exist" containerID="0cc920bfc278cfb3013374d5461cdde11ca3424c536569537b2643237e18eed4" Mar 10 07:22:38 crc kubenswrapper[4825]: I0310 07:22:38.740362 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cc920bfc278cfb3013374d5461cdde11ca3424c536569537b2643237e18eed4"} err="failed to get container status \"0cc920bfc278cfb3013374d5461cdde11ca3424c536569537b2643237e18eed4\": rpc error: code = NotFound desc = could not find container \"0cc920bfc278cfb3013374d5461cdde11ca3424c536569537b2643237e18eed4\": container with ID starting with 0cc920bfc278cfb3013374d5461cdde11ca3424c536569537b2643237e18eed4 not found: ID does not exist" Mar 10 07:22:38 crc kubenswrapper[4825]: I0310 07:22:38.740395 4825 scope.go:117] "RemoveContainer" containerID="f4fc73f98cdc3593f02411f5f1ff649040dfd0302a41f5ddf37b4756dc19cb9d" Mar 10 07:22:38 crc kubenswrapper[4825]: E0310 07:22:38.741617 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4fc73f98cdc3593f02411f5f1ff649040dfd0302a41f5ddf37b4756dc19cb9d\": container with ID starting with f4fc73f98cdc3593f02411f5f1ff649040dfd0302a41f5ddf37b4756dc19cb9d not found: ID does not exist" containerID="f4fc73f98cdc3593f02411f5f1ff649040dfd0302a41f5ddf37b4756dc19cb9d" Mar 10 07:22:38 crc kubenswrapper[4825]: I0310 07:22:38.741658 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4fc73f98cdc3593f02411f5f1ff649040dfd0302a41f5ddf37b4756dc19cb9d"} err="failed to get container status \"f4fc73f98cdc3593f02411f5f1ff649040dfd0302a41f5ddf37b4756dc19cb9d\": rpc error: code = NotFound desc = could not find container \"f4fc73f98cdc3593f02411f5f1ff649040dfd0302a41f5ddf37b4756dc19cb9d\": container with ID starting with f4fc73f98cdc3593f02411f5f1ff649040dfd0302a41f5ddf37b4756dc19cb9d not found: ID does not exist" Mar 10 07:22:38 crc kubenswrapper[4825]: I0310 07:22:38.741691 4825 scope.go:117] "RemoveContainer" containerID="ee7ba71e4c8449495a51ef38211341c52c0920325dd2b06ddd56c983af2c29d1" Mar 10 07:22:38 crc kubenswrapper[4825]: E0310 07:22:38.742183 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee7ba71e4c8449495a51ef38211341c52c0920325dd2b06ddd56c983af2c29d1\": container with ID starting with ee7ba71e4c8449495a51ef38211341c52c0920325dd2b06ddd56c983af2c29d1 not found: ID does not exist" containerID="ee7ba71e4c8449495a51ef38211341c52c0920325dd2b06ddd56c983af2c29d1" Mar 10 07:22:38 crc kubenswrapper[4825]: I0310 07:22:38.742211 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee7ba71e4c8449495a51ef38211341c52c0920325dd2b06ddd56c983af2c29d1"} err="failed to get container status \"ee7ba71e4c8449495a51ef38211341c52c0920325dd2b06ddd56c983af2c29d1\": rpc error: code = NotFound desc = could not find container \"ee7ba71e4c8449495a51ef38211341c52c0920325dd2b06ddd56c983af2c29d1\": container with ID starting with ee7ba71e4c8449495a51ef38211341c52c0920325dd2b06ddd56c983af2c29d1 not found: ID does not exist" Mar 10 07:22:38 crc kubenswrapper[4825]: I0310 07:22:38.810025 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z5x5\" (UniqueName: \"kubernetes.io/projected/459d5515-68ea-44dd-b54a-af57cd3fe2b7-kube-api-access-7z5x5\") pod \"459d5515-68ea-44dd-b54a-af57cd3fe2b7\" (UID: \"459d5515-68ea-44dd-b54a-af57cd3fe2b7\") " Mar 10 07:22:38 crc kubenswrapper[4825]: I0310 07:22:38.810152 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459d5515-68ea-44dd-b54a-af57cd3fe2b7-catalog-content\") pod \"459d5515-68ea-44dd-b54a-af57cd3fe2b7\" (UID: \"459d5515-68ea-44dd-b54a-af57cd3fe2b7\") " Mar 10 07:22:38 crc kubenswrapper[4825]: I0310 07:22:38.810190 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459d5515-68ea-44dd-b54a-af57cd3fe2b7-utilities\") pod \"459d5515-68ea-44dd-b54a-af57cd3fe2b7\" (UID: \"459d5515-68ea-44dd-b54a-af57cd3fe2b7\") " Mar 10 07:22:38 crc kubenswrapper[4825]: I0310 07:22:38.810961 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/459d5515-68ea-44dd-b54a-af57cd3fe2b7-utilities" (OuterVolumeSpecName: "utilities") pod "459d5515-68ea-44dd-b54a-af57cd3fe2b7" (UID: "459d5515-68ea-44dd-b54a-af57cd3fe2b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:22:38 crc kubenswrapper[4825]: I0310 07:22:38.818103 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/459d5515-68ea-44dd-b54a-af57cd3fe2b7-kube-api-access-7z5x5" (OuterVolumeSpecName: "kube-api-access-7z5x5") pod "459d5515-68ea-44dd-b54a-af57cd3fe2b7" (UID: "459d5515-68ea-44dd-b54a-af57cd3fe2b7"). InnerVolumeSpecName "kube-api-access-7z5x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:22:38 crc kubenswrapper[4825]: I0310 07:22:38.842251 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/459d5515-68ea-44dd-b54a-af57cd3fe2b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "459d5515-68ea-44dd-b54a-af57cd3fe2b7" (UID: "459d5515-68ea-44dd-b54a-af57cd3fe2b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:22:38 crc kubenswrapper[4825]: I0310 07:22:38.911979 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z5x5\" (UniqueName: \"kubernetes.io/projected/459d5515-68ea-44dd-b54a-af57cd3fe2b7-kube-api-access-7z5x5\") on node \"crc\" DevicePath \"\"" Mar 10 07:22:38 crc kubenswrapper[4825]: I0310 07:22:38.912022 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459d5515-68ea-44dd-b54a-af57cd3fe2b7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 07:22:38 crc kubenswrapper[4825]: I0310 07:22:38.912036 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459d5515-68ea-44dd-b54a-af57cd3fe2b7-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 07:22:39 crc kubenswrapper[4825]: I0310 07:22:39.014101 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hj9ws"] Mar 10 07:22:39 crc kubenswrapper[4825]: I0310 07:22:39.020369 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hj9ws"] Mar 10 07:22:39 crc kubenswrapper[4825]: I0310 07:22:39.253912 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="459d5515-68ea-44dd-b54a-af57cd3fe2b7" path="/var/lib/kubelet/pods/459d5515-68ea-44dd-b54a-af57cd3fe2b7/volumes" Mar 10 07:22:39 crc kubenswrapper[4825]: I0310 07:22:39.462784 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qt6bq"] Mar 10 07:22:39 crc kubenswrapper[4825]: E0310 07:22:39.463951 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="459d5515-68ea-44dd-b54a-af57cd3fe2b7" containerName="registry-server" Mar 10 07:22:39 crc kubenswrapper[4825]: I0310 07:22:39.463990 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="459d5515-68ea-44dd-b54a-af57cd3fe2b7" containerName="registry-server" Mar 10 07:22:39 crc kubenswrapper[4825]: E0310 07:22:39.464022 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="459d5515-68ea-44dd-b54a-af57cd3fe2b7" containerName="extract-content" Mar 10 07:22:39 crc kubenswrapper[4825]: I0310 07:22:39.464037 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="459d5515-68ea-44dd-b54a-af57cd3fe2b7" containerName="extract-content" Mar 10 07:22:39 crc kubenswrapper[4825]: E0310 07:22:39.464064 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="459d5515-68ea-44dd-b54a-af57cd3fe2b7" containerName="extract-utilities" Mar 10 07:22:39 crc kubenswrapper[4825]: I0310 07:22:39.464077 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="459d5515-68ea-44dd-b54a-af57cd3fe2b7" containerName="extract-utilities" Mar 10 07:22:39 crc kubenswrapper[4825]: I0310 07:22:39.464421 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="459d5515-68ea-44dd-b54a-af57cd3fe2b7" containerName="registry-server" Mar 10 07:22:39 crc kubenswrapper[4825]: I0310 07:22:39.466923 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qt6bq" Mar 10 07:22:39 crc kubenswrapper[4825]: I0310 07:22:39.492943 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qt6bq"] Mar 10 07:22:39 crc kubenswrapper[4825]: I0310 07:22:39.623224 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd1ae15a-290a-43a8-8f7b-3b9a26b1e239-utilities\") pod \"community-operators-qt6bq\" (UID: \"dd1ae15a-290a-43a8-8f7b-3b9a26b1e239\") " pod="openshift-marketplace/community-operators-qt6bq" Mar 10 07:22:39 crc kubenswrapper[4825]: I0310 07:22:39.623358 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd1ae15a-290a-43a8-8f7b-3b9a26b1e239-catalog-content\") pod \"community-operators-qt6bq\" (UID: \"dd1ae15a-290a-43a8-8f7b-3b9a26b1e239\") " pod="openshift-marketplace/community-operators-qt6bq" Mar 10 07:22:39 crc kubenswrapper[4825]: I0310 07:22:39.623404 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj47q\" (UniqueName: \"kubernetes.io/projected/dd1ae15a-290a-43a8-8f7b-3b9a26b1e239-kube-api-access-hj47q\") pod \"community-operators-qt6bq\" (UID: \"dd1ae15a-290a-43a8-8f7b-3b9a26b1e239\") " pod="openshift-marketplace/community-operators-qt6bq" Mar 10 07:22:39 crc kubenswrapper[4825]: I0310 07:22:39.725108 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd1ae15a-290a-43a8-8f7b-3b9a26b1e239-utilities\") pod \"community-operators-qt6bq\" (UID: \"dd1ae15a-290a-43a8-8f7b-3b9a26b1e239\") " pod="openshift-marketplace/community-operators-qt6bq" Mar 10 07:22:39 crc kubenswrapper[4825]: I0310 07:22:39.725320 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd1ae15a-290a-43a8-8f7b-3b9a26b1e239-catalog-content\") pod \"community-operators-qt6bq\" (UID: \"dd1ae15a-290a-43a8-8f7b-3b9a26b1e239\") " pod="openshift-marketplace/community-operators-qt6bq" Mar 10 07:22:39 crc kubenswrapper[4825]: I0310 07:22:39.725377 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj47q\" (UniqueName: \"kubernetes.io/projected/dd1ae15a-290a-43a8-8f7b-3b9a26b1e239-kube-api-access-hj47q\") pod \"community-operators-qt6bq\" (UID: \"dd1ae15a-290a-43a8-8f7b-3b9a26b1e239\") " pod="openshift-marketplace/community-operators-qt6bq" Mar 10 07:22:39 crc kubenswrapper[4825]: I0310 07:22:39.725745 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd1ae15a-290a-43a8-8f7b-3b9a26b1e239-utilities\") pod \"community-operators-qt6bq\" (UID: \"dd1ae15a-290a-43a8-8f7b-3b9a26b1e239\") " pod="openshift-marketplace/community-operators-qt6bq" Mar 10 07:22:39 crc kubenswrapper[4825]: I0310 07:22:39.725923 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd1ae15a-290a-43a8-8f7b-3b9a26b1e239-catalog-content\") pod \"community-operators-qt6bq\" (UID: \"dd1ae15a-290a-43a8-8f7b-3b9a26b1e239\") " pod="openshift-marketplace/community-operators-qt6bq" Mar 10 07:22:39 crc kubenswrapper[4825]: I0310 07:22:39.752406 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj47q\" (UniqueName: \"kubernetes.io/projected/dd1ae15a-290a-43a8-8f7b-3b9a26b1e239-kube-api-access-hj47q\") pod \"community-operators-qt6bq\" (UID: \"dd1ae15a-290a-43a8-8f7b-3b9a26b1e239\") " pod="openshift-marketplace/community-operators-qt6bq" Mar 10 07:22:39 crc kubenswrapper[4825]: I0310 07:22:39.823568 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qt6bq" Mar 10 07:22:40 crc kubenswrapper[4825]: I0310 07:22:40.109425 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qt6bq"] Mar 10 07:22:40 crc kubenswrapper[4825]: I0310 07:22:40.687510 4825 generic.go:334] "Generic (PLEG): container finished" podID="dd1ae15a-290a-43a8-8f7b-3b9a26b1e239" containerID="fc4399b0a374142f5c7d60538b56fdebbc1b36cf4dc92abae8a12649ad051b89" exitCode=0 Mar 10 07:22:40 crc kubenswrapper[4825]: I0310 07:22:40.687585 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qt6bq" event={"ID":"dd1ae15a-290a-43a8-8f7b-3b9a26b1e239","Type":"ContainerDied","Data":"fc4399b0a374142f5c7d60538b56fdebbc1b36cf4dc92abae8a12649ad051b89"} Mar 10 07:22:40 crc kubenswrapper[4825]: I0310 07:22:40.687633 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qt6bq" event={"ID":"dd1ae15a-290a-43a8-8f7b-3b9a26b1e239","Type":"ContainerStarted","Data":"c91724464baf93c3f857168e83eba3627331e52edf659a70eef32e898cea9081"} Mar 10 07:22:41 crc kubenswrapper[4825]: I0310 07:22:41.699596 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qt6bq" event={"ID":"dd1ae15a-290a-43a8-8f7b-3b9a26b1e239","Type":"ContainerStarted","Data":"b977a8bd0a0b2cb5041b856edc24f8362488454a8ea9ff766cceedc21f6bc11e"} Mar 10 07:22:42 crc kubenswrapper[4825]: I0310 07:22:42.714829 4825 generic.go:334] "Generic (PLEG): container finished" podID="dd1ae15a-290a-43a8-8f7b-3b9a26b1e239" containerID="b977a8bd0a0b2cb5041b856edc24f8362488454a8ea9ff766cceedc21f6bc11e" exitCode=0 Mar 10 07:22:42 crc kubenswrapper[4825]: I0310 07:22:42.714891 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qt6bq" event={"ID":"dd1ae15a-290a-43a8-8f7b-3b9a26b1e239","Type":"ContainerDied","Data":"b977a8bd0a0b2cb5041b856edc24f8362488454a8ea9ff766cceedc21f6bc11e"} Mar 10 07:22:43 crc kubenswrapper[4825]: I0310 07:22:43.724690 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qt6bq" event={"ID":"dd1ae15a-290a-43a8-8f7b-3b9a26b1e239","Type":"ContainerStarted","Data":"fd79cb6f2a93849ca1b0c5cff3e94b75aa7ce58cde4d85b1ebb62764582f78f4"} Mar 10 07:22:43 crc kubenswrapper[4825]: I0310 07:22:43.745355 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qt6bq" podStartSLOduration=2.305874473 podStartE2EDuration="4.745331875s" podCreationTimestamp="2026-03-10 07:22:39 +0000 UTC" firstStartedPulling="2026-03-10 07:22:40.692553116 +0000 UTC m=+2313.722333751" lastFinishedPulling="2026-03-10 07:22:43.132010538 +0000 UTC m=+2316.161791153" observedRunningTime="2026-03-10 07:22:43.743732723 +0000 UTC m=+2316.773513368" watchObservedRunningTime="2026-03-10 07:22:43.745331875 +0000 UTC m=+2316.775112490" Mar 10 07:22:46 crc kubenswrapper[4825]: I0310 07:22:46.456904 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dknf6"] Mar 10 07:22:46 crc kubenswrapper[4825]: I0310 07:22:46.459819 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dknf6" Mar 10 07:22:46 crc kubenswrapper[4825]: I0310 07:22:46.485007 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dknf6"] Mar 10 07:22:46 crc kubenswrapper[4825]: I0310 07:22:46.629808 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/364a0b38-6669-4f38-b0ee-744bbe14212e-catalog-content\") pod \"certified-operators-dknf6\" (UID: \"364a0b38-6669-4f38-b0ee-744bbe14212e\") " pod="openshift-marketplace/certified-operators-dknf6" Mar 10 07:22:46 crc kubenswrapper[4825]: I0310 07:22:46.630378 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/364a0b38-6669-4f38-b0ee-744bbe14212e-utilities\") pod \"certified-operators-dknf6\" (UID: \"364a0b38-6669-4f38-b0ee-744bbe14212e\") " pod="openshift-marketplace/certified-operators-dknf6" Mar 10 07:22:46 crc kubenswrapper[4825]: I0310 07:22:46.630452 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz99m\" (UniqueName: \"kubernetes.io/projected/364a0b38-6669-4f38-b0ee-744bbe14212e-kube-api-access-dz99m\") pod \"certified-operators-dknf6\" (UID: \"364a0b38-6669-4f38-b0ee-744bbe14212e\") " pod="openshift-marketplace/certified-operators-dknf6" Mar 10 07:22:46 crc kubenswrapper[4825]: I0310 07:22:46.731842 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/364a0b38-6669-4f38-b0ee-744bbe14212e-utilities\") pod \"certified-operators-dknf6\" (UID: \"364a0b38-6669-4f38-b0ee-744bbe14212e\") " pod="openshift-marketplace/certified-operators-dknf6" Mar 10 07:22:46 crc kubenswrapper[4825]: I0310 07:22:46.731903 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz99m\" (UniqueName: \"kubernetes.io/projected/364a0b38-6669-4f38-b0ee-744bbe14212e-kube-api-access-dz99m\") pod \"certified-operators-dknf6\" (UID: \"364a0b38-6669-4f38-b0ee-744bbe14212e\") " pod="openshift-marketplace/certified-operators-dknf6" Mar 10 07:22:46 crc kubenswrapper[4825]: I0310 07:22:46.731966 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/364a0b38-6669-4f38-b0ee-744bbe14212e-catalog-content\") pod \"certified-operators-dknf6\" (UID: \"364a0b38-6669-4f38-b0ee-744bbe14212e\") " pod="openshift-marketplace/certified-operators-dknf6" Mar 10 07:22:46 crc kubenswrapper[4825]: I0310 07:22:46.732466 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/364a0b38-6669-4f38-b0ee-744bbe14212e-utilities\") pod \"certified-operators-dknf6\" (UID: \"364a0b38-6669-4f38-b0ee-744bbe14212e\") " pod="openshift-marketplace/certified-operators-dknf6" Mar 10 07:22:46 crc kubenswrapper[4825]: I0310 07:22:46.732495 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/364a0b38-6669-4f38-b0ee-744bbe14212e-catalog-content\") pod \"certified-operators-dknf6\" (UID: \"364a0b38-6669-4f38-b0ee-744bbe14212e\") " pod="openshift-marketplace/certified-operators-dknf6" Mar 10 07:22:46 crc kubenswrapper[4825]: I0310 07:22:46.763737 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz99m\" (UniqueName: \"kubernetes.io/projected/364a0b38-6669-4f38-b0ee-744bbe14212e-kube-api-access-dz99m\") pod \"certified-operators-dknf6\" (UID: \"364a0b38-6669-4f38-b0ee-744bbe14212e\") " pod="openshift-marketplace/certified-operators-dknf6" Mar 10 07:22:46 crc kubenswrapper[4825]: I0310 07:22:46.785356 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dknf6" Mar 10 07:22:47 crc kubenswrapper[4825]: I0310 07:22:47.231612 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dknf6"] Mar 10 07:22:47 crc kubenswrapper[4825]: W0310 07:22:47.251307 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod364a0b38_6669_4f38_b0ee_744bbe14212e.slice/crio-cabb49bb3739a1da9a2ccdfeeb990678707fbefbe5d79e01b2c21d0f93025cee WatchSource:0}: Error finding container cabb49bb3739a1da9a2ccdfeeb990678707fbefbe5d79e01b2c21d0f93025cee: Status 404 returned error can't find the container with id cabb49bb3739a1da9a2ccdfeeb990678707fbefbe5d79e01b2c21d0f93025cee Mar 10 07:22:47 crc kubenswrapper[4825]: I0310 07:22:47.757209 4825 generic.go:334] "Generic (PLEG): container finished" podID="364a0b38-6669-4f38-b0ee-744bbe14212e" containerID="957be6d6081112d98ee7afe62c6bc58d388848ce77dc338a1fb440402bfa031e" exitCode=0 Mar 10 07:22:47 crc kubenswrapper[4825]: I0310 07:22:47.757295 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dknf6" event={"ID":"364a0b38-6669-4f38-b0ee-744bbe14212e","Type":"ContainerDied","Data":"957be6d6081112d98ee7afe62c6bc58d388848ce77dc338a1fb440402bfa031e"} Mar 10 07:22:47 crc kubenswrapper[4825]: I0310 07:22:47.757360 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dknf6" event={"ID":"364a0b38-6669-4f38-b0ee-744bbe14212e","Type":"ContainerStarted","Data":"cabb49bb3739a1da9a2ccdfeeb990678707fbefbe5d79e01b2c21d0f93025cee"} Mar 10 07:22:48 crc kubenswrapper[4825]: I0310 07:22:48.768929 4825 generic.go:334] "Generic (PLEG): container finished" podID="364a0b38-6669-4f38-b0ee-744bbe14212e" containerID="e3cb4cf242715001453faba9512236747e7aeedd4a7852db2c59cdfcf41fd0ea" exitCode=0 Mar 10 07:22:48 crc kubenswrapper[4825]: I0310 07:22:48.769045 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dknf6" event={"ID":"364a0b38-6669-4f38-b0ee-744bbe14212e","Type":"ContainerDied","Data":"e3cb4cf242715001453faba9512236747e7aeedd4a7852db2c59cdfcf41fd0ea"} Mar 10 07:22:49 crc kubenswrapper[4825]: I0310 07:22:49.780893 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dknf6" event={"ID":"364a0b38-6669-4f38-b0ee-744bbe14212e","Type":"ContainerStarted","Data":"b7bee0059e263afd47d112a53b9e9db7b86b6e17e3e485c260f4e1335cca4227"} Mar 10 07:22:49 crc kubenswrapper[4825]: I0310 07:22:49.803994 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dknf6" podStartSLOduration=2.294771858 podStartE2EDuration="3.803973835s" podCreationTimestamp="2026-03-10 07:22:46 +0000 UTC" firstStartedPulling="2026-03-10 07:22:47.760063182 +0000 UTC m=+2320.789843807" lastFinishedPulling="2026-03-10 07:22:49.269265159 +0000 UTC m=+2322.299045784" observedRunningTime="2026-03-10 07:22:49.798358107 +0000 UTC m=+2322.828138742" watchObservedRunningTime="2026-03-10 07:22:49.803973835 +0000 UTC m=+2322.833754440" Mar 10 07:22:49 crc kubenswrapper[4825]: I0310 07:22:49.824005 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qt6bq" Mar 10 07:22:49 crc kubenswrapper[4825]: I0310 07:22:49.824049 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qt6bq" Mar 10 07:22:49 crc kubenswrapper[4825]: I0310 07:22:49.891086 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qt6bq" Mar 10 07:22:50 crc kubenswrapper[4825]: I0310 07:22:50.852160 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qt6bq" Mar 10 07:22:52 crc kubenswrapper[4825]: I0310 07:22:52.251832 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qt6bq"] Mar 10 07:22:52 crc kubenswrapper[4825]: I0310 07:22:52.805033 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qt6bq" podUID="dd1ae15a-290a-43a8-8f7b-3b9a26b1e239" containerName="registry-server" containerID="cri-o://fd79cb6f2a93849ca1b0c5cff3e94b75aa7ce58cde4d85b1ebb62764582f78f4" gracePeriod=2 Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.327394 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qt6bq" Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.440260 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd1ae15a-290a-43a8-8f7b-3b9a26b1e239-catalog-content\") pod \"dd1ae15a-290a-43a8-8f7b-3b9a26b1e239\" (UID: \"dd1ae15a-290a-43a8-8f7b-3b9a26b1e239\") " Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.440326 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj47q\" (UniqueName: \"kubernetes.io/projected/dd1ae15a-290a-43a8-8f7b-3b9a26b1e239-kube-api-access-hj47q\") pod \"dd1ae15a-290a-43a8-8f7b-3b9a26b1e239\" (UID: \"dd1ae15a-290a-43a8-8f7b-3b9a26b1e239\") " Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.440494 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd1ae15a-290a-43a8-8f7b-3b9a26b1e239-utilities\") pod \"dd1ae15a-290a-43a8-8f7b-3b9a26b1e239\" (UID: \"dd1ae15a-290a-43a8-8f7b-3b9a26b1e239\") " Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.441601 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd1ae15a-290a-43a8-8f7b-3b9a26b1e239-utilities" (OuterVolumeSpecName: "utilities") pod "dd1ae15a-290a-43a8-8f7b-3b9a26b1e239" (UID: "dd1ae15a-290a-43a8-8f7b-3b9a26b1e239"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.446354 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd1ae15a-290a-43a8-8f7b-3b9a26b1e239-kube-api-access-hj47q" (OuterVolumeSpecName: "kube-api-access-hj47q") pod "dd1ae15a-290a-43a8-8f7b-3b9a26b1e239" (UID: "dd1ae15a-290a-43a8-8f7b-3b9a26b1e239"). InnerVolumeSpecName "kube-api-access-hj47q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.506955 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd1ae15a-290a-43a8-8f7b-3b9a26b1e239-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd1ae15a-290a-43a8-8f7b-3b9a26b1e239" (UID: "dd1ae15a-290a-43a8-8f7b-3b9a26b1e239"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.542595 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd1ae15a-290a-43a8-8f7b-3b9a26b1e239-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.542632 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd1ae15a-290a-43a8-8f7b-3b9a26b1e239-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.542643 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj47q\" (UniqueName: \"kubernetes.io/projected/dd1ae15a-290a-43a8-8f7b-3b9a26b1e239-kube-api-access-hj47q\") on node \"crc\" DevicePath \"\"" Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.819549 4825 generic.go:334] "Generic (PLEG): container finished" podID="dd1ae15a-290a-43a8-8f7b-3b9a26b1e239" containerID="fd79cb6f2a93849ca1b0c5cff3e94b75aa7ce58cde4d85b1ebb62764582f78f4" exitCode=0 Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.819617 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qt6bq" event={"ID":"dd1ae15a-290a-43a8-8f7b-3b9a26b1e239","Type":"ContainerDied","Data":"fd79cb6f2a93849ca1b0c5cff3e94b75aa7ce58cde4d85b1ebb62764582f78f4"} Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.819656 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qt6bq" Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.819697 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qt6bq" event={"ID":"dd1ae15a-290a-43a8-8f7b-3b9a26b1e239","Type":"ContainerDied","Data":"c91724464baf93c3f857168e83eba3627331e52edf659a70eef32e898cea9081"} Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.819721 4825 scope.go:117] "RemoveContainer" containerID="fd79cb6f2a93849ca1b0c5cff3e94b75aa7ce58cde4d85b1ebb62764582f78f4" Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.847818 4825 scope.go:117] "RemoveContainer" containerID="b977a8bd0a0b2cb5041b856edc24f8362488454a8ea9ff766cceedc21f6bc11e" Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.877121 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qt6bq"] Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.892470 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qt6bq"] Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.894327 4825 scope.go:117] "RemoveContainer" containerID="fc4399b0a374142f5c7d60538b56fdebbc1b36cf4dc92abae8a12649ad051b89" Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.930256 4825 scope.go:117] "RemoveContainer" containerID="fd79cb6f2a93849ca1b0c5cff3e94b75aa7ce58cde4d85b1ebb62764582f78f4" Mar 10 07:22:53 crc kubenswrapper[4825]: E0310 07:22:53.930936 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd79cb6f2a93849ca1b0c5cff3e94b75aa7ce58cde4d85b1ebb62764582f78f4\": container with ID starting with fd79cb6f2a93849ca1b0c5cff3e94b75aa7ce58cde4d85b1ebb62764582f78f4 not found: ID does not exist" containerID="fd79cb6f2a93849ca1b0c5cff3e94b75aa7ce58cde4d85b1ebb62764582f78f4" Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.930983 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd79cb6f2a93849ca1b0c5cff3e94b75aa7ce58cde4d85b1ebb62764582f78f4"} err="failed to get container status \"fd79cb6f2a93849ca1b0c5cff3e94b75aa7ce58cde4d85b1ebb62764582f78f4\": rpc error: code = NotFound desc = could not find container \"fd79cb6f2a93849ca1b0c5cff3e94b75aa7ce58cde4d85b1ebb62764582f78f4\": container with ID starting with fd79cb6f2a93849ca1b0c5cff3e94b75aa7ce58cde4d85b1ebb62764582f78f4 not found: ID does not exist" Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.931011 4825 scope.go:117] "RemoveContainer" containerID="b977a8bd0a0b2cb5041b856edc24f8362488454a8ea9ff766cceedc21f6bc11e" Mar 10 07:22:53 crc kubenswrapper[4825]: E0310 07:22:53.931484 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b977a8bd0a0b2cb5041b856edc24f8362488454a8ea9ff766cceedc21f6bc11e\": container with ID starting with b977a8bd0a0b2cb5041b856edc24f8362488454a8ea9ff766cceedc21f6bc11e not found: ID does not exist" containerID="b977a8bd0a0b2cb5041b856edc24f8362488454a8ea9ff766cceedc21f6bc11e" Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.931530 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b977a8bd0a0b2cb5041b856edc24f8362488454a8ea9ff766cceedc21f6bc11e"} err="failed to get container status \"b977a8bd0a0b2cb5041b856edc24f8362488454a8ea9ff766cceedc21f6bc11e\": rpc error: code = NotFound desc = could not find container \"b977a8bd0a0b2cb5041b856edc24f8362488454a8ea9ff766cceedc21f6bc11e\": container with ID starting with b977a8bd0a0b2cb5041b856edc24f8362488454a8ea9ff766cceedc21f6bc11e not found: ID does not exist" Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.931551 4825 scope.go:117] "RemoveContainer" containerID="fc4399b0a374142f5c7d60538b56fdebbc1b36cf4dc92abae8a12649ad051b89" Mar 10 07:22:53 crc kubenswrapper[4825]: E0310 07:22:53.932004 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc4399b0a374142f5c7d60538b56fdebbc1b36cf4dc92abae8a12649ad051b89\": container with ID starting with fc4399b0a374142f5c7d60538b56fdebbc1b36cf4dc92abae8a12649ad051b89 not found: ID does not exist" containerID="fc4399b0a374142f5c7d60538b56fdebbc1b36cf4dc92abae8a12649ad051b89" Mar 10 07:22:53 crc kubenswrapper[4825]: I0310 07:22:53.932043 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc4399b0a374142f5c7d60538b56fdebbc1b36cf4dc92abae8a12649ad051b89"} err="failed to get container status \"fc4399b0a374142f5c7d60538b56fdebbc1b36cf4dc92abae8a12649ad051b89\": rpc error: code = NotFound desc = could not find container \"fc4399b0a374142f5c7d60538b56fdebbc1b36cf4dc92abae8a12649ad051b89\": container with ID starting with fc4399b0a374142f5c7d60538b56fdebbc1b36cf4dc92abae8a12649ad051b89 not found: ID does not exist" Mar 10 07:22:55 crc kubenswrapper[4825]: I0310 07:22:55.250767 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd1ae15a-290a-43a8-8f7b-3b9a26b1e239" path="/var/lib/kubelet/pods/dd1ae15a-290a-43a8-8f7b-3b9a26b1e239/volumes" Mar 10 07:22:56 crc kubenswrapper[4825]: I0310 07:22:56.786418 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dknf6" Mar 10 07:22:56 crc kubenswrapper[4825]: I0310 07:22:56.786696 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dknf6" Mar 10 07:22:56 crc kubenswrapper[4825]: I0310 07:22:56.855887 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dknf6" Mar 10 07:22:56 crc kubenswrapper[4825]: I0310 07:22:56.951320 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dknf6" Mar 10 07:22:57 crc kubenswrapper[4825]: I0310 07:22:57.850928 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dknf6"] Mar 10 07:22:58 crc kubenswrapper[4825]: I0310 07:22:58.885681 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dknf6" podUID="364a0b38-6669-4f38-b0ee-744bbe14212e" containerName="registry-server" containerID="cri-o://b7bee0059e263afd47d112a53b9e9db7b86b6e17e3e485c260f4e1335cca4227" gracePeriod=2 Mar 10 07:22:59 crc kubenswrapper[4825]: I0310 07:22:59.796299 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dknf6" Mar 10 07:22:59 crc kubenswrapper[4825]: I0310 07:22:59.896748 4825 generic.go:334] "Generic (PLEG): container finished" podID="364a0b38-6669-4f38-b0ee-744bbe14212e" containerID="b7bee0059e263afd47d112a53b9e9db7b86b6e17e3e485c260f4e1335cca4227" exitCode=0 Mar 10 07:22:59 crc kubenswrapper[4825]: I0310 07:22:59.896795 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dknf6" event={"ID":"364a0b38-6669-4f38-b0ee-744bbe14212e","Type":"ContainerDied","Data":"b7bee0059e263afd47d112a53b9e9db7b86b6e17e3e485c260f4e1335cca4227"} Mar 10 07:22:59 crc kubenswrapper[4825]: I0310 07:22:59.896822 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dknf6" Mar 10 07:22:59 crc kubenswrapper[4825]: I0310 07:22:59.896851 4825 scope.go:117] "RemoveContainer" containerID="b7bee0059e263afd47d112a53b9e9db7b86b6e17e3e485c260f4e1335cca4227" Mar 10 07:22:59 crc kubenswrapper[4825]: I0310 07:22:59.896836 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dknf6" event={"ID":"364a0b38-6669-4f38-b0ee-744bbe14212e","Type":"ContainerDied","Data":"cabb49bb3739a1da9a2ccdfeeb990678707fbefbe5d79e01b2c21d0f93025cee"} Mar 10 07:22:59 crc kubenswrapper[4825]: I0310 07:22:59.917325 4825 scope.go:117] "RemoveContainer" containerID="e3cb4cf242715001453faba9512236747e7aeedd4a7852db2c59cdfcf41fd0ea" Mar 10 07:22:59 crc kubenswrapper[4825]: I0310 07:22:59.937990 4825 scope.go:117] "RemoveContainer" containerID="957be6d6081112d98ee7afe62c6bc58d388848ce77dc338a1fb440402bfa031e" Mar 10 07:22:59 crc kubenswrapper[4825]: I0310 07:22:59.947168 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/364a0b38-6669-4f38-b0ee-744bbe14212e-utilities\") pod \"364a0b38-6669-4f38-b0ee-744bbe14212e\" (UID: \"364a0b38-6669-4f38-b0ee-744bbe14212e\") " Mar 10 07:22:59 crc kubenswrapper[4825]: I0310 07:22:59.947357 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dz99m\" (UniqueName: \"kubernetes.io/projected/364a0b38-6669-4f38-b0ee-744bbe14212e-kube-api-access-dz99m\") pod \"364a0b38-6669-4f38-b0ee-744bbe14212e\" (UID: \"364a0b38-6669-4f38-b0ee-744bbe14212e\") " Mar 10 07:22:59 crc kubenswrapper[4825]: I0310 07:22:59.948225 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/364a0b38-6669-4f38-b0ee-744bbe14212e-utilities" (OuterVolumeSpecName: "utilities") pod "364a0b38-6669-4f38-b0ee-744bbe14212e" (UID: "364a0b38-6669-4f38-b0ee-744bbe14212e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:22:59 crc kubenswrapper[4825]: I0310 07:22:59.948849 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/364a0b38-6669-4f38-b0ee-744bbe14212e-catalog-content\") pod \"364a0b38-6669-4f38-b0ee-744bbe14212e\" (UID: \"364a0b38-6669-4f38-b0ee-744bbe14212e\") " Mar 10 07:22:59 crc kubenswrapper[4825]: I0310 07:22:59.949528 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/364a0b38-6669-4f38-b0ee-744bbe14212e-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 07:22:59 crc kubenswrapper[4825]: I0310 07:22:59.953254 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/364a0b38-6669-4f38-b0ee-744bbe14212e-kube-api-access-dz99m" (OuterVolumeSpecName: "kube-api-access-dz99m") pod "364a0b38-6669-4f38-b0ee-744bbe14212e" (UID: "364a0b38-6669-4f38-b0ee-744bbe14212e"). InnerVolumeSpecName "kube-api-access-dz99m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:22:59 crc kubenswrapper[4825]: I0310 07:22:59.963701 4825 scope.go:117] "RemoveContainer" containerID="b7bee0059e263afd47d112a53b9e9db7b86b6e17e3e485c260f4e1335cca4227" Mar 10 07:22:59 crc kubenswrapper[4825]: E0310 07:22:59.964086 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7bee0059e263afd47d112a53b9e9db7b86b6e17e3e485c260f4e1335cca4227\": container with ID starting with b7bee0059e263afd47d112a53b9e9db7b86b6e17e3e485c260f4e1335cca4227 not found: ID does not exist" containerID="b7bee0059e263afd47d112a53b9e9db7b86b6e17e3e485c260f4e1335cca4227" Mar 10 07:22:59 crc kubenswrapper[4825]: I0310 07:22:59.964156 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7bee0059e263afd47d112a53b9e9db7b86b6e17e3e485c260f4e1335cca4227"} err="failed to get container status \"b7bee0059e263afd47d112a53b9e9db7b86b6e17e3e485c260f4e1335cca4227\": rpc error: code = NotFound desc = could not find container \"b7bee0059e263afd47d112a53b9e9db7b86b6e17e3e485c260f4e1335cca4227\": container with ID starting with b7bee0059e263afd47d112a53b9e9db7b86b6e17e3e485c260f4e1335cca4227 not found: ID does not exist" Mar 10 07:22:59 crc kubenswrapper[4825]: I0310 07:22:59.964189 4825 scope.go:117] "RemoveContainer" containerID="e3cb4cf242715001453faba9512236747e7aeedd4a7852db2c59cdfcf41fd0ea" Mar 10 07:22:59 crc kubenswrapper[4825]: E0310 07:22:59.964450 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3cb4cf242715001453faba9512236747e7aeedd4a7852db2c59cdfcf41fd0ea\": container with ID starting with e3cb4cf242715001453faba9512236747e7aeedd4a7852db2c59cdfcf41fd0ea not found: ID does not exist" containerID="e3cb4cf242715001453faba9512236747e7aeedd4a7852db2c59cdfcf41fd0ea" Mar 10 07:22:59 crc kubenswrapper[4825]: I0310 07:22:59.964475 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3cb4cf242715001453faba9512236747e7aeedd4a7852db2c59cdfcf41fd0ea"} err="failed to get container status \"e3cb4cf242715001453faba9512236747e7aeedd4a7852db2c59cdfcf41fd0ea\": rpc error: code = NotFound desc = could not find container \"e3cb4cf242715001453faba9512236747e7aeedd4a7852db2c59cdfcf41fd0ea\": container with ID starting with e3cb4cf242715001453faba9512236747e7aeedd4a7852db2c59cdfcf41fd0ea not found: ID does not exist" Mar 10 07:22:59 crc kubenswrapper[4825]: I0310 07:22:59.964487 4825 scope.go:117] "RemoveContainer" containerID="957be6d6081112d98ee7afe62c6bc58d388848ce77dc338a1fb440402bfa031e" Mar 10 07:22:59 crc kubenswrapper[4825]: E0310 07:22:59.964699 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"957be6d6081112d98ee7afe62c6bc58d388848ce77dc338a1fb440402bfa031e\": container with ID starting with 957be6d6081112d98ee7afe62c6bc58d388848ce77dc338a1fb440402bfa031e not found: ID does not exist" containerID="957be6d6081112d98ee7afe62c6bc58d388848ce77dc338a1fb440402bfa031e" Mar 10 07:22:59 crc kubenswrapper[4825]: I0310 07:22:59.964713 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"957be6d6081112d98ee7afe62c6bc58d388848ce77dc338a1fb440402bfa031e"} err="failed to get container status \"957be6d6081112d98ee7afe62c6bc58d388848ce77dc338a1fb440402bfa031e\": rpc error: code = NotFound desc = could not find container \"957be6d6081112d98ee7afe62c6bc58d388848ce77dc338a1fb440402bfa031e\": container with ID starting with 957be6d6081112d98ee7afe62c6bc58d388848ce77dc338a1fb440402bfa031e not found: ID does not exist" Mar 10 07:23:00 crc kubenswrapper[4825]: I0310 07:23:00.031173 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/364a0b38-6669-4f38-b0ee-744bbe14212e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "364a0b38-6669-4f38-b0ee-744bbe14212e" (UID: "364a0b38-6669-4f38-b0ee-744bbe14212e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:23:00 crc kubenswrapper[4825]: I0310 07:23:00.051595 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dz99m\" (UniqueName: \"kubernetes.io/projected/364a0b38-6669-4f38-b0ee-744bbe14212e-kube-api-access-dz99m\") on node \"crc\" DevicePath \"\"" Mar 10 07:23:00 crc kubenswrapper[4825]: I0310 07:23:00.051649 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/364a0b38-6669-4f38-b0ee-744bbe14212e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 07:23:00 crc kubenswrapper[4825]: I0310 07:23:00.245575 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dknf6"] Mar 10 07:23:00 crc kubenswrapper[4825]: I0310 07:23:00.256351 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dknf6"] Mar 10 07:23:01 crc kubenswrapper[4825]: I0310 07:23:01.253782 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="364a0b38-6669-4f38-b0ee-744bbe14212e" path="/var/lib/kubelet/pods/364a0b38-6669-4f38-b0ee-744bbe14212e/volumes" Mar 10 07:24:00 crc kubenswrapper[4825]: I0310 07:24:00.166259 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552124-tffr7"] Mar 10 07:24:00 crc kubenswrapper[4825]: E0310 07:24:00.167415 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1ae15a-290a-43a8-8f7b-3b9a26b1e239" containerName="extract-content" Mar 10 07:24:00 crc kubenswrapper[4825]: I0310 07:24:00.167439 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1ae15a-290a-43a8-8f7b-3b9a26b1e239" containerName="extract-content" Mar 10 07:24:00 crc kubenswrapper[4825]: E0310 07:24:00.167459 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1ae15a-290a-43a8-8f7b-3b9a26b1e239" containerName="extract-utilities" Mar 10 07:24:00 crc kubenswrapper[4825]: I0310 07:24:00.167471 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1ae15a-290a-43a8-8f7b-3b9a26b1e239" containerName="extract-utilities" Mar 10 07:24:00 crc kubenswrapper[4825]: E0310 07:24:00.167498 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="364a0b38-6669-4f38-b0ee-744bbe14212e" containerName="extract-utilities" Mar 10 07:24:00 crc kubenswrapper[4825]: I0310 07:24:00.167510 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="364a0b38-6669-4f38-b0ee-744bbe14212e" containerName="extract-utilities" Mar 10 07:24:00 crc kubenswrapper[4825]: E0310 07:24:00.167534 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="364a0b38-6669-4f38-b0ee-744bbe14212e" containerName="extract-content" Mar 10 07:24:00 crc kubenswrapper[4825]: I0310 07:24:00.167547 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="364a0b38-6669-4f38-b0ee-744bbe14212e" containerName="extract-content" Mar 10 07:24:00 crc kubenswrapper[4825]: E0310 07:24:00.167566 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1ae15a-290a-43a8-8f7b-3b9a26b1e239" containerName="registry-server" Mar 10 07:24:00 crc kubenswrapper[4825]: I0310 07:24:00.167578 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1ae15a-290a-43a8-8f7b-3b9a26b1e239" containerName="registry-server" Mar 10 07:24:00 crc kubenswrapper[4825]: E0310 07:24:00.167614 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="364a0b38-6669-4f38-b0ee-744bbe14212e" containerName="registry-server" Mar 10 07:24:00 crc kubenswrapper[4825]: I0310 07:24:00.167626 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="364a0b38-6669-4f38-b0ee-744bbe14212e" containerName="registry-server" Mar 10 07:24:00 crc kubenswrapper[4825]: I0310 07:24:00.167851 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="364a0b38-6669-4f38-b0ee-744bbe14212e" containerName="registry-server" Mar 10 07:24:00 crc kubenswrapper[4825]: I0310 07:24:00.167888 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1ae15a-290a-43a8-8f7b-3b9a26b1e239" containerName="registry-server" Mar 10 07:24:00 crc kubenswrapper[4825]: I0310 07:24:00.168602 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552124-tffr7" Mar 10 07:24:00 crc kubenswrapper[4825]: I0310 07:24:00.172876 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:24:00 crc kubenswrapper[4825]: I0310 07:24:00.172877 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:24:00 crc kubenswrapper[4825]: I0310 07:24:00.173057 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:24:00 crc kubenswrapper[4825]: I0310 07:24:00.175212 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdsr9\" (UniqueName: \"kubernetes.io/projected/ade7b324-7179-4534-94bd-03819ae4ade3-kube-api-access-jdsr9\") pod \"auto-csr-approver-29552124-tffr7\" (UID: \"ade7b324-7179-4534-94bd-03819ae4ade3\") " pod="openshift-infra/auto-csr-approver-29552124-tffr7" Mar 10 07:24:00 crc kubenswrapper[4825]: I0310 07:24:00.184696 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552124-tffr7"] Mar 10 07:24:00 crc kubenswrapper[4825]: I0310 07:24:00.276125 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdsr9\" (UniqueName: \"kubernetes.io/projected/ade7b324-7179-4534-94bd-03819ae4ade3-kube-api-access-jdsr9\") pod \"auto-csr-approver-29552124-tffr7\" (UID: \"ade7b324-7179-4534-94bd-03819ae4ade3\") " pod="openshift-infra/auto-csr-approver-29552124-tffr7" Mar 10 07:24:00 crc kubenswrapper[4825]: I0310 07:24:00.296057 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdsr9\" (UniqueName: \"kubernetes.io/projected/ade7b324-7179-4534-94bd-03819ae4ade3-kube-api-access-jdsr9\") pod \"auto-csr-approver-29552124-tffr7\" (UID: \"ade7b324-7179-4534-94bd-03819ae4ade3\") " pod="openshift-infra/auto-csr-approver-29552124-tffr7" Mar 10 07:24:00 crc kubenswrapper[4825]: I0310 07:24:00.505638 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552124-tffr7" Mar 10 07:24:00 crc kubenswrapper[4825]: I0310 07:24:00.751532 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552124-tffr7"] Mar 10 07:24:01 crc kubenswrapper[4825]: I0310 07:24:01.468423 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552124-tffr7" event={"ID":"ade7b324-7179-4534-94bd-03819ae4ade3","Type":"ContainerStarted","Data":"0053b727238e2816a8214afc45843d5980438e499915e841d7aad7ecc9174c3c"} Mar 10 07:24:02 crc kubenswrapper[4825]: I0310 07:24:02.479456 4825 generic.go:334] "Generic (PLEG): container finished" podID="ade7b324-7179-4534-94bd-03819ae4ade3" containerID="67ce4aae6072bca2381fc2bb1155837742dc2400d17b065d8bf7b6b17de6403a" exitCode=0 Mar 10 07:24:02 crc kubenswrapper[4825]: I0310 07:24:02.479502 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552124-tffr7" event={"ID":"ade7b324-7179-4534-94bd-03819ae4ade3","Type":"ContainerDied","Data":"67ce4aae6072bca2381fc2bb1155837742dc2400d17b065d8bf7b6b17de6403a"} Mar 10 07:24:03 crc kubenswrapper[4825]: I0310 07:24:03.752793 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552124-tffr7" Mar 10 07:24:03 crc kubenswrapper[4825]: I0310 07:24:03.932263 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdsr9\" (UniqueName: \"kubernetes.io/projected/ade7b324-7179-4534-94bd-03819ae4ade3-kube-api-access-jdsr9\") pod \"ade7b324-7179-4534-94bd-03819ae4ade3\" (UID: \"ade7b324-7179-4534-94bd-03819ae4ade3\") " Mar 10 07:24:03 crc kubenswrapper[4825]: I0310 07:24:03.937489 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ade7b324-7179-4534-94bd-03819ae4ade3-kube-api-access-jdsr9" (OuterVolumeSpecName: "kube-api-access-jdsr9") pod "ade7b324-7179-4534-94bd-03819ae4ade3" (UID: "ade7b324-7179-4534-94bd-03819ae4ade3"). InnerVolumeSpecName "kube-api-access-jdsr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:24:04 crc kubenswrapper[4825]: I0310 07:24:04.033676 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdsr9\" (UniqueName: \"kubernetes.io/projected/ade7b324-7179-4534-94bd-03819ae4ade3-kube-api-access-jdsr9\") on node \"crc\" DevicePath \"\"" Mar 10 07:24:04 crc kubenswrapper[4825]: I0310 07:24:04.501619 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552124-tffr7" event={"ID":"ade7b324-7179-4534-94bd-03819ae4ade3","Type":"ContainerDied","Data":"0053b727238e2816a8214afc45843d5980438e499915e841d7aad7ecc9174c3c"} Mar 10 07:24:04 crc kubenswrapper[4825]: I0310 07:24:04.501720 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0053b727238e2816a8214afc45843d5980438e499915e841d7aad7ecc9174c3c" Mar 10 07:24:04 crc kubenswrapper[4825]: I0310 07:24:04.501716 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552124-tffr7" Mar 10 07:24:04 crc kubenswrapper[4825]: I0310 07:24:04.860334 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552118-lvgdb"] Mar 10 07:24:04 crc kubenswrapper[4825]: I0310 07:24:04.867637 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552118-lvgdb"] Mar 10 07:24:05 crc kubenswrapper[4825]: I0310 07:24:05.253966 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48b65f42-3de8-4e55-a302-7475deec8981" path="/var/lib/kubelet/pods/48b65f42-3de8-4e55-a302-7475deec8981/volumes" Mar 10 07:24:37 crc kubenswrapper[4825]: I0310 07:24:37.900529 4825 scope.go:117] "RemoveContainer" containerID="a137bbd6c70d8d4ece1dd6e26b2a8e147fdde2d0f1873d42630561b75223678a" Mar 10 07:24:46 crc kubenswrapper[4825]: I0310 07:24:46.889659 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:24:46 crc kubenswrapper[4825]: I0310 07:24:46.890277 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:25:16 crc kubenswrapper[4825]: I0310 07:25:16.888251 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:25:16 crc kubenswrapper[4825]: I0310 07:25:16.888814 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:25:46 crc kubenswrapper[4825]: I0310 07:25:46.888513 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:25:46 crc kubenswrapper[4825]: I0310 07:25:46.889341 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:25:46 crc kubenswrapper[4825]: I0310 07:25:46.889418 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 07:25:46 crc kubenswrapper[4825]: I0310 07:25:46.890518 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 07:25:46 crc kubenswrapper[4825]: I0310 07:25:46.890644 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" gracePeriod=600 Mar 10 07:25:47 crc kubenswrapper[4825]: E0310 07:25:47.037807 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:25:47 crc kubenswrapper[4825]: I0310 07:25:47.532486 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" exitCode=0 Mar 10 07:25:47 crc kubenswrapper[4825]: I0310 07:25:47.532584 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45"} Mar 10 07:25:47 crc kubenswrapper[4825]: I0310 07:25:47.533072 4825 scope.go:117] "RemoveContainer" containerID="dddfba3f7a2f8d3823d9a072175efb197634fd1df552eb76662bdf50e8519ec5" Mar 10 07:25:47 crc kubenswrapper[4825]: I0310 07:25:47.533812 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:25:47 crc kubenswrapper[4825]: E0310 07:25:47.534265 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:26:00 crc kubenswrapper[4825]: I0310 07:26:00.141347 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552126-prr59"] Mar 10 07:26:00 crc kubenswrapper[4825]: E0310 07:26:00.143545 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ade7b324-7179-4534-94bd-03819ae4ade3" containerName="oc" Mar 10 07:26:00 crc kubenswrapper[4825]: I0310 07:26:00.143566 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ade7b324-7179-4534-94bd-03819ae4ade3" containerName="oc" Mar 10 07:26:00 crc kubenswrapper[4825]: I0310 07:26:00.143771 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ade7b324-7179-4534-94bd-03819ae4ade3" containerName="oc" Mar 10 07:26:00 crc kubenswrapper[4825]: I0310 07:26:00.144345 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552126-prr59" Mar 10 07:26:00 crc kubenswrapper[4825]: I0310 07:26:00.149600 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:26:00 crc kubenswrapper[4825]: I0310 07:26:00.149812 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:26:00 crc kubenswrapper[4825]: I0310 07:26:00.149839 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:26:00 crc kubenswrapper[4825]: I0310 07:26:00.150161 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552126-prr59"] Mar 10 07:26:00 crc kubenswrapper[4825]: I0310 07:26:00.276041 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dclt4\" (UniqueName: \"kubernetes.io/projected/ddaba890-7e66-4edb-a231-bf7afb2cb8fd-kube-api-access-dclt4\") pod \"auto-csr-approver-29552126-prr59\" (UID: \"ddaba890-7e66-4edb-a231-bf7afb2cb8fd\") " pod="openshift-infra/auto-csr-approver-29552126-prr59" Mar 10 07:26:00 crc kubenswrapper[4825]: I0310 07:26:00.377186 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dclt4\" (UniqueName: \"kubernetes.io/projected/ddaba890-7e66-4edb-a231-bf7afb2cb8fd-kube-api-access-dclt4\") pod \"auto-csr-approver-29552126-prr59\" (UID: \"ddaba890-7e66-4edb-a231-bf7afb2cb8fd\") " pod="openshift-infra/auto-csr-approver-29552126-prr59" Mar 10 07:26:00 crc kubenswrapper[4825]: I0310 07:26:00.403165 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dclt4\" (UniqueName: \"kubernetes.io/projected/ddaba890-7e66-4edb-a231-bf7afb2cb8fd-kube-api-access-dclt4\") pod \"auto-csr-approver-29552126-prr59\" (UID: \"ddaba890-7e66-4edb-a231-bf7afb2cb8fd\") " pod="openshift-infra/auto-csr-approver-29552126-prr59" Mar 10 07:26:00 crc kubenswrapper[4825]: I0310 07:26:00.463360 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552126-prr59" Mar 10 07:26:00 crc kubenswrapper[4825]: I0310 07:26:00.942837 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552126-prr59"] Mar 10 07:26:01 crc kubenswrapper[4825]: I0310 07:26:01.237248 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:26:01 crc kubenswrapper[4825]: E0310 07:26:01.237711 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:26:01 crc kubenswrapper[4825]: I0310 07:26:01.659772 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552126-prr59" event={"ID":"ddaba890-7e66-4edb-a231-bf7afb2cb8fd","Type":"ContainerStarted","Data":"288cd0d6b52d4550b08e9bc3c4884811c7be04d13ba15b09ef2ec82b2e9aabc9"} Mar 10 07:26:02 crc kubenswrapper[4825]: I0310 07:26:02.674113 4825 generic.go:334] "Generic (PLEG): container finished" podID="ddaba890-7e66-4edb-a231-bf7afb2cb8fd" containerID="701e87aa8ae4514e278b86d9059d1c82f25c29fa31a3dfa1df15852205d4f0e3" exitCode=0 Mar 10 07:26:02 crc kubenswrapper[4825]: I0310 07:26:02.674249 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552126-prr59" event={"ID":"ddaba890-7e66-4edb-a231-bf7afb2cb8fd","Type":"ContainerDied","Data":"701e87aa8ae4514e278b86d9059d1c82f25c29fa31a3dfa1df15852205d4f0e3"} Mar 10 07:26:04 crc kubenswrapper[4825]: I0310 07:26:04.003842 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552126-prr59" Mar 10 07:26:04 crc kubenswrapper[4825]: I0310 07:26:04.132259 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dclt4\" (UniqueName: \"kubernetes.io/projected/ddaba890-7e66-4edb-a231-bf7afb2cb8fd-kube-api-access-dclt4\") pod \"ddaba890-7e66-4edb-a231-bf7afb2cb8fd\" (UID: \"ddaba890-7e66-4edb-a231-bf7afb2cb8fd\") " Mar 10 07:26:04 crc kubenswrapper[4825]: I0310 07:26:04.140402 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddaba890-7e66-4edb-a231-bf7afb2cb8fd-kube-api-access-dclt4" (OuterVolumeSpecName: "kube-api-access-dclt4") pod "ddaba890-7e66-4edb-a231-bf7afb2cb8fd" (UID: "ddaba890-7e66-4edb-a231-bf7afb2cb8fd"). InnerVolumeSpecName "kube-api-access-dclt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:26:04 crc kubenswrapper[4825]: I0310 07:26:04.234295 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dclt4\" (UniqueName: \"kubernetes.io/projected/ddaba890-7e66-4edb-a231-bf7afb2cb8fd-kube-api-access-dclt4\") on node \"crc\" DevicePath \"\"" Mar 10 07:26:04 crc kubenswrapper[4825]: I0310 07:26:04.692174 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552126-prr59" event={"ID":"ddaba890-7e66-4edb-a231-bf7afb2cb8fd","Type":"ContainerDied","Data":"288cd0d6b52d4550b08e9bc3c4884811c7be04d13ba15b09ef2ec82b2e9aabc9"} Mar 10 07:26:04 crc kubenswrapper[4825]: I0310 07:26:04.692625 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="288cd0d6b52d4550b08e9bc3c4884811c7be04d13ba15b09ef2ec82b2e9aabc9" Mar 10 07:26:04 crc kubenswrapper[4825]: I0310 07:26:04.692264 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552126-prr59" Mar 10 07:26:05 crc kubenswrapper[4825]: I0310 07:26:05.101566 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552120-lzjsg"] Mar 10 07:26:05 crc kubenswrapper[4825]: I0310 07:26:05.115612 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552120-lzjsg"] Mar 10 07:26:05 crc kubenswrapper[4825]: I0310 07:26:05.253616 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ee48ed3-b478-4fe2-ba16-39060711f446" path="/var/lib/kubelet/pods/8ee48ed3-b478-4fe2-ba16-39060711f446/volumes" Mar 10 07:26:12 crc kubenswrapper[4825]: I0310 07:26:12.237642 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:26:12 crc kubenswrapper[4825]: E0310 07:26:12.238628 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:26:23 crc kubenswrapper[4825]: I0310 07:26:23.237069 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:26:23 crc kubenswrapper[4825]: E0310 07:26:23.238058 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:26:37 crc kubenswrapper[4825]: I0310 07:26:37.237800 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:26:37 crc kubenswrapper[4825]: E0310 07:26:37.238892 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:26:38 crc kubenswrapper[4825]: I0310 07:26:38.039681 4825 scope.go:117] "RemoveContainer" containerID="c05715d8163cbcbef2a3774a17e0efef9fdf1d7aef0433f053f9f328e4345879" Mar 10 07:26:51 crc kubenswrapper[4825]: I0310 07:26:51.236706 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:26:51 crc kubenswrapper[4825]: E0310 07:26:51.237856 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:27:06 crc kubenswrapper[4825]: I0310 07:27:06.237040 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:27:06 crc kubenswrapper[4825]: E0310 07:27:06.238191 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:27:19 crc kubenswrapper[4825]: I0310 07:27:19.247349 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:27:19 crc kubenswrapper[4825]: E0310 07:27:19.250921 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:27:34 crc kubenswrapper[4825]: I0310 07:27:34.236792 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:27:34 crc kubenswrapper[4825]: E0310 07:27:34.237737 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:27:47 crc kubenswrapper[4825]: I0310 07:27:47.236954 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:27:47 crc kubenswrapper[4825]: E0310 07:27:47.238029 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:27:59 crc kubenswrapper[4825]: I0310 07:27:59.243643 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:27:59 crc kubenswrapper[4825]: E0310 07:27:59.245198 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:28:00 crc kubenswrapper[4825]: I0310 07:28:00.152178 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552128-995sq"] Mar 10 07:28:00 crc kubenswrapper[4825]: E0310 07:28:00.152876 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddaba890-7e66-4edb-a231-bf7afb2cb8fd" containerName="oc" Mar 10 07:28:00 crc kubenswrapper[4825]: I0310 07:28:00.152896 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddaba890-7e66-4edb-a231-bf7afb2cb8fd" containerName="oc" Mar 10 07:28:00 crc kubenswrapper[4825]: I0310 07:28:00.153163 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddaba890-7e66-4edb-a231-bf7afb2cb8fd" containerName="oc" Mar 10 07:28:00 crc kubenswrapper[4825]: I0310 07:28:00.153793 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552128-995sq" Mar 10 07:28:00 crc kubenswrapper[4825]: I0310 07:28:00.156850 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:28:00 crc kubenswrapper[4825]: I0310 07:28:00.156873 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:28:00 crc kubenswrapper[4825]: I0310 07:28:00.158610 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:28:00 crc kubenswrapper[4825]: I0310 07:28:00.175520 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552128-995sq"] Mar 10 07:28:00 crc kubenswrapper[4825]: I0310 07:28:00.232155 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcxw6\" (UniqueName: \"kubernetes.io/projected/fc171a02-974b-4e0b-9ef7-57b5f717d951-kube-api-access-mcxw6\") pod \"auto-csr-approver-29552128-995sq\" (UID: \"fc171a02-974b-4e0b-9ef7-57b5f717d951\") " pod="openshift-infra/auto-csr-approver-29552128-995sq" Mar 10 07:28:00 crc kubenswrapper[4825]: I0310 07:28:00.334308 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcxw6\" (UniqueName: \"kubernetes.io/projected/fc171a02-974b-4e0b-9ef7-57b5f717d951-kube-api-access-mcxw6\") pod \"auto-csr-approver-29552128-995sq\" (UID: \"fc171a02-974b-4e0b-9ef7-57b5f717d951\") " pod="openshift-infra/auto-csr-approver-29552128-995sq" Mar 10 07:28:00 crc kubenswrapper[4825]: I0310 07:28:00.357651 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcxw6\" (UniqueName: \"kubernetes.io/projected/fc171a02-974b-4e0b-9ef7-57b5f717d951-kube-api-access-mcxw6\") pod \"auto-csr-approver-29552128-995sq\" (UID: \"fc171a02-974b-4e0b-9ef7-57b5f717d951\") " pod="openshift-infra/auto-csr-approver-29552128-995sq" Mar 10 07:28:00 crc kubenswrapper[4825]: I0310 07:28:00.477165 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552128-995sq" Mar 10 07:28:01 crc kubenswrapper[4825]: I0310 07:28:01.032712 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552128-995sq"] Mar 10 07:28:01 crc kubenswrapper[4825]: I0310 07:28:01.046261 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 07:28:01 crc kubenswrapper[4825]: I0310 07:28:01.727252 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552128-995sq" event={"ID":"fc171a02-974b-4e0b-9ef7-57b5f717d951","Type":"ContainerStarted","Data":"50b5e98e2cd146fb742e1aed4a3be5982b4cf0b7da20bbbeec79540c94270a6b"} Mar 10 07:28:02 crc kubenswrapper[4825]: I0310 07:28:02.737077 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552128-995sq" event={"ID":"fc171a02-974b-4e0b-9ef7-57b5f717d951","Type":"ContainerStarted","Data":"61aa0cd1976baefe8750dc02484d1b9b3790629b9d474bb388426d8a460c7b4f"} Mar 10 07:28:03 crc kubenswrapper[4825]: I0310 07:28:03.748296 4825 generic.go:334] "Generic (PLEG): container finished" podID="fc171a02-974b-4e0b-9ef7-57b5f717d951" containerID="61aa0cd1976baefe8750dc02484d1b9b3790629b9d474bb388426d8a460c7b4f" exitCode=0 Mar 10 07:28:03 crc kubenswrapper[4825]: I0310 07:28:03.748419 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552128-995sq" event={"ID":"fc171a02-974b-4e0b-9ef7-57b5f717d951","Type":"ContainerDied","Data":"61aa0cd1976baefe8750dc02484d1b9b3790629b9d474bb388426d8a460c7b4f"} Mar 10 07:28:05 crc kubenswrapper[4825]: I0310 07:28:05.113297 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552128-995sq" Mar 10 07:28:05 crc kubenswrapper[4825]: I0310 07:28:05.309209 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcxw6\" (UniqueName: \"kubernetes.io/projected/fc171a02-974b-4e0b-9ef7-57b5f717d951-kube-api-access-mcxw6\") pod \"fc171a02-974b-4e0b-9ef7-57b5f717d951\" (UID: \"fc171a02-974b-4e0b-9ef7-57b5f717d951\") " Mar 10 07:28:05 crc kubenswrapper[4825]: I0310 07:28:05.314658 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc171a02-974b-4e0b-9ef7-57b5f717d951-kube-api-access-mcxw6" (OuterVolumeSpecName: "kube-api-access-mcxw6") pod "fc171a02-974b-4e0b-9ef7-57b5f717d951" (UID: "fc171a02-974b-4e0b-9ef7-57b5f717d951"). InnerVolumeSpecName "kube-api-access-mcxw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:28:05 crc kubenswrapper[4825]: I0310 07:28:05.411663 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcxw6\" (UniqueName: \"kubernetes.io/projected/fc171a02-974b-4e0b-9ef7-57b5f717d951-kube-api-access-mcxw6\") on node \"crc\" DevicePath \"\"" Mar 10 07:28:05 crc kubenswrapper[4825]: I0310 07:28:05.771469 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552128-995sq" event={"ID":"fc171a02-974b-4e0b-9ef7-57b5f717d951","Type":"ContainerDied","Data":"50b5e98e2cd146fb742e1aed4a3be5982b4cf0b7da20bbbeec79540c94270a6b"} Mar 10 07:28:05 crc kubenswrapper[4825]: I0310 07:28:05.771517 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552128-995sq" Mar 10 07:28:05 crc kubenswrapper[4825]: I0310 07:28:05.771526 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50b5e98e2cd146fb742e1aed4a3be5982b4cf0b7da20bbbeec79540c94270a6b" Mar 10 07:28:05 crc kubenswrapper[4825]: I0310 07:28:05.817581 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552122-zwxnr"] Mar 10 07:28:05 crc kubenswrapper[4825]: I0310 07:28:05.826515 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552122-zwxnr"] Mar 10 07:28:07 crc kubenswrapper[4825]: I0310 07:28:07.250887 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18954899-280e-4222-bd8f-f23dde790714" path="/var/lib/kubelet/pods/18954899-280e-4222-bd8f-f23dde790714/volumes" Mar 10 07:28:10 crc kubenswrapper[4825]: I0310 07:28:10.236952 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:28:10 crc kubenswrapper[4825]: E0310 07:28:10.237797 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:28:24 crc kubenswrapper[4825]: I0310 07:28:24.236821 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:28:24 crc kubenswrapper[4825]: E0310 07:28:24.237586 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:28:37 crc kubenswrapper[4825]: I0310 07:28:37.236783 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:28:37 crc kubenswrapper[4825]: E0310 07:28:37.237680 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:28:38 crc kubenswrapper[4825]: I0310 07:28:38.156870 4825 scope.go:117] "RemoveContainer" containerID="6e624e4ecf0c2bd0cb9f6b0efd1293653cc2b35372878c46b873163f5fa2b138" Mar 10 07:28:49 crc kubenswrapper[4825]: I0310 07:28:49.244410 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:28:49 crc kubenswrapper[4825]: E0310 07:28:49.245547 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:29:02 crc kubenswrapper[4825]: I0310 07:29:02.236894 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:29:02 crc kubenswrapper[4825]: E0310 07:29:02.237568 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:29:14 crc kubenswrapper[4825]: I0310 07:29:14.389731 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:29:14 crc kubenswrapper[4825]: E0310 07:29:14.390841 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:29:27 crc kubenswrapper[4825]: I0310 07:29:27.237007 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:29:27 crc kubenswrapper[4825]: E0310 07:29:27.237559 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:29:39 crc kubenswrapper[4825]: I0310 07:29:39.740058 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jnnl2"] Mar 10 07:29:39 crc kubenswrapper[4825]: E0310 07:29:39.741090 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc171a02-974b-4e0b-9ef7-57b5f717d951" containerName="oc" Mar 10 07:29:39 crc kubenswrapper[4825]: I0310 07:29:39.741110 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc171a02-974b-4e0b-9ef7-57b5f717d951" containerName="oc" Mar 10 07:29:39 crc kubenswrapper[4825]: I0310 07:29:39.741391 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc171a02-974b-4e0b-9ef7-57b5f717d951" containerName="oc" Mar 10 07:29:39 crc kubenswrapper[4825]: I0310 07:29:39.743048 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jnnl2" Mar 10 07:29:39 crc kubenswrapper[4825]: I0310 07:29:39.764954 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jnnl2"] Mar 10 07:29:39 crc kubenswrapper[4825]: I0310 07:29:39.928805 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwj96\" (UniqueName: \"kubernetes.io/projected/72e0dde1-2305-4348-b207-6809040bc665-kube-api-access-jwj96\") pod \"redhat-operators-jnnl2\" (UID: \"72e0dde1-2305-4348-b207-6809040bc665\") " pod="openshift-marketplace/redhat-operators-jnnl2" Mar 10 07:29:39 crc kubenswrapper[4825]: I0310 07:29:39.928871 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72e0dde1-2305-4348-b207-6809040bc665-catalog-content\") pod \"redhat-operators-jnnl2\" (UID: \"72e0dde1-2305-4348-b207-6809040bc665\") " pod="openshift-marketplace/redhat-operators-jnnl2" Mar 10 07:29:39 crc kubenswrapper[4825]: I0310 07:29:39.928924 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72e0dde1-2305-4348-b207-6809040bc665-utilities\") pod \"redhat-operators-jnnl2\" (UID: \"72e0dde1-2305-4348-b207-6809040bc665\") " pod="openshift-marketplace/redhat-operators-jnnl2" Mar 10 07:29:40 crc kubenswrapper[4825]: I0310 07:29:40.030193 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72e0dde1-2305-4348-b207-6809040bc665-catalog-content\") pod \"redhat-operators-jnnl2\" (UID: \"72e0dde1-2305-4348-b207-6809040bc665\") " pod="openshift-marketplace/redhat-operators-jnnl2" Mar 10 07:29:40 crc kubenswrapper[4825]: I0310 07:29:40.030345 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72e0dde1-2305-4348-b207-6809040bc665-utilities\") pod \"redhat-operators-jnnl2\" (UID: \"72e0dde1-2305-4348-b207-6809040bc665\") " pod="openshift-marketplace/redhat-operators-jnnl2" Mar 10 07:29:40 crc kubenswrapper[4825]: I0310 07:29:40.030450 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwj96\" (UniqueName: \"kubernetes.io/projected/72e0dde1-2305-4348-b207-6809040bc665-kube-api-access-jwj96\") pod \"redhat-operators-jnnl2\" (UID: \"72e0dde1-2305-4348-b207-6809040bc665\") " pod="openshift-marketplace/redhat-operators-jnnl2" Mar 10 07:29:40 crc kubenswrapper[4825]: I0310 07:29:40.030814 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72e0dde1-2305-4348-b207-6809040bc665-utilities\") pod \"redhat-operators-jnnl2\" (UID: \"72e0dde1-2305-4348-b207-6809040bc665\") " pod="openshift-marketplace/redhat-operators-jnnl2" Mar 10 07:29:40 crc kubenswrapper[4825]: I0310 07:29:40.031064 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72e0dde1-2305-4348-b207-6809040bc665-catalog-content\") pod \"redhat-operators-jnnl2\" (UID: \"72e0dde1-2305-4348-b207-6809040bc665\") " pod="openshift-marketplace/redhat-operators-jnnl2" Mar 10 07:29:40 crc kubenswrapper[4825]: I0310 07:29:40.054994 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwj96\" (UniqueName: \"kubernetes.io/projected/72e0dde1-2305-4348-b207-6809040bc665-kube-api-access-jwj96\") pod \"redhat-operators-jnnl2\" (UID: \"72e0dde1-2305-4348-b207-6809040bc665\") " pod="openshift-marketplace/redhat-operators-jnnl2" Mar 10 07:29:40 crc kubenswrapper[4825]: I0310 07:29:40.121739 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jnnl2" Mar 10 07:29:40 crc kubenswrapper[4825]: I0310 07:29:40.394795 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jnnl2"] Mar 10 07:29:40 crc kubenswrapper[4825]: I0310 07:29:40.855953 4825 generic.go:334] "Generic (PLEG): container finished" podID="72e0dde1-2305-4348-b207-6809040bc665" containerID="0068d48252e003ed0ba5fb192806fe6560dcc802880f316620b2586ee5deb82f" exitCode=0 Mar 10 07:29:40 crc kubenswrapper[4825]: I0310 07:29:40.855998 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jnnl2" event={"ID":"72e0dde1-2305-4348-b207-6809040bc665","Type":"ContainerDied","Data":"0068d48252e003ed0ba5fb192806fe6560dcc802880f316620b2586ee5deb82f"} Mar 10 07:29:40 crc kubenswrapper[4825]: I0310 07:29:40.856320 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jnnl2" event={"ID":"72e0dde1-2305-4348-b207-6809040bc665","Type":"ContainerStarted","Data":"5c8aee1fb49d6b83c4b96525fc4e2f29dd73665d118e169743c2b8ec08b8fef9"} Mar 10 07:29:41 crc kubenswrapper[4825]: I0310 07:29:41.236032 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:29:41 crc kubenswrapper[4825]: E0310 07:29:41.236322 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:29:47 crc kubenswrapper[4825]: I0310 07:29:47.911556 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jnnl2" event={"ID":"72e0dde1-2305-4348-b207-6809040bc665","Type":"ContainerStarted","Data":"02c96ff401a54f1b7b02681cc13982ebfae177e2fa836ea62bbecc6f53dbc141"} Mar 10 07:29:48 crc kubenswrapper[4825]: I0310 07:29:48.924035 4825 generic.go:334] "Generic (PLEG): container finished" podID="72e0dde1-2305-4348-b207-6809040bc665" containerID="02c96ff401a54f1b7b02681cc13982ebfae177e2fa836ea62bbecc6f53dbc141" exitCode=0 Mar 10 07:29:48 crc kubenswrapper[4825]: I0310 07:29:48.924112 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jnnl2" event={"ID":"72e0dde1-2305-4348-b207-6809040bc665","Type":"ContainerDied","Data":"02c96ff401a54f1b7b02681cc13982ebfae177e2fa836ea62bbecc6f53dbc141"} Mar 10 07:29:49 crc kubenswrapper[4825]: I0310 07:29:49.936421 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jnnl2" event={"ID":"72e0dde1-2305-4348-b207-6809040bc665","Type":"ContainerStarted","Data":"aae9f488a5d69f02be97ad902f2be63ed30bea6c5879801773358d2f306fffde"} Mar 10 07:29:49 crc kubenswrapper[4825]: I0310 07:29:49.963388 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jnnl2" podStartSLOduration=2.287694874 podStartE2EDuration="10.963362003s" podCreationTimestamp="2026-03-10 07:29:39 +0000 UTC" firstStartedPulling="2026-03-10 07:29:40.857080952 +0000 UTC m=+2733.886861567" lastFinishedPulling="2026-03-10 07:29:49.532748071 +0000 UTC m=+2742.562528696" observedRunningTime="2026-03-10 07:29:49.958877005 +0000 UTC m=+2742.988657650" watchObservedRunningTime="2026-03-10 07:29:49.963362003 +0000 UTC m=+2742.993142658" Mar 10 07:29:50 crc kubenswrapper[4825]: I0310 07:29:50.122449 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jnnl2" Mar 10 07:29:50 crc kubenswrapper[4825]: I0310 07:29:50.122514 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jnnl2" Mar 10 07:29:52 crc kubenswrapper[4825]: I0310 07:29:51.192724 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jnnl2" podUID="72e0dde1-2305-4348-b207-6809040bc665" containerName="registry-server" probeResult="failure" output=< Mar 10 07:29:52 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 07:29:52 crc kubenswrapper[4825]: > Mar 10 07:29:55 crc kubenswrapper[4825]: I0310 07:29:55.237461 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:29:55 crc kubenswrapper[4825]: E0310 07:29:55.238422 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.178289 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552130-sbkmx"] Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.179792 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552130-sbkmx" Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.183198 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r647m\" (UniqueName: \"kubernetes.io/projected/a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f-kube-api-access-r647m\") pod \"collect-profiles-29552130-sbkmx\" (UID: \"a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552130-sbkmx" Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.183320 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f-secret-volume\") pod \"collect-profiles-29552130-sbkmx\" (UID: \"a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552130-sbkmx" Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.183477 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f-config-volume\") pod \"collect-profiles-29552130-sbkmx\" (UID: \"a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552130-sbkmx" Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.184600 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.185495 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.188618 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552130-vn76r"] Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.190714 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552130-vn76r" Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.193217 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.199451 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.200393 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.223257 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552130-vn76r"] Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.228425 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jnnl2" Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.232605 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552130-sbkmx"] Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.278947 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jnnl2" Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.284688 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r647m\" (UniqueName: \"kubernetes.io/projected/a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f-kube-api-access-r647m\") pod \"collect-profiles-29552130-sbkmx\" (UID: \"a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552130-sbkmx" Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.284741 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mt9q\" (UniqueName: \"kubernetes.io/projected/4c439fd5-bf80-4e42-b8b4-e61f6b3f7971-kube-api-access-5mt9q\") pod \"auto-csr-approver-29552130-vn76r\" (UID: \"4c439fd5-bf80-4e42-b8b4-e61f6b3f7971\") " pod="openshift-infra/auto-csr-approver-29552130-vn76r" Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.284765 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f-secret-volume\") pod \"collect-profiles-29552130-sbkmx\" (UID: \"a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552130-sbkmx" Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.284830 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f-config-volume\") pod \"collect-profiles-29552130-sbkmx\" (UID: \"a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552130-sbkmx" Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.285644 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f-config-volume\") pod \"collect-profiles-29552130-sbkmx\" (UID: \"a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552130-sbkmx" Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.293810 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f-secret-volume\") pod \"collect-profiles-29552130-sbkmx\" (UID: \"a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552130-sbkmx" Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.306584 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r647m\" (UniqueName: \"kubernetes.io/projected/a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f-kube-api-access-r647m\") pod \"collect-profiles-29552130-sbkmx\" (UID: \"a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552130-sbkmx" Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.348285 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jnnl2"] Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.386016 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mt9q\" (UniqueName: \"kubernetes.io/projected/4c439fd5-bf80-4e42-b8b4-e61f6b3f7971-kube-api-access-5mt9q\") pod \"auto-csr-approver-29552130-vn76r\" (UID: \"4c439fd5-bf80-4e42-b8b4-e61f6b3f7971\") " pod="openshift-infra/auto-csr-approver-29552130-vn76r" Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.401495 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mt9q\" (UniqueName: \"kubernetes.io/projected/4c439fd5-bf80-4e42-b8b4-e61f6b3f7971-kube-api-access-5mt9q\") pod \"auto-csr-approver-29552130-vn76r\" (UID: \"4c439fd5-bf80-4e42-b8b4-e61f6b3f7971\") " pod="openshift-infra/auto-csr-approver-29552130-vn76r" Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.461832 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8kmh2"] Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.462761 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8kmh2" podUID="b80fe579-c776-4eee-8b58-04e7ab7ac4cb" containerName="registry-server" containerID="cri-o://c463ebf6fab4960d816bf3fa83f077c07e682a07ebfaae3b5f0ee44f6adcc361" gracePeriod=2 Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.515942 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552130-sbkmx" Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.528553 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552130-vn76r" Mar 10 07:30:00 crc kubenswrapper[4825]: I0310 07:30:00.987796 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552130-sbkmx"] Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.013456 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8kmh2" Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.092536 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552130-vn76r"] Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.095828 4825 generic.go:334] "Generic (PLEG): container finished" podID="b80fe579-c776-4eee-8b58-04e7ab7ac4cb" containerID="c463ebf6fab4960d816bf3fa83f077c07e682a07ebfaae3b5f0ee44f6adcc361" exitCode=0 Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.095885 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kmh2" event={"ID":"b80fe579-c776-4eee-8b58-04e7ab7ac4cb","Type":"ContainerDied","Data":"c463ebf6fab4960d816bf3fa83f077c07e682a07ebfaae3b5f0ee44f6adcc361"} Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.095911 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8kmh2" event={"ID":"b80fe579-c776-4eee-8b58-04e7ab7ac4cb","Type":"ContainerDied","Data":"7b40f751f60e384b7434d22c573c31e71d501f8c8903f8a7149f3586e1681193"} Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.095928 4825 scope.go:117] "RemoveContainer" containerID="c463ebf6fab4960d816bf3fa83f077c07e682a07ebfaae3b5f0ee44f6adcc361" Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.096031 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8kmh2" Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.098490 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b80fe579-c776-4eee-8b58-04e7ab7ac4cb-utilities\") pod \"b80fe579-c776-4eee-8b58-04e7ab7ac4cb\" (UID: \"b80fe579-c776-4eee-8b58-04e7ab7ac4cb\") " Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.098563 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b80fe579-c776-4eee-8b58-04e7ab7ac4cb-catalog-content\") pod \"b80fe579-c776-4eee-8b58-04e7ab7ac4cb\" (UID: \"b80fe579-c776-4eee-8b58-04e7ab7ac4cb\") " Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.099356 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b80fe579-c776-4eee-8b58-04e7ab7ac4cb-utilities" (OuterVolumeSpecName: "utilities") pod "b80fe579-c776-4eee-8b58-04e7ab7ac4cb" (UID: "b80fe579-c776-4eee-8b58-04e7ab7ac4cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.114925 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552130-sbkmx" event={"ID":"a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f","Type":"ContainerStarted","Data":"8599607bef7e3b762255cc3ee0371fc0e604d73b798d8716b8a40d66323f7a34"} Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.148495 4825 scope.go:117] "RemoveContainer" containerID="01d2b48df14b5e6917c015a795d5839af6b19862724d7241088f342a0e33370b" Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.180499 4825 scope.go:117] "RemoveContainer" containerID="976c6e1a134c094dd08b36f074c5ad7ba15169aba725cf91209633d3650301b4" Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.200866 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrkdq\" (UniqueName: \"kubernetes.io/projected/b80fe579-c776-4eee-8b58-04e7ab7ac4cb-kube-api-access-mrkdq\") pod \"b80fe579-c776-4eee-8b58-04e7ab7ac4cb\" (UID: \"b80fe579-c776-4eee-8b58-04e7ab7ac4cb\") " Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.201096 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b80fe579-c776-4eee-8b58-04e7ab7ac4cb-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.207416 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b80fe579-c776-4eee-8b58-04e7ab7ac4cb-kube-api-access-mrkdq" (OuterVolumeSpecName: "kube-api-access-mrkdq") pod "b80fe579-c776-4eee-8b58-04e7ab7ac4cb" (UID: "b80fe579-c776-4eee-8b58-04e7ab7ac4cb"). InnerVolumeSpecName "kube-api-access-mrkdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.218987 4825 scope.go:117] "RemoveContainer" containerID="c463ebf6fab4960d816bf3fa83f077c07e682a07ebfaae3b5f0ee44f6adcc361" Mar 10 07:30:01 crc kubenswrapper[4825]: E0310 07:30:01.220664 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c463ebf6fab4960d816bf3fa83f077c07e682a07ebfaae3b5f0ee44f6adcc361\": container with ID starting with c463ebf6fab4960d816bf3fa83f077c07e682a07ebfaae3b5f0ee44f6adcc361 not found: ID does not exist" containerID="c463ebf6fab4960d816bf3fa83f077c07e682a07ebfaae3b5f0ee44f6adcc361" Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.220709 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c463ebf6fab4960d816bf3fa83f077c07e682a07ebfaae3b5f0ee44f6adcc361"} err="failed to get container status \"c463ebf6fab4960d816bf3fa83f077c07e682a07ebfaae3b5f0ee44f6adcc361\": rpc error: code = NotFound desc = could not find container \"c463ebf6fab4960d816bf3fa83f077c07e682a07ebfaae3b5f0ee44f6adcc361\": container with ID starting with c463ebf6fab4960d816bf3fa83f077c07e682a07ebfaae3b5f0ee44f6adcc361 not found: ID does not exist" Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.220734 4825 scope.go:117] "RemoveContainer" containerID="01d2b48df14b5e6917c015a795d5839af6b19862724d7241088f342a0e33370b" Mar 10 07:30:01 crc kubenswrapper[4825]: E0310 07:30:01.221121 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01d2b48df14b5e6917c015a795d5839af6b19862724d7241088f342a0e33370b\": container with ID starting with 01d2b48df14b5e6917c015a795d5839af6b19862724d7241088f342a0e33370b not found: ID does not exist" containerID="01d2b48df14b5e6917c015a795d5839af6b19862724d7241088f342a0e33370b" Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.221196 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01d2b48df14b5e6917c015a795d5839af6b19862724d7241088f342a0e33370b"} err="failed to get container status \"01d2b48df14b5e6917c015a795d5839af6b19862724d7241088f342a0e33370b\": rpc error: code = NotFound desc = could not find container \"01d2b48df14b5e6917c015a795d5839af6b19862724d7241088f342a0e33370b\": container with ID starting with 01d2b48df14b5e6917c015a795d5839af6b19862724d7241088f342a0e33370b not found: ID does not exist" Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.221209 4825 scope.go:117] "RemoveContainer" containerID="976c6e1a134c094dd08b36f074c5ad7ba15169aba725cf91209633d3650301b4" Mar 10 07:30:01 crc kubenswrapper[4825]: E0310 07:30:01.221797 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"976c6e1a134c094dd08b36f074c5ad7ba15169aba725cf91209633d3650301b4\": container with ID starting with 976c6e1a134c094dd08b36f074c5ad7ba15169aba725cf91209633d3650301b4 not found: ID does not exist" containerID="976c6e1a134c094dd08b36f074c5ad7ba15169aba725cf91209633d3650301b4" Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.221822 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"976c6e1a134c094dd08b36f074c5ad7ba15169aba725cf91209633d3650301b4"} err="failed to get container status \"976c6e1a134c094dd08b36f074c5ad7ba15169aba725cf91209633d3650301b4\": rpc error: code = NotFound desc = could not find container \"976c6e1a134c094dd08b36f074c5ad7ba15169aba725cf91209633d3650301b4\": container with ID starting with 976c6e1a134c094dd08b36f074c5ad7ba15169aba725cf91209633d3650301b4 not found: ID does not exist" Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.223616 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b80fe579-c776-4eee-8b58-04e7ab7ac4cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b80fe579-c776-4eee-8b58-04e7ab7ac4cb" (UID: "b80fe579-c776-4eee-8b58-04e7ab7ac4cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.312095 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b80fe579-c776-4eee-8b58-04e7ab7ac4cb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.312154 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrkdq\" (UniqueName: \"kubernetes.io/projected/b80fe579-c776-4eee-8b58-04e7ab7ac4cb-kube-api-access-mrkdq\") on node \"crc\" DevicePath \"\"" Mar 10 07:30:01 crc kubenswrapper[4825]: E0310 07:30:01.376282 4825 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb80fe579_c776_4eee_8b58_04e7ab7ac4cb.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5833cf1_e0d8_4f34_8a4e_d33d9dc86c8f.slice/crio-conmon-f3bcc1214ca1681a4785ba6f7a392fab4cefc0685994622d7ba0d42d12949c17.scope\": RecentStats: unable to find data in memory cache]" Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.424086 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8kmh2"] Mar 10 07:30:01 crc kubenswrapper[4825]: I0310 07:30:01.431380 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8kmh2"] Mar 10 07:30:02 crc kubenswrapper[4825]: I0310 07:30:02.123685 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552130-vn76r" event={"ID":"4c439fd5-bf80-4e42-b8b4-e61f6b3f7971","Type":"ContainerStarted","Data":"70ab941b7725fefe9349a4f09bb672ae4d99d62d11216e103fd6cbc9568f80a1"} Mar 10 07:30:02 crc kubenswrapper[4825]: I0310 07:30:02.125839 4825 generic.go:334] "Generic (PLEG): container finished" podID="a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f" containerID="f3bcc1214ca1681a4785ba6f7a392fab4cefc0685994622d7ba0d42d12949c17" exitCode=0 Mar 10 07:30:02 crc kubenswrapper[4825]: I0310 07:30:02.125922 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552130-sbkmx" event={"ID":"a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f","Type":"ContainerDied","Data":"f3bcc1214ca1681a4785ba6f7a392fab4cefc0685994622d7ba0d42d12949c17"} Mar 10 07:30:03 crc kubenswrapper[4825]: I0310 07:30:03.247123 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b80fe579-c776-4eee-8b58-04e7ab7ac4cb" path="/var/lib/kubelet/pods/b80fe579-c776-4eee-8b58-04e7ab7ac4cb/volumes" Mar 10 07:30:03 crc kubenswrapper[4825]: I0310 07:30:03.411111 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552130-sbkmx" Mar 10 07:30:03 crc kubenswrapper[4825]: I0310 07:30:03.439310 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f-secret-volume\") pod \"a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f\" (UID: \"a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f\") " Mar 10 07:30:03 crc kubenswrapper[4825]: I0310 07:30:03.439386 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f-config-volume\") pod \"a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f\" (UID: \"a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f\") " Mar 10 07:30:03 crc kubenswrapper[4825]: I0310 07:30:03.439472 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r647m\" (UniqueName: \"kubernetes.io/projected/a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f-kube-api-access-r647m\") pod \"a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f\" (UID: \"a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f\") " Mar 10 07:30:03 crc kubenswrapper[4825]: I0310 07:30:03.440708 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f-config-volume" (OuterVolumeSpecName: "config-volume") pod "a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f" (UID: "a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:30:03 crc kubenswrapper[4825]: I0310 07:30:03.444388 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f" (UID: "a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:30:03 crc kubenswrapper[4825]: I0310 07:30:03.444980 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f-kube-api-access-r647m" (OuterVolumeSpecName: "kube-api-access-r647m") pod "a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f" (UID: "a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f"). InnerVolumeSpecName "kube-api-access-r647m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:30:03 crc kubenswrapper[4825]: I0310 07:30:03.540891 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r647m\" (UniqueName: \"kubernetes.io/projected/a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f-kube-api-access-r647m\") on node \"crc\" DevicePath \"\"" Mar 10 07:30:03 crc kubenswrapper[4825]: I0310 07:30:03.541075 4825 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 07:30:03 crc kubenswrapper[4825]: I0310 07:30:03.541179 4825 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 07:30:04 crc kubenswrapper[4825]: I0310 07:30:04.148122 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552130-sbkmx" event={"ID":"a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f","Type":"ContainerDied","Data":"8599607bef7e3b762255cc3ee0371fc0e604d73b798d8716b8a40d66323f7a34"} Mar 10 07:30:04 crc kubenswrapper[4825]: I0310 07:30:04.148193 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552130-sbkmx" Mar 10 07:30:04 crc kubenswrapper[4825]: I0310 07:30:04.148215 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8599607bef7e3b762255cc3ee0371fc0e604d73b798d8716b8a40d66323f7a34" Mar 10 07:30:04 crc kubenswrapper[4825]: I0310 07:30:04.151230 4825 generic.go:334] "Generic (PLEG): container finished" podID="4c439fd5-bf80-4e42-b8b4-e61f6b3f7971" containerID="896b05720a80224f2f9e1c613e5479bc6419911a72ca26f72507bd440f8456a6" exitCode=0 Mar 10 07:30:04 crc kubenswrapper[4825]: I0310 07:30:04.151282 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552130-vn76r" event={"ID":"4c439fd5-bf80-4e42-b8b4-e61f6b3f7971","Type":"ContainerDied","Data":"896b05720a80224f2f9e1c613e5479bc6419911a72ca26f72507bd440f8456a6"} Mar 10 07:30:04 crc kubenswrapper[4825]: I0310 07:30:04.516792 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552085-npl6x"] Mar 10 07:30:04 crc kubenswrapper[4825]: I0310 07:30:04.527021 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552085-npl6x"] Mar 10 07:30:05 crc kubenswrapper[4825]: I0310 07:30:05.290184 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4889476-c0d0-4e4b-986a-f4dcdacce72b" path="/var/lib/kubelet/pods/f4889476-c0d0-4e4b-986a-f4dcdacce72b/volumes" Mar 10 07:30:05 crc kubenswrapper[4825]: I0310 07:30:05.543784 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552130-vn76r" Mar 10 07:30:05 crc kubenswrapper[4825]: I0310 07:30:05.694018 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mt9q\" (UniqueName: \"kubernetes.io/projected/4c439fd5-bf80-4e42-b8b4-e61f6b3f7971-kube-api-access-5mt9q\") pod \"4c439fd5-bf80-4e42-b8b4-e61f6b3f7971\" (UID: \"4c439fd5-bf80-4e42-b8b4-e61f6b3f7971\") " Mar 10 07:30:05 crc kubenswrapper[4825]: I0310 07:30:05.702923 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c439fd5-bf80-4e42-b8b4-e61f6b3f7971-kube-api-access-5mt9q" (OuterVolumeSpecName: "kube-api-access-5mt9q") pod "4c439fd5-bf80-4e42-b8b4-e61f6b3f7971" (UID: "4c439fd5-bf80-4e42-b8b4-e61f6b3f7971"). InnerVolumeSpecName "kube-api-access-5mt9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:30:05 crc kubenswrapper[4825]: I0310 07:30:05.796342 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mt9q\" (UniqueName: \"kubernetes.io/projected/4c439fd5-bf80-4e42-b8b4-e61f6b3f7971-kube-api-access-5mt9q\") on node \"crc\" DevicePath \"\"" Mar 10 07:30:06 crc kubenswrapper[4825]: I0310 07:30:06.173217 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552130-vn76r" event={"ID":"4c439fd5-bf80-4e42-b8b4-e61f6b3f7971","Type":"ContainerDied","Data":"70ab941b7725fefe9349a4f09bb672ae4d99d62d11216e103fd6cbc9568f80a1"} Mar 10 07:30:06 crc kubenswrapper[4825]: I0310 07:30:06.173614 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70ab941b7725fefe9349a4f09bb672ae4d99d62d11216e103fd6cbc9568f80a1" Mar 10 07:30:06 crc kubenswrapper[4825]: I0310 07:30:06.173329 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552130-vn76r" Mar 10 07:30:06 crc kubenswrapper[4825]: I0310 07:30:06.630839 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552124-tffr7"] Mar 10 07:30:06 crc kubenswrapper[4825]: I0310 07:30:06.640277 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552124-tffr7"] Mar 10 07:30:07 crc kubenswrapper[4825]: I0310 07:30:07.236128 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:30:07 crc kubenswrapper[4825]: E0310 07:30:07.236610 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:30:07 crc kubenswrapper[4825]: I0310 07:30:07.252986 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ade7b324-7179-4534-94bd-03819ae4ade3" path="/var/lib/kubelet/pods/ade7b324-7179-4534-94bd-03819ae4ade3/volumes" Mar 10 07:30:21 crc kubenswrapper[4825]: I0310 07:30:21.236652 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:30:21 crc kubenswrapper[4825]: E0310 07:30:21.237751 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:30:33 crc kubenswrapper[4825]: I0310 07:30:33.237530 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:30:33 crc kubenswrapper[4825]: E0310 07:30:33.238511 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:30:38 crc kubenswrapper[4825]: I0310 07:30:38.264300 4825 scope.go:117] "RemoveContainer" containerID="48ead55ecdcc5b367d13e319264acfded320c333d5f630757bdcd4e5589ea7fb" Mar 10 07:30:38 crc kubenswrapper[4825]: I0310 07:30:38.286413 4825 scope.go:117] "RemoveContainer" containerID="67ce4aae6072bca2381fc2bb1155837742dc2400d17b065d8bf7b6b17de6403a" Mar 10 07:30:48 crc kubenswrapper[4825]: I0310 07:30:48.236690 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:30:48 crc kubenswrapper[4825]: I0310 07:30:48.577562 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"f14d76b81593b08c433fbe6d8eff0a69ee39bc5bc0cae738058b3fb2fd863827"} Mar 10 07:32:00 crc kubenswrapper[4825]: I0310 07:32:00.136501 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552132-4xg6n"] Mar 10 07:32:00 crc kubenswrapper[4825]: E0310 07:32:00.138947 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c439fd5-bf80-4e42-b8b4-e61f6b3f7971" containerName="oc" Mar 10 07:32:00 crc kubenswrapper[4825]: I0310 07:32:00.138974 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c439fd5-bf80-4e42-b8b4-e61f6b3f7971" containerName="oc" Mar 10 07:32:00 crc kubenswrapper[4825]: E0310 07:32:00.138992 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b80fe579-c776-4eee-8b58-04e7ab7ac4cb" containerName="registry-server" Mar 10 07:32:00 crc kubenswrapper[4825]: I0310 07:32:00.138999 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b80fe579-c776-4eee-8b58-04e7ab7ac4cb" containerName="registry-server" Mar 10 07:32:00 crc kubenswrapper[4825]: E0310 07:32:00.139012 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b80fe579-c776-4eee-8b58-04e7ab7ac4cb" containerName="extract-utilities" Mar 10 07:32:00 crc kubenswrapper[4825]: I0310 07:32:00.139019 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b80fe579-c776-4eee-8b58-04e7ab7ac4cb" containerName="extract-utilities" Mar 10 07:32:00 crc kubenswrapper[4825]: E0310 07:32:00.139033 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b80fe579-c776-4eee-8b58-04e7ab7ac4cb" containerName="extract-content" Mar 10 07:32:00 crc kubenswrapper[4825]: I0310 07:32:00.139039 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b80fe579-c776-4eee-8b58-04e7ab7ac4cb" containerName="extract-content" Mar 10 07:32:00 crc kubenswrapper[4825]: E0310 07:32:00.139048 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f" containerName="collect-profiles" Mar 10 07:32:00 crc kubenswrapper[4825]: I0310 07:32:00.139054 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f" containerName="collect-profiles" Mar 10 07:32:00 crc kubenswrapper[4825]: I0310 07:32:00.139221 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b80fe579-c776-4eee-8b58-04e7ab7ac4cb" containerName="registry-server" Mar 10 07:32:00 crc kubenswrapper[4825]: I0310 07:32:00.139236 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c439fd5-bf80-4e42-b8b4-e61f6b3f7971" containerName="oc" Mar 10 07:32:00 crc kubenswrapper[4825]: I0310 07:32:00.139245 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f" containerName="collect-profiles" Mar 10 07:32:00 crc kubenswrapper[4825]: I0310 07:32:00.139862 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552132-4xg6n" Mar 10 07:32:00 crc kubenswrapper[4825]: I0310 07:32:00.143716 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:32:00 crc kubenswrapper[4825]: I0310 07:32:00.143924 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:32:00 crc kubenswrapper[4825]: I0310 07:32:00.145007 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:32:00 crc kubenswrapper[4825]: I0310 07:32:00.149031 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552132-4xg6n"] Mar 10 07:32:00 crc kubenswrapper[4825]: I0310 07:32:00.276654 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2wxs\" (UniqueName: \"kubernetes.io/projected/78877c1c-a7f0-4f7c-a9f9-9712470779f1-kube-api-access-t2wxs\") pod \"auto-csr-approver-29552132-4xg6n\" (UID: \"78877c1c-a7f0-4f7c-a9f9-9712470779f1\") " pod="openshift-infra/auto-csr-approver-29552132-4xg6n" Mar 10 07:32:00 crc kubenswrapper[4825]: I0310 07:32:00.378811 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2wxs\" (UniqueName: \"kubernetes.io/projected/78877c1c-a7f0-4f7c-a9f9-9712470779f1-kube-api-access-t2wxs\") pod \"auto-csr-approver-29552132-4xg6n\" (UID: \"78877c1c-a7f0-4f7c-a9f9-9712470779f1\") " pod="openshift-infra/auto-csr-approver-29552132-4xg6n" Mar 10 07:32:00 crc kubenswrapper[4825]: I0310 07:32:00.416958 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2wxs\" (UniqueName: \"kubernetes.io/projected/78877c1c-a7f0-4f7c-a9f9-9712470779f1-kube-api-access-t2wxs\") pod \"auto-csr-approver-29552132-4xg6n\" (UID: \"78877c1c-a7f0-4f7c-a9f9-9712470779f1\") " pod="openshift-infra/auto-csr-approver-29552132-4xg6n" Mar 10 07:32:00 crc kubenswrapper[4825]: I0310 07:32:00.457245 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552132-4xg6n" Mar 10 07:32:00 crc kubenswrapper[4825]: I0310 07:32:00.991098 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552132-4xg6n"] Mar 10 07:32:01 crc kubenswrapper[4825]: I0310 07:32:01.164194 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552132-4xg6n" event={"ID":"78877c1c-a7f0-4f7c-a9f9-9712470779f1","Type":"ContainerStarted","Data":"f613a586bbe08d9fc26fdecc528f29300f0aee1b08c42ea2c883183e986563dc"} Mar 10 07:32:02 crc kubenswrapper[4825]: I0310 07:32:02.174237 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552132-4xg6n" event={"ID":"78877c1c-a7f0-4f7c-a9f9-9712470779f1","Type":"ContainerStarted","Data":"310bf31af4edb2c7dea7b543f7779bbaf81af94b2ad90b8cd733d0b2e42ffa67"} Mar 10 07:32:02 crc kubenswrapper[4825]: I0310 07:32:02.188511 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552132-4xg6n" podStartSLOduration=1.281981815 podStartE2EDuration="2.188489056s" podCreationTimestamp="2026-03-10 07:32:00 +0000 UTC" firstStartedPulling="2026-03-10 07:32:01.005224435 +0000 UTC m=+2874.035005090" lastFinishedPulling="2026-03-10 07:32:01.911731716 +0000 UTC m=+2874.941512331" observedRunningTime="2026-03-10 07:32:02.188282021 +0000 UTC m=+2875.218062676" watchObservedRunningTime="2026-03-10 07:32:02.188489056 +0000 UTC m=+2875.218269711" Mar 10 07:32:03 crc kubenswrapper[4825]: I0310 07:32:03.187463 4825 generic.go:334] "Generic (PLEG): container finished" podID="78877c1c-a7f0-4f7c-a9f9-9712470779f1" containerID="310bf31af4edb2c7dea7b543f7779bbaf81af94b2ad90b8cd733d0b2e42ffa67" exitCode=0 Mar 10 07:32:03 crc kubenswrapper[4825]: I0310 07:32:03.187535 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552132-4xg6n" event={"ID":"78877c1c-a7f0-4f7c-a9f9-9712470779f1","Type":"ContainerDied","Data":"310bf31af4edb2c7dea7b543f7779bbaf81af94b2ad90b8cd733d0b2e42ffa67"} Mar 10 07:32:04 crc kubenswrapper[4825]: I0310 07:32:04.659283 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552132-4xg6n" Mar 10 07:32:04 crc kubenswrapper[4825]: I0310 07:32:04.749875 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2wxs\" (UniqueName: \"kubernetes.io/projected/78877c1c-a7f0-4f7c-a9f9-9712470779f1-kube-api-access-t2wxs\") pod \"78877c1c-a7f0-4f7c-a9f9-9712470779f1\" (UID: \"78877c1c-a7f0-4f7c-a9f9-9712470779f1\") " Mar 10 07:32:04 crc kubenswrapper[4825]: I0310 07:32:04.755346 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78877c1c-a7f0-4f7c-a9f9-9712470779f1-kube-api-access-t2wxs" (OuterVolumeSpecName: "kube-api-access-t2wxs") pod "78877c1c-a7f0-4f7c-a9f9-9712470779f1" (UID: "78877c1c-a7f0-4f7c-a9f9-9712470779f1"). InnerVolumeSpecName "kube-api-access-t2wxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:32:04 crc kubenswrapper[4825]: I0310 07:32:04.852057 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2wxs\" (UniqueName: \"kubernetes.io/projected/78877c1c-a7f0-4f7c-a9f9-9712470779f1-kube-api-access-t2wxs\") on node \"crc\" DevicePath \"\"" Mar 10 07:32:05 crc kubenswrapper[4825]: I0310 07:32:05.208499 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552132-4xg6n" event={"ID":"78877c1c-a7f0-4f7c-a9f9-9712470779f1","Type":"ContainerDied","Data":"f613a586bbe08d9fc26fdecc528f29300f0aee1b08c42ea2c883183e986563dc"} Mar 10 07:32:05 crc kubenswrapper[4825]: I0310 07:32:05.208564 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f613a586bbe08d9fc26fdecc528f29300f0aee1b08c42ea2c883183e986563dc" Mar 10 07:32:05 crc kubenswrapper[4825]: I0310 07:32:05.208667 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552132-4xg6n" Mar 10 07:32:05 crc kubenswrapper[4825]: I0310 07:32:05.297526 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552126-prr59"] Mar 10 07:32:05 crc kubenswrapper[4825]: I0310 07:32:05.309348 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552126-prr59"] Mar 10 07:32:07 crc kubenswrapper[4825]: I0310 07:32:07.247553 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddaba890-7e66-4edb-a231-bf7afb2cb8fd" path="/var/lib/kubelet/pods/ddaba890-7e66-4edb-a231-bf7afb2cb8fd/volumes" Mar 10 07:32:31 crc kubenswrapper[4825]: I0310 07:32:31.495285 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nljlr"] Mar 10 07:32:31 crc kubenswrapper[4825]: E0310 07:32:31.496455 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78877c1c-a7f0-4f7c-a9f9-9712470779f1" containerName="oc" Mar 10 07:32:31 crc kubenswrapper[4825]: I0310 07:32:31.496489 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="78877c1c-a7f0-4f7c-a9f9-9712470779f1" containerName="oc" Mar 10 07:32:31 crc kubenswrapper[4825]: I0310 07:32:31.496843 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="78877c1c-a7f0-4f7c-a9f9-9712470779f1" containerName="oc" Mar 10 07:32:31 crc kubenswrapper[4825]: I0310 07:32:31.499730 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nljlr" Mar 10 07:32:31 crc kubenswrapper[4825]: I0310 07:32:31.527166 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nljlr"] Mar 10 07:32:31 crc kubenswrapper[4825]: I0310 07:32:31.572307 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f344ea8-2817-419d-85a9-7c119f8663e2-catalog-content\") pod \"redhat-marketplace-nljlr\" (UID: \"6f344ea8-2817-419d-85a9-7c119f8663e2\") " pod="openshift-marketplace/redhat-marketplace-nljlr" Mar 10 07:32:31 crc kubenswrapper[4825]: I0310 07:32:31.572644 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwcch\" (UniqueName: \"kubernetes.io/projected/6f344ea8-2817-419d-85a9-7c119f8663e2-kube-api-access-nwcch\") pod \"redhat-marketplace-nljlr\" (UID: \"6f344ea8-2817-419d-85a9-7c119f8663e2\") " pod="openshift-marketplace/redhat-marketplace-nljlr" Mar 10 07:32:31 crc kubenswrapper[4825]: I0310 07:32:31.572687 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f344ea8-2817-419d-85a9-7c119f8663e2-utilities\") pod \"redhat-marketplace-nljlr\" (UID: \"6f344ea8-2817-419d-85a9-7c119f8663e2\") " pod="openshift-marketplace/redhat-marketplace-nljlr" Mar 10 07:32:31 crc kubenswrapper[4825]: I0310 07:32:31.673702 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwcch\" (UniqueName: \"kubernetes.io/projected/6f344ea8-2817-419d-85a9-7c119f8663e2-kube-api-access-nwcch\") pod \"redhat-marketplace-nljlr\" (UID: \"6f344ea8-2817-419d-85a9-7c119f8663e2\") " pod="openshift-marketplace/redhat-marketplace-nljlr" Mar 10 07:32:31 crc kubenswrapper[4825]: I0310 07:32:31.673752 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f344ea8-2817-419d-85a9-7c119f8663e2-utilities\") pod \"redhat-marketplace-nljlr\" (UID: \"6f344ea8-2817-419d-85a9-7c119f8663e2\") " pod="openshift-marketplace/redhat-marketplace-nljlr" Mar 10 07:32:31 crc kubenswrapper[4825]: I0310 07:32:31.673804 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f344ea8-2817-419d-85a9-7c119f8663e2-catalog-content\") pod \"redhat-marketplace-nljlr\" (UID: \"6f344ea8-2817-419d-85a9-7c119f8663e2\") " pod="openshift-marketplace/redhat-marketplace-nljlr" Mar 10 07:32:31 crc kubenswrapper[4825]: I0310 07:32:31.674404 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f344ea8-2817-419d-85a9-7c119f8663e2-catalog-content\") pod \"redhat-marketplace-nljlr\" (UID: \"6f344ea8-2817-419d-85a9-7c119f8663e2\") " pod="openshift-marketplace/redhat-marketplace-nljlr" Mar 10 07:32:31 crc kubenswrapper[4825]: I0310 07:32:31.674841 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f344ea8-2817-419d-85a9-7c119f8663e2-utilities\") pod \"redhat-marketplace-nljlr\" (UID: \"6f344ea8-2817-419d-85a9-7c119f8663e2\") " pod="openshift-marketplace/redhat-marketplace-nljlr" Mar 10 07:32:31 crc kubenswrapper[4825]: I0310 07:32:31.698405 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwcch\" (UniqueName: \"kubernetes.io/projected/6f344ea8-2817-419d-85a9-7c119f8663e2-kube-api-access-nwcch\") pod \"redhat-marketplace-nljlr\" (UID: \"6f344ea8-2817-419d-85a9-7c119f8663e2\") " pod="openshift-marketplace/redhat-marketplace-nljlr" Mar 10 07:32:31 crc kubenswrapper[4825]: I0310 07:32:31.842113 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nljlr" Mar 10 07:32:32 crc kubenswrapper[4825]: I0310 07:32:32.312478 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nljlr"] Mar 10 07:32:32 crc kubenswrapper[4825]: I0310 07:32:32.464972 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nljlr" event={"ID":"6f344ea8-2817-419d-85a9-7c119f8663e2","Type":"ContainerStarted","Data":"a019b7c15b7173b7b6c39df4a38d6030992e8c0b1c9aeb5df9402e9ea93e2c3f"} Mar 10 07:32:33 crc kubenswrapper[4825]: I0310 07:32:33.476386 4825 generic.go:334] "Generic (PLEG): container finished" podID="6f344ea8-2817-419d-85a9-7c119f8663e2" containerID="a91747082f917caf08d7ba673f3288ee902dcda4b849fead49dbe8829c1f90c4" exitCode=0 Mar 10 07:32:33 crc kubenswrapper[4825]: I0310 07:32:33.476503 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nljlr" event={"ID":"6f344ea8-2817-419d-85a9-7c119f8663e2","Type":"ContainerDied","Data":"a91747082f917caf08d7ba673f3288ee902dcda4b849fead49dbe8829c1f90c4"} Mar 10 07:32:35 crc kubenswrapper[4825]: I0310 07:32:35.514265 4825 generic.go:334] "Generic (PLEG): container finished" podID="6f344ea8-2817-419d-85a9-7c119f8663e2" containerID="8f7b478877ac94484babe1af7e31c3524dc7d48e28e0c460aaca8e11b6b4c3b3" exitCode=0 Mar 10 07:32:35 crc kubenswrapper[4825]: I0310 07:32:35.514416 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nljlr" event={"ID":"6f344ea8-2817-419d-85a9-7c119f8663e2","Type":"ContainerDied","Data":"8f7b478877ac94484babe1af7e31c3524dc7d48e28e0c460aaca8e11b6b4c3b3"} Mar 10 07:32:36 crc kubenswrapper[4825]: I0310 07:32:36.529598 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nljlr" event={"ID":"6f344ea8-2817-419d-85a9-7c119f8663e2","Type":"ContainerStarted","Data":"f97f3b420c8ef0dc28ad777de48dd5045db15b5ce3db2ae1722a2b6ec2124ef0"} Mar 10 07:32:36 crc kubenswrapper[4825]: I0310 07:32:36.562764 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nljlr" podStartSLOduration=3.087028729 podStartE2EDuration="5.562737374s" podCreationTimestamp="2026-03-10 07:32:31 +0000 UTC" firstStartedPulling="2026-03-10 07:32:33.478774442 +0000 UTC m=+2906.508555087" lastFinishedPulling="2026-03-10 07:32:35.954483077 +0000 UTC m=+2908.984263732" observedRunningTime="2026-03-10 07:32:36.554907289 +0000 UTC m=+2909.584687984" watchObservedRunningTime="2026-03-10 07:32:36.562737374 +0000 UTC m=+2909.592518029" Mar 10 07:32:38 crc kubenswrapper[4825]: I0310 07:32:38.416625 4825 scope.go:117] "RemoveContainer" containerID="701e87aa8ae4514e278b86d9059d1c82f25c29fa31a3dfa1df15852205d4f0e3" Mar 10 07:32:39 crc kubenswrapper[4825]: I0310 07:32:39.896083 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kw7kx"] Mar 10 07:32:39 crc kubenswrapper[4825]: I0310 07:32:39.899550 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kw7kx" Mar 10 07:32:39 crc kubenswrapper[4825]: I0310 07:32:39.913191 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kw7kx"] Mar 10 07:32:40 crc kubenswrapper[4825]: I0310 07:32:40.006922 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47c81833-1608-41ca-8be0-44b3170d3e4f-utilities\") pod \"community-operators-kw7kx\" (UID: \"47c81833-1608-41ca-8be0-44b3170d3e4f\") " pod="openshift-marketplace/community-operators-kw7kx" Mar 10 07:32:40 crc kubenswrapper[4825]: I0310 07:32:40.007006 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47c81833-1608-41ca-8be0-44b3170d3e4f-catalog-content\") pod \"community-operators-kw7kx\" (UID: \"47c81833-1608-41ca-8be0-44b3170d3e4f\") " pod="openshift-marketplace/community-operators-kw7kx" Mar 10 07:32:40 crc kubenswrapper[4825]: I0310 07:32:40.007085 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9mp4\" (UniqueName: \"kubernetes.io/projected/47c81833-1608-41ca-8be0-44b3170d3e4f-kube-api-access-q9mp4\") pod \"community-operators-kw7kx\" (UID: \"47c81833-1608-41ca-8be0-44b3170d3e4f\") " pod="openshift-marketplace/community-operators-kw7kx" Mar 10 07:32:40 crc kubenswrapper[4825]: I0310 07:32:40.108106 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47c81833-1608-41ca-8be0-44b3170d3e4f-utilities\") pod \"community-operators-kw7kx\" (UID: \"47c81833-1608-41ca-8be0-44b3170d3e4f\") " pod="openshift-marketplace/community-operators-kw7kx" Mar 10 07:32:40 crc kubenswrapper[4825]: I0310 07:32:40.108221 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47c81833-1608-41ca-8be0-44b3170d3e4f-catalog-content\") pod \"community-operators-kw7kx\" (UID: \"47c81833-1608-41ca-8be0-44b3170d3e4f\") " pod="openshift-marketplace/community-operators-kw7kx" Mar 10 07:32:40 crc kubenswrapper[4825]: I0310 07:32:40.108276 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9mp4\" (UniqueName: \"kubernetes.io/projected/47c81833-1608-41ca-8be0-44b3170d3e4f-kube-api-access-q9mp4\") pod \"community-operators-kw7kx\" (UID: \"47c81833-1608-41ca-8be0-44b3170d3e4f\") " pod="openshift-marketplace/community-operators-kw7kx" Mar 10 07:32:40 crc kubenswrapper[4825]: I0310 07:32:40.108875 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47c81833-1608-41ca-8be0-44b3170d3e4f-utilities\") pod \"community-operators-kw7kx\" (UID: \"47c81833-1608-41ca-8be0-44b3170d3e4f\") " pod="openshift-marketplace/community-operators-kw7kx" Mar 10 07:32:40 crc kubenswrapper[4825]: I0310 07:32:40.109010 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47c81833-1608-41ca-8be0-44b3170d3e4f-catalog-content\") pod \"community-operators-kw7kx\" (UID: \"47c81833-1608-41ca-8be0-44b3170d3e4f\") " pod="openshift-marketplace/community-operators-kw7kx" Mar 10 07:32:40 crc kubenswrapper[4825]: I0310 07:32:40.134515 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9mp4\" (UniqueName: \"kubernetes.io/projected/47c81833-1608-41ca-8be0-44b3170d3e4f-kube-api-access-q9mp4\") pod \"community-operators-kw7kx\" (UID: \"47c81833-1608-41ca-8be0-44b3170d3e4f\") " pod="openshift-marketplace/community-operators-kw7kx" Mar 10 07:32:40 crc kubenswrapper[4825]: I0310 07:32:40.220804 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kw7kx" Mar 10 07:32:40 crc kubenswrapper[4825]: W0310 07:32:40.731727 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47c81833_1608_41ca_8be0_44b3170d3e4f.slice/crio-92bab67e835ca957bbcab2f0ff42f73fa7f30c70cf57f85a98e4061635051bd4 WatchSource:0}: Error finding container 92bab67e835ca957bbcab2f0ff42f73fa7f30c70cf57f85a98e4061635051bd4: Status 404 returned error can't find the container with id 92bab67e835ca957bbcab2f0ff42f73fa7f30c70cf57f85a98e4061635051bd4 Mar 10 07:32:40 crc kubenswrapper[4825]: I0310 07:32:40.740729 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kw7kx"] Mar 10 07:32:41 crc kubenswrapper[4825]: I0310 07:32:41.578994 4825 generic.go:334] "Generic (PLEG): container finished" podID="47c81833-1608-41ca-8be0-44b3170d3e4f" containerID="e7271e4a5e0ca00045d24d30b1bdb0db3e2680cab6a4d142f1014e9e57620407" exitCode=0 Mar 10 07:32:41 crc kubenswrapper[4825]: I0310 07:32:41.579065 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kw7kx" event={"ID":"47c81833-1608-41ca-8be0-44b3170d3e4f","Type":"ContainerDied","Data":"e7271e4a5e0ca00045d24d30b1bdb0db3e2680cab6a4d142f1014e9e57620407"} Mar 10 07:32:41 crc kubenswrapper[4825]: I0310 07:32:41.579110 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kw7kx" event={"ID":"47c81833-1608-41ca-8be0-44b3170d3e4f","Type":"ContainerStarted","Data":"92bab67e835ca957bbcab2f0ff42f73fa7f30c70cf57f85a98e4061635051bd4"} Mar 10 07:32:41 crc kubenswrapper[4825]: I0310 07:32:41.842780 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nljlr" Mar 10 07:32:41 crc kubenswrapper[4825]: I0310 07:32:41.842878 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nljlr" Mar 10 07:32:41 crc kubenswrapper[4825]: I0310 07:32:41.931864 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nljlr" Mar 10 07:32:42 crc kubenswrapper[4825]: I0310 07:32:42.662856 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nljlr" Mar 10 07:32:43 crc kubenswrapper[4825]: I0310 07:32:43.604967 4825 generic.go:334] "Generic (PLEG): container finished" podID="47c81833-1608-41ca-8be0-44b3170d3e4f" containerID="78286af40489c7bccb817ec6453133617f8bba1937121e3ca8468b192f75299f" exitCode=0 Mar 10 07:32:43 crc kubenswrapper[4825]: I0310 07:32:43.605185 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kw7kx" event={"ID":"47c81833-1608-41ca-8be0-44b3170d3e4f","Type":"ContainerDied","Data":"78286af40489c7bccb817ec6453133617f8bba1937121e3ca8468b192f75299f"} Mar 10 07:32:44 crc kubenswrapper[4825]: I0310 07:32:44.275205 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nljlr"] Mar 10 07:32:44 crc kubenswrapper[4825]: I0310 07:32:44.630511 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kw7kx" event={"ID":"47c81833-1608-41ca-8be0-44b3170d3e4f","Type":"ContainerStarted","Data":"e2d33ec80348fe0caec94b41de18ba53b95cb30246e70d2223c7bfad6063754a"} Mar 10 07:32:44 crc kubenswrapper[4825]: I0310 07:32:44.630706 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nljlr" podUID="6f344ea8-2817-419d-85a9-7c119f8663e2" containerName="registry-server" containerID="cri-o://f97f3b420c8ef0dc28ad777de48dd5045db15b5ce3db2ae1722a2b6ec2124ef0" gracePeriod=2 Mar 10 07:32:44 crc kubenswrapper[4825]: I0310 07:32:44.660238 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kw7kx" podStartSLOduration=3.135114331 podStartE2EDuration="5.660208428s" podCreationTimestamp="2026-03-10 07:32:39 +0000 UTC" firstStartedPulling="2026-03-10 07:32:41.581675281 +0000 UTC m=+2914.611455936" lastFinishedPulling="2026-03-10 07:32:44.106769388 +0000 UTC m=+2917.136550033" observedRunningTime="2026-03-10 07:32:44.657956929 +0000 UTC m=+2917.687737584" watchObservedRunningTime="2026-03-10 07:32:44.660208428 +0000 UTC m=+2917.689989053" Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.041038 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nljlr" Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.089067 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwcch\" (UniqueName: \"kubernetes.io/projected/6f344ea8-2817-419d-85a9-7c119f8663e2-kube-api-access-nwcch\") pod \"6f344ea8-2817-419d-85a9-7c119f8663e2\" (UID: \"6f344ea8-2817-419d-85a9-7c119f8663e2\") " Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.089185 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f344ea8-2817-419d-85a9-7c119f8663e2-utilities\") pod \"6f344ea8-2817-419d-85a9-7c119f8663e2\" (UID: \"6f344ea8-2817-419d-85a9-7c119f8663e2\") " Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.089213 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f344ea8-2817-419d-85a9-7c119f8663e2-catalog-content\") pod \"6f344ea8-2817-419d-85a9-7c119f8663e2\" (UID: \"6f344ea8-2817-419d-85a9-7c119f8663e2\") " Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.089984 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f344ea8-2817-419d-85a9-7c119f8663e2-utilities" (OuterVolumeSpecName: "utilities") pod "6f344ea8-2817-419d-85a9-7c119f8663e2" (UID: "6f344ea8-2817-419d-85a9-7c119f8663e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.099905 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f344ea8-2817-419d-85a9-7c119f8663e2-kube-api-access-nwcch" (OuterVolumeSpecName: "kube-api-access-nwcch") pod "6f344ea8-2817-419d-85a9-7c119f8663e2" (UID: "6f344ea8-2817-419d-85a9-7c119f8663e2"). InnerVolumeSpecName "kube-api-access-nwcch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.166686 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f344ea8-2817-419d-85a9-7c119f8663e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f344ea8-2817-419d-85a9-7c119f8663e2" (UID: "6f344ea8-2817-419d-85a9-7c119f8663e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.191054 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwcch\" (UniqueName: \"kubernetes.io/projected/6f344ea8-2817-419d-85a9-7c119f8663e2-kube-api-access-nwcch\") on node \"crc\" DevicePath \"\"" Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.191089 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f344ea8-2817-419d-85a9-7c119f8663e2-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.191100 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f344ea8-2817-419d-85a9-7c119f8663e2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.643934 4825 generic.go:334] "Generic (PLEG): container finished" podID="6f344ea8-2817-419d-85a9-7c119f8663e2" containerID="f97f3b420c8ef0dc28ad777de48dd5045db15b5ce3db2ae1722a2b6ec2124ef0" exitCode=0 Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.644068 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nljlr" Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.644197 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nljlr" event={"ID":"6f344ea8-2817-419d-85a9-7c119f8663e2","Type":"ContainerDied","Data":"f97f3b420c8ef0dc28ad777de48dd5045db15b5ce3db2ae1722a2b6ec2124ef0"} Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.644257 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nljlr" event={"ID":"6f344ea8-2817-419d-85a9-7c119f8663e2","Type":"ContainerDied","Data":"a019b7c15b7173b7b6c39df4a38d6030992e8c0b1c9aeb5df9402e9ea93e2c3f"} Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.644295 4825 scope.go:117] "RemoveContainer" containerID="f97f3b420c8ef0dc28ad777de48dd5045db15b5ce3db2ae1722a2b6ec2124ef0" Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.669471 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nljlr"] Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.675257 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nljlr"] Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.678981 4825 scope.go:117] "RemoveContainer" containerID="8f7b478877ac94484babe1af7e31c3524dc7d48e28e0c460aaca8e11b6b4c3b3" Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.699004 4825 scope.go:117] "RemoveContainer" containerID="a91747082f917caf08d7ba673f3288ee902dcda4b849fead49dbe8829c1f90c4" Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.736583 4825 scope.go:117] "RemoveContainer" containerID="f97f3b420c8ef0dc28ad777de48dd5045db15b5ce3db2ae1722a2b6ec2124ef0" Mar 10 07:32:45 crc kubenswrapper[4825]: E0310 07:32:45.737219 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f97f3b420c8ef0dc28ad777de48dd5045db15b5ce3db2ae1722a2b6ec2124ef0\": container with ID starting with f97f3b420c8ef0dc28ad777de48dd5045db15b5ce3db2ae1722a2b6ec2124ef0 not found: ID does not exist" containerID="f97f3b420c8ef0dc28ad777de48dd5045db15b5ce3db2ae1722a2b6ec2124ef0" Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.737262 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f97f3b420c8ef0dc28ad777de48dd5045db15b5ce3db2ae1722a2b6ec2124ef0"} err="failed to get container status \"f97f3b420c8ef0dc28ad777de48dd5045db15b5ce3db2ae1722a2b6ec2124ef0\": rpc error: code = NotFound desc = could not find container \"f97f3b420c8ef0dc28ad777de48dd5045db15b5ce3db2ae1722a2b6ec2124ef0\": container with ID starting with f97f3b420c8ef0dc28ad777de48dd5045db15b5ce3db2ae1722a2b6ec2124ef0 not found: ID does not exist" Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.737287 4825 scope.go:117] "RemoveContainer" containerID="8f7b478877ac94484babe1af7e31c3524dc7d48e28e0c460aaca8e11b6b4c3b3" Mar 10 07:32:45 crc kubenswrapper[4825]: E0310 07:32:45.738100 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f7b478877ac94484babe1af7e31c3524dc7d48e28e0c460aaca8e11b6b4c3b3\": container with ID starting with 8f7b478877ac94484babe1af7e31c3524dc7d48e28e0c460aaca8e11b6b4c3b3 not found: ID does not exist" containerID="8f7b478877ac94484babe1af7e31c3524dc7d48e28e0c460aaca8e11b6b4c3b3" Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.738129 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f7b478877ac94484babe1af7e31c3524dc7d48e28e0c460aaca8e11b6b4c3b3"} err="failed to get container status \"8f7b478877ac94484babe1af7e31c3524dc7d48e28e0c460aaca8e11b6b4c3b3\": rpc error: code = NotFound desc = could not find container \"8f7b478877ac94484babe1af7e31c3524dc7d48e28e0c460aaca8e11b6b4c3b3\": container with ID starting with 8f7b478877ac94484babe1af7e31c3524dc7d48e28e0c460aaca8e11b6b4c3b3 not found: ID does not exist" Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.738164 4825 scope.go:117] "RemoveContainer" containerID="a91747082f917caf08d7ba673f3288ee902dcda4b849fead49dbe8829c1f90c4" Mar 10 07:32:45 crc kubenswrapper[4825]: E0310 07:32:45.738574 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a91747082f917caf08d7ba673f3288ee902dcda4b849fead49dbe8829c1f90c4\": container with ID starting with a91747082f917caf08d7ba673f3288ee902dcda4b849fead49dbe8829c1f90c4 not found: ID does not exist" containerID="a91747082f917caf08d7ba673f3288ee902dcda4b849fead49dbe8829c1f90c4" Mar 10 07:32:45 crc kubenswrapper[4825]: I0310 07:32:45.738598 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a91747082f917caf08d7ba673f3288ee902dcda4b849fead49dbe8829c1f90c4"} err="failed to get container status \"a91747082f917caf08d7ba673f3288ee902dcda4b849fead49dbe8829c1f90c4\": rpc error: code = NotFound desc = could not find container \"a91747082f917caf08d7ba673f3288ee902dcda4b849fead49dbe8829c1f90c4\": container with ID starting with a91747082f917caf08d7ba673f3288ee902dcda4b849fead49dbe8829c1f90c4 not found: ID does not exist" Mar 10 07:32:47 crc kubenswrapper[4825]: I0310 07:32:47.253652 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f344ea8-2817-419d-85a9-7c119f8663e2" path="/var/lib/kubelet/pods/6f344ea8-2817-419d-85a9-7c119f8663e2/volumes" Mar 10 07:32:50 crc kubenswrapper[4825]: I0310 07:32:50.221913 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kw7kx" Mar 10 07:32:50 crc kubenswrapper[4825]: I0310 07:32:50.222618 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kw7kx" Mar 10 07:32:50 crc kubenswrapper[4825]: I0310 07:32:50.302623 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kw7kx" Mar 10 07:32:50 crc kubenswrapper[4825]: I0310 07:32:50.792940 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kw7kx" Mar 10 07:32:50 crc kubenswrapper[4825]: I0310 07:32:50.836754 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kw7kx"] Mar 10 07:32:52 crc kubenswrapper[4825]: I0310 07:32:52.704232 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kw7kx" podUID="47c81833-1608-41ca-8be0-44b3170d3e4f" containerName="registry-server" containerID="cri-o://e2d33ec80348fe0caec94b41de18ba53b95cb30246e70d2223c7bfad6063754a" gracePeriod=2 Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.219658 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kw7kx" Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.327194 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47c81833-1608-41ca-8be0-44b3170d3e4f-catalog-content\") pod \"47c81833-1608-41ca-8be0-44b3170d3e4f\" (UID: \"47c81833-1608-41ca-8be0-44b3170d3e4f\") " Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.327423 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9mp4\" (UniqueName: \"kubernetes.io/projected/47c81833-1608-41ca-8be0-44b3170d3e4f-kube-api-access-q9mp4\") pod \"47c81833-1608-41ca-8be0-44b3170d3e4f\" (UID: \"47c81833-1608-41ca-8be0-44b3170d3e4f\") " Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.327449 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47c81833-1608-41ca-8be0-44b3170d3e4f-utilities\") pod \"47c81833-1608-41ca-8be0-44b3170d3e4f\" (UID: \"47c81833-1608-41ca-8be0-44b3170d3e4f\") " Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.328576 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47c81833-1608-41ca-8be0-44b3170d3e4f-utilities" (OuterVolumeSpecName: "utilities") pod "47c81833-1608-41ca-8be0-44b3170d3e4f" (UID: "47c81833-1608-41ca-8be0-44b3170d3e4f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.334278 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47c81833-1608-41ca-8be0-44b3170d3e4f-kube-api-access-q9mp4" (OuterVolumeSpecName: "kube-api-access-q9mp4") pod "47c81833-1608-41ca-8be0-44b3170d3e4f" (UID: "47c81833-1608-41ca-8be0-44b3170d3e4f"). InnerVolumeSpecName "kube-api-access-q9mp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.421744 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47c81833-1608-41ca-8be0-44b3170d3e4f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47c81833-1608-41ca-8be0-44b3170d3e4f" (UID: "47c81833-1608-41ca-8be0-44b3170d3e4f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.429281 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9mp4\" (UniqueName: \"kubernetes.io/projected/47c81833-1608-41ca-8be0-44b3170d3e4f-kube-api-access-q9mp4\") on node \"crc\" DevicePath \"\"" Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.429332 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47c81833-1608-41ca-8be0-44b3170d3e4f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.429349 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47c81833-1608-41ca-8be0-44b3170d3e4f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.717108 4825 generic.go:334] "Generic (PLEG): container finished" podID="47c81833-1608-41ca-8be0-44b3170d3e4f" containerID="e2d33ec80348fe0caec94b41de18ba53b95cb30246e70d2223c7bfad6063754a" exitCode=0 Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.717227 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kw7kx" Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.717215 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kw7kx" event={"ID":"47c81833-1608-41ca-8be0-44b3170d3e4f","Type":"ContainerDied","Data":"e2d33ec80348fe0caec94b41de18ba53b95cb30246e70d2223c7bfad6063754a"} Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.718264 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kw7kx" event={"ID":"47c81833-1608-41ca-8be0-44b3170d3e4f","Type":"ContainerDied","Data":"92bab67e835ca957bbcab2f0ff42f73fa7f30c70cf57f85a98e4061635051bd4"} Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.718294 4825 scope.go:117] "RemoveContainer" containerID="e2d33ec80348fe0caec94b41de18ba53b95cb30246e70d2223c7bfad6063754a" Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.752216 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kw7kx"] Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.756692 4825 scope.go:117] "RemoveContainer" containerID="78286af40489c7bccb817ec6453133617f8bba1937121e3ca8468b192f75299f" Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.760035 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kw7kx"] Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.783617 4825 scope.go:117] "RemoveContainer" containerID="e7271e4a5e0ca00045d24d30b1bdb0db3e2680cab6a4d142f1014e9e57620407" Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.814095 4825 scope.go:117] "RemoveContainer" containerID="e2d33ec80348fe0caec94b41de18ba53b95cb30246e70d2223c7bfad6063754a" Mar 10 07:32:53 crc kubenswrapper[4825]: E0310 07:32:53.815115 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2d33ec80348fe0caec94b41de18ba53b95cb30246e70d2223c7bfad6063754a\": container with ID starting with e2d33ec80348fe0caec94b41de18ba53b95cb30246e70d2223c7bfad6063754a not found: ID does not exist" containerID="e2d33ec80348fe0caec94b41de18ba53b95cb30246e70d2223c7bfad6063754a" Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.815277 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2d33ec80348fe0caec94b41de18ba53b95cb30246e70d2223c7bfad6063754a"} err="failed to get container status \"e2d33ec80348fe0caec94b41de18ba53b95cb30246e70d2223c7bfad6063754a\": rpc error: code = NotFound desc = could not find container \"e2d33ec80348fe0caec94b41de18ba53b95cb30246e70d2223c7bfad6063754a\": container with ID starting with e2d33ec80348fe0caec94b41de18ba53b95cb30246e70d2223c7bfad6063754a not found: ID does not exist" Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.815315 4825 scope.go:117] "RemoveContainer" containerID="78286af40489c7bccb817ec6453133617f8bba1937121e3ca8468b192f75299f" Mar 10 07:32:53 crc kubenswrapper[4825]: E0310 07:32:53.816178 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78286af40489c7bccb817ec6453133617f8bba1937121e3ca8468b192f75299f\": container with ID starting with 78286af40489c7bccb817ec6453133617f8bba1937121e3ca8468b192f75299f not found: ID does not exist" containerID="78286af40489c7bccb817ec6453133617f8bba1937121e3ca8468b192f75299f" Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.816214 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78286af40489c7bccb817ec6453133617f8bba1937121e3ca8468b192f75299f"} err="failed to get container status \"78286af40489c7bccb817ec6453133617f8bba1937121e3ca8468b192f75299f\": rpc error: code = NotFound desc = could not find container \"78286af40489c7bccb817ec6453133617f8bba1937121e3ca8468b192f75299f\": container with ID starting with 78286af40489c7bccb817ec6453133617f8bba1937121e3ca8468b192f75299f not found: ID does not exist" Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.816235 4825 scope.go:117] "RemoveContainer" containerID="e7271e4a5e0ca00045d24d30b1bdb0db3e2680cab6a4d142f1014e9e57620407" Mar 10 07:32:53 crc kubenswrapper[4825]: E0310 07:32:53.816803 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7271e4a5e0ca00045d24d30b1bdb0db3e2680cab6a4d142f1014e9e57620407\": container with ID starting with e7271e4a5e0ca00045d24d30b1bdb0db3e2680cab6a4d142f1014e9e57620407 not found: ID does not exist" containerID="e7271e4a5e0ca00045d24d30b1bdb0db3e2680cab6a4d142f1014e9e57620407" Mar 10 07:32:53 crc kubenswrapper[4825]: I0310 07:32:53.816853 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7271e4a5e0ca00045d24d30b1bdb0db3e2680cab6a4d142f1014e9e57620407"} err="failed to get container status \"e7271e4a5e0ca00045d24d30b1bdb0db3e2680cab6a4d142f1014e9e57620407\": rpc error: code = NotFound desc = could not find container \"e7271e4a5e0ca00045d24d30b1bdb0db3e2680cab6a4d142f1014e9e57620407\": container with ID starting with e7271e4a5e0ca00045d24d30b1bdb0db3e2680cab6a4d142f1014e9e57620407 not found: ID does not exist" Mar 10 07:32:55 crc kubenswrapper[4825]: I0310 07:32:55.256359 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47c81833-1608-41ca-8be0-44b3170d3e4f" path="/var/lib/kubelet/pods/47c81833-1608-41ca-8be0-44b3170d3e4f/volumes" Mar 10 07:33:16 crc kubenswrapper[4825]: I0310 07:33:16.887966 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:33:16 crc kubenswrapper[4825]: I0310 07:33:16.888917 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:33:46 crc kubenswrapper[4825]: I0310 07:33:46.888686 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:33:46 crc kubenswrapper[4825]: I0310 07:33:46.889558 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:34:00 crc kubenswrapper[4825]: I0310 07:34:00.188102 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552134-fbgnq"] Mar 10 07:34:00 crc kubenswrapper[4825]: E0310 07:34:00.190664 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f344ea8-2817-419d-85a9-7c119f8663e2" containerName="registry-server" Mar 10 07:34:00 crc kubenswrapper[4825]: I0310 07:34:00.190697 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f344ea8-2817-419d-85a9-7c119f8663e2" containerName="registry-server" Mar 10 07:34:00 crc kubenswrapper[4825]: E0310 07:34:00.190760 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47c81833-1608-41ca-8be0-44b3170d3e4f" containerName="registry-server" Mar 10 07:34:00 crc kubenswrapper[4825]: I0310 07:34:00.190778 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c81833-1608-41ca-8be0-44b3170d3e4f" containerName="registry-server" Mar 10 07:34:00 crc kubenswrapper[4825]: E0310 07:34:00.190799 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f344ea8-2817-419d-85a9-7c119f8663e2" containerName="extract-content" Mar 10 07:34:00 crc kubenswrapper[4825]: I0310 07:34:00.190812 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f344ea8-2817-419d-85a9-7c119f8663e2" containerName="extract-content" Mar 10 07:34:00 crc kubenswrapper[4825]: E0310 07:34:00.191797 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f344ea8-2817-419d-85a9-7c119f8663e2" containerName="extract-utilities" Mar 10 07:34:00 crc kubenswrapper[4825]: I0310 07:34:00.191820 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f344ea8-2817-419d-85a9-7c119f8663e2" containerName="extract-utilities" Mar 10 07:34:00 crc kubenswrapper[4825]: E0310 07:34:00.191900 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47c81833-1608-41ca-8be0-44b3170d3e4f" containerName="extract-utilities" Mar 10 07:34:00 crc kubenswrapper[4825]: I0310 07:34:00.191915 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c81833-1608-41ca-8be0-44b3170d3e4f" containerName="extract-utilities" Mar 10 07:34:00 crc kubenswrapper[4825]: E0310 07:34:00.191941 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47c81833-1608-41ca-8be0-44b3170d3e4f" containerName="extract-content" Mar 10 07:34:00 crc kubenswrapper[4825]: I0310 07:34:00.191997 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c81833-1608-41ca-8be0-44b3170d3e4f" containerName="extract-content" Mar 10 07:34:00 crc kubenswrapper[4825]: I0310 07:34:00.192849 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="47c81833-1608-41ca-8be0-44b3170d3e4f" containerName="registry-server" Mar 10 07:34:00 crc kubenswrapper[4825]: I0310 07:34:00.192930 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f344ea8-2817-419d-85a9-7c119f8663e2" containerName="registry-server" Mar 10 07:34:00 crc kubenswrapper[4825]: I0310 07:34:00.195123 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552134-fbgnq" Mar 10 07:34:00 crc kubenswrapper[4825]: I0310 07:34:00.197717 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:34:00 crc kubenswrapper[4825]: I0310 07:34:00.198118 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:34:00 crc kubenswrapper[4825]: I0310 07:34:00.198610 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:34:00 crc kubenswrapper[4825]: I0310 07:34:00.203734 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552134-fbgnq"] Mar 10 07:34:00 crc kubenswrapper[4825]: I0310 07:34:00.233102 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7crz\" (UniqueName: \"kubernetes.io/projected/68d1588c-8009-4e52-9c04-7bdcb2cf4f3d-kube-api-access-m7crz\") pod \"auto-csr-approver-29552134-fbgnq\" (UID: \"68d1588c-8009-4e52-9c04-7bdcb2cf4f3d\") " pod="openshift-infra/auto-csr-approver-29552134-fbgnq" Mar 10 07:34:00 crc kubenswrapper[4825]: I0310 07:34:00.334580 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7crz\" (UniqueName: \"kubernetes.io/projected/68d1588c-8009-4e52-9c04-7bdcb2cf4f3d-kube-api-access-m7crz\") pod \"auto-csr-approver-29552134-fbgnq\" (UID: \"68d1588c-8009-4e52-9c04-7bdcb2cf4f3d\") " pod="openshift-infra/auto-csr-approver-29552134-fbgnq" Mar 10 07:34:00 crc kubenswrapper[4825]: I0310 07:34:00.370265 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7crz\" (UniqueName: \"kubernetes.io/projected/68d1588c-8009-4e52-9c04-7bdcb2cf4f3d-kube-api-access-m7crz\") pod \"auto-csr-approver-29552134-fbgnq\" (UID: \"68d1588c-8009-4e52-9c04-7bdcb2cf4f3d\") " pod="openshift-infra/auto-csr-approver-29552134-fbgnq" Mar 10 07:34:00 crc kubenswrapper[4825]: I0310 07:34:00.532939 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552134-fbgnq" Mar 10 07:34:00 crc kubenswrapper[4825]: I0310 07:34:00.816787 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552134-fbgnq"] Mar 10 07:34:00 crc kubenswrapper[4825]: I0310 07:34:00.825064 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 07:34:01 crc kubenswrapper[4825]: I0310 07:34:01.367876 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552134-fbgnq" event={"ID":"68d1588c-8009-4e52-9c04-7bdcb2cf4f3d","Type":"ContainerStarted","Data":"a1931f9e7bd9190169018c5b21425b1d2f17057f5300cc0193f2c883e0c486ac"} Mar 10 07:34:02 crc kubenswrapper[4825]: I0310 07:34:02.380031 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552134-fbgnq" event={"ID":"68d1588c-8009-4e52-9c04-7bdcb2cf4f3d","Type":"ContainerStarted","Data":"fa982086fed55a11b52c5d80857ea32169bde7ed075e0313c5c22c386d754b2b"} Mar 10 07:34:02 crc kubenswrapper[4825]: I0310 07:34:02.402835 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552134-fbgnq" podStartSLOduration=1.424778489 podStartE2EDuration="2.402804627s" podCreationTimestamp="2026-03-10 07:34:00 +0000 UTC" firstStartedPulling="2026-03-10 07:34:00.824809103 +0000 UTC m=+2993.854589719" lastFinishedPulling="2026-03-10 07:34:01.802835232 +0000 UTC m=+2994.832615857" observedRunningTime="2026-03-10 07:34:02.400072195 +0000 UTC m=+2995.429852870" watchObservedRunningTime="2026-03-10 07:34:02.402804627 +0000 UTC m=+2995.432585292" Mar 10 07:34:03 crc kubenswrapper[4825]: I0310 07:34:03.400017 4825 generic.go:334] "Generic (PLEG): container finished" podID="68d1588c-8009-4e52-9c04-7bdcb2cf4f3d" containerID="fa982086fed55a11b52c5d80857ea32169bde7ed075e0313c5c22c386d754b2b" exitCode=0 Mar 10 07:34:03 crc kubenswrapper[4825]: I0310 07:34:03.400061 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552134-fbgnq" event={"ID":"68d1588c-8009-4e52-9c04-7bdcb2cf4f3d","Type":"ContainerDied","Data":"fa982086fed55a11b52c5d80857ea32169bde7ed075e0313c5c22c386d754b2b"} Mar 10 07:34:04 crc kubenswrapper[4825]: I0310 07:34:04.819601 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552134-fbgnq" Mar 10 07:34:04 crc kubenswrapper[4825]: I0310 07:34:04.912418 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7crz\" (UniqueName: \"kubernetes.io/projected/68d1588c-8009-4e52-9c04-7bdcb2cf4f3d-kube-api-access-m7crz\") pod \"68d1588c-8009-4e52-9c04-7bdcb2cf4f3d\" (UID: \"68d1588c-8009-4e52-9c04-7bdcb2cf4f3d\") " Mar 10 07:34:04 crc kubenswrapper[4825]: I0310 07:34:04.920072 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68d1588c-8009-4e52-9c04-7bdcb2cf4f3d-kube-api-access-m7crz" (OuterVolumeSpecName: "kube-api-access-m7crz") pod "68d1588c-8009-4e52-9c04-7bdcb2cf4f3d" (UID: "68d1588c-8009-4e52-9c04-7bdcb2cf4f3d"). InnerVolumeSpecName "kube-api-access-m7crz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:34:05 crc kubenswrapper[4825]: I0310 07:34:05.014729 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7crz\" (UniqueName: \"kubernetes.io/projected/68d1588c-8009-4e52-9c04-7bdcb2cf4f3d-kube-api-access-m7crz\") on node \"crc\" DevicePath \"\"" Mar 10 07:34:05 crc kubenswrapper[4825]: I0310 07:34:05.424198 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552134-fbgnq" event={"ID":"68d1588c-8009-4e52-9c04-7bdcb2cf4f3d","Type":"ContainerDied","Data":"a1931f9e7bd9190169018c5b21425b1d2f17057f5300cc0193f2c883e0c486ac"} Mar 10 07:34:05 crc kubenswrapper[4825]: I0310 07:34:05.424561 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1931f9e7bd9190169018c5b21425b1d2f17057f5300cc0193f2c883e0c486ac" Mar 10 07:34:05 crc kubenswrapper[4825]: I0310 07:34:05.424339 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552134-fbgnq" Mar 10 07:34:05 crc kubenswrapper[4825]: I0310 07:34:05.480348 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552128-995sq"] Mar 10 07:34:05 crc kubenswrapper[4825]: I0310 07:34:05.488333 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552128-995sq"] Mar 10 07:34:07 crc kubenswrapper[4825]: I0310 07:34:07.254823 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc171a02-974b-4e0b-9ef7-57b5f717d951" path="/var/lib/kubelet/pods/fc171a02-974b-4e0b-9ef7-57b5f717d951/volumes" Mar 10 07:34:16 crc kubenswrapper[4825]: I0310 07:34:16.888417 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:34:16 crc kubenswrapper[4825]: I0310 07:34:16.888894 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:34:16 crc kubenswrapper[4825]: I0310 07:34:16.888938 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 07:34:16 crc kubenswrapper[4825]: I0310 07:34:16.889641 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f14d76b81593b08c433fbe6d8eff0a69ee39bc5bc0cae738058b3fb2fd863827"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 07:34:16 crc kubenswrapper[4825]: I0310 07:34:16.889702 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://f14d76b81593b08c433fbe6d8eff0a69ee39bc5bc0cae738058b3fb2fd863827" gracePeriod=600 Mar 10 07:34:17 crc kubenswrapper[4825]: I0310 07:34:17.547999 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="f14d76b81593b08c433fbe6d8eff0a69ee39bc5bc0cae738058b3fb2fd863827" exitCode=0 Mar 10 07:34:17 crc kubenswrapper[4825]: I0310 07:34:17.548088 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"f14d76b81593b08c433fbe6d8eff0a69ee39bc5bc0cae738058b3fb2fd863827"} Mar 10 07:34:17 crc kubenswrapper[4825]: I0310 07:34:17.548608 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953"} Mar 10 07:34:17 crc kubenswrapper[4825]: I0310 07:34:17.548655 4825 scope.go:117] "RemoveContainer" containerID="d1fa67187f8ea0c80a9c36034a9ddfdf1c0431ab2c7d3c4059f27045f96a6b45" Mar 10 07:34:38 crc kubenswrapper[4825]: I0310 07:34:38.582510 4825 scope.go:117] "RemoveContainer" containerID="61aa0cd1976baefe8750dc02484d1b9b3790629b9d474bb388426d8a460c7b4f" Mar 10 07:36:00 crc kubenswrapper[4825]: I0310 07:36:00.160404 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552136-c4scj"] Mar 10 07:36:00 crc kubenswrapper[4825]: E0310 07:36:00.169261 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d1588c-8009-4e52-9c04-7bdcb2cf4f3d" containerName="oc" Mar 10 07:36:00 crc kubenswrapper[4825]: I0310 07:36:00.169306 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d1588c-8009-4e52-9c04-7bdcb2cf4f3d" containerName="oc" Mar 10 07:36:00 crc kubenswrapper[4825]: I0310 07:36:00.171040 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="68d1588c-8009-4e52-9c04-7bdcb2cf4f3d" containerName="oc" Mar 10 07:36:00 crc kubenswrapper[4825]: I0310 07:36:00.172928 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552136-c4scj" Mar 10 07:36:00 crc kubenswrapper[4825]: I0310 07:36:00.174586 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552136-c4scj"] Mar 10 07:36:00 crc kubenswrapper[4825]: I0310 07:36:00.177171 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:36:00 crc kubenswrapper[4825]: I0310 07:36:00.178211 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:36:00 crc kubenswrapper[4825]: I0310 07:36:00.179490 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:36:00 crc kubenswrapper[4825]: I0310 07:36:00.317383 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvjs8\" (UniqueName: \"kubernetes.io/projected/23621e1d-1931-416a-9d42-89391322cd1b-kube-api-access-qvjs8\") pod \"auto-csr-approver-29552136-c4scj\" (UID: \"23621e1d-1931-416a-9d42-89391322cd1b\") " pod="openshift-infra/auto-csr-approver-29552136-c4scj" Mar 10 07:36:00 crc kubenswrapper[4825]: I0310 07:36:00.419114 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvjs8\" (UniqueName: \"kubernetes.io/projected/23621e1d-1931-416a-9d42-89391322cd1b-kube-api-access-qvjs8\") pod \"auto-csr-approver-29552136-c4scj\" (UID: \"23621e1d-1931-416a-9d42-89391322cd1b\") " pod="openshift-infra/auto-csr-approver-29552136-c4scj" Mar 10 07:36:00 crc kubenswrapper[4825]: I0310 07:36:00.437914 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvjs8\" (UniqueName: \"kubernetes.io/projected/23621e1d-1931-416a-9d42-89391322cd1b-kube-api-access-qvjs8\") pod \"auto-csr-approver-29552136-c4scj\" (UID: \"23621e1d-1931-416a-9d42-89391322cd1b\") " pod="openshift-infra/auto-csr-approver-29552136-c4scj" Mar 10 07:36:00 crc kubenswrapper[4825]: I0310 07:36:00.514195 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552136-c4scj" Mar 10 07:36:00 crc kubenswrapper[4825]: I0310 07:36:00.761025 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552136-c4scj"] Mar 10 07:36:01 crc kubenswrapper[4825]: I0310 07:36:01.562183 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552136-c4scj" event={"ID":"23621e1d-1931-416a-9d42-89391322cd1b","Type":"ContainerStarted","Data":"2804ea731dd43b1fd296e231fb573adf90c5644514ccd6c3d71c92b72ffdacda"} Mar 10 07:36:02 crc kubenswrapper[4825]: I0310 07:36:02.573372 4825 generic.go:334] "Generic (PLEG): container finished" podID="23621e1d-1931-416a-9d42-89391322cd1b" containerID="cd36a0e95b222af021c1052458f6d919260a546438ec548c7c56071465524683" exitCode=0 Mar 10 07:36:02 crc kubenswrapper[4825]: I0310 07:36:02.573438 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552136-c4scj" event={"ID":"23621e1d-1931-416a-9d42-89391322cd1b","Type":"ContainerDied","Data":"cd36a0e95b222af021c1052458f6d919260a546438ec548c7c56071465524683"} Mar 10 07:36:03 crc kubenswrapper[4825]: I0310 07:36:03.936496 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552136-c4scj" Mar 10 07:36:03 crc kubenswrapper[4825]: I0310 07:36:03.977036 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvjs8\" (UniqueName: \"kubernetes.io/projected/23621e1d-1931-416a-9d42-89391322cd1b-kube-api-access-qvjs8\") pod \"23621e1d-1931-416a-9d42-89391322cd1b\" (UID: \"23621e1d-1931-416a-9d42-89391322cd1b\") " Mar 10 07:36:03 crc kubenswrapper[4825]: I0310 07:36:03.991302 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23621e1d-1931-416a-9d42-89391322cd1b-kube-api-access-qvjs8" (OuterVolumeSpecName: "kube-api-access-qvjs8") pod "23621e1d-1931-416a-9d42-89391322cd1b" (UID: "23621e1d-1931-416a-9d42-89391322cd1b"). InnerVolumeSpecName "kube-api-access-qvjs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:36:04 crc kubenswrapper[4825]: I0310 07:36:04.079576 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvjs8\" (UniqueName: \"kubernetes.io/projected/23621e1d-1931-416a-9d42-89391322cd1b-kube-api-access-qvjs8\") on node \"crc\" DevicePath \"\"" Mar 10 07:36:04 crc kubenswrapper[4825]: I0310 07:36:04.594004 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552136-c4scj" event={"ID":"23621e1d-1931-416a-9d42-89391322cd1b","Type":"ContainerDied","Data":"2804ea731dd43b1fd296e231fb573adf90c5644514ccd6c3d71c92b72ffdacda"} Mar 10 07:36:04 crc kubenswrapper[4825]: I0310 07:36:04.594052 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2804ea731dd43b1fd296e231fb573adf90c5644514ccd6c3d71c92b72ffdacda" Mar 10 07:36:04 crc kubenswrapper[4825]: I0310 07:36:04.594080 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552136-c4scj" Mar 10 07:36:05 crc kubenswrapper[4825]: I0310 07:36:05.032969 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552130-vn76r"] Mar 10 07:36:05 crc kubenswrapper[4825]: I0310 07:36:05.044202 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552130-vn76r"] Mar 10 07:36:05 crc kubenswrapper[4825]: I0310 07:36:05.258995 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c439fd5-bf80-4e42-b8b4-e61f6b3f7971" path="/var/lib/kubelet/pods/4c439fd5-bf80-4e42-b8b4-e61f6b3f7971/volumes" Mar 10 07:36:38 crc kubenswrapper[4825]: I0310 07:36:38.694912 4825 scope.go:117] "RemoveContainer" containerID="896b05720a80224f2f9e1c613e5479bc6419911a72ca26f72507bd440f8456a6" Mar 10 07:36:46 crc kubenswrapper[4825]: I0310 07:36:46.887915 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:36:46 crc kubenswrapper[4825]: I0310 07:36:46.888734 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:37:16 crc kubenswrapper[4825]: I0310 07:37:16.887880 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:37:16 crc kubenswrapper[4825]: I0310 07:37:16.888443 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:37:46 crc kubenswrapper[4825]: I0310 07:37:46.888428 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:37:46 crc kubenswrapper[4825]: I0310 07:37:46.889423 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:37:46 crc kubenswrapper[4825]: I0310 07:37:46.889493 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 07:37:46 crc kubenswrapper[4825]: I0310 07:37:46.890560 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 07:37:46 crc kubenswrapper[4825]: I0310 07:37:46.890717 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" gracePeriod=600 Mar 10 07:37:47 crc kubenswrapper[4825]: E0310 07:37:47.020508 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:37:47 crc kubenswrapper[4825]: I0310 07:37:47.530347 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" exitCode=0 Mar 10 07:37:47 crc kubenswrapper[4825]: I0310 07:37:47.530407 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953"} Mar 10 07:37:47 crc kubenswrapper[4825]: I0310 07:37:47.530458 4825 scope.go:117] "RemoveContainer" containerID="f14d76b81593b08c433fbe6d8eff0a69ee39bc5bc0cae738058b3fb2fd863827" Mar 10 07:37:47 crc kubenswrapper[4825]: I0310 07:37:47.531358 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:37:47 crc kubenswrapper[4825]: E0310 07:37:47.531997 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:38:00 crc kubenswrapper[4825]: I0310 07:38:00.212485 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552138-gm4zd"] Mar 10 07:38:00 crc kubenswrapper[4825]: E0310 07:38:00.213417 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23621e1d-1931-416a-9d42-89391322cd1b" containerName="oc" Mar 10 07:38:00 crc kubenswrapper[4825]: I0310 07:38:00.213436 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="23621e1d-1931-416a-9d42-89391322cd1b" containerName="oc" Mar 10 07:38:00 crc kubenswrapper[4825]: I0310 07:38:00.213601 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="23621e1d-1931-416a-9d42-89391322cd1b" containerName="oc" Mar 10 07:38:00 crc kubenswrapper[4825]: I0310 07:38:00.214215 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552138-gm4zd" Mar 10 07:38:00 crc kubenswrapper[4825]: I0310 07:38:00.221411 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:38:00 crc kubenswrapper[4825]: I0310 07:38:00.221594 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:38:00 crc kubenswrapper[4825]: I0310 07:38:00.221536 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:38:00 crc kubenswrapper[4825]: I0310 07:38:00.224423 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552138-gm4zd"] Mar 10 07:38:00 crc kubenswrapper[4825]: I0310 07:38:00.263235 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n75fn\" (UniqueName: \"kubernetes.io/projected/8bdd63ee-cc99-4324-bfce-d871b230eaa6-kube-api-access-n75fn\") pod \"auto-csr-approver-29552138-gm4zd\" (UID: \"8bdd63ee-cc99-4324-bfce-d871b230eaa6\") " pod="openshift-infra/auto-csr-approver-29552138-gm4zd" Mar 10 07:38:00 crc kubenswrapper[4825]: I0310 07:38:00.364950 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n75fn\" (UniqueName: \"kubernetes.io/projected/8bdd63ee-cc99-4324-bfce-d871b230eaa6-kube-api-access-n75fn\") pod \"auto-csr-approver-29552138-gm4zd\" (UID: \"8bdd63ee-cc99-4324-bfce-d871b230eaa6\") " pod="openshift-infra/auto-csr-approver-29552138-gm4zd" Mar 10 07:38:00 crc kubenswrapper[4825]: I0310 07:38:00.388070 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n75fn\" (UniqueName: \"kubernetes.io/projected/8bdd63ee-cc99-4324-bfce-d871b230eaa6-kube-api-access-n75fn\") pod \"auto-csr-approver-29552138-gm4zd\" (UID: \"8bdd63ee-cc99-4324-bfce-d871b230eaa6\") " pod="openshift-infra/auto-csr-approver-29552138-gm4zd" Mar 10 07:38:00 crc kubenswrapper[4825]: I0310 07:38:00.532979 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552138-gm4zd" Mar 10 07:38:01 crc kubenswrapper[4825]: I0310 07:38:01.096434 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552138-gm4zd"] Mar 10 07:38:01 crc kubenswrapper[4825]: W0310 07:38:01.097834 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bdd63ee_cc99_4324_bfce_d871b230eaa6.slice/crio-ca7c23a91a5131d18632635e1061bc92c5d8b138e2f81814b051d22b372a9141 WatchSource:0}: Error finding container ca7c23a91a5131d18632635e1061bc92c5d8b138e2f81814b051d22b372a9141: Status 404 returned error can't find the container with id ca7c23a91a5131d18632635e1061bc92c5d8b138e2f81814b051d22b372a9141 Mar 10 07:38:01 crc kubenswrapper[4825]: I0310 07:38:01.653411 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552138-gm4zd" event={"ID":"8bdd63ee-cc99-4324-bfce-d871b230eaa6","Type":"ContainerStarted","Data":"ca7c23a91a5131d18632635e1061bc92c5d8b138e2f81814b051d22b372a9141"} Mar 10 07:38:02 crc kubenswrapper[4825]: I0310 07:38:02.666305 4825 generic.go:334] "Generic (PLEG): container finished" podID="8bdd63ee-cc99-4324-bfce-d871b230eaa6" containerID="bd76e78d7a6c0c3caa0ec1661fb89634c7c66ffae269f3adf8a8ccbd0c037e40" exitCode=0 Mar 10 07:38:02 crc kubenswrapper[4825]: I0310 07:38:02.666502 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552138-gm4zd" event={"ID":"8bdd63ee-cc99-4324-bfce-d871b230eaa6","Type":"ContainerDied","Data":"bd76e78d7a6c0c3caa0ec1661fb89634c7c66ffae269f3adf8a8ccbd0c037e40"} Mar 10 07:38:03 crc kubenswrapper[4825]: I0310 07:38:03.237423 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:38:03 crc kubenswrapper[4825]: E0310 07:38:03.237916 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:38:03 crc kubenswrapper[4825]: I0310 07:38:03.984014 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552138-gm4zd" Mar 10 07:38:04 crc kubenswrapper[4825]: I0310 07:38:04.015983 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n75fn\" (UniqueName: \"kubernetes.io/projected/8bdd63ee-cc99-4324-bfce-d871b230eaa6-kube-api-access-n75fn\") pod \"8bdd63ee-cc99-4324-bfce-d871b230eaa6\" (UID: \"8bdd63ee-cc99-4324-bfce-d871b230eaa6\") " Mar 10 07:38:04 crc kubenswrapper[4825]: I0310 07:38:04.022588 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bdd63ee-cc99-4324-bfce-d871b230eaa6-kube-api-access-n75fn" (OuterVolumeSpecName: "kube-api-access-n75fn") pod "8bdd63ee-cc99-4324-bfce-d871b230eaa6" (UID: "8bdd63ee-cc99-4324-bfce-d871b230eaa6"). InnerVolumeSpecName "kube-api-access-n75fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:38:04 crc kubenswrapper[4825]: I0310 07:38:04.117726 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n75fn\" (UniqueName: \"kubernetes.io/projected/8bdd63ee-cc99-4324-bfce-d871b230eaa6-kube-api-access-n75fn\") on node \"crc\" DevicePath \"\"" Mar 10 07:38:04 crc kubenswrapper[4825]: I0310 07:38:04.711255 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552138-gm4zd" event={"ID":"8bdd63ee-cc99-4324-bfce-d871b230eaa6","Type":"ContainerDied","Data":"ca7c23a91a5131d18632635e1061bc92c5d8b138e2f81814b051d22b372a9141"} Mar 10 07:38:04 crc kubenswrapper[4825]: I0310 07:38:04.711325 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca7c23a91a5131d18632635e1061bc92c5d8b138e2f81814b051d22b372a9141" Mar 10 07:38:04 crc kubenswrapper[4825]: I0310 07:38:04.711430 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552138-gm4zd" Mar 10 07:38:05 crc kubenswrapper[4825]: I0310 07:38:05.090343 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552132-4xg6n"] Mar 10 07:38:05 crc kubenswrapper[4825]: I0310 07:38:05.100787 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552132-4xg6n"] Mar 10 07:38:05 crc kubenswrapper[4825]: I0310 07:38:05.244490 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78877c1c-a7f0-4f7c-a9f9-9712470779f1" path="/var/lib/kubelet/pods/78877c1c-a7f0-4f7c-a9f9-9712470779f1/volumes" Mar 10 07:38:16 crc kubenswrapper[4825]: I0310 07:38:16.236678 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:38:16 crc kubenswrapper[4825]: E0310 07:38:16.237670 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:38:30 crc kubenswrapper[4825]: I0310 07:38:30.237417 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:38:30 crc kubenswrapper[4825]: E0310 07:38:30.238360 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:38:38 crc kubenswrapper[4825]: I0310 07:38:38.795197 4825 scope.go:117] "RemoveContainer" containerID="310bf31af4edb2c7dea7b543f7779bbaf81af94b2ad90b8cd733d0b2e42ffa67" Mar 10 07:38:45 crc kubenswrapper[4825]: I0310 07:38:45.236774 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:38:45 crc kubenswrapper[4825]: E0310 07:38:45.237658 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:39:00 crc kubenswrapper[4825]: I0310 07:39:00.236383 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:39:00 crc kubenswrapper[4825]: E0310 07:39:00.237758 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:39:14 crc kubenswrapper[4825]: I0310 07:39:14.285101 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:39:14 crc kubenswrapper[4825]: E0310 07:39:14.286428 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:39:28 crc kubenswrapper[4825]: I0310 07:39:28.237434 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:39:28 crc kubenswrapper[4825]: E0310 07:39:28.238525 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:39:42 crc kubenswrapper[4825]: I0310 07:39:42.236854 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:39:42 crc kubenswrapper[4825]: E0310 07:39:42.237838 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:39:54 crc kubenswrapper[4825]: I0310 07:39:54.235632 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:39:54 crc kubenswrapper[4825]: E0310 07:39:54.236453 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:40:00 crc kubenswrapper[4825]: I0310 07:40:00.169427 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552140-x57xg"] Mar 10 07:40:00 crc kubenswrapper[4825]: E0310 07:40:00.170391 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bdd63ee-cc99-4324-bfce-d871b230eaa6" containerName="oc" Mar 10 07:40:00 crc kubenswrapper[4825]: I0310 07:40:00.170412 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bdd63ee-cc99-4324-bfce-d871b230eaa6" containerName="oc" Mar 10 07:40:00 crc kubenswrapper[4825]: I0310 07:40:00.170672 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bdd63ee-cc99-4324-bfce-d871b230eaa6" containerName="oc" Mar 10 07:40:00 crc kubenswrapper[4825]: I0310 07:40:00.171346 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552140-x57xg" Mar 10 07:40:00 crc kubenswrapper[4825]: I0310 07:40:00.173845 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:40:00 crc kubenswrapper[4825]: I0310 07:40:00.174492 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:40:00 crc kubenswrapper[4825]: I0310 07:40:00.174798 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:40:00 crc kubenswrapper[4825]: I0310 07:40:00.186879 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552140-x57xg"] Mar 10 07:40:00 crc kubenswrapper[4825]: I0310 07:40:00.329505 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67xjr\" (UniqueName: \"kubernetes.io/projected/eb168114-ca6b-4a6e-857d-17b7e09c5d6b-kube-api-access-67xjr\") pod \"auto-csr-approver-29552140-x57xg\" (UID: \"eb168114-ca6b-4a6e-857d-17b7e09c5d6b\") " pod="openshift-infra/auto-csr-approver-29552140-x57xg" Mar 10 07:40:00 crc kubenswrapper[4825]: I0310 07:40:00.431558 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67xjr\" (UniqueName: \"kubernetes.io/projected/eb168114-ca6b-4a6e-857d-17b7e09c5d6b-kube-api-access-67xjr\") pod \"auto-csr-approver-29552140-x57xg\" (UID: \"eb168114-ca6b-4a6e-857d-17b7e09c5d6b\") " pod="openshift-infra/auto-csr-approver-29552140-x57xg" Mar 10 07:40:00 crc kubenswrapper[4825]: I0310 07:40:00.464520 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67xjr\" (UniqueName: \"kubernetes.io/projected/eb168114-ca6b-4a6e-857d-17b7e09c5d6b-kube-api-access-67xjr\") pod \"auto-csr-approver-29552140-x57xg\" (UID: \"eb168114-ca6b-4a6e-857d-17b7e09c5d6b\") " pod="openshift-infra/auto-csr-approver-29552140-x57xg" Mar 10 07:40:00 crc kubenswrapper[4825]: I0310 07:40:00.501855 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552140-x57xg" Mar 10 07:40:00 crc kubenswrapper[4825]: I0310 07:40:00.937236 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552140-x57xg"] Mar 10 07:40:00 crc kubenswrapper[4825]: I0310 07:40:00.951765 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 07:40:01 crc kubenswrapper[4825]: I0310 07:40:01.867310 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552140-x57xg" event={"ID":"eb168114-ca6b-4a6e-857d-17b7e09c5d6b","Type":"ContainerStarted","Data":"74e0bde0548f30867db0a70404227d696728801a257fe323f8e6519f36188f7e"} Mar 10 07:40:02 crc kubenswrapper[4825]: I0310 07:40:02.880180 4825 generic.go:334] "Generic (PLEG): container finished" podID="eb168114-ca6b-4a6e-857d-17b7e09c5d6b" containerID="d140737207cd765431f303752c3e8bb3c18cfed42f29428677741395f913c8c1" exitCode=0 Mar 10 07:40:02 crc kubenswrapper[4825]: I0310 07:40:02.880242 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552140-x57xg" event={"ID":"eb168114-ca6b-4a6e-857d-17b7e09c5d6b","Type":"ContainerDied","Data":"d140737207cd765431f303752c3e8bb3c18cfed42f29428677741395f913c8c1"} Mar 10 07:40:04 crc kubenswrapper[4825]: I0310 07:40:04.216232 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552140-x57xg" Mar 10 07:40:04 crc kubenswrapper[4825]: I0310 07:40:04.390206 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67xjr\" (UniqueName: \"kubernetes.io/projected/eb168114-ca6b-4a6e-857d-17b7e09c5d6b-kube-api-access-67xjr\") pod \"eb168114-ca6b-4a6e-857d-17b7e09c5d6b\" (UID: \"eb168114-ca6b-4a6e-857d-17b7e09c5d6b\") " Mar 10 07:40:04 crc kubenswrapper[4825]: I0310 07:40:04.408084 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb168114-ca6b-4a6e-857d-17b7e09c5d6b-kube-api-access-67xjr" (OuterVolumeSpecName: "kube-api-access-67xjr") pod "eb168114-ca6b-4a6e-857d-17b7e09c5d6b" (UID: "eb168114-ca6b-4a6e-857d-17b7e09c5d6b"). InnerVolumeSpecName "kube-api-access-67xjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:40:04 crc kubenswrapper[4825]: I0310 07:40:04.492205 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67xjr\" (UniqueName: \"kubernetes.io/projected/eb168114-ca6b-4a6e-857d-17b7e09c5d6b-kube-api-access-67xjr\") on node \"crc\" DevicePath \"\"" Mar 10 07:40:04 crc kubenswrapper[4825]: I0310 07:40:04.903504 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552140-x57xg" event={"ID":"eb168114-ca6b-4a6e-857d-17b7e09c5d6b","Type":"ContainerDied","Data":"74e0bde0548f30867db0a70404227d696728801a257fe323f8e6519f36188f7e"} Mar 10 07:40:04 crc kubenswrapper[4825]: I0310 07:40:04.903566 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74e0bde0548f30867db0a70404227d696728801a257fe323f8e6519f36188f7e" Mar 10 07:40:04 crc kubenswrapper[4825]: I0310 07:40:04.903578 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552140-x57xg" Mar 10 07:40:05 crc kubenswrapper[4825]: I0310 07:40:05.333161 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552134-fbgnq"] Mar 10 07:40:05 crc kubenswrapper[4825]: I0310 07:40:05.345948 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552134-fbgnq"] Mar 10 07:40:07 crc kubenswrapper[4825]: I0310 07:40:07.236702 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:40:07 crc kubenswrapper[4825]: E0310 07:40:07.237483 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:40:07 crc kubenswrapper[4825]: I0310 07:40:07.253674 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68d1588c-8009-4e52-9c04-7bdcb2cf4f3d" path="/var/lib/kubelet/pods/68d1588c-8009-4e52-9c04-7bdcb2cf4f3d/volumes" Mar 10 07:40:21 crc kubenswrapper[4825]: I0310 07:40:21.236436 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:40:21 crc kubenswrapper[4825]: E0310 07:40:21.237112 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:40:32 crc kubenswrapper[4825]: I0310 07:40:32.217609 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g5lhn"] Mar 10 07:40:32 crc kubenswrapper[4825]: E0310 07:40:32.218917 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb168114-ca6b-4a6e-857d-17b7e09c5d6b" containerName="oc" Mar 10 07:40:32 crc kubenswrapper[4825]: I0310 07:40:32.218947 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb168114-ca6b-4a6e-857d-17b7e09c5d6b" containerName="oc" Mar 10 07:40:32 crc kubenswrapper[4825]: I0310 07:40:32.219333 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb168114-ca6b-4a6e-857d-17b7e09c5d6b" containerName="oc" Mar 10 07:40:32 crc kubenswrapper[4825]: I0310 07:40:32.221670 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g5lhn" Mar 10 07:40:32 crc kubenswrapper[4825]: I0310 07:40:32.234215 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g5lhn"] Mar 10 07:40:32 crc kubenswrapper[4825]: I0310 07:40:32.251098 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9028cc8e-79e8-4c44-8bc0-2db0c871a8e4-catalog-content\") pod \"certified-operators-g5lhn\" (UID: \"9028cc8e-79e8-4c44-8bc0-2db0c871a8e4\") " pod="openshift-marketplace/certified-operators-g5lhn" Mar 10 07:40:32 crc kubenswrapper[4825]: I0310 07:40:32.251221 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9028cc8e-79e8-4c44-8bc0-2db0c871a8e4-utilities\") pod \"certified-operators-g5lhn\" (UID: \"9028cc8e-79e8-4c44-8bc0-2db0c871a8e4\") " pod="openshift-marketplace/certified-operators-g5lhn" Mar 10 07:40:32 crc kubenswrapper[4825]: I0310 07:40:32.251287 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccg7t\" (UniqueName: \"kubernetes.io/projected/9028cc8e-79e8-4c44-8bc0-2db0c871a8e4-kube-api-access-ccg7t\") pod \"certified-operators-g5lhn\" (UID: \"9028cc8e-79e8-4c44-8bc0-2db0c871a8e4\") " pod="openshift-marketplace/certified-operators-g5lhn" Mar 10 07:40:32 crc kubenswrapper[4825]: I0310 07:40:32.352552 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9028cc8e-79e8-4c44-8bc0-2db0c871a8e4-catalog-content\") pod \"certified-operators-g5lhn\" (UID: \"9028cc8e-79e8-4c44-8bc0-2db0c871a8e4\") " pod="openshift-marketplace/certified-operators-g5lhn" Mar 10 07:40:32 crc kubenswrapper[4825]: I0310 07:40:32.352600 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9028cc8e-79e8-4c44-8bc0-2db0c871a8e4-utilities\") pod \"certified-operators-g5lhn\" (UID: \"9028cc8e-79e8-4c44-8bc0-2db0c871a8e4\") " pod="openshift-marketplace/certified-operators-g5lhn" Mar 10 07:40:32 crc kubenswrapper[4825]: I0310 07:40:32.352644 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccg7t\" (UniqueName: \"kubernetes.io/projected/9028cc8e-79e8-4c44-8bc0-2db0c871a8e4-kube-api-access-ccg7t\") pod \"certified-operators-g5lhn\" (UID: \"9028cc8e-79e8-4c44-8bc0-2db0c871a8e4\") " pod="openshift-marketplace/certified-operators-g5lhn" Mar 10 07:40:32 crc kubenswrapper[4825]: I0310 07:40:32.353716 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9028cc8e-79e8-4c44-8bc0-2db0c871a8e4-catalog-content\") pod \"certified-operators-g5lhn\" (UID: \"9028cc8e-79e8-4c44-8bc0-2db0c871a8e4\") " pod="openshift-marketplace/certified-operators-g5lhn" Mar 10 07:40:32 crc kubenswrapper[4825]: I0310 07:40:32.353722 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9028cc8e-79e8-4c44-8bc0-2db0c871a8e4-utilities\") pod \"certified-operators-g5lhn\" (UID: \"9028cc8e-79e8-4c44-8bc0-2db0c871a8e4\") " pod="openshift-marketplace/certified-operators-g5lhn" Mar 10 07:40:32 crc kubenswrapper[4825]: I0310 07:40:32.372680 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccg7t\" (UniqueName: \"kubernetes.io/projected/9028cc8e-79e8-4c44-8bc0-2db0c871a8e4-kube-api-access-ccg7t\") pod \"certified-operators-g5lhn\" (UID: \"9028cc8e-79e8-4c44-8bc0-2db0c871a8e4\") " pod="openshift-marketplace/certified-operators-g5lhn" Mar 10 07:40:32 crc kubenswrapper[4825]: I0310 07:40:32.562552 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g5lhn" Mar 10 07:40:33 crc kubenswrapper[4825]: I0310 07:40:33.074019 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g5lhn"] Mar 10 07:40:33 crc kubenswrapper[4825]: I0310 07:40:33.535081 4825 generic.go:334] "Generic (PLEG): container finished" podID="9028cc8e-79e8-4c44-8bc0-2db0c871a8e4" containerID="c13bf69283809ea2579bfd2d73ac7af84c846188c0d0bf67dc3861c8896d81ce" exitCode=0 Mar 10 07:40:33 crc kubenswrapper[4825]: I0310 07:40:33.535254 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g5lhn" event={"ID":"9028cc8e-79e8-4c44-8bc0-2db0c871a8e4","Type":"ContainerDied","Data":"c13bf69283809ea2579bfd2d73ac7af84c846188c0d0bf67dc3861c8896d81ce"} Mar 10 07:40:33 crc kubenswrapper[4825]: I0310 07:40:33.535594 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g5lhn" event={"ID":"9028cc8e-79e8-4c44-8bc0-2db0c871a8e4","Type":"ContainerStarted","Data":"46906ba338a5b47e92083d1289e53d503c09930ef0a94d808562a6504889cebb"} Mar 10 07:40:34 crc kubenswrapper[4825]: I0310 07:40:34.237078 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:40:34 crc kubenswrapper[4825]: E0310 07:40:34.237565 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:40:34 crc kubenswrapper[4825]: I0310 07:40:34.804517 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jrdtm"] Mar 10 07:40:34 crc kubenswrapper[4825]: I0310 07:40:34.807537 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jrdtm" Mar 10 07:40:34 crc kubenswrapper[4825]: I0310 07:40:34.823408 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jrdtm"] Mar 10 07:40:34 crc kubenswrapper[4825]: I0310 07:40:34.891315 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d61a255-b794-40ae-aedd-9072e30bc6e5-catalog-content\") pod \"redhat-operators-jrdtm\" (UID: \"6d61a255-b794-40ae-aedd-9072e30bc6e5\") " pod="openshift-marketplace/redhat-operators-jrdtm" Mar 10 07:40:34 crc kubenswrapper[4825]: I0310 07:40:34.891373 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d61a255-b794-40ae-aedd-9072e30bc6e5-utilities\") pod \"redhat-operators-jrdtm\" (UID: \"6d61a255-b794-40ae-aedd-9072e30bc6e5\") " pod="openshift-marketplace/redhat-operators-jrdtm" Mar 10 07:40:34 crc kubenswrapper[4825]: I0310 07:40:34.891472 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fmx8\" (UniqueName: \"kubernetes.io/projected/6d61a255-b794-40ae-aedd-9072e30bc6e5-kube-api-access-2fmx8\") pod \"redhat-operators-jrdtm\" (UID: \"6d61a255-b794-40ae-aedd-9072e30bc6e5\") " pod="openshift-marketplace/redhat-operators-jrdtm" Mar 10 07:40:34 crc kubenswrapper[4825]: I0310 07:40:34.993015 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d61a255-b794-40ae-aedd-9072e30bc6e5-catalog-content\") pod \"redhat-operators-jrdtm\" (UID: \"6d61a255-b794-40ae-aedd-9072e30bc6e5\") " pod="openshift-marketplace/redhat-operators-jrdtm" Mar 10 07:40:34 crc kubenswrapper[4825]: I0310 07:40:34.993064 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d61a255-b794-40ae-aedd-9072e30bc6e5-utilities\") pod \"redhat-operators-jrdtm\" (UID: \"6d61a255-b794-40ae-aedd-9072e30bc6e5\") " pod="openshift-marketplace/redhat-operators-jrdtm" Mar 10 07:40:34 crc kubenswrapper[4825]: I0310 07:40:34.993123 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fmx8\" (UniqueName: \"kubernetes.io/projected/6d61a255-b794-40ae-aedd-9072e30bc6e5-kube-api-access-2fmx8\") pod \"redhat-operators-jrdtm\" (UID: \"6d61a255-b794-40ae-aedd-9072e30bc6e5\") " pod="openshift-marketplace/redhat-operators-jrdtm" Mar 10 07:40:34 crc kubenswrapper[4825]: I0310 07:40:34.993536 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d61a255-b794-40ae-aedd-9072e30bc6e5-catalog-content\") pod \"redhat-operators-jrdtm\" (UID: \"6d61a255-b794-40ae-aedd-9072e30bc6e5\") " pod="openshift-marketplace/redhat-operators-jrdtm" Mar 10 07:40:34 crc kubenswrapper[4825]: I0310 07:40:34.993813 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d61a255-b794-40ae-aedd-9072e30bc6e5-utilities\") pod \"redhat-operators-jrdtm\" (UID: \"6d61a255-b794-40ae-aedd-9072e30bc6e5\") " pod="openshift-marketplace/redhat-operators-jrdtm" Mar 10 07:40:35 crc kubenswrapper[4825]: I0310 07:40:35.010848 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fmx8\" (UniqueName: \"kubernetes.io/projected/6d61a255-b794-40ae-aedd-9072e30bc6e5-kube-api-access-2fmx8\") pod \"redhat-operators-jrdtm\" (UID: \"6d61a255-b794-40ae-aedd-9072e30bc6e5\") " pod="openshift-marketplace/redhat-operators-jrdtm" Mar 10 07:40:35 crc kubenswrapper[4825]: I0310 07:40:35.133646 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jrdtm" Mar 10 07:40:35 crc kubenswrapper[4825]: I0310 07:40:35.567906 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jrdtm"] Mar 10 07:40:35 crc kubenswrapper[4825]: W0310 07:40:35.578158 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d61a255_b794_40ae_aedd_9072e30bc6e5.slice/crio-31c5da8de77cfd3e14836b1a00dca142c826d70a732b7059adaaa198a8e95924 WatchSource:0}: Error finding container 31c5da8de77cfd3e14836b1a00dca142c826d70a732b7059adaaa198a8e95924: Status 404 returned error can't find the container with id 31c5da8de77cfd3e14836b1a00dca142c826d70a732b7059adaaa198a8e95924 Mar 10 07:40:36 crc kubenswrapper[4825]: I0310 07:40:36.557172 4825 generic.go:334] "Generic (PLEG): container finished" podID="6d61a255-b794-40ae-aedd-9072e30bc6e5" containerID="39a2cc403b1aa1ffc5cf7d7ee2e7798c04034f25da404d7957ef69e5f72166d5" exitCode=0 Mar 10 07:40:36 crc kubenswrapper[4825]: I0310 07:40:36.557280 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrdtm" event={"ID":"6d61a255-b794-40ae-aedd-9072e30bc6e5","Type":"ContainerDied","Data":"39a2cc403b1aa1ffc5cf7d7ee2e7798c04034f25da404d7957ef69e5f72166d5"} Mar 10 07:40:36 crc kubenswrapper[4825]: I0310 07:40:36.557464 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrdtm" event={"ID":"6d61a255-b794-40ae-aedd-9072e30bc6e5","Type":"ContainerStarted","Data":"31c5da8de77cfd3e14836b1a00dca142c826d70a732b7059adaaa198a8e95924"} Mar 10 07:40:38 crc kubenswrapper[4825]: I0310 07:40:38.896801 4825 scope.go:117] "RemoveContainer" containerID="fa982086fed55a11b52c5d80857ea32169bde7ed075e0313c5c22c386d754b2b" Mar 10 07:40:39 crc kubenswrapper[4825]: I0310 07:40:39.584126 4825 generic.go:334] "Generic (PLEG): container finished" podID="9028cc8e-79e8-4c44-8bc0-2db0c871a8e4" containerID="3e3dccf0abb4f30b116b2f7bf0f1423efe835bd656b2853754ffb78b8410a058" exitCode=0 Mar 10 07:40:39 crc kubenswrapper[4825]: I0310 07:40:39.584233 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g5lhn" event={"ID":"9028cc8e-79e8-4c44-8bc0-2db0c871a8e4","Type":"ContainerDied","Data":"3e3dccf0abb4f30b116b2f7bf0f1423efe835bd656b2853754ffb78b8410a058"} Mar 10 07:40:39 crc kubenswrapper[4825]: I0310 07:40:39.587460 4825 generic.go:334] "Generic (PLEG): container finished" podID="6d61a255-b794-40ae-aedd-9072e30bc6e5" containerID="81664451cd84bacb2d9ff2c6a2907fdeca8833e832f8de16fe3656ef3e6de5f6" exitCode=0 Mar 10 07:40:39 crc kubenswrapper[4825]: I0310 07:40:39.587542 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrdtm" event={"ID":"6d61a255-b794-40ae-aedd-9072e30bc6e5","Type":"ContainerDied","Data":"81664451cd84bacb2d9ff2c6a2907fdeca8833e832f8de16fe3656ef3e6de5f6"} Mar 10 07:40:40 crc kubenswrapper[4825]: I0310 07:40:40.600250 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g5lhn" event={"ID":"9028cc8e-79e8-4c44-8bc0-2db0c871a8e4","Type":"ContainerStarted","Data":"a949e386ff4036b9c7074fe9027dea705e3d8ccae6540fb6549f2b8d0099ca2c"} Mar 10 07:40:40 crc kubenswrapper[4825]: I0310 07:40:40.604209 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrdtm" event={"ID":"6d61a255-b794-40ae-aedd-9072e30bc6e5","Type":"ContainerStarted","Data":"e3f70bbff303daa7fa8ec0b0771845afe69c06cf6cd515bbe6722862fb8b9147"} Mar 10 07:40:40 crc kubenswrapper[4825]: I0310 07:40:40.625324 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g5lhn" podStartSLOduration=2.016041088 podStartE2EDuration="8.625302716s" podCreationTimestamp="2026-03-10 07:40:32 +0000 UTC" firstStartedPulling="2026-03-10 07:40:33.53751195 +0000 UTC m=+3386.567292605" lastFinishedPulling="2026-03-10 07:40:40.146773608 +0000 UTC m=+3393.176554233" observedRunningTime="2026-03-10 07:40:40.621565678 +0000 UTC m=+3393.651346293" watchObservedRunningTime="2026-03-10 07:40:40.625302716 +0000 UTC m=+3393.655083331" Mar 10 07:40:40 crc kubenswrapper[4825]: I0310 07:40:40.660584 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jrdtm" podStartSLOduration=3.207744184 podStartE2EDuration="6.660560485s" podCreationTimestamp="2026-03-10 07:40:34 +0000 UTC" firstStartedPulling="2026-03-10 07:40:36.560960456 +0000 UTC m=+3389.590741071" lastFinishedPulling="2026-03-10 07:40:40.013776747 +0000 UTC m=+3393.043557372" observedRunningTime="2026-03-10 07:40:40.645386005 +0000 UTC m=+3393.675166670" watchObservedRunningTime="2026-03-10 07:40:40.660560485 +0000 UTC m=+3393.690341110" Mar 10 07:40:42 crc kubenswrapper[4825]: I0310 07:40:42.563372 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g5lhn" Mar 10 07:40:42 crc kubenswrapper[4825]: I0310 07:40:42.564458 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g5lhn" Mar 10 07:40:43 crc kubenswrapper[4825]: I0310 07:40:43.620260 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-g5lhn" podUID="9028cc8e-79e8-4c44-8bc0-2db0c871a8e4" containerName="registry-server" probeResult="failure" output=< Mar 10 07:40:43 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 07:40:43 crc kubenswrapper[4825]: > Mar 10 07:40:45 crc kubenswrapper[4825]: I0310 07:40:45.134279 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jrdtm" Mar 10 07:40:45 crc kubenswrapper[4825]: I0310 07:40:45.134572 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jrdtm" Mar 10 07:40:46 crc kubenswrapper[4825]: I0310 07:40:46.231090 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jrdtm" podUID="6d61a255-b794-40ae-aedd-9072e30bc6e5" containerName="registry-server" probeResult="failure" output=< Mar 10 07:40:46 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 07:40:46 crc kubenswrapper[4825]: > Mar 10 07:40:46 crc kubenswrapper[4825]: I0310 07:40:46.236791 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:40:46 crc kubenswrapper[4825]: E0310 07:40:46.237305 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:40:52 crc kubenswrapper[4825]: I0310 07:40:52.612324 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g5lhn" Mar 10 07:40:52 crc kubenswrapper[4825]: I0310 07:40:52.658724 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g5lhn" Mar 10 07:40:55 crc kubenswrapper[4825]: I0310 07:40:55.197379 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jrdtm" Mar 10 07:40:55 crc kubenswrapper[4825]: I0310 07:40:55.255224 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jrdtm" Mar 10 07:40:55 crc kubenswrapper[4825]: I0310 07:40:55.642193 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g5lhn"] Mar 10 07:40:56 crc kubenswrapper[4825]: I0310 07:40:56.406311 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lhnmv"] Mar 10 07:40:56 crc kubenswrapper[4825]: I0310 07:40:56.407391 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lhnmv" podUID="3d75eddb-b190-4e4e-8f99-339d7923a4c0" containerName="registry-server" containerID="cri-o://416f5fac03ee6b5691657111577415cf63b215c1781ae7967c68b687a08f9a8c" gracePeriod=2 Mar 10 07:40:56 crc kubenswrapper[4825]: I0310 07:40:56.782010 4825 generic.go:334] "Generic (PLEG): container finished" podID="3d75eddb-b190-4e4e-8f99-339d7923a4c0" containerID="416f5fac03ee6b5691657111577415cf63b215c1781ae7967c68b687a08f9a8c" exitCode=0 Mar 10 07:40:56 crc kubenswrapper[4825]: I0310 07:40:56.782096 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhnmv" event={"ID":"3d75eddb-b190-4e4e-8f99-339d7923a4c0","Type":"ContainerDied","Data":"416f5fac03ee6b5691657111577415cf63b215c1781ae7967c68b687a08f9a8c"} Mar 10 07:40:56 crc kubenswrapper[4825]: I0310 07:40:56.865758 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lhnmv" Mar 10 07:40:56 crc kubenswrapper[4825]: I0310 07:40:56.974555 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d75eddb-b190-4e4e-8f99-339d7923a4c0-utilities\") pod \"3d75eddb-b190-4e4e-8f99-339d7923a4c0\" (UID: \"3d75eddb-b190-4e4e-8f99-339d7923a4c0\") " Mar 10 07:40:56 crc kubenswrapper[4825]: I0310 07:40:56.974642 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d75eddb-b190-4e4e-8f99-339d7923a4c0-catalog-content\") pod \"3d75eddb-b190-4e4e-8f99-339d7923a4c0\" (UID: \"3d75eddb-b190-4e4e-8f99-339d7923a4c0\") " Mar 10 07:40:56 crc kubenswrapper[4825]: I0310 07:40:56.974784 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw8kd\" (UniqueName: \"kubernetes.io/projected/3d75eddb-b190-4e4e-8f99-339d7923a4c0-kube-api-access-pw8kd\") pod \"3d75eddb-b190-4e4e-8f99-339d7923a4c0\" (UID: \"3d75eddb-b190-4e4e-8f99-339d7923a4c0\") " Mar 10 07:40:56 crc kubenswrapper[4825]: I0310 07:40:56.975209 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d75eddb-b190-4e4e-8f99-339d7923a4c0-utilities" (OuterVolumeSpecName: "utilities") pod "3d75eddb-b190-4e4e-8f99-339d7923a4c0" (UID: "3d75eddb-b190-4e4e-8f99-339d7923a4c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:40:56 crc kubenswrapper[4825]: I0310 07:40:56.979615 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d75eddb-b190-4e4e-8f99-339d7923a4c0-kube-api-access-pw8kd" (OuterVolumeSpecName: "kube-api-access-pw8kd") pod "3d75eddb-b190-4e4e-8f99-339d7923a4c0" (UID: "3d75eddb-b190-4e4e-8f99-339d7923a4c0"). InnerVolumeSpecName "kube-api-access-pw8kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:40:57 crc kubenswrapper[4825]: I0310 07:40:57.021845 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d75eddb-b190-4e4e-8f99-339d7923a4c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d75eddb-b190-4e4e-8f99-339d7923a4c0" (UID: "3d75eddb-b190-4e4e-8f99-339d7923a4c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:40:57 crc kubenswrapper[4825]: I0310 07:40:57.076505 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw8kd\" (UniqueName: \"kubernetes.io/projected/3d75eddb-b190-4e4e-8f99-339d7923a4c0-kube-api-access-pw8kd\") on node \"crc\" DevicePath \"\"" Mar 10 07:40:57 crc kubenswrapper[4825]: I0310 07:40:57.076538 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d75eddb-b190-4e4e-8f99-339d7923a4c0-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 07:40:57 crc kubenswrapper[4825]: I0310 07:40:57.076548 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d75eddb-b190-4e4e-8f99-339d7923a4c0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 07:40:57 crc kubenswrapper[4825]: I0310 07:40:57.805843 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhnmv" event={"ID":"3d75eddb-b190-4e4e-8f99-339d7923a4c0","Type":"ContainerDied","Data":"91763806981d23b8e4ffb287254620952fd6de77cf4ffc53122598503e7fc424"} Mar 10 07:40:57 crc kubenswrapper[4825]: I0310 07:40:57.806154 4825 scope.go:117] "RemoveContainer" containerID="416f5fac03ee6b5691657111577415cf63b215c1781ae7967c68b687a08f9a8c" Mar 10 07:40:57 crc kubenswrapper[4825]: I0310 07:40:57.806304 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lhnmv" Mar 10 07:40:57 crc kubenswrapper[4825]: I0310 07:40:57.830370 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lhnmv"] Mar 10 07:40:57 crc kubenswrapper[4825]: I0310 07:40:57.836563 4825 scope.go:117] "RemoveContainer" containerID="2d9b0adb42e5d887e62b0e7beff93f940930ce9dad529d9c836e434605c67c6a" Mar 10 07:40:57 crc kubenswrapper[4825]: I0310 07:40:57.838040 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lhnmv"] Mar 10 07:40:57 crc kubenswrapper[4825]: I0310 07:40:57.857703 4825 scope.go:117] "RemoveContainer" containerID="62a545b5838b74dab930ca1824c5ef81119474d1ef29a7b4e534d6e89abc830f" Mar 10 07:40:59 crc kubenswrapper[4825]: I0310 07:40:59.245319 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d75eddb-b190-4e4e-8f99-339d7923a4c0" path="/var/lib/kubelet/pods/3d75eddb-b190-4e4e-8f99-339d7923a4c0/volumes" Mar 10 07:40:59 crc kubenswrapper[4825]: I0310 07:40:59.999619 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jrdtm"] Mar 10 07:40:59 crc kubenswrapper[4825]: I0310 07:40:59.999932 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jrdtm" podUID="6d61a255-b794-40ae-aedd-9072e30bc6e5" containerName="registry-server" containerID="cri-o://e3f70bbff303daa7fa8ec0b0771845afe69c06cf6cd515bbe6722862fb8b9147" gracePeriod=2 Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.342508 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jrdtm" Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.522924 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d61a255-b794-40ae-aedd-9072e30bc6e5-catalog-content\") pod \"6d61a255-b794-40ae-aedd-9072e30bc6e5\" (UID: \"6d61a255-b794-40ae-aedd-9072e30bc6e5\") " Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.522988 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fmx8\" (UniqueName: \"kubernetes.io/projected/6d61a255-b794-40ae-aedd-9072e30bc6e5-kube-api-access-2fmx8\") pod \"6d61a255-b794-40ae-aedd-9072e30bc6e5\" (UID: \"6d61a255-b794-40ae-aedd-9072e30bc6e5\") " Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.523080 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d61a255-b794-40ae-aedd-9072e30bc6e5-utilities\") pod \"6d61a255-b794-40ae-aedd-9072e30bc6e5\" (UID: \"6d61a255-b794-40ae-aedd-9072e30bc6e5\") " Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.524230 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d61a255-b794-40ae-aedd-9072e30bc6e5-utilities" (OuterVolumeSpecName: "utilities") pod "6d61a255-b794-40ae-aedd-9072e30bc6e5" (UID: "6d61a255-b794-40ae-aedd-9072e30bc6e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.528250 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d61a255-b794-40ae-aedd-9072e30bc6e5-kube-api-access-2fmx8" (OuterVolumeSpecName: "kube-api-access-2fmx8") pod "6d61a255-b794-40ae-aedd-9072e30bc6e5" (UID: "6d61a255-b794-40ae-aedd-9072e30bc6e5"). InnerVolumeSpecName "kube-api-access-2fmx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.624734 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fmx8\" (UniqueName: \"kubernetes.io/projected/6d61a255-b794-40ae-aedd-9072e30bc6e5-kube-api-access-2fmx8\") on node \"crc\" DevicePath \"\"" Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.625021 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d61a255-b794-40ae-aedd-9072e30bc6e5-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.699943 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d61a255-b794-40ae-aedd-9072e30bc6e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d61a255-b794-40ae-aedd-9072e30bc6e5" (UID: "6d61a255-b794-40ae-aedd-9072e30bc6e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.725968 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d61a255-b794-40ae-aedd-9072e30bc6e5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.826529 4825 generic.go:334] "Generic (PLEG): container finished" podID="6d61a255-b794-40ae-aedd-9072e30bc6e5" containerID="e3f70bbff303daa7fa8ec0b0771845afe69c06cf6cd515bbe6722862fb8b9147" exitCode=0 Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.826577 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrdtm" event={"ID":"6d61a255-b794-40ae-aedd-9072e30bc6e5","Type":"ContainerDied","Data":"e3f70bbff303daa7fa8ec0b0771845afe69c06cf6cd515bbe6722862fb8b9147"} Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.826638 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrdtm" event={"ID":"6d61a255-b794-40ae-aedd-9072e30bc6e5","Type":"ContainerDied","Data":"31c5da8de77cfd3e14836b1a00dca142c826d70a732b7059adaaa198a8e95924"} Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.826646 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jrdtm" Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.826662 4825 scope.go:117] "RemoveContainer" containerID="e3f70bbff303daa7fa8ec0b0771845afe69c06cf6cd515bbe6722862fb8b9147" Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.843965 4825 scope.go:117] "RemoveContainer" containerID="81664451cd84bacb2d9ff2c6a2907fdeca8833e832f8de16fe3656ef3e6de5f6" Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.868835 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jrdtm"] Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.873912 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jrdtm"] Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.874958 4825 scope.go:117] "RemoveContainer" containerID="39a2cc403b1aa1ffc5cf7d7ee2e7798c04034f25da404d7957ef69e5f72166d5" Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.890827 4825 scope.go:117] "RemoveContainer" containerID="e3f70bbff303daa7fa8ec0b0771845afe69c06cf6cd515bbe6722862fb8b9147" Mar 10 07:41:00 crc kubenswrapper[4825]: E0310 07:41:00.891301 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3f70bbff303daa7fa8ec0b0771845afe69c06cf6cd515bbe6722862fb8b9147\": container with ID starting with e3f70bbff303daa7fa8ec0b0771845afe69c06cf6cd515bbe6722862fb8b9147 not found: ID does not exist" containerID="e3f70bbff303daa7fa8ec0b0771845afe69c06cf6cd515bbe6722862fb8b9147" Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.891328 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3f70bbff303daa7fa8ec0b0771845afe69c06cf6cd515bbe6722862fb8b9147"} err="failed to get container status \"e3f70bbff303daa7fa8ec0b0771845afe69c06cf6cd515bbe6722862fb8b9147\": rpc error: code = NotFound desc = could not find container \"e3f70bbff303daa7fa8ec0b0771845afe69c06cf6cd515bbe6722862fb8b9147\": container with ID starting with e3f70bbff303daa7fa8ec0b0771845afe69c06cf6cd515bbe6722862fb8b9147 not found: ID does not exist" Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.891346 4825 scope.go:117] "RemoveContainer" containerID="81664451cd84bacb2d9ff2c6a2907fdeca8833e832f8de16fe3656ef3e6de5f6" Mar 10 07:41:00 crc kubenswrapper[4825]: E0310 07:41:00.891670 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81664451cd84bacb2d9ff2c6a2907fdeca8833e832f8de16fe3656ef3e6de5f6\": container with ID starting with 81664451cd84bacb2d9ff2c6a2907fdeca8833e832f8de16fe3656ef3e6de5f6 not found: ID does not exist" containerID="81664451cd84bacb2d9ff2c6a2907fdeca8833e832f8de16fe3656ef3e6de5f6" Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.891690 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81664451cd84bacb2d9ff2c6a2907fdeca8833e832f8de16fe3656ef3e6de5f6"} err="failed to get container status \"81664451cd84bacb2d9ff2c6a2907fdeca8833e832f8de16fe3656ef3e6de5f6\": rpc error: code = NotFound desc = could not find container \"81664451cd84bacb2d9ff2c6a2907fdeca8833e832f8de16fe3656ef3e6de5f6\": container with ID starting with 81664451cd84bacb2d9ff2c6a2907fdeca8833e832f8de16fe3656ef3e6de5f6 not found: ID does not exist" Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.891703 4825 scope.go:117] "RemoveContainer" containerID="39a2cc403b1aa1ffc5cf7d7ee2e7798c04034f25da404d7957ef69e5f72166d5" Mar 10 07:41:00 crc kubenswrapper[4825]: E0310 07:41:00.891918 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39a2cc403b1aa1ffc5cf7d7ee2e7798c04034f25da404d7957ef69e5f72166d5\": container with ID starting with 39a2cc403b1aa1ffc5cf7d7ee2e7798c04034f25da404d7957ef69e5f72166d5 not found: ID does not exist" containerID="39a2cc403b1aa1ffc5cf7d7ee2e7798c04034f25da404d7957ef69e5f72166d5" Mar 10 07:41:00 crc kubenswrapper[4825]: I0310 07:41:00.891937 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39a2cc403b1aa1ffc5cf7d7ee2e7798c04034f25da404d7957ef69e5f72166d5"} err="failed to get container status \"39a2cc403b1aa1ffc5cf7d7ee2e7798c04034f25da404d7957ef69e5f72166d5\": rpc error: code = NotFound desc = could not find container \"39a2cc403b1aa1ffc5cf7d7ee2e7798c04034f25da404d7957ef69e5f72166d5\": container with ID starting with 39a2cc403b1aa1ffc5cf7d7ee2e7798c04034f25da404d7957ef69e5f72166d5 not found: ID does not exist" Mar 10 07:41:01 crc kubenswrapper[4825]: I0310 07:41:01.237173 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:41:01 crc kubenswrapper[4825]: E0310 07:41:01.237399 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:41:01 crc kubenswrapper[4825]: I0310 07:41:01.247365 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d61a255-b794-40ae-aedd-9072e30bc6e5" path="/var/lib/kubelet/pods/6d61a255-b794-40ae-aedd-9072e30bc6e5/volumes" Mar 10 07:41:14 crc kubenswrapper[4825]: I0310 07:41:14.237076 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:41:14 crc kubenswrapper[4825]: E0310 07:41:14.238338 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:41:27 crc kubenswrapper[4825]: I0310 07:41:27.237881 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:41:27 crc kubenswrapper[4825]: E0310 07:41:27.239837 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:41:41 crc kubenswrapper[4825]: I0310 07:41:41.237314 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:41:41 crc kubenswrapper[4825]: E0310 07:41:41.237976 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:41:53 crc kubenswrapper[4825]: I0310 07:41:53.237063 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:41:53 crc kubenswrapper[4825]: E0310 07:41:53.237792 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:42:00 crc kubenswrapper[4825]: I0310 07:42:00.164106 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552142-zj85r"] Mar 10 07:42:00 crc kubenswrapper[4825]: E0310 07:42:00.165306 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d75eddb-b190-4e4e-8f99-339d7923a4c0" containerName="registry-server" Mar 10 07:42:00 crc kubenswrapper[4825]: I0310 07:42:00.165333 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d75eddb-b190-4e4e-8f99-339d7923a4c0" containerName="registry-server" Mar 10 07:42:00 crc kubenswrapper[4825]: E0310 07:42:00.165354 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d61a255-b794-40ae-aedd-9072e30bc6e5" containerName="extract-utilities" Mar 10 07:42:00 crc kubenswrapper[4825]: I0310 07:42:00.165369 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d61a255-b794-40ae-aedd-9072e30bc6e5" containerName="extract-utilities" Mar 10 07:42:00 crc kubenswrapper[4825]: E0310 07:42:00.165384 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d61a255-b794-40ae-aedd-9072e30bc6e5" containerName="extract-content" Mar 10 07:42:00 crc kubenswrapper[4825]: I0310 07:42:00.165398 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d61a255-b794-40ae-aedd-9072e30bc6e5" containerName="extract-content" Mar 10 07:42:00 crc kubenswrapper[4825]: E0310 07:42:00.165431 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d61a255-b794-40ae-aedd-9072e30bc6e5" containerName="registry-server" Mar 10 07:42:00 crc kubenswrapper[4825]: I0310 07:42:00.165446 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d61a255-b794-40ae-aedd-9072e30bc6e5" containerName="registry-server" Mar 10 07:42:00 crc kubenswrapper[4825]: E0310 07:42:00.165496 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d75eddb-b190-4e4e-8f99-339d7923a4c0" containerName="extract-content" Mar 10 07:42:00 crc kubenswrapper[4825]: I0310 07:42:00.165510 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d75eddb-b190-4e4e-8f99-339d7923a4c0" containerName="extract-content" Mar 10 07:42:00 crc kubenswrapper[4825]: E0310 07:42:00.165527 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d75eddb-b190-4e4e-8f99-339d7923a4c0" containerName="extract-utilities" Mar 10 07:42:00 crc kubenswrapper[4825]: I0310 07:42:00.165541 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d75eddb-b190-4e4e-8f99-339d7923a4c0" containerName="extract-utilities" Mar 10 07:42:00 crc kubenswrapper[4825]: I0310 07:42:00.165789 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d61a255-b794-40ae-aedd-9072e30bc6e5" containerName="registry-server" Mar 10 07:42:00 crc kubenswrapper[4825]: I0310 07:42:00.165811 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d75eddb-b190-4e4e-8f99-339d7923a4c0" containerName="registry-server" Mar 10 07:42:00 crc kubenswrapper[4825]: I0310 07:42:00.166885 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552142-zj85r" Mar 10 07:42:00 crc kubenswrapper[4825]: I0310 07:42:00.170295 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:42:00 crc kubenswrapper[4825]: I0310 07:42:00.170660 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:42:00 crc kubenswrapper[4825]: I0310 07:42:00.170746 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:42:00 crc kubenswrapper[4825]: I0310 07:42:00.197528 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552142-zj85r"] Mar 10 07:42:00 crc kubenswrapper[4825]: I0310 07:42:00.330696 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cphqp\" (UniqueName: \"kubernetes.io/projected/d6cefa1b-ee7d-47ba-a2d8-17e2f3bc9f8f-kube-api-access-cphqp\") pod \"auto-csr-approver-29552142-zj85r\" (UID: \"d6cefa1b-ee7d-47ba-a2d8-17e2f3bc9f8f\") " pod="openshift-infra/auto-csr-approver-29552142-zj85r" Mar 10 07:42:00 crc kubenswrapper[4825]: I0310 07:42:00.432940 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cphqp\" (UniqueName: \"kubernetes.io/projected/d6cefa1b-ee7d-47ba-a2d8-17e2f3bc9f8f-kube-api-access-cphqp\") pod \"auto-csr-approver-29552142-zj85r\" (UID: \"d6cefa1b-ee7d-47ba-a2d8-17e2f3bc9f8f\") " pod="openshift-infra/auto-csr-approver-29552142-zj85r" Mar 10 07:42:00 crc kubenswrapper[4825]: I0310 07:42:00.467461 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cphqp\" (UniqueName: \"kubernetes.io/projected/d6cefa1b-ee7d-47ba-a2d8-17e2f3bc9f8f-kube-api-access-cphqp\") pod \"auto-csr-approver-29552142-zj85r\" (UID: \"d6cefa1b-ee7d-47ba-a2d8-17e2f3bc9f8f\") " pod="openshift-infra/auto-csr-approver-29552142-zj85r" Mar 10 07:42:00 crc kubenswrapper[4825]: I0310 07:42:00.514024 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552142-zj85r" Mar 10 07:42:01 crc kubenswrapper[4825]: I0310 07:42:01.007144 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552142-zj85r"] Mar 10 07:42:01 crc kubenswrapper[4825]: I0310 07:42:01.343948 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552142-zj85r" event={"ID":"d6cefa1b-ee7d-47ba-a2d8-17e2f3bc9f8f","Type":"ContainerStarted","Data":"16502703c39b7419f8589ff5b1a989f8b4d3925415ee0aa6435313a326344f75"} Mar 10 07:42:02 crc kubenswrapper[4825]: I0310 07:42:02.353456 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552142-zj85r" event={"ID":"d6cefa1b-ee7d-47ba-a2d8-17e2f3bc9f8f","Type":"ContainerStarted","Data":"db53e4a822125eccee2e837c6049adafb85a57e0a64db04cd79eda84bf43c108"} Mar 10 07:42:02 crc kubenswrapper[4825]: I0310 07:42:02.371654 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552142-zj85r" podStartSLOduration=1.550347891 podStartE2EDuration="2.371627747s" podCreationTimestamp="2026-03-10 07:42:00 +0000 UTC" firstStartedPulling="2026-03-10 07:42:01.011193636 +0000 UTC m=+3474.040974271" lastFinishedPulling="2026-03-10 07:42:01.832473522 +0000 UTC m=+3474.862254127" observedRunningTime="2026-03-10 07:42:02.370656872 +0000 UTC m=+3475.400437487" watchObservedRunningTime="2026-03-10 07:42:02.371627747 +0000 UTC m=+3475.401408402" Mar 10 07:42:03 crc kubenswrapper[4825]: I0310 07:42:03.363981 4825 generic.go:334] "Generic (PLEG): container finished" podID="d6cefa1b-ee7d-47ba-a2d8-17e2f3bc9f8f" containerID="db53e4a822125eccee2e837c6049adafb85a57e0a64db04cd79eda84bf43c108" exitCode=0 Mar 10 07:42:03 crc kubenswrapper[4825]: I0310 07:42:03.364034 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552142-zj85r" event={"ID":"d6cefa1b-ee7d-47ba-a2d8-17e2f3bc9f8f","Type":"ContainerDied","Data":"db53e4a822125eccee2e837c6049adafb85a57e0a64db04cd79eda84bf43c108"} Mar 10 07:42:04 crc kubenswrapper[4825]: I0310 07:42:04.672649 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552142-zj85r" Mar 10 07:42:04 crc kubenswrapper[4825]: I0310 07:42:04.798692 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cphqp\" (UniqueName: \"kubernetes.io/projected/d6cefa1b-ee7d-47ba-a2d8-17e2f3bc9f8f-kube-api-access-cphqp\") pod \"d6cefa1b-ee7d-47ba-a2d8-17e2f3bc9f8f\" (UID: \"d6cefa1b-ee7d-47ba-a2d8-17e2f3bc9f8f\") " Mar 10 07:42:04 crc kubenswrapper[4825]: I0310 07:42:04.803672 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6cefa1b-ee7d-47ba-a2d8-17e2f3bc9f8f-kube-api-access-cphqp" (OuterVolumeSpecName: "kube-api-access-cphqp") pod "d6cefa1b-ee7d-47ba-a2d8-17e2f3bc9f8f" (UID: "d6cefa1b-ee7d-47ba-a2d8-17e2f3bc9f8f"). InnerVolumeSpecName "kube-api-access-cphqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:42:04 crc kubenswrapper[4825]: I0310 07:42:04.899998 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cphqp\" (UniqueName: \"kubernetes.io/projected/d6cefa1b-ee7d-47ba-a2d8-17e2f3bc9f8f-kube-api-access-cphqp\") on node \"crc\" DevicePath \"\"" Mar 10 07:42:05 crc kubenswrapper[4825]: I0310 07:42:05.237936 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:42:05 crc kubenswrapper[4825]: E0310 07:42:05.238968 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:42:05 crc kubenswrapper[4825]: I0310 07:42:05.383320 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552142-zj85r" event={"ID":"d6cefa1b-ee7d-47ba-a2d8-17e2f3bc9f8f","Type":"ContainerDied","Data":"16502703c39b7419f8589ff5b1a989f8b4d3925415ee0aa6435313a326344f75"} Mar 10 07:42:05 crc kubenswrapper[4825]: I0310 07:42:05.383392 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16502703c39b7419f8589ff5b1a989f8b4d3925415ee0aa6435313a326344f75" Mar 10 07:42:05 crc kubenswrapper[4825]: I0310 07:42:05.383420 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552142-zj85r" Mar 10 07:42:05 crc kubenswrapper[4825]: I0310 07:42:05.471170 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552136-c4scj"] Mar 10 07:42:05 crc kubenswrapper[4825]: I0310 07:42:05.478373 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552136-c4scj"] Mar 10 07:42:07 crc kubenswrapper[4825]: I0310 07:42:07.260478 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23621e1d-1931-416a-9d42-89391322cd1b" path="/var/lib/kubelet/pods/23621e1d-1931-416a-9d42-89391322cd1b/volumes" Mar 10 07:42:18 crc kubenswrapper[4825]: I0310 07:42:18.237454 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:42:18 crc kubenswrapper[4825]: E0310 07:42:18.238522 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:42:30 crc kubenswrapper[4825]: I0310 07:42:30.236590 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:42:30 crc kubenswrapper[4825]: E0310 07:42:30.237742 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:42:39 crc kubenswrapper[4825]: I0310 07:42:39.033868 4825 scope.go:117] "RemoveContainer" containerID="cd36a0e95b222af021c1052458f6d919260a546438ec548c7c56071465524683" Mar 10 07:42:45 crc kubenswrapper[4825]: I0310 07:42:45.237518 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:42:45 crc kubenswrapper[4825]: E0310 07:42:45.238474 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:42:46 crc kubenswrapper[4825]: I0310 07:42:46.195575 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p9w5f"] Mar 10 07:42:46 crc kubenswrapper[4825]: E0310 07:42:46.196745 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6cefa1b-ee7d-47ba-a2d8-17e2f3bc9f8f" containerName="oc" Mar 10 07:42:46 crc kubenswrapper[4825]: I0310 07:42:46.196778 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6cefa1b-ee7d-47ba-a2d8-17e2f3bc9f8f" containerName="oc" Mar 10 07:42:46 crc kubenswrapper[4825]: I0310 07:42:46.197065 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6cefa1b-ee7d-47ba-a2d8-17e2f3bc9f8f" containerName="oc" Mar 10 07:42:46 crc kubenswrapper[4825]: I0310 07:42:46.199084 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p9w5f" Mar 10 07:42:46 crc kubenswrapper[4825]: I0310 07:42:46.208942 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p9w5f"] Mar 10 07:42:46 crc kubenswrapper[4825]: I0310 07:42:46.303827 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czm9t\" (UniqueName: \"kubernetes.io/projected/f0b31d1b-a64c-4ac6-9054-85bfb18fc022-kube-api-access-czm9t\") pod \"community-operators-p9w5f\" (UID: \"f0b31d1b-a64c-4ac6-9054-85bfb18fc022\") " pod="openshift-marketplace/community-operators-p9w5f" Mar 10 07:42:46 crc kubenswrapper[4825]: I0310 07:42:46.303916 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0b31d1b-a64c-4ac6-9054-85bfb18fc022-catalog-content\") pod \"community-operators-p9w5f\" (UID: \"f0b31d1b-a64c-4ac6-9054-85bfb18fc022\") " pod="openshift-marketplace/community-operators-p9w5f" Mar 10 07:42:46 crc kubenswrapper[4825]: I0310 07:42:46.303954 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0b31d1b-a64c-4ac6-9054-85bfb18fc022-utilities\") pod \"community-operators-p9w5f\" (UID: \"f0b31d1b-a64c-4ac6-9054-85bfb18fc022\") " pod="openshift-marketplace/community-operators-p9w5f" Mar 10 07:42:46 crc kubenswrapper[4825]: I0310 07:42:46.404679 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czm9t\" (UniqueName: \"kubernetes.io/projected/f0b31d1b-a64c-4ac6-9054-85bfb18fc022-kube-api-access-czm9t\") pod \"community-operators-p9w5f\" (UID: \"f0b31d1b-a64c-4ac6-9054-85bfb18fc022\") " pod="openshift-marketplace/community-operators-p9w5f" Mar 10 07:42:46 crc kubenswrapper[4825]: I0310 07:42:46.404752 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0b31d1b-a64c-4ac6-9054-85bfb18fc022-catalog-content\") pod \"community-operators-p9w5f\" (UID: \"f0b31d1b-a64c-4ac6-9054-85bfb18fc022\") " pod="openshift-marketplace/community-operators-p9w5f" Mar 10 07:42:46 crc kubenswrapper[4825]: I0310 07:42:46.404787 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0b31d1b-a64c-4ac6-9054-85bfb18fc022-utilities\") pod \"community-operators-p9w5f\" (UID: \"f0b31d1b-a64c-4ac6-9054-85bfb18fc022\") " pod="openshift-marketplace/community-operators-p9w5f" Mar 10 07:42:46 crc kubenswrapper[4825]: I0310 07:42:46.405466 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0b31d1b-a64c-4ac6-9054-85bfb18fc022-utilities\") pod \"community-operators-p9w5f\" (UID: \"f0b31d1b-a64c-4ac6-9054-85bfb18fc022\") " pod="openshift-marketplace/community-operators-p9w5f" Mar 10 07:42:46 crc kubenswrapper[4825]: I0310 07:42:46.405850 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0b31d1b-a64c-4ac6-9054-85bfb18fc022-catalog-content\") pod \"community-operators-p9w5f\" (UID: \"f0b31d1b-a64c-4ac6-9054-85bfb18fc022\") " pod="openshift-marketplace/community-operators-p9w5f" Mar 10 07:42:46 crc kubenswrapper[4825]: I0310 07:42:46.426712 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czm9t\" (UniqueName: \"kubernetes.io/projected/f0b31d1b-a64c-4ac6-9054-85bfb18fc022-kube-api-access-czm9t\") pod \"community-operators-p9w5f\" (UID: \"f0b31d1b-a64c-4ac6-9054-85bfb18fc022\") " pod="openshift-marketplace/community-operators-p9w5f" Mar 10 07:42:46 crc kubenswrapper[4825]: I0310 07:42:46.543409 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p9w5f" Mar 10 07:42:46 crc kubenswrapper[4825]: I0310 07:42:46.851180 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p9w5f"] Mar 10 07:42:47 crc kubenswrapper[4825]: I0310 07:42:47.807930 4825 generic.go:334] "Generic (PLEG): container finished" podID="f0b31d1b-a64c-4ac6-9054-85bfb18fc022" containerID="cd75b0421e543ca6a133f62f36b0f8fece91330ce9edd012088da8ca778d4567" exitCode=0 Mar 10 07:42:47 crc kubenswrapper[4825]: I0310 07:42:47.808031 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9w5f" event={"ID":"f0b31d1b-a64c-4ac6-9054-85bfb18fc022","Type":"ContainerDied","Data":"cd75b0421e543ca6a133f62f36b0f8fece91330ce9edd012088da8ca778d4567"} Mar 10 07:42:47 crc kubenswrapper[4825]: I0310 07:42:47.808281 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9w5f" event={"ID":"f0b31d1b-a64c-4ac6-9054-85bfb18fc022","Type":"ContainerStarted","Data":"1d9c54796e86e0c1f33c10f561490764ce10093bf4f3d44609eb2e56d6e7416d"} Mar 10 07:42:49 crc kubenswrapper[4825]: I0310 07:42:49.826316 4825 generic.go:334] "Generic (PLEG): container finished" podID="f0b31d1b-a64c-4ac6-9054-85bfb18fc022" containerID="4a9be6f02040ea2826eafa73b634eeda6abd79d5c755063719dd845c4fcf6b33" exitCode=0 Mar 10 07:42:49 crc kubenswrapper[4825]: I0310 07:42:49.826390 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9w5f" event={"ID":"f0b31d1b-a64c-4ac6-9054-85bfb18fc022","Type":"ContainerDied","Data":"4a9be6f02040ea2826eafa73b634eeda6abd79d5c755063719dd845c4fcf6b33"} Mar 10 07:42:50 crc kubenswrapper[4825]: I0310 07:42:50.837232 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9w5f" event={"ID":"f0b31d1b-a64c-4ac6-9054-85bfb18fc022","Type":"ContainerStarted","Data":"096bc9d31fc087ba234d51b06d1f2678323503c28c08fe559f66404b9799ba49"} Mar 10 07:42:50 crc kubenswrapper[4825]: I0310 07:42:50.863502 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p9w5f" podStartSLOduration=2.4038503589999998 podStartE2EDuration="4.863478327s" podCreationTimestamp="2026-03-10 07:42:46 +0000 UTC" firstStartedPulling="2026-03-10 07:42:47.81073717 +0000 UTC m=+3520.840517785" lastFinishedPulling="2026-03-10 07:42:50.270365108 +0000 UTC m=+3523.300145753" observedRunningTime="2026-03-10 07:42:50.856222767 +0000 UTC m=+3523.886003392" watchObservedRunningTime="2026-03-10 07:42:50.863478327 +0000 UTC m=+3523.893258982" Mar 10 07:42:56 crc kubenswrapper[4825]: I0310 07:42:56.543858 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p9w5f" Mar 10 07:42:56 crc kubenswrapper[4825]: I0310 07:42:56.544226 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p9w5f" Mar 10 07:42:56 crc kubenswrapper[4825]: I0310 07:42:56.593737 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p9w5f" Mar 10 07:42:56 crc kubenswrapper[4825]: I0310 07:42:56.966919 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p9w5f" Mar 10 07:42:57 crc kubenswrapper[4825]: I0310 07:42:57.031695 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p9w5f"] Mar 10 07:42:57 crc kubenswrapper[4825]: I0310 07:42:57.237311 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:42:57 crc kubenswrapper[4825]: I0310 07:42:57.907785 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"95a5e83fff4d9b64dcc73b9320d1e9fe7e149f0c72cfd7655a687da74d8eabda"} Mar 10 07:42:58 crc kubenswrapper[4825]: I0310 07:42:58.916696 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p9w5f" podUID="f0b31d1b-a64c-4ac6-9054-85bfb18fc022" containerName="registry-server" containerID="cri-o://096bc9d31fc087ba234d51b06d1f2678323503c28c08fe559f66404b9799ba49" gracePeriod=2 Mar 10 07:42:59 crc kubenswrapper[4825]: I0310 07:42:59.372943 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p9w5f" Mar 10 07:42:59 crc kubenswrapper[4825]: I0310 07:42:59.436016 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0b31d1b-a64c-4ac6-9054-85bfb18fc022-catalog-content\") pod \"f0b31d1b-a64c-4ac6-9054-85bfb18fc022\" (UID: \"f0b31d1b-a64c-4ac6-9054-85bfb18fc022\") " Mar 10 07:42:59 crc kubenswrapper[4825]: I0310 07:42:59.436115 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czm9t\" (UniqueName: \"kubernetes.io/projected/f0b31d1b-a64c-4ac6-9054-85bfb18fc022-kube-api-access-czm9t\") pod \"f0b31d1b-a64c-4ac6-9054-85bfb18fc022\" (UID: \"f0b31d1b-a64c-4ac6-9054-85bfb18fc022\") " Mar 10 07:42:59 crc kubenswrapper[4825]: I0310 07:42:59.436171 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0b31d1b-a64c-4ac6-9054-85bfb18fc022-utilities\") pod \"f0b31d1b-a64c-4ac6-9054-85bfb18fc022\" (UID: \"f0b31d1b-a64c-4ac6-9054-85bfb18fc022\") " Mar 10 07:42:59 crc kubenswrapper[4825]: I0310 07:42:59.437189 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0b31d1b-a64c-4ac6-9054-85bfb18fc022-utilities" (OuterVolumeSpecName: "utilities") pod "f0b31d1b-a64c-4ac6-9054-85bfb18fc022" (UID: "f0b31d1b-a64c-4ac6-9054-85bfb18fc022"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:42:59 crc kubenswrapper[4825]: I0310 07:42:59.442680 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0b31d1b-a64c-4ac6-9054-85bfb18fc022-kube-api-access-czm9t" (OuterVolumeSpecName: "kube-api-access-czm9t") pod "f0b31d1b-a64c-4ac6-9054-85bfb18fc022" (UID: "f0b31d1b-a64c-4ac6-9054-85bfb18fc022"). InnerVolumeSpecName "kube-api-access-czm9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:42:59 crc kubenswrapper[4825]: I0310 07:42:59.536963 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czm9t\" (UniqueName: \"kubernetes.io/projected/f0b31d1b-a64c-4ac6-9054-85bfb18fc022-kube-api-access-czm9t\") on node \"crc\" DevicePath \"\"" Mar 10 07:42:59 crc kubenswrapper[4825]: I0310 07:42:59.537011 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0b31d1b-a64c-4ac6-9054-85bfb18fc022-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 07:42:59 crc kubenswrapper[4825]: I0310 07:42:59.625727 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0b31d1b-a64c-4ac6-9054-85bfb18fc022-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0b31d1b-a64c-4ac6-9054-85bfb18fc022" (UID: "f0b31d1b-a64c-4ac6-9054-85bfb18fc022"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:42:59 crc kubenswrapper[4825]: I0310 07:42:59.638031 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0b31d1b-a64c-4ac6-9054-85bfb18fc022-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 07:42:59 crc kubenswrapper[4825]: I0310 07:42:59.927694 4825 generic.go:334] "Generic (PLEG): container finished" podID="f0b31d1b-a64c-4ac6-9054-85bfb18fc022" containerID="096bc9d31fc087ba234d51b06d1f2678323503c28c08fe559f66404b9799ba49" exitCode=0 Mar 10 07:42:59 crc kubenswrapper[4825]: I0310 07:42:59.927757 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9w5f" event={"ID":"f0b31d1b-a64c-4ac6-9054-85bfb18fc022","Type":"ContainerDied","Data":"096bc9d31fc087ba234d51b06d1f2678323503c28c08fe559f66404b9799ba49"} Mar 10 07:42:59 crc kubenswrapper[4825]: I0310 07:42:59.927797 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9w5f" event={"ID":"f0b31d1b-a64c-4ac6-9054-85bfb18fc022","Type":"ContainerDied","Data":"1d9c54796e86e0c1f33c10f561490764ce10093bf4f3d44609eb2e56d6e7416d"} Mar 10 07:42:59 crc kubenswrapper[4825]: I0310 07:42:59.927826 4825 scope.go:117] "RemoveContainer" containerID="096bc9d31fc087ba234d51b06d1f2678323503c28c08fe559f66404b9799ba49" Mar 10 07:42:59 crc kubenswrapper[4825]: I0310 07:42:59.928041 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p9w5f" Mar 10 07:42:59 crc kubenswrapper[4825]: I0310 07:42:59.962728 4825 scope.go:117] "RemoveContainer" containerID="4a9be6f02040ea2826eafa73b634eeda6abd79d5c755063719dd845c4fcf6b33" Mar 10 07:42:59 crc kubenswrapper[4825]: I0310 07:42:59.975012 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p9w5f"] Mar 10 07:42:59 crc kubenswrapper[4825]: I0310 07:42:59.984983 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p9w5f"] Mar 10 07:42:59 crc kubenswrapper[4825]: I0310 07:42:59.994619 4825 scope.go:117] "RemoveContainer" containerID="cd75b0421e543ca6a133f62f36b0f8fece91330ce9edd012088da8ca778d4567" Mar 10 07:43:00 crc kubenswrapper[4825]: I0310 07:43:00.021577 4825 scope.go:117] "RemoveContainer" containerID="096bc9d31fc087ba234d51b06d1f2678323503c28c08fe559f66404b9799ba49" Mar 10 07:43:00 crc kubenswrapper[4825]: E0310 07:43:00.022090 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"096bc9d31fc087ba234d51b06d1f2678323503c28c08fe559f66404b9799ba49\": container with ID starting with 096bc9d31fc087ba234d51b06d1f2678323503c28c08fe559f66404b9799ba49 not found: ID does not exist" containerID="096bc9d31fc087ba234d51b06d1f2678323503c28c08fe559f66404b9799ba49" Mar 10 07:43:00 crc kubenswrapper[4825]: I0310 07:43:00.022151 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"096bc9d31fc087ba234d51b06d1f2678323503c28c08fe559f66404b9799ba49"} err="failed to get container status \"096bc9d31fc087ba234d51b06d1f2678323503c28c08fe559f66404b9799ba49\": rpc error: code = NotFound desc = could not find container \"096bc9d31fc087ba234d51b06d1f2678323503c28c08fe559f66404b9799ba49\": container with ID starting with 096bc9d31fc087ba234d51b06d1f2678323503c28c08fe559f66404b9799ba49 not found: ID does not exist" Mar 10 07:43:00 crc kubenswrapper[4825]: I0310 07:43:00.022179 4825 scope.go:117] "RemoveContainer" containerID="4a9be6f02040ea2826eafa73b634eeda6abd79d5c755063719dd845c4fcf6b33" Mar 10 07:43:00 crc kubenswrapper[4825]: E0310 07:43:00.022494 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a9be6f02040ea2826eafa73b634eeda6abd79d5c755063719dd845c4fcf6b33\": container with ID starting with 4a9be6f02040ea2826eafa73b634eeda6abd79d5c755063719dd845c4fcf6b33 not found: ID does not exist" containerID="4a9be6f02040ea2826eafa73b634eeda6abd79d5c755063719dd845c4fcf6b33" Mar 10 07:43:00 crc kubenswrapper[4825]: I0310 07:43:00.022521 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a9be6f02040ea2826eafa73b634eeda6abd79d5c755063719dd845c4fcf6b33"} err="failed to get container status \"4a9be6f02040ea2826eafa73b634eeda6abd79d5c755063719dd845c4fcf6b33\": rpc error: code = NotFound desc = could not find container \"4a9be6f02040ea2826eafa73b634eeda6abd79d5c755063719dd845c4fcf6b33\": container with ID starting with 4a9be6f02040ea2826eafa73b634eeda6abd79d5c755063719dd845c4fcf6b33 not found: ID does not exist" Mar 10 07:43:00 crc kubenswrapper[4825]: I0310 07:43:00.022535 4825 scope.go:117] "RemoveContainer" containerID="cd75b0421e543ca6a133f62f36b0f8fece91330ce9edd012088da8ca778d4567" Mar 10 07:43:00 crc kubenswrapper[4825]: E0310 07:43:00.022768 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd75b0421e543ca6a133f62f36b0f8fece91330ce9edd012088da8ca778d4567\": container with ID starting with cd75b0421e543ca6a133f62f36b0f8fece91330ce9edd012088da8ca778d4567 not found: ID does not exist" containerID="cd75b0421e543ca6a133f62f36b0f8fece91330ce9edd012088da8ca778d4567" Mar 10 07:43:00 crc kubenswrapper[4825]: I0310 07:43:00.022794 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd75b0421e543ca6a133f62f36b0f8fece91330ce9edd012088da8ca778d4567"} err="failed to get container status \"cd75b0421e543ca6a133f62f36b0f8fece91330ce9edd012088da8ca778d4567\": rpc error: code = NotFound desc = could not find container \"cd75b0421e543ca6a133f62f36b0f8fece91330ce9edd012088da8ca778d4567\": container with ID starting with cd75b0421e543ca6a133f62f36b0f8fece91330ce9edd012088da8ca778d4567 not found: ID does not exist" Mar 10 07:43:01 crc kubenswrapper[4825]: I0310 07:43:01.249792 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0b31d1b-a64c-4ac6-9054-85bfb18fc022" path="/var/lib/kubelet/pods/f0b31d1b-a64c-4ac6-9054-85bfb18fc022/volumes" Mar 10 07:44:00 crc kubenswrapper[4825]: I0310 07:44:00.151677 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552144-fz7k9"] Mar 10 07:44:00 crc kubenswrapper[4825]: E0310 07:44:00.152676 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b31d1b-a64c-4ac6-9054-85bfb18fc022" containerName="registry-server" Mar 10 07:44:00 crc kubenswrapper[4825]: I0310 07:44:00.152702 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b31d1b-a64c-4ac6-9054-85bfb18fc022" containerName="registry-server" Mar 10 07:44:00 crc kubenswrapper[4825]: E0310 07:44:00.152725 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b31d1b-a64c-4ac6-9054-85bfb18fc022" containerName="extract-content" Mar 10 07:44:00 crc kubenswrapper[4825]: I0310 07:44:00.152735 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b31d1b-a64c-4ac6-9054-85bfb18fc022" containerName="extract-content" Mar 10 07:44:00 crc kubenswrapper[4825]: E0310 07:44:00.152777 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b31d1b-a64c-4ac6-9054-85bfb18fc022" containerName="extract-utilities" Mar 10 07:44:00 crc kubenswrapper[4825]: I0310 07:44:00.152790 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b31d1b-a64c-4ac6-9054-85bfb18fc022" containerName="extract-utilities" Mar 10 07:44:00 crc kubenswrapper[4825]: I0310 07:44:00.153047 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0b31d1b-a64c-4ac6-9054-85bfb18fc022" containerName="registry-server" Mar 10 07:44:00 crc kubenswrapper[4825]: I0310 07:44:00.153753 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552144-fz7k9" Mar 10 07:44:00 crc kubenswrapper[4825]: I0310 07:44:00.158640 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:44:00 crc kubenswrapper[4825]: I0310 07:44:00.160170 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:44:00 crc kubenswrapper[4825]: I0310 07:44:00.161114 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:44:00 crc kubenswrapper[4825]: I0310 07:44:00.167200 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552144-fz7k9"] Mar 10 07:44:00 crc kubenswrapper[4825]: I0310 07:44:00.284939 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwkjm\" (UniqueName: \"kubernetes.io/projected/fe5a61dc-26e9-406d-a533-896a826d1f53-kube-api-access-kwkjm\") pod \"auto-csr-approver-29552144-fz7k9\" (UID: \"fe5a61dc-26e9-406d-a533-896a826d1f53\") " pod="openshift-infra/auto-csr-approver-29552144-fz7k9" Mar 10 07:44:00 crc kubenswrapper[4825]: I0310 07:44:00.386356 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwkjm\" (UniqueName: \"kubernetes.io/projected/fe5a61dc-26e9-406d-a533-896a826d1f53-kube-api-access-kwkjm\") pod \"auto-csr-approver-29552144-fz7k9\" (UID: \"fe5a61dc-26e9-406d-a533-896a826d1f53\") " pod="openshift-infra/auto-csr-approver-29552144-fz7k9" Mar 10 07:44:00 crc kubenswrapper[4825]: I0310 07:44:00.426544 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwkjm\" (UniqueName: \"kubernetes.io/projected/fe5a61dc-26e9-406d-a533-896a826d1f53-kube-api-access-kwkjm\") pod \"auto-csr-approver-29552144-fz7k9\" (UID: \"fe5a61dc-26e9-406d-a533-896a826d1f53\") " pod="openshift-infra/auto-csr-approver-29552144-fz7k9" Mar 10 07:44:00 crc kubenswrapper[4825]: I0310 07:44:00.483603 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552144-fz7k9" Mar 10 07:44:00 crc kubenswrapper[4825]: I0310 07:44:00.938394 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552144-fz7k9"] Mar 10 07:44:01 crc kubenswrapper[4825]: I0310 07:44:01.487623 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552144-fz7k9" event={"ID":"fe5a61dc-26e9-406d-a533-896a826d1f53","Type":"ContainerStarted","Data":"8d34b709b6cde468f04dcd38e59b9ea343c3bc91b06da2178f8b8d5b21f262d9"} Mar 10 07:44:02 crc kubenswrapper[4825]: I0310 07:44:02.502938 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552144-fz7k9" event={"ID":"fe5a61dc-26e9-406d-a533-896a826d1f53","Type":"ContainerStarted","Data":"6e4ded3869d3a46487744d8766d1789eadaa514a0e7c4a412590b6ab0a5f41ab"} Mar 10 07:44:02 crc kubenswrapper[4825]: I0310 07:44:02.517555 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552144-fz7k9" podStartSLOduration=1.290165616 podStartE2EDuration="2.517536451s" podCreationTimestamp="2026-03-10 07:44:00 +0000 UTC" firstStartedPulling="2026-03-10 07:44:00.947108988 +0000 UTC m=+3593.976889623" lastFinishedPulling="2026-03-10 07:44:02.174479843 +0000 UTC m=+3595.204260458" observedRunningTime="2026-03-10 07:44:02.516767341 +0000 UTC m=+3595.546547956" watchObservedRunningTime="2026-03-10 07:44:02.517536451 +0000 UTC m=+3595.547317066" Mar 10 07:44:03 crc kubenswrapper[4825]: I0310 07:44:03.510882 4825 generic.go:334] "Generic (PLEG): container finished" podID="fe5a61dc-26e9-406d-a533-896a826d1f53" containerID="6e4ded3869d3a46487744d8766d1789eadaa514a0e7c4a412590b6ab0a5f41ab" exitCode=0 Mar 10 07:44:03 crc kubenswrapper[4825]: I0310 07:44:03.511072 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552144-fz7k9" event={"ID":"fe5a61dc-26e9-406d-a533-896a826d1f53","Type":"ContainerDied","Data":"6e4ded3869d3a46487744d8766d1789eadaa514a0e7c4a412590b6ab0a5f41ab"} Mar 10 07:44:04 crc kubenswrapper[4825]: I0310 07:44:04.884240 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552144-fz7k9" Mar 10 07:44:04 crc kubenswrapper[4825]: I0310 07:44:04.949751 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwkjm\" (UniqueName: \"kubernetes.io/projected/fe5a61dc-26e9-406d-a533-896a826d1f53-kube-api-access-kwkjm\") pod \"fe5a61dc-26e9-406d-a533-896a826d1f53\" (UID: \"fe5a61dc-26e9-406d-a533-896a826d1f53\") " Mar 10 07:44:04 crc kubenswrapper[4825]: I0310 07:44:04.955712 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe5a61dc-26e9-406d-a533-896a826d1f53-kube-api-access-kwkjm" (OuterVolumeSpecName: "kube-api-access-kwkjm") pod "fe5a61dc-26e9-406d-a533-896a826d1f53" (UID: "fe5a61dc-26e9-406d-a533-896a826d1f53"). InnerVolumeSpecName "kube-api-access-kwkjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:44:05 crc kubenswrapper[4825]: I0310 07:44:05.051318 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwkjm\" (UniqueName: \"kubernetes.io/projected/fe5a61dc-26e9-406d-a533-896a826d1f53-kube-api-access-kwkjm\") on node \"crc\" DevicePath \"\"" Mar 10 07:44:05 crc kubenswrapper[4825]: I0310 07:44:05.531041 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552144-fz7k9" event={"ID":"fe5a61dc-26e9-406d-a533-896a826d1f53","Type":"ContainerDied","Data":"8d34b709b6cde468f04dcd38e59b9ea343c3bc91b06da2178f8b8d5b21f262d9"} Mar 10 07:44:05 crc kubenswrapper[4825]: I0310 07:44:05.531083 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d34b709b6cde468f04dcd38e59b9ea343c3bc91b06da2178f8b8d5b21f262d9" Mar 10 07:44:05 crc kubenswrapper[4825]: I0310 07:44:05.531100 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552144-fz7k9" Mar 10 07:44:05 crc kubenswrapper[4825]: I0310 07:44:05.584760 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552138-gm4zd"] Mar 10 07:44:05 crc kubenswrapper[4825]: I0310 07:44:05.590096 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552138-gm4zd"] Mar 10 07:44:07 crc kubenswrapper[4825]: I0310 07:44:07.246573 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bdd63ee-cc99-4324-bfce-d871b230eaa6" path="/var/lib/kubelet/pods/8bdd63ee-cc99-4324-bfce-d871b230eaa6/volumes" Mar 10 07:44:39 crc kubenswrapper[4825]: I0310 07:44:39.154783 4825 scope.go:117] "RemoveContainer" containerID="bd76e78d7a6c0c3caa0ec1661fb89634c7c66ffae269f3adf8a8ccbd0c037e40" Mar 10 07:45:00 crc kubenswrapper[4825]: I0310 07:45:00.157585 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552145-pm7sj"] Mar 10 07:45:00 crc kubenswrapper[4825]: E0310 07:45:00.158515 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe5a61dc-26e9-406d-a533-896a826d1f53" containerName="oc" Mar 10 07:45:00 crc kubenswrapper[4825]: I0310 07:45:00.158532 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe5a61dc-26e9-406d-a533-896a826d1f53" containerName="oc" Mar 10 07:45:00 crc kubenswrapper[4825]: I0310 07:45:00.158709 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe5a61dc-26e9-406d-a533-896a826d1f53" containerName="oc" Mar 10 07:45:00 crc kubenswrapper[4825]: I0310 07:45:00.159281 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552145-pm7sj" Mar 10 07:45:00 crc kubenswrapper[4825]: I0310 07:45:00.163114 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 07:45:00 crc kubenswrapper[4825]: I0310 07:45:00.163600 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 07:45:00 crc kubenswrapper[4825]: I0310 07:45:00.175699 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552145-pm7sj"] Mar 10 07:45:00 crc kubenswrapper[4825]: I0310 07:45:00.219407 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e10f0b6f-4e14-4510-aac9-936b916abda5-config-volume\") pod \"collect-profiles-29552145-pm7sj\" (UID: \"e10f0b6f-4e14-4510-aac9-936b916abda5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552145-pm7sj" Mar 10 07:45:00 crc kubenswrapper[4825]: I0310 07:45:00.219537 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hggjp\" (UniqueName: \"kubernetes.io/projected/e10f0b6f-4e14-4510-aac9-936b916abda5-kube-api-access-hggjp\") pod \"collect-profiles-29552145-pm7sj\" (UID: \"e10f0b6f-4e14-4510-aac9-936b916abda5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552145-pm7sj" Mar 10 07:45:00 crc kubenswrapper[4825]: I0310 07:45:00.219600 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e10f0b6f-4e14-4510-aac9-936b916abda5-secret-volume\") pod \"collect-profiles-29552145-pm7sj\" (UID: \"e10f0b6f-4e14-4510-aac9-936b916abda5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552145-pm7sj" Mar 10 07:45:00 crc kubenswrapper[4825]: I0310 07:45:00.320983 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hggjp\" (UniqueName: \"kubernetes.io/projected/e10f0b6f-4e14-4510-aac9-936b916abda5-kube-api-access-hggjp\") pod \"collect-profiles-29552145-pm7sj\" (UID: \"e10f0b6f-4e14-4510-aac9-936b916abda5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552145-pm7sj" Mar 10 07:45:00 crc kubenswrapper[4825]: I0310 07:45:00.321419 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e10f0b6f-4e14-4510-aac9-936b916abda5-secret-volume\") pod \"collect-profiles-29552145-pm7sj\" (UID: \"e10f0b6f-4e14-4510-aac9-936b916abda5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552145-pm7sj" Mar 10 07:45:00 crc kubenswrapper[4825]: I0310 07:45:00.321756 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e10f0b6f-4e14-4510-aac9-936b916abda5-config-volume\") pod \"collect-profiles-29552145-pm7sj\" (UID: \"e10f0b6f-4e14-4510-aac9-936b916abda5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552145-pm7sj" Mar 10 07:45:00 crc kubenswrapper[4825]: I0310 07:45:00.323401 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e10f0b6f-4e14-4510-aac9-936b916abda5-config-volume\") pod \"collect-profiles-29552145-pm7sj\" (UID: \"e10f0b6f-4e14-4510-aac9-936b916abda5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552145-pm7sj" Mar 10 07:45:00 crc kubenswrapper[4825]: I0310 07:45:00.330242 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e10f0b6f-4e14-4510-aac9-936b916abda5-secret-volume\") pod \"collect-profiles-29552145-pm7sj\" (UID: \"e10f0b6f-4e14-4510-aac9-936b916abda5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552145-pm7sj" Mar 10 07:45:00 crc kubenswrapper[4825]: I0310 07:45:00.351624 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hggjp\" (UniqueName: \"kubernetes.io/projected/e10f0b6f-4e14-4510-aac9-936b916abda5-kube-api-access-hggjp\") pod \"collect-profiles-29552145-pm7sj\" (UID: \"e10f0b6f-4e14-4510-aac9-936b916abda5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552145-pm7sj" Mar 10 07:45:00 crc kubenswrapper[4825]: I0310 07:45:00.488659 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552145-pm7sj" Mar 10 07:45:00 crc kubenswrapper[4825]: I0310 07:45:00.948162 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552145-pm7sj"] Mar 10 07:45:00 crc kubenswrapper[4825]: I0310 07:45:00.961578 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552145-pm7sj" event={"ID":"e10f0b6f-4e14-4510-aac9-936b916abda5","Type":"ContainerStarted","Data":"dce80917f33065777020084c8c7f17563ea76649471bd4b21bf13392a8072db4"} Mar 10 07:45:01 crc kubenswrapper[4825]: I0310 07:45:01.971652 4825 generic.go:334] "Generic (PLEG): container finished" podID="e10f0b6f-4e14-4510-aac9-936b916abda5" containerID="282fd61201a2bb2509a065ff2805533500df406771d4f343ac95615c7c66cf18" exitCode=0 Mar 10 07:45:01 crc kubenswrapper[4825]: I0310 07:45:01.971746 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552145-pm7sj" event={"ID":"e10f0b6f-4e14-4510-aac9-936b916abda5","Type":"ContainerDied","Data":"282fd61201a2bb2509a065ff2805533500df406771d4f343ac95615c7c66cf18"} Mar 10 07:45:03 crc kubenswrapper[4825]: I0310 07:45:03.301865 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552145-pm7sj" Mar 10 07:45:03 crc kubenswrapper[4825]: I0310 07:45:03.364737 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e10f0b6f-4e14-4510-aac9-936b916abda5-config-volume\") pod \"e10f0b6f-4e14-4510-aac9-936b916abda5\" (UID: \"e10f0b6f-4e14-4510-aac9-936b916abda5\") " Mar 10 07:45:03 crc kubenswrapper[4825]: I0310 07:45:03.364793 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hggjp\" (UniqueName: \"kubernetes.io/projected/e10f0b6f-4e14-4510-aac9-936b916abda5-kube-api-access-hggjp\") pod \"e10f0b6f-4e14-4510-aac9-936b916abda5\" (UID: \"e10f0b6f-4e14-4510-aac9-936b916abda5\") " Mar 10 07:45:03 crc kubenswrapper[4825]: I0310 07:45:03.364858 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e10f0b6f-4e14-4510-aac9-936b916abda5-secret-volume\") pod \"e10f0b6f-4e14-4510-aac9-936b916abda5\" (UID: \"e10f0b6f-4e14-4510-aac9-936b916abda5\") " Mar 10 07:45:03 crc kubenswrapper[4825]: I0310 07:45:03.365347 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e10f0b6f-4e14-4510-aac9-936b916abda5-config-volume" (OuterVolumeSpecName: "config-volume") pod "e10f0b6f-4e14-4510-aac9-936b916abda5" (UID: "e10f0b6f-4e14-4510-aac9-936b916abda5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 07:45:03 crc kubenswrapper[4825]: I0310 07:45:03.370081 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e10f0b6f-4e14-4510-aac9-936b916abda5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e10f0b6f-4e14-4510-aac9-936b916abda5" (UID: "e10f0b6f-4e14-4510-aac9-936b916abda5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 07:45:03 crc kubenswrapper[4825]: I0310 07:45:03.370476 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e10f0b6f-4e14-4510-aac9-936b916abda5-kube-api-access-hggjp" (OuterVolumeSpecName: "kube-api-access-hggjp") pod "e10f0b6f-4e14-4510-aac9-936b916abda5" (UID: "e10f0b6f-4e14-4510-aac9-936b916abda5"). InnerVolumeSpecName "kube-api-access-hggjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:45:03 crc kubenswrapper[4825]: I0310 07:45:03.466770 4825 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e10f0b6f-4e14-4510-aac9-936b916abda5-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 07:45:03 crc kubenswrapper[4825]: I0310 07:45:03.466805 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hggjp\" (UniqueName: \"kubernetes.io/projected/e10f0b6f-4e14-4510-aac9-936b916abda5-kube-api-access-hggjp\") on node \"crc\" DevicePath \"\"" Mar 10 07:45:03 crc kubenswrapper[4825]: I0310 07:45:03.466818 4825 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e10f0b6f-4e14-4510-aac9-936b916abda5-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 07:45:04 crc kubenswrapper[4825]: I0310 07:45:04.000381 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552145-pm7sj" event={"ID":"e10f0b6f-4e14-4510-aac9-936b916abda5","Type":"ContainerDied","Data":"dce80917f33065777020084c8c7f17563ea76649471bd4b21bf13392a8072db4"} Mar 10 07:45:04 crc kubenswrapper[4825]: I0310 07:45:04.000454 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dce80917f33065777020084c8c7f17563ea76649471bd4b21bf13392a8072db4" Mar 10 07:45:04 crc kubenswrapper[4825]: I0310 07:45:04.000536 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552145-pm7sj" Mar 10 07:45:04 crc kubenswrapper[4825]: I0310 07:45:04.386714 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552100-5wl6j"] Mar 10 07:45:04 crc kubenswrapper[4825]: I0310 07:45:04.395376 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552100-5wl6j"] Mar 10 07:45:05 crc kubenswrapper[4825]: I0310 07:45:05.251801 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0f2bc77-d38f-42b6-a789-38843e17bcbb" path="/var/lib/kubelet/pods/b0f2bc77-d38f-42b6-a789-38843e17bcbb/volumes" Mar 10 07:45:16 crc kubenswrapper[4825]: I0310 07:45:16.888485 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:45:16 crc kubenswrapper[4825]: I0310 07:45:16.889254 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:45:39 crc kubenswrapper[4825]: I0310 07:45:39.229177 4825 scope.go:117] "RemoveContainer" containerID="abffb41e22121e7640bd4bbc82fb8e4c99dee0b7b8cf28bc6b3407c6f194645f" Mar 10 07:45:46 crc kubenswrapper[4825]: I0310 07:45:46.887999 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:45:46 crc kubenswrapper[4825]: I0310 07:45:46.888561 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:46:00 crc kubenswrapper[4825]: I0310 07:46:00.150389 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552146-rppwt"] Mar 10 07:46:00 crc kubenswrapper[4825]: E0310 07:46:00.152724 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e10f0b6f-4e14-4510-aac9-936b916abda5" containerName="collect-profiles" Mar 10 07:46:00 crc kubenswrapper[4825]: I0310 07:46:00.152852 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e10f0b6f-4e14-4510-aac9-936b916abda5" containerName="collect-profiles" Mar 10 07:46:00 crc kubenswrapper[4825]: I0310 07:46:00.153122 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e10f0b6f-4e14-4510-aac9-936b916abda5" containerName="collect-profiles" Mar 10 07:46:00 crc kubenswrapper[4825]: I0310 07:46:00.153905 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552146-rppwt" Mar 10 07:46:00 crc kubenswrapper[4825]: I0310 07:46:00.159390 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:46:00 crc kubenswrapper[4825]: I0310 07:46:00.159791 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:46:00 crc kubenswrapper[4825]: I0310 07:46:00.159910 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:46:00 crc kubenswrapper[4825]: I0310 07:46:00.162970 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552146-rppwt"] Mar 10 07:46:00 crc kubenswrapper[4825]: I0310 07:46:00.240293 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csbzm\" (UniqueName: \"kubernetes.io/projected/9a2b44ed-4a43-4034-b298-779f0e562e1c-kube-api-access-csbzm\") pod \"auto-csr-approver-29552146-rppwt\" (UID: \"9a2b44ed-4a43-4034-b298-779f0e562e1c\") " pod="openshift-infra/auto-csr-approver-29552146-rppwt" Mar 10 07:46:00 crc kubenswrapper[4825]: I0310 07:46:00.342517 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csbzm\" (UniqueName: \"kubernetes.io/projected/9a2b44ed-4a43-4034-b298-779f0e562e1c-kube-api-access-csbzm\") pod \"auto-csr-approver-29552146-rppwt\" (UID: \"9a2b44ed-4a43-4034-b298-779f0e562e1c\") " pod="openshift-infra/auto-csr-approver-29552146-rppwt" Mar 10 07:46:00 crc kubenswrapper[4825]: I0310 07:46:00.378171 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csbzm\" (UniqueName: \"kubernetes.io/projected/9a2b44ed-4a43-4034-b298-779f0e562e1c-kube-api-access-csbzm\") pod \"auto-csr-approver-29552146-rppwt\" (UID: \"9a2b44ed-4a43-4034-b298-779f0e562e1c\") " pod="openshift-infra/auto-csr-approver-29552146-rppwt" Mar 10 07:46:00 crc kubenswrapper[4825]: I0310 07:46:00.482922 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552146-rppwt" Mar 10 07:46:00 crc kubenswrapper[4825]: I0310 07:46:00.998788 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552146-rppwt"] Mar 10 07:46:01 crc kubenswrapper[4825]: I0310 07:46:01.011564 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 07:46:01 crc kubenswrapper[4825]: I0310 07:46:01.540343 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552146-rppwt" event={"ID":"9a2b44ed-4a43-4034-b298-779f0e562e1c","Type":"ContainerStarted","Data":"f4f4b03ab1bf98dbe91b5af58e743d625e02e5b0a2d6c1465c40ba27ddd34e68"} Mar 10 07:46:02 crc kubenswrapper[4825]: I0310 07:46:02.550873 4825 generic.go:334] "Generic (PLEG): container finished" podID="9a2b44ed-4a43-4034-b298-779f0e562e1c" containerID="e932489589c98e57f911429181513fbe004a30a09427a4726b7c604da54e6076" exitCode=0 Mar 10 07:46:02 crc kubenswrapper[4825]: I0310 07:46:02.550969 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552146-rppwt" event={"ID":"9a2b44ed-4a43-4034-b298-779f0e562e1c","Type":"ContainerDied","Data":"e932489589c98e57f911429181513fbe004a30a09427a4726b7c604da54e6076"} Mar 10 07:46:04 crc kubenswrapper[4825]: I0310 07:46:04.009053 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552146-rppwt" Mar 10 07:46:04 crc kubenswrapper[4825]: I0310 07:46:04.097487 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csbzm\" (UniqueName: \"kubernetes.io/projected/9a2b44ed-4a43-4034-b298-779f0e562e1c-kube-api-access-csbzm\") pod \"9a2b44ed-4a43-4034-b298-779f0e562e1c\" (UID: \"9a2b44ed-4a43-4034-b298-779f0e562e1c\") " Mar 10 07:46:04 crc kubenswrapper[4825]: I0310 07:46:04.104230 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a2b44ed-4a43-4034-b298-779f0e562e1c-kube-api-access-csbzm" (OuterVolumeSpecName: "kube-api-access-csbzm") pod "9a2b44ed-4a43-4034-b298-779f0e562e1c" (UID: "9a2b44ed-4a43-4034-b298-779f0e562e1c"). InnerVolumeSpecName "kube-api-access-csbzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:46:04 crc kubenswrapper[4825]: I0310 07:46:04.199874 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csbzm\" (UniqueName: \"kubernetes.io/projected/9a2b44ed-4a43-4034-b298-779f0e562e1c-kube-api-access-csbzm\") on node \"crc\" DevicePath \"\"" Mar 10 07:46:04 crc kubenswrapper[4825]: I0310 07:46:04.574171 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552146-rppwt" event={"ID":"9a2b44ed-4a43-4034-b298-779f0e562e1c","Type":"ContainerDied","Data":"f4f4b03ab1bf98dbe91b5af58e743d625e02e5b0a2d6c1465c40ba27ddd34e68"} Mar 10 07:46:04 crc kubenswrapper[4825]: I0310 07:46:04.574236 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552146-rppwt" Mar 10 07:46:04 crc kubenswrapper[4825]: I0310 07:46:04.574239 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4f4b03ab1bf98dbe91b5af58e743d625e02e5b0a2d6c1465c40ba27ddd34e68" Mar 10 07:46:05 crc kubenswrapper[4825]: I0310 07:46:05.132235 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552140-x57xg"] Mar 10 07:46:05 crc kubenswrapper[4825]: I0310 07:46:05.142769 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552140-x57xg"] Mar 10 07:46:05 crc kubenswrapper[4825]: I0310 07:46:05.247349 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb168114-ca6b-4a6e-857d-17b7e09c5d6b" path="/var/lib/kubelet/pods/eb168114-ca6b-4a6e-857d-17b7e09c5d6b/volumes" Mar 10 07:46:16 crc kubenswrapper[4825]: I0310 07:46:16.888512 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:46:16 crc kubenswrapper[4825]: I0310 07:46:16.889242 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:46:16 crc kubenswrapper[4825]: I0310 07:46:16.889319 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 07:46:16 crc kubenswrapper[4825]: I0310 07:46:16.890360 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"95a5e83fff4d9b64dcc73b9320d1e9fe7e149f0c72cfd7655a687da74d8eabda"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 07:46:16 crc kubenswrapper[4825]: I0310 07:46:16.890485 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://95a5e83fff4d9b64dcc73b9320d1e9fe7e149f0c72cfd7655a687da74d8eabda" gracePeriod=600 Mar 10 07:46:17 crc kubenswrapper[4825]: I0310 07:46:17.721352 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="95a5e83fff4d9b64dcc73b9320d1e9fe7e149f0c72cfd7655a687da74d8eabda" exitCode=0 Mar 10 07:46:17 crc kubenswrapper[4825]: I0310 07:46:17.721387 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"95a5e83fff4d9b64dcc73b9320d1e9fe7e149f0c72cfd7655a687da74d8eabda"} Mar 10 07:46:17 crc kubenswrapper[4825]: I0310 07:46:17.721734 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e"} Mar 10 07:46:17 crc kubenswrapper[4825]: I0310 07:46:17.721759 4825 scope.go:117] "RemoveContainer" containerID="ba48065b56abc01a0a946c6743e29fc19acf3581c643245f501b00640a86d953" Mar 10 07:46:31 crc kubenswrapper[4825]: I0310 07:46:31.725998 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-657pl"] Mar 10 07:46:31 crc kubenswrapper[4825]: E0310 07:46:31.726827 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a2b44ed-4a43-4034-b298-779f0e562e1c" containerName="oc" Mar 10 07:46:31 crc kubenswrapper[4825]: I0310 07:46:31.726841 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a2b44ed-4a43-4034-b298-779f0e562e1c" containerName="oc" Mar 10 07:46:31 crc kubenswrapper[4825]: I0310 07:46:31.727024 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a2b44ed-4a43-4034-b298-779f0e562e1c" containerName="oc" Mar 10 07:46:31 crc kubenswrapper[4825]: I0310 07:46:31.728170 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-657pl" Mar 10 07:46:31 crc kubenswrapper[4825]: I0310 07:46:31.744898 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-657pl"] Mar 10 07:46:31 crc kubenswrapper[4825]: I0310 07:46:31.852842 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/063494e6-c102-4dcd-a4f1-573f83150304-utilities\") pod \"redhat-marketplace-657pl\" (UID: \"063494e6-c102-4dcd-a4f1-573f83150304\") " pod="openshift-marketplace/redhat-marketplace-657pl" Mar 10 07:46:31 crc kubenswrapper[4825]: I0310 07:46:31.852951 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/063494e6-c102-4dcd-a4f1-573f83150304-catalog-content\") pod \"redhat-marketplace-657pl\" (UID: \"063494e6-c102-4dcd-a4f1-573f83150304\") " pod="openshift-marketplace/redhat-marketplace-657pl" Mar 10 07:46:31 crc kubenswrapper[4825]: I0310 07:46:31.853015 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bx28\" (UniqueName: \"kubernetes.io/projected/063494e6-c102-4dcd-a4f1-573f83150304-kube-api-access-8bx28\") pod \"redhat-marketplace-657pl\" (UID: \"063494e6-c102-4dcd-a4f1-573f83150304\") " pod="openshift-marketplace/redhat-marketplace-657pl" Mar 10 07:46:31 crc kubenswrapper[4825]: I0310 07:46:31.954410 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/063494e6-c102-4dcd-a4f1-573f83150304-utilities\") pod \"redhat-marketplace-657pl\" (UID: \"063494e6-c102-4dcd-a4f1-573f83150304\") " pod="openshift-marketplace/redhat-marketplace-657pl" Mar 10 07:46:31 crc kubenswrapper[4825]: I0310 07:46:31.954491 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/063494e6-c102-4dcd-a4f1-573f83150304-catalog-content\") pod \"redhat-marketplace-657pl\" (UID: \"063494e6-c102-4dcd-a4f1-573f83150304\") " pod="openshift-marketplace/redhat-marketplace-657pl" Mar 10 07:46:31 crc kubenswrapper[4825]: I0310 07:46:31.954958 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/063494e6-c102-4dcd-a4f1-573f83150304-utilities\") pod \"redhat-marketplace-657pl\" (UID: \"063494e6-c102-4dcd-a4f1-573f83150304\") " pod="openshift-marketplace/redhat-marketplace-657pl" Mar 10 07:46:31 crc kubenswrapper[4825]: I0310 07:46:31.955052 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/063494e6-c102-4dcd-a4f1-573f83150304-catalog-content\") pod \"redhat-marketplace-657pl\" (UID: \"063494e6-c102-4dcd-a4f1-573f83150304\") " pod="openshift-marketplace/redhat-marketplace-657pl" Mar 10 07:46:31 crc kubenswrapper[4825]: I0310 07:46:31.955211 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bx28\" (UniqueName: \"kubernetes.io/projected/063494e6-c102-4dcd-a4f1-573f83150304-kube-api-access-8bx28\") pod \"redhat-marketplace-657pl\" (UID: \"063494e6-c102-4dcd-a4f1-573f83150304\") " pod="openshift-marketplace/redhat-marketplace-657pl" Mar 10 07:46:31 crc kubenswrapper[4825]: I0310 07:46:31.988637 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bx28\" (UniqueName: \"kubernetes.io/projected/063494e6-c102-4dcd-a4f1-573f83150304-kube-api-access-8bx28\") pod \"redhat-marketplace-657pl\" (UID: \"063494e6-c102-4dcd-a4f1-573f83150304\") " pod="openshift-marketplace/redhat-marketplace-657pl" Mar 10 07:46:32 crc kubenswrapper[4825]: I0310 07:46:32.067072 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-657pl" Mar 10 07:46:32 crc kubenswrapper[4825]: I0310 07:46:32.339735 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-657pl"] Mar 10 07:46:32 crc kubenswrapper[4825]: W0310 07:46:32.340804 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod063494e6_c102_4dcd_a4f1_573f83150304.slice/crio-0c90e766be62f39789e824eb7baa9e28b7f4d2af79e57d032dbdff9ea227fd0e WatchSource:0}: Error finding container 0c90e766be62f39789e824eb7baa9e28b7f4d2af79e57d032dbdff9ea227fd0e: Status 404 returned error can't find the container with id 0c90e766be62f39789e824eb7baa9e28b7f4d2af79e57d032dbdff9ea227fd0e Mar 10 07:46:32 crc kubenswrapper[4825]: I0310 07:46:32.863946 4825 generic.go:334] "Generic (PLEG): container finished" podID="063494e6-c102-4dcd-a4f1-573f83150304" containerID="06b5993eedbf7371f676ee53b34ed08e64fdc1d8936032ce073b2370bf575878" exitCode=0 Mar 10 07:46:32 crc kubenswrapper[4825]: I0310 07:46:32.864048 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-657pl" event={"ID":"063494e6-c102-4dcd-a4f1-573f83150304","Type":"ContainerDied","Data":"06b5993eedbf7371f676ee53b34ed08e64fdc1d8936032ce073b2370bf575878"} Mar 10 07:46:32 crc kubenswrapper[4825]: I0310 07:46:32.864347 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-657pl" event={"ID":"063494e6-c102-4dcd-a4f1-573f83150304","Type":"ContainerStarted","Data":"0c90e766be62f39789e824eb7baa9e28b7f4d2af79e57d032dbdff9ea227fd0e"} Mar 10 07:46:34 crc kubenswrapper[4825]: I0310 07:46:34.905584 4825 generic.go:334] "Generic (PLEG): container finished" podID="063494e6-c102-4dcd-a4f1-573f83150304" containerID="afd49da66c5c696fbe0287b1cc340ad510d26db9ed11ab912d77fe094a612733" exitCode=0 Mar 10 07:46:34 crc kubenswrapper[4825]: I0310 07:46:34.905685 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-657pl" event={"ID":"063494e6-c102-4dcd-a4f1-573f83150304","Type":"ContainerDied","Data":"afd49da66c5c696fbe0287b1cc340ad510d26db9ed11ab912d77fe094a612733"} Mar 10 07:46:35 crc kubenswrapper[4825]: I0310 07:46:35.915771 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-657pl" event={"ID":"063494e6-c102-4dcd-a4f1-573f83150304","Type":"ContainerStarted","Data":"b6b79b274e6571f64e476754a9ef2ae4c6dcdb0ab831c5f3a9d8c61595739c22"} Mar 10 07:46:35 crc kubenswrapper[4825]: I0310 07:46:35.948614 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-657pl" podStartSLOduration=2.509040786 podStartE2EDuration="4.94859446s" podCreationTimestamp="2026-03-10 07:46:31 +0000 UTC" firstStartedPulling="2026-03-10 07:46:32.867206201 +0000 UTC m=+3745.896986856" lastFinishedPulling="2026-03-10 07:46:35.306759875 +0000 UTC m=+3748.336540530" observedRunningTime="2026-03-10 07:46:35.947631415 +0000 UTC m=+3748.977412080" watchObservedRunningTime="2026-03-10 07:46:35.94859446 +0000 UTC m=+3748.978375085" Mar 10 07:46:39 crc kubenswrapper[4825]: I0310 07:46:39.315740 4825 scope.go:117] "RemoveContainer" containerID="d140737207cd765431f303752c3e8bb3c18cfed42f29428677741395f913c8c1" Mar 10 07:46:42 crc kubenswrapper[4825]: I0310 07:46:42.068102 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-657pl" Mar 10 07:46:42 crc kubenswrapper[4825]: I0310 07:46:42.068215 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-657pl" Mar 10 07:46:42 crc kubenswrapper[4825]: I0310 07:46:42.141545 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-657pl" Mar 10 07:46:43 crc kubenswrapper[4825]: I0310 07:46:43.061205 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-657pl" Mar 10 07:46:43 crc kubenswrapper[4825]: I0310 07:46:43.130479 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-657pl"] Mar 10 07:46:44 crc kubenswrapper[4825]: I0310 07:46:44.992830 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-657pl" podUID="063494e6-c102-4dcd-a4f1-573f83150304" containerName="registry-server" containerID="cri-o://b6b79b274e6571f64e476754a9ef2ae4c6dcdb0ab831c5f3a9d8c61595739c22" gracePeriod=2 Mar 10 07:46:45 crc kubenswrapper[4825]: I0310 07:46:45.402165 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-657pl" Mar 10 07:46:45 crc kubenswrapper[4825]: I0310 07:46:45.508341 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bx28\" (UniqueName: \"kubernetes.io/projected/063494e6-c102-4dcd-a4f1-573f83150304-kube-api-access-8bx28\") pod \"063494e6-c102-4dcd-a4f1-573f83150304\" (UID: \"063494e6-c102-4dcd-a4f1-573f83150304\") " Mar 10 07:46:45 crc kubenswrapper[4825]: I0310 07:46:45.508498 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/063494e6-c102-4dcd-a4f1-573f83150304-utilities\") pod \"063494e6-c102-4dcd-a4f1-573f83150304\" (UID: \"063494e6-c102-4dcd-a4f1-573f83150304\") " Mar 10 07:46:45 crc kubenswrapper[4825]: I0310 07:46:45.508527 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/063494e6-c102-4dcd-a4f1-573f83150304-catalog-content\") pod \"063494e6-c102-4dcd-a4f1-573f83150304\" (UID: \"063494e6-c102-4dcd-a4f1-573f83150304\") " Mar 10 07:46:45 crc kubenswrapper[4825]: I0310 07:46:45.510010 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/063494e6-c102-4dcd-a4f1-573f83150304-utilities" (OuterVolumeSpecName: "utilities") pod "063494e6-c102-4dcd-a4f1-573f83150304" (UID: "063494e6-c102-4dcd-a4f1-573f83150304"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:46:45 crc kubenswrapper[4825]: I0310 07:46:45.512005 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/063494e6-c102-4dcd-a4f1-573f83150304-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 07:46:45 crc kubenswrapper[4825]: I0310 07:46:45.513994 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/063494e6-c102-4dcd-a4f1-573f83150304-kube-api-access-8bx28" (OuterVolumeSpecName: "kube-api-access-8bx28") pod "063494e6-c102-4dcd-a4f1-573f83150304" (UID: "063494e6-c102-4dcd-a4f1-573f83150304"). InnerVolumeSpecName "kube-api-access-8bx28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:46:45 crc kubenswrapper[4825]: I0310 07:46:45.546296 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/063494e6-c102-4dcd-a4f1-573f83150304-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "063494e6-c102-4dcd-a4f1-573f83150304" (UID: "063494e6-c102-4dcd-a4f1-573f83150304"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:46:45 crc kubenswrapper[4825]: I0310 07:46:45.613065 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/063494e6-c102-4dcd-a4f1-573f83150304-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 07:46:45 crc kubenswrapper[4825]: I0310 07:46:45.613099 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bx28\" (UniqueName: \"kubernetes.io/projected/063494e6-c102-4dcd-a4f1-573f83150304-kube-api-access-8bx28\") on node \"crc\" DevicePath \"\"" Mar 10 07:46:46 crc kubenswrapper[4825]: I0310 07:46:46.000461 4825 generic.go:334] "Generic (PLEG): container finished" podID="063494e6-c102-4dcd-a4f1-573f83150304" containerID="b6b79b274e6571f64e476754a9ef2ae4c6dcdb0ab831c5f3a9d8c61595739c22" exitCode=0 Mar 10 07:46:46 crc kubenswrapper[4825]: I0310 07:46:46.000527 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-657pl" event={"ID":"063494e6-c102-4dcd-a4f1-573f83150304","Type":"ContainerDied","Data":"b6b79b274e6571f64e476754a9ef2ae4c6dcdb0ab831c5f3a9d8c61595739c22"} Mar 10 07:46:46 crc kubenswrapper[4825]: I0310 07:46:46.000841 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-657pl" event={"ID":"063494e6-c102-4dcd-a4f1-573f83150304","Type":"ContainerDied","Data":"0c90e766be62f39789e824eb7baa9e28b7f4d2af79e57d032dbdff9ea227fd0e"} Mar 10 07:46:46 crc kubenswrapper[4825]: I0310 07:46:46.000625 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-657pl" Mar 10 07:46:46 crc kubenswrapper[4825]: I0310 07:46:46.000867 4825 scope.go:117] "RemoveContainer" containerID="b6b79b274e6571f64e476754a9ef2ae4c6dcdb0ab831c5f3a9d8c61595739c22" Mar 10 07:46:46 crc kubenswrapper[4825]: I0310 07:46:46.026128 4825 scope.go:117] "RemoveContainer" containerID="afd49da66c5c696fbe0287b1cc340ad510d26db9ed11ab912d77fe094a612733" Mar 10 07:46:46 crc kubenswrapper[4825]: I0310 07:46:46.038834 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-657pl"] Mar 10 07:46:46 crc kubenswrapper[4825]: I0310 07:46:46.048054 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-657pl"] Mar 10 07:46:46 crc kubenswrapper[4825]: I0310 07:46:46.052604 4825 scope.go:117] "RemoveContainer" containerID="06b5993eedbf7371f676ee53b34ed08e64fdc1d8936032ce073b2370bf575878" Mar 10 07:46:46 crc kubenswrapper[4825]: I0310 07:46:46.069880 4825 scope.go:117] "RemoveContainer" containerID="b6b79b274e6571f64e476754a9ef2ae4c6dcdb0ab831c5f3a9d8c61595739c22" Mar 10 07:46:46 crc kubenswrapper[4825]: E0310 07:46:46.070406 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6b79b274e6571f64e476754a9ef2ae4c6dcdb0ab831c5f3a9d8c61595739c22\": container with ID starting with b6b79b274e6571f64e476754a9ef2ae4c6dcdb0ab831c5f3a9d8c61595739c22 not found: ID does not exist" containerID="b6b79b274e6571f64e476754a9ef2ae4c6dcdb0ab831c5f3a9d8c61595739c22" Mar 10 07:46:46 crc kubenswrapper[4825]: I0310 07:46:46.070453 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6b79b274e6571f64e476754a9ef2ae4c6dcdb0ab831c5f3a9d8c61595739c22"} err="failed to get container status \"b6b79b274e6571f64e476754a9ef2ae4c6dcdb0ab831c5f3a9d8c61595739c22\": rpc error: code = NotFound desc = could not find container \"b6b79b274e6571f64e476754a9ef2ae4c6dcdb0ab831c5f3a9d8c61595739c22\": container with ID starting with b6b79b274e6571f64e476754a9ef2ae4c6dcdb0ab831c5f3a9d8c61595739c22 not found: ID does not exist" Mar 10 07:46:46 crc kubenswrapper[4825]: I0310 07:46:46.070479 4825 scope.go:117] "RemoveContainer" containerID="afd49da66c5c696fbe0287b1cc340ad510d26db9ed11ab912d77fe094a612733" Mar 10 07:46:46 crc kubenswrapper[4825]: E0310 07:46:46.070925 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afd49da66c5c696fbe0287b1cc340ad510d26db9ed11ab912d77fe094a612733\": container with ID starting with afd49da66c5c696fbe0287b1cc340ad510d26db9ed11ab912d77fe094a612733 not found: ID does not exist" containerID="afd49da66c5c696fbe0287b1cc340ad510d26db9ed11ab912d77fe094a612733" Mar 10 07:46:46 crc kubenswrapper[4825]: I0310 07:46:46.070963 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afd49da66c5c696fbe0287b1cc340ad510d26db9ed11ab912d77fe094a612733"} err="failed to get container status \"afd49da66c5c696fbe0287b1cc340ad510d26db9ed11ab912d77fe094a612733\": rpc error: code = NotFound desc = could not find container \"afd49da66c5c696fbe0287b1cc340ad510d26db9ed11ab912d77fe094a612733\": container with ID starting with afd49da66c5c696fbe0287b1cc340ad510d26db9ed11ab912d77fe094a612733 not found: ID does not exist" Mar 10 07:46:46 crc kubenswrapper[4825]: I0310 07:46:46.070987 4825 scope.go:117] "RemoveContainer" containerID="06b5993eedbf7371f676ee53b34ed08e64fdc1d8936032ce073b2370bf575878" Mar 10 07:46:46 crc kubenswrapper[4825]: E0310 07:46:46.071314 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06b5993eedbf7371f676ee53b34ed08e64fdc1d8936032ce073b2370bf575878\": container with ID starting with 06b5993eedbf7371f676ee53b34ed08e64fdc1d8936032ce073b2370bf575878 not found: ID does not exist" containerID="06b5993eedbf7371f676ee53b34ed08e64fdc1d8936032ce073b2370bf575878" Mar 10 07:46:46 crc kubenswrapper[4825]: I0310 07:46:46.071342 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06b5993eedbf7371f676ee53b34ed08e64fdc1d8936032ce073b2370bf575878"} err="failed to get container status \"06b5993eedbf7371f676ee53b34ed08e64fdc1d8936032ce073b2370bf575878\": rpc error: code = NotFound desc = could not find container \"06b5993eedbf7371f676ee53b34ed08e64fdc1d8936032ce073b2370bf575878\": container with ID starting with 06b5993eedbf7371f676ee53b34ed08e64fdc1d8936032ce073b2370bf575878 not found: ID does not exist" Mar 10 07:46:47 crc kubenswrapper[4825]: I0310 07:46:47.248917 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="063494e6-c102-4dcd-a4f1-573f83150304" path="/var/lib/kubelet/pods/063494e6-c102-4dcd-a4f1-573f83150304/volumes" Mar 10 07:48:00 crc kubenswrapper[4825]: I0310 07:48:00.155221 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552148-72fl9"] Mar 10 07:48:00 crc kubenswrapper[4825]: E0310 07:48:00.156670 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="063494e6-c102-4dcd-a4f1-573f83150304" containerName="extract-utilities" Mar 10 07:48:00 crc kubenswrapper[4825]: I0310 07:48:00.156695 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="063494e6-c102-4dcd-a4f1-573f83150304" containerName="extract-utilities" Mar 10 07:48:00 crc kubenswrapper[4825]: E0310 07:48:00.156723 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="063494e6-c102-4dcd-a4f1-573f83150304" containerName="extract-content" Mar 10 07:48:00 crc kubenswrapper[4825]: I0310 07:48:00.156770 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="063494e6-c102-4dcd-a4f1-573f83150304" containerName="extract-content" Mar 10 07:48:00 crc kubenswrapper[4825]: E0310 07:48:00.156806 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="063494e6-c102-4dcd-a4f1-573f83150304" containerName="registry-server" Mar 10 07:48:00 crc kubenswrapper[4825]: I0310 07:48:00.156852 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="063494e6-c102-4dcd-a4f1-573f83150304" containerName="registry-server" Mar 10 07:48:00 crc kubenswrapper[4825]: I0310 07:48:00.157270 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="063494e6-c102-4dcd-a4f1-573f83150304" containerName="registry-server" Mar 10 07:48:00 crc kubenswrapper[4825]: I0310 07:48:00.158614 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552148-72fl9" Mar 10 07:48:00 crc kubenswrapper[4825]: I0310 07:48:00.161072 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552148-72fl9"] Mar 10 07:48:00 crc kubenswrapper[4825]: I0310 07:48:00.163633 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:48:00 crc kubenswrapper[4825]: I0310 07:48:00.165112 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:48:00 crc kubenswrapper[4825]: I0310 07:48:00.165359 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:48:00 crc kubenswrapper[4825]: I0310 07:48:00.282650 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kzj5\" (UniqueName: \"kubernetes.io/projected/22968ce1-c537-4b94-91d4-46ed534b995b-kube-api-access-6kzj5\") pod \"auto-csr-approver-29552148-72fl9\" (UID: \"22968ce1-c537-4b94-91d4-46ed534b995b\") " pod="openshift-infra/auto-csr-approver-29552148-72fl9" Mar 10 07:48:00 crc kubenswrapper[4825]: I0310 07:48:00.384863 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kzj5\" (UniqueName: \"kubernetes.io/projected/22968ce1-c537-4b94-91d4-46ed534b995b-kube-api-access-6kzj5\") pod \"auto-csr-approver-29552148-72fl9\" (UID: \"22968ce1-c537-4b94-91d4-46ed534b995b\") " pod="openshift-infra/auto-csr-approver-29552148-72fl9" Mar 10 07:48:00 crc kubenswrapper[4825]: I0310 07:48:00.418921 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kzj5\" (UniqueName: \"kubernetes.io/projected/22968ce1-c537-4b94-91d4-46ed534b995b-kube-api-access-6kzj5\") pod \"auto-csr-approver-29552148-72fl9\" (UID: \"22968ce1-c537-4b94-91d4-46ed534b995b\") " pod="openshift-infra/auto-csr-approver-29552148-72fl9" Mar 10 07:48:00 crc kubenswrapper[4825]: I0310 07:48:00.496220 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552148-72fl9" Mar 10 07:48:00 crc kubenswrapper[4825]: I0310 07:48:00.724401 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552148-72fl9"] Mar 10 07:48:01 crc kubenswrapper[4825]: I0310 07:48:01.660788 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552148-72fl9" event={"ID":"22968ce1-c537-4b94-91d4-46ed534b995b","Type":"ContainerStarted","Data":"9615d2c113e6c3eb09b39c02134fa516db38f4d66f48e6e05c51fd91a3746249"} Mar 10 07:48:02 crc kubenswrapper[4825]: I0310 07:48:02.671087 4825 generic.go:334] "Generic (PLEG): container finished" podID="22968ce1-c537-4b94-91d4-46ed534b995b" containerID="0d6e8de39c599ab6b37d117e7686ece52471a310d0060421d3799f7466d16803" exitCode=0 Mar 10 07:48:02 crc kubenswrapper[4825]: I0310 07:48:02.671202 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552148-72fl9" event={"ID":"22968ce1-c537-4b94-91d4-46ed534b995b","Type":"ContainerDied","Data":"0d6e8de39c599ab6b37d117e7686ece52471a310d0060421d3799f7466d16803"} Mar 10 07:48:03 crc kubenswrapper[4825]: I0310 07:48:03.947408 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552148-72fl9" Mar 10 07:48:04 crc kubenswrapper[4825]: I0310 07:48:04.039108 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kzj5\" (UniqueName: \"kubernetes.io/projected/22968ce1-c537-4b94-91d4-46ed534b995b-kube-api-access-6kzj5\") pod \"22968ce1-c537-4b94-91d4-46ed534b995b\" (UID: \"22968ce1-c537-4b94-91d4-46ed534b995b\") " Mar 10 07:48:04 crc kubenswrapper[4825]: I0310 07:48:04.048393 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22968ce1-c537-4b94-91d4-46ed534b995b-kube-api-access-6kzj5" (OuterVolumeSpecName: "kube-api-access-6kzj5") pod "22968ce1-c537-4b94-91d4-46ed534b995b" (UID: "22968ce1-c537-4b94-91d4-46ed534b995b"). InnerVolumeSpecName "kube-api-access-6kzj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:48:04 crc kubenswrapper[4825]: I0310 07:48:04.141182 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kzj5\" (UniqueName: \"kubernetes.io/projected/22968ce1-c537-4b94-91d4-46ed534b995b-kube-api-access-6kzj5\") on node \"crc\" DevicePath \"\"" Mar 10 07:48:04 crc kubenswrapper[4825]: I0310 07:48:04.691012 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552148-72fl9" event={"ID":"22968ce1-c537-4b94-91d4-46ed534b995b","Type":"ContainerDied","Data":"9615d2c113e6c3eb09b39c02134fa516db38f4d66f48e6e05c51fd91a3746249"} Mar 10 07:48:04 crc kubenswrapper[4825]: I0310 07:48:04.691053 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9615d2c113e6c3eb09b39c02134fa516db38f4d66f48e6e05c51fd91a3746249" Mar 10 07:48:04 crc kubenswrapper[4825]: I0310 07:48:04.691116 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552148-72fl9" Mar 10 07:48:05 crc kubenswrapper[4825]: I0310 07:48:05.030436 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552142-zj85r"] Mar 10 07:48:05 crc kubenswrapper[4825]: I0310 07:48:05.043100 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552142-zj85r"] Mar 10 07:48:05 crc kubenswrapper[4825]: I0310 07:48:05.251635 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6cefa1b-ee7d-47ba-a2d8-17e2f3bc9f8f" path="/var/lib/kubelet/pods/d6cefa1b-ee7d-47ba-a2d8-17e2f3bc9f8f/volumes" Mar 10 07:48:39 crc kubenswrapper[4825]: I0310 07:48:39.452358 4825 scope.go:117] "RemoveContainer" containerID="db53e4a822125eccee2e837c6049adafb85a57e0a64db04cd79eda84bf43c108" Mar 10 07:48:46 crc kubenswrapper[4825]: I0310 07:48:46.888646 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:48:46 crc kubenswrapper[4825]: I0310 07:48:46.889219 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:49:16 crc kubenswrapper[4825]: I0310 07:49:16.887866 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:49:16 crc kubenswrapper[4825]: I0310 07:49:16.888371 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:49:46 crc kubenswrapper[4825]: I0310 07:49:46.888869 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:49:46 crc kubenswrapper[4825]: I0310 07:49:46.889506 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:49:46 crc kubenswrapper[4825]: I0310 07:49:46.889571 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 07:49:46 crc kubenswrapper[4825]: I0310 07:49:46.890318 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 07:49:46 crc kubenswrapper[4825]: I0310 07:49:46.890375 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" gracePeriod=600 Mar 10 07:49:47 crc kubenswrapper[4825]: I0310 07:49:47.543601 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" exitCode=0 Mar 10 07:49:47 crc kubenswrapper[4825]: I0310 07:49:47.543674 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e"} Mar 10 07:49:47 crc kubenswrapper[4825]: I0310 07:49:47.544009 4825 scope.go:117] "RemoveContainer" containerID="95a5e83fff4d9b64dcc73b9320d1e9fe7e149f0c72cfd7655a687da74d8eabda" Mar 10 07:49:47 crc kubenswrapper[4825]: E0310 07:49:47.681432 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:49:48 crc kubenswrapper[4825]: I0310 07:49:48.553226 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:49:48 crc kubenswrapper[4825]: E0310 07:49:48.554543 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:50:00 crc kubenswrapper[4825]: I0310 07:50:00.171377 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552150-cmlkq"] Mar 10 07:50:00 crc kubenswrapper[4825]: E0310 07:50:00.172298 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22968ce1-c537-4b94-91d4-46ed534b995b" containerName="oc" Mar 10 07:50:00 crc kubenswrapper[4825]: I0310 07:50:00.172316 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="22968ce1-c537-4b94-91d4-46ed534b995b" containerName="oc" Mar 10 07:50:00 crc kubenswrapper[4825]: I0310 07:50:00.172500 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="22968ce1-c537-4b94-91d4-46ed534b995b" containerName="oc" Mar 10 07:50:00 crc kubenswrapper[4825]: I0310 07:50:00.173017 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552150-cmlkq" Mar 10 07:50:00 crc kubenswrapper[4825]: I0310 07:50:00.176147 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:50:00 crc kubenswrapper[4825]: I0310 07:50:00.176159 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:50:00 crc kubenswrapper[4825]: I0310 07:50:00.176355 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:50:00 crc kubenswrapper[4825]: I0310 07:50:00.187954 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552150-cmlkq"] Mar 10 07:50:00 crc kubenswrapper[4825]: I0310 07:50:00.268907 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt8ml\" (UniqueName: \"kubernetes.io/projected/21d7cffa-33ec-4b8c-9704-cfcfc67faad1-kube-api-access-kt8ml\") pod \"auto-csr-approver-29552150-cmlkq\" (UID: \"21d7cffa-33ec-4b8c-9704-cfcfc67faad1\") " pod="openshift-infra/auto-csr-approver-29552150-cmlkq" Mar 10 07:50:00 crc kubenswrapper[4825]: I0310 07:50:00.370866 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt8ml\" (UniqueName: \"kubernetes.io/projected/21d7cffa-33ec-4b8c-9704-cfcfc67faad1-kube-api-access-kt8ml\") pod \"auto-csr-approver-29552150-cmlkq\" (UID: \"21d7cffa-33ec-4b8c-9704-cfcfc67faad1\") " pod="openshift-infra/auto-csr-approver-29552150-cmlkq" Mar 10 07:50:00 crc kubenswrapper[4825]: I0310 07:50:00.390492 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt8ml\" (UniqueName: \"kubernetes.io/projected/21d7cffa-33ec-4b8c-9704-cfcfc67faad1-kube-api-access-kt8ml\") pod \"auto-csr-approver-29552150-cmlkq\" (UID: \"21d7cffa-33ec-4b8c-9704-cfcfc67faad1\") " pod="openshift-infra/auto-csr-approver-29552150-cmlkq" Mar 10 07:50:00 crc kubenswrapper[4825]: I0310 07:50:00.501057 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552150-cmlkq" Mar 10 07:50:00 crc kubenswrapper[4825]: I0310 07:50:00.924407 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552150-cmlkq"] Mar 10 07:50:01 crc kubenswrapper[4825]: I0310 07:50:01.644377 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552150-cmlkq" event={"ID":"21d7cffa-33ec-4b8c-9704-cfcfc67faad1","Type":"ContainerStarted","Data":"5fcce59e9143c0ede9748a7ee7ee8913e3ae238a54abd815ef351233809ed885"} Mar 10 07:50:02 crc kubenswrapper[4825]: I0310 07:50:02.236866 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:50:02 crc kubenswrapper[4825]: E0310 07:50:02.237083 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:50:02 crc kubenswrapper[4825]: I0310 07:50:02.653878 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552150-cmlkq" event={"ID":"21d7cffa-33ec-4b8c-9704-cfcfc67faad1","Type":"ContainerStarted","Data":"3e8dcd335b135cc458b631c18fca7d607496017c1de585d27fcacce776e817b3"} Mar 10 07:50:02 crc kubenswrapper[4825]: I0310 07:50:02.671650 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552150-cmlkq" podStartSLOduration=1.420107706 podStartE2EDuration="2.671631484s" podCreationTimestamp="2026-03-10 07:50:00 +0000 UTC" firstStartedPulling="2026-03-10 07:50:00.931344181 +0000 UTC m=+3953.961124796" lastFinishedPulling="2026-03-10 07:50:02.182867959 +0000 UTC m=+3955.212648574" observedRunningTime="2026-03-10 07:50:02.671604233 +0000 UTC m=+3955.701384868" watchObservedRunningTime="2026-03-10 07:50:02.671631484 +0000 UTC m=+3955.701412099" Mar 10 07:50:03 crc kubenswrapper[4825]: I0310 07:50:03.662468 4825 generic.go:334] "Generic (PLEG): container finished" podID="21d7cffa-33ec-4b8c-9704-cfcfc67faad1" containerID="3e8dcd335b135cc458b631c18fca7d607496017c1de585d27fcacce776e817b3" exitCode=0 Mar 10 07:50:03 crc kubenswrapper[4825]: I0310 07:50:03.662553 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552150-cmlkq" event={"ID":"21d7cffa-33ec-4b8c-9704-cfcfc67faad1","Type":"ContainerDied","Data":"3e8dcd335b135cc458b631c18fca7d607496017c1de585d27fcacce776e817b3"} Mar 10 07:50:05 crc kubenswrapper[4825]: I0310 07:50:05.005935 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552150-cmlkq" Mar 10 07:50:05 crc kubenswrapper[4825]: I0310 07:50:05.140656 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt8ml\" (UniqueName: \"kubernetes.io/projected/21d7cffa-33ec-4b8c-9704-cfcfc67faad1-kube-api-access-kt8ml\") pod \"21d7cffa-33ec-4b8c-9704-cfcfc67faad1\" (UID: \"21d7cffa-33ec-4b8c-9704-cfcfc67faad1\") " Mar 10 07:50:05 crc kubenswrapper[4825]: I0310 07:50:05.146239 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21d7cffa-33ec-4b8c-9704-cfcfc67faad1-kube-api-access-kt8ml" (OuterVolumeSpecName: "kube-api-access-kt8ml") pod "21d7cffa-33ec-4b8c-9704-cfcfc67faad1" (UID: "21d7cffa-33ec-4b8c-9704-cfcfc67faad1"). InnerVolumeSpecName "kube-api-access-kt8ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:50:05 crc kubenswrapper[4825]: I0310 07:50:05.245453 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt8ml\" (UniqueName: \"kubernetes.io/projected/21d7cffa-33ec-4b8c-9704-cfcfc67faad1-kube-api-access-kt8ml\") on node \"crc\" DevicePath \"\"" Mar 10 07:50:05 crc kubenswrapper[4825]: I0310 07:50:05.679967 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552150-cmlkq" event={"ID":"21d7cffa-33ec-4b8c-9704-cfcfc67faad1","Type":"ContainerDied","Data":"5fcce59e9143c0ede9748a7ee7ee8913e3ae238a54abd815ef351233809ed885"} Mar 10 07:50:05 crc kubenswrapper[4825]: I0310 07:50:05.680013 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fcce59e9143c0ede9748a7ee7ee8913e3ae238a54abd815ef351233809ed885" Mar 10 07:50:05 crc kubenswrapper[4825]: I0310 07:50:05.680036 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552150-cmlkq" Mar 10 07:50:05 crc kubenswrapper[4825]: I0310 07:50:05.746990 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552144-fz7k9"] Mar 10 07:50:05 crc kubenswrapper[4825]: I0310 07:50:05.753394 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552144-fz7k9"] Mar 10 07:50:07 crc kubenswrapper[4825]: I0310 07:50:07.253407 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe5a61dc-26e9-406d-a533-896a826d1f53" path="/var/lib/kubelet/pods/fe5a61dc-26e9-406d-a533-896a826d1f53/volumes" Mar 10 07:50:15 crc kubenswrapper[4825]: I0310 07:50:15.236393 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:50:15 crc kubenswrapper[4825]: E0310 07:50:15.237601 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:50:29 crc kubenswrapper[4825]: I0310 07:50:29.244176 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:50:29 crc kubenswrapper[4825]: E0310 07:50:29.245215 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:50:39 crc kubenswrapper[4825]: I0310 07:50:39.530985 4825 scope.go:117] "RemoveContainer" containerID="6e4ded3869d3a46487744d8766d1789eadaa514a0e7c4a412590b6ab0a5f41ab" Mar 10 07:50:44 crc kubenswrapper[4825]: I0310 07:50:44.237252 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:50:44 crc kubenswrapper[4825]: E0310 07:50:44.237905 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:50:53 crc kubenswrapper[4825]: I0310 07:50:53.661669 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n9ps5"] Mar 10 07:50:53 crc kubenswrapper[4825]: E0310 07:50:53.662609 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21d7cffa-33ec-4b8c-9704-cfcfc67faad1" containerName="oc" Mar 10 07:50:53 crc kubenswrapper[4825]: I0310 07:50:53.662625 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="21d7cffa-33ec-4b8c-9704-cfcfc67faad1" containerName="oc" Mar 10 07:50:53 crc kubenswrapper[4825]: I0310 07:50:53.662783 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="21d7cffa-33ec-4b8c-9704-cfcfc67faad1" containerName="oc" Mar 10 07:50:53 crc kubenswrapper[4825]: I0310 07:50:53.663705 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n9ps5" Mar 10 07:50:53 crc kubenswrapper[4825]: I0310 07:50:53.669447 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n9ps5"] Mar 10 07:50:53 crc kubenswrapper[4825]: I0310 07:50:53.800745 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48nqw\" (UniqueName: \"kubernetes.io/projected/545a5366-8111-4beb-9e32-02ac0b7d2cc2-kube-api-access-48nqw\") pod \"certified-operators-n9ps5\" (UID: \"545a5366-8111-4beb-9e32-02ac0b7d2cc2\") " pod="openshift-marketplace/certified-operators-n9ps5" Mar 10 07:50:53 crc kubenswrapper[4825]: I0310 07:50:53.800871 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/545a5366-8111-4beb-9e32-02ac0b7d2cc2-utilities\") pod \"certified-operators-n9ps5\" (UID: \"545a5366-8111-4beb-9e32-02ac0b7d2cc2\") " pod="openshift-marketplace/certified-operators-n9ps5" Mar 10 07:50:53 crc kubenswrapper[4825]: I0310 07:50:53.800984 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/545a5366-8111-4beb-9e32-02ac0b7d2cc2-catalog-content\") pod \"certified-operators-n9ps5\" (UID: \"545a5366-8111-4beb-9e32-02ac0b7d2cc2\") " pod="openshift-marketplace/certified-operators-n9ps5" Mar 10 07:50:53 crc kubenswrapper[4825]: I0310 07:50:53.902451 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/545a5366-8111-4beb-9e32-02ac0b7d2cc2-utilities\") pod \"certified-operators-n9ps5\" (UID: \"545a5366-8111-4beb-9e32-02ac0b7d2cc2\") " pod="openshift-marketplace/certified-operators-n9ps5" Mar 10 07:50:53 crc kubenswrapper[4825]: I0310 07:50:53.902524 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/545a5366-8111-4beb-9e32-02ac0b7d2cc2-catalog-content\") pod \"certified-operators-n9ps5\" (UID: \"545a5366-8111-4beb-9e32-02ac0b7d2cc2\") " pod="openshift-marketplace/certified-operators-n9ps5" Mar 10 07:50:53 crc kubenswrapper[4825]: I0310 07:50:53.902556 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48nqw\" (UniqueName: \"kubernetes.io/projected/545a5366-8111-4beb-9e32-02ac0b7d2cc2-kube-api-access-48nqw\") pod \"certified-operators-n9ps5\" (UID: \"545a5366-8111-4beb-9e32-02ac0b7d2cc2\") " pod="openshift-marketplace/certified-operators-n9ps5" Mar 10 07:50:53 crc kubenswrapper[4825]: I0310 07:50:53.903067 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/545a5366-8111-4beb-9e32-02ac0b7d2cc2-catalog-content\") pod \"certified-operators-n9ps5\" (UID: \"545a5366-8111-4beb-9e32-02ac0b7d2cc2\") " pod="openshift-marketplace/certified-operators-n9ps5" Mar 10 07:50:53 crc kubenswrapper[4825]: I0310 07:50:53.903413 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/545a5366-8111-4beb-9e32-02ac0b7d2cc2-utilities\") pod \"certified-operators-n9ps5\" (UID: \"545a5366-8111-4beb-9e32-02ac0b7d2cc2\") " pod="openshift-marketplace/certified-operators-n9ps5" Mar 10 07:50:53 crc kubenswrapper[4825]: I0310 07:50:53.922061 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48nqw\" (UniqueName: \"kubernetes.io/projected/545a5366-8111-4beb-9e32-02ac0b7d2cc2-kube-api-access-48nqw\") pod \"certified-operators-n9ps5\" (UID: \"545a5366-8111-4beb-9e32-02ac0b7d2cc2\") " pod="openshift-marketplace/certified-operators-n9ps5" Mar 10 07:50:54 crc kubenswrapper[4825]: I0310 07:50:54.001329 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n9ps5" Mar 10 07:50:54 crc kubenswrapper[4825]: I0310 07:50:54.506085 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n9ps5"] Mar 10 07:50:55 crc kubenswrapper[4825]: I0310 07:50:55.076599 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9ps5" event={"ID":"545a5366-8111-4beb-9e32-02ac0b7d2cc2","Type":"ContainerStarted","Data":"2e8578e00f3cd62107f9d1e46bc5bd294a6a7deb08e82e184ed3df8a81c53dd5"} Mar 10 07:50:55 crc kubenswrapper[4825]: I0310 07:50:55.076922 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9ps5" event={"ID":"545a5366-8111-4beb-9e32-02ac0b7d2cc2","Type":"ContainerStarted","Data":"a4f368b1eebc03e12c1f95957d8289608080f76cfc932794698e06f32b05a1b3"} Mar 10 07:50:56 crc kubenswrapper[4825]: I0310 07:50:56.086065 4825 generic.go:334] "Generic (PLEG): container finished" podID="545a5366-8111-4beb-9e32-02ac0b7d2cc2" containerID="2e8578e00f3cd62107f9d1e46bc5bd294a6a7deb08e82e184ed3df8a81c53dd5" exitCode=0 Mar 10 07:50:56 crc kubenswrapper[4825]: I0310 07:50:56.086109 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9ps5" event={"ID":"545a5366-8111-4beb-9e32-02ac0b7d2cc2","Type":"ContainerDied","Data":"2e8578e00f3cd62107f9d1e46bc5bd294a6a7deb08e82e184ed3df8a81c53dd5"} Mar 10 07:50:57 crc kubenswrapper[4825]: I0310 07:50:57.239206 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:50:57 crc kubenswrapper[4825]: E0310 07:50:57.239916 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:50:59 crc kubenswrapper[4825]: I0310 07:50:59.110677 4825 generic.go:334] "Generic (PLEG): container finished" podID="545a5366-8111-4beb-9e32-02ac0b7d2cc2" containerID="384343c99c8bb82ad3d7620e4be8cfcd32d3acfd6632d60788da711391e3c580" exitCode=0 Mar 10 07:50:59 crc kubenswrapper[4825]: I0310 07:50:59.110763 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9ps5" event={"ID":"545a5366-8111-4beb-9e32-02ac0b7d2cc2","Type":"ContainerDied","Data":"384343c99c8bb82ad3d7620e4be8cfcd32d3acfd6632d60788da711391e3c580"} Mar 10 07:51:00 crc kubenswrapper[4825]: I0310 07:51:00.121029 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9ps5" event={"ID":"545a5366-8111-4beb-9e32-02ac0b7d2cc2","Type":"ContainerStarted","Data":"4a0835fa4a0ec389c1b7c5b86c89b60dcb93bee84466392372d2532147e1ba96"} Mar 10 07:51:00 crc kubenswrapper[4825]: I0310 07:51:00.159382 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n9ps5" podStartSLOduration=3.6995634109999997 podStartE2EDuration="7.159358724s" podCreationTimestamp="2026-03-10 07:50:53 +0000 UTC" firstStartedPulling="2026-03-10 07:50:56.088779548 +0000 UTC m=+4009.118560173" lastFinishedPulling="2026-03-10 07:50:59.548574841 +0000 UTC m=+4012.578355486" observedRunningTime="2026-03-10 07:51:00.146292581 +0000 UTC m=+4013.176073236" watchObservedRunningTime="2026-03-10 07:51:00.159358724 +0000 UTC m=+4013.189139349" Mar 10 07:51:04 crc kubenswrapper[4825]: I0310 07:51:04.001493 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n9ps5" Mar 10 07:51:04 crc kubenswrapper[4825]: I0310 07:51:04.001918 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n9ps5" Mar 10 07:51:04 crc kubenswrapper[4825]: I0310 07:51:04.073453 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n9ps5" Mar 10 07:51:04 crc kubenswrapper[4825]: I0310 07:51:04.211200 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n9ps5" Mar 10 07:51:04 crc kubenswrapper[4825]: I0310 07:51:04.320041 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n9ps5"] Mar 10 07:51:06 crc kubenswrapper[4825]: I0310 07:51:06.173445 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n9ps5" podUID="545a5366-8111-4beb-9e32-02ac0b7d2cc2" containerName="registry-server" containerID="cri-o://4a0835fa4a0ec389c1b7c5b86c89b60dcb93bee84466392372d2532147e1ba96" gracePeriod=2 Mar 10 07:51:06 crc kubenswrapper[4825]: I0310 07:51:06.596356 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n9ps5" Mar 10 07:51:06 crc kubenswrapper[4825]: I0310 07:51:06.694715 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48nqw\" (UniqueName: \"kubernetes.io/projected/545a5366-8111-4beb-9e32-02ac0b7d2cc2-kube-api-access-48nqw\") pod \"545a5366-8111-4beb-9e32-02ac0b7d2cc2\" (UID: \"545a5366-8111-4beb-9e32-02ac0b7d2cc2\") " Mar 10 07:51:06 crc kubenswrapper[4825]: I0310 07:51:06.694926 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/545a5366-8111-4beb-9e32-02ac0b7d2cc2-catalog-content\") pod \"545a5366-8111-4beb-9e32-02ac0b7d2cc2\" (UID: \"545a5366-8111-4beb-9e32-02ac0b7d2cc2\") " Mar 10 07:51:06 crc kubenswrapper[4825]: I0310 07:51:06.694984 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/545a5366-8111-4beb-9e32-02ac0b7d2cc2-utilities\") pod \"545a5366-8111-4beb-9e32-02ac0b7d2cc2\" (UID: \"545a5366-8111-4beb-9e32-02ac0b7d2cc2\") " Mar 10 07:51:06 crc kubenswrapper[4825]: I0310 07:51:06.695722 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/545a5366-8111-4beb-9e32-02ac0b7d2cc2-utilities" (OuterVolumeSpecName: "utilities") pod "545a5366-8111-4beb-9e32-02ac0b7d2cc2" (UID: "545a5366-8111-4beb-9e32-02ac0b7d2cc2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:51:06 crc kubenswrapper[4825]: I0310 07:51:06.750278 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/545a5366-8111-4beb-9e32-02ac0b7d2cc2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "545a5366-8111-4beb-9e32-02ac0b7d2cc2" (UID: "545a5366-8111-4beb-9e32-02ac0b7d2cc2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:51:06 crc kubenswrapper[4825]: I0310 07:51:06.758431 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/545a5366-8111-4beb-9e32-02ac0b7d2cc2-kube-api-access-48nqw" (OuterVolumeSpecName: "kube-api-access-48nqw") pod "545a5366-8111-4beb-9e32-02ac0b7d2cc2" (UID: "545a5366-8111-4beb-9e32-02ac0b7d2cc2"). InnerVolumeSpecName "kube-api-access-48nqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:51:06 crc kubenswrapper[4825]: I0310 07:51:06.796705 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/545a5366-8111-4beb-9e32-02ac0b7d2cc2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 07:51:06 crc kubenswrapper[4825]: I0310 07:51:06.796736 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/545a5366-8111-4beb-9e32-02ac0b7d2cc2-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 07:51:06 crc kubenswrapper[4825]: I0310 07:51:06.796746 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48nqw\" (UniqueName: \"kubernetes.io/projected/545a5366-8111-4beb-9e32-02ac0b7d2cc2-kube-api-access-48nqw\") on node \"crc\" DevicePath \"\"" Mar 10 07:51:07 crc kubenswrapper[4825]: I0310 07:51:07.187030 4825 generic.go:334] "Generic (PLEG): container finished" podID="545a5366-8111-4beb-9e32-02ac0b7d2cc2" containerID="4a0835fa4a0ec389c1b7c5b86c89b60dcb93bee84466392372d2532147e1ba96" exitCode=0 Mar 10 07:51:07 crc kubenswrapper[4825]: I0310 07:51:07.187086 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n9ps5" Mar 10 07:51:07 crc kubenswrapper[4825]: I0310 07:51:07.187237 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9ps5" event={"ID":"545a5366-8111-4beb-9e32-02ac0b7d2cc2","Type":"ContainerDied","Data":"4a0835fa4a0ec389c1b7c5b86c89b60dcb93bee84466392372d2532147e1ba96"} Mar 10 07:51:07 crc kubenswrapper[4825]: I0310 07:51:07.187309 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9ps5" event={"ID":"545a5366-8111-4beb-9e32-02ac0b7d2cc2","Type":"ContainerDied","Data":"a4f368b1eebc03e12c1f95957d8289608080f76cfc932794698e06f32b05a1b3"} Mar 10 07:51:07 crc kubenswrapper[4825]: I0310 07:51:07.187347 4825 scope.go:117] "RemoveContainer" containerID="4a0835fa4a0ec389c1b7c5b86c89b60dcb93bee84466392372d2532147e1ba96" Mar 10 07:51:07 crc kubenswrapper[4825]: I0310 07:51:07.229028 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n9ps5"] Mar 10 07:51:07 crc kubenswrapper[4825]: I0310 07:51:07.230400 4825 scope.go:117] "RemoveContainer" containerID="384343c99c8bb82ad3d7620e4be8cfcd32d3acfd6632d60788da711391e3c580" Mar 10 07:51:07 crc kubenswrapper[4825]: I0310 07:51:07.258367 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n9ps5"] Mar 10 07:51:07 crc kubenswrapper[4825]: I0310 07:51:07.269896 4825 scope.go:117] "RemoveContainer" containerID="2e8578e00f3cd62107f9d1e46bc5bd294a6a7deb08e82e184ed3df8a81c53dd5" Mar 10 07:51:07 crc kubenswrapper[4825]: I0310 07:51:07.297693 4825 scope.go:117] "RemoveContainer" containerID="4a0835fa4a0ec389c1b7c5b86c89b60dcb93bee84466392372d2532147e1ba96" Mar 10 07:51:07 crc kubenswrapper[4825]: E0310 07:51:07.298656 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a0835fa4a0ec389c1b7c5b86c89b60dcb93bee84466392372d2532147e1ba96\": container with ID starting with 4a0835fa4a0ec389c1b7c5b86c89b60dcb93bee84466392372d2532147e1ba96 not found: ID does not exist" containerID="4a0835fa4a0ec389c1b7c5b86c89b60dcb93bee84466392372d2532147e1ba96" Mar 10 07:51:07 crc kubenswrapper[4825]: I0310 07:51:07.298727 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a0835fa4a0ec389c1b7c5b86c89b60dcb93bee84466392372d2532147e1ba96"} err="failed to get container status \"4a0835fa4a0ec389c1b7c5b86c89b60dcb93bee84466392372d2532147e1ba96\": rpc error: code = NotFound desc = could not find container \"4a0835fa4a0ec389c1b7c5b86c89b60dcb93bee84466392372d2532147e1ba96\": container with ID starting with 4a0835fa4a0ec389c1b7c5b86c89b60dcb93bee84466392372d2532147e1ba96 not found: ID does not exist" Mar 10 07:51:07 crc kubenswrapper[4825]: I0310 07:51:07.298758 4825 scope.go:117] "RemoveContainer" containerID="384343c99c8bb82ad3d7620e4be8cfcd32d3acfd6632d60788da711391e3c580" Mar 10 07:51:07 crc kubenswrapper[4825]: E0310 07:51:07.300050 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"384343c99c8bb82ad3d7620e4be8cfcd32d3acfd6632d60788da711391e3c580\": container with ID starting with 384343c99c8bb82ad3d7620e4be8cfcd32d3acfd6632d60788da711391e3c580 not found: ID does not exist" containerID="384343c99c8bb82ad3d7620e4be8cfcd32d3acfd6632d60788da711391e3c580" Mar 10 07:51:07 crc kubenswrapper[4825]: I0310 07:51:07.300107 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"384343c99c8bb82ad3d7620e4be8cfcd32d3acfd6632d60788da711391e3c580"} err="failed to get container status \"384343c99c8bb82ad3d7620e4be8cfcd32d3acfd6632d60788da711391e3c580\": rpc error: code = NotFound desc = could not find container \"384343c99c8bb82ad3d7620e4be8cfcd32d3acfd6632d60788da711391e3c580\": container with ID starting with 384343c99c8bb82ad3d7620e4be8cfcd32d3acfd6632d60788da711391e3c580 not found: ID does not exist" Mar 10 07:51:07 crc kubenswrapper[4825]: I0310 07:51:07.300157 4825 scope.go:117] "RemoveContainer" containerID="2e8578e00f3cd62107f9d1e46bc5bd294a6a7deb08e82e184ed3df8a81c53dd5" Mar 10 07:51:07 crc kubenswrapper[4825]: E0310 07:51:07.300419 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e8578e00f3cd62107f9d1e46bc5bd294a6a7deb08e82e184ed3df8a81c53dd5\": container with ID starting with 2e8578e00f3cd62107f9d1e46bc5bd294a6a7deb08e82e184ed3df8a81c53dd5 not found: ID does not exist" containerID="2e8578e00f3cd62107f9d1e46bc5bd294a6a7deb08e82e184ed3df8a81c53dd5" Mar 10 07:51:07 crc kubenswrapper[4825]: I0310 07:51:07.300463 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e8578e00f3cd62107f9d1e46bc5bd294a6a7deb08e82e184ed3df8a81c53dd5"} err="failed to get container status \"2e8578e00f3cd62107f9d1e46bc5bd294a6a7deb08e82e184ed3df8a81c53dd5\": rpc error: code = NotFound desc = could not find container \"2e8578e00f3cd62107f9d1e46bc5bd294a6a7deb08e82e184ed3df8a81c53dd5\": container with ID starting with 2e8578e00f3cd62107f9d1e46bc5bd294a6a7deb08e82e184ed3df8a81c53dd5 not found: ID does not exist" Mar 10 07:51:07 crc kubenswrapper[4825]: E0310 07:51:07.359815 4825 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod545a5366_8111_4beb_9e32_02ac0b7d2cc2.slice/crio-a4f368b1eebc03e12c1f95957d8289608080f76cfc932794698e06f32b05a1b3\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod545a5366_8111_4beb_9e32_02ac0b7d2cc2.slice\": RecentStats: unable to find data in memory cache]" Mar 10 07:51:09 crc kubenswrapper[4825]: I0310 07:51:09.244778 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="545a5366-8111-4beb-9e32-02ac0b7d2cc2" path="/var/lib/kubelet/pods/545a5366-8111-4beb-9e32-02ac0b7d2cc2/volumes" Mar 10 07:51:10 crc kubenswrapper[4825]: I0310 07:51:10.236509 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:51:10 crc kubenswrapper[4825]: E0310 07:51:10.236962 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:51:23 crc kubenswrapper[4825]: I0310 07:51:23.236852 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:51:23 crc kubenswrapper[4825]: E0310 07:51:23.237931 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:51:34 crc kubenswrapper[4825]: I0310 07:51:34.237274 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:51:34 crc kubenswrapper[4825]: E0310 07:51:34.238477 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:51:49 crc kubenswrapper[4825]: I0310 07:51:49.243351 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:51:49 crc kubenswrapper[4825]: E0310 07:51:49.244270 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:52:00 crc kubenswrapper[4825]: I0310 07:52:00.166869 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552152-zfqml"] Mar 10 07:52:00 crc kubenswrapper[4825]: E0310 07:52:00.167925 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="545a5366-8111-4beb-9e32-02ac0b7d2cc2" containerName="registry-server" Mar 10 07:52:00 crc kubenswrapper[4825]: I0310 07:52:00.167948 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="545a5366-8111-4beb-9e32-02ac0b7d2cc2" containerName="registry-server" Mar 10 07:52:00 crc kubenswrapper[4825]: E0310 07:52:00.167997 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="545a5366-8111-4beb-9e32-02ac0b7d2cc2" containerName="extract-utilities" Mar 10 07:52:00 crc kubenswrapper[4825]: I0310 07:52:00.168008 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="545a5366-8111-4beb-9e32-02ac0b7d2cc2" containerName="extract-utilities" Mar 10 07:52:00 crc kubenswrapper[4825]: E0310 07:52:00.168032 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="545a5366-8111-4beb-9e32-02ac0b7d2cc2" containerName="extract-content" Mar 10 07:52:00 crc kubenswrapper[4825]: I0310 07:52:00.168043 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="545a5366-8111-4beb-9e32-02ac0b7d2cc2" containerName="extract-content" Mar 10 07:52:00 crc kubenswrapper[4825]: I0310 07:52:00.168558 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="545a5366-8111-4beb-9e32-02ac0b7d2cc2" containerName="registry-server" Mar 10 07:52:00 crc kubenswrapper[4825]: I0310 07:52:00.169546 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552152-zfqml" Mar 10 07:52:00 crc kubenswrapper[4825]: I0310 07:52:00.171633 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:52:00 crc kubenswrapper[4825]: I0310 07:52:00.172121 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:52:00 crc kubenswrapper[4825]: I0310 07:52:00.172607 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:52:00 crc kubenswrapper[4825]: I0310 07:52:00.177405 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552152-zfqml"] Mar 10 07:52:00 crc kubenswrapper[4825]: I0310 07:52:00.227154 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr4qq\" (UniqueName: \"kubernetes.io/projected/c1ae1165-a01f-4785-af2c-ae400f9d8617-kube-api-access-vr4qq\") pod \"auto-csr-approver-29552152-zfqml\" (UID: \"c1ae1165-a01f-4785-af2c-ae400f9d8617\") " pod="openshift-infra/auto-csr-approver-29552152-zfqml" Mar 10 07:52:00 crc kubenswrapper[4825]: I0310 07:52:00.328372 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr4qq\" (UniqueName: \"kubernetes.io/projected/c1ae1165-a01f-4785-af2c-ae400f9d8617-kube-api-access-vr4qq\") pod \"auto-csr-approver-29552152-zfqml\" (UID: \"c1ae1165-a01f-4785-af2c-ae400f9d8617\") " pod="openshift-infra/auto-csr-approver-29552152-zfqml" Mar 10 07:52:00 crc kubenswrapper[4825]: I0310 07:52:00.356692 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr4qq\" (UniqueName: \"kubernetes.io/projected/c1ae1165-a01f-4785-af2c-ae400f9d8617-kube-api-access-vr4qq\") pod \"auto-csr-approver-29552152-zfqml\" (UID: \"c1ae1165-a01f-4785-af2c-ae400f9d8617\") " pod="openshift-infra/auto-csr-approver-29552152-zfqml" Mar 10 07:52:00 crc kubenswrapper[4825]: I0310 07:52:00.501682 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552152-zfqml" Mar 10 07:52:00 crc kubenswrapper[4825]: I0310 07:52:00.784161 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552152-zfqml"] Mar 10 07:52:00 crc kubenswrapper[4825]: I0310 07:52:00.789534 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 07:52:01 crc kubenswrapper[4825]: I0310 07:52:01.677879 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552152-zfqml" event={"ID":"c1ae1165-a01f-4785-af2c-ae400f9d8617","Type":"ContainerStarted","Data":"42aaeae7a239a9b7086bfda25e2560f478d3b42a699adfafd24dcccf5169d74f"} Mar 10 07:52:02 crc kubenswrapper[4825]: I0310 07:52:02.687644 4825 generic.go:334] "Generic (PLEG): container finished" podID="c1ae1165-a01f-4785-af2c-ae400f9d8617" containerID="6c749693e97ce99b7a0caaf4550921eee1d7b411ab4554d13a1968b7b75bd857" exitCode=0 Mar 10 07:52:02 crc kubenswrapper[4825]: I0310 07:52:02.687726 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552152-zfqml" event={"ID":"c1ae1165-a01f-4785-af2c-ae400f9d8617","Type":"ContainerDied","Data":"6c749693e97ce99b7a0caaf4550921eee1d7b411ab4554d13a1968b7b75bd857"} Mar 10 07:52:03 crc kubenswrapper[4825]: I0310 07:52:03.241264 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:52:03 crc kubenswrapper[4825]: E0310 07:52:03.241675 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:52:03 crc kubenswrapper[4825]: I0310 07:52:03.960348 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552152-zfqml" Mar 10 07:52:03 crc kubenswrapper[4825]: I0310 07:52:03.978550 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr4qq\" (UniqueName: \"kubernetes.io/projected/c1ae1165-a01f-4785-af2c-ae400f9d8617-kube-api-access-vr4qq\") pod \"c1ae1165-a01f-4785-af2c-ae400f9d8617\" (UID: \"c1ae1165-a01f-4785-af2c-ae400f9d8617\") " Mar 10 07:52:03 crc kubenswrapper[4825]: I0310 07:52:03.987448 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1ae1165-a01f-4785-af2c-ae400f9d8617-kube-api-access-vr4qq" (OuterVolumeSpecName: "kube-api-access-vr4qq") pod "c1ae1165-a01f-4785-af2c-ae400f9d8617" (UID: "c1ae1165-a01f-4785-af2c-ae400f9d8617"). InnerVolumeSpecName "kube-api-access-vr4qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:52:04 crc kubenswrapper[4825]: I0310 07:52:04.079957 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr4qq\" (UniqueName: \"kubernetes.io/projected/c1ae1165-a01f-4785-af2c-ae400f9d8617-kube-api-access-vr4qq\") on node \"crc\" DevicePath \"\"" Mar 10 07:52:04 crc kubenswrapper[4825]: I0310 07:52:04.709752 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552152-zfqml" event={"ID":"c1ae1165-a01f-4785-af2c-ae400f9d8617","Type":"ContainerDied","Data":"42aaeae7a239a9b7086bfda25e2560f478d3b42a699adfafd24dcccf5169d74f"} Mar 10 07:52:04 crc kubenswrapper[4825]: I0310 07:52:04.709814 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42aaeae7a239a9b7086bfda25e2560f478d3b42a699adfafd24dcccf5169d74f" Mar 10 07:52:04 crc kubenswrapper[4825]: I0310 07:52:04.709824 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552152-zfqml" Mar 10 07:52:05 crc kubenswrapper[4825]: I0310 07:52:05.031380 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552146-rppwt"] Mar 10 07:52:05 crc kubenswrapper[4825]: I0310 07:52:05.040987 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552146-rppwt"] Mar 10 07:52:05 crc kubenswrapper[4825]: I0310 07:52:05.246436 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a2b44ed-4a43-4034-b298-779f0e562e1c" path="/var/lib/kubelet/pods/9a2b44ed-4a43-4034-b298-779f0e562e1c/volumes" Mar 10 07:52:16 crc kubenswrapper[4825]: I0310 07:52:16.237697 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:52:16 crc kubenswrapper[4825]: E0310 07:52:16.238936 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:52:30 crc kubenswrapper[4825]: I0310 07:52:30.236030 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:52:30 crc kubenswrapper[4825]: E0310 07:52:30.237230 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:52:39 crc kubenswrapper[4825]: I0310 07:52:39.635086 4825 scope.go:117] "RemoveContainer" containerID="e932489589c98e57f911429181513fbe004a30a09427a4726b7c604da54e6076" Mar 10 07:52:45 crc kubenswrapper[4825]: I0310 07:52:45.236583 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:52:45 crc kubenswrapper[4825]: E0310 07:52:45.237573 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:52:53 crc kubenswrapper[4825]: I0310 07:52:53.096174 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9mvbz"] Mar 10 07:52:53 crc kubenswrapper[4825]: E0310 07:52:53.097104 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1ae1165-a01f-4785-af2c-ae400f9d8617" containerName="oc" Mar 10 07:52:53 crc kubenswrapper[4825]: I0310 07:52:53.097130 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1ae1165-a01f-4785-af2c-ae400f9d8617" containerName="oc" Mar 10 07:52:53 crc kubenswrapper[4825]: I0310 07:52:53.097504 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1ae1165-a01f-4785-af2c-ae400f9d8617" containerName="oc" Mar 10 07:52:53 crc kubenswrapper[4825]: I0310 07:52:53.099545 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9mvbz" Mar 10 07:52:53 crc kubenswrapper[4825]: I0310 07:52:53.127546 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9mvbz"] Mar 10 07:52:53 crc kubenswrapper[4825]: I0310 07:52:53.260517 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde01a92-3945-4f12-8aa4-f9cc7c5235bb-utilities\") pod \"redhat-operators-9mvbz\" (UID: \"bde01a92-3945-4f12-8aa4-f9cc7c5235bb\") " pod="openshift-marketplace/redhat-operators-9mvbz" Mar 10 07:52:53 crc kubenswrapper[4825]: I0310 07:52:53.260592 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde01a92-3945-4f12-8aa4-f9cc7c5235bb-catalog-content\") pod \"redhat-operators-9mvbz\" (UID: \"bde01a92-3945-4f12-8aa4-f9cc7c5235bb\") " pod="openshift-marketplace/redhat-operators-9mvbz" Mar 10 07:52:53 crc kubenswrapper[4825]: I0310 07:52:53.260628 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf5lv\" (UniqueName: \"kubernetes.io/projected/bde01a92-3945-4f12-8aa4-f9cc7c5235bb-kube-api-access-sf5lv\") pod \"redhat-operators-9mvbz\" (UID: \"bde01a92-3945-4f12-8aa4-f9cc7c5235bb\") " pod="openshift-marketplace/redhat-operators-9mvbz" Mar 10 07:52:53 crc kubenswrapper[4825]: I0310 07:52:53.362022 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde01a92-3945-4f12-8aa4-f9cc7c5235bb-utilities\") pod \"redhat-operators-9mvbz\" (UID: \"bde01a92-3945-4f12-8aa4-f9cc7c5235bb\") " pod="openshift-marketplace/redhat-operators-9mvbz" Mar 10 07:52:53 crc kubenswrapper[4825]: I0310 07:52:53.362198 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde01a92-3945-4f12-8aa4-f9cc7c5235bb-catalog-content\") pod \"redhat-operators-9mvbz\" (UID: \"bde01a92-3945-4f12-8aa4-f9cc7c5235bb\") " pod="openshift-marketplace/redhat-operators-9mvbz" Mar 10 07:52:53 crc kubenswrapper[4825]: I0310 07:52:53.362629 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde01a92-3945-4f12-8aa4-f9cc7c5235bb-utilities\") pod \"redhat-operators-9mvbz\" (UID: \"bde01a92-3945-4f12-8aa4-f9cc7c5235bb\") " pod="openshift-marketplace/redhat-operators-9mvbz" Mar 10 07:52:53 crc kubenswrapper[4825]: I0310 07:52:53.362781 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde01a92-3945-4f12-8aa4-f9cc7c5235bb-catalog-content\") pod \"redhat-operators-9mvbz\" (UID: \"bde01a92-3945-4f12-8aa4-f9cc7c5235bb\") " pod="openshift-marketplace/redhat-operators-9mvbz" Mar 10 07:52:53 crc kubenswrapper[4825]: I0310 07:52:53.362966 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf5lv\" (UniqueName: \"kubernetes.io/projected/bde01a92-3945-4f12-8aa4-f9cc7c5235bb-kube-api-access-sf5lv\") pod \"redhat-operators-9mvbz\" (UID: \"bde01a92-3945-4f12-8aa4-f9cc7c5235bb\") " pod="openshift-marketplace/redhat-operators-9mvbz" Mar 10 07:52:53 crc kubenswrapper[4825]: I0310 07:52:53.388376 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf5lv\" (UniqueName: \"kubernetes.io/projected/bde01a92-3945-4f12-8aa4-f9cc7c5235bb-kube-api-access-sf5lv\") pod \"redhat-operators-9mvbz\" (UID: \"bde01a92-3945-4f12-8aa4-f9cc7c5235bb\") " pod="openshift-marketplace/redhat-operators-9mvbz" Mar 10 07:52:53 crc kubenswrapper[4825]: I0310 07:52:53.437047 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9mvbz" Mar 10 07:52:53 crc kubenswrapper[4825]: I0310 07:52:53.658542 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9mvbz"] Mar 10 07:52:53 crc kubenswrapper[4825]: W0310 07:52:53.662197 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbde01a92_3945_4f12_8aa4_f9cc7c5235bb.slice/crio-646e6cf36bfbbe4c05fe56cfac87f4a4db667b825a3e16db00fa134f499b8f3f WatchSource:0}: Error finding container 646e6cf36bfbbe4c05fe56cfac87f4a4db667b825a3e16db00fa134f499b8f3f: Status 404 returned error can't find the container with id 646e6cf36bfbbe4c05fe56cfac87f4a4db667b825a3e16db00fa134f499b8f3f Mar 10 07:52:54 crc kubenswrapper[4825]: I0310 07:52:54.134958 4825 generic.go:334] "Generic (PLEG): container finished" podID="bde01a92-3945-4f12-8aa4-f9cc7c5235bb" containerID="59dc274a79503a06b51dc88705e976fa90172b3e3bd0bd98b28750b9ec43734d" exitCode=0 Mar 10 07:52:54 crc kubenswrapper[4825]: I0310 07:52:54.135001 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mvbz" event={"ID":"bde01a92-3945-4f12-8aa4-f9cc7c5235bb","Type":"ContainerDied","Data":"59dc274a79503a06b51dc88705e976fa90172b3e3bd0bd98b28750b9ec43734d"} Mar 10 07:52:54 crc kubenswrapper[4825]: I0310 07:52:54.135025 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mvbz" event={"ID":"bde01a92-3945-4f12-8aa4-f9cc7c5235bb","Type":"ContainerStarted","Data":"646e6cf36bfbbe4c05fe56cfac87f4a4db667b825a3e16db00fa134f499b8f3f"} Mar 10 07:52:54 crc kubenswrapper[4825]: I0310 07:52:54.882337 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pzvkq"] Mar 10 07:52:54 crc kubenswrapper[4825]: I0310 07:52:54.885227 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzvkq" Mar 10 07:52:54 crc kubenswrapper[4825]: I0310 07:52:54.901715 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pzvkq"] Mar 10 07:52:54 crc kubenswrapper[4825]: I0310 07:52:54.984632 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99cnh\" (UniqueName: \"kubernetes.io/projected/a702b464-36e5-4458-ab89-9830df30a344-kube-api-access-99cnh\") pod \"community-operators-pzvkq\" (UID: \"a702b464-36e5-4458-ab89-9830df30a344\") " pod="openshift-marketplace/community-operators-pzvkq" Mar 10 07:52:54 crc kubenswrapper[4825]: I0310 07:52:54.984871 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a702b464-36e5-4458-ab89-9830df30a344-catalog-content\") pod \"community-operators-pzvkq\" (UID: \"a702b464-36e5-4458-ab89-9830df30a344\") " pod="openshift-marketplace/community-operators-pzvkq" Mar 10 07:52:54 crc kubenswrapper[4825]: I0310 07:52:54.985020 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a702b464-36e5-4458-ab89-9830df30a344-utilities\") pod \"community-operators-pzvkq\" (UID: \"a702b464-36e5-4458-ab89-9830df30a344\") " pod="openshift-marketplace/community-operators-pzvkq" Mar 10 07:52:55 crc kubenswrapper[4825]: I0310 07:52:55.086108 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a702b464-36e5-4458-ab89-9830df30a344-utilities\") pod \"community-operators-pzvkq\" (UID: \"a702b464-36e5-4458-ab89-9830df30a344\") " pod="openshift-marketplace/community-operators-pzvkq" Mar 10 07:52:55 crc kubenswrapper[4825]: I0310 07:52:55.086279 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99cnh\" (UniqueName: \"kubernetes.io/projected/a702b464-36e5-4458-ab89-9830df30a344-kube-api-access-99cnh\") pod \"community-operators-pzvkq\" (UID: \"a702b464-36e5-4458-ab89-9830df30a344\") " pod="openshift-marketplace/community-operators-pzvkq" Mar 10 07:52:55 crc kubenswrapper[4825]: I0310 07:52:55.086394 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a702b464-36e5-4458-ab89-9830df30a344-catalog-content\") pod \"community-operators-pzvkq\" (UID: \"a702b464-36e5-4458-ab89-9830df30a344\") " pod="openshift-marketplace/community-operators-pzvkq" Mar 10 07:52:55 crc kubenswrapper[4825]: I0310 07:52:55.086599 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a702b464-36e5-4458-ab89-9830df30a344-utilities\") pod \"community-operators-pzvkq\" (UID: \"a702b464-36e5-4458-ab89-9830df30a344\") " pod="openshift-marketplace/community-operators-pzvkq" Mar 10 07:52:55 crc kubenswrapper[4825]: I0310 07:52:55.086947 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a702b464-36e5-4458-ab89-9830df30a344-catalog-content\") pod \"community-operators-pzvkq\" (UID: \"a702b464-36e5-4458-ab89-9830df30a344\") " pod="openshift-marketplace/community-operators-pzvkq" Mar 10 07:52:55 crc kubenswrapper[4825]: I0310 07:52:55.110216 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99cnh\" (UniqueName: \"kubernetes.io/projected/a702b464-36e5-4458-ab89-9830df30a344-kube-api-access-99cnh\") pod \"community-operators-pzvkq\" (UID: \"a702b464-36e5-4458-ab89-9830df30a344\") " pod="openshift-marketplace/community-operators-pzvkq" Mar 10 07:52:55 crc kubenswrapper[4825]: I0310 07:52:55.228578 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzvkq" Mar 10 07:52:55 crc kubenswrapper[4825]: I0310 07:52:55.719148 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pzvkq"] Mar 10 07:52:56 crc kubenswrapper[4825]: I0310 07:52:56.157445 4825 generic.go:334] "Generic (PLEG): container finished" podID="a702b464-36e5-4458-ab89-9830df30a344" containerID="b023d89b72f1fa48b38cbee38f42c56fa0e20a2d589de340d826e9c3cca1fb22" exitCode=0 Mar 10 07:52:56 crc kubenswrapper[4825]: I0310 07:52:56.157682 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzvkq" event={"ID":"a702b464-36e5-4458-ab89-9830df30a344","Type":"ContainerDied","Data":"b023d89b72f1fa48b38cbee38f42c56fa0e20a2d589de340d826e9c3cca1fb22"} Mar 10 07:52:56 crc kubenswrapper[4825]: I0310 07:52:56.157746 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzvkq" event={"ID":"a702b464-36e5-4458-ab89-9830df30a344","Type":"ContainerStarted","Data":"9c9dbd365e09a8623814416e398c32af98f504a692b31f50d4d7508ff14efccb"} Mar 10 07:52:56 crc kubenswrapper[4825]: I0310 07:52:56.160038 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mvbz" event={"ID":"bde01a92-3945-4f12-8aa4-f9cc7c5235bb","Type":"ContainerStarted","Data":"4353b82e6c648bb7d7657ea520ab1001f324fdf184fe5fc1b1639e2b7cbff702"} Mar 10 07:52:56 crc kubenswrapper[4825]: I0310 07:52:56.236489 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:52:56 crc kubenswrapper[4825]: E0310 07:52:56.236726 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:52:57 crc kubenswrapper[4825]: I0310 07:52:57.167981 4825 generic.go:334] "Generic (PLEG): container finished" podID="bde01a92-3945-4f12-8aa4-f9cc7c5235bb" containerID="4353b82e6c648bb7d7657ea520ab1001f324fdf184fe5fc1b1639e2b7cbff702" exitCode=0 Mar 10 07:52:57 crc kubenswrapper[4825]: I0310 07:52:57.168029 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mvbz" event={"ID":"bde01a92-3945-4f12-8aa4-f9cc7c5235bb","Type":"ContainerDied","Data":"4353b82e6c648bb7d7657ea520ab1001f324fdf184fe5fc1b1639e2b7cbff702"} Mar 10 07:52:57 crc kubenswrapper[4825]: I0310 07:52:57.168379 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mvbz" event={"ID":"bde01a92-3945-4f12-8aa4-f9cc7c5235bb","Type":"ContainerStarted","Data":"bef600d07650e623397e70693b05d72f6a1d424e2833d3af63ce8ca9b3fc5051"} Mar 10 07:52:57 crc kubenswrapper[4825]: I0310 07:52:57.172084 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzvkq" event={"ID":"a702b464-36e5-4458-ab89-9830df30a344","Type":"ContainerStarted","Data":"2fd51479c43494f0f85f371ec3f24a6121a517567870276bde5f171066f8930a"} Mar 10 07:52:57 crc kubenswrapper[4825]: I0310 07:52:57.194153 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9mvbz" podStartSLOduration=1.7202341799999998 podStartE2EDuration="4.194119277s" podCreationTimestamp="2026-03-10 07:52:53 +0000 UTC" firstStartedPulling="2026-03-10 07:52:54.136637447 +0000 UTC m=+4127.166418062" lastFinishedPulling="2026-03-10 07:52:56.610522544 +0000 UTC m=+4129.640303159" observedRunningTime="2026-03-10 07:52:57.192251328 +0000 UTC m=+4130.222031963" watchObservedRunningTime="2026-03-10 07:52:57.194119277 +0000 UTC m=+4130.223899902" Mar 10 07:52:58 crc kubenswrapper[4825]: I0310 07:52:58.182884 4825 generic.go:334] "Generic (PLEG): container finished" podID="a702b464-36e5-4458-ab89-9830df30a344" containerID="2fd51479c43494f0f85f371ec3f24a6121a517567870276bde5f171066f8930a" exitCode=0 Mar 10 07:52:58 crc kubenswrapper[4825]: I0310 07:52:58.182966 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzvkq" event={"ID":"a702b464-36e5-4458-ab89-9830df30a344","Type":"ContainerDied","Data":"2fd51479c43494f0f85f371ec3f24a6121a517567870276bde5f171066f8930a"} Mar 10 07:53:00 crc kubenswrapper[4825]: I0310 07:53:00.207451 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzvkq" event={"ID":"a702b464-36e5-4458-ab89-9830df30a344","Type":"ContainerStarted","Data":"ff7522da04e993f53cfff70750d5ab3529b5a497cfe1a1d5c0226e25b647ce6a"} Mar 10 07:53:00 crc kubenswrapper[4825]: I0310 07:53:00.235728 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pzvkq" podStartSLOduration=3.25492625 podStartE2EDuration="6.235695591s" podCreationTimestamp="2026-03-10 07:52:54 +0000 UTC" firstStartedPulling="2026-03-10 07:52:56.158828054 +0000 UTC m=+4129.188608679" lastFinishedPulling="2026-03-10 07:52:59.139597385 +0000 UTC m=+4132.169378020" observedRunningTime="2026-03-10 07:53:00.23565188 +0000 UTC m=+4133.265432495" watchObservedRunningTime="2026-03-10 07:53:00.235695591 +0000 UTC m=+4133.265476256" Mar 10 07:53:03 crc kubenswrapper[4825]: I0310 07:53:03.437961 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9mvbz" Mar 10 07:53:03 crc kubenswrapper[4825]: I0310 07:53:03.438472 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9mvbz" Mar 10 07:53:03 crc kubenswrapper[4825]: I0310 07:53:03.504623 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9mvbz" Mar 10 07:53:04 crc kubenswrapper[4825]: I0310 07:53:04.323016 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9mvbz" Mar 10 07:53:05 crc kubenswrapper[4825]: I0310 07:53:05.070908 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9mvbz"] Mar 10 07:53:05 crc kubenswrapper[4825]: I0310 07:53:05.228946 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pzvkq" Mar 10 07:53:05 crc kubenswrapper[4825]: I0310 07:53:05.229262 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pzvkq" Mar 10 07:53:05 crc kubenswrapper[4825]: I0310 07:53:05.273538 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pzvkq" Mar 10 07:53:06 crc kubenswrapper[4825]: I0310 07:53:06.251399 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9mvbz" podUID="bde01a92-3945-4f12-8aa4-f9cc7c5235bb" containerName="registry-server" containerID="cri-o://bef600d07650e623397e70693b05d72f6a1d424e2833d3af63ce8ca9b3fc5051" gracePeriod=2 Mar 10 07:53:06 crc kubenswrapper[4825]: I0310 07:53:06.598889 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pzvkq" Mar 10 07:53:07 crc kubenswrapper[4825]: I0310 07:53:07.672874 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pzvkq"] Mar 10 07:53:08 crc kubenswrapper[4825]: I0310 07:53:08.268759 4825 generic.go:334] "Generic (PLEG): container finished" podID="bde01a92-3945-4f12-8aa4-f9cc7c5235bb" containerID="bef600d07650e623397e70693b05d72f6a1d424e2833d3af63ce8ca9b3fc5051" exitCode=0 Mar 10 07:53:08 crc kubenswrapper[4825]: I0310 07:53:08.268837 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mvbz" event={"ID":"bde01a92-3945-4f12-8aa4-f9cc7c5235bb","Type":"ContainerDied","Data":"bef600d07650e623397e70693b05d72f6a1d424e2833d3af63ce8ca9b3fc5051"} Mar 10 07:53:08 crc kubenswrapper[4825]: I0310 07:53:08.268996 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pzvkq" podUID="a702b464-36e5-4458-ab89-9830df30a344" containerName="registry-server" containerID="cri-o://ff7522da04e993f53cfff70750d5ab3529b5a497cfe1a1d5c0226e25b647ce6a" gracePeriod=2 Mar 10 07:53:08 crc kubenswrapper[4825]: I0310 07:53:08.786249 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9mvbz" Mar 10 07:53:08 crc kubenswrapper[4825]: I0310 07:53:08.838023 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzvkq" Mar 10 07:53:08 crc kubenswrapper[4825]: I0310 07:53:08.908937 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde01a92-3945-4f12-8aa4-f9cc7c5235bb-catalog-content\") pod \"bde01a92-3945-4f12-8aa4-f9cc7c5235bb\" (UID: \"bde01a92-3945-4f12-8aa4-f9cc7c5235bb\") " Mar 10 07:53:08 crc kubenswrapper[4825]: I0310 07:53:08.909006 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde01a92-3945-4f12-8aa4-f9cc7c5235bb-utilities\") pod \"bde01a92-3945-4f12-8aa4-f9cc7c5235bb\" (UID: \"bde01a92-3945-4f12-8aa4-f9cc7c5235bb\") " Mar 10 07:53:08 crc kubenswrapper[4825]: I0310 07:53:08.909171 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf5lv\" (UniqueName: \"kubernetes.io/projected/bde01a92-3945-4f12-8aa4-f9cc7c5235bb-kube-api-access-sf5lv\") pod \"bde01a92-3945-4f12-8aa4-f9cc7c5235bb\" (UID: \"bde01a92-3945-4f12-8aa4-f9cc7c5235bb\") " Mar 10 07:53:08 crc kubenswrapper[4825]: I0310 07:53:08.911208 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde01a92-3945-4f12-8aa4-f9cc7c5235bb-utilities" (OuterVolumeSpecName: "utilities") pod "bde01a92-3945-4f12-8aa4-f9cc7c5235bb" (UID: "bde01a92-3945-4f12-8aa4-f9cc7c5235bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:53:08 crc kubenswrapper[4825]: I0310 07:53:08.914575 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bde01a92-3945-4f12-8aa4-f9cc7c5235bb-kube-api-access-sf5lv" (OuterVolumeSpecName: "kube-api-access-sf5lv") pod "bde01a92-3945-4f12-8aa4-f9cc7c5235bb" (UID: "bde01a92-3945-4f12-8aa4-f9cc7c5235bb"). InnerVolumeSpecName "kube-api-access-sf5lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.010742 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99cnh\" (UniqueName: \"kubernetes.io/projected/a702b464-36e5-4458-ab89-9830df30a344-kube-api-access-99cnh\") pod \"a702b464-36e5-4458-ab89-9830df30a344\" (UID: \"a702b464-36e5-4458-ab89-9830df30a344\") " Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.010873 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a702b464-36e5-4458-ab89-9830df30a344-utilities\") pod \"a702b464-36e5-4458-ab89-9830df30a344\" (UID: \"a702b464-36e5-4458-ab89-9830df30a344\") " Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.010976 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a702b464-36e5-4458-ab89-9830df30a344-catalog-content\") pod \"a702b464-36e5-4458-ab89-9830df30a344\" (UID: \"a702b464-36e5-4458-ab89-9830df30a344\") " Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.011297 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf5lv\" (UniqueName: \"kubernetes.io/projected/bde01a92-3945-4f12-8aa4-f9cc7c5235bb-kube-api-access-sf5lv\") on node \"crc\" DevicePath \"\"" Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.011316 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde01a92-3945-4f12-8aa4-f9cc7c5235bb-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.011924 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a702b464-36e5-4458-ab89-9830df30a344-utilities" (OuterVolumeSpecName: "utilities") pod "a702b464-36e5-4458-ab89-9830df30a344" (UID: "a702b464-36e5-4458-ab89-9830df30a344"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.017241 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a702b464-36e5-4458-ab89-9830df30a344-kube-api-access-99cnh" (OuterVolumeSpecName: "kube-api-access-99cnh") pod "a702b464-36e5-4458-ab89-9830df30a344" (UID: "a702b464-36e5-4458-ab89-9830df30a344"). InnerVolumeSpecName "kube-api-access-99cnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.049880 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde01a92-3945-4f12-8aa4-f9cc7c5235bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bde01a92-3945-4f12-8aa4-f9cc7c5235bb" (UID: "bde01a92-3945-4f12-8aa4-f9cc7c5235bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.094741 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a702b464-36e5-4458-ab89-9830df30a344-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a702b464-36e5-4458-ab89-9830df30a344" (UID: "a702b464-36e5-4458-ab89-9830df30a344"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.112552 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a702b464-36e5-4458-ab89-9830df30a344-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.112586 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde01a92-3945-4f12-8aa4-f9cc7c5235bb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.112599 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a702b464-36e5-4458-ab89-9830df30a344-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.112615 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99cnh\" (UniqueName: \"kubernetes.io/projected/a702b464-36e5-4458-ab89-9830df30a344-kube-api-access-99cnh\") on node \"crc\" DevicePath \"\"" Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.276780 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mvbz" event={"ID":"bde01a92-3945-4f12-8aa4-f9cc7c5235bb","Type":"ContainerDied","Data":"646e6cf36bfbbe4c05fe56cfac87f4a4db667b825a3e16db00fa134f499b8f3f"} Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.276837 4825 scope.go:117] "RemoveContainer" containerID="bef600d07650e623397e70693b05d72f6a1d424e2833d3af63ce8ca9b3fc5051" Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.277316 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9mvbz" Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.282742 4825 generic.go:334] "Generic (PLEG): container finished" podID="a702b464-36e5-4458-ab89-9830df30a344" containerID="ff7522da04e993f53cfff70750d5ab3529b5a497cfe1a1d5c0226e25b647ce6a" exitCode=0 Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.283071 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzvkq" event={"ID":"a702b464-36e5-4458-ab89-9830df30a344","Type":"ContainerDied","Data":"ff7522da04e993f53cfff70750d5ab3529b5a497cfe1a1d5c0226e25b647ce6a"} Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.283318 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzvkq" event={"ID":"a702b464-36e5-4458-ab89-9830df30a344","Type":"ContainerDied","Data":"9c9dbd365e09a8623814416e398c32af98f504a692b31f50d4d7508ff14efccb"} Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.283182 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzvkq" Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.307045 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9mvbz"] Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.316958 4825 scope.go:117] "RemoveContainer" containerID="4353b82e6c648bb7d7657ea520ab1001f324fdf184fe5fc1b1639e2b7cbff702" Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.317552 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9mvbz"] Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.326345 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pzvkq"] Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.333091 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pzvkq"] Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.342671 4825 scope.go:117] "RemoveContainer" containerID="59dc274a79503a06b51dc88705e976fa90172b3e3bd0bd98b28750b9ec43734d" Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.371628 4825 scope.go:117] "RemoveContainer" containerID="ff7522da04e993f53cfff70750d5ab3529b5a497cfe1a1d5c0226e25b647ce6a" Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.390465 4825 scope.go:117] "RemoveContainer" containerID="2fd51479c43494f0f85f371ec3f24a6121a517567870276bde5f171066f8930a" Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.418017 4825 scope.go:117] "RemoveContainer" containerID="b023d89b72f1fa48b38cbee38f42c56fa0e20a2d589de340d826e9c3cca1fb22" Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.505223 4825 scope.go:117] "RemoveContainer" containerID="ff7522da04e993f53cfff70750d5ab3529b5a497cfe1a1d5c0226e25b647ce6a" Mar 10 07:53:09 crc kubenswrapper[4825]: E0310 07:53:09.509524 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff7522da04e993f53cfff70750d5ab3529b5a497cfe1a1d5c0226e25b647ce6a\": container with ID starting with ff7522da04e993f53cfff70750d5ab3529b5a497cfe1a1d5c0226e25b647ce6a not found: ID does not exist" containerID="ff7522da04e993f53cfff70750d5ab3529b5a497cfe1a1d5c0226e25b647ce6a" Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.509607 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff7522da04e993f53cfff70750d5ab3529b5a497cfe1a1d5c0226e25b647ce6a"} err="failed to get container status \"ff7522da04e993f53cfff70750d5ab3529b5a497cfe1a1d5c0226e25b647ce6a\": rpc error: code = NotFound desc = could not find container \"ff7522da04e993f53cfff70750d5ab3529b5a497cfe1a1d5c0226e25b647ce6a\": container with ID starting with ff7522da04e993f53cfff70750d5ab3529b5a497cfe1a1d5c0226e25b647ce6a not found: ID does not exist" Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.509651 4825 scope.go:117] "RemoveContainer" containerID="2fd51479c43494f0f85f371ec3f24a6121a517567870276bde5f171066f8930a" Mar 10 07:53:09 crc kubenswrapper[4825]: E0310 07:53:09.510079 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fd51479c43494f0f85f371ec3f24a6121a517567870276bde5f171066f8930a\": container with ID starting with 2fd51479c43494f0f85f371ec3f24a6121a517567870276bde5f171066f8930a not found: ID does not exist" containerID="2fd51479c43494f0f85f371ec3f24a6121a517567870276bde5f171066f8930a" Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.510120 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fd51479c43494f0f85f371ec3f24a6121a517567870276bde5f171066f8930a"} err="failed to get container status \"2fd51479c43494f0f85f371ec3f24a6121a517567870276bde5f171066f8930a\": rpc error: code = NotFound desc = could not find container \"2fd51479c43494f0f85f371ec3f24a6121a517567870276bde5f171066f8930a\": container with ID starting with 2fd51479c43494f0f85f371ec3f24a6121a517567870276bde5f171066f8930a not found: ID does not exist" Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.510158 4825 scope.go:117] "RemoveContainer" containerID="b023d89b72f1fa48b38cbee38f42c56fa0e20a2d589de340d826e9c3cca1fb22" Mar 10 07:53:09 crc kubenswrapper[4825]: E0310 07:53:09.510452 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b023d89b72f1fa48b38cbee38f42c56fa0e20a2d589de340d826e9c3cca1fb22\": container with ID starting with b023d89b72f1fa48b38cbee38f42c56fa0e20a2d589de340d826e9c3cca1fb22 not found: ID does not exist" containerID="b023d89b72f1fa48b38cbee38f42c56fa0e20a2d589de340d826e9c3cca1fb22" Mar 10 07:53:09 crc kubenswrapper[4825]: I0310 07:53:09.510473 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b023d89b72f1fa48b38cbee38f42c56fa0e20a2d589de340d826e9c3cca1fb22"} err="failed to get container status \"b023d89b72f1fa48b38cbee38f42c56fa0e20a2d589de340d826e9c3cca1fb22\": rpc error: code = NotFound desc = could not find container \"b023d89b72f1fa48b38cbee38f42c56fa0e20a2d589de340d826e9c3cca1fb22\": container with ID starting with b023d89b72f1fa48b38cbee38f42c56fa0e20a2d589de340d826e9c3cca1fb22 not found: ID does not exist" Mar 10 07:53:10 crc kubenswrapper[4825]: I0310 07:53:10.236483 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:53:10 crc kubenswrapper[4825]: E0310 07:53:10.236850 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:53:11 crc kubenswrapper[4825]: I0310 07:53:11.246253 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a702b464-36e5-4458-ab89-9830df30a344" path="/var/lib/kubelet/pods/a702b464-36e5-4458-ab89-9830df30a344/volumes" Mar 10 07:53:11 crc kubenswrapper[4825]: I0310 07:53:11.248012 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bde01a92-3945-4f12-8aa4-f9cc7c5235bb" path="/var/lib/kubelet/pods/bde01a92-3945-4f12-8aa4-f9cc7c5235bb/volumes" Mar 10 07:53:25 crc kubenswrapper[4825]: I0310 07:53:25.237088 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:53:25 crc kubenswrapper[4825]: E0310 07:53:25.237905 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:53:38 crc kubenswrapper[4825]: I0310 07:53:38.236752 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:53:38 crc kubenswrapper[4825]: E0310 07:53:38.237706 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:53:52 crc kubenswrapper[4825]: I0310 07:53:52.237373 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:53:52 crc kubenswrapper[4825]: E0310 07:53:52.238536 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:54:00 crc kubenswrapper[4825]: I0310 07:54:00.146176 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552154-h29cn"] Mar 10 07:54:00 crc kubenswrapper[4825]: E0310 07:54:00.147094 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a702b464-36e5-4458-ab89-9830df30a344" containerName="extract-utilities" Mar 10 07:54:00 crc kubenswrapper[4825]: I0310 07:54:00.147112 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a702b464-36e5-4458-ab89-9830df30a344" containerName="extract-utilities" Mar 10 07:54:00 crc kubenswrapper[4825]: E0310 07:54:00.147149 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde01a92-3945-4f12-8aa4-f9cc7c5235bb" containerName="registry-server" Mar 10 07:54:00 crc kubenswrapper[4825]: I0310 07:54:00.147158 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde01a92-3945-4f12-8aa4-f9cc7c5235bb" containerName="registry-server" Mar 10 07:54:00 crc kubenswrapper[4825]: E0310 07:54:00.147170 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde01a92-3945-4f12-8aa4-f9cc7c5235bb" containerName="extract-utilities" Mar 10 07:54:00 crc kubenswrapper[4825]: I0310 07:54:00.147178 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde01a92-3945-4f12-8aa4-f9cc7c5235bb" containerName="extract-utilities" Mar 10 07:54:00 crc kubenswrapper[4825]: E0310 07:54:00.147200 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde01a92-3945-4f12-8aa4-f9cc7c5235bb" containerName="extract-content" Mar 10 07:54:00 crc kubenswrapper[4825]: I0310 07:54:00.147209 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde01a92-3945-4f12-8aa4-f9cc7c5235bb" containerName="extract-content" Mar 10 07:54:00 crc kubenswrapper[4825]: E0310 07:54:00.147224 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a702b464-36e5-4458-ab89-9830df30a344" containerName="extract-content" Mar 10 07:54:00 crc kubenswrapper[4825]: I0310 07:54:00.147231 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a702b464-36e5-4458-ab89-9830df30a344" containerName="extract-content" Mar 10 07:54:00 crc kubenswrapper[4825]: E0310 07:54:00.147247 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a702b464-36e5-4458-ab89-9830df30a344" containerName="registry-server" Mar 10 07:54:00 crc kubenswrapper[4825]: I0310 07:54:00.147253 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a702b464-36e5-4458-ab89-9830df30a344" containerName="registry-server" Mar 10 07:54:00 crc kubenswrapper[4825]: I0310 07:54:00.147411 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a702b464-36e5-4458-ab89-9830df30a344" containerName="registry-server" Mar 10 07:54:00 crc kubenswrapper[4825]: I0310 07:54:00.147433 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="bde01a92-3945-4f12-8aa4-f9cc7c5235bb" containerName="registry-server" Mar 10 07:54:00 crc kubenswrapper[4825]: I0310 07:54:00.147963 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552154-h29cn" Mar 10 07:54:00 crc kubenswrapper[4825]: I0310 07:54:00.150955 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:54:00 crc kubenswrapper[4825]: I0310 07:54:00.151305 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:54:00 crc kubenswrapper[4825]: I0310 07:54:00.151652 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:54:00 crc kubenswrapper[4825]: I0310 07:54:00.176800 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552154-h29cn"] Mar 10 07:54:00 crc kubenswrapper[4825]: I0310 07:54:00.313128 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzrvb\" (UniqueName: \"kubernetes.io/projected/4324c2f1-09fe-481a-a5bb-141229d53f70-kube-api-access-hzrvb\") pod \"auto-csr-approver-29552154-h29cn\" (UID: \"4324c2f1-09fe-481a-a5bb-141229d53f70\") " pod="openshift-infra/auto-csr-approver-29552154-h29cn" Mar 10 07:54:00 crc kubenswrapper[4825]: I0310 07:54:00.414368 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzrvb\" (UniqueName: \"kubernetes.io/projected/4324c2f1-09fe-481a-a5bb-141229d53f70-kube-api-access-hzrvb\") pod \"auto-csr-approver-29552154-h29cn\" (UID: \"4324c2f1-09fe-481a-a5bb-141229d53f70\") " pod="openshift-infra/auto-csr-approver-29552154-h29cn" Mar 10 07:54:00 crc kubenswrapper[4825]: I0310 07:54:00.432528 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzrvb\" (UniqueName: \"kubernetes.io/projected/4324c2f1-09fe-481a-a5bb-141229d53f70-kube-api-access-hzrvb\") pod \"auto-csr-approver-29552154-h29cn\" (UID: \"4324c2f1-09fe-481a-a5bb-141229d53f70\") " pod="openshift-infra/auto-csr-approver-29552154-h29cn" Mar 10 07:54:00 crc kubenswrapper[4825]: I0310 07:54:00.486554 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552154-h29cn" Mar 10 07:54:00 crc kubenswrapper[4825]: I0310 07:54:00.904043 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552154-h29cn"] Mar 10 07:54:01 crc kubenswrapper[4825]: I0310 07:54:01.710796 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552154-h29cn" event={"ID":"4324c2f1-09fe-481a-a5bb-141229d53f70","Type":"ContainerStarted","Data":"7bd961efced5bb62f65eff4e49402d735b2f28d7b5a049700644e7e5f7096336"} Mar 10 07:54:02 crc kubenswrapper[4825]: I0310 07:54:02.721859 4825 generic.go:334] "Generic (PLEG): container finished" podID="4324c2f1-09fe-481a-a5bb-141229d53f70" containerID="5d25e5b92acd45205ccf731b1c11831d2292cc933aafc99dd2610f1bd2e87593" exitCode=0 Mar 10 07:54:02 crc kubenswrapper[4825]: I0310 07:54:02.721932 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552154-h29cn" event={"ID":"4324c2f1-09fe-481a-a5bb-141229d53f70","Type":"ContainerDied","Data":"5d25e5b92acd45205ccf731b1c11831d2292cc933aafc99dd2610f1bd2e87593"} Mar 10 07:54:04 crc kubenswrapper[4825]: I0310 07:54:04.055828 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552154-h29cn" Mar 10 07:54:04 crc kubenswrapper[4825]: I0310 07:54:04.169858 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzrvb\" (UniqueName: \"kubernetes.io/projected/4324c2f1-09fe-481a-a5bb-141229d53f70-kube-api-access-hzrvb\") pod \"4324c2f1-09fe-481a-a5bb-141229d53f70\" (UID: \"4324c2f1-09fe-481a-a5bb-141229d53f70\") " Mar 10 07:54:04 crc kubenswrapper[4825]: I0310 07:54:04.177350 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4324c2f1-09fe-481a-a5bb-141229d53f70-kube-api-access-hzrvb" (OuterVolumeSpecName: "kube-api-access-hzrvb") pod "4324c2f1-09fe-481a-a5bb-141229d53f70" (UID: "4324c2f1-09fe-481a-a5bb-141229d53f70"). InnerVolumeSpecName "kube-api-access-hzrvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:54:04 crc kubenswrapper[4825]: I0310 07:54:04.271598 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzrvb\" (UniqueName: \"kubernetes.io/projected/4324c2f1-09fe-481a-a5bb-141229d53f70-kube-api-access-hzrvb\") on node \"crc\" DevicePath \"\"" Mar 10 07:54:04 crc kubenswrapper[4825]: I0310 07:54:04.741107 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552154-h29cn" event={"ID":"4324c2f1-09fe-481a-a5bb-141229d53f70","Type":"ContainerDied","Data":"7bd961efced5bb62f65eff4e49402d735b2f28d7b5a049700644e7e5f7096336"} Mar 10 07:54:04 crc kubenswrapper[4825]: I0310 07:54:04.741222 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bd961efced5bb62f65eff4e49402d735b2f28d7b5a049700644e7e5f7096336" Mar 10 07:54:04 crc kubenswrapper[4825]: I0310 07:54:04.741287 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552154-h29cn" Mar 10 07:54:05 crc kubenswrapper[4825]: I0310 07:54:05.153459 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552148-72fl9"] Mar 10 07:54:05 crc kubenswrapper[4825]: I0310 07:54:05.161957 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552148-72fl9"] Mar 10 07:54:05 crc kubenswrapper[4825]: I0310 07:54:05.253592 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22968ce1-c537-4b94-91d4-46ed534b995b" path="/var/lib/kubelet/pods/22968ce1-c537-4b94-91d4-46ed534b995b/volumes" Mar 10 07:54:07 crc kubenswrapper[4825]: I0310 07:54:07.236374 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:54:07 crc kubenswrapper[4825]: E0310 07:54:07.237193 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:54:20 crc kubenswrapper[4825]: I0310 07:54:20.236079 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:54:20 crc kubenswrapper[4825]: E0310 07:54:20.236932 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:54:33 crc kubenswrapper[4825]: I0310 07:54:33.237265 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:54:33 crc kubenswrapper[4825]: E0310 07:54:33.238280 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:54:39 crc kubenswrapper[4825]: I0310 07:54:39.794993 4825 scope.go:117] "RemoveContainer" containerID="0d6e8de39c599ab6b37d117e7686ece52471a310d0060421d3799f7466d16803" Mar 10 07:54:45 crc kubenswrapper[4825]: I0310 07:54:45.236233 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:54:45 crc kubenswrapper[4825]: E0310 07:54:45.237031 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 07:55:00 crc kubenswrapper[4825]: I0310 07:55:00.236937 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:55:01 crc kubenswrapper[4825]: I0310 07:55:01.201982 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"bf3d1cd0d377f9662a83f61c7f1b91e30e6257a568216e36432b13ae96d3370b"} Mar 10 07:56:00 crc kubenswrapper[4825]: I0310 07:56:00.179647 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552156-p624h"] Mar 10 07:56:00 crc kubenswrapper[4825]: E0310 07:56:00.180687 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4324c2f1-09fe-481a-a5bb-141229d53f70" containerName="oc" Mar 10 07:56:00 crc kubenswrapper[4825]: I0310 07:56:00.180708 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4324c2f1-09fe-481a-a5bb-141229d53f70" containerName="oc" Mar 10 07:56:00 crc kubenswrapper[4825]: I0310 07:56:00.180894 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="4324c2f1-09fe-481a-a5bb-141229d53f70" containerName="oc" Mar 10 07:56:00 crc kubenswrapper[4825]: I0310 07:56:00.181516 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552156-p624h" Mar 10 07:56:00 crc kubenswrapper[4825]: I0310 07:56:00.184874 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:56:00 crc kubenswrapper[4825]: I0310 07:56:00.195728 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:56:00 crc kubenswrapper[4825]: I0310 07:56:00.196012 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:56:00 crc kubenswrapper[4825]: I0310 07:56:00.200614 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552156-p624h"] Mar 10 07:56:00 crc kubenswrapper[4825]: I0310 07:56:00.284991 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dktb6\" (UniqueName: \"kubernetes.io/projected/125ee403-a265-4083-a6ab-c02005988548-kube-api-access-dktb6\") pod \"auto-csr-approver-29552156-p624h\" (UID: \"125ee403-a265-4083-a6ab-c02005988548\") " pod="openshift-infra/auto-csr-approver-29552156-p624h" Mar 10 07:56:00 crc kubenswrapper[4825]: I0310 07:56:00.385801 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dktb6\" (UniqueName: \"kubernetes.io/projected/125ee403-a265-4083-a6ab-c02005988548-kube-api-access-dktb6\") pod \"auto-csr-approver-29552156-p624h\" (UID: \"125ee403-a265-4083-a6ab-c02005988548\") " pod="openshift-infra/auto-csr-approver-29552156-p624h" Mar 10 07:56:00 crc kubenswrapper[4825]: I0310 07:56:00.408616 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dktb6\" (UniqueName: \"kubernetes.io/projected/125ee403-a265-4083-a6ab-c02005988548-kube-api-access-dktb6\") pod \"auto-csr-approver-29552156-p624h\" (UID: \"125ee403-a265-4083-a6ab-c02005988548\") " pod="openshift-infra/auto-csr-approver-29552156-p624h" Mar 10 07:56:00 crc kubenswrapper[4825]: I0310 07:56:00.517296 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552156-p624h" Mar 10 07:56:01 crc kubenswrapper[4825]: I0310 07:56:01.025455 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552156-p624h"] Mar 10 07:56:01 crc kubenswrapper[4825]: I0310 07:56:01.668034 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552156-p624h" event={"ID":"125ee403-a265-4083-a6ab-c02005988548","Type":"ContainerStarted","Data":"346fc129e5f700c809513dc79b2048baf98397821985632fc69ab4cd42b0e5a8"} Mar 10 07:56:02 crc kubenswrapper[4825]: I0310 07:56:02.677084 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552156-p624h" event={"ID":"125ee403-a265-4083-a6ab-c02005988548","Type":"ContainerStarted","Data":"8d93e475acb39cfb291a147d608abb51f38af70c976fe05d21e86ef686302452"} Mar 10 07:56:02 crc kubenswrapper[4825]: I0310 07:56:02.692982 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552156-p624h" podStartSLOduration=1.463507785 podStartE2EDuration="2.692952702s" podCreationTimestamp="2026-03-10 07:56:00 +0000 UTC" firstStartedPulling="2026-03-10 07:56:01.039496911 +0000 UTC m=+4314.069277556" lastFinishedPulling="2026-03-10 07:56:02.268941818 +0000 UTC m=+4315.298722473" observedRunningTime="2026-03-10 07:56:02.6905848 +0000 UTC m=+4315.720365425" watchObservedRunningTime="2026-03-10 07:56:02.692952702 +0000 UTC m=+4315.722733377" Mar 10 07:56:03 crc kubenswrapper[4825]: I0310 07:56:03.688854 4825 generic.go:334] "Generic (PLEG): container finished" podID="125ee403-a265-4083-a6ab-c02005988548" containerID="8d93e475acb39cfb291a147d608abb51f38af70c976fe05d21e86ef686302452" exitCode=0 Mar 10 07:56:03 crc kubenswrapper[4825]: I0310 07:56:03.688916 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552156-p624h" event={"ID":"125ee403-a265-4083-a6ab-c02005988548","Type":"ContainerDied","Data":"8d93e475acb39cfb291a147d608abb51f38af70c976fe05d21e86ef686302452"} Mar 10 07:56:05 crc kubenswrapper[4825]: I0310 07:56:05.096369 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552156-p624h" Mar 10 07:56:05 crc kubenswrapper[4825]: I0310 07:56:05.177538 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dktb6\" (UniqueName: \"kubernetes.io/projected/125ee403-a265-4083-a6ab-c02005988548-kube-api-access-dktb6\") pod \"125ee403-a265-4083-a6ab-c02005988548\" (UID: \"125ee403-a265-4083-a6ab-c02005988548\") " Mar 10 07:56:05 crc kubenswrapper[4825]: I0310 07:56:05.184689 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/125ee403-a265-4083-a6ab-c02005988548-kube-api-access-dktb6" (OuterVolumeSpecName: "kube-api-access-dktb6") pod "125ee403-a265-4083-a6ab-c02005988548" (UID: "125ee403-a265-4083-a6ab-c02005988548"). InnerVolumeSpecName "kube-api-access-dktb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:56:05 crc kubenswrapper[4825]: I0310 07:56:05.279464 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dktb6\" (UniqueName: \"kubernetes.io/projected/125ee403-a265-4083-a6ab-c02005988548-kube-api-access-dktb6\") on node \"crc\" DevicePath \"\"" Mar 10 07:56:05 crc kubenswrapper[4825]: I0310 07:56:05.708719 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552156-p624h" event={"ID":"125ee403-a265-4083-a6ab-c02005988548","Type":"ContainerDied","Data":"346fc129e5f700c809513dc79b2048baf98397821985632fc69ab4cd42b0e5a8"} Mar 10 07:56:05 crc kubenswrapper[4825]: I0310 07:56:05.708778 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="346fc129e5f700c809513dc79b2048baf98397821985632fc69ab4cd42b0e5a8" Mar 10 07:56:05 crc kubenswrapper[4825]: I0310 07:56:05.708859 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552156-p624h" Mar 10 07:56:05 crc kubenswrapper[4825]: I0310 07:56:05.784792 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552150-cmlkq"] Mar 10 07:56:05 crc kubenswrapper[4825]: I0310 07:56:05.789931 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552150-cmlkq"] Mar 10 07:56:07 crc kubenswrapper[4825]: I0310 07:56:07.260567 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21d7cffa-33ec-4b8c-9704-cfcfc67faad1" path="/var/lib/kubelet/pods/21d7cffa-33ec-4b8c-9704-cfcfc67faad1/volumes" Mar 10 07:56:39 crc kubenswrapper[4825]: I0310 07:56:39.892541 4825 scope.go:117] "RemoveContainer" containerID="3e8dcd335b135cc458b631c18fca7d607496017c1de585d27fcacce776e817b3" Mar 10 07:57:16 crc kubenswrapper[4825]: I0310 07:57:16.888838 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:57:16 crc kubenswrapper[4825]: I0310 07:57:16.889411 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:57:46 crc kubenswrapper[4825]: I0310 07:57:46.888390 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:57:46 crc kubenswrapper[4825]: I0310 07:57:46.889010 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:58:00 crc kubenswrapper[4825]: I0310 07:58:00.143313 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552158-lkgxj"] Mar 10 07:58:00 crc kubenswrapper[4825]: E0310 07:58:00.144217 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="125ee403-a265-4083-a6ab-c02005988548" containerName="oc" Mar 10 07:58:00 crc kubenswrapper[4825]: I0310 07:58:00.144232 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="125ee403-a265-4083-a6ab-c02005988548" containerName="oc" Mar 10 07:58:00 crc kubenswrapper[4825]: I0310 07:58:00.144502 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="125ee403-a265-4083-a6ab-c02005988548" containerName="oc" Mar 10 07:58:00 crc kubenswrapper[4825]: I0310 07:58:00.146053 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552158-lkgxj" Mar 10 07:58:00 crc kubenswrapper[4825]: I0310 07:58:00.147743 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 07:58:00 crc kubenswrapper[4825]: I0310 07:58:00.149526 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 07:58:00 crc kubenswrapper[4825]: I0310 07:58:00.149737 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 07:58:00 crc kubenswrapper[4825]: I0310 07:58:00.154324 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552158-lkgxj"] Mar 10 07:58:00 crc kubenswrapper[4825]: I0310 07:58:00.318902 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbttz\" (UniqueName: \"kubernetes.io/projected/4fa50eb5-9f97-4136-9803-59496aa2698f-kube-api-access-vbttz\") pod \"auto-csr-approver-29552158-lkgxj\" (UID: \"4fa50eb5-9f97-4136-9803-59496aa2698f\") " pod="openshift-infra/auto-csr-approver-29552158-lkgxj" Mar 10 07:58:00 crc kubenswrapper[4825]: I0310 07:58:00.420729 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbttz\" (UniqueName: \"kubernetes.io/projected/4fa50eb5-9f97-4136-9803-59496aa2698f-kube-api-access-vbttz\") pod \"auto-csr-approver-29552158-lkgxj\" (UID: \"4fa50eb5-9f97-4136-9803-59496aa2698f\") " pod="openshift-infra/auto-csr-approver-29552158-lkgxj" Mar 10 07:58:00 crc kubenswrapper[4825]: I0310 07:58:00.447354 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbttz\" (UniqueName: \"kubernetes.io/projected/4fa50eb5-9f97-4136-9803-59496aa2698f-kube-api-access-vbttz\") pod \"auto-csr-approver-29552158-lkgxj\" (UID: \"4fa50eb5-9f97-4136-9803-59496aa2698f\") " pod="openshift-infra/auto-csr-approver-29552158-lkgxj" Mar 10 07:58:00 crc kubenswrapper[4825]: I0310 07:58:00.485542 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552158-lkgxj" Mar 10 07:58:00 crc kubenswrapper[4825]: I0310 07:58:00.956332 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552158-lkgxj"] Mar 10 07:58:00 crc kubenswrapper[4825]: I0310 07:58:00.961228 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 07:58:01 crc kubenswrapper[4825]: I0310 07:58:01.413013 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rmb4n"] Mar 10 07:58:01 crc kubenswrapper[4825]: I0310 07:58:01.416017 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rmb4n" Mar 10 07:58:01 crc kubenswrapper[4825]: I0310 07:58:01.422839 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmb4n"] Mar 10 07:58:01 crc kubenswrapper[4825]: I0310 07:58:01.456526 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a8dd78-756c-4fa2-814f-6508e848e092-utilities\") pod \"redhat-marketplace-rmb4n\" (UID: \"57a8dd78-756c-4fa2-814f-6508e848e092\") " pod="openshift-marketplace/redhat-marketplace-rmb4n" Mar 10 07:58:01 crc kubenswrapper[4825]: I0310 07:58:01.456605 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhmrd\" (UniqueName: \"kubernetes.io/projected/57a8dd78-756c-4fa2-814f-6508e848e092-kube-api-access-bhmrd\") pod \"redhat-marketplace-rmb4n\" (UID: \"57a8dd78-756c-4fa2-814f-6508e848e092\") " pod="openshift-marketplace/redhat-marketplace-rmb4n" Mar 10 07:58:01 crc kubenswrapper[4825]: I0310 07:58:01.456639 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a8dd78-756c-4fa2-814f-6508e848e092-catalog-content\") pod \"redhat-marketplace-rmb4n\" (UID: \"57a8dd78-756c-4fa2-814f-6508e848e092\") " pod="openshift-marketplace/redhat-marketplace-rmb4n" Mar 10 07:58:01 crc kubenswrapper[4825]: I0310 07:58:01.557453 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a8dd78-756c-4fa2-814f-6508e848e092-catalog-content\") pod \"redhat-marketplace-rmb4n\" (UID: \"57a8dd78-756c-4fa2-814f-6508e848e092\") " pod="openshift-marketplace/redhat-marketplace-rmb4n" Mar 10 07:58:01 crc kubenswrapper[4825]: I0310 07:58:01.557581 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a8dd78-756c-4fa2-814f-6508e848e092-utilities\") pod \"redhat-marketplace-rmb4n\" (UID: \"57a8dd78-756c-4fa2-814f-6508e848e092\") " pod="openshift-marketplace/redhat-marketplace-rmb4n" Mar 10 07:58:01 crc kubenswrapper[4825]: I0310 07:58:01.557636 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhmrd\" (UniqueName: \"kubernetes.io/projected/57a8dd78-756c-4fa2-814f-6508e848e092-kube-api-access-bhmrd\") pod \"redhat-marketplace-rmb4n\" (UID: \"57a8dd78-756c-4fa2-814f-6508e848e092\") " pod="openshift-marketplace/redhat-marketplace-rmb4n" Mar 10 07:58:01 crc kubenswrapper[4825]: I0310 07:58:01.557917 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a8dd78-756c-4fa2-814f-6508e848e092-catalog-content\") pod \"redhat-marketplace-rmb4n\" (UID: \"57a8dd78-756c-4fa2-814f-6508e848e092\") " pod="openshift-marketplace/redhat-marketplace-rmb4n" Mar 10 07:58:01 crc kubenswrapper[4825]: I0310 07:58:01.558005 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a8dd78-756c-4fa2-814f-6508e848e092-utilities\") pod \"redhat-marketplace-rmb4n\" (UID: \"57a8dd78-756c-4fa2-814f-6508e848e092\") " pod="openshift-marketplace/redhat-marketplace-rmb4n" Mar 10 07:58:01 crc kubenswrapper[4825]: I0310 07:58:01.576510 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhmrd\" (UniqueName: \"kubernetes.io/projected/57a8dd78-756c-4fa2-814f-6508e848e092-kube-api-access-bhmrd\") pod \"redhat-marketplace-rmb4n\" (UID: \"57a8dd78-756c-4fa2-814f-6508e848e092\") " pod="openshift-marketplace/redhat-marketplace-rmb4n" Mar 10 07:58:01 crc kubenswrapper[4825]: I0310 07:58:01.735439 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rmb4n" Mar 10 07:58:01 crc kubenswrapper[4825]: I0310 07:58:01.745611 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552158-lkgxj" event={"ID":"4fa50eb5-9f97-4136-9803-59496aa2698f","Type":"ContainerStarted","Data":"048acd229d8e0fe19df79ef8be590782029215830cd50c4bd904a9e25cd107ea"} Mar 10 07:58:02 crc kubenswrapper[4825]: I0310 07:58:02.162964 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmb4n"] Mar 10 07:58:02 crc kubenswrapper[4825]: W0310 07:58:02.165830 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57a8dd78_756c_4fa2_814f_6508e848e092.slice/crio-7c973049c2c743a330b9ca38ec703d936bda9167d9d8105a9d131b689f059782 WatchSource:0}: Error finding container 7c973049c2c743a330b9ca38ec703d936bda9167d9d8105a9d131b689f059782: Status 404 returned error can't find the container with id 7c973049c2c743a330b9ca38ec703d936bda9167d9d8105a9d131b689f059782 Mar 10 07:58:02 crc kubenswrapper[4825]: I0310 07:58:02.758908 4825 generic.go:334] "Generic (PLEG): container finished" podID="4fa50eb5-9f97-4136-9803-59496aa2698f" containerID="d71a886e8be783f1a2c88c9aa8813de1e5e30017d829cfc788f8fd2ab4f3a798" exitCode=0 Mar 10 07:58:02 crc kubenswrapper[4825]: I0310 07:58:02.758969 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552158-lkgxj" event={"ID":"4fa50eb5-9f97-4136-9803-59496aa2698f","Type":"ContainerDied","Data":"d71a886e8be783f1a2c88c9aa8813de1e5e30017d829cfc788f8fd2ab4f3a798"} Mar 10 07:58:02 crc kubenswrapper[4825]: I0310 07:58:02.761301 4825 generic.go:334] "Generic (PLEG): container finished" podID="57a8dd78-756c-4fa2-814f-6508e848e092" containerID="21aa91e8acd39594b3af584699372380936c95967948e07f649449d62906420f" exitCode=0 Mar 10 07:58:02 crc kubenswrapper[4825]: I0310 07:58:02.761364 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmb4n" event={"ID":"57a8dd78-756c-4fa2-814f-6508e848e092","Type":"ContainerDied","Data":"21aa91e8acd39594b3af584699372380936c95967948e07f649449d62906420f"} Mar 10 07:58:02 crc kubenswrapper[4825]: I0310 07:58:02.761446 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmb4n" event={"ID":"57a8dd78-756c-4fa2-814f-6508e848e092","Type":"ContainerStarted","Data":"7c973049c2c743a330b9ca38ec703d936bda9167d9d8105a9d131b689f059782"} Mar 10 07:58:03 crc kubenswrapper[4825]: I0310 07:58:03.772605 4825 generic.go:334] "Generic (PLEG): container finished" podID="57a8dd78-756c-4fa2-814f-6508e848e092" containerID="dee621b5947e1da6391ca7caa68f268a04a36c01dc4d10230e7fdb8be96345a4" exitCode=0 Mar 10 07:58:03 crc kubenswrapper[4825]: I0310 07:58:03.772704 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmb4n" event={"ID":"57a8dd78-756c-4fa2-814f-6508e848e092","Type":"ContainerDied","Data":"dee621b5947e1da6391ca7caa68f268a04a36c01dc4d10230e7fdb8be96345a4"} Mar 10 07:58:04 crc kubenswrapper[4825]: I0310 07:58:04.059330 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552158-lkgxj" Mar 10 07:58:04 crc kubenswrapper[4825]: I0310 07:58:04.091553 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbttz\" (UniqueName: \"kubernetes.io/projected/4fa50eb5-9f97-4136-9803-59496aa2698f-kube-api-access-vbttz\") pod \"4fa50eb5-9f97-4136-9803-59496aa2698f\" (UID: \"4fa50eb5-9f97-4136-9803-59496aa2698f\") " Mar 10 07:58:04 crc kubenswrapper[4825]: I0310 07:58:04.100981 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fa50eb5-9f97-4136-9803-59496aa2698f-kube-api-access-vbttz" (OuterVolumeSpecName: "kube-api-access-vbttz") pod "4fa50eb5-9f97-4136-9803-59496aa2698f" (UID: "4fa50eb5-9f97-4136-9803-59496aa2698f"). InnerVolumeSpecName "kube-api-access-vbttz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:58:04 crc kubenswrapper[4825]: I0310 07:58:04.192908 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbttz\" (UniqueName: \"kubernetes.io/projected/4fa50eb5-9f97-4136-9803-59496aa2698f-kube-api-access-vbttz\") on node \"crc\" DevicePath \"\"" Mar 10 07:58:04 crc kubenswrapper[4825]: I0310 07:58:04.784526 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552158-lkgxj" event={"ID":"4fa50eb5-9f97-4136-9803-59496aa2698f","Type":"ContainerDied","Data":"048acd229d8e0fe19df79ef8be590782029215830cd50c4bd904a9e25cd107ea"} Mar 10 07:58:04 crc kubenswrapper[4825]: I0310 07:58:04.784911 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="048acd229d8e0fe19df79ef8be590782029215830cd50c4bd904a9e25cd107ea" Mar 10 07:58:04 crc kubenswrapper[4825]: I0310 07:58:04.784553 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552158-lkgxj" Mar 10 07:58:04 crc kubenswrapper[4825]: I0310 07:58:04.787351 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmb4n" event={"ID":"57a8dd78-756c-4fa2-814f-6508e848e092","Type":"ContainerStarted","Data":"def6ab7a93fe234afde5d6ee2da41c915371f2aa8acdcd8e7242f515a52c463d"} Mar 10 07:58:04 crc kubenswrapper[4825]: I0310 07:58:04.828075 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rmb4n" podStartSLOduration=2.074222211 podStartE2EDuration="3.828046821s" podCreationTimestamp="2026-03-10 07:58:01 +0000 UTC" firstStartedPulling="2026-03-10 07:58:02.762921439 +0000 UTC m=+4435.792702064" lastFinishedPulling="2026-03-10 07:58:04.516746049 +0000 UTC m=+4437.546526674" observedRunningTime="2026-03-10 07:58:04.806895697 +0000 UTC m=+4437.836676322" watchObservedRunningTime="2026-03-10 07:58:04.828046821 +0000 UTC m=+4437.857827446" Mar 10 07:58:05 crc kubenswrapper[4825]: I0310 07:58:05.131305 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552152-zfqml"] Mar 10 07:58:05 crc kubenswrapper[4825]: I0310 07:58:05.139317 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552152-zfqml"] Mar 10 07:58:05 crc kubenswrapper[4825]: I0310 07:58:05.244854 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1ae1165-a01f-4785-af2c-ae400f9d8617" path="/var/lib/kubelet/pods/c1ae1165-a01f-4785-af2c-ae400f9d8617/volumes" Mar 10 07:58:11 crc kubenswrapper[4825]: I0310 07:58:11.736114 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rmb4n" Mar 10 07:58:11 crc kubenswrapper[4825]: I0310 07:58:11.736611 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rmb4n" Mar 10 07:58:11 crc kubenswrapper[4825]: I0310 07:58:11.793057 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rmb4n" Mar 10 07:58:11 crc kubenswrapper[4825]: I0310 07:58:11.893566 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rmb4n" Mar 10 07:58:12 crc kubenswrapper[4825]: I0310 07:58:12.028213 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmb4n"] Mar 10 07:58:13 crc kubenswrapper[4825]: I0310 07:58:13.867361 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rmb4n" podUID="57a8dd78-756c-4fa2-814f-6508e848e092" containerName="registry-server" containerID="cri-o://def6ab7a93fe234afde5d6ee2da41c915371f2aa8acdcd8e7242f515a52c463d" gracePeriod=2 Mar 10 07:58:14 crc kubenswrapper[4825]: I0310 07:58:14.802984 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rmb4n" Mar 10 07:58:14 crc kubenswrapper[4825]: I0310 07:58:14.877435 4825 generic.go:334] "Generic (PLEG): container finished" podID="57a8dd78-756c-4fa2-814f-6508e848e092" containerID="def6ab7a93fe234afde5d6ee2da41c915371f2aa8acdcd8e7242f515a52c463d" exitCode=0 Mar 10 07:58:14 crc kubenswrapper[4825]: I0310 07:58:14.877486 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmb4n" event={"ID":"57a8dd78-756c-4fa2-814f-6508e848e092","Type":"ContainerDied","Data":"def6ab7a93fe234afde5d6ee2da41c915371f2aa8acdcd8e7242f515a52c463d"} Mar 10 07:58:14 crc kubenswrapper[4825]: I0310 07:58:14.877512 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rmb4n" Mar 10 07:58:14 crc kubenswrapper[4825]: I0310 07:58:14.877546 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmb4n" event={"ID":"57a8dd78-756c-4fa2-814f-6508e848e092","Type":"ContainerDied","Data":"7c973049c2c743a330b9ca38ec703d936bda9167d9d8105a9d131b689f059782"} Mar 10 07:58:14 crc kubenswrapper[4825]: I0310 07:58:14.877579 4825 scope.go:117] "RemoveContainer" containerID="def6ab7a93fe234afde5d6ee2da41c915371f2aa8acdcd8e7242f515a52c463d" Mar 10 07:58:14 crc kubenswrapper[4825]: I0310 07:58:14.902336 4825 scope.go:117] "RemoveContainer" containerID="dee621b5947e1da6391ca7caa68f268a04a36c01dc4d10230e7fdb8be96345a4" Mar 10 07:58:14 crc kubenswrapper[4825]: I0310 07:58:14.926915 4825 scope.go:117] "RemoveContainer" containerID="21aa91e8acd39594b3af584699372380936c95967948e07f649449d62906420f" Mar 10 07:58:14 crc kubenswrapper[4825]: I0310 07:58:14.961275 4825 scope.go:117] "RemoveContainer" containerID="def6ab7a93fe234afde5d6ee2da41c915371f2aa8acdcd8e7242f515a52c463d" Mar 10 07:58:14 crc kubenswrapper[4825]: E0310 07:58:14.961825 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"def6ab7a93fe234afde5d6ee2da41c915371f2aa8acdcd8e7242f515a52c463d\": container with ID starting with def6ab7a93fe234afde5d6ee2da41c915371f2aa8acdcd8e7242f515a52c463d not found: ID does not exist" containerID="def6ab7a93fe234afde5d6ee2da41c915371f2aa8acdcd8e7242f515a52c463d" Mar 10 07:58:14 crc kubenswrapper[4825]: I0310 07:58:14.961953 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"def6ab7a93fe234afde5d6ee2da41c915371f2aa8acdcd8e7242f515a52c463d"} err="failed to get container status \"def6ab7a93fe234afde5d6ee2da41c915371f2aa8acdcd8e7242f515a52c463d\": rpc error: code = NotFound desc = could not find container \"def6ab7a93fe234afde5d6ee2da41c915371f2aa8acdcd8e7242f515a52c463d\": container with ID starting with def6ab7a93fe234afde5d6ee2da41c915371f2aa8acdcd8e7242f515a52c463d not found: ID does not exist" Mar 10 07:58:14 crc kubenswrapper[4825]: I0310 07:58:14.962080 4825 scope.go:117] "RemoveContainer" containerID="dee621b5947e1da6391ca7caa68f268a04a36c01dc4d10230e7fdb8be96345a4" Mar 10 07:58:14 crc kubenswrapper[4825]: E0310 07:58:14.962549 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dee621b5947e1da6391ca7caa68f268a04a36c01dc4d10230e7fdb8be96345a4\": container with ID starting with dee621b5947e1da6391ca7caa68f268a04a36c01dc4d10230e7fdb8be96345a4 not found: ID does not exist" containerID="dee621b5947e1da6391ca7caa68f268a04a36c01dc4d10230e7fdb8be96345a4" Mar 10 07:58:14 crc kubenswrapper[4825]: I0310 07:58:14.962580 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dee621b5947e1da6391ca7caa68f268a04a36c01dc4d10230e7fdb8be96345a4"} err="failed to get container status \"dee621b5947e1da6391ca7caa68f268a04a36c01dc4d10230e7fdb8be96345a4\": rpc error: code = NotFound desc = could not find container \"dee621b5947e1da6391ca7caa68f268a04a36c01dc4d10230e7fdb8be96345a4\": container with ID starting with dee621b5947e1da6391ca7caa68f268a04a36c01dc4d10230e7fdb8be96345a4 not found: ID does not exist" Mar 10 07:58:14 crc kubenswrapper[4825]: I0310 07:58:14.962602 4825 scope.go:117] "RemoveContainer" containerID="21aa91e8acd39594b3af584699372380936c95967948e07f649449d62906420f" Mar 10 07:58:14 crc kubenswrapper[4825]: E0310 07:58:14.962994 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21aa91e8acd39594b3af584699372380936c95967948e07f649449d62906420f\": container with ID starting with 21aa91e8acd39594b3af584699372380936c95967948e07f649449d62906420f not found: ID does not exist" containerID="21aa91e8acd39594b3af584699372380936c95967948e07f649449d62906420f" Mar 10 07:58:14 crc kubenswrapper[4825]: I0310 07:58:14.963029 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21aa91e8acd39594b3af584699372380936c95967948e07f649449d62906420f"} err="failed to get container status \"21aa91e8acd39594b3af584699372380936c95967948e07f649449d62906420f\": rpc error: code = NotFound desc = could not find container \"21aa91e8acd39594b3af584699372380936c95967948e07f649449d62906420f\": container with ID starting with 21aa91e8acd39594b3af584699372380936c95967948e07f649449d62906420f not found: ID does not exist" Mar 10 07:58:14 crc kubenswrapper[4825]: I0310 07:58:14.976565 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a8dd78-756c-4fa2-814f-6508e848e092-utilities\") pod \"57a8dd78-756c-4fa2-814f-6508e848e092\" (UID: \"57a8dd78-756c-4fa2-814f-6508e848e092\") " Mar 10 07:58:14 crc kubenswrapper[4825]: I0310 07:58:14.976950 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a8dd78-756c-4fa2-814f-6508e848e092-catalog-content\") pod \"57a8dd78-756c-4fa2-814f-6508e848e092\" (UID: \"57a8dd78-756c-4fa2-814f-6508e848e092\") " Mar 10 07:58:14 crc kubenswrapper[4825]: I0310 07:58:14.977292 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhmrd\" (UniqueName: \"kubernetes.io/projected/57a8dd78-756c-4fa2-814f-6508e848e092-kube-api-access-bhmrd\") pod \"57a8dd78-756c-4fa2-814f-6508e848e092\" (UID: \"57a8dd78-756c-4fa2-814f-6508e848e092\") " Mar 10 07:58:14 crc kubenswrapper[4825]: I0310 07:58:14.977835 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a8dd78-756c-4fa2-814f-6508e848e092-utilities" (OuterVolumeSpecName: "utilities") pod "57a8dd78-756c-4fa2-814f-6508e848e092" (UID: "57a8dd78-756c-4fa2-814f-6508e848e092"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:58:14 crc kubenswrapper[4825]: I0310 07:58:14.982771 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a8dd78-756c-4fa2-814f-6508e848e092-kube-api-access-bhmrd" (OuterVolumeSpecName: "kube-api-access-bhmrd") pod "57a8dd78-756c-4fa2-814f-6508e848e092" (UID: "57a8dd78-756c-4fa2-814f-6508e848e092"). InnerVolumeSpecName "kube-api-access-bhmrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 07:58:15 crc kubenswrapper[4825]: I0310 07:58:15.006285 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a8dd78-756c-4fa2-814f-6508e848e092-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a8dd78-756c-4fa2-814f-6508e848e092" (UID: "57a8dd78-756c-4fa2-814f-6508e848e092"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 07:58:15 crc kubenswrapper[4825]: I0310 07:58:15.078685 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a8dd78-756c-4fa2-814f-6508e848e092-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 07:58:15 crc kubenswrapper[4825]: I0310 07:58:15.078737 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhmrd\" (UniqueName: \"kubernetes.io/projected/57a8dd78-756c-4fa2-814f-6508e848e092-kube-api-access-bhmrd\") on node \"crc\" DevicePath \"\"" Mar 10 07:58:15 crc kubenswrapper[4825]: I0310 07:58:15.078750 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a8dd78-756c-4fa2-814f-6508e848e092-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 07:58:15 crc kubenswrapper[4825]: I0310 07:58:15.250180 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmb4n"] Mar 10 07:58:15 crc kubenswrapper[4825]: I0310 07:58:15.252819 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmb4n"] Mar 10 07:58:16 crc kubenswrapper[4825]: I0310 07:58:16.887991 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 07:58:16 crc kubenswrapper[4825]: I0310 07:58:16.888420 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 07:58:16 crc kubenswrapper[4825]: I0310 07:58:16.888486 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 07:58:16 crc kubenswrapper[4825]: I0310 07:58:16.889403 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf3d1cd0d377f9662a83f61c7f1b91e30e6257a568216e36432b13ae96d3370b"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 07:58:16 crc kubenswrapper[4825]: I0310 07:58:16.889491 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://bf3d1cd0d377f9662a83f61c7f1b91e30e6257a568216e36432b13ae96d3370b" gracePeriod=600 Mar 10 07:58:17 crc kubenswrapper[4825]: I0310 07:58:17.246886 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a8dd78-756c-4fa2-814f-6508e848e092" path="/var/lib/kubelet/pods/57a8dd78-756c-4fa2-814f-6508e848e092/volumes" Mar 10 07:58:17 crc kubenswrapper[4825]: I0310 07:58:17.909281 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="bf3d1cd0d377f9662a83f61c7f1b91e30e6257a568216e36432b13ae96d3370b" exitCode=0 Mar 10 07:58:17 crc kubenswrapper[4825]: I0310 07:58:17.909319 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"bf3d1cd0d377f9662a83f61c7f1b91e30e6257a568216e36432b13ae96d3370b"} Mar 10 07:58:17 crc kubenswrapper[4825]: I0310 07:58:17.909734 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c"} Mar 10 07:58:17 crc kubenswrapper[4825]: I0310 07:58:17.909772 4825 scope.go:117] "RemoveContainer" containerID="4441b6748db08977859f541e3c9aeb39a944a681030d88e33129a9cd77990e0e" Mar 10 07:58:39 crc kubenswrapper[4825]: I0310 07:58:39.989029 4825 scope.go:117] "RemoveContainer" containerID="6c749693e97ce99b7a0caaf4550921eee1d7b411ab4554d13a1968b7b75bd857" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.156070 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552160-8892c"] Mar 10 08:00:00 crc kubenswrapper[4825]: E0310 08:00:00.156941 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a8dd78-756c-4fa2-814f-6508e848e092" containerName="extract-content" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.156959 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a8dd78-756c-4fa2-814f-6508e848e092" containerName="extract-content" Mar 10 08:00:00 crc kubenswrapper[4825]: E0310 08:00:00.156971 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fa50eb5-9f97-4136-9803-59496aa2698f" containerName="oc" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.156978 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fa50eb5-9f97-4136-9803-59496aa2698f" containerName="oc" Mar 10 08:00:00 crc kubenswrapper[4825]: E0310 08:00:00.156993 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a8dd78-756c-4fa2-814f-6508e848e092" containerName="extract-utilities" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.157001 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a8dd78-756c-4fa2-814f-6508e848e092" containerName="extract-utilities" Mar 10 08:00:00 crc kubenswrapper[4825]: E0310 08:00:00.157009 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a8dd78-756c-4fa2-814f-6508e848e092" containerName="registry-server" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.157015 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a8dd78-756c-4fa2-814f-6508e848e092" containerName="registry-server" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.157228 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="57a8dd78-756c-4fa2-814f-6508e848e092" containerName="registry-server" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.157246 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fa50eb5-9f97-4136-9803-59496aa2698f" containerName="oc" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.157732 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552160-8892c" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.162820 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.163313 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.163345 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.166507 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552160-8892c"] Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.173287 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552160-tc9rj"] Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.174787 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552160-tc9rj" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.179151 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552160-tc9rj"] Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.181585 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.181632 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.199259 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9xp8\" (UniqueName: \"kubernetes.io/projected/fa07e9b6-e969-4222-803d-69e1b3560b27-kube-api-access-n9xp8\") pod \"auto-csr-approver-29552160-8892c\" (UID: \"fa07e9b6-e969-4222-803d-69e1b3560b27\") " pod="openshift-infra/auto-csr-approver-29552160-8892c" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.199351 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/535e3e1e-6cd1-4558-8eb1-26d354d84aab-secret-volume\") pod \"collect-profiles-29552160-tc9rj\" (UID: \"535e3e1e-6cd1-4558-8eb1-26d354d84aab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552160-tc9rj" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.199412 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/535e3e1e-6cd1-4558-8eb1-26d354d84aab-config-volume\") pod \"collect-profiles-29552160-tc9rj\" (UID: \"535e3e1e-6cd1-4558-8eb1-26d354d84aab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552160-tc9rj" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.199438 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d8g9\" (UniqueName: \"kubernetes.io/projected/535e3e1e-6cd1-4558-8eb1-26d354d84aab-kube-api-access-6d8g9\") pod \"collect-profiles-29552160-tc9rj\" (UID: \"535e3e1e-6cd1-4558-8eb1-26d354d84aab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552160-tc9rj" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.300879 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/535e3e1e-6cd1-4558-8eb1-26d354d84aab-secret-volume\") pod \"collect-profiles-29552160-tc9rj\" (UID: \"535e3e1e-6cd1-4558-8eb1-26d354d84aab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552160-tc9rj" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.300959 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/535e3e1e-6cd1-4558-8eb1-26d354d84aab-config-volume\") pod \"collect-profiles-29552160-tc9rj\" (UID: \"535e3e1e-6cd1-4558-8eb1-26d354d84aab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552160-tc9rj" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.300989 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d8g9\" (UniqueName: \"kubernetes.io/projected/535e3e1e-6cd1-4558-8eb1-26d354d84aab-kube-api-access-6d8g9\") pod \"collect-profiles-29552160-tc9rj\" (UID: \"535e3e1e-6cd1-4558-8eb1-26d354d84aab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552160-tc9rj" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.301035 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9xp8\" (UniqueName: \"kubernetes.io/projected/fa07e9b6-e969-4222-803d-69e1b3560b27-kube-api-access-n9xp8\") pod \"auto-csr-approver-29552160-8892c\" (UID: \"fa07e9b6-e969-4222-803d-69e1b3560b27\") " pod="openshift-infra/auto-csr-approver-29552160-8892c" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.302538 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/535e3e1e-6cd1-4558-8eb1-26d354d84aab-config-volume\") pod \"collect-profiles-29552160-tc9rj\" (UID: \"535e3e1e-6cd1-4558-8eb1-26d354d84aab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552160-tc9rj" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.307030 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/535e3e1e-6cd1-4558-8eb1-26d354d84aab-secret-volume\") pod \"collect-profiles-29552160-tc9rj\" (UID: \"535e3e1e-6cd1-4558-8eb1-26d354d84aab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552160-tc9rj" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.317204 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9xp8\" (UniqueName: \"kubernetes.io/projected/fa07e9b6-e969-4222-803d-69e1b3560b27-kube-api-access-n9xp8\") pod \"auto-csr-approver-29552160-8892c\" (UID: \"fa07e9b6-e969-4222-803d-69e1b3560b27\") " pod="openshift-infra/auto-csr-approver-29552160-8892c" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.319782 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d8g9\" (UniqueName: \"kubernetes.io/projected/535e3e1e-6cd1-4558-8eb1-26d354d84aab-kube-api-access-6d8g9\") pod \"collect-profiles-29552160-tc9rj\" (UID: \"535e3e1e-6cd1-4558-8eb1-26d354d84aab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552160-tc9rj" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.494978 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552160-8892c" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.504646 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552160-tc9rj" Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.752643 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552160-8892c"] Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.795820 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552160-tc9rj"] Mar 10 08:00:00 crc kubenswrapper[4825]: W0310 08:00:00.800122 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod535e3e1e_6cd1_4558_8eb1_26d354d84aab.slice/crio-adf3caccc271c60e20ec06ad84e59cb34d73507e58f81d99ec703043fad1e773 WatchSource:0}: Error finding container adf3caccc271c60e20ec06ad84e59cb34d73507e58f81d99ec703043fad1e773: Status 404 returned error can't find the container with id adf3caccc271c60e20ec06ad84e59cb34d73507e58f81d99ec703043fad1e773 Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.857385 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552160-tc9rj" event={"ID":"535e3e1e-6cd1-4558-8eb1-26d354d84aab","Type":"ContainerStarted","Data":"adf3caccc271c60e20ec06ad84e59cb34d73507e58f81d99ec703043fad1e773"} Mar 10 08:00:00 crc kubenswrapper[4825]: I0310 08:00:00.863435 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552160-8892c" event={"ID":"fa07e9b6-e969-4222-803d-69e1b3560b27","Type":"ContainerStarted","Data":"ee339d616fdbc660c35f357428ce1aeceb239cb91a74c79812bd0b3f13ff1e70"} Mar 10 08:00:01 crc kubenswrapper[4825]: I0310 08:00:01.871703 4825 generic.go:334] "Generic (PLEG): container finished" podID="535e3e1e-6cd1-4558-8eb1-26d354d84aab" containerID="f69710812475557cff5b28a89e252f8422664974d758dfa8bd70815e8bee731c" exitCode=0 Mar 10 08:00:01 crc kubenswrapper[4825]: I0310 08:00:01.871780 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552160-tc9rj" event={"ID":"535e3e1e-6cd1-4558-8eb1-26d354d84aab","Type":"ContainerDied","Data":"f69710812475557cff5b28a89e252f8422664974d758dfa8bd70815e8bee731c"} Mar 10 08:00:03 crc kubenswrapper[4825]: I0310 08:00:03.202031 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552160-tc9rj" Mar 10 08:00:03 crc kubenswrapper[4825]: I0310 08:00:03.242240 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d8g9\" (UniqueName: \"kubernetes.io/projected/535e3e1e-6cd1-4558-8eb1-26d354d84aab-kube-api-access-6d8g9\") pod \"535e3e1e-6cd1-4558-8eb1-26d354d84aab\" (UID: \"535e3e1e-6cd1-4558-8eb1-26d354d84aab\") " Mar 10 08:00:03 crc kubenswrapper[4825]: I0310 08:00:03.242339 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/535e3e1e-6cd1-4558-8eb1-26d354d84aab-config-volume\") pod \"535e3e1e-6cd1-4558-8eb1-26d354d84aab\" (UID: \"535e3e1e-6cd1-4558-8eb1-26d354d84aab\") " Mar 10 08:00:03 crc kubenswrapper[4825]: I0310 08:00:03.242380 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/535e3e1e-6cd1-4558-8eb1-26d354d84aab-secret-volume\") pod \"535e3e1e-6cd1-4558-8eb1-26d354d84aab\" (UID: \"535e3e1e-6cd1-4558-8eb1-26d354d84aab\") " Mar 10 08:00:03 crc kubenswrapper[4825]: I0310 08:00:03.243947 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/535e3e1e-6cd1-4558-8eb1-26d354d84aab-config-volume" (OuterVolumeSpecName: "config-volume") pod "535e3e1e-6cd1-4558-8eb1-26d354d84aab" (UID: "535e3e1e-6cd1-4558-8eb1-26d354d84aab"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:00:03 crc kubenswrapper[4825]: I0310 08:00:03.247577 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/535e3e1e-6cd1-4558-8eb1-26d354d84aab-kube-api-access-6d8g9" (OuterVolumeSpecName: "kube-api-access-6d8g9") pod "535e3e1e-6cd1-4558-8eb1-26d354d84aab" (UID: "535e3e1e-6cd1-4558-8eb1-26d354d84aab"). InnerVolumeSpecName "kube-api-access-6d8g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:00:03 crc kubenswrapper[4825]: I0310 08:00:03.248781 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/535e3e1e-6cd1-4558-8eb1-26d354d84aab-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "535e3e1e-6cd1-4558-8eb1-26d354d84aab" (UID: "535e3e1e-6cd1-4558-8eb1-26d354d84aab"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:00:03 crc kubenswrapper[4825]: I0310 08:00:03.344256 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d8g9\" (UniqueName: \"kubernetes.io/projected/535e3e1e-6cd1-4558-8eb1-26d354d84aab-kube-api-access-6d8g9\") on node \"crc\" DevicePath \"\"" Mar 10 08:00:03 crc kubenswrapper[4825]: I0310 08:00:03.344289 4825 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/535e3e1e-6cd1-4558-8eb1-26d354d84aab-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 08:00:03 crc kubenswrapper[4825]: I0310 08:00:03.344297 4825 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/535e3e1e-6cd1-4558-8eb1-26d354d84aab-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 08:00:03 crc kubenswrapper[4825]: I0310 08:00:03.908491 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552160-tc9rj" event={"ID":"535e3e1e-6cd1-4558-8eb1-26d354d84aab","Type":"ContainerDied","Data":"adf3caccc271c60e20ec06ad84e59cb34d73507e58f81d99ec703043fad1e773"} Mar 10 08:00:03 crc kubenswrapper[4825]: I0310 08:00:03.908604 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552160-tc9rj" Mar 10 08:00:03 crc kubenswrapper[4825]: I0310 08:00:03.908625 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adf3caccc271c60e20ec06ad84e59cb34d73507e58f81d99ec703043fad1e773" Mar 10 08:00:04 crc kubenswrapper[4825]: I0310 08:00:04.287031 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552115-k2j8m"] Mar 10 08:00:04 crc kubenswrapper[4825]: I0310 08:00:04.296622 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552115-k2j8m"] Mar 10 08:00:04 crc kubenswrapper[4825]: I0310 08:00:04.917468 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552160-8892c" event={"ID":"fa07e9b6-e969-4222-803d-69e1b3560b27","Type":"ContainerStarted","Data":"03d2536a097bc83bfae019f5b730c18e02fa290fb58244fde82127ec8dc1f1be"} Mar 10 08:00:04 crc kubenswrapper[4825]: I0310 08:00:04.932199 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552160-8892c" podStartSLOduration=1.187666385 podStartE2EDuration="4.932172167s" podCreationTimestamp="2026-03-10 08:00:00 +0000 UTC" firstStartedPulling="2026-03-10 08:00:00.759923974 +0000 UTC m=+4553.789704589" lastFinishedPulling="2026-03-10 08:00:04.504429746 +0000 UTC m=+4557.534210371" observedRunningTime="2026-03-10 08:00:04.927498565 +0000 UTC m=+4557.957279240" watchObservedRunningTime="2026-03-10 08:00:04.932172167 +0000 UTC m=+4557.961952802" Mar 10 08:00:05 crc kubenswrapper[4825]: I0310 08:00:05.252240 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16589365-ed0f-41b3-af61-118fe9d14e14" path="/var/lib/kubelet/pods/16589365-ed0f-41b3-af61-118fe9d14e14/volumes" Mar 10 08:00:05 crc kubenswrapper[4825]: I0310 08:00:05.930042 4825 generic.go:334] "Generic (PLEG): container finished" podID="fa07e9b6-e969-4222-803d-69e1b3560b27" containerID="03d2536a097bc83bfae019f5b730c18e02fa290fb58244fde82127ec8dc1f1be" exitCode=0 Mar 10 08:00:05 crc kubenswrapper[4825]: I0310 08:00:05.930086 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552160-8892c" event={"ID":"fa07e9b6-e969-4222-803d-69e1b3560b27","Type":"ContainerDied","Data":"03d2536a097bc83bfae019f5b730c18e02fa290fb58244fde82127ec8dc1f1be"} Mar 10 08:00:07 crc kubenswrapper[4825]: I0310 08:00:07.258110 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552160-8892c" Mar 10 08:00:07 crc kubenswrapper[4825]: I0310 08:00:07.305371 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9xp8\" (UniqueName: \"kubernetes.io/projected/fa07e9b6-e969-4222-803d-69e1b3560b27-kube-api-access-n9xp8\") pod \"fa07e9b6-e969-4222-803d-69e1b3560b27\" (UID: \"fa07e9b6-e969-4222-803d-69e1b3560b27\") " Mar 10 08:00:07 crc kubenswrapper[4825]: I0310 08:00:07.658262 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa07e9b6-e969-4222-803d-69e1b3560b27-kube-api-access-n9xp8" (OuterVolumeSpecName: "kube-api-access-n9xp8") pod "fa07e9b6-e969-4222-803d-69e1b3560b27" (UID: "fa07e9b6-e969-4222-803d-69e1b3560b27"). InnerVolumeSpecName "kube-api-access-n9xp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:00:07 crc kubenswrapper[4825]: I0310 08:00:07.712115 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9xp8\" (UniqueName: \"kubernetes.io/projected/fa07e9b6-e969-4222-803d-69e1b3560b27-kube-api-access-n9xp8\") on node \"crc\" DevicePath \"\"" Mar 10 08:00:07 crc kubenswrapper[4825]: I0310 08:00:07.948222 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552160-8892c" event={"ID":"fa07e9b6-e969-4222-803d-69e1b3560b27","Type":"ContainerDied","Data":"ee339d616fdbc660c35f357428ce1aeceb239cb91a74c79812bd0b3f13ff1e70"} Mar 10 08:00:07 crc kubenswrapper[4825]: I0310 08:00:07.948327 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552160-8892c" Mar 10 08:00:07 crc kubenswrapper[4825]: I0310 08:00:07.948324 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee339d616fdbc660c35f357428ce1aeceb239cb91a74c79812bd0b3f13ff1e70" Mar 10 08:00:07 crc kubenswrapper[4825]: I0310 08:00:07.994536 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552154-h29cn"] Mar 10 08:00:08 crc kubenswrapper[4825]: I0310 08:00:08.004213 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552154-h29cn"] Mar 10 08:00:08 crc kubenswrapper[4825]: E0310 08:00:08.141785 4825 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa07e9b6_e969_4222_803d_69e1b3560b27.slice/crio-ee339d616fdbc660c35f357428ce1aeceb239cb91a74c79812bd0b3f13ff1e70\": RecentStats: unable to find data in memory cache]" Mar 10 08:00:09 crc kubenswrapper[4825]: I0310 08:00:09.245637 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4324c2f1-09fe-481a-a5bb-141229d53f70" path="/var/lib/kubelet/pods/4324c2f1-09fe-481a-a5bb-141229d53f70/volumes" Mar 10 08:00:22 crc kubenswrapper[4825]: I0310 08:00:22.129226 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-nr28s"] Mar 10 08:00:22 crc kubenswrapper[4825]: I0310 08:00:22.142042 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-nr28s"] Mar 10 08:00:22 crc kubenswrapper[4825]: I0310 08:00:22.304833 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-mlldc"] Mar 10 08:00:22 crc kubenswrapper[4825]: E0310 08:00:22.305217 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="535e3e1e-6cd1-4558-8eb1-26d354d84aab" containerName="collect-profiles" Mar 10 08:00:22 crc kubenswrapper[4825]: I0310 08:00:22.305240 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="535e3e1e-6cd1-4558-8eb1-26d354d84aab" containerName="collect-profiles" Mar 10 08:00:22 crc kubenswrapper[4825]: E0310 08:00:22.305253 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa07e9b6-e969-4222-803d-69e1b3560b27" containerName="oc" Mar 10 08:00:22 crc kubenswrapper[4825]: I0310 08:00:22.305260 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa07e9b6-e969-4222-803d-69e1b3560b27" containerName="oc" Mar 10 08:00:22 crc kubenswrapper[4825]: I0310 08:00:22.305428 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="535e3e1e-6cd1-4558-8eb1-26d354d84aab" containerName="collect-profiles" Mar 10 08:00:22 crc kubenswrapper[4825]: I0310 08:00:22.305470 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa07e9b6-e969-4222-803d-69e1b3560b27" containerName="oc" Mar 10 08:00:22 crc kubenswrapper[4825]: I0310 08:00:22.305971 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mlldc" Mar 10 08:00:22 crc kubenswrapper[4825]: I0310 08:00:22.310638 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 10 08:00:22 crc kubenswrapper[4825]: I0310 08:00:22.311056 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 10 08:00:22 crc kubenswrapper[4825]: I0310 08:00:22.311189 4825 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-8w9nk" Mar 10 08:00:22 crc kubenswrapper[4825]: I0310 08:00:22.311308 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 10 08:00:22 crc kubenswrapper[4825]: I0310 08:00:22.322147 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-mlldc"] Mar 10 08:00:22 crc kubenswrapper[4825]: I0310 08:00:22.468633 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f92ec4dd-be42-4629-add3-7e50ca6cb897-crc-storage\") pod \"crc-storage-crc-mlldc\" (UID: \"f92ec4dd-be42-4629-add3-7e50ca6cb897\") " pod="crc-storage/crc-storage-crc-mlldc" Mar 10 08:00:22 crc kubenswrapper[4825]: I0310 08:00:22.468693 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f92ec4dd-be42-4629-add3-7e50ca6cb897-node-mnt\") pod \"crc-storage-crc-mlldc\" (UID: \"f92ec4dd-be42-4629-add3-7e50ca6cb897\") " pod="crc-storage/crc-storage-crc-mlldc" Mar 10 08:00:22 crc kubenswrapper[4825]: I0310 08:00:22.468759 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cr8h\" (UniqueName: \"kubernetes.io/projected/f92ec4dd-be42-4629-add3-7e50ca6cb897-kube-api-access-5cr8h\") pod \"crc-storage-crc-mlldc\" (UID: \"f92ec4dd-be42-4629-add3-7e50ca6cb897\") " pod="crc-storage/crc-storage-crc-mlldc" Mar 10 08:00:22 crc kubenswrapper[4825]: I0310 08:00:22.571876 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f92ec4dd-be42-4629-add3-7e50ca6cb897-node-mnt\") pod \"crc-storage-crc-mlldc\" (UID: \"f92ec4dd-be42-4629-add3-7e50ca6cb897\") " pod="crc-storage/crc-storage-crc-mlldc" Mar 10 08:00:22 crc kubenswrapper[4825]: I0310 08:00:22.571955 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cr8h\" (UniqueName: \"kubernetes.io/projected/f92ec4dd-be42-4629-add3-7e50ca6cb897-kube-api-access-5cr8h\") pod \"crc-storage-crc-mlldc\" (UID: \"f92ec4dd-be42-4629-add3-7e50ca6cb897\") " pod="crc-storage/crc-storage-crc-mlldc" Mar 10 08:00:22 crc kubenswrapper[4825]: I0310 08:00:22.572041 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f92ec4dd-be42-4629-add3-7e50ca6cb897-crc-storage\") pod \"crc-storage-crc-mlldc\" (UID: \"f92ec4dd-be42-4629-add3-7e50ca6cb897\") " pod="crc-storage/crc-storage-crc-mlldc" Mar 10 08:00:22 crc kubenswrapper[4825]: I0310 08:00:22.572562 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f92ec4dd-be42-4629-add3-7e50ca6cb897-node-mnt\") pod \"crc-storage-crc-mlldc\" (UID: \"f92ec4dd-be42-4629-add3-7e50ca6cb897\") " pod="crc-storage/crc-storage-crc-mlldc" Mar 10 08:00:22 crc kubenswrapper[4825]: I0310 08:00:22.572719 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f92ec4dd-be42-4629-add3-7e50ca6cb897-crc-storage\") pod \"crc-storage-crc-mlldc\" (UID: \"f92ec4dd-be42-4629-add3-7e50ca6cb897\") " pod="crc-storage/crc-storage-crc-mlldc" Mar 10 08:00:22 crc kubenswrapper[4825]: I0310 08:00:22.603896 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cr8h\" (UniqueName: \"kubernetes.io/projected/f92ec4dd-be42-4629-add3-7e50ca6cb897-kube-api-access-5cr8h\") pod \"crc-storage-crc-mlldc\" (UID: \"f92ec4dd-be42-4629-add3-7e50ca6cb897\") " pod="crc-storage/crc-storage-crc-mlldc" Mar 10 08:00:22 crc kubenswrapper[4825]: I0310 08:00:22.676397 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mlldc" Mar 10 08:00:23 crc kubenswrapper[4825]: I0310 08:00:23.070672 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-mlldc"] Mar 10 08:00:23 crc kubenswrapper[4825]: I0310 08:00:23.254058 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ca2b7f9-6ee2-4a55-8029-f956f85c0466" path="/var/lib/kubelet/pods/0ca2b7f9-6ee2-4a55-8029-f956f85c0466/volumes" Mar 10 08:00:24 crc kubenswrapper[4825]: I0310 08:00:24.098822 4825 generic.go:334] "Generic (PLEG): container finished" podID="f92ec4dd-be42-4629-add3-7e50ca6cb897" containerID="c778d1b6fd25bd0d2c1fc75beee55baf9d66f6cf5319bd6ebc6298e0be7d36d2" exitCode=0 Mar 10 08:00:24 crc kubenswrapper[4825]: I0310 08:00:24.098890 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-mlldc" event={"ID":"f92ec4dd-be42-4629-add3-7e50ca6cb897","Type":"ContainerDied","Data":"c778d1b6fd25bd0d2c1fc75beee55baf9d66f6cf5319bd6ebc6298e0be7d36d2"} Mar 10 08:00:24 crc kubenswrapper[4825]: I0310 08:00:24.098928 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-mlldc" event={"ID":"f92ec4dd-be42-4629-add3-7e50ca6cb897","Type":"ContainerStarted","Data":"b6e97cea0679023eac5ebb9aeb43bcb2603e3d4a6bcf5ce641b8f4b28edc9d28"} Mar 10 08:00:25 crc kubenswrapper[4825]: I0310 08:00:25.420433 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mlldc" Mar 10 08:00:25 crc kubenswrapper[4825]: I0310 08:00:25.514519 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f92ec4dd-be42-4629-add3-7e50ca6cb897-node-mnt\") pod \"f92ec4dd-be42-4629-add3-7e50ca6cb897\" (UID: \"f92ec4dd-be42-4629-add3-7e50ca6cb897\") " Mar 10 08:00:25 crc kubenswrapper[4825]: I0310 08:00:25.514665 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f92ec4dd-be42-4629-add3-7e50ca6cb897-crc-storage\") pod \"f92ec4dd-be42-4629-add3-7e50ca6cb897\" (UID: \"f92ec4dd-be42-4629-add3-7e50ca6cb897\") " Mar 10 08:00:25 crc kubenswrapper[4825]: I0310 08:00:25.514639 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f92ec4dd-be42-4629-add3-7e50ca6cb897-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "f92ec4dd-be42-4629-add3-7e50ca6cb897" (UID: "f92ec4dd-be42-4629-add3-7e50ca6cb897"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 08:00:25 crc kubenswrapper[4825]: I0310 08:00:25.514862 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cr8h\" (UniqueName: \"kubernetes.io/projected/f92ec4dd-be42-4629-add3-7e50ca6cb897-kube-api-access-5cr8h\") pod \"f92ec4dd-be42-4629-add3-7e50ca6cb897\" (UID: \"f92ec4dd-be42-4629-add3-7e50ca6cb897\") " Mar 10 08:00:25 crc kubenswrapper[4825]: I0310 08:00:25.515293 4825 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f92ec4dd-be42-4629-add3-7e50ca6cb897-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 10 08:00:25 crc kubenswrapper[4825]: I0310 08:00:25.520254 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f92ec4dd-be42-4629-add3-7e50ca6cb897-kube-api-access-5cr8h" (OuterVolumeSpecName: "kube-api-access-5cr8h") pod "f92ec4dd-be42-4629-add3-7e50ca6cb897" (UID: "f92ec4dd-be42-4629-add3-7e50ca6cb897"). InnerVolumeSpecName "kube-api-access-5cr8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:00:25 crc kubenswrapper[4825]: I0310 08:00:25.532180 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f92ec4dd-be42-4629-add3-7e50ca6cb897-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "f92ec4dd-be42-4629-add3-7e50ca6cb897" (UID: "f92ec4dd-be42-4629-add3-7e50ca6cb897"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:00:25 crc kubenswrapper[4825]: I0310 08:00:25.617813 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cr8h\" (UniqueName: \"kubernetes.io/projected/f92ec4dd-be42-4629-add3-7e50ca6cb897-kube-api-access-5cr8h\") on node \"crc\" DevicePath \"\"" Mar 10 08:00:25 crc kubenswrapper[4825]: I0310 08:00:25.618379 4825 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f92ec4dd-be42-4629-add3-7e50ca6cb897-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 10 08:00:26 crc kubenswrapper[4825]: I0310 08:00:26.132640 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-mlldc" event={"ID":"f92ec4dd-be42-4629-add3-7e50ca6cb897","Type":"ContainerDied","Data":"b6e97cea0679023eac5ebb9aeb43bcb2603e3d4a6bcf5ce641b8f4b28edc9d28"} Mar 10 08:00:26 crc kubenswrapper[4825]: I0310 08:00:26.132943 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6e97cea0679023eac5ebb9aeb43bcb2603e3d4a6bcf5ce641b8f4b28edc9d28" Mar 10 08:00:26 crc kubenswrapper[4825]: I0310 08:00:26.132763 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mlldc" Mar 10 08:00:27 crc kubenswrapper[4825]: I0310 08:00:27.809865 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-mlldc"] Mar 10 08:00:27 crc kubenswrapper[4825]: I0310 08:00:27.820571 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-mlldc"] Mar 10 08:00:27 crc kubenswrapper[4825]: I0310 08:00:27.933697 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-tk8cq"] Mar 10 08:00:27 crc kubenswrapper[4825]: E0310 08:00:27.934077 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f92ec4dd-be42-4629-add3-7e50ca6cb897" containerName="storage" Mar 10 08:00:27 crc kubenswrapper[4825]: I0310 08:00:27.934096 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f92ec4dd-be42-4629-add3-7e50ca6cb897" containerName="storage" Mar 10 08:00:27 crc kubenswrapper[4825]: I0310 08:00:27.934240 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f92ec4dd-be42-4629-add3-7e50ca6cb897" containerName="storage" Mar 10 08:00:27 crc kubenswrapper[4825]: I0310 08:00:27.934961 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tk8cq" Mar 10 08:00:27 crc kubenswrapper[4825]: I0310 08:00:27.938552 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 10 08:00:27 crc kubenswrapper[4825]: I0310 08:00:27.938806 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 10 08:00:27 crc kubenswrapper[4825]: I0310 08:00:27.939315 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 10 08:00:27 crc kubenswrapper[4825]: I0310 08:00:27.939723 4825 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-8w9nk" Mar 10 08:00:27 crc kubenswrapper[4825]: I0310 08:00:27.940441 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tk8cq"] Mar 10 08:00:28 crc kubenswrapper[4825]: I0310 08:00:28.054643 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/590d2904-0fbc-4719-bb66-5ef354e0186f-node-mnt\") pod \"crc-storage-crc-tk8cq\" (UID: \"590d2904-0fbc-4719-bb66-5ef354e0186f\") " pod="crc-storage/crc-storage-crc-tk8cq" Mar 10 08:00:28 crc kubenswrapper[4825]: I0310 08:00:28.054704 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6ql5\" (UniqueName: \"kubernetes.io/projected/590d2904-0fbc-4719-bb66-5ef354e0186f-kube-api-access-f6ql5\") pod \"crc-storage-crc-tk8cq\" (UID: \"590d2904-0fbc-4719-bb66-5ef354e0186f\") " pod="crc-storage/crc-storage-crc-tk8cq" Mar 10 08:00:28 crc kubenswrapper[4825]: I0310 08:00:28.054815 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/590d2904-0fbc-4719-bb66-5ef354e0186f-crc-storage\") pod \"crc-storage-crc-tk8cq\" (UID: \"590d2904-0fbc-4719-bb66-5ef354e0186f\") " pod="crc-storage/crc-storage-crc-tk8cq" Mar 10 08:00:28 crc kubenswrapper[4825]: I0310 08:00:28.156244 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/590d2904-0fbc-4719-bb66-5ef354e0186f-node-mnt\") pod \"crc-storage-crc-tk8cq\" (UID: \"590d2904-0fbc-4719-bb66-5ef354e0186f\") " pod="crc-storage/crc-storage-crc-tk8cq" Mar 10 08:00:28 crc kubenswrapper[4825]: I0310 08:00:28.156299 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6ql5\" (UniqueName: \"kubernetes.io/projected/590d2904-0fbc-4719-bb66-5ef354e0186f-kube-api-access-f6ql5\") pod \"crc-storage-crc-tk8cq\" (UID: \"590d2904-0fbc-4719-bb66-5ef354e0186f\") " pod="crc-storage/crc-storage-crc-tk8cq" Mar 10 08:00:28 crc kubenswrapper[4825]: I0310 08:00:28.156423 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/590d2904-0fbc-4719-bb66-5ef354e0186f-crc-storage\") pod \"crc-storage-crc-tk8cq\" (UID: \"590d2904-0fbc-4719-bb66-5ef354e0186f\") " pod="crc-storage/crc-storage-crc-tk8cq" Mar 10 08:00:28 crc kubenswrapper[4825]: I0310 08:00:28.156519 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/590d2904-0fbc-4719-bb66-5ef354e0186f-node-mnt\") pod \"crc-storage-crc-tk8cq\" (UID: \"590d2904-0fbc-4719-bb66-5ef354e0186f\") " pod="crc-storage/crc-storage-crc-tk8cq" Mar 10 08:00:28 crc kubenswrapper[4825]: I0310 08:00:28.157379 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/590d2904-0fbc-4719-bb66-5ef354e0186f-crc-storage\") pod \"crc-storage-crc-tk8cq\" (UID: \"590d2904-0fbc-4719-bb66-5ef354e0186f\") " pod="crc-storage/crc-storage-crc-tk8cq" Mar 10 08:00:28 crc kubenswrapper[4825]: I0310 08:00:28.178740 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6ql5\" (UniqueName: \"kubernetes.io/projected/590d2904-0fbc-4719-bb66-5ef354e0186f-kube-api-access-f6ql5\") pod \"crc-storage-crc-tk8cq\" (UID: \"590d2904-0fbc-4719-bb66-5ef354e0186f\") " pod="crc-storage/crc-storage-crc-tk8cq" Mar 10 08:00:28 crc kubenswrapper[4825]: I0310 08:00:28.251208 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tk8cq" Mar 10 08:00:28 crc kubenswrapper[4825]: I0310 08:00:28.687620 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tk8cq"] Mar 10 08:00:28 crc kubenswrapper[4825]: W0310 08:00:28.690626 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod590d2904_0fbc_4719_bb66_5ef354e0186f.slice/crio-f4dfe4a76df0ef68472368ba6cf26727e8d293086fa7a72b4db9de4d94fe9659 WatchSource:0}: Error finding container f4dfe4a76df0ef68472368ba6cf26727e8d293086fa7a72b4db9de4d94fe9659: Status 404 returned error can't find the container with id f4dfe4a76df0ef68472368ba6cf26727e8d293086fa7a72b4db9de4d94fe9659 Mar 10 08:00:29 crc kubenswrapper[4825]: I0310 08:00:29.160722 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tk8cq" event={"ID":"590d2904-0fbc-4719-bb66-5ef354e0186f","Type":"ContainerStarted","Data":"f4dfe4a76df0ef68472368ba6cf26727e8d293086fa7a72b4db9de4d94fe9659"} Mar 10 08:00:29 crc kubenswrapper[4825]: I0310 08:00:29.253591 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f92ec4dd-be42-4629-add3-7e50ca6cb897" path="/var/lib/kubelet/pods/f92ec4dd-be42-4629-add3-7e50ca6cb897/volumes" Mar 10 08:00:30 crc kubenswrapper[4825]: I0310 08:00:30.172255 4825 generic.go:334] "Generic (PLEG): container finished" podID="590d2904-0fbc-4719-bb66-5ef354e0186f" containerID="19a1c6a1197900fcb99d1c11eea42be1021b1cbf07ab327d1959025284325412" exitCode=0 Mar 10 08:00:30 crc kubenswrapper[4825]: I0310 08:00:30.172313 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tk8cq" event={"ID":"590d2904-0fbc-4719-bb66-5ef354e0186f","Type":"ContainerDied","Data":"19a1c6a1197900fcb99d1c11eea42be1021b1cbf07ab327d1959025284325412"} Mar 10 08:00:31 crc kubenswrapper[4825]: I0310 08:00:31.466150 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tk8cq" Mar 10 08:00:31 crc kubenswrapper[4825]: I0310 08:00:31.612475 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/590d2904-0fbc-4719-bb66-5ef354e0186f-node-mnt\") pod \"590d2904-0fbc-4719-bb66-5ef354e0186f\" (UID: \"590d2904-0fbc-4719-bb66-5ef354e0186f\") " Mar 10 08:00:31 crc kubenswrapper[4825]: I0310 08:00:31.612553 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/590d2904-0fbc-4719-bb66-5ef354e0186f-crc-storage\") pod \"590d2904-0fbc-4719-bb66-5ef354e0186f\" (UID: \"590d2904-0fbc-4719-bb66-5ef354e0186f\") " Mar 10 08:00:31 crc kubenswrapper[4825]: I0310 08:00:31.612573 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/590d2904-0fbc-4719-bb66-5ef354e0186f-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "590d2904-0fbc-4719-bb66-5ef354e0186f" (UID: "590d2904-0fbc-4719-bb66-5ef354e0186f"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 08:00:31 crc kubenswrapper[4825]: I0310 08:00:31.612686 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6ql5\" (UniqueName: \"kubernetes.io/projected/590d2904-0fbc-4719-bb66-5ef354e0186f-kube-api-access-f6ql5\") pod \"590d2904-0fbc-4719-bb66-5ef354e0186f\" (UID: \"590d2904-0fbc-4719-bb66-5ef354e0186f\") " Mar 10 08:00:31 crc kubenswrapper[4825]: I0310 08:00:31.613128 4825 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/590d2904-0fbc-4719-bb66-5ef354e0186f-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 10 08:00:31 crc kubenswrapper[4825]: I0310 08:00:31.621679 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/590d2904-0fbc-4719-bb66-5ef354e0186f-kube-api-access-f6ql5" (OuterVolumeSpecName: "kube-api-access-f6ql5") pod "590d2904-0fbc-4719-bb66-5ef354e0186f" (UID: "590d2904-0fbc-4719-bb66-5ef354e0186f"). InnerVolumeSpecName "kube-api-access-f6ql5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:00:31 crc kubenswrapper[4825]: I0310 08:00:31.654211 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/590d2904-0fbc-4719-bb66-5ef354e0186f-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "590d2904-0fbc-4719-bb66-5ef354e0186f" (UID: "590d2904-0fbc-4719-bb66-5ef354e0186f"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:00:31 crc kubenswrapper[4825]: I0310 08:00:31.713888 4825 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/590d2904-0fbc-4719-bb66-5ef354e0186f-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 10 08:00:31 crc kubenswrapper[4825]: I0310 08:00:31.713923 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6ql5\" (UniqueName: \"kubernetes.io/projected/590d2904-0fbc-4719-bb66-5ef354e0186f-kube-api-access-f6ql5\") on node \"crc\" DevicePath \"\"" Mar 10 08:00:32 crc kubenswrapper[4825]: I0310 08:00:32.190319 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tk8cq" event={"ID":"590d2904-0fbc-4719-bb66-5ef354e0186f","Type":"ContainerDied","Data":"f4dfe4a76df0ef68472368ba6cf26727e8d293086fa7a72b4db9de4d94fe9659"} Mar 10 08:00:32 crc kubenswrapper[4825]: I0310 08:00:32.190361 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4dfe4a76df0ef68472368ba6cf26727e8d293086fa7a72b4db9de4d94fe9659" Mar 10 08:00:32 crc kubenswrapper[4825]: I0310 08:00:32.190389 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tk8cq" Mar 10 08:00:40 crc kubenswrapper[4825]: I0310 08:00:40.100069 4825 scope.go:117] "RemoveContainer" containerID="5d25e5b92acd45205ccf731b1c11831d2292cc933aafc99dd2610f1bd2e87593" Mar 10 08:00:40 crc kubenswrapper[4825]: I0310 08:00:40.148096 4825 scope.go:117] "RemoveContainer" containerID="9ddc9b1808547bac708633152aac261201399154d44c29d69246c28751850144" Mar 10 08:00:40 crc kubenswrapper[4825]: I0310 08:00:40.170985 4825 scope.go:117] "RemoveContainer" containerID="988e5e73af76d370ca7ebba422484bb17fca7ca10664c878f367817b00f4355b" Mar 10 08:00:46 crc kubenswrapper[4825]: I0310 08:00:46.888749 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:00:46 crc kubenswrapper[4825]: I0310 08:00:46.889434 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:01:08 crc kubenswrapper[4825]: I0310 08:01:08.957003 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gs959"] Mar 10 08:01:08 crc kubenswrapper[4825]: E0310 08:01:08.957744 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="590d2904-0fbc-4719-bb66-5ef354e0186f" containerName="storage" Mar 10 08:01:08 crc kubenswrapper[4825]: I0310 08:01:08.957756 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="590d2904-0fbc-4719-bb66-5ef354e0186f" containerName="storage" Mar 10 08:01:08 crc kubenswrapper[4825]: I0310 08:01:08.957874 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="590d2904-0fbc-4719-bb66-5ef354e0186f" containerName="storage" Mar 10 08:01:08 crc kubenswrapper[4825]: I0310 08:01:08.958819 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gs959" Mar 10 08:01:08 crc kubenswrapper[4825]: I0310 08:01:08.977909 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gs959"] Mar 10 08:01:09 crc kubenswrapper[4825]: I0310 08:01:09.116325 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3705c8e-0865-4b64-90e2-7a54edc2690f-catalog-content\") pod \"certified-operators-gs959\" (UID: \"f3705c8e-0865-4b64-90e2-7a54edc2690f\") " pod="openshift-marketplace/certified-operators-gs959" Mar 10 08:01:09 crc kubenswrapper[4825]: I0310 08:01:09.116390 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3705c8e-0865-4b64-90e2-7a54edc2690f-utilities\") pod \"certified-operators-gs959\" (UID: \"f3705c8e-0865-4b64-90e2-7a54edc2690f\") " pod="openshift-marketplace/certified-operators-gs959" Mar 10 08:01:09 crc kubenswrapper[4825]: I0310 08:01:09.116427 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgd2r\" (UniqueName: \"kubernetes.io/projected/f3705c8e-0865-4b64-90e2-7a54edc2690f-kube-api-access-kgd2r\") pod \"certified-operators-gs959\" (UID: \"f3705c8e-0865-4b64-90e2-7a54edc2690f\") " pod="openshift-marketplace/certified-operators-gs959" Mar 10 08:01:09 crc kubenswrapper[4825]: I0310 08:01:09.217219 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3705c8e-0865-4b64-90e2-7a54edc2690f-catalog-content\") pod \"certified-operators-gs959\" (UID: \"f3705c8e-0865-4b64-90e2-7a54edc2690f\") " pod="openshift-marketplace/certified-operators-gs959" Mar 10 08:01:09 crc kubenswrapper[4825]: I0310 08:01:09.217477 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3705c8e-0865-4b64-90e2-7a54edc2690f-utilities\") pod \"certified-operators-gs959\" (UID: \"f3705c8e-0865-4b64-90e2-7a54edc2690f\") " pod="openshift-marketplace/certified-operators-gs959" Mar 10 08:01:09 crc kubenswrapper[4825]: I0310 08:01:09.217587 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgd2r\" (UniqueName: \"kubernetes.io/projected/f3705c8e-0865-4b64-90e2-7a54edc2690f-kube-api-access-kgd2r\") pod \"certified-operators-gs959\" (UID: \"f3705c8e-0865-4b64-90e2-7a54edc2690f\") " pod="openshift-marketplace/certified-operators-gs959" Mar 10 08:01:09 crc kubenswrapper[4825]: I0310 08:01:09.217997 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3705c8e-0865-4b64-90e2-7a54edc2690f-utilities\") pod \"certified-operators-gs959\" (UID: \"f3705c8e-0865-4b64-90e2-7a54edc2690f\") " pod="openshift-marketplace/certified-operators-gs959" Mar 10 08:01:09 crc kubenswrapper[4825]: I0310 08:01:09.218149 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3705c8e-0865-4b64-90e2-7a54edc2690f-catalog-content\") pod \"certified-operators-gs959\" (UID: \"f3705c8e-0865-4b64-90e2-7a54edc2690f\") " pod="openshift-marketplace/certified-operators-gs959" Mar 10 08:01:09 crc kubenswrapper[4825]: I0310 08:01:09.244433 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgd2r\" (UniqueName: \"kubernetes.io/projected/f3705c8e-0865-4b64-90e2-7a54edc2690f-kube-api-access-kgd2r\") pod \"certified-operators-gs959\" (UID: \"f3705c8e-0865-4b64-90e2-7a54edc2690f\") " pod="openshift-marketplace/certified-operators-gs959" Mar 10 08:01:09 crc kubenswrapper[4825]: I0310 08:01:09.277887 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gs959" Mar 10 08:01:09 crc kubenswrapper[4825]: I0310 08:01:09.727793 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gs959"] Mar 10 08:01:10 crc kubenswrapper[4825]: I0310 08:01:10.488360 4825 generic.go:334] "Generic (PLEG): container finished" podID="f3705c8e-0865-4b64-90e2-7a54edc2690f" containerID="38d579176e63b0ff059a6e7eb6b8a488c6eecd6a522ba91007efd3eb39820fa4" exitCode=0 Mar 10 08:01:10 crc kubenswrapper[4825]: I0310 08:01:10.488483 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gs959" event={"ID":"f3705c8e-0865-4b64-90e2-7a54edc2690f","Type":"ContainerDied","Data":"38d579176e63b0ff059a6e7eb6b8a488c6eecd6a522ba91007efd3eb39820fa4"} Mar 10 08:01:10 crc kubenswrapper[4825]: I0310 08:01:10.488776 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gs959" event={"ID":"f3705c8e-0865-4b64-90e2-7a54edc2690f","Type":"ContainerStarted","Data":"81cd705c67f94da1ae423456f8071d672670ad38c032fb356fe94bb30ea651ac"} Mar 10 08:01:11 crc kubenswrapper[4825]: I0310 08:01:11.497359 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gs959" event={"ID":"f3705c8e-0865-4b64-90e2-7a54edc2690f","Type":"ContainerStarted","Data":"2e18a58d7fe51c9a2c86c4910cfddebafaa268c724a836136e0a5d746afe5548"} Mar 10 08:01:12 crc kubenswrapper[4825]: I0310 08:01:12.507391 4825 generic.go:334] "Generic (PLEG): container finished" podID="f3705c8e-0865-4b64-90e2-7a54edc2690f" containerID="2e18a58d7fe51c9a2c86c4910cfddebafaa268c724a836136e0a5d746afe5548" exitCode=0 Mar 10 08:01:12 crc kubenswrapper[4825]: I0310 08:01:12.507434 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gs959" event={"ID":"f3705c8e-0865-4b64-90e2-7a54edc2690f","Type":"ContainerDied","Data":"2e18a58d7fe51c9a2c86c4910cfddebafaa268c724a836136e0a5d746afe5548"} Mar 10 08:01:13 crc kubenswrapper[4825]: I0310 08:01:13.516461 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gs959" event={"ID":"f3705c8e-0865-4b64-90e2-7a54edc2690f","Type":"ContainerStarted","Data":"c515df3928d52b817fb435bd5a6471e7dc0286447a5d4ca0b8ee5b56eb954005"} Mar 10 08:01:13 crc kubenswrapper[4825]: I0310 08:01:13.536802 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gs959" podStartSLOduration=3.142869355 podStartE2EDuration="5.536786129s" podCreationTimestamp="2026-03-10 08:01:08 +0000 UTC" firstStartedPulling="2026-03-10 08:01:10.490004931 +0000 UTC m=+4623.519785566" lastFinishedPulling="2026-03-10 08:01:12.883921725 +0000 UTC m=+4625.913702340" observedRunningTime="2026-03-10 08:01:13.532940418 +0000 UTC m=+4626.562721043" watchObservedRunningTime="2026-03-10 08:01:13.536786129 +0000 UTC m=+4626.566566744" Mar 10 08:01:16 crc kubenswrapper[4825]: I0310 08:01:16.888196 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:01:16 crc kubenswrapper[4825]: I0310 08:01:16.888549 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:01:19 crc kubenswrapper[4825]: I0310 08:01:19.278335 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gs959" Mar 10 08:01:19 crc kubenswrapper[4825]: I0310 08:01:19.278609 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gs959" Mar 10 08:01:19 crc kubenswrapper[4825]: I0310 08:01:19.605396 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gs959" Mar 10 08:01:19 crc kubenswrapper[4825]: I0310 08:01:19.642886 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gs959" Mar 10 08:01:19 crc kubenswrapper[4825]: I0310 08:01:19.845856 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gs959"] Mar 10 08:01:21 crc kubenswrapper[4825]: I0310 08:01:21.573809 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gs959" podUID="f3705c8e-0865-4b64-90e2-7a54edc2690f" containerName="registry-server" containerID="cri-o://c515df3928d52b817fb435bd5a6471e7dc0286447a5d4ca0b8ee5b56eb954005" gracePeriod=2 Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.031608 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gs959" Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.105119 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3705c8e-0865-4b64-90e2-7a54edc2690f-utilities\") pod \"f3705c8e-0865-4b64-90e2-7a54edc2690f\" (UID: \"f3705c8e-0865-4b64-90e2-7a54edc2690f\") " Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.105286 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgd2r\" (UniqueName: \"kubernetes.io/projected/f3705c8e-0865-4b64-90e2-7a54edc2690f-kube-api-access-kgd2r\") pod \"f3705c8e-0865-4b64-90e2-7a54edc2690f\" (UID: \"f3705c8e-0865-4b64-90e2-7a54edc2690f\") " Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.105369 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3705c8e-0865-4b64-90e2-7a54edc2690f-catalog-content\") pod \"f3705c8e-0865-4b64-90e2-7a54edc2690f\" (UID: \"f3705c8e-0865-4b64-90e2-7a54edc2690f\") " Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.105952 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3705c8e-0865-4b64-90e2-7a54edc2690f-utilities" (OuterVolumeSpecName: "utilities") pod "f3705c8e-0865-4b64-90e2-7a54edc2690f" (UID: "f3705c8e-0865-4b64-90e2-7a54edc2690f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.110321 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3705c8e-0865-4b64-90e2-7a54edc2690f-kube-api-access-kgd2r" (OuterVolumeSpecName: "kube-api-access-kgd2r") pod "f3705c8e-0865-4b64-90e2-7a54edc2690f" (UID: "f3705c8e-0865-4b64-90e2-7a54edc2690f"). InnerVolumeSpecName "kube-api-access-kgd2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.166330 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3705c8e-0865-4b64-90e2-7a54edc2690f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f3705c8e-0865-4b64-90e2-7a54edc2690f" (UID: "f3705c8e-0865-4b64-90e2-7a54edc2690f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.207199 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgd2r\" (UniqueName: \"kubernetes.io/projected/f3705c8e-0865-4b64-90e2-7a54edc2690f-kube-api-access-kgd2r\") on node \"crc\" DevicePath \"\"" Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.207232 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3705c8e-0865-4b64-90e2-7a54edc2690f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.207240 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3705c8e-0865-4b64-90e2-7a54edc2690f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.591504 4825 generic.go:334] "Generic (PLEG): container finished" podID="f3705c8e-0865-4b64-90e2-7a54edc2690f" containerID="c515df3928d52b817fb435bd5a6471e7dc0286447a5d4ca0b8ee5b56eb954005" exitCode=0 Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.591579 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gs959" event={"ID":"f3705c8e-0865-4b64-90e2-7a54edc2690f","Type":"ContainerDied","Data":"c515df3928d52b817fb435bd5a6471e7dc0286447a5d4ca0b8ee5b56eb954005"} Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.591617 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gs959" Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.591629 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gs959" event={"ID":"f3705c8e-0865-4b64-90e2-7a54edc2690f","Type":"ContainerDied","Data":"81cd705c67f94da1ae423456f8071d672670ad38c032fb356fe94bb30ea651ac"} Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.591667 4825 scope.go:117] "RemoveContainer" containerID="c515df3928d52b817fb435bd5a6471e7dc0286447a5d4ca0b8ee5b56eb954005" Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.628864 4825 scope.go:117] "RemoveContainer" containerID="2e18a58d7fe51c9a2c86c4910cfddebafaa268c724a836136e0a5d746afe5548" Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.649857 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gs959"] Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.666912 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gs959"] Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.676560 4825 scope.go:117] "RemoveContainer" containerID="38d579176e63b0ff059a6e7eb6b8a488c6eecd6a522ba91007efd3eb39820fa4" Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.701439 4825 scope.go:117] "RemoveContainer" containerID="c515df3928d52b817fb435bd5a6471e7dc0286447a5d4ca0b8ee5b56eb954005" Mar 10 08:01:22 crc kubenswrapper[4825]: E0310 08:01:22.701874 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c515df3928d52b817fb435bd5a6471e7dc0286447a5d4ca0b8ee5b56eb954005\": container with ID starting with c515df3928d52b817fb435bd5a6471e7dc0286447a5d4ca0b8ee5b56eb954005 not found: ID does not exist" containerID="c515df3928d52b817fb435bd5a6471e7dc0286447a5d4ca0b8ee5b56eb954005" Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.702018 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c515df3928d52b817fb435bd5a6471e7dc0286447a5d4ca0b8ee5b56eb954005"} err="failed to get container status \"c515df3928d52b817fb435bd5a6471e7dc0286447a5d4ca0b8ee5b56eb954005\": rpc error: code = NotFound desc = could not find container \"c515df3928d52b817fb435bd5a6471e7dc0286447a5d4ca0b8ee5b56eb954005\": container with ID starting with c515df3928d52b817fb435bd5a6471e7dc0286447a5d4ca0b8ee5b56eb954005 not found: ID does not exist" Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.702178 4825 scope.go:117] "RemoveContainer" containerID="2e18a58d7fe51c9a2c86c4910cfddebafaa268c724a836136e0a5d746afe5548" Mar 10 08:01:22 crc kubenswrapper[4825]: E0310 08:01:22.702690 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e18a58d7fe51c9a2c86c4910cfddebafaa268c724a836136e0a5d746afe5548\": container with ID starting with 2e18a58d7fe51c9a2c86c4910cfddebafaa268c724a836136e0a5d746afe5548 not found: ID does not exist" containerID="2e18a58d7fe51c9a2c86c4910cfddebafaa268c724a836136e0a5d746afe5548" Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.702842 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e18a58d7fe51c9a2c86c4910cfddebafaa268c724a836136e0a5d746afe5548"} err="failed to get container status \"2e18a58d7fe51c9a2c86c4910cfddebafaa268c724a836136e0a5d746afe5548\": rpc error: code = NotFound desc = could not find container \"2e18a58d7fe51c9a2c86c4910cfddebafaa268c724a836136e0a5d746afe5548\": container with ID starting with 2e18a58d7fe51c9a2c86c4910cfddebafaa268c724a836136e0a5d746afe5548 not found: ID does not exist" Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.702964 4825 scope.go:117] "RemoveContainer" containerID="38d579176e63b0ff059a6e7eb6b8a488c6eecd6a522ba91007efd3eb39820fa4" Mar 10 08:01:22 crc kubenswrapper[4825]: E0310 08:01:22.703909 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38d579176e63b0ff059a6e7eb6b8a488c6eecd6a522ba91007efd3eb39820fa4\": container with ID starting with 38d579176e63b0ff059a6e7eb6b8a488c6eecd6a522ba91007efd3eb39820fa4 not found: ID does not exist" containerID="38d579176e63b0ff059a6e7eb6b8a488c6eecd6a522ba91007efd3eb39820fa4" Mar 10 08:01:22 crc kubenswrapper[4825]: I0310 08:01:22.704068 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38d579176e63b0ff059a6e7eb6b8a488c6eecd6a522ba91007efd3eb39820fa4"} err="failed to get container status \"38d579176e63b0ff059a6e7eb6b8a488c6eecd6a522ba91007efd3eb39820fa4\": rpc error: code = NotFound desc = could not find container \"38d579176e63b0ff059a6e7eb6b8a488c6eecd6a522ba91007efd3eb39820fa4\": container with ID starting with 38d579176e63b0ff059a6e7eb6b8a488c6eecd6a522ba91007efd3eb39820fa4 not found: ID does not exist" Mar 10 08:01:23 crc kubenswrapper[4825]: I0310 08:01:23.255530 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3705c8e-0865-4b64-90e2-7a54edc2690f" path="/var/lib/kubelet/pods/f3705c8e-0865-4b64-90e2-7a54edc2690f/volumes" Mar 10 08:01:46 crc kubenswrapper[4825]: I0310 08:01:46.888718 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:01:46 crc kubenswrapper[4825]: I0310 08:01:46.889314 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:01:46 crc kubenswrapper[4825]: I0310 08:01:46.889368 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 08:01:46 crc kubenswrapper[4825]: I0310 08:01:46.890081 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 08:01:46 crc kubenswrapper[4825]: I0310 08:01:46.890183 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" gracePeriod=600 Mar 10 08:01:47 crc kubenswrapper[4825]: E0310 08:01:47.015947 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:01:47 crc kubenswrapper[4825]: I0310 08:01:47.854755 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" exitCode=0 Mar 10 08:01:47 crc kubenswrapper[4825]: I0310 08:01:47.854838 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c"} Mar 10 08:01:47 crc kubenswrapper[4825]: I0310 08:01:47.855093 4825 scope.go:117] "RemoveContainer" containerID="bf3d1cd0d377f9662a83f61c7f1b91e30e6257a568216e36432b13ae96d3370b" Mar 10 08:01:47 crc kubenswrapper[4825]: I0310 08:01:47.856775 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:01:47 crc kubenswrapper[4825]: E0310 08:01:47.857437 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:02:00 crc kubenswrapper[4825]: I0310 08:02:00.139578 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552162-bl6n6"] Mar 10 08:02:00 crc kubenswrapper[4825]: E0310 08:02:00.140106 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3705c8e-0865-4b64-90e2-7a54edc2690f" containerName="extract-utilities" Mar 10 08:02:00 crc kubenswrapper[4825]: I0310 08:02:00.140118 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3705c8e-0865-4b64-90e2-7a54edc2690f" containerName="extract-utilities" Mar 10 08:02:00 crc kubenswrapper[4825]: E0310 08:02:00.140149 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3705c8e-0865-4b64-90e2-7a54edc2690f" containerName="registry-server" Mar 10 08:02:00 crc kubenswrapper[4825]: I0310 08:02:00.140157 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3705c8e-0865-4b64-90e2-7a54edc2690f" containerName="registry-server" Mar 10 08:02:00 crc kubenswrapper[4825]: E0310 08:02:00.140173 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3705c8e-0865-4b64-90e2-7a54edc2690f" containerName="extract-content" Mar 10 08:02:00 crc kubenswrapper[4825]: I0310 08:02:00.140179 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3705c8e-0865-4b64-90e2-7a54edc2690f" containerName="extract-content" Mar 10 08:02:00 crc kubenswrapper[4825]: I0310 08:02:00.140381 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3705c8e-0865-4b64-90e2-7a54edc2690f" containerName="registry-server" Mar 10 08:02:00 crc kubenswrapper[4825]: I0310 08:02:00.140983 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552162-bl6n6" Mar 10 08:02:00 crc kubenswrapper[4825]: I0310 08:02:00.143031 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:02:00 crc kubenswrapper[4825]: I0310 08:02:00.143688 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:02:00 crc kubenswrapper[4825]: I0310 08:02:00.144923 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:02:00 crc kubenswrapper[4825]: I0310 08:02:00.149785 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552162-bl6n6"] Mar 10 08:02:00 crc kubenswrapper[4825]: I0310 08:02:00.236235 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:02:00 crc kubenswrapper[4825]: E0310 08:02:00.236457 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:02:00 crc kubenswrapper[4825]: I0310 08:02:00.300472 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fsbb\" (UniqueName: \"kubernetes.io/projected/24859db8-ff70-476f-9aab-62e44102d93f-kube-api-access-2fsbb\") pod \"auto-csr-approver-29552162-bl6n6\" (UID: \"24859db8-ff70-476f-9aab-62e44102d93f\") " pod="openshift-infra/auto-csr-approver-29552162-bl6n6" Mar 10 08:02:00 crc kubenswrapper[4825]: I0310 08:02:00.402504 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fsbb\" (UniqueName: \"kubernetes.io/projected/24859db8-ff70-476f-9aab-62e44102d93f-kube-api-access-2fsbb\") pod \"auto-csr-approver-29552162-bl6n6\" (UID: \"24859db8-ff70-476f-9aab-62e44102d93f\") " pod="openshift-infra/auto-csr-approver-29552162-bl6n6" Mar 10 08:02:00 crc kubenswrapper[4825]: I0310 08:02:00.426008 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fsbb\" (UniqueName: \"kubernetes.io/projected/24859db8-ff70-476f-9aab-62e44102d93f-kube-api-access-2fsbb\") pod \"auto-csr-approver-29552162-bl6n6\" (UID: \"24859db8-ff70-476f-9aab-62e44102d93f\") " pod="openshift-infra/auto-csr-approver-29552162-bl6n6" Mar 10 08:02:00 crc kubenswrapper[4825]: I0310 08:02:00.464216 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552162-bl6n6" Mar 10 08:02:00 crc kubenswrapper[4825]: I0310 08:02:00.902602 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552162-bl6n6"] Mar 10 08:02:00 crc kubenswrapper[4825]: I0310 08:02:00.967215 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552162-bl6n6" event={"ID":"24859db8-ff70-476f-9aab-62e44102d93f","Type":"ContainerStarted","Data":"fefbdfe6215510f66a9905dec985b2e835eb839e6dbb5ac022c69af2b9f6c841"} Mar 10 08:02:02 crc kubenswrapper[4825]: I0310 08:02:02.988575 4825 generic.go:334] "Generic (PLEG): container finished" podID="24859db8-ff70-476f-9aab-62e44102d93f" containerID="b12453ef51ed63f6d32b67d2e38327a80486d410ffa855f5f5abaaaa0fe051c4" exitCode=0 Mar 10 08:02:02 crc kubenswrapper[4825]: I0310 08:02:02.988865 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552162-bl6n6" event={"ID":"24859db8-ff70-476f-9aab-62e44102d93f","Type":"ContainerDied","Data":"b12453ef51ed63f6d32b67d2e38327a80486d410ffa855f5f5abaaaa0fe051c4"} Mar 10 08:02:04 crc kubenswrapper[4825]: I0310 08:02:04.309835 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552162-bl6n6" Mar 10 08:02:04 crc kubenswrapper[4825]: I0310 08:02:04.471899 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fsbb\" (UniqueName: \"kubernetes.io/projected/24859db8-ff70-476f-9aab-62e44102d93f-kube-api-access-2fsbb\") pod \"24859db8-ff70-476f-9aab-62e44102d93f\" (UID: \"24859db8-ff70-476f-9aab-62e44102d93f\") " Mar 10 08:02:04 crc kubenswrapper[4825]: I0310 08:02:04.476703 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24859db8-ff70-476f-9aab-62e44102d93f-kube-api-access-2fsbb" (OuterVolumeSpecName: "kube-api-access-2fsbb") pod "24859db8-ff70-476f-9aab-62e44102d93f" (UID: "24859db8-ff70-476f-9aab-62e44102d93f"). InnerVolumeSpecName "kube-api-access-2fsbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:02:04 crc kubenswrapper[4825]: I0310 08:02:04.573971 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fsbb\" (UniqueName: \"kubernetes.io/projected/24859db8-ff70-476f-9aab-62e44102d93f-kube-api-access-2fsbb\") on node \"crc\" DevicePath \"\"" Mar 10 08:02:05 crc kubenswrapper[4825]: I0310 08:02:05.009014 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552162-bl6n6" event={"ID":"24859db8-ff70-476f-9aab-62e44102d93f","Type":"ContainerDied","Data":"fefbdfe6215510f66a9905dec985b2e835eb839e6dbb5ac022c69af2b9f6c841"} Mar 10 08:02:05 crc kubenswrapper[4825]: I0310 08:02:05.009064 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552162-bl6n6" Mar 10 08:02:05 crc kubenswrapper[4825]: I0310 08:02:05.009070 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fefbdfe6215510f66a9905dec985b2e835eb839e6dbb5ac022c69af2b9f6c841" Mar 10 08:02:05 crc kubenswrapper[4825]: I0310 08:02:05.376840 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552156-p624h"] Mar 10 08:02:05 crc kubenswrapper[4825]: I0310 08:02:05.384502 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552156-p624h"] Mar 10 08:02:07 crc kubenswrapper[4825]: I0310 08:02:07.250108 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="125ee403-a265-4083-a6ab-c02005988548" path="/var/lib/kubelet/pods/125ee403-a265-4083-a6ab-c02005988548/volumes" Mar 10 08:02:14 crc kubenswrapper[4825]: I0310 08:02:14.236961 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:02:14 crc kubenswrapper[4825]: E0310 08:02:14.237913 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:02:28 crc kubenswrapper[4825]: I0310 08:02:28.236373 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:02:28 crc kubenswrapper[4825]: E0310 08:02:28.237052 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.132612 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cbd84bb99-mjd6c"] Mar 10 08:02:38 crc kubenswrapper[4825]: E0310 08:02:38.133405 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24859db8-ff70-476f-9aab-62e44102d93f" containerName="oc" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.133417 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="24859db8-ff70-476f-9aab-62e44102d93f" containerName="oc" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.133545 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="24859db8-ff70-476f-9aab-62e44102d93f" containerName="oc" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.134278 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cbd84bb99-mjd6c" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.138598 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.138787 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.138872 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-kvsts" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.139196 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.140281 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c68856ffc-mksv8"] Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.141812 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c68856ffc-mksv8" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.142017 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.150975 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cbd84bb99-mjd6c"] Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.159910 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c68856ffc-mksv8"] Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.215482 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33f2eee4-99a8-4399-9806-e8052cc13783-dns-svc\") pod \"dnsmasq-dns-6cbd84bb99-mjd6c\" (UID: \"33f2eee4-99a8-4399-9806-e8052cc13783\") " pod="openstack/dnsmasq-dns-6cbd84bb99-mjd6c" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.215558 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfnm6\" (UniqueName: \"kubernetes.io/projected/33f2eee4-99a8-4399-9806-e8052cc13783-kube-api-access-nfnm6\") pod \"dnsmasq-dns-6cbd84bb99-mjd6c\" (UID: \"33f2eee4-99a8-4399-9806-e8052cc13783\") " pod="openstack/dnsmasq-dns-6cbd84bb99-mjd6c" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.215604 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59lqj\" (UniqueName: \"kubernetes.io/projected/0907fe91-e1a7-477e-849b-073dd367c69a-kube-api-access-59lqj\") pod \"dnsmasq-dns-c68856ffc-mksv8\" (UID: \"0907fe91-e1a7-477e-849b-073dd367c69a\") " pod="openstack/dnsmasq-dns-c68856ffc-mksv8" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.215644 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f2eee4-99a8-4399-9806-e8052cc13783-config\") pod \"dnsmasq-dns-6cbd84bb99-mjd6c\" (UID: \"33f2eee4-99a8-4399-9806-e8052cc13783\") " pod="openstack/dnsmasq-dns-6cbd84bb99-mjd6c" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.215867 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0907fe91-e1a7-477e-849b-073dd367c69a-config\") pod \"dnsmasq-dns-c68856ffc-mksv8\" (UID: \"0907fe91-e1a7-477e-849b-073dd367c69a\") " pod="openstack/dnsmasq-dns-c68856ffc-mksv8" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.317075 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59lqj\" (UniqueName: \"kubernetes.io/projected/0907fe91-e1a7-477e-849b-073dd367c69a-kube-api-access-59lqj\") pod \"dnsmasq-dns-c68856ffc-mksv8\" (UID: \"0907fe91-e1a7-477e-849b-073dd367c69a\") " pod="openstack/dnsmasq-dns-c68856ffc-mksv8" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.317178 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f2eee4-99a8-4399-9806-e8052cc13783-config\") pod \"dnsmasq-dns-6cbd84bb99-mjd6c\" (UID: \"33f2eee4-99a8-4399-9806-e8052cc13783\") " pod="openstack/dnsmasq-dns-6cbd84bb99-mjd6c" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.317204 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0907fe91-e1a7-477e-849b-073dd367c69a-config\") pod \"dnsmasq-dns-c68856ffc-mksv8\" (UID: \"0907fe91-e1a7-477e-849b-073dd367c69a\") " pod="openstack/dnsmasq-dns-c68856ffc-mksv8" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.317281 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33f2eee4-99a8-4399-9806-e8052cc13783-dns-svc\") pod \"dnsmasq-dns-6cbd84bb99-mjd6c\" (UID: \"33f2eee4-99a8-4399-9806-e8052cc13783\") " pod="openstack/dnsmasq-dns-6cbd84bb99-mjd6c" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.318418 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0907fe91-e1a7-477e-849b-073dd367c69a-config\") pod \"dnsmasq-dns-c68856ffc-mksv8\" (UID: \"0907fe91-e1a7-477e-849b-073dd367c69a\") " pod="openstack/dnsmasq-dns-c68856ffc-mksv8" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.318509 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfnm6\" (UniqueName: \"kubernetes.io/projected/33f2eee4-99a8-4399-9806-e8052cc13783-kube-api-access-nfnm6\") pod \"dnsmasq-dns-6cbd84bb99-mjd6c\" (UID: \"33f2eee4-99a8-4399-9806-e8052cc13783\") " pod="openstack/dnsmasq-dns-6cbd84bb99-mjd6c" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.318561 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33f2eee4-99a8-4399-9806-e8052cc13783-dns-svc\") pod \"dnsmasq-dns-6cbd84bb99-mjd6c\" (UID: \"33f2eee4-99a8-4399-9806-e8052cc13783\") " pod="openstack/dnsmasq-dns-6cbd84bb99-mjd6c" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.318795 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f2eee4-99a8-4399-9806-e8052cc13783-config\") pod \"dnsmasq-dns-6cbd84bb99-mjd6c\" (UID: \"33f2eee4-99a8-4399-9806-e8052cc13783\") " pod="openstack/dnsmasq-dns-6cbd84bb99-mjd6c" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.356245 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59lqj\" (UniqueName: \"kubernetes.io/projected/0907fe91-e1a7-477e-849b-073dd367c69a-kube-api-access-59lqj\") pod \"dnsmasq-dns-c68856ffc-mksv8\" (UID: \"0907fe91-e1a7-477e-849b-073dd367c69a\") " pod="openstack/dnsmasq-dns-c68856ffc-mksv8" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.357640 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfnm6\" (UniqueName: \"kubernetes.io/projected/33f2eee4-99a8-4399-9806-e8052cc13783-kube-api-access-nfnm6\") pod \"dnsmasq-dns-6cbd84bb99-mjd6c\" (UID: \"33f2eee4-99a8-4399-9806-e8052cc13783\") " pod="openstack/dnsmasq-dns-6cbd84bb99-mjd6c" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.370573 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cbd84bb99-mjd6c"] Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.374120 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cbd84bb99-mjd6c" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.388391 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f79796479-jv592"] Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.389441 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f79796479-jv592" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.399408 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f79796479-jv592"] Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.476996 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c68856ffc-mksv8" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.523341 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9mrw\" (UniqueName: \"kubernetes.io/projected/c518751a-c456-49d7-998d-cb0b3b062057-kube-api-access-d9mrw\") pod \"dnsmasq-dns-6f79796479-jv592\" (UID: \"c518751a-c456-49d7-998d-cb0b3b062057\") " pod="openstack/dnsmasq-dns-6f79796479-jv592" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.523434 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c518751a-c456-49d7-998d-cb0b3b062057-dns-svc\") pod \"dnsmasq-dns-6f79796479-jv592\" (UID: \"c518751a-c456-49d7-998d-cb0b3b062057\") " pod="openstack/dnsmasq-dns-6f79796479-jv592" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.523480 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c518751a-c456-49d7-998d-cb0b3b062057-config\") pod \"dnsmasq-dns-6f79796479-jv592\" (UID: \"c518751a-c456-49d7-998d-cb0b3b062057\") " pod="openstack/dnsmasq-dns-6f79796479-jv592" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.625045 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c518751a-c456-49d7-998d-cb0b3b062057-dns-svc\") pod \"dnsmasq-dns-6f79796479-jv592\" (UID: \"c518751a-c456-49d7-998d-cb0b3b062057\") " pod="openstack/dnsmasq-dns-6f79796479-jv592" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.625399 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c518751a-c456-49d7-998d-cb0b3b062057-config\") pod \"dnsmasq-dns-6f79796479-jv592\" (UID: \"c518751a-c456-49d7-998d-cb0b3b062057\") " pod="openstack/dnsmasq-dns-6f79796479-jv592" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.625427 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9mrw\" (UniqueName: \"kubernetes.io/projected/c518751a-c456-49d7-998d-cb0b3b062057-kube-api-access-d9mrw\") pod \"dnsmasq-dns-6f79796479-jv592\" (UID: \"c518751a-c456-49d7-998d-cb0b3b062057\") " pod="openstack/dnsmasq-dns-6f79796479-jv592" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.626226 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c518751a-c456-49d7-998d-cb0b3b062057-dns-svc\") pod \"dnsmasq-dns-6f79796479-jv592\" (UID: \"c518751a-c456-49d7-998d-cb0b3b062057\") " pod="openstack/dnsmasq-dns-6f79796479-jv592" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.626428 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c518751a-c456-49d7-998d-cb0b3b062057-config\") pod \"dnsmasq-dns-6f79796479-jv592\" (UID: \"c518751a-c456-49d7-998d-cb0b3b062057\") " pod="openstack/dnsmasq-dns-6f79796479-jv592" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.647757 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9mrw\" (UniqueName: \"kubernetes.io/projected/c518751a-c456-49d7-998d-cb0b3b062057-kube-api-access-d9mrw\") pod \"dnsmasq-dns-6f79796479-jv592\" (UID: \"c518751a-c456-49d7-998d-cb0b3b062057\") " pod="openstack/dnsmasq-dns-6f79796479-jv592" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.763087 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f79796479-jv592" Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.914795 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f79796479-jv592"] Mar 10 08:02:38 crc kubenswrapper[4825]: I0310 08:02:38.929705 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cbd84bb99-mjd6c"] Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.005383 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c87cfb955-vzt5j"] Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.007400 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c87cfb955-vzt5j" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.017337 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c87cfb955-vzt5j"] Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.030473 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c68856ffc-mksv8"] Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.131244 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55c11e3c-2b81-4e63-87f2-71e18805c676-dns-svc\") pod \"dnsmasq-dns-c87cfb955-vzt5j\" (UID: \"55c11e3c-2b81-4e63-87f2-71e18805c676\") " pod="openstack/dnsmasq-dns-c87cfb955-vzt5j" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.131337 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqqpr\" (UniqueName: \"kubernetes.io/projected/55c11e3c-2b81-4e63-87f2-71e18805c676-kube-api-access-gqqpr\") pod \"dnsmasq-dns-c87cfb955-vzt5j\" (UID: \"55c11e3c-2b81-4e63-87f2-71e18805c676\") " pod="openstack/dnsmasq-dns-c87cfb955-vzt5j" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.131358 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55c11e3c-2b81-4e63-87f2-71e18805c676-config\") pod \"dnsmasq-dns-c87cfb955-vzt5j\" (UID: \"55c11e3c-2b81-4e63-87f2-71e18805c676\") " pod="openstack/dnsmasq-dns-c87cfb955-vzt5j" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.197202 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f79796479-jv592"] Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.234096 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqqpr\" (UniqueName: \"kubernetes.io/projected/55c11e3c-2b81-4e63-87f2-71e18805c676-kube-api-access-gqqpr\") pod \"dnsmasq-dns-c87cfb955-vzt5j\" (UID: \"55c11e3c-2b81-4e63-87f2-71e18805c676\") " pod="openstack/dnsmasq-dns-c87cfb955-vzt5j" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.234149 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55c11e3c-2b81-4e63-87f2-71e18805c676-config\") pod \"dnsmasq-dns-c87cfb955-vzt5j\" (UID: \"55c11e3c-2b81-4e63-87f2-71e18805c676\") " pod="openstack/dnsmasq-dns-c87cfb955-vzt5j" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.234214 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55c11e3c-2b81-4e63-87f2-71e18805c676-dns-svc\") pod \"dnsmasq-dns-c87cfb955-vzt5j\" (UID: \"55c11e3c-2b81-4e63-87f2-71e18805c676\") " pod="openstack/dnsmasq-dns-c87cfb955-vzt5j" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.235065 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55c11e3c-2b81-4e63-87f2-71e18805c676-dns-svc\") pod \"dnsmasq-dns-c87cfb955-vzt5j\" (UID: \"55c11e3c-2b81-4e63-87f2-71e18805c676\") " pod="openstack/dnsmasq-dns-c87cfb955-vzt5j" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.235324 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55c11e3c-2b81-4e63-87f2-71e18805c676-config\") pod \"dnsmasq-dns-c87cfb955-vzt5j\" (UID: \"55c11e3c-2b81-4e63-87f2-71e18805c676\") " pod="openstack/dnsmasq-dns-c87cfb955-vzt5j" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.306699 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cbd84bb99-mjd6c" event={"ID":"33f2eee4-99a8-4399-9806-e8052cc13783","Type":"ContainerStarted","Data":"5e7963311f5d7b026908c6a7511dcfc795176a7d9b05ff0b67ec50147d05a83f"} Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.315516 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c68856ffc-mksv8" event={"ID":"0907fe91-e1a7-477e-849b-073dd367c69a","Type":"ContainerStarted","Data":"619623e308b219e11483661dae6025209b32e5d80cbe640dcb591a89489cd898"} Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.324461 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f79796479-jv592" event={"ID":"c518751a-c456-49d7-998d-cb0b3b062057","Type":"ContainerStarted","Data":"b8ef583d5bad8b5ad4d4870752de4bffe13f1d04184822a4d2ee4a4f38f25f68"} Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.573320 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqqpr\" (UniqueName: \"kubernetes.io/projected/55c11e3c-2b81-4e63-87f2-71e18805c676-kube-api-access-gqqpr\") pod \"dnsmasq-dns-c87cfb955-vzt5j\" (UID: \"55c11e3c-2b81-4e63-87f2-71e18805c676\") " pod="openstack/dnsmasq-dns-c87cfb955-vzt5j" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.575272 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.576827 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.579378 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.579691 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.579843 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.579932 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-fbtpt" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.580043 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.580114 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.580247 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.582122 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.638787 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a37c9d85-7482-45de-b47d-378f308feb54-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.638830 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a37c9d85-7482-45de-b47d-378f308feb54-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.638854 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a37c9d85-7482-45de-b47d-378f308feb54-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.638869 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a37c9d85-7482-45de-b47d-378f308feb54-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.638885 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a37c9d85-7482-45de-b47d-378f308feb54-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.638913 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-10ce9807-6e64-488b-8542-1028122298f4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10ce9807-6e64-488b-8542-1028122298f4\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.638940 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a37c9d85-7482-45de-b47d-378f308feb54-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.638957 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f277q\" (UniqueName: \"kubernetes.io/projected/a37c9d85-7482-45de-b47d-378f308feb54-kube-api-access-f277q\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.638972 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a37c9d85-7482-45de-b47d-378f308feb54-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.639006 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a37c9d85-7482-45de-b47d-378f308feb54-config-data\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.639023 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a37c9d85-7482-45de-b47d-378f308feb54-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.650065 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c87cfb955-vzt5j" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.740081 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a37c9d85-7482-45de-b47d-378f308feb54-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.740125 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a37c9d85-7482-45de-b47d-378f308feb54-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.740162 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a37c9d85-7482-45de-b47d-378f308feb54-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.740183 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a37c9d85-7482-45de-b47d-378f308feb54-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.740210 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-10ce9807-6e64-488b-8542-1028122298f4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10ce9807-6e64-488b-8542-1028122298f4\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.740239 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a37c9d85-7482-45de-b47d-378f308feb54-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.740258 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f277q\" (UniqueName: \"kubernetes.io/projected/a37c9d85-7482-45de-b47d-378f308feb54-kube-api-access-f277q\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.740301 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a37c9d85-7482-45de-b47d-378f308feb54-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.740357 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a37c9d85-7482-45de-b47d-378f308feb54-config-data\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.740393 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a37c9d85-7482-45de-b47d-378f308feb54-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.740522 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a37c9d85-7482-45de-b47d-378f308feb54-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.740785 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a37c9d85-7482-45de-b47d-378f308feb54-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.743025 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a37c9d85-7482-45de-b47d-378f308feb54-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.743342 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a37c9d85-7482-45de-b47d-378f308feb54-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.746298 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a37c9d85-7482-45de-b47d-378f308feb54-config-data\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.750822 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a37c9d85-7482-45de-b47d-378f308feb54-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.753152 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a37c9d85-7482-45de-b47d-378f308feb54-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.753435 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a37c9d85-7482-45de-b47d-378f308feb54-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.753447 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a37c9d85-7482-45de-b47d-378f308feb54-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.767564 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a37c9d85-7482-45de-b47d-378f308feb54-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.767625 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f277q\" (UniqueName: \"kubernetes.io/projected/a37c9d85-7482-45de-b47d-378f308feb54-kube-api-access-f277q\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.788016 4825 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.788056 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-10ce9807-6e64-488b-8542-1028122298f4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10ce9807-6e64-488b-8542-1028122298f4\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/39ad561c7ffae3360bcae64ac6a958177480a2045780dd959a8d51d1e4ce878f/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.822005 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-10ce9807-6e64-488b-8542-1028122298f4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10ce9807-6e64-488b-8542-1028122298f4\") pod \"rabbitmq-server-0\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " pod="openstack/rabbitmq-server-0" Mar 10 08:02:39 crc kubenswrapper[4825]: I0310 08:02:39.931240 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.129262 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.144966 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.153689 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.153777 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.153864 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.153572 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-vvcrg" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.154290 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.154537 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.156657 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.160600 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.167585 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c87cfb955-vzt5j"] Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.251747 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3d908280-2284-4dec-b8e3-67aae4d22314-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.252075 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xtm8\" (UniqueName: \"kubernetes.io/projected/3d908280-2284-4dec-b8e3-67aae4d22314-kube-api-access-8xtm8\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.252123 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3d908280-2284-4dec-b8e3-67aae4d22314-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.252187 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d908280-2284-4dec-b8e3-67aae4d22314-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.252240 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3d908280-2284-4dec-b8e3-67aae4d22314-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.252264 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3d908280-2284-4dec-b8e3-67aae4d22314-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.252292 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.253619 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3d908280-2284-4dec-b8e3-67aae4d22314-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.253681 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3d908280-2284-4dec-b8e3-67aae4d22314-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.253717 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3d908280-2284-4dec-b8e3-67aae4d22314-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.253795 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3d908280-2284-4dec-b8e3-67aae4d22314-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.272535 4825 scope.go:117] "RemoveContainer" containerID="8d93e475acb39cfb291a147d608abb51f38af70c976fe05d21e86ef686302452" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.334935 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c87cfb955-vzt5j" event={"ID":"55c11e3c-2b81-4e63-87f2-71e18805c676","Type":"ContainerStarted","Data":"3779e46ac757cdf5873a290b8bee1a3e82bd215d2982525e40e98ba4574d4705"} Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.376984 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3d908280-2284-4dec-b8e3-67aae4d22314-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.377097 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3d908280-2284-4dec-b8e3-67aae4d22314-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.377192 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3d908280-2284-4dec-b8e3-67aae4d22314-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.377232 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xtm8\" (UniqueName: \"kubernetes.io/projected/3d908280-2284-4dec-b8e3-67aae4d22314-kube-api-access-8xtm8\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.377283 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3d908280-2284-4dec-b8e3-67aae4d22314-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.377338 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d908280-2284-4dec-b8e3-67aae4d22314-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.377457 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3d908280-2284-4dec-b8e3-67aae4d22314-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.377550 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3d908280-2284-4dec-b8e3-67aae4d22314-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.377596 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.377673 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3d908280-2284-4dec-b8e3-67aae4d22314-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.377726 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3d908280-2284-4dec-b8e3-67aae4d22314-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.380047 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3d908280-2284-4dec-b8e3-67aae4d22314-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.380257 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3d908280-2284-4dec-b8e3-67aae4d22314-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.380415 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d908280-2284-4dec-b8e3-67aae4d22314-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.380714 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3d908280-2284-4dec-b8e3-67aae4d22314-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.381093 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3d908280-2284-4dec-b8e3-67aae4d22314-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.383986 4825 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.384026 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1635cd019ed9056df86878228b36b513f8c17626caeab9d1ed75370112b35fe1/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.384828 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3d908280-2284-4dec-b8e3-67aae4d22314-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.386519 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3d908280-2284-4dec-b8e3-67aae4d22314-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.387175 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3d908280-2284-4dec-b8e3-67aae4d22314-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.396295 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xtm8\" (UniqueName: \"kubernetes.io/projected/3d908280-2284-4dec-b8e3-67aae4d22314-kube-api-access-8xtm8\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.404089 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3d908280-2284-4dec-b8e3-67aae4d22314-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.426042 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6\") pod \"rabbitmq-cell1-server-0\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.500257 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.518630 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:02:40 crc kubenswrapper[4825]: I0310 08:02:40.977229 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 08:02:41 crc kubenswrapper[4825]: W0310 08:02:41.062856 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d908280_2284_4dec_b8e3_67aae4d22314.slice/crio-ba2c417b869e48f28f1944fab4517af944f2d60eaee6d394ddc70b5bd83d6d0c WatchSource:0}: Error finding container ba2c417b869e48f28f1944fab4517af944f2d60eaee6d394ddc70b5bd83d6d0c: Status 404 returned error can't find the container with id ba2c417b869e48f28f1944fab4517af944f2d60eaee6d394ddc70b5bd83d6d0c Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.259484 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.261954 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.266661 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-dnq62" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.266945 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.273551 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.273955 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.274554 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.275147 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.391048 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a37c9d85-7482-45de-b47d-378f308feb54","Type":"ContainerStarted","Data":"1efec9b8740bc4d4a33475ae730b7711821dc3539764baeee428b70b300f2b2a"} Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.392749 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khd67\" (UniqueName: \"kubernetes.io/projected/47fdbdcc-cae4-4261-9695-645201acdc61-kube-api-access-khd67\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") " pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.392804 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/47fdbdcc-cae4-4261-9695-645201acdc61-config-data-default\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") " pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.392837 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47fdbdcc-cae4-4261-9695-645201acdc61-operator-scripts\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") " pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.392868 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/47fdbdcc-cae4-4261-9695-645201acdc61-kolla-config\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") " pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.392910 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47fdbdcc-cae4-4261-9695-645201acdc61-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") " pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.392935 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/47fdbdcc-cae4-4261-9695-645201acdc61-config-data-generated\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") " pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.393009 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3bd11da8-4026-4e12-a711-8eb9d2c59e3c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3bd11da8-4026-4e12-a711-8eb9d2c59e3c\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") " pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.393041 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/47fdbdcc-cae4-4261-9695-645201acdc61-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") " pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.397873 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3d908280-2284-4dec-b8e3-67aae4d22314","Type":"ContainerStarted","Data":"ba2c417b869e48f28f1944fab4517af944f2d60eaee6d394ddc70b5bd83d6d0c"} Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.495254 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47fdbdcc-cae4-4261-9695-645201acdc61-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") " pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.495315 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/47fdbdcc-cae4-4261-9695-645201acdc61-config-data-generated\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") " pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.495365 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3bd11da8-4026-4e12-a711-8eb9d2c59e3c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3bd11da8-4026-4e12-a711-8eb9d2c59e3c\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") " pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.495403 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/47fdbdcc-cae4-4261-9695-645201acdc61-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") " pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.495471 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khd67\" (UniqueName: \"kubernetes.io/projected/47fdbdcc-cae4-4261-9695-645201acdc61-kube-api-access-khd67\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") " pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.495515 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/47fdbdcc-cae4-4261-9695-645201acdc61-config-data-default\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") " pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.495546 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47fdbdcc-cae4-4261-9695-645201acdc61-operator-scripts\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") " pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.495572 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/47fdbdcc-cae4-4261-9695-645201acdc61-kolla-config\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") " pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.500055 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/47fdbdcc-cae4-4261-9695-645201acdc61-kolla-config\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") " pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.503512 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/47fdbdcc-cae4-4261-9695-645201acdc61-config-data-default\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") " pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.505852 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47fdbdcc-cae4-4261-9695-645201acdc61-operator-scripts\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") " pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.506484 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/47fdbdcc-cae4-4261-9695-645201acdc61-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") " pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.507047 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/47fdbdcc-cae4-4261-9695-645201acdc61-config-data-generated\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") " pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.510907 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47fdbdcc-cae4-4261-9695-645201acdc61-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") " pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.512042 4825 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.512104 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3bd11da8-4026-4e12-a711-8eb9d2c59e3c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3bd11da8-4026-4e12-a711-8eb9d2c59e3c\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2b03c75cd98c87b31566293c16cd625029497f5952f3dc6b9450e6cb462bb639/globalmount\"" pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.522877 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khd67\" (UniqueName: \"kubernetes.io/projected/47fdbdcc-cae4-4261-9695-645201acdc61-kube-api-access-khd67\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") " pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.546106 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3bd11da8-4026-4e12-a711-8eb9d2c59e3c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3bd11da8-4026-4e12-a711-8eb9d2c59e3c\") pod \"openstack-galera-0\" (UID: \"47fdbdcc-cae4-4261-9695-645201acdc61\") " pod="openstack/openstack-galera-0" Mar 10 08:02:41 crc kubenswrapper[4825]: I0310 08:02:41.638158 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 08:02:42 crc kubenswrapper[4825]: I0310 08:02:42.142504 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 08:02:42 crc kubenswrapper[4825]: W0310 08:02:42.155296 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47fdbdcc_cae4_4261_9695_645201acdc61.slice/crio-ac6c6434d5737d0b08c41b99615c9569bb303ad09970bdc74a7d5c9ce23e10a8 WatchSource:0}: Error finding container ac6c6434d5737d0b08c41b99615c9569bb303ad09970bdc74a7d5c9ce23e10a8: Status 404 returned error can't find the container with id ac6c6434d5737d0b08c41b99615c9569bb303ad09970bdc74a7d5c9ce23e10a8 Mar 10 08:02:42 crc kubenswrapper[4825]: I0310 08:02:42.236567 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:02:42 crc kubenswrapper[4825]: E0310 08:02:42.236801 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:02:42 crc kubenswrapper[4825]: I0310 08:02:42.408991 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"47fdbdcc-cae4-4261-9695-645201acdc61","Type":"ContainerStarted","Data":"ac6c6434d5737d0b08c41b99615c9569bb303ad09970bdc74a7d5c9ce23e10a8"} Mar 10 08:02:42 crc kubenswrapper[4825]: I0310 08:02:42.506402 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 08:02:42 crc kubenswrapper[4825]: I0310 08:02:42.507648 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:42 crc kubenswrapper[4825]: I0310 08:02:42.509777 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 10 08:02:42 crc kubenswrapper[4825]: I0310 08:02:42.510172 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-mgztx" Mar 10 08:02:42 crc kubenswrapper[4825]: I0310 08:02:42.510921 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 10 08:02:42 crc kubenswrapper[4825]: I0310 08:02:42.512284 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 10 08:02:42 crc kubenswrapper[4825]: I0310 08:02:42.518241 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.094285 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b069d964-330f-4562-9916-81cae7d0e72f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") " pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.094319 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfpf7\" (UniqueName: \"kubernetes.io/projected/b069d964-330f-4562-9916-81cae7d0e72f-kube-api-access-nfpf7\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") " pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.094363 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b069d964-330f-4562-9916-81cae7d0e72f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") " pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.094390 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b069d964-330f-4562-9916-81cae7d0e72f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") " pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.094411 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b069d964-330f-4562-9916-81cae7d0e72f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") " pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.094458 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b069d964-330f-4562-9916-81cae7d0e72f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") " pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.094490 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b069d964-330f-4562-9916-81cae7d0e72f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") " pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.094529 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a43afae2-195b-4b3c-ab32-4bf9c813b97b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a43afae2-195b-4b3c-ab32-4bf9c813b97b\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") " pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.201532 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b069d964-330f-4562-9916-81cae7d0e72f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") " pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.205266 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfpf7\" (UniqueName: \"kubernetes.io/projected/b069d964-330f-4562-9916-81cae7d0e72f-kube-api-access-nfpf7\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") " pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.205409 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b069d964-330f-4562-9916-81cae7d0e72f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") " pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.205474 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b069d964-330f-4562-9916-81cae7d0e72f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") " pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.205517 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b069d964-330f-4562-9916-81cae7d0e72f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") " pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.205836 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b069d964-330f-4562-9916-81cae7d0e72f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") " pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.205894 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b069d964-330f-4562-9916-81cae7d0e72f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") " pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.205979 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a43afae2-195b-4b3c-ab32-4bf9c813b97b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a43afae2-195b-4b3c-ab32-4bf9c813b97b\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") " pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.207423 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b069d964-330f-4562-9916-81cae7d0e72f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") " pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.208037 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b069d964-330f-4562-9916-81cae7d0e72f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") " pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.209404 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b069d964-330f-4562-9916-81cae7d0e72f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") " pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.213958 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b069d964-330f-4562-9916-81cae7d0e72f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") " pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.214920 4825 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.214970 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a43afae2-195b-4b3c-ab32-4bf9c813b97b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a43afae2-195b-4b3c-ab32-4bf9c813b97b\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ccb39a84e53a0ded73368b52bb27b99aaf84452b98c5953effed86c096824e07/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.217896 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b069d964-330f-4562-9916-81cae7d0e72f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") " pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.229178 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b069d964-330f-4562-9916-81cae7d0e72f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") " pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.239363 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfpf7\" (UniqueName: \"kubernetes.io/projected/b069d964-330f-4562-9916-81cae7d0e72f-kube-api-access-nfpf7\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") " pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.252607 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a43afae2-195b-4b3c-ab32-4bf9c813b97b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a43afae2-195b-4b3c-ab32-4bf9c813b97b\") pod \"openstack-cell1-galera-0\" (UID: \"b069d964-330f-4562-9916-81cae7d0e72f\") " pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.371722 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.372918 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.375369 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-x7v92" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.376262 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.379265 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.392560 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.451520 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.510115 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ce6631-6a7f-447f-9ea1-036ab13eec97-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a6ce6631-6a7f-447f-9ea1-036ab13eec97\") " pod="openstack/memcached-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.510241 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a6ce6631-6a7f-447f-9ea1-036ab13eec97-kolla-config\") pod \"memcached-0\" (UID: \"a6ce6631-6a7f-447f-9ea1-036ab13eec97\") " pod="openstack/memcached-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.510286 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9wmw\" (UniqueName: \"kubernetes.io/projected/a6ce6631-6a7f-447f-9ea1-036ab13eec97-kube-api-access-s9wmw\") pod \"memcached-0\" (UID: \"a6ce6631-6a7f-447f-9ea1-036ab13eec97\") " pod="openstack/memcached-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.510322 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6ce6631-6a7f-447f-9ea1-036ab13eec97-config-data\") pod \"memcached-0\" (UID: \"a6ce6631-6a7f-447f-9ea1-036ab13eec97\") " pod="openstack/memcached-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.510344 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce6631-6a7f-447f-9ea1-036ab13eec97-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a6ce6631-6a7f-447f-9ea1-036ab13eec97\") " pod="openstack/memcached-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.611644 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9wmw\" (UniqueName: \"kubernetes.io/projected/a6ce6631-6a7f-447f-9ea1-036ab13eec97-kube-api-access-s9wmw\") pod \"memcached-0\" (UID: \"a6ce6631-6a7f-447f-9ea1-036ab13eec97\") " pod="openstack/memcached-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.611707 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6ce6631-6a7f-447f-9ea1-036ab13eec97-config-data\") pod \"memcached-0\" (UID: \"a6ce6631-6a7f-447f-9ea1-036ab13eec97\") " pod="openstack/memcached-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.611733 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce6631-6a7f-447f-9ea1-036ab13eec97-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a6ce6631-6a7f-447f-9ea1-036ab13eec97\") " pod="openstack/memcached-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.611784 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ce6631-6a7f-447f-9ea1-036ab13eec97-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a6ce6631-6a7f-447f-9ea1-036ab13eec97\") " pod="openstack/memcached-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.611820 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a6ce6631-6a7f-447f-9ea1-036ab13eec97-kolla-config\") pod \"memcached-0\" (UID: \"a6ce6631-6a7f-447f-9ea1-036ab13eec97\") " pod="openstack/memcached-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.612646 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a6ce6631-6a7f-447f-9ea1-036ab13eec97-kolla-config\") pod \"memcached-0\" (UID: \"a6ce6631-6a7f-447f-9ea1-036ab13eec97\") " pod="openstack/memcached-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.613390 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6ce6631-6a7f-447f-9ea1-036ab13eec97-config-data\") pod \"memcached-0\" (UID: \"a6ce6631-6a7f-447f-9ea1-036ab13eec97\") " pod="openstack/memcached-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.623665 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ce6631-6a7f-447f-9ea1-036ab13eec97-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a6ce6631-6a7f-447f-9ea1-036ab13eec97\") " pod="openstack/memcached-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.627475 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ce6631-6a7f-447f-9ea1-036ab13eec97-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a6ce6631-6a7f-447f-9ea1-036ab13eec97\") " pod="openstack/memcached-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.643174 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9wmw\" (UniqueName: \"kubernetes.io/projected/a6ce6631-6a7f-447f-9ea1-036ab13eec97-kube-api-access-s9wmw\") pod \"memcached-0\" (UID: \"a6ce6631-6a7f-447f-9ea1-036ab13eec97\") " pod="openstack/memcached-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.713647 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 08:02:43 crc kubenswrapper[4825]: I0310 08:02:43.913183 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 08:02:43 crc kubenswrapper[4825]: W0310 08:02:43.993390 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb069d964_330f_4562_9916_81cae7d0e72f.slice/crio-bc1ad0734a1040d85d5bf43a3c607324ea466ecc69bbaa5343c8633012967728 WatchSource:0}: Error finding container bc1ad0734a1040d85d5bf43a3c607324ea466ecc69bbaa5343c8633012967728: Status 404 returned error can't find the container with id bc1ad0734a1040d85d5bf43a3c607324ea466ecc69bbaa5343c8633012967728 Mar 10 08:02:44 crc kubenswrapper[4825]: I0310 08:02:44.420290 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 08:02:44 crc kubenswrapper[4825]: W0310 08:02:44.421558 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6ce6631_6a7f_447f_9ea1_036ab13eec97.slice/crio-d812d750b887cc1c5d58b845aa47f081d132d56ab46a5eacb0664ea511d7f3b4 WatchSource:0}: Error finding container d812d750b887cc1c5d58b845aa47f081d132d56ab46a5eacb0664ea511d7f3b4: Status 404 returned error can't find the container with id d812d750b887cc1c5d58b845aa47f081d132d56ab46a5eacb0664ea511d7f3b4 Mar 10 08:02:44 crc kubenswrapper[4825]: I0310 08:02:44.453410 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a6ce6631-6a7f-447f-9ea1-036ab13eec97","Type":"ContainerStarted","Data":"d812d750b887cc1c5d58b845aa47f081d132d56ab46a5eacb0664ea511d7f3b4"} Mar 10 08:02:44 crc kubenswrapper[4825]: I0310 08:02:44.458656 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b069d964-330f-4562-9916-81cae7d0e72f","Type":"ContainerStarted","Data":"bc1ad0734a1040d85d5bf43a3c607324ea466ecc69bbaa5343c8633012967728"} Mar 10 08:02:55 crc kubenswrapper[4825]: I0310 08:02:55.236223 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:02:55 crc kubenswrapper[4825]: E0310 08:02:55.236884 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:03:08 crc kubenswrapper[4825]: E0310 08:03:08.652614 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:e43235cb19da04699a53f42b6a75afe9" Mar 10 08:03:08 crc kubenswrapper[4825]: E0310 08:03:08.653090 4825 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:e43235cb19da04699a53f42b6a75afe9" Mar 10 08:03:08 crc kubenswrapper[4825]: E0310 08:03:08.653260 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:e43235cb19da04699a53f42b6a75afe9,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8xtm8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(3d908280-2284-4dec-b8e3-67aae4d22314): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 08:03:08 crc kubenswrapper[4825]: E0310 08:03:08.654665 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="3d908280-2284-4dec-b8e3-67aae4d22314" Mar 10 08:03:09 crc kubenswrapper[4825]: I0310 08:03:09.236460 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:03:09 crc kubenswrapper[4825]: E0310 08:03:09.236912 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:03:09 crc kubenswrapper[4825]: I0310 08:03:09.635945 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a6ce6631-6a7f-447f-9ea1-036ab13eec97","Type":"ContainerStarted","Data":"fc66216d08dbb478e3b84f78a9815ff1227cac185f918d74d7d11bae3306d7a3"} Mar 10 08:03:09 crc kubenswrapper[4825]: I0310 08:03:09.636063 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 10 08:03:09 crc kubenswrapper[4825]: I0310 08:03:09.638930 4825 generic.go:334] "Generic (PLEG): container finished" podID="33f2eee4-99a8-4399-9806-e8052cc13783" containerID="d52b34eef5d672615de94135a3a2065c29258eba1285564aa91446ed16bca575" exitCode=0 Mar 10 08:03:09 crc kubenswrapper[4825]: I0310 08:03:09.639013 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cbd84bb99-mjd6c" event={"ID":"33f2eee4-99a8-4399-9806-e8052cc13783","Type":"ContainerDied","Data":"d52b34eef5d672615de94135a3a2065c29258eba1285564aa91446ed16bca575"} Mar 10 08:03:09 crc kubenswrapper[4825]: I0310 08:03:09.641919 4825 generic.go:334] "Generic (PLEG): container finished" podID="0907fe91-e1a7-477e-849b-073dd367c69a" containerID="1bcbcfc70673a48e90a76270ba808224808f2c832392abee98e6d61e0294226a" exitCode=0 Mar 10 08:03:09 crc kubenswrapper[4825]: I0310 08:03:09.642023 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c68856ffc-mksv8" event={"ID":"0907fe91-e1a7-477e-849b-073dd367c69a","Type":"ContainerDied","Data":"1bcbcfc70673a48e90a76270ba808224808f2c832392abee98e6d61e0294226a"} Mar 10 08:03:09 crc kubenswrapper[4825]: I0310 08:03:09.645230 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b069d964-330f-4562-9916-81cae7d0e72f","Type":"ContainerStarted","Data":"d0dc1ca86ca45f7daffbaa3f2e1d9e9e63a20d9b6c565746ea0a085025beac2c"} Mar 10 08:03:09 crc kubenswrapper[4825]: I0310 08:03:09.647606 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"47fdbdcc-cae4-4261-9695-645201acdc61","Type":"ContainerStarted","Data":"69e79007bfa9fd75e767deb3f9c61c03085cedf70fbe350ce7928a2dcf6980e2"} Mar 10 08:03:09 crc kubenswrapper[4825]: I0310 08:03:09.650364 4825 generic.go:334] "Generic (PLEG): container finished" podID="c518751a-c456-49d7-998d-cb0b3b062057" containerID="e7d6873e8e3831a41ac7f6708d9adad4f406f74cde7701d36f864c4a6bc1f975" exitCode=0 Mar 10 08:03:09 crc kubenswrapper[4825]: I0310 08:03:09.650447 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f79796479-jv592" event={"ID":"c518751a-c456-49d7-998d-cb0b3b062057","Type":"ContainerDied","Data":"e7d6873e8e3831a41ac7f6708d9adad4f406f74cde7701d36f864c4a6bc1f975"} Mar 10 08:03:09 crc kubenswrapper[4825]: I0310 08:03:09.653770 4825 generic.go:334] "Generic (PLEG): container finished" podID="55c11e3c-2b81-4e63-87f2-71e18805c676" containerID="c831e847b10fc73b9d3536510b87a575cea97c11dc1e9acaba09022333f201dc" exitCode=0 Mar 10 08:03:09 crc kubenswrapper[4825]: I0310 08:03:09.653893 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c87cfb955-vzt5j" event={"ID":"55c11e3c-2b81-4e63-87f2-71e18805c676","Type":"ContainerDied","Data":"c831e847b10fc73b9d3536510b87a575cea97c11dc1e9acaba09022333f201dc"} Mar 10 08:03:09 crc kubenswrapper[4825]: I0310 08:03:09.679118 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.419578336 podStartE2EDuration="26.67908974s" podCreationTimestamp="2026-03-10 08:02:43 +0000 UTC" firstStartedPulling="2026-03-10 08:02:44.424986673 +0000 UTC m=+4717.454767278" lastFinishedPulling="2026-03-10 08:03:08.684498067 +0000 UTC m=+4741.714278682" observedRunningTime="2026-03-10 08:03:09.662851704 +0000 UTC m=+4742.692632339" watchObservedRunningTime="2026-03-10 08:03:09.67908974 +0000 UTC m=+4742.708870395" Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.115665 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cbd84bb99-mjd6c" Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.120306 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f79796479-jv592" Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.245163 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c518751a-c456-49d7-998d-cb0b3b062057-dns-svc\") pod \"c518751a-c456-49d7-998d-cb0b3b062057\" (UID: \"c518751a-c456-49d7-998d-cb0b3b062057\") " Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.245301 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f2eee4-99a8-4399-9806-e8052cc13783-config\") pod \"33f2eee4-99a8-4399-9806-e8052cc13783\" (UID: \"33f2eee4-99a8-4399-9806-e8052cc13783\") " Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.245323 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9mrw\" (UniqueName: \"kubernetes.io/projected/c518751a-c456-49d7-998d-cb0b3b062057-kube-api-access-d9mrw\") pod \"c518751a-c456-49d7-998d-cb0b3b062057\" (UID: \"c518751a-c456-49d7-998d-cb0b3b062057\") " Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.245341 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c518751a-c456-49d7-998d-cb0b3b062057-config\") pod \"c518751a-c456-49d7-998d-cb0b3b062057\" (UID: \"c518751a-c456-49d7-998d-cb0b3b062057\") " Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.245372 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33f2eee4-99a8-4399-9806-e8052cc13783-dns-svc\") pod \"33f2eee4-99a8-4399-9806-e8052cc13783\" (UID: \"33f2eee4-99a8-4399-9806-e8052cc13783\") " Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.245408 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfnm6\" (UniqueName: \"kubernetes.io/projected/33f2eee4-99a8-4399-9806-e8052cc13783-kube-api-access-nfnm6\") pod \"33f2eee4-99a8-4399-9806-e8052cc13783\" (UID: \"33f2eee4-99a8-4399-9806-e8052cc13783\") " Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.250363 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33f2eee4-99a8-4399-9806-e8052cc13783-kube-api-access-nfnm6" (OuterVolumeSpecName: "kube-api-access-nfnm6") pod "33f2eee4-99a8-4399-9806-e8052cc13783" (UID: "33f2eee4-99a8-4399-9806-e8052cc13783"). InnerVolumeSpecName "kube-api-access-nfnm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.250534 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c518751a-c456-49d7-998d-cb0b3b062057-kube-api-access-d9mrw" (OuterVolumeSpecName: "kube-api-access-d9mrw") pod "c518751a-c456-49d7-998d-cb0b3b062057" (UID: "c518751a-c456-49d7-998d-cb0b3b062057"). InnerVolumeSpecName "kube-api-access-d9mrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.263874 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33f2eee4-99a8-4399-9806-e8052cc13783-config" (OuterVolumeSpecName: "config") pod "33f2eee4-99a8-4399-9806-e8052cc13783" (UID: "33f2eee4-99a8-4399-9806-e8052cc13783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.347398 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33f2eee4-99a8-4399-9806-e8052cc13783-config\") on node \"crc\" DevicePath \"\"" Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.347426 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9mrw\" (UniqueName: \"kubernetes.io/projected/c518751a-c456-49d7-998d-cb0b3b062057-kube-api-access-d9mrw\") on node \"crc\" DevicePath \"\"" Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.347441 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfnm6\" (UniqueName: \"kubernetes.io/projected/33f2eee4-99a8-4399-9806-e8052cc13783-kube-api-access-nfnm6\") on node \"crc\" DevicePath \"\"" Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.557230 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c518751a-c456-49d7-998d-cb0b3b062057-config" (OuterVolumeSpecName: "config") pod "c518751a-c456-49d7-998d-cb0b3b062057" (UID: "c518751a-c456-49d7-998d-cb0b3b062057"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.559605 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33f2eee4-99a8-4399-9806-e8052cc13783-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "33f2eee4-99a8-4399-9806-e8052cc13783" (UID: "33f2eee4-99a8-4399-9806-e8052cc13783"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.559898 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c518751a-c456-49d7-998d-cb0b3b062057-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c518751a-c456-49d7-998d-cb0b3b062057" (UID: "c518751a-c456-49d7-998d-cb0b3b062057"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.653235 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33f2eee4-99a8-4399-9806-e8052cc13783-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.653268 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c518751a-c456-49d7-998d-cb0b3b062057-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.653277 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c518751a-c456-49d7-998d-cb0b3b062057-config\") on node \"crc\" DevicePath \"\"" Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.665322 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f79796479-jv592" event={"ID":"c518751a-c456-49d7-998d-cb0b3b062057","Type":"ContainerDied","Data":"b8ef583d5bad8b5ad4d4870752de4bffe13f1d04184822a4d2ee4a4f38f25f68"} Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.665383 4825 scope.go:117] "RemoveContainer" containerID="e7d6873e8e3831a41ac7f6708d9adad4f406f74cde7701d36f864c4a6bc1f975" Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.665532 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f79796479-jv592" Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.672802 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a37c9d85-7482-45de-b47d-378f308feb54","Type":"ContainerStarted","Data":"ff3cda9d055df2d990e3bd4fac577a92655f2704e525cd7fe0d91738cac0ebbd"} Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.674534 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c87cfb955-vzt5j" event={"ID":"55c11e3c-2b81-4e63-87f2-71e18805c676","Type":"ContainerStarted","Data":"ab829cd23f04929298245185a0f6837bde3c4ae1cb3ce366e60c7a9627bacf64"} Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.674784 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c87cfb955-vzt5j" Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.676221 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cbd84bb99-mjd6c" event={"ID":"33f2eee4-99a8-4399-9806-e8052cc13783","Type":"ContainerDied","Data":"5e7963311f5d7b026908c6a7511dcfc795176a7d9b05ff0b67ec50147d05a83f"} Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.676289 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cbd84bb99-mjd6c" Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.677498 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c68856ffc-mksv8" event={"ID":"0907fe91-e1a7-477e-849b-073dd367c69a","Type":"ContainerStarted","Data":"d4beda8a540854d21c64c6c9793d7e6f6a684c28367c3513cbbbc2057eff9d64"} Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.752836 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c68856ffc-mksv8" podStartSLOduration=2.973043614 podStartE2EDuration="32.752818776s" podCreationTimestamp="2026-03-10 08:02:38 +0000 UTC" firstStartedPulling="2026-03-10 08:02:39.006618577 +0000 UTC m=+4712.036399192" lastFinishedPulling="2026-03-10 08:03:08.786393739 +0000 UTC m=+4741.816174354" observedRunningTime="2026-03-10 08:03:10.752679512 +0000 UTC m=+4743.782460127" watchObservedRunningTime="2026-03-10 08:03:10.752818776 +0000 UTC m=+4743.782599401" Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.754853 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c87cfb955-vzt5j" podStartSLOduration=4.210259507 podStartE2EDuration="32.754843649s" podCreationTimestamp="2026-03-10 08:02:38 +0000 UTC" firstStartedPulling="2026-03-10 08:02:40.187331958 +0000 UTC m=+4713.217112573" lastFinishedPulling="2026-03-10 08:03:08.7319161 +0000 UTC m=+4741.761696715" observedRunningTime="2026-03-10 08:03:10.734068364 +0000 UTC m=+4743.763848979" watchObservedRunningTime="2026-03-10 08:03:10.754843649 +0000 UTC m=+4743.784624274" Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.772461 4825 scope.go:117] "RemoveContainer" containerID="d52b34eef5d672615de94135a3a2065c29258eba1285564aa91446ed16bca575" Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.790371 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f79796479-jv592"] Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.796667 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f79796479-jv592"] Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.974912 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cbd84bb99-mjd6c"] Mar 10 08:03:10 crc kubenswrapper[4825]: I0310 08:03:10.980550 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cbd84bb99-mjd6c"] Mar 10 08:03:11 crc kubenswrapper[4825]: I0310 08:03:11.249249 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33f2eee4-99a8-4399-9806-e8052cc13783" path="/var/lib/kubelet/pods/33f2eee4-99a8-4399-9806-e8052cc13783/volumes" Mar 10 08:03:11 crc kubenswrapper[4825]: I0310 08:03:11.250445 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c518751a-c456-49d7-998d-cb0b3b062057" path="/var/lib/kubelet/pods/c518751a-c456-49d7-998d-cb0b3b062057/volumes" Mar 10 08:03:11 crc kubenswrapper[4825]: I0310 08:03:11.689380 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3d908280-2284-4dec-b8e3-67aae4d22314","Type":"ContainerStarted","Data":"62f2f3f8a0cd389ca5004001736b8dc9ad79906fe8521e2892e77012e16f5b20"} Mar 10 08:03:11 crc kubenswrapper[4825]: I0310 08:03:11.692333 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c68856ffc-mksv8" Mar 10 08:03:12 crc kubenswrapper[4825]: I0310 08:03:12.703186 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"47fdbdcc-cae4-4261-9695-645201acdc61","Type":"ContainerDied","Data":"69e79007bfa9fd75e767deb3f9c61c03085cedf70fbe350ce7928a2dcf6980e2"} Mar 10 08:03:12 crc kubenswrapper[4825]: I0310 08:03:12.703101 4825 generic.go:334] "Generic (PLEG): container finished" podID="47fdbdcc-cae4-4261-9695-645201acdc61" containerID="69e79007bfa9fd75e767deb3f9c61c03085cedf70fbe350ce7928a2dcf6980e2" exitCode=0 Mar 10 08:03:13 crc kubenswrapper[4825]: I0310 08:03:13.715221 4825 generic.go:334] "Generic (PLEG): container finished" podID="b069d964-330f-4562-9916-81cae7d0e72f" containerID="d0dc1ca86ca45f7daffbaa3f2e1d9e9e63a20d9b6c565746ea0a085025beac2c" exitCode=0 Mar 10 08:03:13 crc kubenswrapper[4825]: I0310 08:03:13.715283 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b069d964-330f-4562-9916-81cae7d0e72f","Type":"ContainerDied","Data":"d0dc1ca86ca45f7daffbaa3f2e1d9e9e63a20d9b6c565746ea0a085025beac2c"} Mar 10 08:03:13 crc kubenswrapper[4825]: I0310 08:03:13.718374 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 10 08:03:14 crc kubenswrapper[4825]: I0310 08:03:14.652292 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c87cfb955-vzt5j" Mar 10 08:03:14 crc kubenswrapper[4825]: I0310 08:03:14.707322 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c68856ffc-mksv8"] Mar 10 08:03:14 crc kubenswrapper[4825]: I0310 08:03:14.707805 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c68856ffc-mksv8" podUID="0907fe91-e1a7-477e-849b-073dd367c69a" containerName="dnsmasq-dns" containerID="cri-o://d4beda8a540854d21c64c6c9793d7e6f6a684c28367c3513cbbbc2057eff9d64" gracePeriod=10 Mar 10 08:03:14 crc kubenswrapper[4825]: I0310 08:03:14.713411 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c68856ffc-mksv8" Mar 10 08:03:15 crc kubenswrapper[4825]: I0310 08:03:15.737250 4825 generic.go:334] "Generic (PLEG): container finished" podID="0907fe91-e1a7-477e-849b-073dd367c69a" containerID="d4beda8a540854d21c64c6c9793d7e6f6a684c28367c3513cbbbc2057eff9d64" exitCode=0 Mar 10 08:03:15 crc kubenswrapper[4825]: I0310 08:03:15.737339 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c68856ffc-mksv8" event={"ID":"0907fe91-e1a7-477e-849b-073dd367c69a","Type":"ContainerDied","Data":"d4beda8a540854d21c64c6c9793d7e6f6a684c28367c3513cbbbc2057eff9d64"} Mar 10 08:03:16 crc kubenswrapper[4825]: I0310 08:03:16.518868 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c68856ffc-mksv8" Mar 10 08:03:16 crc kubenswrapper[4825]: I0310 08:03:16.585837 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59lqj\" (UniqueName: \"kubernetes.io/projected/0907fe91-e1a7-477e-849b-073dd367c69a-kube-api-access-59lqj\") pod \"0907fe91-e1a7-477e-849b-073dd367c69a\" (UID: \"0907fe91-e1a7-477e-849b-073dd367c69a\") " Mar 10 08:03:16 crc kubenswrapper[4825]: I0310 08:03:16.585929 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0907fe91-e1a7-477e-849b-073dd367c69a-config\") pod \"0907fe91-e1a7-477e-849b-073dd367c69a\" (UID: \"0907fe91-e1a7-477e-849b-073dd367c69a\") " Mar 10 08:03:16 crc kubenswrapper[4825]: I0310 08:03:16.595430 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0907fe91-e1a7-477e-849b-073dd367c69a-kube-api-access-59lqj" (OuterVolumeSpecName: "kube-api-access-59lqj") pod "0907fe91-e1a7-477e-849b-073dd367c69a" (UID: "0907fe91-e1a7-477e-849b-073dd367c69a"). InnerVolumeSpecName "kube-api-access-59lqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:03:16 crc kubenswrapper[4825]: I0310 08:03:16.621283 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0907fe91-e1a7-477e-849b-073dd367c69a-config" (OuterVolumeSpecName: "config") pod "0907fe91-e1a7-477e-849b-073dd367c69a" (UID: "0907fe91-e1a7-477e-849b-073dd367c69a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:03:16 crc kubenswrapper[4825]: I0310 08:03:16.687397 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0907fe91-e1a7-477e-849b-073dd367c69a-config\") on node \"crc\" DevicePath \"\"" Mar 10 08:03:16 crc kubenswrapper[4825]: I0310 08:03:16.687428 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59lqj\" (UniqueName: \"kubernetes.io/projected/0907fe91-e1a7-477e-849b-073dd367c69a-kube-api-access-59lqj\") on node \"crc\" DevicePath \"\"" Mar 10 08:03:16 crc kubenswrapper[4825]: I0310 08:03:16.748358 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"47fdbdcc-cae4-4261-9695-645201acdc61","Type":"ContainerStarted","Data":"39e61343bcb6a82d6a88d2fbb4f429f8ab461319e7534fb1648d90e323595c1b"} Mar 10 08:03:16 crc kubenswrapper[4825]: I0310 08:03:16.751243 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c68856ffc-mksv8" event={"ID":"0907fe91-e1a7-477e-849b-073dd367c69a","Type":"ContainerDied","Data":"619623e308b219e11483661dae6025209b32e5d80cbe640dcb591a89489cd898"} Mar 10 08:03:16 crc kubenswrapper[4825]: I0310 08:03:16.751297 4825 scope.go:117] "RemoveContainer" containerID="d4beda8a540854d21c64c6c9793d7e6f6a684c28367c3513cbbbc2057eff9d64" Mar 10 08:03:16 crc kubenswrapper[4825]: I0310 08:03:16.751402 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c68856ffc-mksv8" Mar 10 08:03:16 crc kubenswrapper[4825]: I0310 08:03:16.759428 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b069d964-330f-4562-9916-81cae7d0e72f","Type":"ContainerStarted","Data":"56ab39b62172b26158c44919d77fda44d86572bdb827047ad538517b801adba2"} Mar 10 08:03:16 crc kubenswrapper[4825]: I0310 08:03:16.787796 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=10.207044202 podStartE2EDuration="36.787769175s" podCreationTimestamp="2026-03-10 08:02:40 +0000 UTC" firstStartedPulling="2026-03-10 08:02:42.1573806 +0000 UTC m=+4715.187161215" lastFinishedPulling="2026-03-10 08:03:08.738105583 +0000 UTC m=+4741.767886188" observedRunningTime="2026-03-10 08:03:16.772985498 +0000 UTC m=+4749.802766113" watchObservedRunningTime="2026-03-10 08:03:16.787769175 +0000 UTC m=+4749.817549810" Mar 10 08:03:16 crc kubenswrapper[4825]: I0310 08:03:16.803385 4825 scope.go:117] "RemoveContainer" containerID="1bcbcfc70673a48e90a76270ba808224808f2c832392abee98e6d61e0294226a" Mar 10 08:03:16 crc kubenswrapper[4825]: I0310 08:03:16.806512 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=10.975411369 podStartE2EDuration="35.806494416s" podCreationTimestamp="2026-03-10 08:02:41 +0000 UTC" firstStartedPulling="2026-03-10 08:02:44.000216259 +0000 UTC m=+4717.029996874" lastFinishedPulling="2026-03-10 08:03:08.831299286 +0000 UTC m=+4741.861079921" observedRunningTime="2026-03-10 08:03:16.801984088 +0000 UTC m=+4749.831764693" watchObservedRunningTime="2026-03-10 08:03:16.806494416 +0000 UTC m=+4749.836275031" Mar 10 08:03:16 crc kubenswrapper[4825]: I0310 08:03:16.834702 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c68856ffc-mksv8"] Mar 10 08:03:16 crc kubenswrapper[4825]: I0310 08:03:16.839898 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c68856ffc-mksv8"] Mar 10 08:03:17 crc kubenswrapper[4825]: I0310 08:03:17.246415 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0907fe91-e1a7-477e-849b-073dd367c69a" path="/var/lib/kubelet/pods/0907fe91-e1a7-477e-849b-073dd367c69a/volumes" Mar 10 08:03:19 crc kubenswrapper[4825]: E0310 08:03:19.277279 4825 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.222:48958->38.102.83.222:42289: write tcp 38.102.83.222:48958->38.102.83.222:42289: write: broken pipe Mar 10 08:03:20 crc kubenswrapper[4825]: I0310 08:03:20.237516 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:03:20 crc kubenswrapper[4825]: E0310 08:03:20.237837 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:03:21 crc kubenswrapper[4825]: I0310 08:03:21.639468 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 10 08:03:21 crc kubenswrapper[4825]: I0310 08:03:21.639844 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 10 08:03:23 crc kubenswrapper[4825]: I0310 08:03:23.452294 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 10 08:03:23 crc kubenswrapper[4825]: I0310 08:03:23.452339 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 10 08:03:23 crc kubenswrapper[4825]: I0310 08:03:23.525593 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 10 08:03:23 crc kubenswrapper[4825]: I0310 08:03:23.916413 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 10 08:03:23 crc kubenswrapper[4825]: I0310 08:03:23.933416 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 10 08:03:24 crc kubenswrapper[4825]: I0310 08:03:24.052505 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 10 08:03:30 crc kubenswrapper[4825]: I0310 08:03:30.227031 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-6w89m"] Mar 10 08:03:30 crc kubenswrapper[4825]: E0310 08:03:30.227515 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f2eee4-99a8-4399-9806-e8052cc13783" containerName="init" Mar 10 08:03:30 crc kubenswrapper[4825]: I0310 08:03:30.227540 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f2eee4-99a8-4399-9806-e8052cc13783" containerName="init" Mar 10 08:03:30 crc kubenswrapper[4825]: E0310 08:03:30.227566 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0907fe91-e1a7-477e-849b-073dd367c69a" containerName="dnsmasq-dns" Mar 10 08:03:30 crc kubenswrapper[4825]: I0310 08:03:30.227582 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0907fe91-e1a7-477e-849b-073dd367c69a" containerName="dnsmasq-dns" Mar 10 08:03:30 crc kubenswrapper[4825]: E0310 08:03:30.227594 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0907fe91-e1a7-477e-849b-073dd367c69a" containerName="init" Mar 10 08:03:30 crc kubenswrapper[4825]: I0310 08:03:30.227608 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0907fe91-e1a7-477e-849b-073dd367c69a" containerName="init" Mar 10 08:03:30 crc kubenswrapper[4825]: E0310 08:03:30.227656 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c518751a-c456-49d7-998d-cb0b3b062057" containerName="init" Mar 10 08:03:30 crc kubenswrapper[4825]: I0310 08:03:30.227668 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c518751a-c456-49d7-998d-cb0b3b062057" containerName="init" Mar 10 08:03:30 crc kubenswrapper[4825]: I0310 08:03:30.227930 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="33f2eee4-99a8-4399-9806-e8052cc13783" containerName="init" Mar 10 08:03:30 crc kubenswrapper[4825]: I0310 08:03:30.227957 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c518751a-c456-49d7-998d-cb0b3b062057" containerName="init" Mar 10 08:03:30 crc kubenswrapper[4825]: I0310 08:03:30.227986 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="0907fe91-e1a7-477e-849b-073dd367c69a" containerName="dnsmasq-dns" Mar 10 08:03:30 crc kubenswrapper[4825]: I0310 08:03:30.228715 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6w89m" Mar 10 08:03:30 crc kubenswrapper[4825]: I0310 08:03:30.231454 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 10 08:03:30 crc kubenswrapper[4825]: I0310 08:03:30.239172 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-6w89m"] Mar 10 08:03:30 crc kubenswrapper[4825]: I0310 08:03:30.328418 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17b84d4e-632d-4057-aebb-d7dafe30f2e1-operator-scripts\") pod \"root-account-create-update-6w89m\" (UID: \"17b84d4e-632d-4057-aebb-d7dafe30f2e1\") " pod="openstack/root-account-create-update-6w89m" Mar 10 08:03:30 crc kubenswrapper[4825]: I0310 08:03:30.328495 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcr4w\" (UniqueName: \"kubernetes.io/projected/17b84d4e-632d-4057-aebb-d7dafe30f2e1-kube-api-access-gcr4w\") pod \"root-account-create-update-6w89m\" (UID: \"17b84d4e-632d-4057-aebb-d7dafe30f2e1\") " pod="openstack/root-account-create-update-6w89m" Mar 10 08:03:30 crc kubenswrapper[4825]: I0310 08:03:30.429928 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17b84d4e-632d-4057-aebb-d7dafe30f2e1-operator-scripts\") pod \"root-account-create-update-6w89m\" (UID: \"17b84d4e-632d-4057-aebb-d7dafe30f2e1\") " pod="openstack/root-account-create-update-6w89m" Mar 10 08:03:30 crc kubenswrapper[4825]: I0310 08:03:30.430326 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcr4w\" (UniqueName: \"kubernetes.io/projected/17b84d4e-632d-4057-aebb-d7dafe30f2e1-kube-api-access-gcr4w\") pod \"root-account-create-update-6w89m\" (UID: \"17b84d4e-632d-4057-aebb-d7dafe30f2e1\") " pod="openstack/root-account-create-update-6w89m" Mar 10 08:03:30 crc kubenswrapper[4825]: I0310 08:03:30.431429 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17b84d4e-632d-4057-aebb-d7dafe30f2e1-operator-scripts\") pod \"root-account-create-update-6w89m\" (UID: \"17b84d4e-632d-4057-aebb-d7dafe30f2e1\") " pod="openstack/root-account-create-update-6w89m" Mar 10 08:03:30 crc kubenswrapper[4825]: I0310 08:03:30.449802 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcr4w\" (UniqueName: \"kubernetes.io/projected/17b84d4e-632d-4057-aebb-d7dafe30f2e1-kube-api-access-gcr4w\") pod \"root-account-create-update-6w89m\" (UID: \"17b84d4e-632d-4057-aebb-d7dafe30f2e1\") " pod="openstack/root-account-create-update-6w89m" Mar 10 08:03:30 crc kubenswrapper[4825]: I0310 08:03:30.550031 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6w89m" Mar 10 08:03:30 crc kubenswrapper[4825]: I0310 08:03:30.990151 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-6w89m"] Mar 10 08:03:30 crc kubenswrapper[4825]: W0310 08:03:30.995937 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17b84d4e_632d_4057_aebb_d7dafe30f2e1.slice/crio-c3e91178f2722701c40e3dea2d64830b4083b98bfbbaf2c21afbc5e5705ef461 WatchSource:0}: Error finding container c3e91178f2722701c40e3dea2d64830b4083b98bfbbaf2c21afbc5e5705ef461: Status 404 returned error can't find the container with id c3e91178f2722701c40e3dea2d64830b4083b98bfbbaf2c21afbc5e5705ef461 Mar 10 08:03:31 crc kubenswrapper[4825]: I0310 08:03:31.917406 4825 generic.go:334] "Generic (PLEG): container finished" podID="17b84d4e-632d-4057-aebb-d7dafe30f2e1" containerID="22f9a4619cc3afadea5907a8b0d49e13d5f5e340f38d071e4d45e57275fe5daf" exitCode=0 Mar 10 08:03:31 crc kubenswrapper[4825]: I0310 08:03:31.917551 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6w89m" event={"ID":"17b84d4e-632d-4057-aebb-d7dafe30f2e1","Type":"ContainerDied","Data":"22f9a4619cc3afadea5907a8b0d49e13d5f5e340f38d071e4d45e57275fe5daf"} Mar 10 08:03:31 crc kubenswrapper[4825]: I0310 08:03:31.917672 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6w89m" event={"ID":"17b84d4e-632d-4057-aebb-d7dafe30f2e1","Type":"ContainerStarted","Data":"c3e91178f2722701c40e3dea2d64830b4083b98bfbbaf2c21afbc5e5705ef461"} Mar 10 08:03:33 crc kubenswrapper[4825]: I0310 08:03:33.346782 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6w89m" Mar 10 08:03:33 crc kubenswrapper[4825]: I0310 08:03:33.476378 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17b84d4e-632d-4057-aebb-d7dafe30f2e1-operator-scripts\") pod \"17b84d4e-632d-4057-aebb-d7dafe30f2e1\" (UID: \"17b84d4e-632d-4057-aebb-d7dafe30f2e1\") " Mar 10 08:03:33 crc kubenswrapper[4825]: I0310 08:03:33.476472 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcr4w\" (UniqueName: \"kubernetes.io/projected/17b84d4e-632d-4057-aebb-d7dafe30f2e1-kube-api-access-gcr4w\") pod \"17b84d4e-632d-4057-aebb-d7dafe30f2e1\" (UID: \"17b84d4e-632d-4057-aebb-d7dafe30f2e1\") " Mar 10 08:03:33 crc kubenswrapper[4825]: I0310 08:03:33.477700 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17b84d4e-632d-4057-aebb-d7dafe30f2e1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "17b84d4e-632d-4057-aebb-d7dafe30f2e1" (UID: "17b84d4e-632d-4057-aebb-d7dafe30f2e1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:03:33 crc kubenswrapper[4825]: I0310 08:03:33.485543 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17b84d4e-632d-4057-aebb-d7dafe30f2e1-kube-api-access-gcr4w" (OuterVolumeSpecName: "kube-api-access-gcr4w") pod "17b84d4e-632d-4057-aebb-d7dafe30f2e1" (UID: "17b84d4e-632d-4057-aebb-d7dafe30f2e1"). InnerVolumeSpecName "kube-api-access-gcr4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:03:33 crc kubenswrapper[4825]: I0310 08:03:33.578548 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17b84d4e-632d-4057-aebb-d7dafe30f2e1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:03:33 crc kubenswrapper[4825]: I0310 08:03:33.578669 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcr4w\" (UniqueName: \"kubernetes.io/projected/17b84d4e-632d-4057-aebb-d7dafe30f2e1-kube-api-access-gcr4w\") on node \"crc\" DevicePath \"\"" Mar 10 08:03:33 crc kubenswrapper[4825]: I0310 08:03:33.939773 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6w89m" event={"ID":"17b84d4e-632d-4057-aebb-d7dafe30f2e1","Type":"ContainerDied","Data":"c3e91178f2722701c40e3dea2d64830b4083b98bfbbaf2c21afbc5e5705ef461"} Mar 10 08:03:33 crc kubenswrapper[4825]: I0310 08:03:33.939820 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3e91178f2722701c40e3dea2d64830b4083b98bfbbaf2c21afbc5e5705ef461" Mar 10 08:03:33 crc kubenswrapper[4825]: I0310 08:03:33.939894 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6w89m" Mar 10 08:03:34 crc kubenswrapper[4825]: I0310 08:03:34.236894 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:03:34 crc kubenswrapper[4825]: E0310 08:03:34.237593 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:03:36 crc kubenswrapper[4825]: I0310 08:03:36.483783 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-6w89m"] Mar 10 08:03:36 crc kubenswrapper[4825]: I0310 08:03:36.491763 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-6w89m"] Mar 10 08:03:37 crc kubenswrapper[4825]: I0310 08:03:37.261819 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17b84d4e-632d-4057-aebb-d7dafe30f2e1" path="/var/lib/kubelet/pods/17b84d4e-632d-4057-aebb-d7dafe30f2e1/volumes" Mar 10 08:03:41 crc kubenswrapper[4825]: I0310 08:03:41.496538 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-pbrmr"] Mar 10 08:03:41 crc kubenswrapper[4825]: E0310 08:03:41.498311 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17b84d4e-632d-4057-aebb-d7dafe30f2e1" containerName="mariadb-account-create-update" Mar 10 08:03:41 crc kubenswrapper[4825]: I0310 08:03:41.498445 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b84d4e-632d-4057-aebb-d7dafe30f2e1" containerName="mariadb-account-create-update" Mar 10 08:03:41 crc kubenswrapper[4825]: I0310 08:03:41.498747 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="17b84d4e-632d-4057-aebb-d7dafe30f2e1" containerName="mariadb-account-create-update" Mar 10 08:03:41 crc kubenswrapper[4825]: I0310 08:03:41.500266 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pbrmr" Mar 10 08:03:41 crc kubenswrapper[4825]: I0310 08:03:41.502895 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 10 08:03:41 crc kubenswrapper[4825]: I0310 08:03:41.509677 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-pbrmr"] Mar 10 08:03:41 crc kubenswrapper[4825]: I0310 08:03:41.615903 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c42f77e-dffc-402b-adce-0a83e6a92739-operator-scripts\") pod \"root-account-create-update-pbrmr\" (UID: \"9c42f77e-dffc-402b-adce-0a83e6a92739\") " pod="openstack/root-account-create-update-pbrmr" Mar 10 08:03:41 crc kubenswrapper[4825]: I0310 08:03:41.616253 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cgvc\" (UniqueName: \"kubernetes.io/projected/9c42f77e-dffc-402b-adce-0a83e6a92739-kube-api-access-6cgvc\") pod \"root-account-create-update-pbrmr\" (UID: \"9c42f77e-dffc-402b-adce-0a83e6a92739\") " pod="openstack/root-account-create-update-pbrmr" Mar 10 08:03:41 crc kubenswrapper[4825]: I0310 08:03:41.718332 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c42f77e-dffc-402b-adce-0a83e6a92739-operator-scripts\") pod \"root-account-create-update-pbrmr\" (UID: \"9c42f77e-dffc-402b-adce-0a83e6a92739\") " pod="openstack/root-account-create-update-pbrmr" Mar 10 08:03:41 crc kubenswrapper[4825]: I0310 08:03:41.718863 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cgvc\" (UniqueName: \"kubernetes.io/projected/9c42f77e-dffc-402b-adce-0a83e6a92739-kube-api-access-6cgvc\") pod \"root-account-create-update-pbrmr\" (UID: \"9c42f77e-dffc-402b-adce-0a83e6a92739\") " pod="openstack/root-account-create-update-pbrmr" Mar 10 08:03:41 crc kubenswrapper[4825]: I0310 08:03:41.719986 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c42f77e-dffc-402b-adce-0a83e6a92739-operator-scripts\") pod \"root-account-create-update-pbrmr\" (UID: \"9c42f77e-dffc-402b-adce-0a83e6a92739\") " pod="openstack/root-account-create-update-pbrmr" Mar 10 08:03:41 crc kubenswrapper[4825]: I0310 08:03:41.759717 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cgvc\" (UniqueName: \"kubernetes.io/projected/9c42f77e-dffc-402b-adce-0a83e6a92739-kube-api-access-6cgvc\") pod \"root-account-create-update-pbrmr\" (UID: \"9c42f77e-dffc-402b-adce-0a83e6a92739\") " pod="openstack/root-account-create-update-pbrmr" Mar 10 08:03:41 crc kubenswrapper[4825]: I0310 08:03:41.822543 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pbrmr" Mar 10 08:03:42 crc kubenswrapper[4825]: I0310 08:03:42.015876 4825 generic.go:334] "Generic (PLEG): container finished" podID="a37c9d85-7482-45de-b47d-378f308feb54" containerID="ff3cda9d055df2d990e3bd4fac577a92655f2704e525cd7fe0d91738cac0ebbd" exitCode=0 Mar 10 08:03:42 crc kubenswrapper[4825]: I0310 08:03:42.015962 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a37c9d85-7482-45de-b47d-378f308feb54","Type":"ContainerDied","Data":"ff3cda9d055df2d990e3bd4fac577a92655f2704e525cd7fe0d91738cac0ebbd"} Mar 10 08:03:42 crc kubenswrapper[4825]: E0310 08:03:42.075846 4825 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda37c9d85_7482_45de_b47d_378f308feb54.slice/crio-ff3cda9d055df2d990e3bd4fac577a92655f2704e525cd7fe0d91738cac0ebbd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda37c9d85_7482_45de_b47d_378f308feb54.slice/crio-conmon-ff3cda9d055df2d990e3bd4fac577a92655f2704e525cd7fe0d91738cac0ebbd.scope\": RecentStats: unable to find data in memory cache]" Mar 10 08:03:42 crc kubenswrapper[4825]: I0310 08:03:42.267419 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-pbrmr"] Mar 10 08:03:42 crc kubenswrapper[4825]: W0310 08:03:42.272369 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c42f77e_dffc_402b_adce_0a83e6a92739.slice/crio-0437a969e85c997b18c8dd4724b2307ca6f6d380b1b39b9fea05c96ab3ea23e4 WatchSource:0}: Error finding container 0437a969e85c997b18c8dd4724b2307ca6f6d380b1b39b9fea05c96ab3ea23e4: Status 404 returned error can't find the container with id 0437a969e85c997b18c8dd4724b2307ca6f6d380b1b39b9fea05c96ab3ea23e4 Mar 10 08:03:43 crc kubenswrapper[4825]: I0310 08:03:43.027386 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a37c9d85-7482-45de-b47d-378f308feb54","Type":"ContainerStarted","Data":"bb840d5d1875e6c200d4ddc418614e59596370b2603d0997ee7dbb06712b1774"} Mar 10 08:03:43 crc kubenswrapper[4825]: I0310 08:03:43.029214 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 10 08:03:43 crc kubenswrapper[4825]: I0310 08:03:43.029913 4825 generic.go:334] "Generic (PLEG): container finished" podID="9c42f77e-dffc-402b-adce-0a83e6a92739" containerID="3a08e1311949cfde24b33789a2fdb7b4c86c8ab8c13f514d2c8de1edcbafb48a" exitCode=0 Mar 10 08:03:43 crc kubenswrapper[4825]: I0310 08:03:43.029966 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pbrmr" event={"ID":"9c42f77e-dffc-402b-adce-0a83e6a92739","Type":"ContainerDied","Data":"3a08e1311949cfde24b33789a2fdb7b4c86c8ab8c13f514d2c8de1edcbafb48a"} Mar 10 08:03:43 crc kubenswrapper[4825]: I0310 08:03:43.029995 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pbrmr" event={"ID":"9c42f77e-dffc-402b-adce-0a83e6a92739","Type":"ContainerStarted","Data":"0437a969e85c997b18c8dd4724b2307ca6f6d380b1b39b9fea05c96ab3ea23e4"} Mar 10 08:03:43 crc kubenswrapper[4825]: I0310 08:03:43.058761 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.786841849 podStartE2EDuration="1m5.058737623s" podCreationTimestamp="2026-03-10 08:02:38 +0000 UTC" firstStartedPulling="2026-03-10 08:02:40.516967039 +0000 UTC m=+4713.546747654" lastFinishedPulling="2026-03-10 08:03:08.788862803 +0000 UTC m=+4741.818643428" observedRunningTime="2026-03-10 08:03:43.053465245 +0000 UTC m=+4776.083245910" watchObservedRunningTime="2026-03-10 08:03:43.058737623 +0000 UTC m=+4776.088518238" Mar 10 08:03:44 crc kubenswrapper[4825]: I0310 08:03:44.039753 4825 generic.go:334] "Generic (PLEG): container finished" podID="3d908280-2284-4dec-b8e3-67aae4d22314" containerID="62f2f3f8a0cd389ca5004001736b8dc9ad79906fe8521e2892e77012e16f5b20" exitCode=0 Mar 10 08:03:44 crc kubenswrapper[4825]: I0310 08:03:44.040007 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3d908280-2284-4dec-b8e3-67aae4d22314","Type":"ContainerDied","Data":"62f2f3f8a0cd389ca5004001736b8dc9ad79906fe8521e2892e77012e16f5b20"} Mar 10 08:03:44 crc kubenswrapper[4825]: I0310 08:03:44.412593 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pbrmr" Mar 10 08:03:44 crc kubenswrapper[4825]: I0310 08:03:44.593088 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cgvc\" (UniqueName: \"kubernetes.io/projected/9c42f77e-dffc-402b-adce-0a83e6a92739-kube-api-access-6cgvc\") pod \"9c42f77e-dffc-402b-adce-0a83e6a92739\" (UID: \"9c42f77e-dffc-402b-adce-0a83e6a92739\") " Mar 10 08:03:44 crc kubenswrapper[4825]: I0310 08:03:44.593244 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c42f77e-dffc-402b-adce-0a83e6a92739-operator-scripts\") pod \"9c42f77e-dffc-402b-adce-0a83e6a92739\" (UID: \"9c42f77e-dffc-402b-adce-0a83e6a92739\") " Mar 10 08:03:44 crc kubenswrapper[4825]: I0310 08:03:44.593788 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c42f77e-dffc-402b-adce-0a83e6a92739-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9c42f77e-dffc-402b-adce-0a83e6a92739" (UID: "9c42f77e-dffc-402b-adce-0a83e6a92739"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:03:44 crc kubenswrapper[4825]: I0310 08:03:44.599561 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c42f77e-dffc-402b-adce-0a83e6a92739-kube-api-access-6cgvc" (OuterVolumeSpecName: "kube-api-access-6cgvc") pod "9c42f77e-dffc-402b-adce-0a83e6a92739" (UID: "9c42f77e-dffc-402b-adce-0a83e6a92739"). InnerVolumeSpecName "kube-api-access-6cgvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:03:44 crc kubenswrapper[4825]: I0310 08:03:44.695589 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c42f77e-dffc-402b-adce-0a83e6a92739-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:03:44 crc kubenswrapper[4825]: I0310 08:03:44.695632 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cgvc\" (UniqueName: \"kubernetes.io/projected/9c42f77e-dffc-402b-adce-0a83e6a92739-kube-api-access-6cgvc\") on node \"crc\" DevicePath \"\"" Mar 10 08:03:45 crc kubenswrapper[4825]: I0310 08:03:45.049502 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pbrmr" Mar 10 08:03:45 crc kubenswrapper[4825]: I0310 08:03:45.049488 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pbrmr" event={"ID":"9c42f77e-dffc-402b-adce-0a83e6a92739","Type":"ContainerDied","Data":"0437a969e85c997b18c8dd4724b2307ca6f6d380b1b39b9fea05c96ab3ea23e4"} Mar 10 08:03:45 crc kubenswrapper[4825]: I0310 08:03:45.049615 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0437a969e85c997b18c8dd4724b2307ca6f6d380b1b39b9fea05c96ab3ea23e4" Mar 10 08:03:45 crc kubenswrapper[4825]: I0310 08:03:45.052129 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3d908280-2284-4dec-b8e3-67aae4d22314","Type":"ContainerStarted","Data":"b4d61c6344a19f965ce9a8fefe891013ab0e02c9e7d0a5bcef488c45b2732e0d"} Mar 10 08:03:45 crc kubenswrapper[4825]: I0310 08:03:45.052389 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:03:45 crc kubenswrapper[4825]: I0310 08:03:45.081864 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371970.772934 podStartE2EDuration="1m6.081842307s" podCreationTimestamp="2026-03-10 08:02:39 +0000 UTC" firstStartedPulling="2026-03-10 08:02:41.06593477 +0000 UTC m=+4714.095715385" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:03:45.077287228 +0000 UTC m=+4778.107067853" watchObservedRunningTime="2026-03-10 08:03:45.081842307 +0000 UTC m=+4778.111622932" Mar 10 08:03:46 crc kubenswrapper[4825]: I0310 08:03:46.236052 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:03:46 crc kubenswrapper[4825]: E0310 08:03:46.236566 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:03:57 crc kubenswrapper[4825]: I0310 08:03:57.237059 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:03:57 crc kubenswrapper[4825]: E0310 08:03:57.238077 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:03:59 crc kubenswrapper[4825]: I0310 08:03:59.935425 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 10 08:04:00 crc kubenswrapper[4825]: I0310 08:04:00.157490 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552164-vv99d"] Mar 10 08:04:00 crc kubenswrapper[4825]: E0310 08:04:00.158033 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c42f77e-dffc-402b-adce-0a83e6a92739" containerName="mariadb-account-create-update" Mar 10 08:04:00 crc kubenswrapper[4825]: I0310 08:04:00.158066 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c42f77e-dffc-402b-adce-0a83e6a92739" containerName="mariadb-account-create-update" Mar 10 08:04:00 crc kubenswrapper[4825]: I0310 08:04:00.158317 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c42f77e-dffc-402b-adce-0a83e6a92739" containerName="mariadb-account-create-update" Mar 10 08:04:00 crc kubenswrapper[4825]: I0310 08:04:00.159238 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552164-vv99d" Mar 10 08:04:00 crc kubenswrapper[4825]: I0310 08:04:00.166776 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:04:00 crc kubenswrapper[4825]: I0310 08:04:00.166827 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:04:00 crc kubenswrapper[4825]: I0310 08:04:00.166914 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:04:00 crc kubenswrapper[4825]: I0310 08:04:00.177629 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552164-vv99d"] Mar 10 08:04:00 crc kubenswrapper[4825]: I0310 08:04:00.349627 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc5dj\" (UniqueName: \"kubernetes.io/projected/13750d54-cc8a-459d-b9b0-ff2a1c678421-kube-api-access-nc5dj\") pod \"auto-csr-approver-29552164-vv99d\" (UID: \"13750d54-cc8a-459d-b9b0-ff2a1c678421\") " pod="openshift-infra/auto-csr-approver-29552164-vv99d" Mar 10 08:04:00 crc kubenswrapper[4825]: I0310 08:04:00.451651 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc5dj\" (UniqueName: \"kubernetes.io/projected/13750d54-cc8a-459d-b9b0-ff2a1c678421-kube-api-access-nc5dj\") pod \"auto-csr-approver-29552164-vv99d\" (UID: \"13750d54-cc8a-459d-b9b0-ff2a1c678421\") " pod="openshift-infra/auto-csr-approver-29552164-vv99d" Mar 10 08:04:00 crc kubenswrapper[4825]: I0310 08:04:00.490323 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc5dj\" (UniqueName: \"kubernetes.io/projected/13750d54-cc8a-459d-b9b0-ff2a1c678421-kube-api-access-nc5dj\") pod \"auto-csr-approver-29552164-vv99d\" (UID: \"13750d54-cc8a-459d-b9b0-ff2a1c678421\") " pod="openshift-infra/auto-csr-approver-29552164-vv99d" Mar 10 08:04:00 crc kubenswrapper[4825]: I0310 08:04:00.522407 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:00 crc kubenswrapper[4825]: I0310 08:04:00.786193 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552164-vv99d" Mar 10 08:04:01 crc kubenswrapper[4825]: I0310 08:04:01.213412 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 08:04:01 crc kubenswrapper[4825]: I0310 08:04:01.224832 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552164-vv99d"] Mar 10 08:04:02 crc kubenswrapper[4825]: I0310 08:04:02.150875 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-llk5l"] Mar 10 08:04:02 crc kubenswrapper[4825]: I0310 08:04:02.152666 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llk5l" Mar 10 08:04:02 crc kubenswrapper[4825]: I0310 08:04:02.173794 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-llk5l"] Mar 10 08:04:02 crc kubenswrapper[4825]: I0310 08:04:02.201175 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552164-vv99d" event={"ID":"13750d54-cc8a-459d-b9b0-ff2a1c678421","Type":"ContainerStarted","Data":"282509ba1b2798039b1b8c27f71192649a785e56c30195793124e67bd29d2c12"} Mar 10 08:04:02 crc kubenswrapper[4825]: I0310 08:04:02.278922 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b2fa50-d9a8-4569-99da-4eac1f129875-utilities\") pod \"community-operators-llk5l\" (UID: \"13b2fa50-d9a8-4569-99da-4eac1f129875\") " pod="openshift-marketplace/community-operators-llk5l" Mar 10 08:04:02 crc kubenswrapper[4825]: I0310 08:04:02.278986 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b2fa50-d9a8-4569-99da-4eac1f129875-catalog-content\") pod \"community-operators-llk5l\" (UID: \"13b2fa50-d9a8-4569-99da-4eac1f129875\") " pod="openshift-marketplace/community-operators-llk5l" Mar 10 08:04:02 crc kubenswrapper[4825]: I0310 08:04:02.279011 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx4hp\" (UniqueName: \"kubernetes.io/projected/13b2fa50-d9a8-4569-99da-4eac1f129875-kube-api-access-gx4hp\") pod \"community-operators-llk5l\" (UID: \"13b2fa50-d9a8-4569-99da-4eac1f129875\") " pod="openshift-marketplace/community-operators-llk5l" Mar 10 08:04:02 crc kubenswrapper[4825]: I0310 08:04:02.380367 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b2fa50-d9a8-4569-99da-4eac1f129875-utilities\") pod \"community-operators-llk5l\" (UID: \"13b2fa50-d9a8-4569-99da-4eac1f129875\") " pod="openshift-marketplace/community-operators-llk5l" Mar 10 08:04:02 crc kubenswrapper[4825]: I0310 08:04:02.380432 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b2fa50-d9a8-4569-99da-4eac1f129875-catalog-content\") pod \"community-operators-llk5l\" (UID: \"13b2fa50-d9a8-4569-99da-4eac1f129875\") " pod="openshift-marketplace/community-operators-llk5l" Mar 10 08:04:02 crc kubenswrapper[4825]: I0310 08:04:02.380470 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx4hp\" (UniqueName: \"kubernetes.io/projected/13b2fa50-d9a8-4569-99da-4eac1f129875-kube-api-access-gx4hp\") pod \"community-operators-llk5l\" (UID: \"13b2fa50-d9a8-4569-99da-4eac1f129875\") " pod="openshift-marketplace/community-operators-llk5l" Mar 10 08:04:02 crc kubenswrapper[4825]: I0310 08:04:02.380896 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b2fa50-d9a8-4569-99da-4eac1f129875-utilities\") pod \"community-operators-llk5l\" (UID: \"13b2fa50-d9a8-4569-99da-4eac1f129875\") " pod="openshift-marketplace/community-operators-llk5l" Mar 10 08:04:02 crc kubenswrapper[4825]: I0310 08:04:02.380917 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b2fa50-d9a8-4569-99da-4eac1f129875-catalog-content\") pod \"community-operators-llk5l\" (UID: \"13b2fa50-d9a8-4569-99da-4eac1f129875\") " pod="openshift-marketplace/community-operators-llk5l" Mar 10 08:04:02 crc kubenswrapper[4825]: I0310 08:04:02.402627 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx4hp\" (UniqueName: \"kubernetes.io/projected/13b2fa50-d9a8-4569-99da-4eac1f129875-kube-api-access-gx4hp\") pod \"community-operators-llk5l\" (UID: \"13b2fa50-d9a8-4569-99da-4eac1f129875\") " pod="openshift-marketplace/community-operators-llk5l" Mar 10 08:04:02 crc kubenswrapper[4825]: I0310 08:04:02.475378 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llk5l" Mar 10 08:04:02 crc kubenswrapper[4825]: I0310 08:04:02.742558 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-llk5l"] Mar 10 08:04:02 crc kubenswrapper[4825]: W0310 08:04:02.750616 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13b2fa50_d9a8_4569_99da_4eac1f129875.slice/crio-69c1ff9d39de84fd184724c0585b7ce78f9c55849a1b2bcc394a23f1b59096a0 WatchSource:0}: Error finding container 69c1ff9d39de84fd184724c0585b7ce78f9c55849a1b2bcc394a23f1b59096a0: Status 404 returned error can't find the container with id 69c1ff9d39de84fd184724c0585b7ce78f9c55849a1b2bcc394a23f1b59096a0 Mar 10 08:04:03 crc kubenswrapper[4825]: I0310 08:04:03.210819 4825 generic.go:334] "Generic (PLEG): container finished" podID="13b2fa50-d9a8-4569-99da-4eac1f129875" containerID="ac4794aaf7631f82d8f397e86a77f5c7372523e4b0302c9be0bb27d369aef29f" exitCode=0 Mar 10 08:04:03 crc kubenswrapper[4825]: I0310 08:04:03.210873 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llk5l" event={"ID":"13b2fa50-d9a8-4569-99da-4eac1f129875","Type":"ContainerDied","Data":"ac4794aaf7631f82d8f397e86a77f5c7372523e4b0302c9be0bb27d369aef29f"} Mar 10 08:04:03 crc kubenswrapper[4825]: I0310 08:04:03.211637 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llk5l" event={"ID":"13b2fa50-d9a8-4569-99da-4eac1f129875","Type":"ContainerStarted","Data":"69c1ff9d39de84fd184724c0585b7ce78f9c55849a1b2bcc394a23f1b59096a0"} Mar 10 08:04:03 crc kubenswrapper[4825]: I0310 08:04:03.213914 4825 generic.go:334] "Generic (PLEG): container finished" podID="13750d54-cc8a-459d-b9b0-ff2a1c678421" containerID="2a1b30f0ff9c8feeb518a52f0391c45ccadded9b6a8f5dd536dcfe1450db78b7" exitCode=0 Mar 10 08:04:03 crc kubenswrapper[4825]: I0310 08:04:03.213984 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552164-vv99d" event={"ID":"13750d54-cc8a-459d-b9b0-ff2a1c678421","Type":"ContainerDied","Data":"2a1b30f0ff9c8feeb518a52f0391c45ccadded9b6a8f5dd536dcfe1450db78b7"} Mar 10 08:04:04 crc kubenswrapper[4825]: I0310 08:04:04.244161 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llk5l" event={"ID":"13b2fa50-d9a8-4569-99da-4eac1f129875","Type":"ContainerStarted","Data":"6acc051bbfaca4a37c3bcec6a5b398c7ef48b245682239ef40972adc232cb8fc"} Mar 10 08:04:04 crc kubenswrapper[4825]: I0310 08:04:04.595775 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552164-vv99d" Mar 10 08:04:04 crc kubenswrapper[4825]: I0310 08:04:04.719989 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc5dj\" (UniqueName: \"kubernetes.io/projected/13750d54-cc8a-459d-b9b0-ff2a1c678421-kube-api-access-nc5dj\") pod \"13750d54-cc8a-459d-b9b0-ff2a1c678421\" (UID: \"13750d54-cc8a-459d-b9b0-ff2a1c678421\") " Mar 10 08:04:04 crc kubenswrapper[4825]: I0310 08:04:04.726076 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13750d54-cc8a-459d-b9b0-ff2a1c678421-kube-api-access-nc5dj" (OuterVolumeSpecName: "kube-api-access-nc5dj") pod "13750d54-cc8a-459d-b9b0-ff2a1c678421" (UID: "13750d54-cc8a-459d-b9b0-ff2a1c678421"). InnerVolumeSpecName "kube-api-access-nc5dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:04:04 crc kubenswrapper[4825]: I0310 08:04:04.821747 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc5dj\" (UniqueName: \"kubernetes.io/projected/13750d54-cc8a-459d-b9b0-ff2a1c678421-kube-api-access-nc5dj\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:05 crc kubenswrapper[4825]: I0310 08:04:05.256328 4825 generic.go:334] "Generic (PLEG): container finished" podID="13b2fa50-d9a8-4569-99da-4eac1f129875" containerID="6acc051bbfaca4a37c3bcec6a5b398c7ef48b245682239ef40972adc232cb8fc" exitCode=0 Mar 10 08:04:05 crc kubenswrapper[4825]: I0310 08:04:05.256446 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llk5l" event={"ID":"13b2fa50-d9a8-4569-99da-4eac1f129875","Type":"ContainerDied","Data":"6acc051bbfaca4a37c3bcec6a5b398c7ef48b245682239ef40972adc232cb8fc"} Mar 10 08:04:05 crc kubenswrapper[4825]: I0310 08:04:05.262824 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552164-vv99d" event={"ID":"13750d54-cc8a-459d-b9b0-ff2a1c678421","Type":"ContainerDied","Data":"282509ba1b2798039b1b8c27f71192649a785e56c30195793124e67bd29d2c12"} Mar 10 08:04:05 crc kubenswrapper[4825]: I0310 08:04:05.262878 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="282509ba1b2798039b1b8c27f71192649a785e56c30195793124e67bd29d2c12" Mar 10 08:04:05 crc kubenswrapper[4825]: I0310 08:04:05.262971 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552164-vv99d" Mar 10 08:04:05 crc kubenswrapper[4825]: I0310 08:04:05.668038 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552158-lkgxj"] Mar 10 08:04:05 crc kubenswrapper[4825]: I0310 08:04:05.673859 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552158-lkgxj"] Mar 10 08:04:06 crc kubenswrapper[4825]: I0310 08:04:06.273690 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llk5l" event={"ID":"13b2fa50-d9a8-4569-99da-4eac1f129875","Type":"ContainerStarted","Data":"69c35b38e190dcd61755c574a31a0eeb308808606b0e1359c07a00e1acbc4e62"} Mar 10 08:04:06 crc kubenswrapper[4825]: I0310 08:04:06.295713 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-llk5l" podStartSLOduration=1.641957908 podStartE2EDuration="4.295678941s" podCreationTimestamp="2026-03-10 08:04:02 +0000 UTC" firstStartedPulling="2026-03-10 08:04:03.212460129 +0000 UTC m=+4796.242240744" lastFinishedPulling="2026-03-10 08:04:05.866181162 +0000 UTC m=+4798.895961777" observedRunningTime="2026-03-10 08:04:06.288422461 +0000 UTC m=+4799.318203096" watchObservedRunningTime="2026-03-10 08:04:06.295678941 +0000 UTC m=+4799.325459596" Mar 10 08:04:07 crc kubenswrapper[4825]: I0310 08:04:07.248725 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fa50eb5-9f97-4136-9803-59496aa2698f" path="/var/lib/kubelet/pods/4fa50eb5-9f97-4136-9803-59496aa2698f/volumes" Mar 10 08:04:09 crc kubenswrapper[4825]: I0310 08:04:09.261211 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:04:09 crc kubenswrapper[4825]: E0310 08:04:09.262039 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:04:09 crc kubenswrapper[4825]: I0310 08:04:09.557396 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76d4984c65-sjjvn"] Mar 10 08:04:09 crc kubenswrapper[4825]: E0310 08:04:09.557748 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13750d54-cc8a-459d-b9b0-ff2a1c678421" containerName="oc" Mar 10 08:04:09 crc kubenswrapper[4825]: I0310 08:04:09.557769 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="13750d54-cc8a-459d-b9b0-ff2a1c678421" containerName="oc" Mar 10 08:04:09 crc kubenswrapper[4825]: I0310 08:04:09.557963 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="13750d54-cc8a-459d-b9b0-ff2a1c678421" containerName="oc" Mar 10 08:04:09 crc kubenswrapper[4825]: I0310 08:04:09.558836 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76d4984c65-sjjvn" Mar 10 08:04:09 crc kubenswrapper[4825]: I0310 08:04:09.578475 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76d4984c65-sjjvn"] Mar 10 08:04:09 crc kubenswrapper[4825]: I0310 08:04:09.704212 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrlb6\" (UniqueName: \"kubernetes.io/projected/278f6673-3940-4a04-839d-28ce27000afa-kube-api-access-wrlb6\") pod \"dnsmasq-dns-76d4984c65-sjjvn\" (UID: \"278f6673-3940-4a04-839d-28ce27000afa\") " pod="openstack/dnsmasq-dns-76d4984c65-sjjvn" Mar 10 08:04:09 crc kubenswrapper[4825]: I0310 08:04:09.704286 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/278f6673-3940-4a04-839d-28ce27000afa-config\") pod \"dnsmasq-dns-76d4984c65-sjjvn\" (UID: \"278f6673-3940-4a04-839d-28ce27000afa\") " pod="openstack/dnsmasq-dns-76d4984c65-sjjvn" Mar 10 08:04:09 crc kubenswrapper[4825]: I0310 08:04:09.704438 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/278f6673-3940-4a04-839d-28ce27000afa-dns-svc\") pod \"dnsmasq-dns-76d4984c65-sjjvn\" (UID: \"278f6673-3940-4a04-839d-28ce27000afa\") " pod="openstack/dnsmasq-dns-76d4984c65-sjjvn" Mar 10 08:04:09 crc kubenswrapper[4825]: I0310 08:04:09.805494 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/278f6673-3940-4a04-839d-28ce27000afa-config\") pod \"dnsmasq-dns-76d4984c65-sjjvn\" (UID: \"278f6673-3940-4a04-839d-28ce27000afa\") " pod="openstack/dnsmasq-dns-76d4984c65-sjjvn" Mar 10 08:04:09 crc kubenswrapper[4825]: I0310 08:04:09.806423 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/278f6673-3940-4a04-839d-28ce27000afa-dns-svc\") pod \"dnsmasq-dns-76d4984c65-sjjvn\" (UID: \"278f6673-3940-4a04-839d-28ce27000afa\") " pod="openstack/dnsmasq-dns-76d4984c65-sjjvn" Mar 10 08:04:09 crc kubenswrapper[4825]: I0310 08:04:09.806489 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/278f6673-3940-4a04-839d-28ce27000afa-config\") pod \"dnsmasq-dns-76d4984c65-sjjvn\" (UID: \"278f6673-3940-4a04-839d-28ce27000afa\") " pod="openstack/dnsmasq-dns-76d4984c65-sjjvn" Mar 10 08:04:09 crc kubenswrapper[4825]: I0310 08:04:09.806681 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrlb6\" (UniqueName: \"kubernetes.io/projected/278f6673-3940-4a04-839d-28ce27000afa-kube-api-access-wrlb6\") pod \"dnsmasq-dns-76d4984c65-sjjvn\" (UID: \"278f6673-3940-4a04-839d-28ce27000afa\") " pod="openstack/dnsmasq-dns-76d4984c65-sjjvn" Mar 10 08:04:09 crc kubenswrapper[4825]: I0310 08:04:09.807257 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/278f6673-3940-4a04-839d-28ce27000afa-dns-svc\") pod \"dnsmasq-dns-76d4984c65-sjjvn\" (UID: \"278f6673-3940-4a04-839d-28ce27000afa\") " pod="openstack/dnsmasq-dns-76d4984c65-sjjvn" Mar 10 08:04:09 crc kubenswrapper[4825]: I0310 08:04:09.825126 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrlb6\" (UniqueName: \"kubernetes.io/projected/278f6673-3940-4a04-839d-28ce27000afa-kube-api-access-wrlb6\") pod \"dnsmasq-dns-76d4984c65-sjjvn\" (UID: \"278f6673-3940-4a04-839d-28ce27000afa\") " pod="openstack/dnsmasq-dns-76d4984c65-sjjvn" Mar 10 08:04:09 crc kubenswrapper[4825]: I0310 08:04:09.878769 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76d4984c65-sjjvn" Mar 10 08:04:10 crc kubenswrapper[4825]: I0310 08:04:10.409089 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76d4984c65-sjjvn"] Mar 10 08:04:10 crc kubenswrapper[4825]: W0310 08:04:10.417179 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod278f6673_3940_4a04_839d_28ce27000afa.slice/crio-d88e633dce5e2c88d7540ddfe3a3c3cb18ed7fea0a376bbf0cfaf794f3b204bc WatchSource:0}: Error finding container d88e633dce5e2c88d7540ddfe3a3c3cb18ed7fea0a376bbf0cfaf794f3b204bc: Status 404 returned error can't find the container with id d88e633dce5e2c88d7540ddfe3a3c3cb18ed7fea0a376bbf0cfaf794f3b204bc Mar 10 08:04:10 crc kubenswrapper[4825]: I0310 08:04:10.991884 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 08:04:11 crc kubenswrapper[4825]: I0310 08:04:11.316221 4825 generic.go:334] "Generic (PLEG): container finished" podID="278f6673-3940-4a04-839d-28ce27000afa" containerID="406add66cf15836d786c223e5b2f37d54d9e420827ca22e46ba2f5fd21230e17" exitCode=0 Mar 10 08:04:11 crc kubenswrapper[4825]: I0310 08:04:11.316267 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76d4984c65-sjjvn" event={"ID":"278f6673-3940-4a04-839d-28ce27000afa","Type":"ContainerDied","Data":"406add66cf15836d786c223e5b2f37d54d9e420827ca22e46ba2f5fd21230e17"} Mar 10 08:04:11 crc kubenswrapper[4825]: I0310 08:04:11.316297 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76d4984c65-sjjvn" event={"ID":"278f6673-3940-4a04-839d-28ce27000afa","Type":"ContainerStarted","Data":"d88e633dce5e2c88d7540ddfe3a3c3cb18ed7fea0a376bbf0cfaf794f3b204bc"} Mar 10 08:04:11 crc kubenswrapper[4825]: I0310 08:04:11.859521 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 08:04:12 crc kubenswrapper[4825]: I0310 08:04:12.324953 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76d4984c65-sjjvn" event={"ID":"278f6673-3940-4a04-839d-28ce27000afa","Type":"ContainerStarted","Data":"6a0b4071218c2ab0cad4195f8bd8cac9a5d9f05eb6135642d68df8bb72bc278f"} Mar 10 08:04:12 crc kubenswrapper[4825]: I0310 08:04:12.325110 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76d4984c65-sjjvn" Mar 10 08:04:12 crc kubenswrapper[4825]: I0310 08:04:12.348059 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76d4984c65-sjjvn" podStartSLOduration=3.348038936 podStartE2EDuration="3.348038936s" podCreationTimestamp="2026-03-10 08:04:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:04:12.342728397 +0000 UTC m=+4805.372509032" watchObservedRunningTime="2026-03-10 08:04:12.348038936 +0000 UTC m=+4805.377819561" Mar 10 08:04:12 crc kubenswrapper[4825]: I0310 08:04:12.475531 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-llk5l" Mar 10 08:04:12 crc kubenswrapper[4825]: I0310 08:04:12.476099 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-llk5l" Mar 10 08:04:12 crc kubenswrapper[4825]: I0310 08:04:12.516084 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-llk5l" Mar 10 08:04:13 crc kubenswrapper[4825]: I0310 08:04:13.382941 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-llk5l" Mar 10 08:04:13 crc kubenswrapper[4825]: I0310 08:04:13.431995 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-llk5l"] Mar 10 08:04:14 crc kubenswrapper[4825]: I0310 08:04:14.918027 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="a37c9d85-7482-45de-b47d-378f308feb54" containerName="rabbitmq" containerID="cri-o://bb840d5d1875e6c200d4ddc418614e59596370b2603d0997ee7dbb06712b1774" gracePeriod=604797 Mar 10 08:04:15 crc kubenswrapper[4825]: I0310 08:04:15.344539 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-llk5l" podUID="13b2fa50-d9a8-4569-99da-4eac1f129875" containerName="registry-server" containerID="cri-o://69c35b38e190dcd61755c574a31a0eeb308808606b0e1359c07a00e1acbc4e62" gracePeriod=2 Mar 10 08:04:15 crc kubenswrapper[4825]: I0310 08:04:15.745430 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llk5l" Mar 10 08:04:15 crc kubenswrapper[4825]: I0310 08:04:15.905065 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b2fa50-d9a8-4569-99da-4eac1f129875-catalog-content\") pod \"13b2fa50-d9a8-4569-99da-4eac1f129875\" (UID: \"13b2fa50-d9a8-4569-99da-4eac1f129875\") " Mar 10 08:04:15 crc kubenswrapper[4825]: I0310 08:04:15.905243 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b2fa50-d9a8-4569-99da-4eac1f129875-utilities\") pod \"13b2fa50-d9a8-4569-99da-4eac1f129875\" (UID: \"13b2fa50-d9a8-4569-99da-4eac1f129875\") " Mar 10 08:04:15 crc kubenswrapper[4825]: I0310 08:04:15.905331 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx4hp\" (UniqueName: \"kubernetes.io/projected/13b2fa50-d9a8-4569-99da-4eac1f129875-kube-api-access-gx4hp\") pod \"13b2fa50-d9a8-4569-99da-4eac1f129875\" (UID: \"13b2fa50-d9a8-4569-99da-4eac1f129875\") " Mar 10 08:04:15 crc kubenswrapper[4825]: I0310 08:04:15.906717 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13b2fa50-d9a8-4569-99da-4eac1f129875-utilities" (OuterVolumeSpecName: "utilities") pod "13b2fa50-d9a8-4569-99da-4eac1f129875" (UID: "13b2fa50-d9a8-4569-99da-4eac1f129875"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:04:15 crc kubenswrapper[4825]: I0310 08:04:15.916957 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13b2fa50-d9a8-4569-99da-4eac1f129875-kube-api-access-gx4hp" (OuterVolumeSpecName: "kube-api-access-gx4hp") pod "13b2fa50-d9a8-4569-99da-4eac1f129875" (UID: "13b2fa50-d9a8-4569-99da-4eac1f129875"). InnerVolumeSpecName "kube-api-access-gx4hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:04:15 crc kubenswrapper[4825]: I0310 08:04:15.971800 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="3d908280-2284-4dec-b8e3-67aae4d22314" containerName="rabbitmq" containerID="cri-o://b4d61c6344a19f965ce9a8fefe891013ab0e02c9e7d0a5bcef488c45b2732e0d" gracePeriod=604796 Mar 10 08:04:15 crc kubenswrapper[4825]: I0310 08:04:15.981652 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13b2fa50-d9a8-4569-99da-4eac1f129875-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13b2fa50-d9a8-4569-99da-4eac1f129875" (UID: "13b2fa50-d9a8-4569-99da-4eac1f129875"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:04:16 crc kubenswrapper[4825]: I0310 08:04:16.007487 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx4hp\" (UniqueName: \"kubernetes.io/projected/13b2fa50-d9a8-4569-99da-4eac1f129875-kube-api-access-gx4hp\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:16 crc kubenswrapper[4825]: I0310 08:04:16.007534 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b2fa50-d9a8-4569-99da-4eac1f129875-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:16 crc kubenswrapper[4825]: I0310 08:04:16.007543 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b2fa50-d9a8-4569-99da-4eac1f129875-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:16 crc kubenswrapper[4825]: I0310 08:04:16.354467 4825 generic.go:334] "Generic (PLEG): container finished" podID="13b2fa50-d9a8-4569-99da-4eac1f129875" containerID="69c35b38e190dcd61755c574a31a0eeb308808606b0e1359c07a00e1acbc4e62" exitCode=0 Mar 10 08:04:16 crc kubenswrapper[4825]: I0310 08:04:16.354529 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llk5l" event={"ID":"13b2fa50-d9a8-4569-99da-4eac1f129875","Type":"ContainerDied","Data":"69c35b38e190dcd61755c574a31a0eeb308808606b0e1359c07a00e1acbc4e62"} Mar 10 08:04:16 crc kubenswrapper[4825]: I0310 08:04:16.354612 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llk5l" event={"ID":"13b2fa50-d9a8-4569-99da-4eac1f129875","Type":"ContainerDied","Data":"69c1ff9d39de84fd184724c0585b7ce78f9c55849a1b2bcc394a23f1b59096a0"} Mar 10 08:04:16 crc kubenswrapper[4825]: I0310 08:04:16.354647 4825 scope.go:117] "RemoveContainer" containerID="69c35b38e190dcd61755c574a31a0eeb308808606b0e1359c07a00e1acbc4e62" Mar 10 08:04:16 crc kubenswrapper[4825]: I0310 08:04:16.354549 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llk5l" Mar 10 08:04:16 crc kubenswrapper[4825]: I0310 08:04:16.376612 4825 scope.go:117] "RemoveContainer" containerID="6acc051bbfaca4a37c3bcec6a5b398c7ef48b245682239ef40972adc232cb8fc" Mar 10 08:04:16 crc kubenswrapper[4825]: I0310 08:04:16.389100 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-llk5l"] Mar 10 08:04:16 crc kubenswrapper[4825]: I0310 08:04:16.407215 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-llk5l"] Mar 10 08:04:16 crc kubenswrapper[4825]: I0310 08:04:16.409353 4825 scope.go:117] "RemoveContainer" containerID="ac4794aaf7631f82d8f397e86a77f5c7372523e4b0302c9be0bb27d369aef29f" Mar 10 08:04:16 crc kubenswrapper[4825]: I0310 08:04:16.441595 4825 scope.go:117] "RemoveContainer" containerID="69c35b38e190dcd61755c574a31a0eeb308808606b0e1359c07a00e1acbc4e62" Mar 10 08:04:16 crc kubenswrapper[4825]: E0310 08:04:16.442190 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69c35b38e190dcd61755c574a31a0eeb308808606b0e1359c07a00e1acbc4e62\": container with ID starting with 69c35b38e190dcd61755c574a31a0eeb308808606b0e1359c07a00e1acbc4e62 not found: ID does not exist" containerID="69c35b38e190dcd61755c574a31a0eeb308808606b0e1359c07a00e1acbc4e62" Mar 10 08:04:16 crc kubenswrapper[4825]: I0310 08:04:16.442233 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69c35b38e190dcd61755c574a31a0eeb308808606b0e1359c07a00e1acbc4e62"} err="failed to get container status \"69c35b38e190dcd61755c574a31a0eeb308808606b0e1359c07a00e1acbc4e62\": rpc error: code = NotFound desc = could not find container \"69c35b38e190dcd61755c574a31a0eeb308808606b0e1359c07a00e1acbc4e62\": container with ID starting with 69c35b38e190dcd61755c574a31a0eeb308808606b0e1359c07a00e1acbc4e62 not found: ID does not exist" Mar 10 08:04:16 crc kubenswrapper[4825]: I0310 08:04:16.442265 4825 scope.go:117] "RemoveContainer" containerID="6acc051bbfaca4a37c3bcec6a5b398c7ef48b245682239ef40972adc232cb8fc" Mar 10 08:04:16 crc kubenswrapper[4825]: E0310 08:04:16.442886 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6acc051bbfaca4a37c3bcec6a5b398c7ef48b245682239ef40972adc232cb8fc\": container with ID starting with 6acc051bbfaca4a37c3bcec6a5b398c7ef48b245682239ef40972adc232cb8fc not found: ID does not exist" containerID="6acc051bbfaca4a37c3bcec6a5b398c7ef48b245682239ef40972adc232cb8fc" Mar 10 08:04:16 crc kubenswrapper[4825]: I0310 08:04:16.442931 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6acc051bbfaca4a37c3bcec6a5b398c7ef48b245682239ef40972adc232cb8fc"} err="failed to get container status \"6acc051bbfaca4a37c3bcec6a5b398c7ef48b245682239ef40972adc232cb8fc\": rpc error: code = NotFound desc = could not find container \"6acc051bbfaca4a37c3bcec6a5b398c7ef48b245682239ef40972adc232cb8fc\": container with ID starting with 6acc051bbfaca4a37c3bcec6a5b398c7ef48b245682239ef40972adc232cb8fc not found: ID does not exist" Mar 10 08:04:16 crc kubenswrapper[4825]: I0310 08:04:16.442967 4825 scope.go:117] "RemoveContainer" containerID="ac4794aaf7631f82d8f397e86a77f5c7372523e4b0302c9be0bb27d369aef29f" Mar 10 08:04:16 crc kubenswrapper[4825]: E0310 08:04:16.443360 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac4794aaf7631f82d8f397e86a77f5c7372523e4b0302c9be0bb27d369aef29f\": container with ID starting with ac4794aaf7631f82d8f397e86a77f5c7372523e4b0302c9be0bb27d369aef29f not found: ID does not exist" containerID="ac4794aaf7631f82d8f397e86a77f5c7372523e4b0302c9be0bb27d369aef29f" Mar 10 08:04:16 crc kubenswrapper[4825]: I0310 08:04:16.443394 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac4794aaf7631f82d8f397e86a77f5c7372523e4b0302c9be0bb27d369aef29f"} err="failed to get container status \"ac4794aaf7631f82d8f397e86a77f5c7372523e4b0302c9be0bb27d369aef29f\": rpc error: code = NotFound desc = could not find container \"ac4794aaf7631f82d8f397e86a77f5c7372523e4b0302c9be0bb27d369aef29f\": container with ID starting with ac4794aaf7631f82d8f397e86a77f5c7372523e4b0302c9be0bb27d369aef29f not found: ID does not exist" Mar 10 08:04:17 crc kubenswrapper[4825]: I0310 08:04:17.246514 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13b2fa50-d9a8-4569-99da-4eac1f129875" path="/var/lib/kubelet/pods/13b2fa50-d9a8-4569-99da-4eac1f129875/volumes" Mar 10 08:04:19 crc kubenswrapper[4825]: I0310 08:04:19.880827 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76d4984c65-sjjvn" Mar 10 08:04:19 crc kubenswrapper[4825]: I0310 08:04:19.932263 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="a37c9d85-7482-45de-b47d-378f308feb54" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.22:5671: connect: connection refused" Mar 10 08:04:19 crc kubenswrapper[4825]: I0310 08:04:19.935980 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c87cfb955-vzt5j"] Mar 10 08:04:19 crc kubenswrapper[4825]: I0310 08:04:19.936228 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c87cfb955-vzt5j" podUID="55c11e3c-2b81-4e63-87f2-71e18805c676" containerName="dnsmasq-dns" containerID="cri-o://ab829cd23f04929298245185a0f6837bde3c4ae1cb3ce366e60c7a9627bacf64" gracePeriod=10 Mar 10 08:04:20 crc kubenswrapper[4825]: I0310 08:04:20.373045 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c87cfb955-vzt5j" Mar 10 08:04:20 crc kubenswrapper[4825]: I0310 08:04:20.391985 4825 generic.go:334] "Generic (PLEG): container finished" podID="55c11e3c-2b81-4e63-87f2-71e18805c676" containerID="ab829cd23f04929298245185a0f6837bde3c4ae1cb3ce366e60c7a9627bacf64" exitCode=0 Mar 10 08:04:20 crc kubenswrapper[4825]: I0310 08:04:20.392024 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c87cfb955-vzt5j" event={"ID":"55c11e3c-2b81-4e63-87f2-71e18805c676","Type":"ContainerDied","Data":"ab829cd23f04929298245185a0f6837bde3c4ae1cb3ce366e60c7a9627bacf64"} Mar 10 08:04:20 crc kubenswrapper[4825]: I0310 08:04:20.392049 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c87cfb955-vzt5j" event={"ID":"55c11e3c-2b81-4e63-87f2-71e18805c676","Type":"ContainerDied","Data":"3779e46ac757cdf5873a290b8bee1a3e82bd215d2982525e40e98ba4574d4705"} Mar 10 08:04:20 crc kubenswrapper[4825]: I0310 08:04:20.392064 4825 scope.go:117] "RemoveContainer" containerID="ab829cd23f04929298245185a0f6837bde3c4ae1cb3ce366e60c7a9627bacf64" Mar 10 08:04:20 crc kubenswrapper[4825]: I0310 08:04:20.392642 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c87cfb955-vzt5j" Mar 10 08:04:20 crc kubenswrapper[4825]: I0310 08:04:20.409428 4825 scope.go:117] "RemoveContainer" containerID="c831e847b10fc73b9d3536510b87a575cea97c11dc1e9acaba09022333f201dc" Mar 10 08:04:20 crc kubenswrapper[4825]: I0310 08:04:20.456473 4825 scope.go:117] "RemoveContainer" containerID="ab829cd23f04929298245185a0f6837bde3c4ae1cb3ce366e60c7a9627bacf64" Mar 10 08:04:20 crc kubenswrapper[4825]: E0310 08:04:20.460429 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab829cd23f04929298245185a0f6837bde3c4ae1cb3ce366e60c7a9627bacf64\": container with ID starting with ab829cd23f04929298245185a0f6837bde3c4ae1cb3ce366e60c7a9627bacf64 not found: ID does not exist" containerID="ab829cd23f04929298245185a0f6837bde3c4ae1cb3ce366e60c7a9627bacf64" Mar 10 08:04:20 crc kubenswrapper[4825]: I0310 08:04:20.460469 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab829cd23f04929298245185a0f6837bde3c4ae1cb3ce366e60c7a9627bacf64"} err="failed to get container status \"ab829cd23f04929298245185a0f6837bde3c4ae1cb3ce366e60c7a9627bacf64\": rpc error: code = NotFound desc = could not find container \"ab829cd23f04929298245185a0f6837bde3c4ae1cb3ce366e60c7a9627bacf64\": container with ID starting with ab829cd23f04929298245185a0f6837bde3c4ae1cb3ce366e60c7a9627bacf64 not found: ID does not exist" Mar 10 08:04:20 crc kubenswrapper[4825]: I0310 08:04:20.460494 4825 scope.go:117] "RemoveContainer" containerID="c831e847b10fc73b9d3536510b87a575cea97c11dc1e9acaba09022333f201dc" Mar 10 08:04:20 crc kubenswrapper[4825]: E0310 08:04:20.460785 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c831e847b10fc73b9d3536510b87a575cea97c11dc1e9acaba09022333f201dc\": container with ID starting with c831e847b10fc73b9d3536510b87a575cea97c11dc1e9acaba09022333f201dc not found: ID does not exist" containerID="c831e847b10fc73b9d3536510b87a575cea97c11dc1e9acaba09022333f201dc" Mar 10 08:04:20 crc kubenswrapper[4825]: I0310 08:04:20.460831 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c831e847b10fc73b9d3536510b87a575cea97c11dc1e9acaba09022333f201dc"} err="failed to get container status \"c831e847b10fc73b9d3536510b87a575cea97c11dc1e9acaba09022333f201dc\": rpc error: code = NotFound desc = could not find container \"c831e847b10fc73b9d3536510b87a575cea97c11dc1e9acaba09022333f201dc\": container with ID starting with c831e847b10fc73b9d3536510b87a575cea97c11dc1e9acaba09022333f201dc not found: ID does not exist" Mar 10 08:04:20 crc kubenswrapper[4825]: I0310 08:04:20.477549 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqqpr\" (UniqueName: \"kubernetes.io/projected/55c11e3c-2b81-4e63-87f2-71e18805c676-kube-api-access-gqqpr\") pod \"55c11e3c-2b81-4e63-87f2-71e18805c676\" (UID: \"55c11e3c-2b81-4e63-87f2-71e18805c676\") " Mar 10 08:04:20 crc kubenswrapper[4825]: I0310 08:04:20.478052 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55c11e3c-2b81-4e63-87f2-71e18805c676-dns-svc\") pod \"55c11e3c-2b81-4e63-87f2-71e18805c676\" (UID: \"55c11e3c-2b81-4e63-87f2-71e18805c676\") " Mar 10 08:04:20 crc kubenswrapper[4825]: I0310 08:04:20.478191 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55c11e3c-2b81-4e63-87f2-71e18805c676-config\") pod \"55c11e3c-2b81-4e63-87f2-71e18805c676\" (UID: \"55c11e3c-2b81-4e63-87f2-71e18805c676\") " Mar 10 08:04:20 crc kubenswrapper[4825]: I0310 08:04:20.486569 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55c11e3c-2b81-4e63-87f2-71e18805c676-kube-api-access-gqqpr" (OuterVolumeSpecName: "kube-api-access-gqqpr") pod "55c11e3c-2b81-4e63-87f2-71e18805c676" (UID: "55c11e3c-2b81-4e63-87f2-71e18805c676"). InnerVolumeSpecName "kube-api-access-gqqpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:04:20 crc kubenswrapper[4825]: I0310 08:04:20.528257 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="3d908280-2284-4dec-b8e3-67aae4d22314" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.23:5671: connect: connection refused" Mar 10 08:04:20 crc kubenswrapper[4825]: E0310 08:04:20.544548 4825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/55c11e3c-2b81-4e63-87f2-71e18805c676-config podName:55c11e3c-2b81-4e63-87f2-71e18805c676 nodeName:}" failed. No retries permitted until 2026-03-10 08:04:21.044520947 +0000 UTC m=+4814.074301562 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config" (UniqueName: "kubernetes.io/configmap/55c11e3c-2b81-4e63-87f2-71e18805c676-config") pod "55c11e3c-2b81-4e63-87f2-71e18805c676" (UID: "55c11e3c-2b81-4e63-87f2-71e18805c676") : error deleting /var/lib/kubelet/pods/55c11e3c-2b81-4e63-87f2-71e18805c676/volume-subpaths: remove /var/lib/kubelet/pods/55c11e3c-2b81-4e63-87f2-71e18805c676/volume-subpaths: no such file or directory Mar 10 08:04:20 crc kubenswrapper[4825]: I0310 08:04:20.544799 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55c11e3c-2b81-4e63-87f2-71e18805c676-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "55c11e3c-2b81-4e63-87f2-71e18805c676" (UID: "55c11e3c-2b81-4e63-87f2-71e18805c676"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:04:20 crc kubenswrapper[4825]: I0310 08:04:20.579846 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55c11e3c-2b81-4e63-87f2-71e18805c676-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:20 crc kubenswrapper[4825]: I0310 08:04:20.579876 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqqpr\" (UniqueName: \"kubernetes.io/projected/55c11e3c-2b81-4e63-87f2-71e18805c676-kube-api-access-gqqpr\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.086558 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55c11e3c-2b81-4e63-87f2-71e18805c676-config\") pod \"55c11e3c-2b81-4e63-87f2-71e18805c676\" (UID: \"55c11e3c-2b81-4e63-87f2-71e18805c676\") " Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.086947 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55c11e3c-2b81-4e63-87f2-71e18805c676-config" (OuterVolumeSpecName: "config") pod "55c11e3c-2b81-4e63-87f2-71e18805c676" (UID: "55c11e3c-2b81-4e63-87f2-71e18805c676"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.088389 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55c11e3c-2b81-4e63-87f2-71e18805c676-config\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.235679 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:04:21 crc kubenswrapper[4825]: E0310 08:04:21.235914 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.310976 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c87cfb955-vzt5j"] Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.316310 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c87cfb955-vzt5j"] Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.400375 4825 generic.go:334] "Generic (PLEG): container finished" podID="a37c9d85-7482-45de-b47d-378f308feb54" containerID="bb840d5d1875e6c200d4ddc418614e59596370b2603d0997ee7dbb06712b1774" exitCode=0 Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.400442 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a37c9d85-7482-45de-b47d-378f308feb54","Type":"ContainerDied","Data":"bb840d5d1875e6c200d4ddc418614e59596370b2603d0997ee7dbb06712b1774"} Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.610536 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.697862 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a37c9d85-7482-45de-b47d-378f308feb54-pod-info\") pod \"a37c9d85-7482-45de-b47d-378f308feb54\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.697908 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a37c9d85-7482-45de-b47d-378f308feb54-rabbitmq-erlang-cookie\") pod \"a37c9d85-7482-45de-b47d-378f308feb54\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.697931 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a37c9d85-7482-45de-b47d-378f308feb54-rabbitmq-plugins\") pod \"a37c9d85-7482-45de-b47d-378f308feb54\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.697956 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a37c9d85-7482-45de-b47d-378f308feb54-plugins-conf\") pod \"a37c9d85-7482-45de-b47d-378f308feb54\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.697977 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f277q\" (UniqueName: \"kubernetes.io/projected/a37c9d85-7482-45de-b47d-378f308feb54-kube-api-access-f277q\") pod \"a37c9d85-7482-45de-b47d-378f308feb54\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.698012 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a37c9d85-7482-45de-b47d-378f308feb54-erlang-cookie-secret\") pod \"a37c9d85-7482-45de-b47d-378f308feb54\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.698072 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a37c9d85-7482-45de-b47d-378f308feb54-rabbitmq-tls\") pod \"a37c9d85-7482-45de-b47d-378f308feb54\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.698086 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a37c9d85-7482-45de-b47d-378f308feb54-server-conf\") pod \"a37c9d85-7482-45de-b47d-378f308feb54\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.698216 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10ce9807-6e64-488b-8542-1028122298f4\") pod \"a37c9d85-7482-45de-b47d-378f308feb54\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.698253 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a37c9d85-7482-45de-b47d-378f308feb54-config-data\") pod \"a37c9d85-7482-45de-b47d-378f308feb54\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.698276 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a37c9d85-7482-45de-b47d-378f308feb54-rabbitmq-confd\") pod \"a37c9d85-7482-45de-b47d-378f308feb54\" (UID: \"a37c9d85-7482-45de-b47d-378f308feb54\") " Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.698462 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a37c9d85-7482-45de-b47d-378f308feb54-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a37c9d85-7482-45de-b47d-378f308feb54" (UID: "a37c9d85-7482-45de-b47d-378f308feb54"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.698658 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a37c9d85-7482-45de-b47d-378f308feb54-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a37c9d85-7482-45de-b47d-378f308feb54" (UID: "a37c9d85-7482-45de-b47d-378f308feb54"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.698666 4825 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a37c9d85-7482-45de-b47d-378f308feb54-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.699094 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a37c9d85-7482-45de-b47d-378f308feb54-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a37c9d85-7482-45de-b47d-378f308feb54" (UID: "a37c9d85-7482-45de-b47d-378f308feb54"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.706334 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a37c9d85-7482-45de-b47d-378f308feb54-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a37c9d85-7482-45de-b47d-378f308feb54" (UID: "a37c9d85-7482-45de-b47d-378f308feb54"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.706699 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a37c9d85-7482-45de-b47d-378f308feb54-kube-api-access-f277q" (OuterVolumeSpecName: "kube-api-access-f277q") pod "a37c9d85-7482-45de-b47d-378f308feb54" (UID: "a37c9d85-7482-45de-b47d-378f308feb54"). InnerVolumeSpecName "kube-api-access-f277q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.709965 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a37c9d85-7482-45de-b47d-378f308feb54-pod-info" (OuterVolumeSpecName: "pod-info") pod "a37c9d85-7482-45de-b47d-378f308feb54" (UID: "a37c9d85-7482-45de-b47d-378f308feb54"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.710384 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10ce9807-6e64-488b-8542-1028122298f4" (OuterVolumeSpecName: "persistence") pod "a37c9d85-7482-45de-b47d-378f308feb54" (UID: "a37c9d85-7482-45de-b47d-378f308feb54"). InnerVolumeSpecName "pvc-10ce9807-6e64-488b-8542-1028122298f4". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.721418 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a37c9d85-7482-45de-b47d-378f308feb54-config-data" (OuterVolumeSpecName: "config-data") pod "a37c9d85-7482-45de-b47d-378f308feb54" (UID: "a37c9d85-7482-45de-b47d-378f308feb54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.723409 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a37c9d85-7482-45de-b47d-378f308feb54-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a37c9d85-7482-45de-b47d-378f308feb54" (UID: "a37c9d85-7482-45de-b47d-378f308feb54"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.742043 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a37c9d85-7482-45de-b47d-378f308feb54-server-conf" (OuterVolumeSpecName: "server-conf") pod "a37c9d85-7482-45de-b47d-378f308feb54" (UID: "a37c9d85-7482-45de-b47d-378f308feb54"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.776161 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a37c9d85-7482-45de-b47d-378f308feb54-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a37c9d85-7482-45de-b47d-378f308feb54" (UID: "a37c9d85-7482-45de-b47d-378f308feb54"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.799727 4825 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a37c9d85-7482-45de-b47d-378f308feb54-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.799769 4825 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a37c9d85-7482-45de-b47d-378f308feb54-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.799782 4825 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a37c9d85-7482-45de-b47d-378f308feb54-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.799823 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-10ce9807-6e64-488b-8542-1028122298f4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10ce9807-6e64-488b-8542-1028122298f4\") on node \"crc\" " Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.799837 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a37c9d85-7482-45de-b47d-378f308feb54-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.799848 4825 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a37c9d85-7482-45de-b47d-378f308feb54-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.799862 4825 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a37c9d85-7482-45de-b47d-378f308feb54-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.799875 4825 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a37c9d85-7482-45de-b47d-378f308feb54-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.799886 4825 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a37c9d85-7482-45de-b47d-378f308feb54-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.799896 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f277q\" (UniqueName: \"kubernetes.io/projected/a37c9d85-7482-45de-b47d-378f308feb54-kube-api-access-f277q\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.836909 4825 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.837605 4825 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-10ce9807-6e64-488b-8542-1028122298f4" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10ce9807-6e64-488b-8542-1028122298f4") on node "crc" Mar 10 08:04:21 crc kubenswrapper[4825]: I0310 08:04:21.901250 4825 reconciler_common.go:293] "Volume detached for volume \"pvc-10ce9807-6e64-488b-8542-1028122298f4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10ce9807-6e64-488b-8542-1028122298f4\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.412971 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a37c9d85-7482-45de-b47d-378f308feb54","Type":"ContainerDied","Data":"1efec9b8740bc4d4a33475ae730b7711821dc3539764baeee428b70b300f2b2a"} Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.413064 4825 scope.go:117] "RemoveContainer" containerID="bb840d5d1875e6c200d4ddc418614e59596370b2603d0997ee7dbb06712b1774" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.413299 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.484350 4825 scope.go:117] "RemoveContainer" containerID="ff3cda9d055df2d990e3bd4fac577a92655f2704e525cd7fe0d91738cac0ebbd" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.490415 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.498591 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.524145 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 08:04:22 crc kubenswrapper[4825]: E0310 08:04:22.528093 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a37c9d85-7482-45de-b47d-378f308feb54" containerName="setup-container" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.528145 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a37c9d85-7482-45de-b47d-378f308feb54" containerName="setup-container" Mar 10 08:04:22 crc kubenswrapper[4825]: E0310 08:04:22.528171 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a37c9d85-7482-45de-b47d-378f308feb54" containerName="rabbitmq" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.528181 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a37c9d85-7482-45de-b47d-378f308feb54" containerName="rabbitmq" Mar 10 08:04:22 crc kubenswrapper[4825]: E0310 08:04:22.528194 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c11e3c-2b81-4e63-87f2-71e18805c676" containerName="dnsmasq-dns" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.528203 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c11e3c-2b81-4e63-87f2-71e18805c676" containerName="dnsmasq-dns" Mar 10 08:04:22 crc kubenswrapper[4825]: E0310 08:04:22.528313 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b2fa50-d9a8-4569-99da-4eac1f129875" containerName="extract-content" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.528323 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b2fa50-d9a8-4569-99da-4eac1f129875" containerName="extract-content" Mar 10 08:04:22 crc kubenswrapper[4825]: E0310 08:04:22.528334 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c11e3c-2b81-4e63-87f2-71e18805c676" containerName="init" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.528342 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c11e3c-2b81-4e63-87f2-71e18805c676" containerName="init" Mar 10 08:04:22 crc kubenswrapper[4825]: E0310 08:04:22.528365 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b2fa50-d9a8-4569-99da-4eac1f129875" containerName="registry-server" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.528373 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b2fa50-d9a8-4569-99da-4eac1f129875" containerName="registry-server" Mar 10 08:04:22 crc kubenswrapper[4825]: E0310 08:04:22.528388 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b2fa50-d9a8-4569-99da-4eac1f129875" containerName="extract-utilities" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.528396 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b2fa50-d9a8-4569-99da-4eac1f129875" containerName="extract-utilities" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.528587 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a37c9d85-7482-45de-b47d-378f308feb54" containerName="rabbitmq" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.528612 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="55c11e3c-2b81-4e63-87f2-71e18805c676" containerName="dnsmasq-dns" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.528625 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="13b2fa50-d9a8-4569-99da-4eac1f129875" containerName="registry-server" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.529701 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.535970 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.536637 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.536663 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.536866 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.536925 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.537602 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-fbtpt" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.537946 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.538016 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.614807 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/67f57387-e031-49bf-9895-efa6796a98cd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.614850 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/67f57387-e031-49bf-9895-efa6796a98cd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.614882 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttnnh\" (UniqueName: \"kubernetes.io/projected/67f57387-e031-49bf-9895-efa6796a98cd-kube-api-access-ttnnh\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.614906 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/67f57387-e031-49bf-9895-efa6796a98cd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.615075 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67f57387-e031-49bf-9895-efa6796a98cd-config-data\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.615222 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/67f57387-e031-49bf-9895-efa6796a98cd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.615294 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/67f57387-e031-49bf-9895-efa6796a98cd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.615378 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/67f57387-e031-49bf-9895-efa6796a98cd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.615509 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/67f57387-e031-49bf-9895-efa6796a98cd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.615675 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-10ce9807-6e64-488b-8542-1028122298f4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10ce9807-6e64-488b-8542-1028122298f4\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.615730 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/67f57387-e031-49bf-9895-efa6796a98cd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.717750 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67f57387-e031-49bf-9895-efa6796a98cd-config-data\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.717802 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/67f57387-e031-49bf-9895-efa6796a98cd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.717831 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/67f57387-e031-49bf-9895-efa6796a98cd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.717865 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/67f57387-e031-49bf-9895-efa6796a98cd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.717901 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/67f57387-e031-49bf-9895-efa6796a98cd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.717954 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-10ce9807-6e64-488b-8542-1028122298f4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10ce9807-6e64-488b-8542-1028122298f4\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.717982 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/67f57387-e031-49bf-9895-efa6796a98cd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.718014 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/67f57387-e031-49bf-9895-efa6796a98cd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.718039 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/67f57387-e031-49bf-9895-efa6796a98cd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.718070 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttnnh\" (UniqueName: \"kubernetes.io/projected/67f57387-e031-49bf-9895-efa6796a98cd-kube-api-access-ttnnh\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.718096 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/67f57387-e031-49bf-9895-efa6796a98cd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.722459 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67f57387-e031-49bf-9895-efa6796a98cd-config-data\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.722760 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/67f57387-e031-49bf-9895-efa6796a98cd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.723049 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/67f57387-e031-49bf-9895-efa6796a98cd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.723458 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/67f57387-e031-49bf-9895-efa6796a98cd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.724541 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/67f57387-e031-49bf-9895-efa6796a98cd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.725802 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/67f57387-e031-49bf-9895-efa6796a98cd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.728721 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/67f57387-e031-49bf-9895-efa6796a98cd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.728821 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/67f57387-e031-49bf-9895-efa6796a98cd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.728832 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/67f57387-e031-49bf-9895-efa6796a98cd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.735837 4825 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.735887 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-10ce9807-6e64-488b-8542-1028122298f4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10ce9807-6e64-488b-8542-1028122298f4\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/39ad561c7ffae3360bcae64ac6a958177480a2045780dd959a8d51d1e4ce878f/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.747381 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttnnh\" (UniqueName: \"kubernetes.io/projected/67f57387-e031-49bf-9895-efa6796a98cd-kube-api-access-ttnnh\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.775968 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-10ce9807-6e64-488b-8542-1028122298f4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10ce9807-6e64-488b-8542-1028122298f4\") pod \"rabbitmq-server-0\" (UID: \"67f57387-e031-49bf-9895-efa6796a98cd\") " pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.796455 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.847432 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.921621 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3d908280-2284-4dec-b8e3-67aae4d22314-plugins-conf\") pod \"3d908280-2284-4dec-b8e3-67aae4d22314\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.921686 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xtm8\" (UniqueName: \"kubernetes.io/projected/3d908280-2284-4dec-b8e3-67aae4d22314-kube-api-access-8xtm8\") pod \"3d908280-2284-4dec-b8e3-67aae4d22314\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.921718 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3d908280-2284-4dec-b8e3-67aae4d22314-server-conf\") pod \"3d908280-2284-4dec-b8e3-67aae4d22314\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.921752 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3d908280-2284-4dec-b8e3-67aae4d22314-pod-info\") pod \"3d908280-2284-4dec-b8e3-67aae4d22314\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.921783 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3d908280-2284-4dec-b8e3-67aae4d22314-rabbitmq-erlang-cookie\") pod \"3d908280-2284-4dec-b8e3-67aae4d22314\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.921850 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3d908280-2284-4dec-b8e3-67aae4d22314-rabbitmq-plugins\") pod \"3d908280-2284-4dec-b8e3-67aae4d22314\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.921910 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d908280-2284-4dec-b8e3-67aae4d22314-config-data\") pod \"3d908280-2284-4dec-b8e3-67aae4d22314\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.922020 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6\") pod \"3d908280-2284-4dec-b8e3-67aae4d22314\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.922052 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3d908280-2284-4dec-b8e3-67aae4d22314-erlang-cookie-secret\") pod \"3d908280-2284-4dec-b8e3-67aae4d22314\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.922070 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3d908280-2284-4dec-b8e3-67aae4d22314-rabbitmq-confd\") pod \"3d908280-2284-4dec-b8e3-67aae4d22314\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.922100 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3d908280-2284-4dec-b8e3-67aae4d22314-rabbitmq-tls\") pod \"3d908280-2284-4dec-b8e3-67aae4d22314\" (UID: \"3d908280-2284-4dec-b8e3-67aae4d22314\") " Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.923347 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d908280-2284-4dec-b8e3-67aae4d22314-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3d908280-2284-4dec-b8e3-67aae4d22314" (UID: "3d908280-2284-4dec-b8e3-67aae4d22314"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.923582 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d908280-2284-4dec-b8e3-67aae4d22314-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3d908280-2284-4dec-b8e3-67aae4d22314" (UID: "3d908280-2284-4dec-b8e3-67aae4d22314"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.924273 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d908280-2284-4dec-b8e3-67aae4d22314-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3d908280-2284-4dec-b8e3-67aae4d22314" (UID: "3d908280-2284-4dec-b8e3-67aae4d22314"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.926354 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d908280-2284-4dec-b8e3-67aae4d22314-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3d908280-2284-4dec-b8e3-67aae4d22314" (UID: "3d908280-2284-4dec-b8e3-67aae4d22314"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.926705 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d908280-2284-4dec-b8e3-67aae4d22314-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3d908280-2284-4dec-b8e3-67aae4d22314" (UID: "3d908280-2284-4dec-b8e3-67aae4d22314"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.927239 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d908280-2284-4dec-b8e3-67aae4d22314-kube-api-access-8xtm8" (OuterVolumeSpecName: "kube-api-access-8xtm8") pod "3d908280-2284-4dec-b8e3-67aae4d22314" (UID: "3d908280-2284-4dec-b8e3-67aae4d22314"). InnerVolumeSpecName "kube-api-access-8xtm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.927678 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3d908280-2284-4dec-b8e3-67aae4d22314-pod-info" (OuterVolumeSpecName: "pod-info") pod "3d908280-2284-4dec-b8e3-67aae4d22314" (UID: "3d908280-2284-4dec-b8e3-67aae4d22314"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.939843 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6" (OuterVolumeSpecName: "persistence") pod "3d908280-2284-4dec-b8e3-67aae4d22314" (UID: "3d908280-2284-4dec-b8e3-67aae4d22314"). InnerVolumeSpecName "pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 08:04:22 crc kubenswrapper[4825]: I0310 08:04:22.940449 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d908280-2284-4dec-b8e3-67aae4d22314-config-data" (OuterVolumeSpecName: "config-data") pod "3d908280-2284-4dec-b8e3-67aae4d22314" (UID: "3d908280-2284-4dec-b8e3-67aae4d22314"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.023171 4825 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3d908280-2284-4dec-b8e3-67aae4d22314-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.023198 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xtm8\" (UniqueName: \"kubernetes.io/projected/3d908280-2284-4dec-b8e3-67aae4d22314-kube-api-access-8xtm8\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.023207 4825 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3d908280-2284-4dec-b8e3-67aae4d22314-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.023215 4825 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3d908280-2284-4dec-b8e3-67aae4d22314-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.023224 4825 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3d908280-2284-4dec-b8e3-67aae4d22314-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.023233 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d908280-2284-4dec-b8e3-67aae4d22314-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.023262 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6\") on node \"crc\" " Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.023272 4825 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3d908280-2284-4dec-b8e3-67aae4d22314-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.023287 4825 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3d908280-2284-4dec-b8e3-67aae4d22314-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.253338 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55c11e3c-2b81-4e63-87f2-71e18805c676" path="/var/lib/kubelet/pods/55c11e3c-2b81-4e63-87f2-71e18805c676/volumes" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.256555 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a37c9d85-7482-45de-b47d-378f308feb54" path="/var/lib/kubelet/pods/a37c9d85-7482-45de-b47d-378f308feb54/volumes" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.295684 4825 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.295870 4825 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6") on node "crc" Mar 10 08:04:23 crc kubenswrapper[4825]: W0310 08:04:23.302492 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67f57387_e031_49bf_9895_efa6796a98cd.slice/crio-04289208fd7eb02dd47b49c96def29442f559e3dfc2794b690451a8f9ec6891f WatchSource:0}: Error finding container 04289208fd7eb02dd47b49c96def29442f559e3dfc2794b690451a8f9ec6891f: Status 404 returned error can't find the container with id 04289208fd7eb02dd47b49c96def29442f559e3dfc2794b690451a8f9ec6891f Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.328375 4825 reconciler_common.go:293] "Volume detached for volume \"pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.431254 4825 generic.go:334] "Generic (PLEG): container finished" podID="3d908280-2284-4dec-b8e3-67aae4d22314" containerID="b4d61c6344a19f965ce9a8fefe891013ab0e02c9e7d0a5bcef488c45b2732e0d" exitCode=0 Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.431354 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.470950 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d908280-2284-4dec-b8e3-67aae4d22314-server-conf" (OuterVolumeSpecName: "server-conf") pod "3d908280-2284-4dec-b8e3-67aae4d22314" (UID: "3d908280-2284-4dec-b8e3-67aae4d22314"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.507048 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d908280-2284-4dec-b8e3-67aae4d22314-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3d908280-2284-4dec-b8e3-67aae4d22314" (UID: "3d908280-2284-4dec-b8e3-67aae4d22314"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.531269 4825 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3d908280-2284-4dec-b8e3-67aae4d22314-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.531304 4825 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3d908280-2284-4dec-b8e3-67aae4d22314-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.540957 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3d908280-2284-4dec-b8e3-67aae4d22314","Type":"ContainerDied","Data":"b4d61c6344a19f965ce9a8fefe891013ab0e02c9e7d0a5bcef488c45b2732e0d"} Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.541008 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3d908280-2284-4dec-b8e3-67aae4d22314","Type":"ContainerDied","Data":"ba2c417b869e48f28f1944fab4517af944f2d60eaee6d394ddc70b5bd83d6d0c"} Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.541022 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"67f57387-e031-49bf-9895-efa6796a98cd","Type":"ContainerStarted","Data":"04289208fd7eb02dd47b49c96def29442f559e3dfc2794b690451a8f9ec6891f"} Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.541040 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.541064 4825 scope.go:117] "RemoveContainer" containerID="b4d61c6344a19f965ce9a8fefe891013ab0e02c9e7d0a5bcef488c45b2732e0d" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.568894 4825 scope.go:117] "RemoveContainer" containerID="62f2f3f8a0cd389ca5004001736b8dc9ad79906fe8521e2892e77012e16f5b20" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.595067 4825 scope.go:117] "RemoveContainer" containerID="b4d61c6344a19f965ce9a8fefe891013ab0e02c9e7d0a5bcef488c45b2732e0d" Mar 10 08:04:23 crc kubenswrapper[4825]: E0310 08:04:23.595527 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4d61c6344a19f965ce9a8fefe891013ab0e02c9e7d0a5bcef488c45b2732e0d\": container with ID starting with b4d61c6344a19f965ce9a8fefe891013ab0e02c9e7d0a5bcef488c45b2732e0d not found: ID does not exist" containerID="b4d61c6344a19f965ce9a8fefe891013ab0e02c9e7d0a5bcef488c45b2732e0d" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.595558 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4d61c6344a19f965ce9a8fefe891013ab0e02c9e7d0a5bcef488c45b2732e0d"} err="failed to get container status \"b4d61c6344a19f965ce9a8fefe891013ab0e02c9e7d0a5bcef488c45b2732e0d\": rpc error: code = NotFound desc = could not find container \"b4d61c6344a19f965ce9a8fefe891013ab0e02c9e7d0a5bcef488c45b2732e0d\": container with ID starting with b4d61c6344a19f965ce9a8fefe891013ab0e02c9e7d0a5bcef488c45b2732e0d not found: ID does not exist" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.595576 4825 scope.go:117] "RemoveContainer" containerID="62f2f3f8a0cd389ca5004001736b8dc9ad79906fe8521e2892e77012e16f5b20" Mar 10 08:04:23 crc kubenswrapper[4825]: E0310 08:04:23.596082 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62f2f3f8a0cd389ca5004001736b8dc9ad79906fe8521e2892e77012e16f5b20\": container with ID starting with 62f2f3f8a0cd389ca5004001736b8dc9ad79906fe8521e2892e77012e16f5b20 not found: ID does not exist" containerID="62f2f3f8a0cd389ca5004001736b8dc9ad79906fe8521e2892e77012e16f5b20" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.596158 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62f2f3f8a0cd389ca5004001736b8dc9ad79906fe8521e2892e77012e16f5b20"} err="failed to get container status \"62f2f3f8a0cd389ca5004001736b8dc9ad79906fe8521e2892e77012e16f5b20\": rpc error: code = NotFound desc = could not find container \"62f2f3f8a0cd389ca5004001736b8dc9ad79906fe8521e2892e77012e16f5b20\": container with ID starting with 62f2f3f8a0cd389ca5004001736b8dc9ad79906fe8521e2892e77012e16f5b20 not found: ID does not exist" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.775376 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.790846 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.799469 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 08:04:23 crc kubenswrapper[4825]: E0310 08:04:23.799823 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d908280-2284-4dec-b8e3-67aae4d22314" containerName="rabbitmq" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.799843 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d908280-2284-4dec-b8e3-67aae4d22314" containerName="rabbitmq" Mar 10 08:04:23 crc kubenswrapper[4825]: E0310 08:04:23.799863 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d908280-2284-4dec-b8e3-67aae4d22314" containerName="setup-container" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.799872 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d908280-2284-4dec-b8e3-67aae4d22314" containerName="setup-container" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.800037 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d908280-2284-4dec-b8e3-67aae4d22314" containerName="rabbitmq" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.800982 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.803803 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.803952 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.803925 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.804059 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-vvcrg" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.804093 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.803924 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.804599 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.831204 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.937289 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.937335 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.937357 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gx8q\" (UniqueName: \"kubernetes.io/projected/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-kube-api-access-9gx8q\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.937383 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.937407 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.937453 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.937468 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.937610 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.937689 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.937771 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:23 crc kubenswrapper[4825]: I0310 08:04:23.937823 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.038693 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.038757 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.038776 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.038826 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.038843 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.038860 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gx8q\" (UniqueName: \"kubernetes.io/projected/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-kube-api-access-9gx8q\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.038880 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.038901 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.038931 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.038945 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.038978 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.040383 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.040623 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.043241 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.043670 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.043747 4825 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.044166 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1635cd019ed9056df86878228b36b513f8c17626caeab9d1ed75370112b35fe1/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.044314 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.044467 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.044756 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.044961 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.046556 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.057570 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gx8q\" (UniqueName: \"kubernetes.io/projected/b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a-kube-api-access-9gx8q\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.090844 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe14f45b-65e4-4e2b-8b48-bc8134ebe3b6\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.130013 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:04:24 crc kubenswrapper[4825]: W0310 08:04:24.558593 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9e3f0a6_4a2e_4363_ab86_a4ac4ad4b65a.slice/crio-c29468b12b5bbdb4c93828cb0071660d26edc8bede5e739a4baef848b93c5ab4 WatchSource:0}: Error finding container c29468b12b5bbdb4c93828cb0071660d26edc8bede5e739a4baef848b93c5ab4: Status 404 returned error can't find the container with id c29468b12b5bbdb4c93828cb0071660d26edc8bede5e739a4baef848b93c5ab4 Mar 10 08:04:24 crc kubenswrapper[4825]: I0310 08:04:24.559478 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 08:04:25 crc kubenswrapper[4825]: I0310 08:04:25.248605 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d908280-2284-4dec-b8e3-67aae4d22314" path="/var/lib/kubelet/pods/3d908280-2284-4dec-b8e3-67aae4d22314/volumes" Mar 10 08:04:25 crc kubenswrapper[4825]: I0310 08:04:25.461913 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"67f57387-e031-49bf-9895-efa6796a98cd","Type":"ContainerStarted","Data":"8efe5b038995e87e3294c07e98dd3a9a22fb1be2733d5b7e3cc4b25454d03ff0"} Mar 10 08:04:25 crc kubenswrapper[4825]: I0310 08:04:25.464072 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a","Type":"ContainerStarted","Data":"c29468b12b5bbdb4c93828cb0071660d26edc8bede5e739a4baef848b93c5ab4"} Mar 10 08:04:26 crc kubenswrapper[4825]: I0310 08:04:26.474657 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a","Type":"ContainerStarted","Data":"19128595f791b8c6010558196abe6f9054b5be683d8e93a9d56d1bd95ec6c34e"} Mar 10 08:04:34 crc kubenswrapper[4825]: I0310 08:04:34.236243 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:04:34 crc kubenswrapper[4825]: E0310 08:04:34.236921 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:04:40 crc kubenswrapper[4825]: I0310 08:04:40.497309 4825 scope.go:117] "RemoveContainer" containerID="d71a886e8be783f1a2c88c9aa8813de1e5e30017d829cfc788f8fd2ab4f3a798" Mar 10 08:04:45 crc kubenswrapper[4825]: I0310 08:04:45.237477 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:04:45 crc kubenswrapper[4825]: E0310 08:04:45.238364 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:04:56 crc kubenswrapper[4825]: I0310 08:04:56.235983 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:04:56 crc kubenswrapper[4825]: E0310 08:04:56.236871 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:04:57 crc kubenswrapper[4825]: I0310 08:04:57.733466 4825 generic.go:334] "Generic (PLEG): container finished" podID="67f57387-e031-49bf-9895-efa6796a98cd" containerID="8efe5b038995e87e3294c07e98dd3a9a22fb1be2733d5b7e3cc4b25454d03ff0" exitCode=0 Mar 10 08:04:57 crc kubenswrapper[4825]: I0310 08:04:57.733594 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"67f57387-e031-49bf-9895-efa6796a98cd","Type":"ContainerDied","Data":"8efe5b038995e87e3294c07e98dd3a9a22fb1be2733d5b7e3cc4b25454d03ff0"} Mar 10 08:04:58 crc kubenswrapper[4825]: I0310 08:04:58.743604 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"67f57387-e031-49bf-9895-efa6796a98cd","Type":"ContainerStarted","Data":"02f62b97be9783d48ec7612a4d0a2ec5dd297e4751ea330d9e186c6820079d4c"} Mar 10 08:04:58 crc kubenswrapper[4825]: I0310 08:04:58.744255 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 10 08:04:58 crc kubenswrapper[4825]: I0310 08:04:58.767042 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.767013232 podStartE2EDuration="36.767013232s" podCreationTimestamp="2026-03-10 08:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:04:58.766072497 +0000 UTC m=+4851.795853132" watchObservedRunningTime="2026-03-10 08:04:58.767013232 +0000 UTC m=+4851.796793867" Mar 10 08:04:59 crc kubenswrapper[4825]: I0310 08:04:59.752739 4825 generic.go:334] "Generic (PLEG): container finished" podID="b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a" containerID="19128595f791b8c6010558196abe6f9054b5be683d8e93a9d56d1bd95ec6c34e" exitCode=0 Mar 10 08:04:59 crc kubenswrapper[4825]: I0310 08:04:59.752823 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a","Type":"ContainerDied","Data":"19128595f791b8c6010558196abe6f9054b5be683d8e93a9d56d1bd95ec6c34e"} Mar 10 08:05:00 crc kubenswrapper[4825]: I0310 08:05:00.761721 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a","Type":"ContainerStarted","Data":"10125015b27d5f331462c28bac19a993a299cf09e279bf2852c13c3f26cdc978"} Mar 10 08:05:00 crc kubenswrapper[4825]: I0310 08:05:00.762488 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:05:00 crc kubenswrapper[4825]: I0310 08:05:00.784501 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.784479246 podStartE2EDuration="37.784479246s" podCreationTimestamp="2026-03-10 08:04:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:05:00.780554963 +0000 UTC m=+4853.810335598" watchObservedRunningTime="2026-03-10 08:05:00.784479246 +0000 UTC m=+4853.814259861" Mar 10 08:05:09 crc kubenswrapper[4825]: I0310 08:05:09.248976 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:05:09 crc kubenswrapper[4825]: E0310 08:05:09.250006 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:05:12 crc kubenswrapper[4825]: I0310 08:05:12.850437 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 10 08:05:13 crc kubenswrapper[4825]: I0310 08:05:13.763833 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-phcz7"] Mar 10 08:05:13 crc kubenswrapper[4825]: I0310 08:05:13.766736 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phcz7" Mar 10 08:05:13 crc kubenswrapper[4825]: I0310 08:05:13.795085 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-phcz7"] Mar 10 08:05:13 crc kubenswrapper[4825]: I0310 08:05:13.940935 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pslk9\" (UniqueName: \"kubernetes.io/projected/a58949ad-caaf-4489-9a32-26526aa44d53-kube-api-access-pslk9\") pod \"redhat-operators-phcz7\" (UID: \"a58949ad-caaf-4489-9a32-26526aa44d53\") " pod="openshift-marketplace/redhat-operators-phcz7" Mar 10 08:05:13 crc kubenswrapper[4825]: I0310 08:05:13.941304 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a58949ad-caaf-4489-9a32-26526aa44d53-catalog-content\") pod \"redhat-operators-phcz7\" (UID: \"a58949ad-caaf-4489-9a32-26526aa44d53\") " pod="openshift-marketplace/redhat-operators-phcz7" Mar 10 08:05:13 crc kubenswrapper[4825]: I0310 08:05:13.941352 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a58949ad-caaf-4489-9a32-26526aa44d53-utilities\") pod \"redhat-operators-phcz7\" (UID: \"a58949ad-caaf-4489-9a32-26526aa44d53\") " pod="openshift-marketplace/redhat-operators-phcz7" Mar 10 08:05:14 crc kubenswrapper[4825]: I0310 08:05:14.042760 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a58949ad-caaf-4489-9a32-26526aa44d53-catalog-content\") pod \"redhat-operators-phcz7\" (UID: \"a58949ad-caaf-4489-9a32-26526aa44d53\") " pod="openshift-marketplace/redhat-operators-phcz7" Mar 10 08:05:14 crc kubenswrapper[4825]: I0310 08:05:14.042839 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a58949ad-caaf-4489-9a32-26526aa44d53-utilities\") pod \"redhat-operators-phcz7\" (UID: \"a58949ad-caaf-4489-9a32-26526aa44d53\") " pod="openshift-marketplace/redhat-operators-phcz7" Mar 10 08:05:14 crc kubenswrapper[4825]: I0310 08:05:14.042872 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pslk9\" (UniqueName: \"kubernetes.io/projected/a58949ad-caaf-4489-9a32-26526aa44d53-kube-api-access-pslk9\") pod \"redhat-operators-phcz7\" (UID: \"a58949ad-caaf-4489-9a32-26526aa44d53\") " pod="openshift-marketplace/redhat-operators-phcz7" Mar 10 08:05:14 crc kubenswrapper[4825]: I0310 08:05:14.043406 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a58949ad-caaf-4489-9a32-26526aa44d53-catalog-content\") pod \"redhat-operators-phcz7\" (UID: \"a58949ad-caaf-4489-9a32-26526aa44d53\") " pod="openshift-marketplace/redhat-operators-phcz7" Mar 10 08:05:14 crc kubenswrapper[4825]: I0310 08:05:14.043426 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a58949ad-caaf-4489-9a32-26526aa44d53-utilities\") pod \"redhat-operators-phcz7\" (UID: \"a58949ad-caaf-4489-9a32-26526aa44d53\") " pod="openshift-marketplace/redhat-operators-phcz7" Mar 10 08:05:14 crc kubenswrapper[4825]: I0310 08:05:14.064765 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pslk9\" (UniqueName: \"kubernetes.io/projected/a58949ad-caaf-4489-9a32-26526aa44d53-kube-api-access-pslk9\") pod \"redhat-operators-phcz7\" (UID: \"a58949ad-caaf-4489-9a32-26526aa44d53\") " pod="openshift-marketplace/redhat-operators-phcz7" Mar 10 08:05:14 crc kubenswrapper[4825]: I0310 08:05:14.133324 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 10 08:05:14 crc kubenswrapper[4825]: I0310 08:05:14.143525 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phcz7" Mar 10 08:05:14 crc kubenswrapper[4825]: I0310 08:05:14.638494 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-phcz7"] Mar 10 08:05:14 crc kubenswrapper[4825]: I0310 08:05:14.889457 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phcz7" event={"ID":"a58949ad-caaf-4489-9a32-26526aa44d53","Type":"ContainerStarted","Data":"90ee46310b8d797d2482bb50f2637140edcf3ac58b5c23a024401929746e33ce"} Mar 10 08:05:15 crc kubenswrapper[4825]: I0310 08:05:15.899894 4825 generic.go:334] "Generic (PLEG): container finished" podID="a58949ad-caaf-4489-9a32-26526aa44d53" containerID="7c40fb928fa8202de02fa35e46044c9f71fac379f2951cde434b12dfc41a83d0" exitCode=0 Mar 10 08:05:15 crc kubenswrapper[4825]: I0310 08:05:15.899985 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phcz7" event={"ID":"a58949ad-caaf-4489-9a32-26526aa44d53","Type":"ContainerDied","Data":"7c40fb928fa8202de02fa35e46044c9f71fac379f2951cde434b12dfc41a83d0"} Mar 10 08:05:16 crc kubenswrapper[4825]: I0310 08:05:16.912648 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phcz7" event={"ID":"a58949ad-caaf-4489-9a32-26526aa44d53","Type":"ContainerStarted","Data":"a86fbd13604e25ee67e5b74d25d6e1be9dc61eaaf854ff8017d1c4d64c75a56c"} Mar 10 08:05:17 crc kubenswrapper[4825]: I0310 08:05:17.052394 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 10 08:05:17 crc kubenswrapper[4825]: I0310 08:05:17.053925 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 08:05:17 crc kubenswrapper[4825]: I0310 08:05:17.056729 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-grwd4" Mar 10 08:05:17 crc kubenswrapper[4825]: I0310 08:05:17.065769 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 10 08:05:17 crc kubenswrapper[4825]: I0310 08:05:17.200991 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svft5\" (UniqueName: \"kubernetes.io/projected/2ea48178-1db2-4c38-9713-bab5bbe1aac5-kube-api-access-svft5\") pod \"mariadb-client\" (UID: \"2ea48178-1db2-4c38-9713-bab5bbe1aac5\") " pod="openstack/mariadb-client" Mar 10 08:05:17 crc kubenswrapper[4825]: I0310 08:05:17.302875 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svft5\" (UniqueName: \"kubernetes.io/projected/2ea48178-1db2-4c38-9713-bab5bbe1aac5-kube-api-access-svft5\") pod \"mariadb-client\" (UID: \"2ea48178-1db2-4c38-9713-bab5bbe1aac5\") " pod="openstack/mariadb-client" Mar 10 08:05:17 crc kubenswrapper[4825]: I0310 08:05:17.334464 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svft5\" (UniqueName: \"kubernetes.io/projected/2ea48178-1db2-4c38-9713-bab5bbe1aac5-kube-api-access-svft5\") pod \"mariadb-client\" (UID: \"2ea48178-1db2-4c38-9713-bab5bbe1aac5\") " pod="openstack/mariadb-client" Mar 10 08:05:17 crc kubenswrapper[4825]: I0310 08:05:17.396078 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 08:05:17 crc kubenswrapper[4825]: I0310 08:05:17.693974 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 10 08:05:17 crc kubenswrapper[4825]: I0310 08:05:17.921913 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"2ea48178-1db2-4c38-9713-bab5bbe1aac5","Type":"ContainerStarted","Data":"067ebeda8b65d68cb1eff5b6b308d790be7890562978cc0757932fb45c26b7e3"} Mar 10 08:05:17 crc kubenswrapper[4825]: I0310 08:05:17.923908 4825 generic.go:334] "Generic (PLEG): container finished" podID="a58949ad-caaf-4489-9a32-26526aa44d53" containerID="a86fbd13604e25ee67e5b74d25d6e1be9dc61eaaf854ff8017d1c4d64c75a56c" exitCode=0 Mar 10 08:05:17 crc kubenswrapper[4825]: I0310 08:05:17.923949 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phcz7" event={"ID":"a58949ad-caaf-4489-9a32-26526aa44d53","Type":"ContainerDied","Data":"a86fbd13604e25ee67e5b74d25d6e1be9dc61eaaf854ff8017d1c4d64c75a56c"} Mar 10 08:05:18 crc kubenswrapper[4825]: I0310 08:05:18.937531 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"2ea48178-1db2-4c38-9713-bab5bbe1aac5","Type":"ContainerStarted","Data":"356be41132a342e47c97839b7387ca8cc994efe9cfd1c097ef28ac7dcc5aa511"} Mar 10 08:05:18 crc kubenswrapper[4825]: I0310 08:05:18.940726 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phcz7" event={"ID":"a58949ad-caaf-4489-9a32-26526aa44d53","Type":"ContainerStarted","Data":"b2aeaadf0c56d86df7caa1cf03566bc8ccbdcd001bbf9e06743a9f9dc09e1f0c"} Mar 10 08:05:18 crc kubenswrapper[4825]: I0310 08:05:18.964420 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=1.476112471 podStartE2EDuration="1.96439496s" podCreationTimestamp="2026-03-10 08:05:17 +0000 UTC" firstStartedPulling="2026-03-10 08:05:17.697550458 +0000 UTC m=+4870.727331073" lastFinishedPulling="2026-03-10 08:05:18.185832947 +0000 UTC m=+4871.215613562" observedRunningTime="2026-03-10 08:05:18.953544996 +0000 UTC m=+4871.983325641" watchObservedRunningTime="2026-03-10 08:05:18.96439496 +0000 UTC m=+4871.994175615" Mar 10 08:05:18 crc kubenswrapper[4825]: I0310 08:05:18.980186 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-phcz7" podStartSLOduration=3.477097934 podStartE2EDuration="5.980163814s" podCreationTimestamp="2026-03-10 08:05:13 +0000 UTC" firstStartedPulling="2026-03-10 08:05:15.908149609 +0000 UTC m=+4868.937930214" lastFinishedPulling="2026-03-10 08:05:18.411215449 +0000 UTC m=+4871.440996094" observedRunningTime="2026-03-10 08:05:18.97733467 +0000 UTC m=+4872.007115305" watchObservedRunningTime="2026-03-10 08:05:18.980163814 +0000 UTC m=+4872.009944469" Mar 10 08:05:20 crc kubenswrapper[4825]: I0310 08:05:20.236240 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:05:20 crc kubenswrapper[4825]: E0310 08:05:20.236661 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:05:23 crc kubenswrapper[4825]: E0310 08:05:23.272711 4825 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.222:33264->38.102.83.222:42289: read tcp 38.102.83.222:33264->38.102.83.222:42289: read: connection reset by peer Mar 10 08:05:24 crc kubenswrapper[4825]: I0310 08:05:24.144157 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-phcz7" Mar 10 08:05:24 crc kubenswrapper[4825]: I0310 08:05:24.144599 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-phcz7" Mar 10 08:05:25 crc kubenswrapper[4825]: I0310 08:05:25.194412 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-phcz7" podUID="a58949ad-caaf-4489-9a32-26526aa44d53" containerName="registry-server" probeResult="failure" output=< Mar 10 08:05:25 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 08:05:25 crc kubenswrapper[4825]: > Mar 10 08:05:31 crc kubenswrapper[4825]: I0310 08:05:31.523960 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 10 08:05:31 crc kubenswrapper[4825]: I0310 08:05:31.524804 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="2ea48178-1db2-4c38-9713-bab5bbe1aac5" containerName="mariadb-client" containerID="cri-o://356be41132a342e47c97839b7387ca8cc994efe9cfd1c097ef28ac7dcc5aa511" gracePeriod=30 Mar 10 08:05:32 crc kubenswrapper[4825]: I0310 08:05:32.084465 4825 generic.go:334] "Generic (PLEG): container finished" podID="2ea48178-1db2-4c38-9713-bab5bbe1aac5" containerID="356be41132a342e47c97839b7387ca8cc994efe9cfd1c097ef28ac7dcc5aa511" exitCode=143 Mar 10 08:05:32 crc kubenswrapper[4825]: I0310 08:05:32.084514 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"2ea48178-1db2-4c38-9713-bab5bbe1aac5","Type":"ContainerDied","Data":"356be41132a342e47c97839b7387ca8cc994efe9cfd1c097ef28ac7dcc5aa511"} Mar 10 08:05:32 crc kubenswrapper[4825]: I0310 08:05:32.282116 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 08:05:32 crc kubenswrapper[4825]: I0310 08:05:32.348417 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svft5\" (UniqueName: \"kubernetes.io/projected/2ea48178-1db2-4c38-9713-bab5bbe1aac5-kube-api-access-svft5\") pod \"2ea48178-1db2-4c38-9713-bab5bbe1aac5\" (UID: \"2ea48178-1db2-4c38-9713-bab5bbe1aac5\") " Mar 10 08:05:32 crc kubenswrapper[4825]: I0310 08:05:32.357545 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ea48178-1db2-4c38-9713-bab5bbe1aac5-kube-api-access-svft5" (OuterVolumeSpecName: "kube-api-access-svft5") pod "2ea48178-1db2-4c38-9713-bab5bbe1aac5" (UID: "2ea48178-1db2-4c38-9713-bab5bbe1aac5"). InnerVolumeSpecName "kube-api-access-svft5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:05:32 crc kubenswrapper[4825]: I0310 08:05:32.450653 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svft5\" (UniqueName: \"kubernetes.io/projected/2ea48178-1db2-4c38-9713-bab5bbe1aac5-kube-api-access-svft5\") on node \"crc\" DevicePath \"\"" Mar 10 08:05:33 crc kubenswrapper[4825]: I0310 08:05:33.098227 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"2ea48178-1db2-4c38-9713-bab5bbe1aac5","Type":"ContainerDied","Data":"067ebeda8b65d68cb1eff5b6b308d790be7890562978cc0757932fb45c26b7e3"} Mar 10 08:05:33 crc kubenswrapper[4825]: I0310 08:05:33.098337 4825 scope.go:117] "RemoveContainer" containerID="356be41132a342e47c97839b7387ca8cc994efe9cfd1c097ef28ac7dcc5aa511" Mar 10 08:05:33 crc kubenswrapper[4825]: I0310 08:05:33.098353 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 08:05:33 crc kubenswrapper[4825]: I0310 08:05:33.165353 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 10 08:05:33 crc kubenswrapper[4825]: I0310 08:05:33.177837 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 10 08:05:33 crc kubenswrapper[4825]: I0310 08:05:33.266521 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ea48178-1db2-4c38-9713-bab5bbe1aac5" path="/var/lib/kubelet/pods/2ea48178-1db2-4c38-9713-bab5bbe1aac5/volumes" Mar 10 08:05:34 crc kubenswrapper[4825]: I0310 08:05:34.205056 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-phcz7" Mar 10 08:05:34 crc kubenswrapper[4825]: I0310 08:05:34.258181 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-phcz7" Mar 10 08:05:34 crc kubenswrapper[4825]: I0310 08:05:34.442206 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-phcz7"] Mar 10 08:05:35 crc kubenswrapper[4825]: I0310 08:05:35.237817 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:05:35 crc kubenswrapper[4825]: E0310 08:05:35.238191 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:05:36 crc kubenswrapper[4825]: I0310 08:05:36.127466 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-phcz7" podUID="a58949ad-caaf-4489-9a32-26526aa44d53" containerName="registry-server" containerID="cri-o://b2aeaadf0c56d86df7caa1cf03566bc8ccbdcd001bbf9e06743a9f9dc09e1f0c" gracePeriod=2 Mar 10 08:05:36 crc kubenswrapper[4825]: I0310 08:05:36.555076 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phcz7" Mar 10 08:05:36 crc kubenswrapper[4825]: I0310 08:05:36.620750 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a58949ad-caaf-4489-9a32-26526aa44d53-utilities\") pod \"a58949ad-caaf-4489-9a32-26526aa44d53\" (UID: \"a58949ad-caaf-4489-9a32-26526aa44d53\") " Mar 10 08:05:36 crc kubenswrapper[4825]: I0310 08:05:36.620792 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pslk9\" (UniqueName: \"kubernetes.io/projected/a58949ad-caaf-4489-9a32-26526aa44d53-kube-api-access-pslk9\") pod \"a58949ad-caaf-4489-9a32-26526aa44d53\" (UID: \"a58949ad-caaf-4489-9a32-26526aa44d53\") " Mar 10 08:05:36 crc kubenswrapper[4825]: I0310 08:05:36.620856 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a58949ad-caaf-4489-9a32-26526aa44d53-catalog-content\") pod \"a58949ad-caaf-4489-9a32-26526aa44d53\" (UID: \"a58949ad-caaf-4489-9a32-26526aa44d53\") " Mar 10 08:05:36 crc kubenswrapper[4825]: I0310 08:05:36.622044 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a58949ad-caaf-4489-9a32-26526aa44d53-utilities" (OuterVolumeSpecName: "utilities") pod "a58949ad-caaf-4489-9a32-26526aa44d53" (UID: "a58949ad-caaf-4489-9a32-26526aa44d53"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:05:36 crc kubenswrapper[4825]: I0310 08:05:36.647669 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a58949ad-caaf-4489-9a32-26526aa44d53-kube-api-access-pslk9" (OuterVolumeSpecName: "kube-api-access-pslk9") pod "a58949ad-caaf-4489-9a32-26526aa44d53" (UID: "a58949ad-caaf-4489-9a32-26526aa44d53"). InnerVolumeSpecName "kube-api-access-pslk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:05:36 crc kubenswrapper[4825]: I0310 08:05:36.723165 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a58949ad-caaf-4489-9a32-26526aa44d53-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 08:05:36 crc kubenswrapper[4825]: I0310 08:05:36.723207 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pslk9\" (UniqueName: \"kubernetes.io/projected/a58949ad-caaf-4489-9a32-26526aa44d53-kube-api-access-pslk9\") on node \"crc\" DevicePath \"\"" Mar 10 08:05:36 crc kubenswrapper[4825]: I0310 08:05:36.767590 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a58949ad-caaf-4489-9a32-26526aa44d53-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a58949ad-caaf-4489-9a32-26526aa44d53" (UID: "a58949ad-caaf-4489-9a32-26526aa44d53"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:05:36 crc kubenswrapper[4825]: I0310 08:05:36.824380 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a58949ad-caaf-4489-9a32-26526aa44d53-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 08:05:37 crc kubenswrapper[4825]: I0310 08:05:37.138622 4825 generic.go:334] "Generic (PLEG): container finished" podID="a58949ad-caaf-4489-9a32-26526aa44d53" containerID="b2aeaadf0c56d86df7caa1cf03566bc8ccbdcd001bbf9e06743a9f9dc09e1f0c" exitCode=0 Mar 10 08:05:37 crc kubenswrapper[4825]: I0310 08:05:37.138688 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phcz7" event={"ID":"a58949ad-caaf-4489-9a32-26526aa44d53","Type":"ContainerDied","Data":"b2aeaadf0c56d86df7caa1cf03566bc8ccbdcd001bbf9e06743a9f9dc09e1f0c"} Mar 10 08:05:37 crc kubenswrapper[4825]: I0310 08:05:37.138723 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phcz7" event={"ID":"a58949ad-caaf-4489-9a32-26526aa44d53","Type":"ContainerDied","Data":"90ee46310b8d797d2482bb50f2637140edcf3ac58b5c23a024401929746e33ce"} Mar 10 08:05:37 crc kubenswrapper[4825]: I0310 08:05:37.138749 4825 scope.go:117] "RemoveContainer" containerID="b2aeaadf0c56d86df7caa1cf03566bc8ccbdcd001bbf9e06743a9f9dc09e1f0c" Mar 10 08:05:37 crc kubenswrapper[4825]: I0310 08:05:37.138886 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phcz7" Mar 10 08:05:37 crc kubenswrapper[4825]: I0310 08:05:37.165226 4825 scope.go:117] "RemoveContainer" containerID="a86fbd13604e25ee67e5b74d25d6e1be9dc61eaaf854ff8017d1c4d64c75a56c" Mar 10 08:05:37 crc kubenswrapper[4825]: I0310 08:05:37.187293 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-phcz7"] Mar 10 08:05:37 crc kubenswrapper[4825]: I0310 08:05:37.190958 4825 scope.go:117] "RemoveContainer" containerID="7c40fb928fa8202de02fa35e46044c9f71fac379f2951cde434b12dfc41a83d0" Mar 10 08:05:37 crc kubenswrapper[4825]: I0310 08:05:37.191932 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-phcz7"] Mar 10 08:05:37 crc kubenswrapper[4825]: I0310 08:05:37.229936 4825 scope.go:117] "RemoveContainer" containerID="b2aeaadf0c56d86df7caa1cf03566bc8ccbdcd001bbf9e06743a9f9dc09e1f0c" Mar 10 08:05:37 crc kubenswrapper[4825]: E0310 08:05:37.230375 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2aeaadf0c56d86df7caa1cf03566bc8ccbdcd001bbf9e06743a9f9dc09e1f0c\": container with ID starting with b2aeaadf0c56d86df7caa1cf03566bc8ccbdcd001bbf9e06743a9f9dc09e1f0c not found: ID does not exist" containerID="b2aeaadf0c56d86df7caa1cf03566bc8ccbdcd001bbf9e06743a9f9dc09e1f0c" Mar 10 08:05:37 crc kubenswrapper[4825]: I0310 08:05:37.230407 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2aeaadf0c56d86df7caa1cf03566bc8ccbdcd001bbf9e06743a9f9dc09e1f0c"} err="failed to get container status \"b2aeaadf0c56d86df7caa1cf03566bc8ccbdcd001bbf9e06743a9f9dc09e1f0c\": rpc error: code = NotFound desc = could not find container \"b2aeaadf0c56d86df7caa1cf03566bc8ccbdcd001bbf9e06743a9f9dc09e1f0c\": container with ID starting with b2aeaadf0c56d86df7caa1cf03566bc8ccbdcd001bbf9e06743a9f9dc09e1f0c not found: ID does not exist" Mar 10 08:05:37 crc kubenswrapper[4825]: I0310 08:05:37.230457 4825 scope.go:117] "RemoveContainer" containerID="a86fbd13604e25ee67e5b74d25d6e1be9dc61eaaf854ff8017d1c4d64c75a56c" Mar 10 08:05:37 crc kubenswrapper[4825]: E0310 08:05:37.230826 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a86fbd13604e25ee67e5b74d25d6e1be9dc61eaaf854ff8017d1c4d64c75a56c\": container with ID starting with a86fbd13604e25ee67e5b74d25d6e1be9dc61eaaf854ff8017d1c4d64c75a56c not found: ID does not exist" containerID="a86fbd13604e25ee67e5b74d25d6e1be9dc61eaaf854ff8017d1c4d64c75a56c" Mar 10 08:05:37 crc kubenswrapper[4825]: I0310 08:05:37.230876 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a86fbd13604e25ee67e5b74d25d6e1be9dc61eaaf854ff8017d1c4d64c75a56c"} err="failed to get container status \"a86fbd13604e25ee67e5b74d25d6e1be9dc61eaaf854ff8017d1c4d64c75a56c\": rpc error: code = NotFound desc = could not find container \"a86fbd13604e25ee67e5b74d25d6e1be9dc61eaaf854ff8017d1c4d64c75a56c\": container with ID starting with a86fbd13604e25ee67e5b74d25d6e1be9dc61eaaf854ff8017d1c4d64c75a56c not found: ID does not exist" Mar 10 08:05:37 crc kubenswrapper[4825]: I0310 08:05:37.230965 4825 scope.go:117] "RemoveContainer" containerID="7c40fb928fa8202de02fa35e46044c9f71fac379f2951cde434b12dfc41a83d0" Mar 10 08:05:37 crc kubenswrapper[4825]: E0310 08:05:37.231375 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c40fb928fa8202de02fa35e46044c9f71fac379f2951cde434b12dfc41a83d0\": container with ID starting with 7c40fb928fa8202de02fa35e46044c9f71fac379f2951cde434b12dfc41a83d0 not found: ID does not exist" containerID="7c40fb928fa8202de02fa35e46044c9f71fac379f2951cde434b12dfc41a83d0" Mar 10 08:05:37 crc kubenswrapper[4825]: I0310 08:05:37.231427 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c40fb928fa8202de02fa35e46044c9f71fac379f2951cde434b12dfc41a83d0"} err="failed to get container status \"7c40fb928fa8202de02fa35e46044c9f71fac379f2951cde434b12dfc41a83d0\": rpc error: code = NotFound desc = could not find container \"7c40fb928fa8202de02fa35e46044c9f71fac379f2951cde434b12dfc41a83d0\": container with ID starting with 7c40fb928fa8202de02fa35e46044c9f71fac379f2951cde434b12dfc41a83d0 not found: ID does not exist" Mar 10 08:05:37 crc kubenswrapper[4825]: I0310 08:05:37.245324 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a58949ad-caaf-4489-9a32-26526aa44d53" path="/var/lib/kubelet/pods/a58949ad-caaf-4489-9a32-26526aa44d53/volumes" Mar 10 08:05:48 crc kubenswrapper[4825]: I0310 08:05:48.237343 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:05:48 crc kubenswrapper[4825]: E0310 08:05:48.238766 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:06:00 crc kubenswrapper[4825]: I0310 08:06:00.171315 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552166-cjdxc"] Mar 10 08:06:00 crc kubenswrapper[4825]: E0310 08:06:00.173350 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a58949ad-caaf-4489-9a32-26526aa44d53" containerName="registry-server" Mar 10 08:06:00 crc kubenswrapper[4825]: I0310 08:06:00.173415 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a58949ad-caaf-4489-9a32-26526aa44d53" containerName="registry-server" Mar 10 08:06:00 crc kubenswrapper[4825]: E0310 08:06:00.173459 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea48178-1db2-4c38-9713-bab5bbe1aac5" containerName="mariadb-client" Mar 10 08:06:00 crc kubenswrapper[4825]: I0310 08:06:00.173480 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea48178-1db2-4c38-9713-bab5bbe1aac5" containerName="mariadb-client" Mar 10 08:06:00 crc kubenswrapper[4825]: E0310 08:06:00.173544 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a58949ad-caaf-4489-9a32-26526aa44d53" containerName="extract-utilities" Mar 10 08:06:00 crc kubenswrapper[4825]: I0310 08:06:00.173565 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a58949ad-caaf-4489-9a32-26526aa44d53" containerName="extract-utilities" Mar 10 08:06:00 crc kubenswrapper[4825]: E0310 08:06:00.173601 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a58949ad-caaf-4489-9a32-26526aa44d53" containerName="extract-content" Mar 10 08:06:00 crc kubenswrapper[4825]: I0310 08:06:00.173618 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a58949ad-caaf-4489-9a32-26526aa44d53" containerName="extract-content" Mar 10 08:06:00 crc kubenswrapper[4825]: I0310 08:06:00.173978 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a58949ad-caaf-4489-9a32-26526aa44d53" containerName="registry-server" Mar 10 08:06:00 crc kubenswrapper[4825]: I0310 08:06:00.174033 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ea48178-1db2-4c38-9713-bab5bbe1aac5" containerName="mariadb-client" Mar 10 08:06:00 crc kubenswrapper[4825]: I0310 08:06:00.175055 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552166-cjdxc" Mar 10 08:06:00 crc kubenswrapper[4825]: I0310 08:06:00.177948 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552166-cjdxc"] Mar 10 08:06:00 crc kubenswrapper[4825]: I0310 08:06:00.181099 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:06:00 crc kubenswrapper[4825]: I0310 08:06:00.181627 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:06:00 crc kubenswrapper[4825]: I0310 08:06:00.181884 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:06:00 crc kubenswrapper[4825]: I0310 08:06:00.231513 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr7ll\" (UniqueName: \"kubernetes.io/projected/658c3420-a9be-4dd3-be55-d59475be36dc-kube-api-access-nr7ll\") pod \"auto-csr-approver-29552166-cjdxc\" (UID: \"658c3420-a9be-4dd3-be55-d59475be36dc\") " pod="openshift-infra/auto-csr-approver-29552166-cjdxc" Mar 10 08:06:00 crc kubenswrapper[4825]: I0310 08:06:00.333211 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr7ll\" (UniqueName: \"kubernetes.io/projected/658c3420-a9be-4dd3-be55-d59475be36dc-kube-api-access-nr7ll\") pod \"auto-csr-approver-29552166-cjdxc\" (UID: \"658c3420-a9be-4dd3-be55-d59475be36dc\") " pod="openshift-infra/auto-csr-approver-29552166-cjdxc" Mar 10 08:06:00 crc kubenswrapper[4825]: I0310 08:06:00.360596 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr7ll\" (UniqueName: \"kubernetes.io/projected/658c3420-a9be-4dd3-be55-d59475be36dc-kube-api-access-nr7ll\") pod \"auto-csr-approver-29552166-cjdxc\" (UID: \"658c3420-a9be-4dd3-be55-d59475be36dc\") " pod="openshift-infra/auto-csr-approver-29552166-cjdxc" Mar 10 08:06:00 crc kubenswrapper[4825]: I0310 08:06:00.508156 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552166-cjdxc" Mar 10 08:06:00 crc kubenswrapper[4825]: I0310 08:06:00.737782 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552166-cjdxc"] Mar 10 08:06:01 crc kubenswrapper[4825]: I0310 08:06:01.370323 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552166-cjdxc" event={"ID":"658c3420-a9be-4dd3-be55-d59475be36dc","Type":"ContainerStarted","Data":"630dc83a2bb566824618426244de9e6162bcf57239698e1b0d2ff6594df77dcd"} Mar 10 08:06:02 crc kubenswrapper[4825]: I0310 08:06:02.380229 4825 generic.go:334] "Generic (PLEG): container finished" podID="658c3420-a9be-4dd3-be55-d59475be36dc" containerID="7bce7cbde66a23cf5d2619d1a5be3ef4415be3ce23f68212a5f04f2444b54ef9" exitCode=0 Mar 10 08:06:02 crc kubenswrapper[4825]: I0310 08:06:02.380287 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552166-cjdxc" event={"ID":"658c3420-a9be-4dd3-be55-d59475be36dc","Type":"ContainerDied","Data":"7bce7cbde66a23cf5d2619d1a5be3ef4415be3ce23f68212a5f04f2444b54ef9"} Mar 10 08:06:03 crc kubenswrapper[4825]: I0310 08:06:03.236780 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:06:03 crc kubenswrapper[4825]: E0310 08:06:03.237465 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:06:03 crc kubenswrapper[4825]: I0310 08:06:03.675471 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552166-cjdxc" Mar 10 08:06:03 crc kubenswrapper[4825]: I0310 08:06:03.780782 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr7ll\" (UniqueName: \"kubernetes.io/projected/658c3420-a9be-4dd3-be55-d59475be36dc-kube-api-access-nr7ll\") pod \"658c3420-a9be-4dd3-be55-d59475be36dc\" (UID: \"658c3420-a9be-4dd3-be55-d59475be36dc\") " Mar 10 08:06:04 crc kubenswrapper[4825]: I0310 08:06:04.358352 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/658c3420-a9be-4dd3-be55-d59475be36dc-kube-api-access-nr7ll" (OuterVolumeSpecName: "kube-api-access-nr7ll") pod "658c3420-a9be-4dd3-be55-d59475be36dc" (UID: "658c3420-a9be-4dd3-be55-d59475be36dc"). InnerVolumeSpecName "kube-api-access-nr7ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:06:04 crc kubenswrapper[4825]: I0310 08:06:04.391481 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nr7ll\" (UniqueName: \"kubernetes.io/projected/658c3420-a9be-4dd3-be55-d59475be36dc-kube-api-access-nr7ll\") on node \"crc\" DevicePath \"\"" Mar 10 08:06:04 crc kubenswrapper[4825]: I0310 08:06:04.402045 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552166-cjdxc" event={"ID":"658c3420-a9be-4dd3-be55-d59475be36dc","Type":"ContainerDied","Data":"630dc83a2bb566824618426244de9e6162bcf57239698e1b0d2ff6594df77dcd"} Mar 10 08:06:04 crc kubenswrapper[4825]: I0310 08:06:04.402114 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="630dc83a2bb566824618426244de9e6162bcf57239698e1b0d2ff6594df77dcd" Mar 10 08:06:04 crc kubenswrapper[4825]: I0310 08:06:04.402192 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552166-cjdxc" Mar 10 08:06:04 crc kubenswrapper[4825]: I0310 08:06:04.770985 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552160-8892c"] Mar 10 08:06:04 crc kubenswrapper[4825]: I0310 08:06:04.778605 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552160-8892c"] Mar 10 08:06:05 crc kubenswrapper[4825]: I0310 08:06:05.253163 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa07e9b6-e969-4222-803d-69e1b3560b27" path="/var/lib/kubelet/pods/fa07e9b6-e969-4222-803d-69e1b3560b27/volumes" Mar 10 08:06:17 crc kubenswrapper[4825]: I0310 08:06:17.237619 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:06:17 crc kubenswrapper[4825]: E0310 08:06:17.238493 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:06:32 crc kubenswrapper[4825]: I0310 08:06:32.236919 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:06:32 crc kubenswrapper[4825]: E0310 08:06:32.237830 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:06:40 crc kubenswrapper[4825]: I0310 08:06:40.687215 4825 scope.go:117] "RemoveContainer" containerID="03d2536a097bc83bfae019f5b730c18e02fa290fb58244fde82127ec8dc1f1be" Mar 10 08:06:40 crc kubenswrapper[4825]: I0310 08:06:40.766576 4825 scope.go:117] "RemoveContainer" containerID="c778d1b6fd25bd0d2c1fc75beee55baf9d66f6cf5319bd6ebc6298e0be7d36d2" Mar 10 08:06:44 crc kubenswrapper[4825]: I0310 08:06:44.237431 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:06:44 crc kubenswrapper[4825]: E0310 08:06:44.238436 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:06:58 crc kubenswrapper[4825]: I0310 08:06:58.237734 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:06:58 crc kubenswrapper[4825]: I0310 08:06:58.854498 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"8802fd3b03472e4fc7e7e01ba0c1a6885938d2d647b76d216f70475068123de1"} Mar 10 08:08:00 crc kubenswrapper[4825]: I0310 08:08:00.155212 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552168-fcmhd"] Mar 10 08:08:00 crc kubenswrapper[4825]: E0310 08:08:00.156109 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="658c3420-a9be-4dd3-be55-d59475be36dc" containerName="oc" Mar 10 08:08:00 crc kubenswrapper[4825]: I0310 08:08:00.156121 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="658c3420-a9be-4dd3-be55-d59475be36dc" containerName="oc" Mar 10 08:08:00 crc kubenswrapper[4825]: I0310 08:08:00.156263 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="658c3420-a9be-4dd3-be55-d59475be36dc" containerName="oc" Mar 10 08:08:00 crc kubenswrapper[4825]: I0310 08:08:00.156960 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552168-fcmhd" Mar 10 08:08:00 crc kubenswrapper[4825]: I0310 08:08:00.159266 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:08:00 crc kubenswrapper[4825]: I0310 08:08:00.159553 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:08:00 crc kubenswrapper[4825]: I0310 08:08:00.170098 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:08:00 crc kubenswrapper[4825]: I0310 08:08:00.184124 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552168-fcmhd"] Mar 10 08:08:00 crc kubenswrapper[4825]: I0310 08:08:00.316399 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfqdq\" (UniqueName: \"kubernetes.io/projected/501e7d92-d832-43a7-8f14-38e1747824c9-kube-api-access-cfqdq\") pod \"auto-csr-approver-29552168-fcmhd\" (UID: \"501e7d92-d832-43a7-8f14-38e1747824c9\") " pod="openshift-infra/auto-csr-approver-29552168-fcmhd" Mar 10 08:08:00 crc kubenswrapper[4825]: I0310 08:08:00.418819 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfqdq\" (UniqueName: \"kubernetes.io/projected/501e7d92-d832-43a7-8f14-38e1747824c9-kube-api-access-cfqdq\") pod \"auto-csr-approver-29552168-fcmhd\" (UID: \"501e7d92-d832-43a7-8f14-38e1747824c9\") " pod="openshift-infra/auto-csr-approver-29552168-fcmhd" Mar 10 08:08:00 crc kubenswrapper[4825]: I0310 08:08:00.443082 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfqdq\" (UniqueName: \"kubernetes.io/projected/501e7d92-d832-43a7-8f14-38e1747824c9-kube-api-access-cfqdq\") pod \"auto-csr-approver-29552168-fcmhd\" (UID: \"501e7d92-d832-43a7-8f14-38e1747824c9\") " pod="openshift-infra/auto-csr-approver-29552168-fcmhd" Mar 10 08:08:00 crc kubenswrapper[4825]: I0310 08:08:00.492690 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552168-fcmhd" Mar 10 08:08:01 crc kubenswrapper[4825]: I0310 08:08:00.952943 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552168-fcmhd"] Mar 10 08:08:01 crc kubenswrapper[4825]: I0310 08:08:01.437278 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552168-fcmhd" event={"ID":"501e7d92-d832-43a7-8f14-38e1747824c9","Type":"ContainerStarted","Data":"c5a29432703f22cc5eaffc061d51d660805167cab9117c488ffa49a73591f1cb"} Mar 10 08:08:02 crc kubenswrapper[4825]: I0310 08:08:02.446920 4825 generic.go:334] "Generic (PLEG): container finished" podID="501e7d92-d832-43a7-8f14-38e1747824c9" containerID="0e3f03f9e285a51c36cb1f99c0b6950b36203c1b284e40b961dde1d7f3dc31f8" exitCode=0 Mar 10 08:08:02 crc kubenswrapper[4825]: I0310 08:08:02.446980 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552168-fcmhd" event={"ID":"501e7d92-d832-43a7-8f14-38e1747824c9","Type":"ContainerDied","Data":"0e3f03f9e285a51c36cb1f99c0b6950b36203c1b284e40b961dde1d7f3dc31f8"} Mar 10 08:08:03 crc kubenswrapper[4825]: I0310 08:08:03.818249 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552168-fcmhd" Mar 10 08:08:03 crc kubenswrapper[4825]: I0310 08:08:03.879826 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfqdq\" (UniqueName: \"kubernetes.io/projected/501e7d92-d832-43a7-8f14-38e1747824c9-kube-api-access-cfqdq\") pod \"501e7d92-d832-43a7-8f14-38e1747824c9\" (UID: \"501e7d92-d832-43a7-8f14-38e1747824c9\") " Mar 10 08:08:03 crc kubenswrapper[4825]: I0310 08:08:03.886326 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/501e7d92-d832-43a7-8f14-38e1747824c9-kube-api-access-cfqdq" (OuterVolumeSpecName: "kube-api-access-cfqdq") pod "501e7d92-d832-43a7-8f14-38e1747824c9" (UID: "501e7d92-d832-43a7-8f14-38e1747824c9"). InnerVolumeSpecName "kube-api-access-cfqdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:08:03 crc kubenswrapper[4825]: I0310 08:08:03.982155 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfqdq\" (UniqueName: \"kubernetes.io/projected/501e7d92-d832-43a7-8f14-38e1747824c9-kube-api-access-cfqdq\") on node \"crc\" DevicePath \"\"" Mar 10 08:08:04 crc kubenswrapper[4825]: I0310 08:08:04.467909 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552168-fcmhd" event={"ID":"501e7d92-d832-43a7-8f14-38e1747824c9","Type":"ContainerDied","Data":"c5a29432703f22cc5eaffc061d51d660805167cab9117c488ffa49a73591f1cb"} Mar 10 08:08:04 crc kubenswrapper[4825]: I0310 08:08:04.468188 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5a29432703f22cc5eaffc061d51d660805167cab9117c488ffa49a73591f1cb" Mar 10 08:08:04 crc kubenswrapper[4825]: I0310 08:08:04.467987 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552168-fcmhd" Mar 10 08:08:04 crc kubenswrapper[4825]: I0310 08:08:04.894437 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552162-bl6n6"] Mar 10 08:08:04 crc kubenswrapper[4825]: I0310 08:08:04.902152 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552162-bl6n6"] Mar 10 08:08:05 crc kubenswrapper[4825]: I0310 08:08:05.250418 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24859db8-ff70-476f-9aab-62e44102d93f" path="/var/lib/kubelet/pods/24859db8-ff70-476f-9aab-62e44102d93f/volumes" Mar 10 08:08:29 crc kubenswrapper[4825]: I0310 08:08:29.081011 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Mar 10 08:08:29 crc kubenswrapper[4825]: E0310 08:08:29.082452 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="501e7d92-d832-43a7-8f14-38e1747824c9" containerName="oc" Mar 10 08:08:29 crc kubenswrapper[4825]: I0310 08:08:29.082487 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="501e7d92-d832-43a7-8f14-38e1747824c9" containerName="oc" Mar 10 08:08:29 crc kubenswrapper[4825]: I0310 08:08:29.082729 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="501e7d92-d832-43a7-8f14-38e1747824c9" containerName="oc" Mar 10 08:08:29 crc kubenswrapper[4825]: I0310 08:08:29.083646 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 10 08:08:29 crc kubenswrapper[4825]: I0310 08:08:29.085978 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-grwd4" Mar 10 08:08:29 crc kubenswrapper[4825]: I0310 08:08:29.093759 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 10 08:08:29 crc kubenswrapper[4825]: I0310 08:08:29.234621 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q79n6\" (UniqueName: \"kubernetes.io/projected/808cc76b-6582-4007-8504-5c11f9d43ba4-kube-api-access-q79n6\") pod \"mariadb-copy-data\" (UID: \"808cc76b-6582-4007-8504-5c11f9d43ba4\") " pod="openstack/mariadb-copy-data" Mar 10 08:08:29 crc kubenswrapper[4825]: I0310 08:08:29.234975 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-acd5ef12-e6ba-4571-9bdc-07cbc77dbb5b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acd5ef12-e6ba-4571-9bdc-07cbc77dbb5b\") pod \"mariadb-copy-data\" (UID: \"808cc76b-6582-4007-8504-5c11f9d43ba4\") " pod="openstack/mariadb-copy-data" Mar 10 08:08:29 crc kubenswrapper[4825]: I0310 08:08:29.336103 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q79n6\" (UniqueName: \"kubernetes.io/projected/808cc76b-6582-4007-8504-5c11f9d43ba4-kube-api-access-q79n6\") pod \"mariadb-copy-data\" (UID: \"808cc76b-6582-4007-8504-5c11f9d43ba4\") " pod="openstack/mariadb-copy-data" Mar 10 08:08:29 crc kubenswrapper[4825]: I0310 08:08:29.336295 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-acd5ef12-e6ba-4571-9bdc-07cbc77dbb5b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acd5ef12-e6ba-4571-9bdc-07cbc77dbb5b\") pod \"mariadb-copy-data\" (UID: \"808cc76b-6582-4007-8504-5c11f9d43ba4\") " pod="openstack/mariadb-copy-data" Mar 10 08:08:29 crc kubenswrapper[4825]: I0310 08:08:29.340916 4825 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 08:08:29 crc kubenswrapper[4825]: I0310 08:08:29.340969 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-acd5ef12-e6ba-4571-9bdc-07cbc77dbb5b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acd5ef12-e6ba-4571-9bdc-07cbc77dbb5b\") pod \"mariadb-copy-data\" (UID: \"808cc76b-6582-4007-8504-5c11f9d43ba4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ac720c1497af11cbb8ac52cd8f75525ff1143214f126e52e479e65e3f3b5710f/globalmount\"" pod="openstack/mariadb-copy-data" Mar 10 08:08:29 crc kubenswrapper[4825]: I0310 08:08:29.363918 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q79n6\" (UniqueName: \"kubernetes.io/projected/808cc76b-6582-4007-8504-5c11f9d43ba4-kube-api-access-q79n6\") pod \"mariadb-copy-data\" (UID: \"808cc76b-6582-4007-8504-5c11f9d43ba4\") " pod="openstack/mariadb-copy-data" Mar 10 08:08:29 crc kubenswrapper[4825]: I0310 08:08:29.386667 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-acd5ef12-e6ba-4571-9bdc-07cbc77dbb5b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acd5ef12-e6ba-4571-9bdc-07cbc77dbb5b\") pod \"mariadb-copy-data\" (UID: \"808cc76b-6582-4007-8504-5c11f9d43ba4\") " pod="openstack/mariadb-copy-data" Mar 10 08:08:29 crc kubenswrapper[4825]: I0310 08:08:29.417183 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 10 08:08:29 crc kubenswrapper[4825]: I0310 08:08:29.785001 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 10 08:08:30 crc kubenswrapper[4825]: I0310 08:08:30.698061 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"808cc76b-6582-4007-8504-5c11f9d43ba4","Type":"ContainerStarted","Data":"8b01dc2d22588b22bd3853e41c894a4cb9414e35aa1a1773d48381c1a43c3440"} Mar 10 08:08:30 crc kubenswrapper[4825]: I0310 08:08:30.698390 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"808cc76b-6582-4007-8504-5c11f9d43ba4","Type":"ContainerStarted","Data":"11a7b84ddee9aad62b933d0dcc7a5a47e936aab10adef93d82aa6fc3b9183508"} Mar 10 08:08:30 crc kubenswrapper[4825]: I0310 08:08:30.718061 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.718040136 podStartE2EDuration="2.718040136s" podCreationTimestamp="2026-03-10 08:08:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:08:30.717804129 +0000 UTC m=+5063.747584794" watchObservedRunningTime="2026-03-10 08:08:30.718040136 +0000 UTC m=+5063.747820761" Mar 10 08:08:33 crc kubenswrapper[4825]: I0310 08:08:33.681929 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 10 08:08:33 crc kubenswrapper[4825]: I0310 08:08:33.683370 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 08:08:33 crc kubenswrapper[4825]: I0310 08:08:33.688368 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 10 08:08:33 crc kubenswrapper[4825]: I0310 08:08:33.809973 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7ldj\" (UniqueName: \"kubernetes.io/projected/98f1540d-2a2f-4560-b23e-bc749ffb169d-kube-api-access-g7ldj\") pod \"mariadb-client\" (UID: \"98f1540d-2a2f-4560-b23e-bc749ffb169d\") " pod="openstack/mariadb-client" Mar 10 08:08:33 crc kubenswrapper[4825]: I0310 08:08:33.911124 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7ldj\" (UniqueName: \"kubernetes.io/projected/98f1540d-2a2f-4560-b23e-bc749ffb169d-kube-api-access-g7ldj\") pod \"mariadb-client\" (UID: \"98f1540d-2a2f-4560-b23e-bc749ffb169d\") " pod="openstack/mariadb-client" Mar 10 08:08:33 crc kubenswrapper[4825]: I0310 08:08:33.936375 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7ldj\" (UniqueName: \"kubernetes.io/projected/98f1540d-2a2f-4560-b23e-bc749ffb169d-kube-api-access-g7ldj\") pod \"mariadb-client\" (UID: \"98f1540d-2a2f-4560-b23e-bc749ffb169d\") " pod="openstack/mariadb-client" Mar 10 08:08:34 crc kubenswrapper[4825]: I0310 08:08:34.031123 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 08:08:34 crc kubenswrapper[4825]: I0310 08:08:34.485324 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 10 08:08:34 crc kubenswrapper[4825]: I0310 08:08:34.736029 4825 generic.go:334] "Generic (PLEG): container finished" podID="98f1540d-2a2f-4560-b23e-bc749ffb169d" containerID="d47f17f2d6711d9df57a1a123518e67c9c43ff6f9cec5e1afd62736eb7829d33" exitCode=0 Mar 10 08:08:34 crc kubenswrapper[4825]: I0310 08:08:34.736088 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"98f1540d-2a2f-4560-b23e-bc749ffb169d","Type":"ContainerDied","Data":"d47f17f2d6711d9df57a1a123518e67c9c43ff6f9cec5e1afd62736eb7829d33"} Mar 10 08:08:34 crc kubenswrapper[4825]: I0310 08:08:34.736120 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"98f1540d-2a2f-4560-b23e-bc749ffb169d","Type":"ContainerStarted","Data":"5a2c168899299d03fbde6fd314093d09766a492c1745891419c660718ca335e6"} Mar 10 08:08:36 crc kubenswrapper[4825]: I0310 08:08:36.013030 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 08:08:36 crc kubenswrapper[4825]: I0310 08:08:36.045165 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_98f1540d-2a2f-4560-b23e-bc749ffb169d/mariadb-client/0.log" Mar 10 08:08:36 crc kubenswrapper[4825]: I0310 08:08:36.073943 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 10 08:08:36 crc kubenswrapper[4825]: I0310 08:08:36.080818 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 10 08:08:36 crc kubenswrapper[4825]: I0310 08:08:36.151769 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7ldj\" (UniqueName: \"kubernetes.io/projected/98f1540d-2a2f-4560-b23e-bc749ffb169d-kube-api-access-g7ldj\") pod \"98f1540d-2a2f-4560-b23e-bc749ffb169d\" (UID: \"98f1540d-2a2f-4560-b23e-bc749ffb169d\") " Mar 10 08:08:36 crc kubenswrapper[4825]: I0310 08:08:36.159677 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98f1540d-2a2f-4560-b23e-bc749ffb169d-kube-api-access-g7ldj" (OuterVolumeSpecName: "kube-api-access-g7ldj") pod "98f1540d-2a2f-4560-b23e-bc749ffb169d" (UID: "98f1540d-2a2f-4560-b23e-bc749ffb169d"). InnerVolumeSpecName "kube-api-access-g7ldj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:08:36 crc kubenswrapper[4825]: I0310 08:08:36.255755 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7ldj\" (UniqueName: \"kubernetes.io/projected/98f1540d-2a2f-4560-b23e-bc749ffb169d-kube-api-access-g7ldj\") on node \"crc\" DevicePath \"\"" Mar 10 08:08:36 crc kubenswrapper[4825]: I0310 08:08:36.262889 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 10 08:08:36 crc kubenswrapper[4825]: E0310 08:08:36.263469 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f1540d-2a2f-4560-b23e-bc749ffb169d" containerName="mariadb-client" Mar 10 08:08:36 crc kubenswrapper[4825]: I0310 08:08:36.263506 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f1540d-2a2f-4560-b23e-bc749ffb169d" containerName="mariadb-client" Mar 10 08:08:36 crc kubenswrapper[4825]: I0310 08:08:36.263833 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="98f1540d-2a2f-4560-b23e-bc749ffb169d" containerName="mariadb-client" Mar 10 08:08:36 crc kubenswrapper[4825]: I0310 08:08:36.264790 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 08:08:36 crc kubenswrapper[4825]: I0310 08:08:36.288480 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 10 08:08:36 crc kubenswrapper[4825]: I0310 08:08:36.357117 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrzgc\" (UniqueName: \"kubernetes.io/projected/47190584-46af-44ab-bcdd-f8519ff93582-kube-api-access-wrzgc\") pod \"mariadb-client\" (UID: \"47190584-46af-44ab-bcdd-f8519ff93582\") " pod="openstack/mariadb-client" Mar 10 08:08:36 crc kubenswrapper[4825]: I0310 08:08:36.459050 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrzgc\" (UniqueName: \"kubernetes.io/projected/47190584-46af-44ab-bcdd-f8519ff93582-kube-api-access-wrzgc\") pod \"mariadb-client\" (UID: \"47190584-46af-44ab-bcdd-f8519ff93582\") " pod="openstack/mariadb-client" Mar 10 08:08:36 crc kubenswrapper[4825]: I0310 08:08:36.478818 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrzgc\" (UniqueName: \"kubernetes.io/projected/47190584-46af-44ab-bcdd-f8519ff93582-kube-api-access-wrzgc\") pod \"mariadb-client\" (UID: \"47190584-46af-44ab-bcdd-f8519ff93582\") " pod="openstack/mariadb-client" Mar 10 08:08:36 crc kubenswrapper[4825]: I0310 08:08:36.594573 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 08:08:36 crc kubenswrapper[4825]: I0310 08:08:36.754349 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a2c168899299d03fbde6fd314093d09766a492c1745891419c660718ca335e6" Mar 10 08:08:36 crc kubenswrapper[4825]: I0310 08:08:36.754614 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 08:08:36 crc kubenswrapper[4825]: I0310 08:08:36.780795 4825 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="98f1540d-2a2f-4560-b23e-bc749ffb169d" podUID="47190584-46af-44ab-bcdd-f8519ff93582" Mar 10 08:08:36 crc kubenswrapper[4825]: I0310 08:08:36.897504 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 10 08:08:36 crc kubenswrapper[4825]: W0310 08:08:36.898891 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47190584_46af_44ab_bcdd_f8519ff93582.slice/crio-0b492e75877121e74ed82490705f32d82c94c0f546273b0607218af4d863c755 WatchSource:0}: Error finding container 0b492e75877121e74ed82490705f32d82c94c0f546273b0607218af4d863c755: Status 404 returned error can't find the container with id 0b492e75877121e74ed82490705f32d82c94c0f546273b0607218af4d863c755 Mar 10 08:08:37 crc kubenswrapper[4825]: I0310 08:08:37.255254 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98f1540d-2a2f-4560-b23e-bc749ffb169d" path="/var/lib/kubelet/pods/98f1540d-2a2f-4560-b23e-bc749ffb169d/volumes" Mar 10 08:08:37 crc kubenswrapper[4825]: I0310 08:08:37.767629 4825 generic.go:334] "Generic (PLEG): container finished" podID="47190584-46af-44ab-bcdd-f8519ff93582" containerID="6d7f07720fe71b1c6df297a30e12e619fd890a1ee2329fd6611172f15a47c839" exitCode=0 Mar 10 08:08:37 crc kubenswrapper[4825]: I0310 08:08:37.767760 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"47190584-46af-44ab-bcdd-f8519ff93582","Type":"ContainerDied","Data":"6d7f07720fe71b1c6df297a30e12e619fd890a1ee2329fd6611172f15a47c839"} Mar 10 08:08:37 crc kubenswrapper[4825]: I0310 08:08:37.768283 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"47190584-46af-44ab-bcdd-f8519ff93582","Type":"ContainerStarted","Data":"0b492e75877121e74ed82490705f32d82c94c0f546273b0607218af4d863c755"} Mar 10 08:08:39 crc kubenswrapper[4825]: I0310 08:08:39.116237 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 08:08:39 crc kubenswrapper[4825]: I0310 08:08:39.133286 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_47190584-46af-44ab-bcdd-f8519ff93582/mariadb-client/0.log" Mar 10 08:08:39 crc kubenswrapper[4825]: I0310 08:08:39.158911 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 10 08:08:39 crc kubenswrapper[4825]: I0310 08:08:39.165781 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 10 08:08:39 crc kubenswrapper[4825]: I0310 08:08:39.207403 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrzgc\" (UniqueName: \"kubernetes.io/projected/47190584-46af-44ab-bcdd-f8519ff93582-kube-api-access-wrzgc\") pod \"47190584-46af-44ab-bcdd-f8519ff93582\" (UID: \"47190584-46af-44ab-bcdd-f8519ff93582\") " Mar 10 08:08:39 crc kubenswrapper[4825]: I0310 08:08:39.215696 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47190584-46af-44ab-bcdd-f8519ff93582-kube-api-access-wrzgc" (OuterVolumeSpecName: "kube-api-access-wrzgc") pod "47190584-46af-44ab-bcdd-f8519ff93582" (UID: "47190584-46af-44ab-bcdd-f8519ff93582"). InnerVolumeSpecName "kube-api-access-wrzgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:08:39 crc kubenswrapper[4825]: I0310 08:08:39.249485 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47190584-46af-44ab-bcdd-f8519ff93582" path="/var/lib/kubelet/pods/47190584-46af-44ab-bcdd-f8519ff93582/volumes" Mar 10 08:08:39 crc kubenswrapper[4825]: I0310 08:08:39.308761 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrzgc\" (UniqueName: \"kubernetes.io/projected/47190584-46af-44ab-bcdd-f8519ff93582-kube-api-access-wrzgc\") on node \"crc\" DevicePath \"\"" Mar 10 08:08:39 crc kubenswrapper[4825]: I0310 08:08:39.787586 4825 scope.go:117] "RemoveContainer" containerID="6d7f07720fe71b1c6df297a30e12e619fd890a1ee2329fd6611172f15a47c839" Mar 10 08:08:39 crc kubenswrapper[4825]: I0310 08:08:39.787683 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 10 08:08:40 crc kubenswrapper[4825]: I0310 08:08:40.876856 4825 scope.go:117] "RemoveContainer" containerID="b12453ef51ed63f6d32b67d2e38327a80486d410ffa855f5f5abaaaa0fe051c4" Mar 10 08:09:06 crc kubenswrapper[4825]: I0310 08:09:06.206679 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dscg5"] Mar 10 08:09:06 crc kubenswrapper[4825]: E0310 08:09:06.207359 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47190584-46af-44ab-bcdd-f8519ff93582" containerName="mariadb-client" Mar 10 08:09:06 crc kubenswrapper[4825]: I0310 08:09:06.207371 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="47190584-46af-44ab-bcdd-f8519ff93582" containerName="mariadb-client" Mar 10 08:09:06 crc kubenswrapper[4825]: I0310 08:09:06.207509 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="47190584-46af-44ab-bcdd-f8519ff93582" containerName="mariadb-client" Mar 10 08:09:06 crc kubenswrapper[4825]: I0310 08:09:06.208479 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dscg5" Mar 10 08:09:06 crc kubenswrapper[4825]: I0310 08:09:06.235833 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dscg5"] Mar 10 08:09:06 crc kubenswrapper[4825]: I0310 08:09:06.390769 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/125d80d4-f496-4c1f-8a88-3feb02ffcda7-catalog-content\") pod \"redhat-marketplace-dscg5\" (UID: \"125d80d4-f496-4c1f-8a88-3feb02ffcda7\") " pod="openshift-marketplace/redhat-marketplace-dscg5" Mar 10 08:09:06 crc kubenswrapper[4825]: I0310 08:09:06.390826 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/125d80d4-f496-4c1f-8a88-3feb02ffcda7-utilities\") pod \"redhat-marketplace-dscg5\" (UID: \"125d80d4-f496-4c1f-8a88-3feb02ffcda7\") " pod="openshift-marketplace/redhat-marketplace-dscg5" Mar 10 08:09:06 crc kubenswrapper[4825]: I0310 08:09:06.390844 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7hfq\" (UniqueName: \"kubernetes.io/projected/125d80d4-f496-4c1f-8a88-3feb02ffcda7-kube-api-access-f7hfq\") pod \"redhat-marketplace-dscg5\" (UID: \"125d80d4-f496-4c1f-8a88-3feb02ffcda7\") " pod="openshift-marketplace/redhat-marketplace-dscg5" Mar 10 08:09:06 crc kubenswrapper[4825]: I0310 08:09:06.492477 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/125d80d4-f496-4c1f-8a88-3feb02ffcda7-catalog-content\") pod \"redhat-marketplace-dscg5\" (UID: \"125d80d4-f496-4c1f-8a88-3feb02ffcda7\") " pod="openshift-marketplace/redhat-marketplace-dscg5" Mar 10 08:09:06 crc kubenswrapper[4825]: I0310 08:09:06.492552 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/125d80d4-f496-4c1f-8a88-3feb02ffcda7-utilities\") pod \"redhat-marketplace-dscg5\" (UID: \"125d80d4-f496-4c1f-8a88-3feb02ffcda7\") " pod="openshift-marketplace/redhat-marketplace-dscg5" Mar 10 08:09:06 crc kubenswrapper[4825]: I0310 08:09:06.492580 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7hfq\" (UniqueName: \"kubernetes.io/projected/125d80d4-f496-4c1f-8a88-3feb02ffcda7-kube-api-access-f7hfq\") pod \"redhat-marketplace-dscg5\" (UID: \"125d80d4-f496-4c1f-8a88-3feb02ffcda7\") " pod="openshift-marketplace/redhat-marketplace-dscg5" Mar 10 08:09:06 crc kubenswrapper[4825]: I0310 08:09:06.493040 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/125d80d4-f496-4c1f-8a88-3feb02ffcda7-catalog-content\") pod \"redhat-marketplace-dscg5\" (UID: \"125d80d4-f496-4c1f-8a88-3feb02ffcda7\") " pod="openshift-marketplace/redhat-marketplace-dscg5" Mar 10 08:09:06 crc kubenswrapper[4825]: I0310 08:09:06.493154 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/125d80d4-f496-4c1f-8a88-3feb02ffcda7-utilities\") pod \"redhat-marketplace-dscg5\" (UID: \"125d80d4-f496-4c1f-8a88-3feb02ffcda7\") " pod="openshift-marketplace/redhat-marketplace-dscg5" Mar 10 08:09:06 crc kubenswrapper[4825]: I0310 08:09:06.517268 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7hfq\" (UniqueName: \"kubernetes.io/projected/125d80d4-f496-4c1f-8a88-3feb02ffcda7-kube-api-access-f7hfq\") pod \"redhat-marketplace-dscg5\" (UID: \"125d80d4-f496-4c1f-8a88-3feb02ffcda7\") " pod="openshift-marketplace/redhat-marketplace-dscg5" Mar 10 08:09:06 crc kubenswrapper[4825]: I0310 08:09:06.550928 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dscg5" Mar 10 08:09:06 crc kubenswrapper[4825]: I0310 08:09:06.808055 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dscg5"] Mar 10 08:09:07 crc kubenswrapper[4825]: I0310 08:09:07.015841 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dscg5" event={"ID":"125d80d4-f496-4c1f-8a88-3feb02ffcda7","Type":"ContainerStarted","Data":"97254cd811f4a78490ef722168028c3edc42698d756488ac163ff9c0fbbfcddb"} Mar 10 08:09:07 crc kubenswrapper[4825]: I0310 08:09:07.016156 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dscg5" event={"ID":"125d80d4-f496-4c1f-8a88-3feb02ffcda7","Type":"ContainerStarted","Data":"5e6bdbb723e149e083a556d8bc14887896a96cce6373d3be26c67564f615c442"} Mar 10 08:09:07 crc kubenswrapper[4825]: I0310 08:09:07.018472 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 08:09:08 crc kubenswrapper[4825]: I0310 08:09:08.066785 4825 generic.go:334] "Generic (PLEG): container finished" podID="125d80d4-f496-4c1f-8a88-3feb02ffcda7" containerID="97254cd811f4a78490ef722168028c3edc42698d756488ac163ff9c0fbbfcddb" exitCode=0 Mar 10 08:09:08 crc kubenswrapper[4825]: I0310 08:09:08.067078 4825 generic.go:334] "Generic (PLEG): container finished" podID="125d80d4-f496-4c1f-8a88-3feb02ffcda7" containerID="b0cbc7d00c2bf75e0d278fb1fe8f51e196c3ff914ee7fff1317c80643183c931" exitCode=0 Mar 10 08:09:08 crc kubenswrapper[4825]: I0310 08:09:08.066868 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dscg5" event={"ID":"125d80d4-f496-4c1f-8a88-3feb02ffcda7","Type":"ContainerDied","Data":"97254cd811f4a78490ef722168028c3edc42698d756488ac163ff9c0fbbfcddb"} Mar 10 08:09:08 crc kubenswrapper[4825]: I0310 08:09:08.067117 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dscg5" event={"ID":"125d80d4-f496-4c1f-8a88-3feb02ffcda7","Type":"ContainerDied","Data":"b0cbc7d00c2bf75e0d278fb1fe8f51e196c3ff914ee7fff1317c80643183c931"} Mar 10 08:09:09 crc kubenswrapper[4825]: I0310 08:09:09.077374 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dscg5" event={"ID":"125d80d4-f496-4c1f-8a88-3feb02ffcda7","Type":"ContainerStarted","Data":"9c1ef6ae934bd0c641cd124b2c6018c881897050aa57d781fdf5ae674e0be3d2"} Mar 10 08:09:09 crc kubenswrapper[4825]: I0310 08:09:09.095310 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dscg5" podStartSLOduration=1.659925591 podStartE2EDuration="3.095294224s" podCreationTimestamp="2026-03-10 08:09:06 +0000 UTC" firstStartedPulling="2026-03-10 08:09:07.018208639 +0000 UTC m=+5100.047989254" lastFinishedPulling="2026-03-10 08:09:08.453577272 +0000 UTC m=+5101.483357887" observedRunningTime="2026-03-10 08:09:09.091768742 +0000 UTC m=+5102.121549367" watchObservedRunningTime="2026-03-10 08:09:09.095294224 +0000 UTC m=+5102.125074829" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.478452 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.480406 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.484157 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.484424 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.484541 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.484584 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.484699 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.484867 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-h9xq6" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.491510 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.492885 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.527380 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.529070 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.533345 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.569349 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.597228 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qctx\" (UniqueName: \"kubernetes.io/projected/eee9953c-6698-4bdf-bda4-d8c49476fe3c-kube-api-access-5qctx\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") " pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.597288 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nf42\" (UniqueName: \"kubernetes.io/projected/bf24e25c-e247-40c5-ab10-ecc60c1b83db-kube-api-access-7nf42\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") " pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.597330 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/624ef9e0-5a07-4d11-b58e-b4b2be2a25cc-config\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") " pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.597354 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eee9953c-6698-4bdf-bda4-d8c49476fe3c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") " pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.597377 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eee9953c-6698-4bdf-bda4-d8c49476fe3c-config\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") " pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.597446 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee9953c-6698-4bdf-bda4-d8c49476fe3c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") " pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.597492 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf24e25c-e247-40c5-ab10-ecc60c1b83db-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") " pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.597529 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2ec69207-e25b-430f-b794-16cf0be691ec\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ec69207-e25b-430f-b794-16cf0be691ec\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") " pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.597573 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2ecadd60-4dc1-489e-ae36-9bd3cb412ac6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ecadd60-4dc1-489e-ae36-9bd3cb412ac6\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") " pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.597615 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m97dk\" (UniqueName: \"kubernetes.io/projected/624ef9e0-5a07-4d11-b58e-b4b2be2a25cc-kube-api-access-m97dk\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") " pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.597646 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf24e25c-e247-40c5-ab10-ecc60c1b83db-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") " pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.597673 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee9953c-6698-4bdf-bda4-d8c49476fe3c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") " pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.597709 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf24e25c-e247-40c5-ab10-ecc60c1b83db-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") " pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.597748 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf24e25c-e247-40c5-ab10-ecc60c1b83db-config\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") " pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.597822 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/624ef9e0-5a07-4d11-b58e-b4b2be2a25cc-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") " pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.597864 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/624ef9e0-5a07-4d11-b58e-b4b2be2a25cc-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") " pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.597889 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/624ef9e0-5a07-4d11-b58e-b4b2be2a25cc-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") " pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.597912 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eee9953c-6698-4bdf-bda4-d8c49476fe3c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") " pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.597937 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/624ef9e0-5a07-4d11-b58e-b4b2be2a25cc-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") " pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.597959 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624ef9e0-5a07-4d11-b58e-b4b2be2a25cc-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") " pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.597984 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee9953c-6698-4bdf-bda4-d8c49476fe3c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") " pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.598006 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf24e25c-e247-40c5-ab10-ecc60c1b83db-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") " pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.598038 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf24e25c-e247-40c5-ab10-ecc60c1b83db-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") " pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.598067 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-422b4e31-7230-4122-b49d-50405a6bbf29\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-422b4e31-7230-4122-b49d-50405a6bbf29\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") " pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.664024 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.665145 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.666946 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.667482 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-h82lz" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.669034 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.672523 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.677722 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.695336 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.699640 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700314 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf24e25c-e247-40c5-ab10-ecc60c1b83db-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") " pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700349 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee9953c-6698-4bdf-bda4-d8c49476fe3c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") " pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700374 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2ec69207-e25b-430f-b794-16cf0be691ec\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ec69207-e25b-430f-b794-16cf0be691ec\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") " pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700408 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2ecadd60-4dc1-489e-ae36-9bd3cb412ac6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ecadd60-4dc1-489e-ae36-9bd3cb412ac6\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") " pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700441 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m97dk\" (UniqueName: \"kubernetes.io/projected/624ef9e0-5a07-4d11-b58e-b4b2be2a25cc-kube-api-access-m97dk\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") " pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700463 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf24e25c-e247-40c5-ab10-ecc60c1b83db-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") " pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700483 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee9953c-6698-4bdf-bda4-d8c49476fe3c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") " pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700506 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf24e25c-e247-40c5-ab10-ecc60c1b83db-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") " pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700531 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83412342-2b66-41e4-a4c6-c9715ab28427-config\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") " pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700554 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf24e25c-e247-40c5-ab10-ecc60c1b83db-config\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") " pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700582 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/624ef9e0-5a07-4d11-b58e-b4b2be2a25cc-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") " pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700619 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/624ef9e0-5a07-4d11-b58e-b4b2be2a25cc-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") " pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700644 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83412342-2b66-41e4-a4c6-c9715ab28427-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") " pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700673 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/624ef9e0-5a07-4d11-b58e-b4b2be2a25cc-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") " pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700694 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83412342-2b66-41e4-a4c6-c9715ab28427-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") " pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700716 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eee9953c-6698-4bdf-bda4-d8c49476fe3c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") " pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700743 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/624ef9e0-5a07-4d11-b58e-b4b2be2a25cc-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") " pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700769 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624ef9e0-5a07-4d11-b58e-b4b2be2a25cc-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") " pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700795 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6298\" (UniqueName: \"kubernetes.io/projected/83412342-2b66-41e4-a4c6-c9715ab28427-kube-api-access-s6298\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") " pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700818 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee9953c-6698-4bdf-bda4-d8c49476fe3c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") " pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700837 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/83412342-2b66-41e4-a4c6-c9715ab28427-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") " pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700862 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf24e25c-e247-40c5-ab10-ecc60c1b83db-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") " pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700892 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf24e25c-e247-40c5-ab10-ecc60c1b83db-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") " pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700921 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-422b4e31-7230-4122-b49d-50405a6bbf29\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-422b4e31-7230-4122-b49d-50405a6bbf29\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") " pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700953 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qctx\" (UniqueName: \"kubernetes.io/projected/eee9953c-6698-4bdf-bda4-d8c49476fe3c-kube-api-access-5qctx\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") " pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.700984 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/83412342-2b66-41e4-a4c6-c9715ab28427-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") " pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.701008 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/83412342-2b66-41e4-a4c6-c9715ab28427-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") " pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.701033 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nf42\" (UniqueName: \"kubernetes.io/projected/bf24e25c-e247-40c5-ab10-ecc60c1b83db-kube-api-access-7nf42\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") " pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.701068 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/624ef9e0-5a07-4d11-b58e-b4b2be2a25cc-config\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") " pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.701091 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c4f6d861-9c37-4141-8eb3-9041e7df4da9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c4f6d861-9c37-4141-8eb3-9041e7df4da9\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") " pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.701121 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eee9953c-6698-4bdf-bda4-d8c49476fe3c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") " pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.701171 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eee9953c-6698-4bdf-bda4-d8c49476fe3c-config\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") " pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.701963 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf24e25c-e247-40c5-ab10-ecc60c1b83db-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") " pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.701988 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf24e25c-e247-40c5-ab10-ecc60c1b83db-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") " pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.702002 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eee9953c-6698-4bdf-bda4-d8c49476fe3c-config\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") " pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.702470 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eee9953c-6698-4bdf-bda4-d8c49476fe3c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") " pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.704143 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/624ef9e0-5a07-4d11-b58e-b4b2be2a25cc-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") " pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.705513 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/624ef9e0-5a07-4d11-b58e-b4b2be2a25cc-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") " pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.706348 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/624ef9e0-5a07-4d11-b58e-b4b2be2a25cc-config\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") " pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.706762 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eee9953c-6698-4bdf-bda4-d8c49476fe3c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") " pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.706834 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf24e25c-e247-40c5-ab10-ecc60c1b83db-config\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") " pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.710870 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf24e25c-e247-40c5-ab10-ecc60c1b83db-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") " pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.711475 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/624ef9e0-5a07-4d11-b58e-b4b2be2a25cc-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") " pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.711691 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624ef9e0-5a07-4d11-b58e-b4b2be2a25cc-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") " pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.711931 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf24e25c-e247-40c5-ab10-ecc60c1b83db-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") " pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.712616 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf24e25c-e247-40c5-ab10-ecc60c1b83db-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") " pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.712929 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee9953c-6698-4bdf-bda4-d8c49476fe3c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") " pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.714331 4825 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.714345 4825 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.714362 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2ec69207-e25b-430f-b794-16cf0be691ec\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ec69207-e25b-430f-b794-16cf0be691ec\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a8fdb2395b83eddc0b5599c9958fc958b8f09f529e28cc26055405c79db21dac/globalmount\"" pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.714369 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/624ef9e0-5a07-4d11-b58e-b4b2be2a25cc-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") " pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.714375 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-422b4e31-7230-4122-b49d-50405a6bbf29\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-422b4e31-7230-4122-b49d-50405a6bbf29\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/22acf1c4db53349565573e53f16b6865b844eae20e7821d8aba65ecfa601f82e/globalmount\"" pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.714583 4825 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.714672 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2ecadd60-4dc1-489e-ae36-9bd3cb412ac6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ecadd60-4dc1-489e-ae36-9bd3cb412ac6\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5408fd4665ea78857ea95a07576528a784a45c589a501480546682a00dcda2dd/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.716066 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee9953c-6698-4bdf-bda4-d8c49476fe3c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") " pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.725585 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qctx\" (UniqueName: \"kubernetes.io/projected/eee9953c-6698-4bdf-bda4-d8c49476fe3c-kube-api-access-5qctx\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") " pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.725967 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.727238 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.732620 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.736549 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m97dk\" (UniqueName: \"kubernetes.io/projected/624ef9e0-5a07-4d11-b58e-b4b2be2a25cc-kube-api-access-m97dk\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") " pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.739411 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.746393 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee9953c-6698-4bdf-bda4-d8c49476fe3c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") " pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.753457 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-422b4e31-7230-4122-b49d-50405a6bbf29\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-422b4e31-7230-4122-b49d-50405a6bbf29\") pod \"ovsdbserver-nb-1\" (UID: \"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc\") " pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.757300 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2ec69207-e25b-430f-b794-16cf0be691ec\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ec69207-e25b-430f-b794-16cf0be691ec\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") " pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.760215 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nf42\" (UniqueName: \"kubernetes.io/projected/bf24e25c-e247-40c5-ab10-ecc60c1b83db-kube-api-access-7nf42\") pod \"ovsdbserver-nb-2\" (UID: \"bf24e25c-e247-40c5-ab10-ecc60c1b83db\") " pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.765991 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2ecadd60-4dc1-489e-ae36-9bd3cb412ac6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ecadd60-4dc1-489e-ae36-9bd3cb412ac6\") pod \"ovsdbserver-nb-0\" (UID: \"eee9953c-6698-4bdf-bda4-d8c49476fe3c\") " pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.801879 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cbaf4a7a-c60d-42a1-8751-849bab562b68-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") " pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.801928 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c4f6d861-9c37-4141-8eb3-9041e7df4da9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c4f6d861-9c37-4141-8eb3-9041e7df4da9\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") " pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.801955 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ef0e8e14-a34c-4e12-814d-9c50d99cd5fb-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") " pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.801977 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef0e8e14-a34c-4e12-814d-9c50d99cd5fb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") " pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.801996 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbaf4a7a-c60d-42a1-8751-849bab562b68-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") " pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.802010 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbaf4a7a-c60d-42a1-8751-849bab562b68-config\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") " pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.802026 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef0e8e14-a34c-4e12-814d-9c50d99cd5fb-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") " pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.802043 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef0e8e14-a34c-4e12-814d-9c50d99cd5fb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") " pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.802060 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83412342-2b66-41e4-a4c6-c9715ab28427-config\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") " pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.802083 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xchp\" (UniqueName: \"kubernetes.io/projected/cbaf4a7a-c60d-42a1-8751-849bab562b68-kube-api-access-9xchp\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") " pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.802101 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cbaf4a7a-c60d-42a1-8751-849bab562b68-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") " pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.802241 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbaf4a7a-c60d-42a1-8751-849bab562b68-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") " pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.802288 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83412342-2b66-41e4-a4c6-c9715ab28427-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") " pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.802306 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5b293702-65b9-4b2c-a50f-31f4db99bc6f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5b293702-65b9-4b2c-a50f-31f4db99bc6f\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") " pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.802323 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83412342-2b66-41e4-a4c6-c9715ab28427-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") " pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.802346 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-455v5\" (UniqueName: \"kubernetes.io/projected/ef0e8e14-a34c-4e12-814d-9c50d99cd5fb-kube-api-access-455v5\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") " pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.802361 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbaf4a7a-c60d-42a1-8751-849bab562b68-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") " pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.802376 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef0e8e14-a34c-4e12-814d-9c50d99cd5fb-config\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") " pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.802393 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6298\" (UniqueName: \"kubernetes.io/projected/83412342-2b66-41e4-a4c6-c9715ab28427-kube-api-access-s6298\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") " pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.802412 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/83412342-2b66-41e4-a4c6-c9715ab28427-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") " pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.802444 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef0e8e14-a34c-4e12-814d-9c50d99cd5fb-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") " pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.802462 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-91c09cf1-2b69-4281-b559-c24dabe655e5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-91c09cf1-2b69-4281-b559-c24dabe655e5\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") " pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.802485 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/83412342-2b66-41e4-a4c6-c9715ab28427-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") " pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.802501 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/83412342-2b66-41e4-a4c6-c9715ab28427-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") " pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.803387 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83412342-2b66-41e4-a4c6-c9715ab28427-config\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") " pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.805629 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83412342-2b66-41e4-a4c6-c9715ab28427-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") " pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.805685 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/83412342-2b66-41e4-a4c6-c9715ab28427-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") " pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.806043 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/83412342-2b66-41e4-a4c6-c9715ab28427-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") " pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.806380 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/83412342-2b66-41e4-a4c6-c9715ab28427-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") " pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.806575 4825 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.806667 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c4f6d861-9c37-4141-8eb3-9041e7df4da9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c4f6d861-9c37-4141-8eb3-9041e7df4da9\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/387c55ca002bd689e14f0a2557d04da8ed812ececd1dffc695591d56d95abebb/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.810060 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83412342-2b66-41e4-a4c6-c9715ab28427-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") " pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.821029 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6298\" (UniqueName: \"kubernetes.io/projected/83412342-2b66-41e4-a4c6-c9715ab28427-kube-api-access-s6298\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") " pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.833710 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c4f6d861-9c37-4141-8eb3-9041e7df4da9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c4f6d861-9c37-4141-8eb3-9041e7df4da9\") pod \"ovsdbserver-sb-0\" (UID: \"83412342-2b66-41e4-a4c6-c9715ab28427\") " pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.851600 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.859873 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.878826 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.904263 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ef0e8e14-a34c-4e12-814d-9c50d99cd5fb-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") " pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.904312 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef0e8e14-a34c-4e12-814d-9c50d99cd5fb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") " pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.904334 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbaf4a7a-c60d-42a1-8751-849bab562b68-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") " pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.904369 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbaf4a7a-c60d-42a1-8751-849bab562b68-config\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") " pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.904392 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef0e8e14-a34c-4e12-814d-9c50d99cd5fb-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") " pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.904597 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef0e8e14-a34c-4e12-814d-9c50d99cd5fb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") " pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.904668 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xchp\" (UniqueName: \"kubernetes.io/projected/cbaf4a7a-c60d-42a1-8751-849bab562b68-kube-api-access-9xchp\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") " pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.905104 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cbaf4a7a-c60d-42a1-8751-849bab562b68-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") " pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.905197 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbaf4a7a-c60d-42a1-8751-849bab562b68-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") " pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.905239 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5b293702-65b9-4b2c-a50f-31f4db99bc6f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5b293702-65b9-4b2c-a50f-31f4db99bc6f\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") " pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.905298 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-455v5\" (UniqueName: \"kubernetes.io/projected/ef0e8e14-a34c-4e12-814d-9c50d99cd5fb-kube-api-access-455v5\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") " pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.905326 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbaf4a7a-c60d-42a1-8751-849bab562b68-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") " pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.905353 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef0e8e14-a34c-4e12-814d-9c50d99cd5fb-config\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") " pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.905496 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cbaf4a7a-c60d-42a1-8751-849bab562b68-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") " pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.905495 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbaf4a7a-c60d-42a1-8751-849bab562b68-config\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") " pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.905926 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ef0e8e14-a34c-4e12-814d-9c50d99cd5fb-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") " pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.905984 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef0e8e14-a34c-4e12-814d-9c50d99cd5fb-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") " pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.906024 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-91c09cf1-2b69-4281-b559-c24dabe655e5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-91c09cf1-2b69-4281-b559-c24dabe655e5\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") " pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.906089 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cbaf4a7a-c60d-42a1-8751-849bab562b68-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") " pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.906911 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef0e8e14-a34c-4e12-814d-9c50d99cd5fb-config\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") " pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.907185 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cbaf4a7a-c60d-42a1-8751-849bab562b68-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") " pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.907245 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef0e8e14-a34c-4e12-814d-9c50d99cd5fb-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") " pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.908652 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef0e8e14-a34c-4e12-814d-9c50d99cd5fb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") " pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.908928 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbaf4a7a-c60d-42a1-8751-849bab562b68-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") " pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.909414 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef0e8e14-a34c-4e12-814d-9c50d99cd5fb-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") " pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.909523 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef0e8e14-a34c-4e12-814d-9c50d99cd5fb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") " pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.909935 4825 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.909968 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-91c09cf1-2b69-4281-b559-c24dabe655e5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-91c09cf1-2b69-4281-b559-c24dabe655e5\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c311c4262ce9f82c791c6a1ab41c627e31fcfa50b377904ca2b9c6741f224cef/globalmount\"" pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.911114 4825 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.911169 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5b293702-65b9-4b2c-a50f-31f4db99bc6f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5b293702-65b9-4b2c-a50f-31f4db99bc6f\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ded91f5ba65c2fc4fb79e3a1602067fe74c4bf7de201c251df0ed1a1ebd03dd5/globalmount\"" pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.912627 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbaf4a7a-c60d-42a1-8751-849bab562b68-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") " pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.919072 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbaf4a7a-c60d-42a1-8751-849bab562b68-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") " pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.925669 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-455v5\" (UniqueName: \"kubernetes.io/projected/ef0e8e14-a34c-4e12-814d-9c50d99cd5fb-kube-api-access-455v5\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") " pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.928916 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xchp\" (UniqueName: \"kubernetes.io/projected/cbaf4a7a-c60d-42a1-8751-849bab562b68-kube-api-access-9xchp\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") " pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.939773 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5b293702-65b9-4b2c-a50f-31f4db99bc6f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5b293702-65b9-4b2c-a50f-31f4db99bc6f\") pod \"ovsdbserver-sb-2\" (UID: \"cbaf4a7a-c60d-42a1-8751-849bab562b68\") " pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.945260 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-91c09cf1-2b69-4281-b559-c24dabe655e5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-91c09cf1-2b69-4281-b559-c24dabe655e5\") pod \"ovsdbserver-sb-1\" (UID: \"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb\") " pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:13 crc kubenswrapper[4825]: I0310 08:09:13.993772 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:14 crc kubenswrapper[4825]: I0310 08:09:14.105882 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:14 crc kubenswrapper[4825]: I0310 08:09:14.166299 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:14 crc kubenswrapper[4825]: I0310 08:09:14.442133 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 08:09:14 crc kubenswrapper[4825]: I0310 08:09:14.517924 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 10 08:09:14 crc kubenswrapper[4825]: W0310 08:09:14.517980 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf24e25c_e247_40c5_ab10_ecc60c1b83db.slice/crio-26822d18ee13d10a150bb3bdcd962df5882553d1a9ccf1bfb5d93606e25ed87b WatchSource:0}: Error finding container 26822d18ee13d10a150bb3bdcd962df5882553d1a9ccf1bfb5d93606e25ed87b: Status 404 returned error can't find the container with id 26822d18ee13d10a150bb3bdcd962df5882553d1a9ccf1bfb5d93606e25ed87b Mar 10 08:09:14 crc kubenswrapper[4825]: I0310 08:09:14.621705 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 10 08:09:14 crc kubenswrapper[4825]: W0310 08:09:14.670514 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef0e8e14_a34c_4e12_814d_9c50d99cd5fb.slice/crio-36d42318af979986d6cbe18814bf3738eec14a58110f7e9d3ea62b1d4caf18b5 WatchSource:0}: Error finding container 36d42318af979986d6cbe18814bf3738eec14a58110f7e9d3ea62b1d4caf18b5: Status 404 returned error can't find the container with id 36d42318af979986d6cbe18814bf3738eec14a58110f7e9d3ea62b1d4caf18b5 Mar 10 08:09:14 crc kubenswrapper[4825]: I0310 08:09:14.782179 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 10 08:09:14 crc kubenswrapper[4825]: W0310 08:09:14.791677 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbaf4a7a_c60d_42a1_8751_849bab562b68.slice/crio-0905ce71a52c242e2c245bb464e06bb9e65ea7c18e00f69db8c8f737f76b7b94 WatchSource:0}: Error finding container 0905ce71a52c242e2c245bb464e06bb9e65ea7c18e00f69db8c8f737f76b7b94: Status 404 returned error can't find the container with id 0905ce71a52c242e2c245bb464e06bb9e65ea7c18e00f69db8c8f737f76b7b94 Mar 10 08:09:15 crc kubenswrapper[4825]: I0310 08:09:15.131407 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"cbaf4a7a-c60d-42a1-8751-849bab562b68","Type":"ContainerStarted","Data":"0905ce71a52c242e2c245bb464e06bb9e65ea7c18e00f69db8c8f737f76b7b94"} Mar 10 08:09:15 crc kubenswrapper[4825]: I0310 08:09:15.133173 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb","Type":"ContainerStarted","Data":"36d42318af979986d6cbe18814bf3738eec14a58110f7e9d3ea62b1d4caf18b5"} Mar 10 08:09:15 crc kubenswrapper[4825]: I0310 08:09:15.135347 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"bf24e25c-e247-40c5-ab10-ecc60c1b83db","Type":"ContainerStarted","Data":"26822d18ee13d10a150bb3bdcd962df5882553d1a9ccf1bfb5d93606e25ed87b"} Mar 10 08:09:15 crc kubenswrapper[4825]: I0310 08:09:15.137636 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"eee9953c-6698-4bdf-bda4-d8c49476fe3c","Type":"ContainerStarted","Data":"2a06a3687e89645536dbf8f67a1e37a3a52c1e7873bca7ac33579674e51492f5"} Mar 10 08:09:15 crc kubenswrapper[4825]: I0310 08:09:15.285748 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 08:09:15 crc kubenswrapper[4825]: W0310 08:09:15.291324 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83412342_2b66_41e4_a4c6_c9715ab28427.slice/crio-b8d6a75bd0f36ef157f525c90914101271a7cacafa89dcc9e0f5b55e01a7f969 WatchSource:0}: Error finding container b8d6a75bd0f36ef157f525c90914101271a7cacafa89dcc9e0f5b55e01a7f969: Status 404 returned error can't find the container with id b8d6a75bd0f36ef157f525c90914101271a7cacafa89dcc9e0f5b55e01a7f969 Mar 10 08:09:15 crc kubenswrapper[4825]: I0310 08:09:15.385436 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 10 08:09:16 crc kubenswrapper[4825]: I0310 08:09:16.146926 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc","Type":"ContainerStarted","Data":"a0f317d0d6e2f7ec3a5a543cfb04f5fb03596172f2f1671e1c63589c9326fed0"} Mar 10 08:09:16 crc kubenswrapper[4825]: I0310 08:09:16.149356 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"83412342-2b66-41e4-a4c6-c9715ab28427","Type":"ContainerStarted","Data":"b8d6a75bd0f36ef157f525c90914101271a7cacafa89dcc9e0f5b55e01a7f969"} Mar 10 08:09:16 crc kubenswrapper[4825]: I0310 08:09:16.551300 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dscg5" Mar 10 08:09:16 crc kubenswrapper[4825]: I0310 08:09:16.551364 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dscg5" Mar 10 08:09:16 crc kubenswrapper[4825]: I0310 08:09:16.600550 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dscg5" Mar 10 08:09:16 crc kubenswrapper[4825]: I0310 08:09:16.888272 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:09:16 crc kubenswrapper[4825]: I0310 08:09:16.888864 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:09:17 crc kubenswrapper[4825]: I0310 08:09:17.205247 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dscg5" Mar 10 08:09:17 crc kubenswrapper[4825]: I0310 08:09:17.254125 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dscg5"] Mar 10 08:09:19 crc kubenswrapper[4825]: I0310 08:09:19.178269 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"bf24e25c-e247-40c5-ab10-ecc60c1b83db","Type":"ContainerStarted","Data":"2f151e1db1a864f15a5087e8ee4a3eb420aa5175781d86b431f18f627efa1897"} Mar 10 08:09:19 crc kubenswrapper[4825]: I0310 08:09:19.179866 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"eee9953c-6698-4bdf-bda4-d8c49476fe3c","Type":"ContainerStarted","Data":"0649628907a550609f756b95e440b4252a5838b47bcecd0fc40c572dc59b74cf"} Mar 10 08:09:19 crc kubenswrapper[4825]: I0310 08:09:19.182244 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc","Type":"ContainerStarted","Data":"ac10bc32c8679651564a8e9963c2640c53de8670117687c06c82b2eff5666bc0"} Mar 10 08:09:19 crc kubenswrapper[4825]: I0310 08:09:19.184518 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"cbaf4a7a-c60d-42a1-8751-849bab562b68","Type":"ContainerStarted","Data":"424dda6cd8dc714bed480f9228c23b19a844d924871dd07f407ca67134082b02"} Mar 10 08:09:19 crc kubenswrapper[4825]: I0310 08:09:19.186628 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb","Type":"ContainerStarted","Data":"c804c07b992a358b8de072f411adf4661c56943c0a5a1f9c1e8d301e5431b1b5"} Mar 10 08:09:19 crc kubenswrapper[4825]: I0310 08:09:19.186741 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dscg5" podUID="125d80d4-f496-4c1f-8a88-3feb02ffcda7" containerName="registry-server" containerID="cri-o://9c1ef6ae934bd0c641cd124b2c6018c881897050aa57d781fdf5ae674e0be3d2" gracePeriod=2 Mar 10 08:09:19 crc kubenswrapper[4825]: I0310 08:09:19.587636 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dscg5" Mar 10 08:09:19 crc kubenswrapper[4825]: I0310 08:09:19.711385 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/125d80d4-f496-4c1f-8a88-3feb02ffcda7-catalog-content\") pod \"125d80d4-f496-4c1f-8a88-3feb02ffcda7\" (UID: \"125d80d4-f496-4c1f-8a88-3feb02ffcda7\") " Mar 10 08:09:19 crc kubenswrapper[4825]: I0310 08:09:19.711614 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/125d80d4-f496-4c1f-8a88-3feb02ffcda7-utilities\") pod \"125d80d4-f496-4c1f-8a88-3feb02ffcda7\" (UID: \"125d80d4-f496-4c1f-8a88-3feb02ffcda7\") " Mar 10 08:09:19 crc kubenswrapper[4825]: I0310 08:09:19.711643 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7hfq\" (UniqueName: \"kubernetes.io/projected/125d80d4-f496-4c1f-8a88-3feb02ffcda7-kube-api-access-f7hfq\") pod \"125d80d4-f496-4c1f-8a88-3feb02ffcda7\" (UID: \"125d80d4-f496-4c1f-8a88-3feb02ffcda7\") " Mar 10 08:09:19 crc kubenswrapper[4825]: I0310 08:09:19.713026 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/125d80d4-f496-4c1f-8a88-3feb02ffcda7-utilities" (OuterVolumeSpecName: "utilities") pod "125d80d4-f496-4c1f-8a88-3feb02ffcda7" (UID: "125d80d4-f496-4c1f-8a88-3feb02ffcda7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:09:19 crc kubenswrapper[4825]: I0310 08:09:19.715371 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/125d80d4-f496-4c1f-8a88-3feb02ffcda7-kube-api-access-f7hfq" (OuterVolumeSpecName: "kube-api-access-f7hfq") pod "125d80d4-f496-4c1f-8a88-3feb02ffcda7" (UID: "125d80d4-f496-4c1f-8a88-3feb02ffcda7"). InnerVolumeSpecName "kube-api-access-f7hfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:09:19 crc kubenswrapper[4825]: I0310 08:09:19.760109 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/125d80d4-f496-4c1f-8a88-3feb02ffcda7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "125d80d4-f496-4c1f-8a88-3feb02ffcda7" (UID: "125d80d4-f496-4c1f-8a88-3feb02ffcda7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:09:19 crc kubenswrapper[4825]: I0310 08:09:19.814577 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/125d80d4-f496-4c1f-8a88-3feb02ffcda7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 08:09:19 crc kubenswrapper[4825]: I0310 08:09:19.814643 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/125d80d4-f496-4c1f-8a88-3feb02ffcda7-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 08:09:19 crc kubenswrapper[4825]: I0310 08:09:19.814673 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7hfq\" (UniqueName: \"kubernetes.io/projected/125d80d4-f496-4c1f-8a88-3feb02ffcda7-kube-api-access-f7hfq\") on node \"crc\" DevicePath \"\"" Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.201619 4825 generic.go:334] "Generic (PLEG): container finished" podID="125d80d4-f496-4c1f-8a88-3feb02ffcda7" containerID="9c1ef6ae934bd0c641cd124b2c6018c881897050aa57d781fdf5ae674e0be3d2" exitCode=0 Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.201672 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dscg5" Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.201676 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dscg5" event={"ID":"125d80d4-f496-4c1f-8a88-3feb02ffcda7","Type":"ContainerDied","Data":"9c1ef6ae934bd0c641cd124b2c6018c881897050aa57d781fdf5ae674e0be3d2"} Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.201806 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dscg5" event={"ID":"125d80d4-f496-4c1f-8a88-3feb02ffcda7","Type":"ContainerDied","Data":"5e6bdbb723e149e083a556d8bc14887896a96cce6373d3be26c67564f615c442"} Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.201853 4825 scope.go:117] "RemoveContainer" containerID="9c1ef6ae934bd0c641cd124b2c6018c881897050aa57d781fdf5ae674e0be3d2" Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.208437 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"bf24e25c-e247-40c5-ab10-ecc60c1b83db","Type":"ContainerStarted","Data":"1c36068b9b4621df6cc71ce6cfa516fae7db329609de679e813b5856cdb76f5b"} Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.214808 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"eee9953c-6698-4bdf-bda4-d8c49476fe3c","Type":"ContainerStarted","Data":"46b0a7703e6c4d56ce64ce0b4e43e9dbc1808df1ddb8afe3f834b679027e4461"} Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.218769 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"83412342-2b66-41e4-a4c6-c9715ab28427","Type":"ContainerStarted","Data":"8a1fffb1035076470ff749a9c7ffa62aacdfb92766acb6f8f140f0b51f60c8d1"} Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.218838 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"83412342-2b66-41e4-a4c6-c9715ab28427","Type":"ContainerStarted","Data":"1e174562c6f86dc26a44d0a8bee00fa41c6bfd67d2ccbb4a65177343d032564b"} Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.223928 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"624ef9e0-5a07-4d11-b58e-b4b2be2a25cc","Type":"ContainerStarted","Data":"125b5a02fadd64742847ab93cbfe317bc5037dfdaef77aafaf1e68cd06bdaeee"} Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.227011 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"cbaf4a7a-c60d-42a1-8751-849bab562b68","Type":"ContainerStarted","Data":"dd8fb15403b28f82a234143752073900d3578438cea1e88279bb4cf1ca17ec95"} Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.230223 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"ef0e8e14-a34c-4e12-814d-9c50d99cd5fb","Type":"ContainerStarted","Data":"defe7f6fec84382ddc1d1ffa8912c851f1c7ef3e9457020145220f708cada822"} Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.236221 4825 scope.go:117] "RemoveContainer" containerID="b0cbc7d00c2bf75e0d278fb1fe8f51e196c3ff914ee7fff1317c80643183c931" Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.260557 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.099833365 podStartE2EDuration="8.26052297s" podCreationTimestamp="2026-03-10 08:09:12 +0000 UTC" firstStartedPulling="2026-03-10 08:09:14.520279613 +0000 UTC m=+5107.550060218" lastFinishedPulling="2026-03-10 08:09:18.680969198 +0000 UTC m=+5111.710749823" observedRunningTime="2026-03-10 08:09:20.246936543 +0000 UTC m=+5113.276717298" watchObservedRunningTime="2026-03-10 08:09:20.26052297 +0000 UTC m=+5113.290303625" Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.269067 4825 scope.go:117] "RemoveContainer" containerID="97254cd811f4a78490ef722168028c3edc42698d756488ac163ff9c0fbbfcddb" Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.276613 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=4.978534185 podStartE2EDuration="8.276583982s" podCreationTimestamp="2026-03-10 08:09:12 +0000 UTC" firstStartedPulling="2026-03-10 08:09:15.396612351 +0000 UTC m=+5108.426392956" lastFinishedPulling="2026-03-10 08:09:18.694662128 +0000 UTC m=+5111.724442753" observedRunningTime="2026-03-10 08:09:20.274423035 +0000 UTC m=+5113.304203670" watchObservedRunningTime="2026-03-10 08:09:20.276583982 +0000 UTC m=+5113.306364637" Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.308724 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=4.417312395 podStartE2EDuration="8.308705046s" podCreationTimestamp="2026-03-10 08:09:12 +0000 UTC" firstStartedPulling="2026-03-10 08:09:14.794194148 +0000 UTC m=+5107.823974763" lastFinishedPulling="2026-03-10 08:09:18.685586799 +0000 UTC m=+5111.715367414" observedRunningTime="2026-03-10 08:09:20.30542906 +0000 UTC m=+5113.335209675" watchObservedRunningTime="2026-03-10 08:09:20.308705046 +0000 UTC m=+5113.338485661" Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.323723 4825 scope.go:117] "RemoveContainer" containerID="9c1ef6ae934bd0c641cd124b2c6018c881897050aa57d781fdf5ae674e0be3d2" Mar 10 08:09:20 crc kubenswrapper[4825]: E0310 08:09:20.327679 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c1ef6ae934bd0c641cd124b2c6018c881897050aa57d781fdf5ae674e0be3d2\": container with ID starting with 9c1ef6ae934bd0c641cd124b2c6018c881897050aa57d781fdf5ae674e0be3d2 not found: ID does not exist" containerID="9c1ef6ae934bd0c641cd124b2c6018c881897050aa57d781fdf5ae674e0be3d2" Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.327748 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c1ef6ae934bd0c641cd124b2c6018c881897050aa57d781fdf5ae674e0be3d2"} err="failed to get container status \"9c1ef6ae934bd0c641cd124b2c6018c881897050aa57d781fdf5ae674e0be3d2\": rpc error: code = NotFound desc = could not find container \"9c1ef6ae934bd0c641cd124b2c6018c881897050aa57d781fdf5ae674e0be3d2\": container with ID starting with 9c1ef6ae934bd0c641cd124b2c6018c881897050aa57d781fdf5ae674e0be3d2 not found: ID does not exist" Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.327780 4825 scope.go:117] "RemoveContainer" containerID="b0cbc7d00c2bf75e0d278fb1fe8f51e196c3ff914ee7fff1317c80643183c931" Mar 10 08:09:20 crc kubenswrapper[4825]: E0310 08:09:20.328800 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0cbc7d00c2bf75e0d278fb1fe8f51e196c3ff914ee7fff1317c80643183c931\": container with ID starting with b0cbc7d00c2bf75e0d278fb1fe8f51e196c3ff914ee7fff1317c80643183c931 not found: ID does not exist" containerID="b0cbc7d00c2bf75e0d278fb1fe8f51e196c3ff914ee7fff1317c80643183c931" Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.328840 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0cbc7d00c2bf75e0d278fb1fe8f51e196c3ff914ee7fff1317c80643183c931"} err="failed to get container status \"b0cbc7d00c2bf75e0d278fb1fe8f51e196c3ff914ee7fff1317c80643183c931\": rpc error: code = NotFound desc = could not find container \"b0cbc7d00c2bf75e0d278fb1fe8f51e196c3ff914ee7fff1317c80643183c931\": container with ID starting with b0cbc7d00c2bf75e0d278fb1fe8f51e196c3ff914ee7fff1317c80643183c931 not found: ID does not exist" Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.328865 4825 scope.go:117] "RemoveContainer" containerID="97254cd811f4a78490ef722168028c3edc42698d756488ac163ff9c0fbbfcddb" Mar 10 08:09:20 crc kubenswrapper[4825]: E0310 08:09:20.329331 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97254cd811f4a78490ef722168028c3edc42698d756488ac163ff9c0fbbfcddb\": container with ID starting with 97254cd811f4a78490ef722168028c3edc42698d756488ac163ff9c0fbbfcddb not found: ID does not exist" containerID="97254cd811f4a78490ef722168028c3edc42698d756488ac163ff9c0fbbfcddb" Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.329380 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97254cd811f4a78490ef722168028c3edc42698d756488ac163ff9c0fbbfcddb"} err="failed to get container status \"97254cd811f4a78490ef722168028c3edc42698d756488ac163ff9c0fbbfcddb\": rpc error: code = NotFound desc = could not find container \"97254cd811f4a78490ef722168028c3edc42698d756488ac163ff9c0fbbfcddb\": container with ID starting with 97254cd811f4a78490ef722168028c3edc42698d756488ac163ff9c0fbbfcddb not found: ID does not exist" Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.329108 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.597280823 podStartE2EDuration="8.329094501s" podCreationTimestamp="2026-03-10 08:09:12 +0000 UTC" firstStartedPulling="2026-03-10 08:09:15.294383019 +0000 UTC m=+5108.324163634" lastFinishedPulling="2026-03-10 08:09:19.026196697 +0000 UTC m=+5112.055977312" observedRunningTime="2026-03-10 08:09:20.32408969 +0000 UTC m=+5113.353870315" watchObservedRunningTime="2026-03-10 08:09:20.329094501 +0000 UTC m=+5113.358875116" Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.349689 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.121328933 podStartE2EDuration="8.349660432s" podCreationTimestamp="2026-03-10 08:09:12 +0000 UTC" firstStartedPulling="2026-03-10 08:09:14.43479059 +0000 UTC m=+5107.464571215" lastFinishedPulling="2026-03-10 08:09:18.663122099 +0000 UTC m=+5111.692902714" observedRunningTime="2026-03-10 08:09:20.347352011 +0000 UTC m=+5113.377132646" watchObservedRunningTime="2026-03-10 08:09:20.349660432 +0000 UTC m=+5113.379441057" Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.374077 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=4.384352152 podStartE2EDuration="8.374055642s" podCreationTimestamp="2026-03-10 08:09:12 +0000 UTC" firstStartedPulling="2026-03-10 08:09:14.672523896 +0000 UTC m=+5107.702304511" lastFinishedPulling="2026-03-10 08:09:18.662227386 +0000 UTC m=+5111.692008001" observedRunningTime="2026-03-10 08:09:20.369561224 +0000 UTC m=+5113.399341859" watchObservedRunningTime="2026-03-10 08:09:20.374055642 +0000 UTC m=+5113.403836247" Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.386814 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dscg5"] Mar 10 08:09:20 crc kubenswrapper[4825]: I0310 08:09:20.393313 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dscg5"] Mar 10 08:09:21 crc kubenswrapper[4825]: I0310 08:09:21.263095 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="125d80d4-f496-4c1f-8a88-3feb02ffcda7" path="/var/lib/kubelet/pods/125d80d4-f496-4c1f-8a88-3feb02ffcda7/volumes" Mar 10 08:09:22 crc kubenswrapper[4825]: I0310 08:09:22.852440 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:22 crc kubenswrapper[4825]: I0310 08:09:22.861014 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:22 crc kubenswrapper[4825]: I0310 08:09:22.880327 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:22 crc kubenswrapper[4825]: I0310 08:09:22.917644 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:22 crc kubenswrapper[4825]: I0310 08:09:22.930897 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:22 crc kubenswrapper[4825]: I0310 08:09:22.947664 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:22 crc kubenswrapper[4825]: I0310 08:09:22.995370 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.052982 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.107116 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.167377 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.171040 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.208712 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.270652 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.271072 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.271697 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.271786 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.271810 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.271828 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.307346 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.311131 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.329298 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.491890 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-558cfc7d79-nfbnt"] Mar 10 08:09:23 crc kubenswrapper[4825]: E0310 08:09:23.492563 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="125d80d4-f496-4c1f-8a88-3feb02ffcda7" containerName="registry-server" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.492589 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="125d80d4-f496-4c1f-8a88-3feb02ffcda7" containerName="registry-server" Mar 10 08:09:23 crc kubenswrapper[4825]: E0310 08:09:23.492606 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="125d80d4-f496-4c1f-8a88-3feb02ffcda7" containerName="extract-content" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.492615 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="125d80d4-f496-4c1f-8a88-3feb02ffcda7" containerName="extract-content" Mar 10 08:09:23 crc kubenswrapper[4825]: E0310 08:09:23.492631 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="125d80d4-f496-4c1f-8a88-3feb02ffcda7" containerName="extract-utilities" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.492640 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="125d80d4-f496-4c1f-8a88-3feb02ffcda7" containerName="extract-utilities" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.492828 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="125d80d4-f496-4c1f-8a88-3feb02ffcda7" containerName="registry-server" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.493854 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-558cfc7d79-nfbnt" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.500464 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.504580 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-558cfc7d79-nfbnt"] Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.608346 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ebd29a7-114c-4f66-a459-615abc54c313-config\") pod \"dnsmasq-dns-558cfc7d79-nfbnt\" (UID: \"4ebd29a7-114c-4f66-a459-615abc54c313\") " pod="openstack/dnsmasq-dns-558cfc7d79-nfbnt" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.608393 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ebd29a7-114c-4f66-a459-615abc54c313-dns-svc\") pod \"dnsmasq-dns-558cfc7d79-nfbnt\" (UID: \"4ebd29a7-114c-4f66-a459-615abc54c313\") " pod="openstack/dnsmasq-dns-558cfc7d79-nfbnt" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.608445 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2j8t\" (UniqueName: \"kubernetes.io/projected/4ebd29a7-114c-4f66-a459-615abc54c313-kube-api-access-k2j8t\") pod \"dnsmasq-dns-558cfc7d79-nfbnt\" (UID: \"4ebd29a7-114c-4f66-a459-615abc54c313\") " pod="openstack/dnsmasq-dns-558cfc7d79-nfbnt" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.608529 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ebd29a7-114c-4f66-a459-615abc54c313-ovsdbserver-sb\") pod \"dnsmasq-dns-558cfc7d79-nfbnt\" (UID: \"4ebd29a7-114c-4f66-a459-615abc54c313\") " pod="openstack/dnsmasq-dns-558cfc7d79-nfbnt" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.644414 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-558cfc7d79-nfbnt"] Mar 10 08:09:23 crc kubenswrapper[4825]: E0310 08:09:23.648891 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-k2j8t ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-558cfc7d79-nfbnt" podUID="4ebd29a7-114c-4f66-a459-615abc54c313" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.679946 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79c5dff769-v95dp"] Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.681830 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79c5dff769-v95dp" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.684035 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.688316 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79c5dff769-v95dp"] Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.710240 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ebd29a7-114c-4f66-a459-615abc54c313-dns-svc\") pod \"dnsmasq-dns-558cfc7d79-nfbnt\" (UID: \"4ebd29a7-114c-4f66-a459-615abc54c313\") " pod="openstack/dnsmasq-dns-558cfc7d79-nfbnt" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.710326 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2j8t\" (UniqueName: \"kubernetes.io/projected/4ebd29a7-114c-4f66-a459-615abc54c313-kube-api-access-k2j8t\") pod \"dnsmasq-dns-558cfc7d79-nfbnt\" (UID: \"4ebd29a7-114c-4f66-a459-615abc54c313\") " pod="openstack/dnsmasq-dns-558cfc7d79-nfbnt" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.710378 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ebd29a7-114c-4f66-a459-615abc54c313-ovsdbserver-sb\") pod \"dnsmasq-dns-558cfc7d79-nfbnt\" (UID: \"4ebd29a7-114c-4f66-a459-615abc54c313\") " pod="openstack/dnsmasq-dns-558cfc7d79-nfbnt" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.710437 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ebd29a7-114c-4f66-a459-615abc54c313-config\") pod \"dnsmasq-dns-558cfc7d79-nfbnt\" (UID: \"4ebd29a7-114c-4f66-a459-615abc54c313\") " pod="openstack/dnsmasq-dns-558cfc7d79-nfbnt" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.711238 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ebd29a7-114c-4f66-a459-615abc54c313-dns-svc\") pod \"dnsmasq-dns-558cfc7d79-nfbnt\" (UID: \"4ebd29a7-114c-4f66-a459-615abc54c313\") " pod="openstack/dnsmasq-dns-558cfc7d79-nfbnt" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.711270 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ebd29a7-114c-4f66-a459-615abc54c313-config\") pod \"dnsmasq-dns-558cfc7d79-nfbnt\" (UID: \"4ebd29a7-114c-4f66-a459-615abc54c313\") " pod="openstack/dnsmasq-dns-558cfc7d79-nfbnt" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.711758 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ebd29a7-114c-4f66-a459-615abc54c313-ovsdbserver-sb\") pod \"dnsmasq-dns-558cfc7d79-nfbnt\" (UID: \"4ebd29a7-114c-4f66-a459-615abc54c313\") " pod="openstack/dnsmasq-dns-558cfc7d79-nfbnt" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.728282 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2j8t\" (UniqueName: \"kubernetes.io/projected/4ebd29a7-114c-4f66-a459-615abc54c313-kube-api-access-k2j8t\") pod \"dnsmasq-dns-558cfc7d79-nfbnt\" (UID: \"4ebd29a7-114c-4f66-a459-615abc54c313\") " pod="openstack/dnsmasq-dns-558cfc7d79-nfbnt" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.814927 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzn6c\" (UniqueName: \"kubernetes.io/projected/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-kube-api-access-nzn6c\") pod \"dnsmasq-dns-79c5dff769-v95dp\" (UID: \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\") " pod="openstack/dnsmasq-dns-79c5dff769-v95dp" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.815255 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-ovsdbserver-sb\") pod \"dnsmasq-dns-79c5dff769-v95dp\" (UID: \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\") " pod="openstack/dnsmasq-dns-79c5dff769-v95dp" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.815468 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-ovsdbserver-nb\") pod \"dnsmasq-dns-79c5dff769-v95dp\" (UID: \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\") " pod="openstack/dnsmasq-dns-79c5dff769-v95dp" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.815526 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-config\") pod \"dnsmasq-dns-79c5dff769-v95dp\" (UID: \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\") " pod="openstack/dnsmasq-dns-79c5dff769-v95dp" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.815573 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-dns-svc\") pod \"dnsmasq-dns-79c5dff769-v95dp\" (UID: \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\") " pod="openstack/dnsmasq-dns-79c5dff769-v95dp" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.916919 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzn6c\" (UniqueName: \"kubernetes.io/projected/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-kube-api-access-nzn6c\") pod \"dnsmasq-dns-79c5dff769-v95dp\" (UID: \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\") " pod="openstack/dnsmasq-dns-79c5dff769-v95dp" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.917022 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-ovsdbserver-sb\") pod \"dnsmasq-dns-79c5dff769-v95dp\" (UID: \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\") " pod="openstack/dnsmasq-dns-79c5dff769-v95dp" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.917070 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-ovsdbserver-nb\") pod \"dnsmasq-dns-79c5dff769-v95dp\" (UID: \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\") " pod="openstack/dnsmasq-dns-79c5dff769-v95dp" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.917097 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-config\") pod \"dnsmasq-dns-79c5dff769-v95dp\" (UID: \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\") " pod="openstack/dnsmasq-dns-79c5dff769-v95dp" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.917126 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-dns-svc\") pod \"dnsmasq-dns-79c5dff769-v95dp\" (UID: \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\") " pod="openstack/dnsmasq-dns-79c5dff769-v95dp" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.917972 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-ovsdbserver-sb\") pod \"dnsmasq-dns-79c5dff769-v95dp\" (UID: \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\") " pod="openstack/dnsmasq-dns-79c5dff769-v95dp" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.918161 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-ovsdbserver-nb\") pod \"dnsmasq-dns-79c5dff769-v95dp\" (UID: \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\") " pod="openstack/dnsmasq-dns-79c5dff769-v95dp" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.918229 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-dns-svc\") pod \"dnsmasq-dns-79c5dff769-v95dp\" (UID: \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\") " pod="openstack/dnsmasq-dns-79c5dff769-v95dp" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.918538 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-config\") pod \"dnsmasq-dns-79c5dff769-v95dp\" (UID: \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\") " pod="openstack/dnsmasq-dns-79c5dff769-v95dp" Mar 10 08:09:23 crc kubenswrapper[4825]: I0310 08:09:23.933901 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzn6c\" (UniqueName: \"kubernetes.io/projected/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-kube-api-access-nzn6c\") pod \"dnsmasq-dns-79c5dff769-v95dp\" (UID: \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\") " pod="openstack/dnsmasq-dns-79c5dff769-v95dp" Mar 10 08:09:24 crc kubenswrapper[4825]: I0310 08:09:24.006791 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79c5dff769-v95dp" Mar 10 08:09:24 crc kubenswrapper[4825]: I0310 08:09:24.277555 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-558cfc7d79-nfbnt" Mar 10 08:09:24 crc kubenswrapper[4825]: I0310 08:09:24.290225 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-558cfc7d79-nfbnt" Mar 10 08:09:24 crc kubenswrapper[4825]: I0310 08:09:24.317778 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 10 08:09:24 crc kubenswrapper[4825]: I0310 08:09:24.323372 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Mar 10 08:09:24 crc kubenswrapper[4825]: I0310 08:09:24.325716 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Mar 10 08:09:24 crc kubenswrapper[4825]: I0310 08:09:24.429777 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ebd29a7-114c-4f66-a459-615abc54c313-config\") pod \"4ebd29a7-114c-4f66-a459-615abc54c313\" (UID: \"4ebd29a7-114c-4f66-a459-615abc54c313\") " Mar 10 08:09:24 crc kubenswrapper[4825]: I0310 08:09:24.429823 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ebd29a7-114c-4f66-a459-615abc54c313-ovsdbserver-sb\") pod \"4ebd29a7-114c-4f66-a459-615abc54c313\" (UID: \"4ebd29a7-114c-4f66-a459-615abc54c313\") " Mar 10 08:09:24 crc kubenswrapper[4825]: I0310 08:09:24.429849 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ebd29a7-114c-4f66-a459-615abc54c313-dns-svc\") pod \"4ebd29a7-114c-4f66-a459-615abc54c313\" (UID: \"4ebd29a7-114c-4f66-a459-615abc54c313\") " Mar 10 08:09:24 crc kubenswrapper[4825]: I0310 08:09:24.430010 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2j8t\" (UniqueName: \"kubernetes.io/projected/4ebd29a7-114c-4f66-a459-615abc54c313-kube-api-access-k2j8t\") pod \"4ebd29a7-114c-4f66-a459-615abc54c313\" (UID: \"4ebd29a7-114c-4f66-a459-615abc54c313\") " Mar 10 08:09:24 crc kubenswrapper[4825]: I0310 08:09:24.430407 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ebd29a7-114c-4f66-a459-615abc54c313-config" (OuterVolumeSpecName: "config") pod "4ebd29a7-114c-4f66-a459-615abc54c313" (UID: "4ebd29a7-114c-4f66-a459-615abc54c313"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:09:24 crc kubenswrapper[4825]: I0310 08:09:24.430683 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ebd29a7-114c-4f66-a459-615abc54c313-config\") on node \"crc\" DevicePath \"\"" Mar 10 08:09:24 crc kubenswrapper[4825]: I0310 08:09:24.431591 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ebd29a7-114c-4f66-a459-615abc54c313-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4ebd29a7-114c-4f66-a459-615abc54c313" (UID: "4ebd29a7-114c-4f66-a459-615abc54c313"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:09:24 crc kubenswrapper[4825]: I0310 08:09:24.432098 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ebd29a7-114c-4f66-a459-615abc54c313-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4ebd29a7-114c-4f66-a459-615abc54c313" (UID: "4ebd29a7-114c-4f66-a459-615abc54c313"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:09:24 crc kubenswrapper[4825]: I0310 08:09:24.440957 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ebd29a7-114c-4f66-a459-615abc54c313-kube-api-access-k2j8t" (OuterVolumeSpecName: "kube-api-access-k2j8t") pod "4ebd29a7-114c-4f66-a459-615abc54c313" (UID: "4ebd29a7-114c-4f66-a459-615abc54c313"). InnerVolumeSpecName "kube-api-access-k2j8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:09:24 crc kubenswrapper[4825]: I0310 08:09:24.493660 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79c5dff769-v95dp"] Mar 10 08:09:24 crc kubenswrapper[4825]: I0310 08:09:24.534284 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2j8t\" (UniqueName: \"kubernetes.io/projected/4ebd29a7-114c-4f66-a459-615abc54c313-kube-api-access-k2j8t\") on node \"crc\" DevicePath \"\"" Mar 10 08:09:24 crc kubenswrapper[4825]: I0310 08:09:24.534316 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ebd29a7-114c-4f66-a459-615abc54c313-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 08:09:24 crc kubenswrapper[4825]: I0310 08:09:24.534326 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ebd29a7-114c-4f66-a459-615abc54c313-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 08:09:25 crc kubenswrapper[4825]: I0310 08:09:25.290654 4825 generic.go:334] "Generic (PLEG): container finished" podID="2ef475cc-7b9d-475f-bac7-c01ac3d568cb" containerID="4748b7071eebf2c2a015322fddcb926947e5b9e97f5486275c5a7087810ad0d0" exitCode=0 Mar 10 08:09:25 crc kubenswrapper[4825]: I0310 08:09:25.290723 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79c5dff769-v95dp" event={"ID":"2ef475cc-7b9d-475f-bac7-c01ac3d568cb","Type":"ContainerDied","Data":"4748b7071eebf2c2a015322fddcb926947e5b9e97f5486275c5a7087810ad0d0"} Mar 10 08:09:25 crc kubenswrapper[4825]: I0310 08:09:25.291171 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79c5dff769-v95dp" event={"ID":"2ef475cc-7b9d-475f-bac7-c01ac3d568cb","Type":"ContainerStarted","Data":"ee5a9cb1b7ddbd1126d2b1a615214e02a485c34bec93f7ad6e43fbeb3f6ebe4c"} Mar 10 08:09:25 crc kubenswrapper[4825]: I0310 08:09:25.291364 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-558cfc7d79-nfbnt" Mar 10 08:09:25 crc kubenswrapper[4825]: I0310 08:09:25.390939 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-558cfc7d79-nfbnt"] Mar 10 08:09:25 crc kubenswrapper[4825]: I0310 08:09:25.424044 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-558cfc7d79-nfbnt"] Mar 10 08:09:26 crc kubenswrapper[4825]: I0310 08:09:26.302204 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79c5dff769-v95dp" event={"ID":"2ef475cc-7b9d-475f-bac7-c01ac3d568cb","Type":"ContainerStarted","Data":"cc19aabad6ff8c1e49bcc4a7b3dd2ae4ae4af696c3cd59f2027c2c0813dd3b57"} Mar 10 08:09:26 crc kubenswrapper[4825]: I0310 08:09:26.302418 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79c5dff769-v95dp" Mar 10 08:09:26 crc kubenswrapper[4825]: I0310 08:09:26.323210 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79c5dff769-v95dp" podStartSLOduration=3.323116725 podStartE2EDuration="3.323116725s" podCreationTimestamp="2026-03-10 08:09:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:09:26.317786125 +0000 UTC m=+5119.347566750" watchObservedRunningTime="2026-03-10 08:09:26.323116725 +0000 UTC m=+5119.352897360" Mar 10 08:09:27 crc kubenswrapper[4825]: I0310 08:09:27.249993 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ebd29a7-114c-4f66-a459-615abc54c313" path="/var/lib/kubelet/pods/4ebd29a7-114c-4f66-a459-615abc54c313/volumes" Mar 10 08:09:27 crc kubenswrapper[4825]: I0310 08:09:27.517723 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Mar 10 08:09:27 crc kubenswrapper[4825]: I0310 08:09:27.519018 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 10 08:09:27 crc kubenswrapper[4825]: I0310 08:09:27.523465 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Mar 10 08:09:27 crc kubenswrapper[4825]: I0310 08:09:27.540815 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 10 08:09:27 crc kubenswrapper[4825]: I0310 08:09:27.612476 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6d1180a2-690c-4b45-b239-887bef574278\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6d1180a2-690c-4b45-b239-887bef574278\") pod \"ovn-copy-data\" (UID: \"45279de6-a8ae-49b0-8182-9b576a1f5e11\") " pod="openstack/ovn-copy-data" Mar 10 08:09:27 crc kubenswrapper[4825]: I0310 08:09:27.612558 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/45279de6-a8ae-49b0-8182-9b576a1f5e11-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"45279de6-a8ae-49b0-8182-9b576a1f5e11\") " pod="openstack/ovn-copy-data" Mar 10 08:09:27 crc kubenswrapper[4825]: I0310 08:09:27.612626 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wm8h\" (UniqueName: \"kubernetes.io/projected/45279de6-a8ae-49b0-8182-9b576a1f5e11-kube-api-access-7wm8h\") pod \"ovn-copy-data\" (UID: \"45279de6-a8ae-49b0-8182-9b576a1f5e11\") " pod="openstack/ovn-copy-data" Mar 10 08:09:27 crc kubenswrapper[4825]: I0310 08:09:27.715486 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6d1180a2-690c-4b45-b239-887bef574278\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6d1180a2-690c-4b45-b239-887bef574278\") pod \"ovn-copy-data\" (UID: \"45279de6-a8ae-49b0-8182-9b576a1f5e11\") " pod="openstack/ovn-copy-data" Mar 10 08:09:27 crc kubenswrapper[4825]: I0310 08:09:27.715742 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/45279de6-a8ae-49b0-8182-9b576a1f5e11-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"45279de6-a8ae-49b0-8182-9b576a1f5e11\") " pod="openstack/ovn-copy-data" Mar 10 08:09:27 crc kubenswrapper[4825]: I0310 08:09:27.715853 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wm8h\" (UniqueName: \"kubernetes.io/projected/45279de6-a8ae-49b0-8182-9b576a1f5e11-kube-api-access-7wm8h\") pod \"ovn-copy-data\" (UID: \"45279de6-a8ae-49b0-8182-9b576a1f5e11\") " pod="openstack/ovn-copy-data" Mar 10 08:09:27 crc kubenswrapper[4825]: I0310 08:09:27.720600 4825 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 08:09:27 crc kubenswrapper[4825]: I0310 08:09:27.720940 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6d1180a2-690c-4b45-b239-887bef574278\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6d1180a2-690c-4b45-b239-887bef574278\") pod \"ovn-copy-data\" (UID: \"45279de6-a8ae-49b0-8182-9b576a1f5e11\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/095659874b44d7e30d6252a2b24401a8c63bcaca35515e3f6a5ab4b8b0038610/globalmount\"" pod="openstack/ovn-copy-data" Mar 10 08:09:27 crc kubenswrapper[4825]: I0310 08:09:27.725625 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/45279de6-a8ae-49b0-8182-9b576a1f5e11-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"45279de6-a8ae-49b0-8182-9b576a1f5e11\") " pod="openstack/ovn-copy-data" Mar 10 08:09:27 crc kubenswrapper[4825]: I0310 08:09:27.742905 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wm8h\" (UniqueName: \"kubernetes.io/projected/45279de6-a8ae-49b0-8182-9b576a1f5e11-kube-api-access-7wm8h\") pod \"ovn-copy-data\" (UID: \"45279de6-a8ae-49b0-8182-9b576a1f5e11\") " pod="openstack/ovn-copy-data" Mar 10 08:09:27 crc kubenswrapper[4825]: I0310 08:09:27.777251 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6d1180a2-690c-4b45-b239-887bef574278\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6d1180a2-690c-4b45-b239-887bef574278\") pod \"ovn-copy-data\" (UID: \"45279de6-a8ae-49b0-8182-9b576a1f5e11\") " pod="openstack/ovn-copy-data" Mar 10 08:09:27 crc kubenswrapper[4825]: I0310 08:09:27.850422 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 10 08:09:28 crc kubenswrapper[4825]: I0310 08:09:28.382199 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 10 08:09:28 crc kubenswrapper[4825]: W0310 08:09:28.390542 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45279de6_a8ae_49b0_8182_9b576a1f5e11.slice/crio-81577592660eb68339e1ce86026b02b416d949e0244f3c76122f6912028c4f57 WatchSource:0}: Error finding container 81577592660eb68339e1ce86026b02b416d949e0244f3c76122f6912028c4f57: Status 404 returned error can't find the container with id 81577592660eb68339e1ce86026b02b416d949e0244f3c76122f6912028c4f57 Mar 10 08:09:29 crc kubenswrapper[4825]: I0310 08:09:29.338529 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"45279de6-a8ae-49b0-8182-9b576a1f5e11","Type":"ContainerStarted","Data":"a06eea67b494d75e4c30481fc42af7aea77ef71a8057552e179619bfc049e25c"} Mar 10 08:09:29 crc kubenswrapper[4825]: I0310 08:09:29.338965 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"45279de6-a8ae-49b0-8182-9b576a1f5e11","Type":"ContainerStarted","Data":"81577592660eb68339e1ce86026b02b416d949e0244f3c76122f6912028c4f57"} Mar 10 08:09:29 crc kubenswrapper[4825]: I0310 08:09:29.377559 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.176543029 podStartE2EDuration="3.377532779s" podCreationTimestamp="2026-03-10 08:09:26 +0000 UTC" firstStartedPulling="2026-03-10 08:09:28.393362506 +0000 UTC m=+5121.423143121" lastFinishedPulling="2026-03-10 08:09:28.594352236 +0000 UTC m=+5121.624132871" observedRunningTime="2026-03-10 08:09:29.36423449 +0000 UTC m=+5122.394015145" watchObservedRunningTime="2026-03-10 08:09:29.377532779 +0000 UTC m=+5122.407313434" Mar 10 08:09:34 crc kubenswrapper[4825]: I0310 08:09:34.008483 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79c5dff769-v95dp" Mar 10 08:09:34 crc kubenswrapper[4825]: I0310 08:09:34.088271 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76d4984c65-sjjvn"] Mar 10 08:09:34 crc kubenswrapper[4825]: I0310 08:09:34.088519 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76d4984c65-sjjvn" podUID="278f6673-3940-4a04-839d-28ce27000afa" containerName="dnsmasq-dns" containerID="cri-o://6a0b4071218c2ab0cad4195f8bd8cac9a5d9f05eb6135642d68df8bb72bc278f" gracePeriod=10 Mar 10 08:09:34 crc kubenswrapper[4825]: I0310 08:09:34.382952 4825 generic.go:334] "Generic (PLEG): container finished" podID="278f6673-3940-4a04-839d-28ce27000afa" containerID="6a0b4071218c2ab0cad4195f8bd8cac9a5d9f05eb6135642d68df8bb72bc278f" exitCode=0 Mar 10 08:09:34 crc kubenswrapper[4825]: I0310 08:09:34.382998 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76d4984c65-sjjvn" event={"ID":"278f6673-3940-4a04-839d-28ce27000afa","Type":"ContainerDied","Data":"6a0b4071218c2ab0cad4195f8bd8cac9a5d9f05eb6135642d68df8bb72bc278f"} Mar 10 08:09:34 crc kubenswrapper[4825]: I0310 08:09:34.561207 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76d4984c65-sjjvn" Mar 10 08:09:34 crc kubenswrapper[4825]: I0310 08:09:34.632012 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/278f6673-3940-4a04-839d-28ce27000afa-config\") pod \"278f6673-3940-4a04-839d-28ce27000afa\" (UID: \"278f6673-3940-4a04-839d-28ce27000afa\") " Mar 10 08:09:34 crc kubenswrapper[4825]: I0310 08:09:34.632087 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/278f6673-3940-4a04-839d-28ce27000afa-dns-svc\") pod \"278f6673-3940-4a04-839d-28ce27000afa\" (UID: \"278f6673-3940-4a04-839d-28ce27000afa\") " Mar 10 08:09:34 crc kubenswrapper[4825]: I0310 08:09:34.632905 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrlb6\" (UniqueName: \"kubernetes.io/projected/278f6673-3940-4a04-839d-28ce27000afa-kube-api-access-wrlb6\") pod \"278f6673-3940-4a04-839d-28ce27000afa\" (UID: \"278f6673-3940-4a04-839d-28ce27000afa\") " Mar 10 08:09:34 crc kubenswrapper[4825]: I0310 08:09:34.638359 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/278f6673-3940-4a04-839d-28ce27000afa-kube-api-access-wrlb6" (OuterVolumeSpecName: "kube-api-access-wrlb6") pod "278f6673-3940-4a04-839d-28ce27000afa" (UID: "278f6673-3940-4a04-839d-28ce27000afa"). InnerVolumeSpecName "kube-api-access-wrlb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:09:34 crc kubenswrapper[4825]: I0310 08:09:34.668658 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/278f6673-3940-4a04-839d-28ce27000afa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "278f6673-3940-4a04-839d-28ce27000afa" (UID: "278f6673-3940-4a04-839d-28ce27000afa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:09:34 crc kubenswrapper[4825]: I0310 08:09:34.672937 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/278f6673-3940-4a04-839d-28ce27000afa-config" (OuterVolumeSpecName: "config") pod "278f6673-3940-4a04-839d-28ce27000afa" (UID: "278f6673-3940-4a04-839d-28ce27000afa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:09:34 crc kubenswrapper[4825]: I0310 08:09:34.735220 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrlb6\" (UniqueName: \"kubernetes.io/projected/278f6673-3940-4a04-839d-28ce27000afa-kube-api-access-wrlb6\") on node \"crc\" DevicePath \"\"" Mar 10 08:09:34 crc kubenswrapper[4825]: I0310 08:09:34.735262 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/278f6673-3940-4a04-839d-28ce27000afa-config\") on node \"crc\" DevicePath \"\"" Mar 10 08:09:34 crc kubenswrapper[4825]: I0310 08:09:34.735305 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/278f6673-3940-4a04-839d-28ce27000afa-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.395109 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76d4984c65-sjjvn" event={"ID":"278f6673-3940-4a04-839d-28ce27000afa","Type":"ContainerDied","Data":"d88e633dce5e2c88d7540ddfe3a3c3cb18ed7fea0a376bbf0cfaf794f3b204bc"} Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.395937 4825 scope.go:117] "RemoveContainer" containerID="6a0b4071218c2ab0cad4195f8bd8cac9a5d9f05eb6135642d68df8bb72bc278f" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.396236 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76d4984c65-sjjvn" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.426501 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76d4984c65-sjjvn"] Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.430690 4825 scope.go:117] "RemoveContainer" containerID="406add66cf15836d786c223e5b2f37d54d9e420827ca22e46ba2f5fd21230e17" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.436280 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76d4984c65-sjjvn"] Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.536554 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 10 08:09:35 crc kubenswrapper[4825]: E0310 08:09:35.536869 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278f6673-3940-4a04-839d-28ce27000afa" containerName="dnsmasq-dns" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.536882 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="278f6673-3940-4a04-839d-28ce27000afa" containerName="dnsmasq-dns" Mar 10 08:09:35 crc kubenswrapper[4825]: E0310 08:09:35.536895 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278f6673-3940-4a04-839d-28ce27000afa" containerName="init" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.536900 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="278f6673-3940-4a04-839d-28ce27000afa" containerName="init" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.537067 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="278f6673-3940-4a04-839d-28ce27000afa" containerName="dnsmasq-dns" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.537888 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.541606 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.544418 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-qsq2m" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.545366 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.546074 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.550797 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.650005 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dddf17b-b47a-40c1-b6e9-99de860dd2bf-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6dddf17b-b47a-40c1-b6e9-99de860dd2bf\") " pod="openstack/ovn-northd-0" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.650346 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dddf17b-b47a-40c1-b6e9-99de860dd2bf-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6dddf17b-b47a-40c1-b6e9-99de860dd2bf\") " pod="openstack/ovn-northd-0" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.650390 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dddf17b-b47a-40c1-b6e9-99de860dd2bf-config\") pod \"ovn-northd-0\" (UID: \"6dddf17b-b47a-40c1-b6e9-99de860dd2bf\") " pod="openstack/ovn-northd-0" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.650455 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spz7z\" (UniqueName: \"kubernetes.io/projected/6dddf17b-b47a-40c1-b6e9-99de860dd2bf-kube-api-access-spz7z\") pod \"ovn-northd-0\" (UID: \"6dddf17b-b47a-40c1-b6e9-99de860dd2bf\") " pod="openstack/ovn-northd-0" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.650677 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6dddf17b-b47a-40c1-b6e9-99de860dd2bf-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6dddf17b-b47a-40c1-b6e9-99de860dd2bf\") " pod="openstack/ovn-northd-0" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.650792 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6dddf17b-b47a-40c1-b6e9-99de860dd2bf-scripts\") pod \"ovn-northd-0\" (UID: \"6dddf17b-b47a-40c1-b6e9-99de860dd2bf\") " pod="openstack/ovn-northd-0" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.650857 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dddf17b-b47a-40c1-b6e9-99de860dd2bf-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6dddf17b-b47a-40c1-b6e9-99de860dd2bf\") " pod="openstack/ovn-northd-0" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.751997 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spz7z\" (UniqueName: \"kubernetes.io/projected/6dddf17b-b47a-40c1-b6e9-99de860dd2bf-kube-api-access-spz7z\") pod \"ovn-northd-0\" (UID: \"6dddf17b-b47a-40c1-b6e9-99de860dd2bf\") " pod="openstack/ovn-northd-0" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.752095 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6dddf17b-b47a-40c1-b6e9-99de860dd2bf-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6dddf17b-b47a-40c1-b6e9-99de860dd2bf\") " pod="openstack/ovn-northd-0" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.752166 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6dddf17b-b47a-40c1-b6e9-99de860dd2bf-scripts\") pod \"ovn-northd-0\" (UID: \"6dddf17b-b47a-40c1-b6e9-99de860dd2bf\") " pod="openstack/ovn-northd-0" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.752203 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dddf17b-b47a-40c1-b6e9-99de860dd2bf-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6dddf17b-b47a-40c1-b6e9-99de860dd2bf\") " pod="openstack/ovn-northd-0" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.752234 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dddf17b-b47a-40c1-b6e9-99de860dd2bf-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6dddf17b-b47a-40c1-b6e9-99de860dd2bf\") " pod="openstack/ovn-northd-0" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.752255 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dddf17b-b47a-40c1-b6e9-99de860dd2bf-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6dddf17b-b47a-40c1-b6e9-99de860dd2bf\") " pod="openstack/ovn-northd-0" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.752290 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dddf17b-b47a-40c1-b6e9-99de860dd2bf-config\") pod \"ovn-northd-0\" (UID: \"6dddf17b-b47a-40c1-b6e9-99de860dd2bf\") " pod="openstack/ovn-northd-0" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.753173 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dddf17b-b47a-40c1-b6e9-99de860dd2bf-config\") pod \"ovn-northd-0\" (UID: \"6dddf17b-b47a-40c1-b6e9-99de860dd2bf\") " pod="openstack/ovn-northd-0" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.753677 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6dddf17b-b47a-40c1-b6e9-99de860dd2bf-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6dddf17b-b47a-40c1-b6e9-99de860dd2bf\") " pod="openstack/ovn-northd-0" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.754240 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6dddf17b-b47a-40c1-b6e9-99de860dd2bf-scripts\") pod \"ovn-northd-0\" (UID: \"6dddf17b-b47a-40c1-b6e9-99de860dd2bf\") " pod="openstack/ovn-northd-0" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.760378 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dddf17b-b47a-40c1-b6e9-99de860dd2bf-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6dddf17b-b47a-40c1-b6e9-99de860dd2bf\") " pod="openstack/ovn-northd-0" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.760973 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dddf17b-b47a-40c1-b6e9-99de860dd2bf-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6dddf17b-b47a-40c1-b6e9-99de860dd2bf\") " pod="openstack/ovn-northd-0" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.769874 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dddf17b-b47a-40c1-b6e9-99de860dd2bf-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6dddf17b-b47a-40c1-b6e9-99de860dd2bf\") " pod="openstack/ovn-northd-0" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.773680 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spz7z\" (UniqueName: \"kubernetes.io/projected/6dddf17b-b47a-40c1-b6e9-99de860dd2bf-kube-api-access-spz7z\") pod \"ovn-northd-0\" (UID: \"6dddf17b-b47a-40c1-b6e9-99de860dd2bf\") " pod="openstack/ovn-northd-0" Mar 10 08:09:35 crc kubenswrapper[4825]: I0310 08:09:35.853363 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 08:09:36 crc kubenswrapper[4825]: W0310 08:09:36.278078 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6dddf17b_b47a_40c1_b6e9_99de860dd2bf.slice/crio-36de12bb30e9700694cc2dfa849174d278ef9e15046bd792be8fd23a1245d32a WatchSource:0}: Error finding container 36de12bb30e9700694cc2dfa849174d278ef9e15046bd792be8fd23a1245d32a: Status 404 returned error can't find the container with id 36de12bb30e9700694cc2dfa849174d278ef9e15046bd792be8fd23a1245d32a Mar 10 08:09:36 crc kubenswrapper[4825]: I0310 08:09:36.278398 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 08:09:36 crc kubenswrapper[4825]: I0310 08:09:36.406181 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6dddf17b-b47a-40c1-b6e9-99de860dd2bf","Type":"ContainerStarted","Data":"36de12bb30e9700694cc2dfa849174d278ef9e15046bd792be8fd23a1245d32a"} Mar 10 08:09:37 crc kubenswrapper[4825]: I0310 08:09:37.246656 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="278f6673-3940-4a04-839d-28ce27000afa" path="/var/lib/kubelet/pods/278f6673-3940-4a04-839d-28ce27000afa/volumes" Mar 10 08:09:37 crc kubenswrapper[4825]: I0310 08:09:37.420968 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6dddf17b-b47a-40c1-b6e9-99de860dd2bf","Type":"ContainerStarted","Data":"1b0b0d073692fd59c58f38ae9d62455095a43f44076ba308a2d77b5207f005b2"} Mar 10 08:09:37 crc kubenswrapper[4825]: I0310 08:09:37.421038 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6dddf17b-b47a-40c1-b6e9-99de860dd2bf","Type":"ContainerStarted","Data":"98ea81aebeb891d921ab4c869e8b85eb765f98a36817cf3ec95ebe9e22d492bf"} Mar 10 08:09:37 crc kubenswrapper[4825]: I0310 08:09:37.421093 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 10 08:09:37 crc kubenswrapper[4825]: I0310 08:09:37.442202 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.7880174389999999 podStartE2EDuration="2.442180973s" podCreationTimestamp="2026-03-10 08:09:35 +0000 UTC" firstStartedPulling="2026-03-10 08:09:36.279948233 +0000 UTC m=+5129.309728838" lastFinishedPulling="2026-03-10 08:09:36.934111757 +0000 UTC m=+5129.963892372" observedRunningTime="2026-03-10 08:09:37.438487996 +0000 UTC m=+5130.468268641" watchObservedRunningTime="2026-03-10 08:09:37.442180973 +0000 UTC m=+5130.471961608" Mar 10 08:09:40 crc kubenswrapper[4825]: I0310 08:09:40.174929 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-rn9vj"] Mar 10 08:09:40 crc kubenswrapper[4825]: I0310 08:09:40.179793 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rn9vj" Mar 10 08:09:40 crc kubenswrapper[4825]: I0310 08:09:40.185280 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-rn9vj"] Mar 10 08:09:40 crc kubenswrapper[4825]: I0310 08:09:40.272803 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-ccee-account-create-update-dnx7s"] Mar 10 08:09:40 crc kubenswrapper[4825]: I0310 08:09:40.274156 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ccee-account-create-update-dnx7s" Mar 10 08:09:40 crc kubenswrapper[4825]: I0310 08:09:40.280638 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ccee-account-create-update-dnx7s"] Mar 10 08:09:40 crc kubenswrapper[4825]: I0310 08:09:40.281481 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jfwk\" (UniqueName: \"kubernetes.io/projected/95bf6167-d4ef-425d-a6ea-7678a3fb8956-kube-api-access-9jfwk\") pod \"keystone-db-create-rn9vj\" (UID: \"95bf6167-d4ef-425d-a6ea-7678a3fb8956\") " pod="openstack/keystone-db-create-rn9vj" Mar 10 08:09:40 crc kubenswrapper[4825]: I0310 08:09:40.281583 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95bf6167-d4ef-425d-a6ea-7678a3fb8956-operator-scripts\") pod \"keystone-db-create-rn9vj\" (UID: \"95bf6167-d4ef-425d-a6ea-7678a3fb8956\") " pod="openstack/keystone-db-create-rn9vj" Mar 10 08:09:40 crc kubenswrapper[4825]: I0310 08:09:40.311864 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 10 08:09:40 crc kubenswrapper[4825]: I0310 08:09:40.383560 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh7tr\" (UniqueName: \"kubernetes.io/projected/71b648ef-f8a8-475f-a831-8ebb97d1d57e-kube-api-access-hh7tr\") pod \"keystone-ccee-account-create-update-dnx7s\" (UID: \"71b648ef-f8a8-475f-a831-8ebb97d1d57e\") " pod="openstack/keystone-ccee-account-create-update-dnx7s" Mar 10 08:09:40 crc kubenswrapper[4825]: I0310 08:09:40.383705 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95bf6167-d4ef-425d-a6ea-7678a3fb8956-operator-scripts\") pod \"keystone-db-create-rn9vj\" (UID: \"95bf6167-d4ef-425d-a6ea-7678a3fb8956\") " pod="openstack/keystone-db-create-rn9vj" Mar 10 08:09:40 crc kubenswrapper[4825]: I0310 08:09:40.383785 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jfwk\" (UniqueName: \"kubernetes.io/projected/95bf6167-d4ef-425d-a6ea-7678a3fb8956-kube-api-access-9jfwk\") pod \"keystone-db-create-rn9vj\" (UID: \"95bf6167-d4ef-425d-a6ea-7678a3fb8956\") " pod="openstack/keystone-db-create-rn9vj" Mar 10 08:09:40 crc kubenswrapper[4825]: I0310 08:09:40.383835 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71b648ef-f8a8-475f-a831-8ebb97d1d57e-operator-scripts\") pod \"keystone-ccee-account-create-update-dnx7s\" (UID: \"71b648ef-f8a8-475f-a831-8ebb97d1d57e\") " pod="openstack/keystone-ccee-account-create-update-dnx7s" Mar 10 08:09:40 crc kubenswrapper[4825]: I0310 08:09:40.385546 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95bf6167-d4ef-425d-a6ea-7678a3fb8956-operator-scripts\") pod \"keystone-db-create-rn9vj\" (UID: \"95bf6167-d4ef-425d-a6ea-7678a3fb8956\") " pod="openstack/keystone-db-create-rn9vj" Mar 10 08:09:40 crc kubenswrapper[4825]: I0310 08:09:40.405918 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jfwk\" (UniqueName: \"kubernetes.io/projected/95bf6167-d4ef-425d-a6ea-7678a3fb8956-kube-api-access-9jfwk\") pod \"keystone-db-create-rn9vj\" (UID: \"95bf6167-d4ef-425d-a6ea-7678a3fb8956\") " pod="openstack/keystone-db-create-rn9vj" Mar 10 08:09:40 crc kubenswrapper[4825]: I0310 08:09:40.485583 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71b648ef-f8a8-475f-a831-8ebb97d1d57e-operator-scripts\") pod \"keystone-ccee-account-create-update-dnx7s\" (UID: \"71b648ef-f8a8-475f-a831-8ebb97d1d57e\") " pod="openstack/keystone-ccee-account-create-update-dnx7s" Mar 10 08:09:40 crc kubenswrapper[4825]: I0310 08:09:40.485694 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh7tr\" (UniqueName: \"kubernetes.io/projected/71b648ef-f8a8-475f-a831-8ebb97d1d57e-kube-api-access-hh7tr\") pod \"keystone-ccee-account-create-update-dnx7s\" (UID: \"71b648ef-f8a8-475f-a831-8ebb97d1d57e\") " pod="openstack/keystone-ccee-account-create-update-dnx7s" Mar 10 08:09:40 crc kubenswrapper[4825]: I0310 08:09:40.486373 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71b648ef-f8a8-475f-a831-8ebb97d1d57e-operator-scripts\") pod \"keystone-ccee-account-create-update-dnx7s\" (UID: \"71b648ef-f8a8-475f-a831-8ebb97d1d57e\") " pod="openstack/keystone-ccee-account-create-update-dnx7s" Mar 10 08:09:40 crc kubenswrapper[4825]: I0310 08:09:40.502609 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh7tr\" (UniqueName: \"kubernetes.io/projected/71b648ef-f8a8-475f-a831-8ebb97d1d57e-kube-api-access-hh7tr\") pod \"keystone-ccee-account-create-update-dnx7s\" (UID: \"71b648ef-f8a8-475f-a831-8ebb97d1d57e\") " pod="openstack/keystone-ccee-account-create-update-dnx7s" Mar 10 08:09:40 crc kubenswrapper[4825]: I0310 08:09:40.511926 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rn9vj" Mar 10 08:09:40 crc kubenswrapper[4825]: I0310 08:09:40.626455 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ccee-account-create-update-dnx7s" Mar 10 08:09:40 crc kubenswrapper[4825]: I0310 08:09:40.905465 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-rn9vj"] Mar 10 08:09:40 crc kubenswrapper[4825]: W0310 08:09:40.910736 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95bf6167_d4ef_425d_a6ea_7678a3fb8956.slice/crio-ebae54d019c535ea5993714537d4b28910490d5f783b15ef83f3449ebad18e57 WatchSource:0}: Error finding container ebae54d019c535ea5993714537d4b28910490d5f783b15ef83f3449ebad18e57: Status 404 returned error can't find the container with id ebae54d019c535ea5993714537d4b28910490d5f783b15ef83f3449ebad18e57 Mar 10 08:09:40 crc kubenswrapper[4825]: I0310 08:09:40.983204 4825 scope.go:117] "RemoveContainer" containerID="22f9a4619cc3afadea5907a8b0d49e13d5f5e340f38d071e4d45e57275fe5daf" Mar 10 08:09:41 crc kubenswrapper[4825]: I0310 08:09:41.021348 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ccee-account-create-update-dnx7s"] Mar 10 08:09:41 crc kubenswrapper[4825]: W0310 08:09:41.021685 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71b648ef_f8a8_475f_a831_8ebb97d1d57e.slice/crio-f6985f627f432d370bf32d6236debd8fe6f79a0f3303fc9bf0ec6c4a542e6b86 WatchSource:0}: Error finding container f6985f627f432d370bf32d6236debd8fe6f79a0f3303fc9bf0ec6c4a542e6b86: Status 404 returned error can't find the container with id f6985f627f432d370bf32d6236debd8fe6f79a0f3303fc9bf0ec6c4a542e6b86 Mar 10 08:09:41 crc kubenswrapper[4825]: I0310 08:09:41.453539 4825 generic.go:334] "Generic (PLEG): container finished" podID="95bf6167-d4ef-425d-a6ea-7678a3fb8956" containerID="1748fcb25c55190745bc6107d4e046b63ffd4821e2b74fdf4c5ec5c198c3087f" exitCode=0 Mar 10 08:09:41 crc kubenswrapper[4825]: I0310 08:09:41.453671 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rn9vj" event={"ID":"95bf6167-d4ef-425d-a6ea-7678a3fb8956","Type":"ContainerDied","Data":"1748fcb25c55190745bc6107d4e046b63ffd4821e2b74fdf4c5ec5c198c3087f"} Mar 10 08:09:41 crc kubenswrapper[4825]: I0310 08:09:41.453739 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rn9vj" event={"ID":"95bf6167-d4ef-425d-a6ea-7678a3fb8956","Type":"ContainerStarted","Data":"ebae54d019c535ea5993714537d4b28910490d5f783b15ef83f3449ebad18e57"} Mar 10 08:09:41 crc kubenswrapper[4825]: I0310 08:09:41.456485 4825 generic.go:334] "Generic (PLEG): container finished" podID="71b648ef-f8a8-475f-a831-8ebb97d1d57e" containerID="5fc0fd428da0df49f8a2e5b863ca67ba564a9dcca824b1d01642d3c5a44ba489" exitCode=0 Mar 10 08:09:41 crc kubenswrapper[4825]: I0310 08:09:41.456538 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ccee-account-create-update-dnx7s" event={"ID":"71b648ef-f8a8-475f-a831-8ebb97d1d57e","Type":"ContainerDied","Data":"5fc0fd428da0df49f8a2e5b863ca67ba564a9dcca824b1d01642d3c5a44ba489"} Mar 10 08:09:41 crc kubenswrapper[4825]: I0310 08:09:41.456598 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ccee-account-create-update-dnx7s" event={"ID":"71b648ef-f8a8-475f-a831-8ebb97d1d57e","Type":"ContainerStarted","Data":"f6985f627f432d370bf32d6236debd8fe6f79a0f3303fc9bf0ec6c4a542e6b86"} Mar 10 08:09:42 crc kubenswrapper[4825]: I0310 08:09:42.907121 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ccee-account-create-update-dnx7s" Mar 10 08:09:42 crc kubenswrapper[4825]: I0310 08:09:42.913814 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rn9vj" Mar 10 08:09:42 crc kubenswrapper[4825]: I0310 08:09:42.928569 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71b648ef-f8a8-475f-a831-8ebb97d1d57e-operator-scripts\") pod \"71b648ef-f8a8-475f-a831-8ebb97d1d57e\" (UID: \"71b648ef-f8a8-475f-a831-8ebb97d1d57e\") " Mar 10 08:09:42 crc kubenswrapper[4825]: I0310 08:09:42.928878 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh7tr\" (UniqueName: \"kubernetes.io/projected/71b648ef-f8a8-475f-a831-8ebb97d1d57e-kube-api-access-hh7tr\") pod \"71b648ef-f8a8-475f-a831-8ebb97d1d57e\" (UID: \"71b648ef-f8a8-475f-a831-8ebb97d1d57e\") " Mar 10 08:09:42 crc kubenswrapper[4825]: I0310 08:09:42.930580 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71b648ef-f8a8-475f-a831-8ebb97d1d57e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "71b648ef-f8a8-475f-a831-8ebb97d1d57e" (UID: "71b648ef-f8a8-475f-a831-8ebb97d1d57e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:09:42 crc kubenswrapper[4825]: I0310 08:09:42.936536 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71b648ef-f8a8-475f-a831-8ebb97d1d57e-kube-api-access-hh7tr" (OuterVolumeSpecName: "kube-api-access-hh7tr") pod "71b648ef-f8a8-475f-a831-8ebb97d1d57e" (UID: "71b648ef-f8a8-475f-a831-8ebb97d1d57e"). InnerVolumeSpecName "kube-api-access-hh7tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:09:43 crc kubenswrapper[4825]: I0310 08:09:43.030412 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jfwk\" (UniqueName: \"kubernetes.io/projected/95bf6167-d4ef-425d-a6ea-7678a3fb8956-kube-api-access-9jfwk\") pod \"95bf6167-d4ef-425d-a6ea-7678a3fb8956\" (UID: \"95bf6167-d4ef-425d-a6ea-7678a3fb8956\") " Mar 10 08:09:43 crc kubenswrapper[4825]: I0310 08:09:43.030977 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95bf6167-d4ef-425d-a6ea-7678a3fb8956-operator-scripts\") pod \"95bf6167-d4ef-425d-a6ea-7678a3fb8956\" (UID: \"95bf6167-d4ef-425d-a6ea-7678a3fb8956\") " Mar 10 08:09:43 crc kubenswrapper[4825]: I0310 08:09:43.031596 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh7tr\" (UniqueName: \"kubernetes.io/projected/71b648ef-f8a8-475f-a831-8ebb97d1d57e-kube-api-access-hh7tr\") on node \"crc\" DevicePath \"\"" Mar 10 08:09:43 crc kubenswrapper[4825]: I0310 08:09:43.031633 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71b648ef-f8a8-475f-a831-8ebb97d1d57e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:09:43 crc kubenswrapper[4825]: I0310 08:09:43.031619 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95bf6167-d4ef-425d-a6ea-7678a3fb8956-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95bf6167-d4ef-425d-a6ea-7678a3fb8956" (UID: "95bf6167-d4ef-425d-a6ea-7678a3fb8956"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:09:43 crc kubenswrapper[4825]: I0310 08:09:43.034901 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95bf6167-d4ef-425d-a6ea-7678a3fb8956-kube-api-access-9jfwk" (OuterVolumeSpecName: "kube-api-access-9jfwk") pod "95bf6167-d4ef-425d-a6ea-7678a3fb8956" (UID: "95bf6167-d4ef-425d-a6ea-7678a3fb8956"). InnerVolumeSpecName "kube-api-access-9jfwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:09:43 crc kubenswrapper[4825]: I0310 08:09:43.133932 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95bf6167-d4ef-425d-a6ea-7678a3fb8956-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:09:43 crc kubenswrapper[4825]: I0310 08:09:43.133963 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jfwk\" (UniqueName: \"kubernetes.io/projected/95bf6167-d4ef-425d-a6ea-7678a3fb8956-kube-api-access-9jfwk\") on node \"crc\" DevicePath \"\"" Mar 10 08:09:43 crc kubenswrapper[4825]: I0310 08:09:43.471627 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ccee-account-create-update-dnx7s" event={"ID":"71b648ef-f8a8-475f-a831-8ebb97d1d57e","Type":"ContainerDied","Data":"f6985f627f432d370bf32d6236debd8fe6f79a0f3303fc9bf0ec6c4a542e6b86"} Mar 10 08:09:43 crc kubenswrapper[4825]: I0310 08:09:43.471674 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6985f627f432d370bf32d6236debd8fe6f79a0f3303fc9bf0ec6c4a542e6b86" Mar 10 08:09:43 crc kubenswrapper[4825]: I0310 08:09:43.471704 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ccee-account-create-update-dnx7s" Mar 10 08:09:43 crc kubenswrapper[4825]: I0310 08:09:43.473842 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rn9vj" event={"ID":"95bf6167-d4ef-425d-a6ea-7678a3fb8956","Type":"ContainerDied","Data":"ebae54d019c535ea5993714537d4b28910490d5f783b15ef83f3449ebad18e57"} Mar 10 08:09:43 crc kubenswrapper[4825]: I0310 08:09:43.473882 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebae54d019c535ea5993714537d4b28910490d5f783b15ef83f3449ebad18e57" Mar 10 08:09:43 crc kubenswrapper[4825]: I0310 08:09:43.473915 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rn9vj" Mar 10 08:09:45 crc kubenswrapper[4825]: I0310 08:09:45.698557 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-ksp8z"] Mar 10 08:09:45 crc kubenswrapper[4825]: E0310 08:09:45.699224 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b648ef-f8a8-475f-a831-8ebb97d1d57e" containerName="mariadb-account-create-update" Mar 10 08:09:45 crc kubenswrapper[4825]: I0310 08:09:45.699242 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b648ef-f8a8-475f-a831-8ebb97d1d57e" containerName="mariadb-account-create-update" Mar 10 08:09:45 crc kubenswrapper[4825]: E0310 08:09:45.699283 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95bf6167-d4ef-425d-a6ea-7678a3fb8956" containerName="mariadb-database-create" Mar 10 08:09:45 crc kubenswrapper[4825]: I0310 08:09:45.699291 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="95bf6167-d4ef-425d-a6ea-7678a3fb8956" containerName="mariadb-database-create" Mar 10 08:09:45 crc kubenswrapper[4825]: I0310 08:09:45.699499 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="95bf6167-d4ef-425d-a6ea-7678a3fb8956" containerName="mariadb-database-create" Mar 10 08:09:45 crc kubenswrapper[4825]: I0310 08:09:45.699520 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="71b648ef-f8a8-475f-a831-8ebb97d1d57e" containerName="mariadb-account-create-update" Mar 10 08:09:45 crc kubenswrapper[4825]: I0310 08:09:45.700237 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ksp8z" Mar 10 08:09:45 crc kubenswrapper[4825]: I0310 08:09:45.706267 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ksp8z"] Mar 10 08:09:45 crc kubenswrapper[4825]: I0310 08:09:45.710504 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 08:09:45 crc kubenswrapper[4825]: I0310 08:09:45.710685 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 08:09:45 crc kubenswrapper[4825]: I0310 08:09:45.710870 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rghcj" Mar 10 08:09:45 crc kubenswrapper[4825]: I0310 08:09:45.711073 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 08:09:45 crc kubenswrapper[4825]: I0310 08:09:45.778968 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8565e58-689e-417e-a072-cc3a26fed9a3-combined-ca-bundle\") pod \"keystone-db-sync-ksp8z\" (UID: \"d8565e58-689e-417e-a072-cc3a26fed9a3\") " pod="openstack/keystone-db-sync-ksp8z" Mar 10 08:09:45 crc kubenswrapper[4825]: I0310 08:09:45.779029 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnjlf\" (UniqueName: \"kubernetes.io/projected/d8565e58-689e-417e-a072-cc3a26fed9a3-kube-api-access-qnjlf\") pod \"keystone-db-sync-ksp8z\" (UID: \"d8565e58-689e-417e-a072-cc3a26fed9a3\") " pod="openstack/keystone-db-sync-ksp8z" Mar 10 08:09:45 crc kubenswrapper[4825]: I0310 08:09:45.779322 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8565e58-689e-417e-a072-cc3a26fed9a3-config-data\") pod \"keystone-db-sync-ksp8z\" (UID: \"d8565e58-689e-417e-a072-cc3a26fed9a3\") " pod="openstack/keystone-db-sync-ksp8z" Mar 10 08:09:45 crc kubenswrapper[4825]: I0310 08:09:45.881036 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8565e58-689e-417e-a072-cc3a26fed9a3-config-data\") pod \"keystone-db-sync-ksp8z\" (UID: \"d8565e58-689e-417e-a072-cc3a26fed9a3\") " pod="openstack/keystone-db-sync-ksp8z" Mar 10 08:09:45 crc kubenswrapper[4825]: I0310 08:09:45.881359 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8565e58-689e-417e-a072-cc3a26fed9a3-combined-ca-bundle\") pod \"keystone-db-sync-ksp8z\" (UID: \"d8565e58-689e-417e-a072-cc3a26fed9a3\") " pod="openstack/keystone-db-sync-ksp8z" Mar 10 08:09:45 crc kubenswrapper[4825]: I0310 08:09:45.881407 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnjlf\" (UniqueName: \"kubernetes.io/projected/d8565e58-689e-417e-a072-cc3a26fed9a3-kube-api-access-qnjlf\") pod \"keystone-db-sync-ksp8z\" (UID: \"d8565e58-689e-417e-a072-cc3a26fed9a3\") " pod="openstack/keystone-db-sync-ksp8z" Mar 10 08:09:45 crc kubenswrapper[4825]: I0310 08:09:45.899388 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8565e58-689e-417e-a072-cc3a26fed9a3-combined-ca-bundle\") pod \"keystone-db-sync-ksp8z\" (UID: \"d8565e58-689e-417e-a072-cc3a26fed9a3\") " pod="openstack/keystone-db-sync-ksp8z" Mar 10 08:09:45 crc kubenswrapper[4825]: I0310 08:09:45.900166 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8565e58-689e-417e-a072-cc3a26fed9a3-config-data\") pod \"keystone-db-sync-ksp8z\" (UID: \"d8565e58-689e-417e-a072-cc3a26fed9a3\") " pod="openstack/keystone-db-sync-ksp8z" Mar 10 08:09:45 crc kubenswrapper[4825]: I0310 08:09:45.907762 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnjlf\" (UniqueName: \"kubernetes.io/projected/d8565e58-689e-417e-a072-cc3a26fed9a3-kube-api-access-qnjlf\") pod \"keystone-db-sync-ksp8z\" (UID: \"d8565e58-689e-417e-a072-cc3a26fed9a3\") " pod="openstack/keystone-db-sync-ksp8z" Mar 10 08:09:46 crc kubenswrapper[4825]: I0310 08:09:46.022163 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ksp8z" Mar 10 08:09:46 crc kubenswrapper[4825]: I0310 08:09:46.528103 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ksp8z"] Mar 10 08:09:46 crc kubenswrapper[4825]: I0310 08:09:46.888696 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:09:46 crc kubenswrapper[4825]: I0310 08:09:46.888759 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:09:47 crc kubenswrapper[4825]: I0310 08:09:47.504528 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ksp8z" event={"ID":"d8565e58-689e-417e-a072-cc3a26fed9a3","Type":"ContainerStarted","Data":"a79885db7b9858b342c0d1a93afd6ff19008139f0b3a8fc907fb36e32d50a4d7"} Mar 10 08:09:52 crc kubenswrapper[4825]: I0310 08:09:52.545187 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ksp8z" event={"ID":"d8565e58-689e-417e-a072-cc3a26fed9a3","Type":"ContainerStarted","Data":"2eb2b8f487a346d97915c6289d53b6f34478daa1b1a82707defd22b2970a4eab"} Mar 10 08:09:52 crc kubenswrapper[4825]: I0310 08:09:52.570715 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-ksp8z" podStartSLOduration=2.76253324 podStartE2EDuration="7.570703713s" podCreationTimestamp="2026-03-10 08:09:45 +0000 UTC" firstStartedPulling="2026-03-10 08:09:46.524079368 +0000 UTC m=+5139.553859983" lastFinishedPulling="2026-03-10 08:09:51.332249831 +0000 UTC m=+5144.362030456" observedRunningTime="2026-03-10 08:09:52.568186867 +0000 UTC m=+5145.597967542" watchObservedRunningTime="2026-03-10 08:09:52.570703713 +0000 UTC m=+5145.600484328" Mar 10 08:09:53 crc kubenswrapper[4825]: I0310 08:09:53.556095 4825 generic.go:334] "Generic (PLEG): container finished" podID="d8565e58-689e-417e-a072-cc3a26fed9a3" containerID="2eb2b8f487a346d97915c6289d53b6f34478daa1b1a82707defd22b2970a4eab" exitCode=0 Mar 10 08:09:53 crc kubenswrapper[4825]: I0310 08:09:53.556162 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ksp8z" event={"ID":"d8565e58-689e-417e-a072-cc3a26fed9a3","Type":"ContainerDied","Data":"2eb2b8f487a346d97915c6289d53b6f34478daa1b1a82707defd22b2970a4eab"} Mar 10 08:09:54 crc kubenswrapper[4825]: I0310 08:09:54.951482 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ksp8z" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.057278 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8565e58-689e-417e-a072-cc3a26fed9a3-config-data\") pod \"d8565e58-689e-417e-a072-cc3a26fed9a3\" (UID: \"d8565e58-689e-417e-a072-cc3a26fed9a3\") " Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.057833 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnjlf\" (UniqueName: \"kubernetes.io/projected/d8565e58-689e-417e-a072-cc3a26fed9a3-kube-api-access-qnjlf\") pod \"d8565e58-689e-417e-a072-cc3a26fed9a3\" (UID: \"d8565e58-689e-417e-a072-cc3a26fed9a3\") " Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.057924 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8565e58-689e-417e-a072-cc3a26fed9a3-combined-ca-bundle\") pod \"d8565e58-689e-417e-a072-cc3a26fed9a3\" (UID: \"d8565e58-689e-417e-a072-cc3a26fed9a3\") " Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.062522 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8565e58-689e-417e-a072-cc3a26fed9a3-kube-api-access-qnjlf" (OuterVolumeSpecName: "kube-api-access-qnjlf") pod "d8565e58-689e-417e-a072-cc3a26fed9a3" (UID: "d8565e58-689e-417e-a072-cc3a26fed9a3"). InnerVolumeSpecName "kube-api-access-qnjlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.080610 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8565e58-689e-417e-a072-cc3a26fed9a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8565e58-689e-417e-a072-cc3a26fed9a3" (UID: "d8565e58-689e-417e-a072-cc3a26fed9a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.098445 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8565e58-689e-417e-a072-cc3a26fed9a3-config-data" (OuterVolumeSpecName: "config-data") pod "d8565e58-689e-417e-a072-cc3a26fed9a3" (UID: "d8565e58-689e-417e-a072-cc3a26fed9a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.160481 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnjlf\" (UniqueName: \"kubernetes.io/projected/d8565e58-689e-417e-a072-cc3a26fed9a3-kube-api-access-qnjlf\") on node \"crc\" DevicePath \"\"" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.160517 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8565e58-689e-417e-a072-cc3a26fed9a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.160526 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8565e58-689e-417e-a072-cc3a26fed9a3-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.577435 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ksp8z" event={"ID":"d8565e58-689e-417e-a072-cc3a26fed9a3","Type":"ContainerDied","Data":"a79885db7b9858b342c0d1a93afd6ff19008139f0b3a8fc907fb36e32d50a4d7"} Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.577496 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a79885db7b9858b342c0d1a93afd6ff19008139f0b3a8fc907fb36e32d50a4d7" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.577558 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ksp8z" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.811614 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dfd5d77f-gdfgt"] Mar 10 08:09:55 crc kubenswrapper[4825]: E0310 08:09:55.811978 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8565e58-689e-417e-a072-cc3a26fed9a3" containerName="keystone-db-sync" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.811993 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8565e58-689e-417e-a072-cc3a26fed9a3" containerName="keystone-db-sync" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.812173 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8565e58-689e-417e-a072-cc3a26fed9a3" containerName="keystone-db-sync" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.813623 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.822215 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dfd5d77f-gdfgt"] Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.873198 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl9ft\" (UniqueName: \"kubernetes.io/projected/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-kube-api-access-jl9ft\") pod \"dnsmasq-dns-5dfd5d77f-gdfgt\" (UID: \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\") " pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.873297 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-dns-svc\") pod \"dnsmasq-dns-5dfd5d77f-gdfgt\" (UID: \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\") " pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.873364 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-config\") pod \"dnsmasq-dns-5dfd5d77f-gdfgt\" (UID: \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\") " pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.873441 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-ovsdbserver-sb\") pod \"dnsmasq-dns-5dfd5d77f-gdfgt\" (UID: \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\") " pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.873500 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-ovsdbserver-nb\") pod \"dnsmasq-dns-5dfd5d77f-gdfgt\" (UID: \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\") " pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.881524 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ffch6"] Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.882756 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ffch6" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.890377 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.890415 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rghcj" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.890583 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.890603 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.890377 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.891234 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ffch6"] Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.949556 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.974966 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-combined-ca-bundle\") pod \"keystone-bootstrap-ffch6\" (UID: \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\") " pod="openstack/keystone-bootstrap-ffch6" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.975042 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-dns-svc\") pod \"dnsmasq-dns-5dfd5d77f-gdfgt\" (UID: \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\") " pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.975116 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp7pm\" (UniqueName: \"kubernetes.io/projected/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-kube-api-access-sp7pm\") pod \"keystone-bootstrap-ffch6\" (UID: \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\") " pod="openstack/keystone-bootstrap-ffch6" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.975165 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-config\") pod \"dnsmasq-dns-5dfd5d77f-gdfgt\" (UID: \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\") " pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.975231 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-fernet-keys\") pod \"keystone-bootstrap-ffch6\" (UID: \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\") " pod="openstack/keystone-bootstrap-ffch6" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.975276 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-scripts\") pod \"keystone-bootstrap-ffch6\" (UID: \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\") " pod="openstack/keystone-bootstrap-ffch6" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.975312 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-ovsdbserver-sb\") pod \"dnsmasq-dns-5dfd5d77f-gdfgt\" (UID: \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\") " pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.975357 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-credential-keys\") pod \"keystone-bootstrap-ffch6\" (UID: \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\") " pod="openstack/keystone-bootstrap-ffch6" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.975397 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-ovsdbserver-nb\") pod \"dnsmasq-dns-5dfd5d77f-gdfgt\" (UID: \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\") " pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.975428 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-config-data\") pod \"keystone-bootstrap-ffch6\" (UID: \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\") " pod="openstack/keystone-bootstrap-ffch6" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.975466 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl9ft\" (UniqueName: \"kubernetes.io/projected/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-kube-api-access-jl9ft\") pod \"dnsmasq-dns-5dfd5d77f-gdfgt\" (UID: \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\") " pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.976528 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-dns-svc\") pod \"dnsmasq-dns-5dfd5d77f-gdfgt\" (UID: \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\") " pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.976724 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-ovsdbserver-sb\") pod \"dnsmasq-dns-5dfd5d77f-gdfgt\" (UID: \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\") " pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.977530 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-ovsdbserver-nb\") pod \"dnsmasq-dns-5dfd5d77f-gdfgt\" (UID: \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\") " pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" Mar 10 08:09:55 crc kubenswrapper[4825]: I0310 08:09:55.979410 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-config\") pod \"dnsmasq-dns-5dfd5d77f-gdfgt\" (UID: \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\") " pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" Mar 10 08:09:56 crc kubenswrapper[4825]: I0310 08:09:56.004341 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl9ft\" (UniqueName: \"kubernetes.io/projected/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-kube-api-access-jl9ft\") pod \"dnsmasq-dns-5dfd5d77f-gdfgt\" (UID: \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\") " pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" Mar 10 08:09:56 crc kubenswrapper[4825]: I0310 08:09:56.077495 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-credential-keys\") pod \"keystone-bootstrap-ffch6\" (UID: \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\") " pod="openstack/keystone-bootstrap-ffch6" Mar 10 08:09:56 crc kubenswrapper[4825]: I0310 08:09:56.077558 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-config-data\") pod \"keystone-bootstrap-ffch6\" (UID: \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\") " pod="openstack/keystone-bootstrap-ffch6" Mar 10 08:09:56 crc kubenswrapper[4825]: I0310 08:09:56.077594 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-combined-ca-bundle\") pod \"keystone-bootstrap-ffch6\" (UID: \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\") " pod="openstack/keystone-bootstrap-ffch6" Mar 10 08:09:56 crc kubenswrapper[4825]: I0310 08:09:56.077643 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp7pm\" (UniqueName: \"kubernetes.io/projected/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-kube-api-access-sp7pm\") pod \"keystone-bootstrap-ffch6\" (UID: \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\") " pod="openstack/keystone-bootstrap-ffch6" Mar 10 08:09:56 crc kubenswrapper[4825]: I0310 08:09:56.077683 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-fernet-keys\") pod \"keystone-bootstrap-ffch6\" (UID: \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\") " pod="openstack/keystone-bootstrap-ffch6" Mar 10 08:09:56 crc kubenswrapper[4825]: I0310 08:09:56.077704 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-scripts\") pod \"keystone-bootstrap-ffch6\" (UID: \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\") " pod="openstack/keystone-bootstrap-ffch6" Mar 10 08:09:56 crc kubenswrapper[4825]: I0310 08:09:56.082247 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-config-data\") pod \"keystone-bootstrap-ffch6\" (UID: \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\") " pod="openstack/keystone-bootstrap-ffch6" Mar 10 08:09:56 crc kubenswrapper[4825]: I0310 08:09:56.082724 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-scripts\") pod \"keystone-bootstrap-ffch6\" (UID: \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\") " pod="openstack/keystone-bootstrap-ffch6" Mar 10 08:09:56 crc kubenswrapper[4825]: I0310 08:09:56.082824 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-fernet-keys\") pod \"keystone-bootstrap-ffch6\" (UID: \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\") " pod="openstack/keystone-bootstrap-ffch6" Mar 10 08:09:56 crc kubenswrapper[4825]: I0310 08:09:56.083653 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-credential-keys\") pod \"keystone-bootstrap-ffch6\" (UID: \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\") " pod="openstack/keystone-bootstrap-ffch6" Mar 10 08:09:56 crc kubenswrapper[4825]: I0310 08:09:56.089273 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-combined-ca-bundle\") pod \"keystone-bootstrap-ffch6\" (UID: \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\") " pod="openstack/keystone-bootstrap-ffch6" Mar 10 08:09:56 crc kubenswrapper[4825]: I0310 08:09:56.094309 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp7pm\" (UniqueName: \"kubernetes.io/projected/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-kube-api-access-sp7pm\") pod \"keystone-bootstrap-ffch6\" (UID: \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\") " pod="openstack/keystone-bootstrap-ffch6" Mar 10 08:09:56 crc kubenswrapper[4825]: I0310 08:09:56.141779 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" Mar 10 08:09:56 crc kubenswrapper[4825]: I0310 08:09:56.223744 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ffch6" Mar 10 08:09:56 crc kubenswrapper[4825]: W0310 08:09:56.546604 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba211946_95a9_4331_93a4_e3e3f9ac2e2a.slice/crio-a84f644c5a9cd999d0130d97c76ca4145894a9c07dfb1a80449efc7afd952177 WatchSource:0}: Error finding container a84f644c5a9cd999d0130d97c76ca4145894a9c07dfb1a80449efc7afd952177: Status 404 returned error can't find the container with id a84f644c5a9cd999d0130d97c76ca4145894a9c07dfb1a80449efc7afd952177 Mar 10 08:09:56 crc kubenswrapper[4825]: I0310 08:09:56.547999 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ffch6"] Mar 10 08:09:56 crc kubenswrapper[4825]: I0310 08:09:56.593683 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ffch6" event={"ID":"ba211946-95a9-4331-93a4-e3e3f9ac2e2a","Type":"ContainerStarted","Data":"a84f644c5a9cd999d0130d97c76ca4145894a9c07dfb1a80449efc7afd952177"} Mar 10 08:09:56 crc kubenswrapper[4825]: I0310 08:09:56.603784 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dfd5d77f-gdfgt"] Mar 10 08:09:56 crc kubenswrapper[4825]: W0310 08:09:56.603877 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode57cbdf7_2b03_4898_b7bd_2ffddbef90c9.slice/crio-0bddb9aefff3b48461f265a5c41b3a693f6f377498294d6003627246b6c479b1 WatchSource:0}: Error finding container 0bddb9aefff3b48461f265a5c41b3a693f6f377498294d6003627246b6c479b1: Status 404 returned error can't find the container with id 0bddb9aefff3b48461f265a5c41b3a693f6f377498294d6003627246b6c479b1 Mar 10 08:09:57 crc kubenswrapper[4825]: I0310 08:09:57.604652 4825 generic.go:334] "Generic (PLEG): container finished" podID="e57cbdf7-2b03-4898-b7bd-2ffddbef90c9" containerID="3943a89c108fe7c1692b6c3449a15c3b6a3360d7f26dd8f9e42cf60127331675" exitCode=0 Mar 10 08:09:57 crc kubenswrapper[4825]: I0310 08:09:57.604753 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" event={"ID":"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9","Type":"ContainerDied","Data":"3943a89c108fe7c1692b6c3449a15c3b6a3360d7f26dd8f9e42cf60127331675"} Mar 10 08:09:57 crc kubenswrapper[4825]: I0310 08:09:57.605110 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" event={"ID":"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9","Type":"ContainerStarted","Data":"0bddb9aefff3b48461f265a5c41b3a693f6f377498294d6003627246b6c479b1"} Mar 10 08:09:57 crc kubenswrapper[4825]: I0310 08:09:57.610168 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ffch6" event={"ID":"ba211946-95a9-4331-93a4-e3e3f9ac2e2a","Type":"ContainerStarted","Data":"1e270fb12bcae28fa39d843b0de9753a845039c08cd041f524cf77f266f8b086"} Mar 10 08:09:57 crc kubenswrapper[4825]: I0310 08:09:57.671883 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ffch6" podStartSLOduration=2.6718620619999998 podStartE2EDuration="2.671862062s" podCreationTimestamp="2026-03-10 08:09:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:09:57.663952524 +0000 UTC m=+5150.693733139" watchObservedRunningTime="2026-03-10 08:09:57.671862062 +0000 UTC m=+5150.701642697" Mar 10 08:09:58 crc kubenswrapper[4825]: I0310 08:09:58.622655 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" event={"ID":"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9","Type":"ContainerStarted","Data":"ccd934705cd89f8776e1f8903c14de8d5ca47047552ef100f24e6bbbdce4f4c2"} Mar 10 08:09:58 crc kubenswrapper[4825]: I0310 08:09:58.622726 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" Mar 10 08:09:58 crc kubenswrapper[4825]: I0310 08:09:58.648595 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" podStartSLOduration=3.6485758390000003 podStartE2EDuration="3.648575839s" podCreationTimestamp="2026-03-10 08:09:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:09:58.644565123 +0000 UTC m=+5151.674345758" watchObservedRunningTime="2026-03-10 08:09:58.648575839 +0000 UTC m=+5151.678356464" Mar 10 08:10:00 crc kubenswrapper[4825]: I0310 08:10:00.140288 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552170-d7mn2"] Mar 10 08:10:00 crc kubenswrapper[4825]: I0310 08:10:00.149112 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552170-d7mn2" Mar 10 08:10:00 crc kubenswrapper[4825]: I0310 08:10:00.152787 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:10:00 crc kubenswrapper[4825]: I0310 08:10:00.156935 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:10:00 crc kubenswrapper[4825]: I0310 08:10:00.157199 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:10:00 crc kubenswrapper[4825]: I0310 08:10:00.162316 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552170-d7mn2"] Mar 10 08:10:00 crc kubenswrapper[4825]: I0310 08:10:00.257517 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nkl5\" (UniqueName: \"kubernetes.io/projected/fb5138fc-0a3c-4581-8319-0154b854ccce-kube-api-access-5nkl5\") pod \"auto-csr-approver-29552170-d7mn2\" (UID: \"fb5138fc-0a3c-4581-8319-0154b854ccce\") " pod="openshift-infra/auto-csr-approver-29552170-d7mn2" Mar 10 08:10:00 crc kubenswrapper[4825]: I0310 08:10:00.360413 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nkl5\" (UniqueName: \"kubernetes.io/projected/fb5138fc-0a3c-4581-8319-0154b854ccce-kube-api-access-5nkl5\") pod \"auto-csr-approver-29552170-d7mn2\" (UID: \"fb5138fc-0a3c-4581-8319-0154b854ccce\") " pod="openshift-infra/auto-csr-approver-29552170-d7mn2" Mar 10 08:10:00 crc kubenswrapper[4825]: I0310 08:10:00.378337 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nkl5\" (UniqueName: \"kubernetes.io/projected/fb5138fc-0a3c-4581-8319-0154b854ccce-kube-api-access-5nkl5\") pod \"auto-csr-approver-29552170-d7mn2\" (UID: \"fb5138fc-0a3c-4581-8319-0154b854ccce\") " pod="openshift-infra/auto-csr-approver-29552170-d7mn2" Mar 10 08:10:00 crc kubenswrapper[4825]: I0310 08:10:00.478650 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552170-d7mn2" Mar 10 08:10:00 crc kubenswrapper[4825]: I0310 08:10:00.640608 4825 generic.go:334] "Generic (PLEG): container finished" podID="ba211946-95a9-4331-93a4-e3e3f9ac2e2a" containerID="1e270fb12bcae28fa39d843b0de9753a845039c08cd041f524cf77f266f8b086" exitCode=0 Mar 10 08:10:00 crc kubenswrapper[4825]: I0310 08:10:00.642554 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ffch6" event={"ID":"ba211946-95a9-4331-93a4-e3e3f9ac2e2a","Type":"ContainerDied","Data":"1e270fb12bcae28fa39d843b0de9753a845039c08cd041f524cf77f266f8b086"} Mar 10 08:10:00 crc kubenswrapper[4825]: I0310 08:10:00.921535 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552170-d7mn2"] Mar 10 08:10:00 crc kubenswrapper[4825]: W0310 08:10:00.929292 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb5138fc_0a3c_4581_8319_0154b854ccce.slice/crio-6fa24c5e23716bb681812dbe49889c4fc1b3574327f480524e26dbb580d7402f WatchSource:0}: Error finding container 6fa24c5e23716bb681812dbe49889c4fc1b3574327f480524e26dbb580d7402f: Status 404 returned error can't find the container with id 6fa24c5e23716bb681812dbe49889c4fc1b3574327f480524e26dbb580d7402f Mar 10 08:10:01 crc kubenswrapper[4825]: I0310 08:10:01.652937 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552170-d7mn2" event={"ID":"fb5138fc-0a3c-4581-8319-0154b854ccce","Type":"ContainerStarted","Data":"6fa24c5e23716bb681812dbe49889c4fc1b3574327f480524e26dbb580d7402f"} Mar 10 08:10:01 crc kubenswrapper[4825]: I0310 08:10:01.972289 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ffch6" Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.088495 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-config-data\") pod \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\" (UID: \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\") " Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.088611 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-fernet-keys\") pod \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\" (UID: \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\") " Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.088643 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-scripts\") pod \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\" (UID: \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\") " Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.088663 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-credential-keys\") pod \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\" (UID: \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\") " Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.088698 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp7pm\" (UniqueName: \"kubernetes.io/projected/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-kube-api-access-sp7pm\") pod \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\" (UID: \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\") " Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.088715 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-combined-ca-bundle\") pod \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\" (UID: \"ba211946-95a9-4331-93a4-e3e3f9ac2e2a\") " Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.096113 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-scripts" (OuterVolumeSpecName: "scripts") pod "ba211946-95a9-4331-93a4-e3e3f9ac2e2a" (UID: "ba211946-95a9-4331-93a4-e3e3f9ac2e2a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.098579 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-kube-api-access-sp7pm" (OuterVolumeSpecName: "kube-api-access-sp7pm") pod "ba211946-95a9-4331-93a4-e3e3f9ac2e2a" (UID: "ba211946-95a9-4331-93a4-e3e3f9ac2e2a"). InnerVolumeSpecName "kube-api-access-sp7pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.100590 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ba211946-95a9-4331-93a4-e3e3f9ac2e2a" (UID: "ba211946-95a9-4331-93a4-e3e3f9ac2e2a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.100951 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ba211946-95a9-4331-93a4-e3e3f9ac2e2a" (UID: "ba211946-95a9-4331-93a4-e3e3f9ac2e2a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.124331 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-config-data" (OuterVolumeSpecName: "config-data") pod "ba211946-95a9-4331-93a4-e3e3f9ac2e2a" (UID: "ba211946-95a9-4331-93a4-e3e3f9ac2e2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.124843 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba211946-95a9-4331-93a4-e3e3f9ac2e2a" (UID: "ba211946-95a9-4331-93a4-e3e3f9ac2e2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.190288 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.190316 4825 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.190325 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.190333 4825 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.190345 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp7pm\" (UniqueName: \"kubernetes.io/projected/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-kube-api-access-sp7pm\") on node \"crc\" DevicePath \"\"" Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.190353 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba211946-95a9-4331-93a4-e3e3f9ac2e2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.663411 4825 generic.go:334] "Generic (PLEG): container finished" podID="fb5138fc-0a3c-4581-8319-0154b854ccce" containerID="8488bd91af709b889b7fc2c8e6562a78138b630e47186398a9d97e4c0e154db4" exitCode=0 Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.663534 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552170-d7mn2" event={"ID":"fb5138fc-0a3c-4581-8319-0154b854ccce","Type":"ContainerDied","Data":"8488bd91af709b889b7fc2c8e6562a78138b630e47186398a9d97e4c0e154db4"} Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.665679 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ffch6" event={"ID":"ba211946-95a9-4331-93a4-e3e3f9ac2e2a","Type":"ContainerDied","Data":"a84f644c5a9cd999d0130d97c76ca4145894a9c07dfb1a80449efc7afd952177"} Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.665715 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a84f644c5a9cd999d0130d97c76ca4145894a9c07dfb1a80449efc7afd952177" Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.665760 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ffch6" Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.729902 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ffch6"] Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.735876 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ffch6"] Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.816257 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-d4mm8"] Mar 10 08:10:02 crc kubenswrapper[4825]: E0310 08:10:02.816648 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba211946-95a9-4331-93a4-e3e3f9ac2e2a" containerName="keystone-bootstrap" Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.816670 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba211946-95a9-4331-93a4-e3e3f9ac2e2a" containerName="keystone-bootstrap" Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.816877 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba211946-95a9-4331-93a4-e3e3f9ac2e2a" containerName="keystone-bootstrap" Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.817538 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d4mm8" Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.820489 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.820742 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.820797 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.821435 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rghcj" Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.822117 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 08:10:02 crc kubenswrapper[4825]: I0310 08:10:02.833654 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-d4mm8"] Mar 10 08:10:03 crc kubenswrapper[4825]: I0310 08:10:03.004210 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-fernet-keys\") pod \"keystone-bootstrap-d4mm8\" (UID: \"c94d5e99-c6af-43b2-9000-bc78fb053d79\") " pod="openstack/keystone-bootstrap-d4mm8" Mar 10 08:10:03 crc kubenswrapper[4825]: I0310 08:10:03.004331 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wln9\" (UniqueName: \"kubernetes.io/projected/c94d5e99-c6af-43b2-9000-bc78fb053d79-kube-api-access-2wln9\") pod \"keystone-bootstrap-d4mm8\" (UID: \"c94d5e99-c6af-43b2-9000-bc78fb053d79\") " pod="openstack/keystone-bootstrap-d4mm8" Mar 10 08:10:03 crc kubenswrapper[4825]: I0310 08:10:03.004380 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-combined-ca-bundle\") pod \"keystone-bootstrap-d4mm8\" (UID: \"c94d5e99-c6af-43b2-9000-bc78fb053d79\") " pod="openstack/keystone-bootstrap-d4mm8" Mar 10 08:10:03 crc kubenswrapper[4825]: I0310 08:10:03.004541 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-scripts\") pod \"keystone-bootstrap-d4mm8\" (UID: \"c94d5e99-c6af-43b2-9000-bc78fb053d79\") " pod="openstack/keystone-bootstrap-d4mm8" Mar 10 08:10:03 crc kubenswrapper[4825]: I0310 08:10:03.004695 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-config-data\") pod \"keystone-bootstrap-d4mm8\" (UID: \"c94d5e99-c6af-43b2-9000-bc78fb053d79\") " pod="openstack/keystone-bootstrap-d4mm8" Mar 10 08:10:03 crc kubenswrapper[4825]: I0310 08:10:03.004811 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-credential-keys\") pod \"keystone-bootstrap-d4mm8\" (UID: \"c94d5e99-c6af-43b2-9000-bc78fb053d79\") " pod="openstack/keystone-bootstrap-d4mm8" Mar 10 08:10:03 crc kubenswrapper[4825]: I0310 08:10:03.105578 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-fernet-keys\") pod \"keystone-bootstrap-d4mm8\" (UID: \"c94d5e99-c6af-43b2-9000-bc78fb053d79\") " pod="openstack/keystone-bootstrap-d4mm8" Mar 10 08:10:03 crc kubenswrapper[4825]: I0310 08:10:03.105651 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wln9\" (UniqueName: \"kubernetes.io/projected/c94d5e99-c6af-43b2-9000-bc78fb053d79-kube-api-access-2wln9\") pod \"keystone-bootstrap-d4mm8\" (UID: \"c94d5e99-c6af-43b2-9000-bc78fb053d79\") " pod="openstack/keystone-bootstrap-d4mm8" Mar 10 08:10:03 crc kubenswrapper[4825]: I0310 08:10:03.105682 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-combined-ca-bundle\") pod \"keystone-bootstrap-d4mm8\" (UID: \"c94d5e99-c6af-43b2-9000-bc78fb053d79\") " pod="openstack/keystone-bootstrap-d4mm8" Mar 10 08:10:03 crc kubenswrapper[4825]: I0310 08:10:03.105712 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-scripts\") pod \"keystone-bootstrap-d4mm8\" (UID: \"c94d5e99-c6af-43b2-9000-bc78fb053d79\") " pod="openstack/keystone-bootstrap-d4mm8" Mar 10 08:10:03 crc kubenswrapper[4825]: I0310 08:10:03.105761 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-config-data\") pod \"keystone-bootstrap-d4mm8\" (UID: \"c94d5e99-c6af-43b2-9000-bc78fb053d79\") " pod="openstack/keystone-bootstrap-d4mm8" Mar 10 08:10:03 crc kubenswrapper[4825]: I0310 08:10:03.105814 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-credential-keys\") pod \"keystone-bootstrap-d4mm8\" (UID: \"c94d5e99-c6af-43b2-9000-bc78fb053d79\") " pod="openstack/keystone-bootstrap-d4mm8" Mar 10 08:10:03 crc kubenswrapper[4825]: I0310 08:10:03.111178 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-credential-keys\") pod \"keystone-bootstrap-d4mm8\" (UID: \"c94d5e99-c6af-43b2-9000-bc78fb053d79\") " pod="openstack/keystone-bootstrap-d4mm8" Mar 10 08:10:03 crc kubenswrapper[4825]: I0310 08:10:03.111258 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-fernet-keys\") pod \"keystone-bootstrap-d4mm8\" (UID: \"c94d5e99-c6af-43b2-9000-bc78fb053d79\") " pod="openstack/keystone-bootstrap-d4mm8" Mar 10 08:10:03 crc kubenswrapper[4825]: I0310 08:10:03.111366 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-config-data\") pod \"keystone-bootstrap-d4mm8\" (UID: \"c94d5e99-c6af-43b2-9000-bc78fb053d79\") " pod="openstack/keystone-bootstrap-d4mm8" Mar 10 08:10:03 crc kubenswrapper[4825]: I0310 08:10:03.112484 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-scripts\") pod \"keystone-bootstrap-d4mm8\" (UID: \"c94d5e99-c6af-43b2-9000-bc78fb053d79\") " pod="openstack/keystone-bootstrap-d4mm8" Mar 10 08:10:03 crc kubenswrapper[4825]: I0310 08:10:03.114251 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-combined-ca-bundle\") pod \"keystone-bootstrap-d4mm8\" (UID: \"c94d5e99-c6af-43b2-9000-bc78fb053d79\") " pod="openstack/keystone-bootstrap-d4mm8" Mar 10 08:10:03 crc kubenswrapper[4825]: I0310 08:10:03.121909 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wln9\" (UniqueName: \"kubernetes.io/projected/c94d5e99-c6af-43b2-9000-bc78fb053d79-kube-api-access-2wln9\") pod \"keystone-bootstrap-d4mm8\" (UID: \"c94d5e99-c6af-43b2-9000-bc78fb053d79\") " pod="openstack/keystone-bootstrap-d4mm8" Mar 10 08:10:03 crc kubenswrapper[4825]: I0310 08:10:03.135898 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d4mm8" Mar 10 08:10:03 crc kubenswrapper[4825]: I0310 08:10:03.250365 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba211946-95a9-4331-93a4-e3e3f9ac2e2a" path="/var/lib/kubelet/pods/ba211946-95a9-4331-93a4-e3e3f9ac2e2a/volumes" Mar 10 08:10:03 crc kubenswrapper[4825]: I0310 08:10:03.611669 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-d4mm8"] Mar 10 08:10:03 crc kubenswrapper[4825]: I0310 08:10:03.676602 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d4mm8" event={"ID":"c94d5e99-c6af-43b2-9000-bc78fb053d79","Type":"ContainerStarted","Data":"ecd6ade617b5acb45e09eda34eaea38cef3365aeef73ed959695e4645cd07ecc"} Mar 10 08:10:03 crc kubenswrapper[4825]: I0310 08:10:03.924312 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552170-d7mn2" Mar 10 08:10:04 crc kubenswrapper[4825]: I0310 08:10:04.123324 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nkl5\" (UniqueName: \"kubernetes.io/projected/fb5138fc-0a3c-4581-8319-0154b854ccce-kube-api-access-5nkl5\") pod \"fb5138fc-0a3c-4581-8319-0154b854ccce\" (UID: \"fb5138fc-0a3c-4581-8319-0154b854ccce\") " Mar 10 08:10:04 crc kubenswrapper[4825]: I0310 08:10:04.133146 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb5138fc-0a3c-4581-8319-0154b854ccce-kube-api-access-5nkl5" (OuterVolumeSpecName: "kube-api-access-5nkl5") pod "fb5138fc-0a3c-4581-8319-0154b854ccce" (UID: "fb5138fc-0a3c-4581-8319-0154b854ccce"). InnerVolumeSpecName "kube-api-access-5nkl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:10:04 crc kubenswrapper[4825]: I0310 08:10:04.224971 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nkl5\" (UniqueName: \"kubernetes.io/projected/fb5138fc-0a3c-4581-8319-0154b854ccce-kube-api-access-5nkl5\") on node \"crc\" DevicePath \"\"" Mar 10 08:10:04 crc kubenswrapper[4825]: I0310 08:10:04.688679 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d4mm8" event={"ID":"c94d5e99-c6af-43b2-9000-bc78fb053d79","Type":"ContainerStarted","Data":"9bc38ee5df0de7e7cb535b07fa19f7882db88fdfd7fd3c142ef84476287cf80d"} Mar 10 08:10:04 crc kubenswrapper[4825]: I0310 08:10:04.691651 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552170-d7mn2" event={"ID":"fb5138fc-0a3c-4581-8319-0154b854ccce","Type":"ContainerDied","Data":"6fa24c5e23716bb681812dbe49889c4fc1b3574327f480524e26dbb580d7402f"} Mar 10 08:10:04 crc kubenswrapper[4825]: I0310 08:10:04.691712 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fa24c5e23716bb681812dbe49889c4fc1b3574327f480524e26dbb580d7402f" Mar 10 08:10:04 crc kubenswrapper[4825]: I0310 08:10:04.691789 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552170-d7mn2" Mar 10 08:10:04 crc kubenswrapper[4825]: I0310 08:10:04.717440 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-d4mm8" podStartSLOduration=2.717412136 podStartE2EDuration="2.717412136s" podCreationTimestamp="2026-03-10 08:10:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:10:04.715171927 +0000 UTC m=+5157.744952562" watchObservedRunningTime="2026-03-10 08:10:04.717412136 +0000 UTC m=+5157.747192791" Mar 10 08:10:04 crc kubenswrapper[4825]: I0310 08:10:04.987445 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552164-vv99d"] Mar 10 08:10:04 crc kubenswrapper[4825]: I0310 08:10:04.993749 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552164-vv99d"] Mar 10 08:10:05 crc kubenswrapper[4825]: I0310 08:10:05.246017 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13750d54-cc8a-459d-b9b0-ff2a1c678421" path="/var/lib/kubelet/pods/13750d54-cc8a-459d-b9b0-ff2a1c678421/volumes" Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.144274 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.215408 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79c5dff769-v95dp"] Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.215694 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79c5dff769-v95dp" podUID="2ef475cc-7b9d-475f-bac7-c01ac3d568cb" containerName="dnsmasq-dns" containerID="cri-o://cc19aabad6ff8c1e49bcc4a7b3dd2ae4ae4af696c3cd59f2027c2c0813dd3b57" gracePeriod=10 Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.695840 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79c5dff769-v95dp" Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.709957 4825 generic.go:334] "Generic (PLEG): container finished" podID="c94d5e99-c6af-43b2-9000-bc78fb053d79" containerID="9bc38ee5df0de7e7cb535b07fa19f7882db88fdfd7fd3c142ef84476287cf80d" exitCode=0 Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.710054 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d4mm8" event={"ID":"c94d5e99-c6af-43b2-9000-bc78fb053d79","Type":"ContainerDied","Data":"9bc38ee5df0de7e7cb535b07fa19f7882db88fdfd7fd3c142ef84476287cf80d"} Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.712090 4825 generic.go:334] "Generic (PLEG): container finished" podID="2ef475cc-7b9d-475f-bac7-c01ac3d568cb" containerID="cc19aabad6ff8c1e49bcc4a7b3dd2ae4ae4af696c3cd59f2027c2c0813dd3b57" exitCode=0 Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.712124 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79c5dff769-v95dp" event={"ID":"2ef475cc-7b9d-475f-bac7-c01ac3d568cb","Type":"ContainerDied","Data":"cc19aabad6ff8c1e49bcc4a7b3dd2ae4ae4af696c3cd59f2027c2c0813dd3b57"} Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.712160 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79c5dff769-v95dp" event={"ID":"2ef475cc-7b9d-475f-bac7-c01ac3d568cb","Type":"ContainerDied","Data":"ee5a9cb1b7ddbd1126d2b1a615214e02a485c34bec93f7ad6e43fbeb3f6ebe4c"} Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.712167 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79c5dff769-v95dp" Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.712182 4825 scope.go:117] "RemoveContainer" containerID="cc19aabad6ff8c1e49bcc4a7b3dd2ae4ae4af696c3cd59f2027c2c0813dd3b57" Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.733698 4825 scope.go:117] "RemoveContainer" containerID="4748b7071eebf2c2a015322fddcb926947e5b9e97f5486275c5a7087810ad0d0" Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.763230 4825 scope.go:117] "RemoveContainer" containerID="cc19aabad6ff8c1e49bcc4a7b3dd2ae4ae4af696c3cd59f2027c2c0813dd3b57" Mar 10 08:10:06 crc kubenswrapper[4825]: E0310 08:10:06.763680 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc19aabad6ff8c1e49bcc4a7b3dd2ae4ae4af696c3cd59f2027c2c0813dd3b57\": container with ID starting with cc19aabad6ff8c1e49bcc4a7b3dd2ae4ae4af696c3cd59f2027c2c0813dd3b57 not found: ID does not exist" containerID="cc19aabad6ff8c1e49bcc4a7b3dd2ae4ae4af696c3cd59f2027c2c0813dd3b57" Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.763730 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc19aabad6ff8c1e49bcc4a7b3dd2ae4ae4af696c3cd59f2027c2c0813dd3b57"} err="failed to get container status \"cc19aabad6ff8c1e49bcc4a7b3dd2ae4ae4af696c3cd59f2027c2c0813dd3b57\": rpc error: code = NotFound desc = could not find container \"cc19aabad6ff8c1e49bcc4a7b3dd2ae4ae4af696c3cd59f2027c2c0813dd3b57\": container with ID starting with cc19aabad6ff8c1e49bcc4a7b3dd2ae4ae4af696c3cd59f2027c2c0813dd3b57 not found: ID does not exist" Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.763762 4825 scope.go:117] "RemoveContainer" containerID="4748b7071eebf2c2a015322fddcb926947e5b9e97f5486275c5a7087810ad0d0" Mar 10 08:10:06 crc kubenswrapper[4825]: E0310 08:10:06.764116 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4748b7071eebf2c2a015322fddcb926947e5b9e97f5486275c5a7087810ad0d0\": container with ID starting with 4748b7071eebf2c2a015322fddcb926947e5b9e97f5486275c5a7087810ad0d0 not found: ID does not exist" containerID="4748b7071eebf2c2a015322fddcb926947e5b9e97f5486275c5a7087810ad0d0" Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.764191 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4748b7071eebf2c2a015322fddcb926947e5b9e97f5486275c5a7087810ad0d0"} err="failed to get container status \"4748b7071eebf2c2a015322fddcb926947e5b9e97f5486275c5a7087810ad0d0\": rpc error: code = NotFound desc = could not find container \"4748b7071eebf2c2a015322fddcb926947e5b9e97f5486275c5a7087810ad0d0\": container with ID starting with 4748b7071eebf2c2a015322fddcb926947e5b9e97f5486275c5a7087810ad0d0 not found: ID does not exist" Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.873237 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-dns-svc\") pod \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\" (UID: \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\") " Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.873459 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-ovsdbserver-sb\") pod \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\" (UID: \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\") " Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.873504 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-config\") pod \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\" (UID: \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\") " Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.873535 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzn6c\" (UniqueName: \"kubernetes.io/projected/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-kube-api-access-nzn6c\") pod \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\" (UID: \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\") " Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.873629 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-ovsdbserver-nb\") pod \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\" (UID: \"2ef475cc-7b9d-475f-bac7-c01ac3d568cb\") " Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.880261 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-kube-api-access-nzn6c" (OuterVolumeSpecName: "kube-api-access-nzn6c") pod "2ef475cc-7b9d-475f-bac7-c01ac3d568cb" (UID: "2ef475cc-7b9d-475f-bac7-c01ac3d568cb"). InnerVolumeSpecName "kube-api-access-nzn6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.923722 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2ef475cc-7b9d-475f-bac7-c01ac3d568cb" (UID: "2ef475cc-7b9d-475f-bac7-c01ac3d568cb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.928397 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2ef475cc-7b9d-475f-bac7-c01ac3d568cb" (UID: "2ef475cc-7b9d-475f-bac7-c01ac3d568cb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.932452 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-config" (OuterVolumeSpecName: "config") pod "2ef475cc-7b9d-475f-bac7-c01ac3d568cb" (UID: "2ef475cc-7b9d-475f-bac7-c01ac3d568cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.933656 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2ef475cc-7b9d-475f-bac7-c01ac3d568cb" (UID: "2ef475cc-7b9d-475f-bac7-c01ac3d568cb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.976245 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.976287 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-config\") on node \"crc\" DevicePath \"\"" Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.976302 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzn6c\" (UniqueName: \"kubernetes.io/projected/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-kube-api-access-nzn6c\") on node \"crc\" DevicePath \"\"" Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.976313 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 08:10:06 crc kubenswrapper[4825]: I0310 08:10:06.976324 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ef475cc-7b9d-475f-bac7-c01ac3d568cb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 08:10:07 crc kubenswrapper[4825]: I0310 08:10:07.041076 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79c5dff769-v95dp"] Mar 10 08:10:07 crc kubenswrapper[4825]: I0310 08:10:07.047056 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79c5dff769-v95dp"] Mar 10 08:10:07 crc kubenswrapper[4825]: I0310 08:10:07.255525 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ef475cc-7b9d-475f-bac7-c01ac3d568cb" path="/var/lib/kubelet/pods/2ef475cc-7b9d-475f-bac7-c01ac3d568cb/volumes" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.004235 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d4mm8" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.090956 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wln9\" (UniqueName: \"kubernetes.io/projected/c94d5e99-c6af-43b2-9000-bc78fb053d79-kube-api-access-2wln9\") pod \"c94d5e99-c6af-43b2-9000-bc78fb053d79\" (UID: \"c94d5e99-c6af-43b2-9000-bc78fb053d79\") " Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.091023 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-credential-keys\") pod \"c94d5e99-c6af-43b2-9000-bc78fb053d79\" (UID: \"c94d5e99-c6af-43b2-9000-bc78fb053d79\") " Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.091067 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-combined-ca-bundle\") pod \"c94d5e99-c6af-43b2-9000-bc78fb053d79\" (UID: \"c94d5e99-c6af-43b2-9000-bc78fb053d79\") " Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.091104 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-config-data\") pod \"c94d5e99-c6af-43b2-9000-bc78fb053d79\" (UID: \"c94d5e99-c6af-43b2-9000-bc78fb053d79\") " Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.091124 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-scripts\") pod \"c94d5e99-c6af-43b2-9000-bc78fb053d79\" (UID: \"c94d5e99-c6af-43b2-9000-bc78fb053d79\") " Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.091247 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-fernet-keys\") pod \"c94d5e99-c6af-43b2-9000-bc78fb053d79\" (UID: \"c94d5e99-c6af-43b2-9000-bc78fb053d79\") " Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.097286 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c94d5e99-c6af-43b2-9000-bc78fb053d79" (UID: "c94d5e99-c6af-43b2-9000-bc78fb053d79"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.097363 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c94d5e99-c6af-43b2-9000-bc78fb053d79" (UID: "c94d5e99-c6af-43b2-9000-bc78fb053d79"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.097565 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c94d5e99-c6af-43b2-9000-bc78fb053d79-kube-api-access-2wln9" (OuterVolumeSpecName: "kube-api-access-2wln9") pod "c94d5e99-c6af-43b2-9000-bc78fb053d79" (UID: "c94d5e99-c6af-43b2-9000-bc78fb053d79"). InnerVolumeSpecName "kube-api-access-2wln9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.098577 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-scripts" (OuterVolumeSpecName: "scripts") pod "c94d5e99-c6af-43b2-9000-bc78fb053d79" (UID: "c94d5e99-c6af-43b2-9000-bc78fb053d79"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.115149 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c94d5e99-c6af-43b2-9000-bc78fb053d79" (UID: "c94d5e99-c6af-43b2-9000-bc78fb053d79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.116693 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-config-data" (OuterVolumeSpecName: "config-data") pod "c94d5e99-c6af-43b2-9000-bc78fb053d79" (UID: "c94d5e99-c6af-43b2-9000-bc78fb053d79"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.193446 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wln9\" (UniqueName: \"kubernetes.io/projected/c94d5e99-c6af-43b2-9000-bc78fb053d79-kube-api-access-2wln9\") on node \"crc\" DevicePath \"\"" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.193632 4825 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.193691 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.193771 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.193828 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.193880 4825 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c94d5e99-c6af-43b2-9000-bc78fb053d79-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.727317 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d4mm8" event={"ID":"c94d5e99-c6af-43b2-9000-bc78fb053d79","Type":"ContainerDied","Data":"ecd6ade617b5acb45e09eda34eaea38cef3365aeef73ed959695e4645cd07ecc"} Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.727354 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecd6ade617b5acb45e09eda34eaea38cef3365aeef73ed959695e4645cd07ecc" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.727396 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d4mm8" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.836211 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-586b5b5d4c-4gh87"] Mar 10 08:10:08 crc kubenswrapper[4825]: E0310 08:10:08.836561 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef475cc-7b9d-475f-bac7-c01ac3d568cb" containerName="dnsmasq-dns" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.836581 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef475cc-7b9d-475f-bac7-c01ac3d568cb" containerName="dnsmasq-dns" Mar 10 08:10:08 crc kubenswrapper[4825]: E0310 08:10:08.836618 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef475cc-7b9d-475f-bac7-c01ac3d568cb" containerName="init" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.836630 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef475cc-7b9d-475f-bac7-c01ac3d568cb" containerName="init" Mar 10 08:10:08 crc kubenswrapper[4825]: E0310 08:10:08.836654 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c94d5e99-c6af-43b2-9000-bc78fb053d79" containerName="keystone-bootstrap" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.836663 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c94d5e99-c6af-43b2-9000-bc78fb053d79" containerName="keystone-bootstrap" Mar 10 08:10:08 crc kubenswrapper[4825]: E0310 08:10:08.836680 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5138fc-0a3c-4581-8319-0154b854ccce" containerName="oc" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.836688 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5138fc-0a3c-4581-8319-0154b854ccce" containerName="oc" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.836865 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ef475cc-7b9d-475f-bac7-c01ac3d568cb" containerName="dnsmasq-dns" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.836891 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb5138fc-0a3c-4581-8319-0154b854ccce" containerName="oc" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.836907 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c94d5e99-c6af-43b2-9000-bc78fb053d79" containerName="keystone-bootstrap" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.837591 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.839373 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.839735 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.840022 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.841006 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.841189 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.841418 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rghcj" Mar 10 08:10:08 crc kubenswrapper[4825]: I0310 08:10:08.845429 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-586b5b5d4c-4gh87"] Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.006708 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f663222e-3075-4730-a6d8-74a90a27c152-combined-ca-bundle\") pod \"keystone-586b5b5d4c-4gh87\" (UID: \"f663222e-3075-4730-a6d8-74a90a27c152\") " pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.006761 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6g66\" (UniqueName: \"kubernetes.io/projected/f663222e-3075-4730-a6d8-74a90a27c152-kube-api-access-g6g66\") pod \"keystone-586b5b5d4c-4gh87\" (UID: \"f663222e-3075-4730-a6d8-74a90a27c152\") " pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.006835 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f663222e-3075-4730-a6d8-74a90a27c152-fernet-keys\") pod \"keystone-586b5b5d4c-4gh87\" (UID: \"f663222e-3075-4730-a6d8-74a90a27c152\") " pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.006859 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f663222e-3075-4730-a6d8-74a90a27c152-internal-tls-certs\") pod \"keystone-586b5b5d4c-4gh87\" (UID: \"f663222e-3075-4730-a6d8-74a90a27c152\") " pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.006994 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f663222e-3075-4730-a6d8-74a90a27c152-config-data\") pod \"keystone-586b5b5d4c-4gh87\" (UID: \"f663222e-3075-4730-a6d8-74a90a27c152\") " pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.007041 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f663222e-3075-4730-a6d8-74a90a27c152-public-tls-certs\") pod \"keystone-586b5b5d4c-4gh87\" (UID: \"f663222e-3075-4730-a6d8-74a90a27c152\") " pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.007067 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f663222e-3075-4730-a6d8-74a90a27c152-scripts\") pod \"keystone-586b5b5d4c-4gh87\" (UID: \"f663222e-3075-4730-a6d8-74a90a27c152\") " pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.007108 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f663222e-3075-4730-a6d8-74a90a27c152-credential-keys\") pod \"keystone-586b5b5d4c-4gh87\" (UID: \"f663222e-3075-4730-a6d8-74a90a27c152\") " pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.108480 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6g66\" (UniqueName: \"kubernetes.io/projected/f663222e-3075-4730-a6d8-74a90a27c152-kube-api-access-g6g66\") pod \"keystone-586b5b5d4c-4gh87\" (UID: \"f663222e-3075-4730-a6d8-74a90a27c152\") " pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.108553 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f663222e-3075-4730-a6d8-74a90a27c152-fernet-keys\") pod \"keystone-586b5b5d4c-4gh87\" (UID: \"f663222e-3075-4730-a6d8-74a90a27c152\") " pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.108574 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f663222e-3075-4730-a6d8-74a90a27c152-internal-tls-certs\") pod \"keystone-586b5b5d4c-4gh87\" (UID: \"f663222e-3075-4730-a6d8-74a90a27c152\") " pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.108668 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f663222e-3075-4730-a6d8-74a90a27c152-config-data\") pod \"keystone-586b5b5d4c-4gh87\" (UID: \"f663222e-3075-4730-a6d8-74a90a27c152\") " pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.108716 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f663222e-3075-4730-a6d8-74a90a27c152-public-tls-certs\") pod \"keystone-586b5b5d4c-4gh87\" (UID: \"f663222e-3075-4730-a6d8-74a90a27c152\") " pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.108743 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f663222e-3075-4730-a6d8-74a90a27c152-scripts\") pod \"keystone-586b5b5d4c-4gh87\" (UID: \"f663222e-3075-4730-a6d8-74a90a27c152\") " pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.108768 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f663222e-3075-4730-a6d8-74a90a27c152-credential-keys\") pod \"keystone-586b5b5d4c-4gh87\" (UID: \"f663222e-3075-4730-a6d8-74a90a27c152\") " pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.108797 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f663222e-3075-4730-a6d8-74a90a27c152-combined-ca-bundle\") pod \"keystone-586b5b5d4c-4gh87\" (UID: \"f663222e-3075-4730-a6d8-74a90a27c152\") " pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.112350 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f663222e-3075-4730-a6d8-74a90a27c152-combined-ca-bundle\") pod \"keystone-586b5b5d4c-4gh87\" (UID: \"f663222e-3075-4730-a6d8-74a90a27c152\") " pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.112503 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f663222e-3075-4730-a6d8-74a90a27c152-config-data\") pod \"keystone-586b5b5d4c-4gh87\" (UID: \"f663222e-3075-4730-a6d8-74a90a27c152\") " pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.113443 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f663222e-3075-4730-a6d8-74a90a27c152-fernet-keys\") pod \"keystone-586b5b5d4c-4gh87\" (UID: \"f663222e-3075-4730-a6d8-74a90a27c152\") " pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.114506 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f663222e-3075-4730-a6d8-74a90a27c152-public-tls-certs\") pod \"keystone-586b5b5d4c-4gh87\" (UID: \"f663222e-3075-4730-a6d8-74a90a27c152\") " pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.114832 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f663222e-3075-4730-a6d8-74a90a27c152-internal-tls-certs\") pod \"keystone-586b5b5d4c-4gh87\" (UID: \"f663222e-3075-4730-a6d8-74a90a27c152\") " pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.117357 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f663222e-3075-4730-a6d8-74a90a27c152-scripts\") pod \"keystone-586b5b5d4c-4gh87\" (UID: \"f663222e-3075-4730-a6d8-74a90a27c152\") " pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.117912 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f663222e-3075-4730-a6d8-74a90a27c152-credential-keys\") pod \"keystone-586b5b5d4c-4gh87\" (UID: \"f663222e-3075-4730-a6d8-74a90a27c152\") " pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.129599 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6g66\" (UniqueName: \"kubernetes.io/projected/f663222e-3075-4730-a6d8-74a90a27c152-kube-api-access-g6g66\") pod \"keystone-586b5b5d4c-4gh87\" (UID: \"f663222e-3075-4730-a6d8-74a90a27c152\") " pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.162916 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rghcj" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.170655 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.621400 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-586b5b5d4c-4gh87"] Mar 10 08:10:09 crc kubenswrapper[4825]: W0310 08:10:09.629291 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf663222e_3075_4730_a6d8_74a90a27c152.slice/crio-fb7ae5839bed9a71ac24beaf34221dd406443d585cb01ac173f79adcfd32083f WatchSource:0}: Error finding container fb7ae5839bed9a71ac24beaf34221dd406443d585cb01ac173f79adcfd32083f: Status 404 returned error can't find the container with id fb7ae5839bed9a71ac24beaf34221dd406443d585cb01ac173f79adcfd32083f Mar 10 08:10:09 crc kubenswrapper[4825]: I0310 08:10:09.737967 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-586b5b5d4c-4gh87" event={"ID":"f663222e-3075-4730-a6d8-74a90a27c152","Type":"ContainerStarted","Data":"fb7ae5839bed9a71ac24beaf34221dd406443d585cb01ac173f79adcfd32083f"} Mar 10 08:10:10 crc kubenswrapper[4825]: I0310 08:10:10.748564 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-586b5b5d4c-4gh87" event={"ID":"f663222e-3075-4730-a6d8-74a90a27c152","Type":"ContainerStarted","Data":"a570a6209b5428f5ad5a961824413da849f12fb23bf568a22e78e7c92c4270bd"} Mar 10 08:10:10 crc kubenswrapper[4825]: I0310 08:10:10.748898 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:10 crc kubenswrapper[4825]: I0310 08:10:10.778603 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-586b5b5d4c-4gh87" podStartSLOduration=2.778583042 podStartE2EDuration="2.778583042s" podCreationTimestamp="2026-03-10 08:10:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:10:10.773058027 +0000 UTC m=+5163.802838652" watchObservedRunningTime="2026-03-10 08:10:10.778583042 +0000 UTC m=+5163.808363657" Mar 10 08:10:16 crc kubenswrapper[4825]: I0310 08:10:16.888480 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:10:16 crc kubenswrapper[4825]: I0310 08:10:16.889181 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:10:16 crc kubenswrapper[4825]: I0310 08:10:16.889276 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 08:10:16 crc kubenswrapper[4825]: I0310 08:10:16.890723 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8802fd3b03472e4fc7e7e01ba0c1a6885938d2d647b76d216f70475068123de1"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 08:10:16 crc kubenswrapper[4825]: I0310 08:10:16.890885 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://8802fd3b03472e4fc7e7e01ba0c1a6885938d2d647b76d216f70475068123de1" gracePeriod=600 Mar 10 08:10:17 crc kubenswrapper[4825]: I0310 08:10:17.808081 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="8802fd3b03472e4fc7e7e01ba0c1a6885938d2d647b76d216f70475068123de1" exitCode=0 Mar 10 08:10:17 crc kubenswrapper[4825]: I0310 08:10:17.808169 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"8802fd3b03472e4fc7e7e01ba0c1a6885938d2d647b76d216f70475068123de1"} Mar 10 08:10:17 crc kubenswrapper[4825]: I0310 08:10:17.808683 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414"} Mar 10 08:10:17 crc kubenswrapper[4825]: I0310 08:10:17.808719 4825 scope.go:117] "RemoveContainer" containerID="856a46518228770fd6354414263218ba580cad9f6bf09966c52a899f2397881c" Mar 10 08:10:40 crc kubenswrapper[4825]: I0310 08:10:40.663510 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-586b5b5d4c-4gh87" Mar 10 08:10:41 crc kubenswrapper[4825]: I0310 08:10:41.095948 4825 scope.go:117] "RemoveContainer" containerID="2a1b30f0ff9c8feeb518a52f0391c45ccadded9b6a8f5dd536dcfe1450db78b7" Mar 10 08:10:45 crc kubenswrapper[4825]: I0310 08:10:45.207986 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 10 08:10:45 crc kubenswrapper[4825]: I0310 08:10:45.210568 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 08:10:45 crc kubenswrapper[4825]: I0310 08:10:45.212624 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 10 08:10:45 crc kubenswrapper[4825]: I0310 08:10:45.213296 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-7qq88" Mar 10 08:10:45 crc kubenswrapper[4825]: I0310 08:10:45.213539 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 10 08:10:45 crc kubenswrapper[4825]: I0310 08:10:45.229976 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 08:10:45 crc kubenswrapper[4825]: I0310 08:10:45.361452 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de-openstack-config\") pod \"openstackclient\" (UID: \"a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de\") " pod="openstack/openstackclient" Mar 10 08:10:45 crc kubenswrapper[4825]: I0310 08:10:45.361529 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de-openstack-config-secret\") pod \"openstackclient\" (UID: \"a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de\") " pod="openstack/openstackclient" Mar 10 08:10:45 crc kubenswrapper[4825]: I0310 08:10:45.361581 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s8q2\" (UniqueName: \"kubernetes.io/projected/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de-kube-api-access-8s8q2\") pod \"openstackclient\" (UID: \"a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de\") " pod="openstack/openstackclient" Mar 10 08:10:45 crc kubenswrapper[4825]: I0310 08:10:45.361629 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de\") " pod="openstack/openstackclient" Mar 10 08:10:45 crc kubenswrapper[4825]: I0310 08:10:45.463465 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de-openstack-config\") pod \"openstackclient\" (UID: \"a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de\") " pod="openstack/openstackclient" Mar 10 08:10:45 crc kubenswrapper[4825]: I0310 08:10:45.463507 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de-openstack-config-secret\") pod \"openstackclient\" (UID: \"a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de\") " pod="openstack/openstackclient" Mar 10 08:10:45 crc kubenswrapper[4825]: I0310 08:10:45.463542 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s8q2\" (UniqueName: \"kubernetes.io/projected/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de-kube-api-access-8s8q2\") pod \"openstackclient\" (UID: \"a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de\") " pod="openstack/openstackclient" Mar 10 08:10:45 crc kubenswrapper[4825]: I0310 08:10:45.463581 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de\") " pod="openstack/openstackclient" Mar 10 08:10:45 crc kubenswrapper[4825]: I0310 08:10:45.464641 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de-openstack-config\") pod \"openstackclient\" (UID: \"a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de\") " pod="openstack/openstackclient" Mar 10 08:10:45 crc kubenswrapper[4825]: I0310 08:10:45.470858 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de\") " pod="openstack/openstackclient" Mar 10 08:10:45 crc kubenswrapper[4825]: I0310 08:10:45.473044 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de-openstack-config-secret\") pod \"openstackclient\" (UID: \"a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de\") " pod="openstack/openstackclient" Mar 10 08:10:45 crc kubenswrapper[4825]: I0310 08:10:45.481079 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s8q2\" (UniqueName: \"kubernetes.io/projected/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de-kube-api-access-8s8q2\") pod \"openstackclient\" (UID: \"a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de\") " pod="openstack/openstackclient" Mar 10 08:10:45 crc kubenswrapper[4825]: I0310 08:10:45.547534 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 08:10:46 crc kubenswrapper[4825]: I0310 08:10:46.048775 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 08:10:46 crc kubenswrapper[4825]: I0310 08:10:46.083725 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de","Type":"ContainerStarted","Data":"239b3cfe195a0c3fc417f559438fb80c810446f7b57cd71937f1613308c57bf6"} Mar 10 08:10:59 crc kubenswrapper[4825]: I0310 08:10:59.213194 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de","Type":"ContainerStarted","Data":"e7ff1f930e87f5d9e3cabeec960ef7b6904a7905154d6c779781b767eab69638"} Mar 10 08:10:59 crc kubenswrapper[4825]: I0310 08:10:59.244852 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.941616171 podStartE2EDuration="14.244829605s" podCreationTimestamp="2026-03-10 08:10:45 +0000 UTC" firstStartedPulling="2026-03-10 08:10:46.055305119 +0000 UTC m=+5199.085085764" lastFinishedPulling="2026-03-10 08:10:58.358518563 +0000 UTC m=+5211.388299198" observedRunningTime="2026-03-10 08:10:59.23053564 +0000 UTC m=+5212.260316255" watchObservedRunningTime="2026-03-10 08:10:59.244829605 +0000 UTC m=+5212.274610230" Mar 10 08:12:00 crc kubenswrapper[4825]: I0310 08:12:00.169484 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552172-hdgvs"] Mar 10 08:12:00 crc kubenswrapper[4825]: I0310 08:12:00.171855 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552172-hdgvs" Mar 10 08:12:00 crc kubenswrapper[4825]: I0310 08:12:00.177233 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:12:00 crc kubenswrapper[4825]: I0310 08:12:00.177435 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:12:00 crc kubenswrapper[4825]: I0310 08:12:00.177727 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:12:00 crc kubenswrapper[4825]: I0310 08:12:00.194636 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552172-hdgvs"] Mar 10 08:12:00 crc kubenswrapper[4825]: I0310 08:12:00.262698 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dx6mj"] Mar 10 08:12:00 crc kubenswrapper[4825]: I0310 08:12:00.266048 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dx6mj" Mar 10 08:12:00 crc kubenswrapper[4825]: I0310 08:12:00.284192 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dx6mj"] Mar 10 08:12:00 crc kubenswrapper[4825]: I0310 08:12:00.284784 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bff8\" (UniqueName: \"kubernetes.io/projected/9032c572-c243-4bb6-b65c-83017848cc2c-kube-api-access-7bff8\") pod \"auto-csr-approver-29552172-hdgvs\" (UID: \"9032c572-c243-4bb6-b65c-83017848cc2c\") " pod="openshift-infra/auto-csr-approver-29552172-hdgvs" Mar 10 08:12:00 crc kubenswrapper[4825]: I0310 08:12:00.387740 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hnkd\" (UniqueName: \"kubernetes.io/projected/3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b-kube-api-access-7hnkd\") pod \"certified-operators-dx6mj\" (UID: \"3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b\") " pod="openshift-marketplace/certified-operators-dx6mj" Mar 10 08:12:00 crc kubenswrapper[4825]: I0310 08:12:00.387823 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b-utilities\") pod \"certified-operators-dx6mj\" (UID: \"3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b\") " pod="openshift-marketplace/certified-operators-dx6mj" Mar 10 08:12:00 crc kubenswrapper[4825]: I0310 08:12:00.387955 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bff8\" (UniqueName: \"kubernetes.io/projected/9032c572-c243-4bb6-b65c-83017848cc2c-kube-api-access-7bff8\") pod \"auto-csr-approver-29552172-hdgvs\" (UID: \"9032c572-c243-4bb6-b65c-83017848cc2c\") " pod="openshift-infra/auto-csr-approver-29552172-hdgvs" Mar 10 08:12:00 crc kubenswrapper[4825]: I0310 08:12:00.388097 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b-catalog-content\") pod \"certified-operators-dx6mj\" (UID: \"3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b\") " pod="openshift-marketplace/certified-operators-dx6mj" Mar 10 08:12:00 crc kubenswrapper[4825]: I0310 08:12:00.411858 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bff8\" (UniqueName: \"kubernetes.io/projected/9032c572-c243-4bb6-b65c-83017848cc2c-kube-api-access-7bff8\") pod \"auto-csr-approver-29552172-hdgvs\" (UID: \"9032c572-c243-4bb6-b65c-83017848cc2c\") " pod="openshift-infra/auto-csr-approver-29552172-hdgvs" Mar 10 08:12:00 crc kubenswrapper[4825]: I0310 08:12:00.490148 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b-catalog-content\") pod \"certified-operators-dx6mj\" (UID: \"3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b\") " pod="openshift-marketplace/certified-operators-dx6mj" Mar 10 08:12:00 crc kubenswrapper[4825]: I0310 08:12:00.490238 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hnkd\" (UniqueName: \"kubernetes.io/projected/3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b-kube-api-access-7hnkd\") pod \"certified-operators-dx6mj\" (UID: \"3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b\") " pod="openshift-marketplace/certified-operators-dx6mj" Mar 10 08:12:00 crc kubenswrapper[4825]: I0310 08:12:00.490272 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b-utilities\") pod \"certified-operators-dx6mj\" (UID: \"3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b\") " pod="openshift-marketplace/certified-operators-dx6mj" Mar 10 08:12:00 crc kubenswrapper[4825]: I0310 08:12:00.490837 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b-utilities\") pod \"certified-operators-dx6mj\" (UID: \"3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b\") " pod="openshift-marketplace/certified-operators-dx6mj" Mar 10 08:12:00 crc kubenswrapper[4825]: I0310 08:12:00.491231 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b-catalog-content\") pod \"certified-operators-dx6mj\" (UID: \"3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b\") " pod="openshift-marketplace/certified-operators-dx6mj" Mar 10 08:12:00 crc kubenswrapper[4825]: I0310 08:12:00.498454 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552172-hdgvs" Mar 10 08:12:00 crc kubenswrapper[4825]: I0310 08:12:00.511850 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hnkd\" (UniqueName: \"kubernetes.io/projected/3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b-kube-api-access-7hnkd\") pod \"certified-operators-dx6mj\" (UID: \"3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b\") " pod="openshift-marketplace/certified-operators-dx6mj" Mar 10 08:12:00 crc kubenswrapper[4825]: I0310 08:12:00.604056 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dx6mj" Mar 10 08:12:01 crc kubenswrapper[4825]: I0310 08:12:01.059734 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552172-hdgvs"] Mar 10 08:12:01 crc kubenswrapper[4825]: I0310 08:12:01.163904 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dx6mj"] Mar 10 08:12:01 crc kubenswrapper[4825]: W0310 08:12:01.166068 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3033dc36_d740_4fcb_b1bc_b1c4ecfcf95b.slice/crio-31c3fe3f651b99161d9973779bb8e87ac06f6803ca2c5cf4d6f897e7ec22c168 WatchSource:0}: Error finding container 31c3fe3f651b99161d9973779bb8e87ac06f6803ca2c5cf4d6f897e7ec22c168: Status 404 returned error can't find the container with id 31c3fe3f651b99161d9973779bb8e87ac06f6803ca2c5cf4d6f897e7ec22c168 Mar 10 08:12:01 crc kubenswrapper[4825]: I0310 08:12:01.798925 4825 generic.go:334] "Generic (PLEG): container finished" podID="3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b" containerID="5a722f772ff2b870e6cae73d92a12fea94c40323ce1a49af0d4ba243e98f5ab6" exitCode=0 Mar 10 08:12:01 crc kubenswrapper[4825]: I0310 08:12:01.799005 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dx6mj" event={"ID":"3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b","Type":"ContainerDied","Data":"5a722f772ff2b870e6cae73d92a12fea94c40323ce1a49af0d4ba243e98f5ab6"} Mar 10 08:12:01 crc kubenswrapper[4825]: I0310 08:12:01.799541 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dx6mj" event={"ID":"3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b","Type":"ContainerStarted","Data":"31c3fe3f651b99161d9973779bb8e87ac06f6803ca2c5cf4d6f897e7ec22c168"} Mar 10 08:12:01 crc kubenswrapper[4825]: I0310 08:12:01.801614 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552172-hdgvs" event={"ID":"9032c572-c243-4bb6-b65c-83017848cc2c","Type":"ContainerStarted","Data":"d03ba4d9b7eec2205c69cedfe977dc516f3f2e1e54cd3afe22fb4c72c80e8398"} Mar 10 08:12:02 crc kubenswrapper[4825]: I0310 08:12:02.811232 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dx6mj" event={"ID":"3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b","Type":"ContainerStarted","Data":"a10ab00f209ad684fc0a649c9b62f40f0fe242ffbba0522fe37eac099cd333b0"} Mar 10 08:12:02 crc kubenswrapper[4825]: I0310 08:12:02.813380 4825 generic.go:334] "Generic (PLEG): container finished" podID="9032c572-c243-4bb6-b65c-83017848cc2c" containerID="189a96d07f651b5b7db5b6d68d9943764df8afa614cf63037b9e4dc2d490d28b" exitCode=0 Mar 10 08:12:02 crc kubenswrapper[4825]: I0310 08:12:02.813431 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552172-hdgvs" event={"ID":"9032c572-c243-4bb6-b65c-83017848cc2c","Type":"ContainerDied","Data":"189a96d07f651b5b7db5b6d68d9943764df8afa614cf63037b9e4dc2d490d28b"} Mar 10 08:12:03 crc kubenswrapper[4825]: I0310 08:12:03.824535 4825 generic.go:334] "Generic (PLEG): container finished" podID="3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b" containerID="a10ab00f209ad684fc0a649c9b62f40f0fe242ffbba0522fe37eac099cd333b0" exitCode=0 Mar 10 08:12:03 crc kubenswrapper[4825]: I0310 08:12:03.824610 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dx6mj" event={"ID":"3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b","Type":"ContainerDied","Data":"a10ab00f209ad684fc0a649c9b62f40f0fe242ffbba0522fe37eac099cd333b0"} Mar 10 08:12:04 crc kubenswrapper[4825]: I0310 08:12:04.143609 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552172-hdgvs" Mar 10 08:12:04 crc kubenswrapper[4825]: I0310 08:12:04.262903 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bff8\" (UniqueName: \"kubernetes.io/projected/9032c572-c243-4bb6-b65c-83017848cc2c-kube-api-access-7bff8\") pod \"9032c572-c243-4bb6-b65c-83017848cc2c\" (UID: \"9032c572-c243-4bb6-b65c-83017848cc2c\") " Mar 10 08:12:04 crc kubenswrapper[4825]: I0310 08:12:04.269734 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9032c572-c243-4bb6-b65c-83017848cc2c-kube-api-access-7bff8" (OuterVolumeSpecName: "kube-api-access-7bff8") pod "9032c572-c243-4bb6-b65c-83017848cc2c" (UID: "9032c572-c243-4bb6-b65c-83017848cc2c"). InnerVolumeSpecName "kube-api-access-7bff8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:12:04 crc kubenswrapper[4825]: I0310 08:12:04.364865 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bff8\" (UniqueName: \"kubernetes.io/projected/9032c572-c243-4bb6-b65c-83017848cc2c-kube-api-access-7bff8\") on node \"crc\" DevicePath \"\"" Mar 10 08:12:04 crc kubenswrapper[4825]: I0310 08:12:04.837306 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552172-hdgvs" event={"ID":"9032c572-c243-4bb6-b65c-83017848cc2c","Type":"ContainerDied","Data":"d03ba4d9b7eec2205c69cedfe977dc516f3f2e1e54cd3afe22fb4c72c80e8398"} Mar 10 08:12:04 crc kubenswrapper[4825]: I0310 08:12:04.837365 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d03ba4d9b7eec2205c69cedfe977dc516f3f2e1e54cd3afe22fb4c72c80e8398" Mar 10 08:12:04 crc kubenswrapper[4825]: I0310 08:12:04.837333 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552172-hdgvs" Mar 10 08:12:04 crc kubenswrapper[4825]: I0310 08:12:04.840081 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dx6mj" event={"ID":"3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b","Type":"ContainerStarted","Data":"3fd57e6dc614ddd898114e97eb1ba7470e72ccd708d11ca4adc47bfb9df12dd8"} Mar 10 08:12:04 crc kubenswrapper[4825]: I0310 08:12:04.881512 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dx6mj" podStartSLOduration=2.278315923 podStartE2EDuration="4.881491144s" podCreationTimestamp="2026-03-10 08:12:00 +0000 UTC" firstStartedPulling="2026-03-10 08:12:01.801892108 +0000 UTC m=+5274.831672753" lastFinishedPulling="2026-03-10 08:12:04.405067359 +0000 UTC m=+5277.434847974" observedRunningTime="2026-03-10 08:12:04.873322099 +0000 UTC m=+5277.903102734" watchObservedRunningTime="2026-03-10 08:12:04.881491144 +0000 UTC m=+5277.911271769" Mar 10 08:12:05 crc kubenswrapper[4825]: I0310 08:12:05.251515 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552166-cjdxc"] Mar 10 08:12:05 crc kubenswrapper[4825]: I0310 08:12:05.252124 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552166-cjdxc"] Mar 10 08:12:07 crc kubenswrapper[4825]: I0310 08:12:07.248647 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="658c3420-a9be-4dd3-be55-d59475be36dc" path="/var/lib/kubelet/pods/658c3420-a9be-4dd3-be55-d59475be36dc/volumes" Mar 10 08:12:10 crc kubenswrapper[4825]: I0310 08:12:10.604541 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dx6mj" Mar 10 08:12:10 crc kubenswrapper[4825]: I0310 08:12:10.605207 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dx6mj" Mar 10 08:12:10 crc kubenswrapper[4825]: I0310 08:12:10.667936 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dx6mj" Mar 10 08:12:10 crc kubenswrapper[4825]: I0310 08:12:10.951743 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dx6mj" Mar 10 08:12:11 crc kubenswrapper[4825]: I0310 08:12:10.994540 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dx6mj"] Mar 10 08:12:12 crc kubenswrapper[4825]: I0310 08:12:12.913650 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dx6mj" podUID="3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b" containerName="registry-server" containerID="cri-o://3fd57e6dc614ddd898114e97eb1ba7470e72ccd708d11ca4adc47bfb9df12dd8" gracePeriod=2 Mar 10 08:12:13 crc kubenswrapper[4825]: I0310 08:12:13.425766 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dx6mj" Mar 10 08:12:13 crc kubenswrapper[4825]: I0310 08:12:13.528591 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b-catalog-content\") pod \"3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b\" (UID: \"3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b\") " Mar 10 08:12:13 crc kubenswrapper[4825]: I0310 08:12:13.528699 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hnkd\" (UniqueName: \"kubernetes.io/projected/3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b-kube-api-access-7hnkd\") pod \"3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b\" (UID: \"3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b\") " Mar 10 08:12:13 crc kubenswrapper[4825]: I0310 08:12:13.528774 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b-utilities\") pod \"3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b\" (UID: \"3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b\") " Mar 10 08:12:13 crc kubenswrapper[4825]: I0310 08:12:13.539293 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b-kube-api-access-7hnkd" (OuterVolumeSpecName: "kube-api-access-7hnkd") pod "3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b" (UID: "3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b"). InnerVolumeSpecName "kube-api-access-7hnkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:12:13 crc kubenswrapper[4825]: I0310 08:12:13.562343 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b-utilities" (OuterVolumeSpecName: "utilities") pod "3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b" (UID: "3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:12:13 crc kubenswrapper[4825]: I0310 08:12:13.631177 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hnkd\" (UniqueName: \"kubernetes.io/projected/3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b-kube-api-access-7hnkd\") on node \"crc\" DevicePath \"\"" Mar 10 08:12:13 crc kubenswrapper[4825]: I0310 08:12:13.631206 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 08:12:13 crc kubenswrapper[4825]: I0310 08:12:13.924737 4825 generic.go:334] "Generic (PLEG): container finished" podID="3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b" containerID="3fd57e6dc614ddd898114e97eb1ba7470e72ccd708d11ca4adc47bfb9df12dd8" exitCode=0 Mar 10 08:12:13 crc kubenswrapper[4825]: I0310 08:12:13.924800 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dx6mj" Mar 10 08:12:13 crc kubenswrapper[4825]: I0310 08:12:13.924808 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dx6mj" event={"ID":"3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b","Type":"ContainerDied","Data":"3fd57e6dc614ddd898114e97eb1ba7470e72ccd708d11ca4adc47bfb9df12dd8"} Mar 10 08:12:13 crc kubenswrapper[4825]: I0310 08:12:13.926201 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dx6mj" event={"ID":"3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b","Type":"ContainerDied","Data":"31c3fe3f651b99161d9973779bb8e87ac06f6803ca2c5cf4d6f897e7ec22c168"} Mar 10 08:12:13 crc kubenswrapper[4825]: I0310 08:12:13.926235 4825 scope.go:117] "RemoveContainer" containerID="3fd57e6dc614ddd898114e97eb1ba7470e72ccd708d11ca4adc47bfb9df12dd8" Mar 10 08:12:13 crc kubenswrapper[4825]: I0310 08:12:13.946786 4825 scope.go:117] "RemoveContainer" containerID="a10ab00f209ad684fc0a649c9b62f40f0fe242ffbba0522fe37eac099cd333b0" Mar 10 08:12:13 crc kubenswrapper[4825]: I0310 08:12:13.968976 4825 scope.go:117] "RemoveContainer" containerID="5a722f772ff2b870e6cae73d92a12fea94c40323ce1a49af0d4ba243e98f5ab6" Mar 10 08:12:14 crc kubenswrapper[4825]: I0310 08:12:14.014373 4825 scope.go:117] "RemoveContainer" containerID="3fd57e6dc614ddd898114e97eb1ba7470e72ccd708d11ca4adc47bfb9df12dd8" Mar 10 08:12:14 crc kubenswrapper[4825]: E0310 08:12:14.015392 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fd57e6dc614ddd898114e97eb1ba7470e72ccd708d11ca4adc47bfb9df12dd8\": container with ID starting with 3fd57e6dc614ddd898114e97eb1ba7470e72ccd708d11ca4adc47bfb9df12dd8 not found: ID does not exist" containerID="3fd57e6dc614ddd898114e97eb1ba7470e72ccd708d11ca4adc47bfb9df12dd8" Mar 10 08:12:14 crc kubenswrapper[4825]: I0310 08:12:14.015436 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fd57e6dc614ddd898114e97eb1ba7470e72ccd708d11ca4adc47bfb9df12dd8"} err="failed to get container status \"3fd57e6dc614ddd898114e97eb1ba7470e72ccd708d11ca4adc47bfb9df12dd8\": rpc error: code = NotFound desc = could not find container \"3fd57e6dc614ddd898114e97eb1ba7470e72ccd708d11ca4adc47bfb9df12dd8\": container with ID starting with 3fd57e6dc614ddd898114e97eb1ba7470e72ccd708d11ca4adc47bfb9df12dd8 not found: ID does not exist" Mar 10 08:12:14 crc kubenswrapper[4825]: I0310 08:12:14.015466 4825 scope.go:117] "RemoveContainer" containerID="a10ab00f209ad684fc0a649c9b62f40f0fe242ffbba0522fe37eac099cd333b0" Mar 10 08:12:14 crc kubenswrapper[4825]: E0310 08:12:14.015877 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a10ab00f209ad684fc0a649c9b62f40f0fe242ffbba0522fe37eac099cd333b0\": container with ID starting with a10ab00f209ad684fc0a649c9b62f40f0fe242ffbba0522fe37eac099cd333b0 not found: ID does not exist" containerID="a10ab00f209ad684fc0a649c9b62f40f0fe242ffbba0522fe37eac099cd333b0" Mar 10 08:12:14 crc kubenswrapper[4825]: I0310 08:12:14.015907 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a10ab00f209ad684fc0a649c9b62f40f0fe242ffbba0522fe37eac099cd333b0"} err="failed to get container status \"a10ab00f209ad684fc0a649c9b62f40f0fe242ffbba0522fe37eac099cd333b0\": rpc error: code = NotFound desc = could not find container \"a10ab00f209ad684fc0a649c9b62f40f0fe242ffbba0522fe37eac099cd333b0\": container with ID starting with a10ab00f209ad684fc0a649c9b62f40f0fe242ffbba0522fe37eac099cd333b0 not found: ID does not exist" Mar 10 08:12:14 crc kubenswrapper[4825]: I0310 08:12:14.015925 4825 scope.go:117] "RemoveContainer" containerID="5a722f772ff2b870e6cae73d92a12fea94c40323ce1a49af0d4ba243e98f5ab6" Mar 10 08:12:14 crc kubenswrapper[4825]: E0310 08:12:14.016171 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a722f772ff2b870e6cae73d92a12fea94c40323ce1a49af0d4ba243e98f5ab6\": container with ID starting with 5a722f772ff2b870e6cae73d92a12fea94c40323ce1a49af0d4ba243e98f5ab6 not found: ID does not exist" containerID="5a722f772ff2b870e6cae73d92a12fea94c40323ce1a49af0d4ba243e98f5ab6" Mar 10 08:12:14 crc kubenswrapper[4825]: I0310 08:12:14.016204 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a722f772ff2b870e6cae73d92a12fea94c40323ce1a49af0d4ba243e98f5ab6"} err="failed to get container status \"5a722f772ff2b870e6cae73d92a12fea94c40323ce1a49af0d4ba243e98f5ab6\": rpc error: code = NotFound desc = could not find container \"5a722f772ff2b870e6cae73d92a12fea94c40323ce1a49af0d4ba243e98f5ab6\": container with ID starting with 5a722f772ff2b870e6cae73d92a12fea94c40323ce1a49af0d4ba243e98f5ab6 not found: ID does not exist" Mar 10 08:12:14 crc kubenswrapper[4825]: I0310 08:12:14.100406 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b" (UID: "3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:12:14 crc kubenswrapper[4825]: I0310 08:12:14.143024 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 08:12:14 crc kubenswrapper[4825]: I0310 08:12:14.256406 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dx6mj"] Mar 10 08:12:14 crc kubenswrapper[4825]: I0310 08:12:14.262977 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dx6mj"] Mar 10 08:12:15 crc kubenswrapper[4825]: I0310 08:12:15.255524 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b" path="/var/lib/kubelet/pods/3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b/volumes" Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.371628 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-krslz"] Mar 10 08:12:22 crc kubenswrapper[4825]: E0310 08:12:22.372206 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b" containerName="registry-server" Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.372219 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b" containerName="registry-server" Mar 10 08:12:22 crc kubenswrapper[4825]: E0310 08:12:22.372229 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b" containerName="extract-utilities" Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.372235 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b" containerName="extract-utilities" Mar 10 08:12:22 crc kubenswrapper[4825]: E0310 08:12:22.372255 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b" containerName="extract-content" Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.372261 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b" containerName="extract-content" Mar 10 08:12:22 crc kubenswrapper[4825]: E0310 08:12:22.372278 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9032c572-c243-4bb6-b65c-83017848cc2c" containerName="oc" Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.372285 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="9032c572-c243-4bb6-b65c-83017848cc2c" containerName="oc" Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.372429 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3033dc36-d740-4fcb-b1bc-b1c4ecfcf95b" containerName="registry-server" Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.372446 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="9032c572-c243-4bb6-b65c-83017848cc2c" containerName="oc" Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.373014 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-krslz" Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.383350 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-krslz"] Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.505310 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7e5180d-ef78-40ef-9813-da06bc220bc9-operator-scripts\") pod \"barbican-db-create-krslz\" (UID: \"a7e5180d-ef78-40ef-9813-da06bc220bc9\") " pod="openstack/barbican-db-create-krslz" Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.505370 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rqs5\" (UniqueName: \"kubernetes.io/projected/a7e5180d-ef78-40ef-9813-da06bc220bc9-kube-api-access-7rqs5\") pod \"barbican-db-create-krslz\" (UID: \"a7e5180d-ef78-40ef-9813-da06bc220bc9\") " pod="openstack/barbican-db-create-krslz" Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.507570 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-25ac-account-create-update-hn6w9"] Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.508850 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-25ac-account-create-update-hn6w9" Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.510887 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.520786 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-25ac-account-create-update-hn6w9"] Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.607074 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46c67\" (UniqueName: \"kubernetes.io/projected/d8e45726-486b-47db-8c50-1f7bf24c83dc-kube-api-access-46c67\") pod \"barbican-25ac-account-create-update-hn6w9\" (UID: \"d8e45726-486b-47db-8c50-1f7bf24c83dc\") " pod="openstack/barbican-25ac-account-create-update-hn6w9" Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.607247 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7e5180d-ef78-40ef-9813-da06bc220bc9-operator-scripts\") pod \"barbican-db-create-krslz\" (UID: \"a7e5180d-ef78-40ef-9813-da06bc220bc9\") " pod="openstack/barbican-db-create-krslz" Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.607282 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rqs5\" (UniqueName: \"kubernetes.io/projected/a7e5180d-ef78-40ef-9813-da06bc220bc9-kube-api-access-7rqs5\") pod \"barbican-db-create-krslz\" (UID: \"a7e5180d-ef78-40ef-9813-da06bc220bc9\") " pod="openstack/barbican-db-create-krslz" Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.607324 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8e45726-486b-47db-8c50-1f7bf24c83dc-operator-scripts\") pod \"barbican-25ac-account-create-update-hn6w9\" (UID: \"d8e45726-486b-47db-8c50-1f7bf24c83dc\") " pod="openstack/barbican-25ac-account-create-update-hn6w9" Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.608226 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7e5180d-ef78-40ef-9813-da06bc220bc9-operator-scripts\") pod \"barbican-db-create-krslz\" (UID: \"a7e5180d-ef78-40ef-9813-da06bc220bc9\") " pod="openstack/barbican-db-create-krslz" Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.638899 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rqs5\" (UniqueName: \"kubernetes.io/projected/a7e5180d-ef78-40ef-9813-da06bc220bc9-kube-api-access-7rqs5\") pod \"barbican-db-create-krslz\" (UID: \"a7e5180d-ef78-40ef-9813-da06bc220bc9\") " pod="openstack/barbican-db-create-krslz" Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.691808 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-krslz" Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.709124 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46c67\" (UniqueName: \"kubernetes.io/projected/d8e45726-486b-47db-8c50-1f7bf24c83dc-kube-api-access-46c67\") pod \"barbican-25ac-account-create-update-hn6w9\" (UID: \"d8e45726-486b-47db-8c50-1f7bf24c83dc\") " pod="openstack/barbican-25ac-account-create-update-hn6w9" Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.709634 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8e45726-486b-47db-8c50-1f7bf24c83dc-operator-scripts\") pod \"barbican-25ac-account-create-update-hn6w9\" (UID: \"d8e45726-486b-47db-8c50-1f7bf24c83dc\") " pod="openstack/barbican-25ac-account-create-update-hn6w9" Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.710378 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8e45726-486b-47db-8c50-1f7bf24c83dc-operator-scripts\") pod \"barbican-25ac-account-create-update-hn6w9\" (UID: \"d8e45726-486b-47db-8c50-1f7bf24c83dc\") " pod="openstack/barbican-25ac-account-create-update-hn6w9" Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.727654 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46c67\" (UniqueName: \"kubernetes.io/projected/d8e45726-486b-47db-8c50-1f7bf24c83dc-kube-api-access-46c67\") pod \"barbican-25ac-account-create-update-hn6w9\" (UID: \"d8e45726-486b-47db-8c50-1f7bf24c83dc\") " pod="openstack/barbican-25ac-account-create-update-hn6w9" Mar 10 08:12:22 crc kubenswrapper[4825]: I0310 08:12:22.829247 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-25ac-account-create-update-hn6w9" Mar 10 08:12:23 crc kubenswrapper[4825]: I0310 08:12:23.119189 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-krslz"] Mar 10 08:12:23 crc kubenswrapper[4825]: I0310 08:12:23.262475 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-25ac-account-create-update-hn6w9"] Mar 10 08:12:23 crc kubenswrapper[4825]: W0310 08:12:23.267622 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8e45726_486b_47db_8c50_1f7bf24c83dc.slice/crio-b2dfac38641534fbee29dbeada3edc356b659bcf7599b187b37f1cdadf88c37d WatchSource:0}: Error finding container b2dfac38641534fbee29dbeada3edc356b659bcf7599b187b37f1cdadf88c37d: Status 404 returned error can't find the container with id b2dfac38641534fbee29dbeada3edc356b659bcf7599b187b37f1cdadf88c37d Mar 10 08:12:24 crc kubenswrapper[4825]: I0310 08:12:24.018809 4825 generic.go:334] "Generic (PLEG): container finished" podID="a7e5180d-ef78-40ef-9813-da06bc220bc9" containerID="458b51ca9dd944447c170a4ec933cff17927330de8315141fc288440a8bda21c" exitCode=0 Mar 10 08:12:24 crc kubenswrapper[4825]: I0310 08:12:24.019095 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-krslz" event={"ID":"a7e5180d-ef78-40ef-9813-da06bc220bc9","Type":"ContainerDied","Data":"458b51ca9dd944447c170a4ec933cff17927330de8315141fc288440a8bda21c"} Mar 10 08:12:24 crc kubenswrapper[4825]: I0310 08:12:24.019151 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-krslz" event={"ID":"a7e5180d-ef78-40ef-9813-da06bc220bc9","Type":"ContainerStarted","Data":"730b196232955da0d17278543bebd801f4162bcec7d86f12b11353ec1f2e0295"} Mar 10 08:12:24 crc kubenswrapper[4825]: I0310 08:12:24.020758 4825 generic.go:334] "Generic (PLEG): container finished" podID="d8e45726-486b-47db-8c50-1f7bf24c83dc" containerID="750740d74dc7ce86ef078cec1e5b2c8959a59bb22ae559a45faba7e7d535b367" exitCode=0 Mar 10 08:12:24 crc kubenswrapper[4825]: I0310 08:12:24.020790 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-25ac-account-create-update-hn6w9" event={"ID":"d8e45726-486b-47db-8c50-1f7bf24c83dc","Type":"ContainerDied","Data":"750740d74dc7ce86ef078cec1e5b2c8959a59bb22ae559a45faba7e7d535b367"} Mar 10 08:12:24 crc kubenswrapper[4825]: I0310 08:12:24.020808 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-25ac-account-create-update-hn6w9" event={"ID":"d8e45726-486b-47db-8c50-1f7bf24c83dc","Type":"ContainerStarted","Data":"b2dfac38641534fbee29dbeada3edc356b659bcf7599b187b37f1cdadf88c37d"} Mar 10 08:12:25 crc kubenswrapper[4825]: I0310 08:12:25.420478 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-25ac-account-create-update-hn6w9" Mar 10 08:12:25 crc kubenswrapper[4825]: I0310 08:12:25.427859 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-krslz" Mar 10 08:12:25 crc kubenswrapper[4825]: I0310 08:12:25.571098 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rqs5\" (UniqueName: \"kubernetes.io/projected/a7e5180d-ef78-40ef-9813-da06bc220bc9-kube-api-access-7rqs5\") pod \"a7e5180d-ef78-40ef-9813-da06bc220bc9\" (UID: \"a7e5180d-ef78-40ef-9813-da06bc220bc9\") " Mar 10 08:12:25 crc kubenswrapper[4825]: I0310 08:12:25.571216 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8e45726-486b-47db-8c50-1f7bf24c83dc-operator-scripts\") pod \"d8e45726-486b-47db-8c50-1f7bf24c83dc\" (UID: \"d8e45726-486b-47db-8c50-1f7bf24c83dc\") " Mar 10 08:12:25 crc kubenswrapper[4825]: I0310 08:12:25.571275 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46c67\" (UniqueName: \"kubernetes.io/projected/d8e45726-486b-47db-8c50-1f7bf24c83dc-kube-api-access-46c67\") pod \"d8e45726-486b-47db-8c50-1f7bf24c83dc\" (UID: \"d8e45726-486b-47db-8c50-1f7bf24c83dc\") " Mar 10 08:12:25 crc kubenswrapper[4825]: I0310 08:12:25.571362 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7e5180d-ef78-40ef-9813-da06bc220bc9-operator-scripts\") pod \"a7e5180d-ef78-40ef-9813-da06bc220bc9\" (UID: \"a7e5180d-ef78-40ef-9813-da06bc220bc9\") " Mar 10 08:12:25 crc kubenswrapper[4825]: I0310 08:12:25.571800 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8e45726-486b-47db-8c50-1f7bf24c83dc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8e45726-486b-47db-8c50-1f7bf24c83dc" (UID: "d8e45726-486b-47db-8c50-1f7bf24c83dc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:12:25 crc kubenswrapper[4825]: I0310 08:12:25.572244 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7e5180d-ef78-40ef-9813-da06bc220bc9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a7e5180d-ef78-40ef-9813-da06bc220bc9" (UID: "a7e5180d-ef78-40ef-9813-da06bc220bc9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:12:25 crc kubenswrapper[4825]: I0310 08:12:25.573287 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7e5180d-ef78-40ef-9813-da06bc220bc9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:12:25 crc kubenswrapper[4825]: I0310 08:12:25.573333 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8e45726-486b-47db-8c50-1f7bf24c83dc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:12:25 crc kubenswrapper[4825]: I0310 08:12:25.576839 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8e45726-486b-47db-8c50-1f7bf24c83dc-kube-api-access-46c67" (OuterVolumeSpecName: "kube-api-access-46c67") pod "d8e45726-486b-47db-8c50-1f7bf24c83dc" (UID: "d8e45726-486b-47db-8c50-1f7bf24c83dc"). InnerVolumeSpecName "kube-api-access-46c67". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:12:25 crc kubenswrapper[4825]: I0310 08:12:25.579261 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7e5180d-ef78-40ef-9813-da06bc220bc9-kube-api-access-7rqs5" (OuterVolumeSpecName: "kube-api-access-7rqs5") pod "a7e5180d-ef78-40ef-9813-da06bc220bc9" (UID: "a7e5180d-ef78-40ef-9813-da06bc220bc9"). InnerVolumeSpecName "kube-api-access-7rqs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:12:25 crc kubenswrapper[4825]: I0310 08:12:25.675212 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rqs5\" (UniqueName: \"kubernetes.io/projected/a7e5180d-ef78-40ef-9813-da06bc220bc9-kube-api-access-7rqs5\") on node \"crc\" DevicePath \"\"" Mar 10 08:12:25 crc kubenswrapper[4825]: I0310 08:12:25.675293 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46c67\" (UniqueName: \"kubernetes.io/projected/d8e45726-486b-47db-8c50-1f7bf24c83dc-kube-api-access-46c67\") on node \"crc\" DevicePath \"\"" Mar 10 08:12:26 crc kubenswrapper[4825]: I0310 08:12:26.039391 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-25ac-account-create-update-hn6w9" Mar 10 08:12:26 crc kubenswrapper[4825]: I0310 08:12:26.039374 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-25ac-account-create-update-hn6w9" event={"ID":"d8e45726-486b-47db-8c50-1f7bf24c83dc","Type":"ContainerDied","Data":"b2dfac38641534fbee29dbeada3edc356b659bcf7599b187b37f1cdadf88c37d"} Mar 10 08:12:26 crc kubenswrapper[4825]: I0310 08:12:26.039784 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2dfac38641534fbee29dbeada3edc356b659bcf7599b187b37f1cdadf88c37d" Mar 10 08:12:26 crc kubenswrapper[4825]: I0310 08:12:26.041511 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-krslz" event={"ID":"a7e5180d-ef78-40ef-9813-da06bc220bc9","Type":"ContainerDied","Data":"730b196232955da0d17278543bebd801f4162bcec7d86f12b11353ec1f2e0295"} Mar 10 08:12:26 crc kubenswrapper[4825]: I0310 08:12:26.041549 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="730b196232955da0d17278543bebd801f4162bcec7d86f12b11353ec1f2e0295" Mar 10 08:12:26 crc kubenswrapper[4825]: I0310 08:12:26.041564 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-krslz" Mar 10 08:12:27 crc kubenswrapper[4825]: I0310 08:12:27.749765 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-m6vdt"] Mar 10 08:12:27 crc kubenswrapper[4825]: E0310 08:12:27.750159 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e5180d-ef78-40ef-9813-da06bc220bc9" containerName="mariadb-database-create" Mar 10 08:12:27 crc kubenswrapper[4825]: I0310 08:12:27.750175 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e5180d-ef78-40ef-9813-da06bc220bc9" containerName="mariadb-database-create" Mar 10 08:12:27 crc kubenswrapper[4825]: E0310 08:12:27.750189 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e45726-486b-47db-8c50-1f7bf24c83dc" containerName="mariadb-account-create-update" Mar 10 08:12:27 crc kubenswrapper[4825]: I0310 08:12:27.750197 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e45726-486b-47db-8c50-1f7bf24c83dc" containerName="mariadb-account-create-update" Mar 10 08:12:27 crc kubenswrapper[4825]: I0310 08:12:27.750428 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8e45726-486b-47db-8c50-1f7bf24c83dc" containerName="mariadb-account-create-update" Mar 10 08:12:27 crc kubenswrapper[4825]: I0310 08:12:27.750450 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7e5180d-ef78-40ef-9813-da06bc220bc9" containerName="mariadb-database-create" Mar 10 08:12:27 crc kubenswrapper[4825]: I0310 08:12:27.751053 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-m6vdt" Mar 10 08:12:27 crc kubenswrapper[4825]: I0310 08:12:27.755419 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-9s54j" Mar 10 08:12:27 crc kubenswrapper[4825]: I0310 08:12:27.755638 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 10 08:12:27 crc kubenswrapper[4825]: I0310 08:12:27.763547 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-m6vdt"] Mar 10 08:12:27 crc kubenswrapper[4825]: I0310 08:12:27.912027 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf14f62c-18f3-4696-abd2-ea72a47e6773-combined-ca-bundle\") pod \"barbican-db-sync-m6vdt\" (UID: \"bf14f62c-18f3-4696-abd2-ea72a47e6773\") " pod="openstack/barbican-db-sync-m6vdt" Mar 10 08:12:27 crc kubenswrapper[4825]: I0310 08:12:27.912168 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58pvq\" (UniqueName: \"kubernetes.io/projected/bf14f62c-18f3-4696-abd2-ea72a47e6773-kube-api-access-58pvq\") pod \"barbican-db-sync-m6vdt\" (UID: \"bf14f62c-18f3-4696-abd2-ea72a47e6773\") " pod="openstack/barbican-db-sync-m6vdt" Mar 10 08:12:27 crc kubenswrapper[4825]: I0310 08:12:27.912196 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf14f62c-18f3-4696-abd2-ea72a47e6773-db-sync-config-data\") pod \"barbican-db-sync-m6vdt\" (UID: \"bf14f62c-18f3-4696-abd2-ea72a47e6773\") " pod="openstack/barbican-db-sync-m6vdt" Mar 10 08:12:28 crc kubenswrapper[4825]: I0310 08:12:28.014752 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58pvq\" (UniqueName: \"kubernetes.io/projected/bf14f62c-18f3-4696-abd2-ea72a47e6773-kube-api-access-58pvq\") pod \"barbican-db-sync-m6vdt\" (UID: \"bf14f62c-18f3-4696-abd2-ea72a47e6773\") " pod="openstack/barbican-db-sync-m6vdt" Mar 10 08:12:28 crc kubenswrapper[4825]: I0310 08:12:28.014844 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf14f62c-18f3-4696-abd2-ea72a47e6773-db-sync-config-data\") pod \"barbican-db-sync-m6vdt\" (UID: \"bf14f62c-18f3-4696-abd2-ea72a47e6773\") " pod="openstack/barbican-db-sync-m6vdt" Mar 10 08:12:28 crc kubenswrapper[4825]: I0310 08:12:28.014962 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf14f62c-18f3-4696-abd2-ea72a47e6773-combined-ca-bundle\") pod \"barbican-db-sync-m6vdt\" (UID: \"bf14f62c-18f3-4696-abd2-ea72a47e6773\") " pod="openstack/barbican-db-sync-m6vdt" Mar 10 08:12:28 crc kubenswrapper[4825]: I0310 08:12:28.022045 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf14f62c-18f3-4696-abd2-ea72a47e6773-combined-ca-bundle\") pod \"barbican-db-sync-m6vdt\" (UID: \"bf14f62c-18f3-4696-abd2-ea72a47e6773\") " pod="openstack/barbican-db-sync-m6vdt" Mar 10 08:12:28 crc kubenswrapper[4825]: I0310 08:12:28.035266 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf14f62c-18f3-4696-abd2-ea72a47e6773-db-sync-config-data\") pod \"barbican-db-sync-m6vdt\" (UID: \"bf14f62c-18f3-4696-abd2-ea72a47e6773\") " pod="openstack/barbican-db-sync-m6vdt" Mar 10 08:12:28 crc kubenswrapper[4825]: I0310 08:12:28.036351 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58pvq\" (UniqueName: \"kubernetes.io/projected/bf14f62c-18f3-4696-abd2-ea72a47e6773-kube-api-access-58pvq\") pod \"barbican-db-sync-m6vdt\" (UID: \"bf14f62c-18f3-4696-abd2-ea72a47e6773\") " pod="openstack/barbican-db-sync-m6vdt" Mar 10 08:12:28 crc kubenswrapper[4825]: I0310 08:12:28.070977 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-m6vdt" Mar 10 08:12:28 crc kubenswrapper[4825]: I0310 08:12:28.561774 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-m6vdt"] Mar 10 08:12:29 crc kubenswrapper[4825]: I0310 08:12:29.067811 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-m6vdt" event={"ID":"bf14f62c-18f3-4696-abd2-ea72a47e6773","Type":"ContainerStarted","Data":"dfe6d32b8b0d8ef1fc33e8b139619694d4a07fcbe8f74b0174e72afaf2cba944"} Mar 10 08:12:34 crc kubenswrapper[4825]: I0310 08:12:34.104122 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-m6vdt" event={"ID":"bf14f62c-18f3-4696-abd2-ea72a47e6773","Type":"ContainerStarted","Data":"0aac67a9c34adc135dd6601faadaecb8b5564bc2c31f1a368b7f6aa72a870b3a"} Mar 10 08:12:34 crc kubenswrapper[4825]: I0310 08:12:34.139042 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-m6vdt" podStartSLOduration=2.456730716 podStartE2EDuration="7.139020271s" podCreationTimestamp="2026-03-10 08:12:27 +0000 UTC" firstStartedPulling="2026-03-10 08:12:28.574249194 +0000 UTC m=+5301.604029809" lastFinishedPulling="2026-03-10 08:12:33.256538749 +0000 UTC m=+5306.286319364" observedRunningTime="2026-03-10 08:12:34.132096219 +0000 UTC m=+5307.161876844" watchObservedRunningTime="2026-03-10 08:12:34.139020271 +0000 UTC m=+5307.168800886" Mar 10 08:12:35 crc kubenswrapper[4825]: I0310 08:12:35.122443 4825 generic.go:334] "Generic (PLEG): container finished" podID="bf14f62c-18f3-4696-abd2-ea72a47e6773" containerID="0aac67a9c34adc135dd6601faadaecb8b5564bc2c31f1a368b7f6aa72a870b3a" exitCode=0 Mar 10 08:12:35 crc kubenswrapper[4825]: I0310 08:12:35.122526 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-m6vdt" event={"ID":"bf14f62c-18f3-4696-abd2-ea72a47e6773","Type":"ContainerDied","Data":"0aac67a9c34adc135dd6601faadaecb8b5564bc2c31f1a368b7f6aa72a870b3a"} Mar 10 08:12:36 crc kubenswrapper[4825]: I0310 08:12:36.522192 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-m6vdt" Mar 10 08:12:36 crc kubenswrapper[4825]: I0310 08:12:36.698841 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf14f62c-18f3-4696-abd2-ea72a47e6773-db-sync-config-data\") pod \"bf14f62c-18f3-4696-abd2-ea72a47e6773\" (UID: \"bf14f62c-18f3-4696-abd2-ea72a47e6773\") " Mar 10 08:12:36 crc kubenswrapper[4825]: I0310 08:12:36.698991 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf14f62c-18f3-4696-abd2-ea72a47e6773-combined-ca-bundle\") pod \"bf14f62c-18f3-4696-abd2-ea72a47e6773\" (UID: \"bf14f62c-18f3-4696-abd2-ea72a47e6773\") " Mar 10 08:12:36 crc kubenswrapper[4825]: I0310 08:12:36.699211 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58pvq\" (UniqueName: \"kubernetes.io/projected/bf14f62c-18f3-4696-abd2-ea72a47e6773-kube-api-access-58pvq\") pod \"bf14f62c-18f3-4696-abd2-ea72a47e6773\" (UID: \"bf14f62c-18f3-4696-abd2-ea72a47e6773\") " Mar 10 08:12:36 crc kubenswrapper[4825]: I0310 08:12:36.705836 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf14f62c-18f3-4696-abd2-ea72a47e6773-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bf14f62c-18f3-4696-abd2-ea72a47e6773" (UID: "bf14f62c-18f3-4696-abd2-ea72a47e6773"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:12:36 crc kubenswrapper[4825]: I0310 08:12:36.707120 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf14f62c-18f3-4696-abd2-ea72a47e6773-kube-api-access-58pvq" (OuterVolumeSpecName: "kube-api-access-58pvq") pod "bf14f62c-18f3-4696-abd2-ea72a47e6773" (UID: "bf14f62c-18f3-4696-abd2-ea72a47e6773"). InnerVolumeSpecName "kube-api-access-58pvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:12:36 crc kubenswrapper[4825]: I0310 08:12:36.731470 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf14f62c-18f3-4696-abd2-ea72a47e6773-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf14f62c-18f3-4696-abd2-ea72a47e6773" (UID: "bf14f62c-18f3-4696-abd2-ea72a47e6773"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:12:36 crc kubenswrapper[4825]: I0310 08:12:36.801526 4825 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf14f62c-18f3-4696-abd2-ea72a47e6773-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:12:36 crc kubenswrapper[4825]: I0310 08:12:36.801580 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf14f62c-18f3-4696-abd2-ea72a47e6773-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:12:36 crc kubenswrapper[4825]: I0310 08:12:36.801601 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58pvq\" (UniqueName: \"kubernetes.io/projected/bf14f62c-18f3-4696-abd2-ea72a47e6773-kube-api-access-58pvq\") on node \"crc\" DevicePath \"\"" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.144698 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-m6vdt" event={"ID":"bf14f62c-18f3-4696-abd2-ea72a47e6773","Type":"ContainerDied","Data":"dfe6d32b8b0d8ef1fc33e8b139619694d4a07fcbe8f74b0174e72afaf2cba944"} Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.144742 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfe6d32b8b0d8ef1fc33e8b139619694d4a07fcbe8f74b0174e72afaf2cba944" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.144751 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-m6vdt" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.414276 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-644cfb5dc8-ldc2h"] Mar 10 08:12:37 crc kubenswrapper[4825]: E0310 08:12:37.414673 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf14f62c-18f3-4696-abd2-ea72a47e6773" containerName="barbican-db-sync" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.414695 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf14f62c-18f3-4696-abd2-ea72a47e6773" containerName="barbican-db-sync" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.414864 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf14f62c-18f3-4696-abd2-ea72a47e6773" containerName="barbican-db-sync" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.415692 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-644cfb5dc8-ldc2h" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.424944 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.425834 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-9s54j" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.427089 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-644cfb5dc8-ldc2h"] Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.433426 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.464113 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7c98ff7-wbdq9"] Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.465450 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c98ff7-wbdq9" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.469171 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.490521 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7c98ff7-wbdq9"] Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.513476 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac2019c-71a9-4006-b1e3-a3d5104834c7-combined-ca-bundle\") pod \"barbican-keystone-listener-644cfb5dc8-ldc2h\" (UID: \"0ac2019c-71a9-4006-b1e3-a3d5104834c7\") " pod="openstack/barbican-keystone-listener-644cfb5dc8-ldc2h" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.513592 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ac2019c-71a9-4006-b1e3-a3d5104834c7-config-data\") pod \"barbican-keystone-listener-644cfb5dc8-ldc2h\" (UID: \"0ac2019c-71a9-4006-b1e3-a3d5104834c7\") " pod="openstack/barbican-keystone-listener-644cfb5dc8-ldc2h" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.513625 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ac2019c-71a9-4006-b1e3-a3d5104834c7-logs\") pod \"barbican-keystone-listener-644cfb5dc8-ldc2h\" (UID: \"0ac2019c-71a9-4006-b1e3-a3d5104834c7\") " pod="openstack/barbican-keystone-listener-644cfb5dc8-ldc2h" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.513647 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt4c4\" (UniqueName: \"kubernetes.io/projected/0ac2019c-71a9-4006-b1e3-a3d5104834c7-kube-api-access-tt4c4\") pod \"barbican-keystone-listener-644cfb5dc8-ldc2h\" (UID: \"0ac2019c-71a9-4006-b1e3-a3d5104834c7\") " pod="openstack/barbican-keystone-listener-644cfb5dc8-ldc2h" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.513702 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ac2019c-71a9-4006-b1e3-a3d5104834c7-config-data-custom\") pod \"barbican-keystone-listener-644cfb5dc8-ldc2h\" (UID: \"0ac2019c-71a9-4006-b1e3-a3d5104834c7\") " pod="openstack/barbican-keystone-listener-644cfb5dc8-ldc2h" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.578703 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5645ccfdc5-zldk4"] Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.579981 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.602155 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5645ccfdc5-zldk4"] Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.614810 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/057a22f7-8752-4d5b-b896-5e1c5b46ce91-logs\") pod \"barbican-worker-7c98ff7-wbdq9\" (UID: \"057a22f7-8752-4d5b-b896-5e1c5b46ce91\") " pod="openstack/barbican-worker-7c98ff7-wbdq9" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.614890 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ac2019c-71a9-4006-b1e3-a3d5104834c7-config-data-custom\") pod \"barbican-keystone-listener-644cfb5dc8-ldc2h\" (UID: \"0ac2019c-71a9-4006-b1e3-a3d5104834c7\") " pod="openstack/barbican-keystone-listener-644cfb5dc8-ldc2h" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.614916 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/057a22f7-8752-4d5b-b896-5e1c5b46ce91-config-data\") pod \"barbican-worker-7c98ff7-wbdq9\" (UID: \"057a22f7-8752-4d5b-b896-5e1c5b46ce91\") " pod="openstack/barbican-worker-7c98ff7-wbdq9" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.614942 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac2019c-71a9-4006-b1e3-a3d5104834c7-combined-ca-bundle\") pod \"barbican-keystone-listener-644cfb5dc8-ldc2h\" (UID: \"0ac2019c-71a9-4006-b1e3-a3d5104834c7\") " pod="openstack/barbican-keystone-listener-644cfb5dc8-ldc2h" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.614972 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/057a22f7-8752-4d5b-b896-5e1c5b46ce91-config-data-custom\") pod \"barbican-worker-7c98ff7-wbdq9\" (UID: \"057a22f7-8752-4d5b-b896-5e1c5b46ce91\") " pod="openstack/barbican-worker-7c98ff7-wbdq9" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.615034 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ac2019c-71a9-4006-b1e3-a3d5104834c7-config-data\") pod \"barbican-keystone-listener-644cfb5dc8-ldc2h\" (UID: \"0ac2019c-71a9-4006-b1e3-a3d5104834c7\") " pod="openstack/barbican-keystone-listener-644cfb5dc8-ldc2h" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.615052 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/057a22f7-8752-4d5b-b896-5e1c5b46ce91-combined-ca-bundle\") pod \"barbican-worker-7c98ff7-wbdq9\" (UID: \"057a22f7-8752-4d5b-b896-5e1c5b46ce91\") " pod="openstack/barbican-worker-7c98ff7-wbdq9" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.615072 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ac2019c-71a9-4006-b1e3-a3d5104834c7-logs\") pod \"barbican-keystone-listener-644cfb5dc8-ldc2h\" (UID: \"0ac2019c-71a9-4006-b1e3-a3d5104834c7\") " pod="openstack/barbican-keystone-listener-644cfb5dc8-ldc2h" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.615101 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt4c4\" (UniqueName: \"kubernetes.io/projected/0ac2019c-71a9-4006-b1e3-a3d5104834c7-kube-api-access-tt4c4\") pod \"barbican-keystone-listener-644cfb5dc8-ldc2h\" (UID: \"0ac2019c-71a9-4006-b1e3-a3d5104834c7\") " pod="openstack/barbican-keystone-listener-644cfb5dc8-ldc2h" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.615121 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnhk7\" (UniqueName: \"kubernetes.io/projected/057a22f7-8752-4d5b-b896-5e1c5b46ce91-kube-api-access-hnhk7\") pod \"barbican-worker-7c98ff7-wbdq9\" (UID: \"057a22f7-8752-4d5b-b896-5e1c5b46ce91\") " pod="openstack/barbican-worker-7c98ff7-wbdq9" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.616113 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ac2019c-71a9-4006-b1e3-a3d5104834c7-logs\") pod \"barbican-keystone-listener-644cfb5dc8-ldc2h\" (UID: \"0ac2019c-71a9-4006-b1e3-a3d5104834c7\") " pod="openstack/barbican-keystone-listener-644cfb5dc8-ldc2h" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.621075 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac2019c-71a9-4006-b1e3-a3d5104834c7-combined-ca-bundle\") pod \"barbican-keystone-listener-644cfb5dc8-ldc2h\" (UID: \"0ac2019c-71a9-4006-b1e3-a3d5104834c7\") " pod="openstack/barbican-keystone-listener-644cfb5dc8-ldc2h" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.623491 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ac2019c-71a9-4006-b1e3-a3d5104834c7-config-data\") pod \"barbican-keystone-listener-644cfb5dc8-ldc2h\" (UID: \"0ac2019c-71a9-4006-b1e3-a3d5104834c7\") " pod="openstack/barbican-keystone-listener-644cfb5dc8-ldc2h" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.625900 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ac2019c-71a9-4006-b1e3-a3d5104834c7-config-data-custom\") pod \"barbican-keystone-listener-644cfb5dc8-ldc2h\" (UID: \"0ac2019c-71a9-4006-b1e3-a3d5104834c7\") " pod="openstack/barbican-keystone-listener-644cfb5dc8-ldc2h" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.651784 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt4c4\" (UniqueName: \"kubernetes.io/projected/0ac2019c-71a9-4006-b1e3-a3d5104834c7-kube-api-access-tt4c4\") pod \"barbican-keystone-listener-644cfb5dc8-ldc2h\" (UID: \"0ac2019c-71a9-4006-b1e3-a3d5104834c7\") " pod="openstack/barbican-keystone-listener-644cfb5dc8-ldc2h" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.715002 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6db664db48-4nf5z"] Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.716284 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/057a22f7-8752-4d5b-b896-5e1c5b46ce91-config-data-custom\") pod \"barbican-worker-7c98ff7-wbdq9\" (UID: \"057a22f7-8752-4d5b-b896-5e1c5b46ce91\") " pod="openstack/barbican-worker-7c98ff7-wbdq9" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.716345 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec60a834-532a-45ec-ac78-cc5954ab9d85-dns-svc\") pod \"dnsmasq-dns-5645ccfdc5-zldk4\" (UID: \"ec60a834-532a-45ec-ac78-cc5954ab9d85\") " pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.716382 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec60a834-532a-45ec-ac78-cc5954ab9d85-config\") pod \"dnsmasq-dns-5645ccfdc5-zldk4\" (UID: \"ec60a834-532a-45ec-ac78-cc5954ab9d85\") " pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.716410 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/057a22f7-8752-4d5b-b896-5e1c5b46ce91-combined-ca-bundle\") pod \"barbican-worker-7c98ff7-wbdq9\" (UID: \"057a22f7-8752-4d5b-b896-5e1c5b46ce91\") " pod="openstack/barbican-worker-7c98ff7-wbdq9" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.716434 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec60a834-532a-45ec-ac78-cc5954ab9d85-ovsdbserver-sb\") pod \"dnsmasq-dns-5645ccfdc5-zldk4\" (UID: \"ec60a834-532a-45ec-ac78-cc5954ab9d85\") " pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.716465 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnhk7\" (UniqueName: \"kubernetes.io/projected/057a22f7-8752-4d5b-b896-5e1c5b46ce91-kube-api-access-hnhk7\") pod \"barbican-worker-7c98ff7-wbdq9\" (UID: \"057a22f7-8752-4d5b-b896-5e1c5b46ce91\") " pod="openstack/barbican-worker-7c98ff7-wbdq9" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.716512 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/057a22f7-8752-4d5b-b896-5e1c5b46ce91-logs\") pod \"barbican-worker-7c98ff7-wbdq9\" (UID: \"057a22f7-8752-4d5b-b896-5e1c5b46ce91\") " pod="openstack/barbican-worker-7c98ff7-wbdq9" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.716536 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr5d7\" (UniqueName: \"kubernetes.io/projected/ec60a834-532a-45ec-ac78-cc5954ab9d85-kube-api-access-nr5d7\") pod \"dnsmasq-dns-5645ccfdc5-zldk4\" (UID: \"ec60a834-532a-45ec-ac78-cc5954ab9d85\") " pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.716560 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec60a834-532a-45ec-ac78-cc5954ab9d85-ovsdbserver-nb\") pod \"dnsmasq-dns-5645ccfdc5-zldk4\" (UID: \"ec60a834-532a-45ec-ac78-cc5954ab9d85\") " pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.716611 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/057a22f7-8752-4d5b-b896-5e1c5b46ce91-config-data\") pod \"barbican-worker-7c98ff7-wbdq9\" (UID: \"057a22f7-8752-4d5b-b896-5e1c5b46ce91\") " pod="openstack/barbican-worker-7c98ff7-wbdq9" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.716346 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6db664db48-4nf5z" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.717147 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/057a22f7-8752-4d5b-b896-5e1c5b46ce91-logs\") pod \"barbican-worker-7c98ff7-wbdq9\" (UID: \"057a22f7-8752-4d5b-b896-5e1c5b46ce91\") " pod="openstack/barbican-worker-7c98ff7-wbdq9" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.719512 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/057a22f7-8752-4d5b-b896-5e1c5b46ce91-combined-ca-bundle\") pod \"barbican-worker-7c98ff7-wbdq9\" (UID: \"057a22f7-8752-4d5b-b896-5e1c5b46ce91\") " pod="openstack/barbican-worker-7c98ff7-wbdq9" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.721297 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.722821 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/057a22f7-8752-4d5b-b896-5e1c5b46ce91-config-data-custom\") pod \"barbican-worker-7c98ff7-wbdq9\" (UID: \"057a22f7-8752-4d5b-b896-5e1c5b46ce91\") " pod="openstack/barbican-worker-7c98ff7-wbdq9" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.724210 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/057a22f7-8752-4d5b-b896-5e1c5b46ce91-config-data\") pod \"barbican-worker-7c98ff7-wbdq9\" (UID: \"057a22f7-8752-4d5b-b896-5e1c5b46ce91\") " pod="openstack/barbican-worker-7c98ff7-wbdq9" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.738329 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnhk7\" (UniqueName: \"kubernetes.io/projected/057a22f7-8752-4d5b-b896-5e1c5b46ce91-kube-api-access-hnhk7\") pod \"barbican-worker-7c98ff7-wbdq9\" (UID: \"057a22f7-8752-4d5b-b896-5e1c5b46ce91\") " pod="openstack/barbican-worker-7c98ff7-wbdq9" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.740010 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-644cfb5dc8-ldc2h" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.742194 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6db664db48-4nf5z"] Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.789554 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c98ff7-wbdq9" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.819603 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec60a834-532a-45ec-ac78-cc5954ab9d85-dns-svc\") pod \"dnsmasq-dns-5645ccfdc5-zldk4\" (UID: \"ec60a834-532a-45ec-ac78-cc5954ab9d85\") " pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.819679 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpfrl\" (UniqueName: \"kubernetes.io/projected/2502b2fe-3275-49f1-ac6e-b0c73c16527b-kube-api-access-wpfrl\") pod \"barbican-api-6db664db48-4nf5z\" (UID: \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\") " pod="openstack/barbican-api-6db664db48-4nf5z" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.819718 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec60a834-532a-45ec-ac78-cc5954ab9d85-config\") pod \"dnsmasq-dns-5645ccfdc5-zldk4\" (UID: \"ec60a834-532a-45ec-ac78-cc5954ab9d85\") " pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.819761 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2502b2fe-3275-49f1-ac6e-b0c73c16527b-config-data-custom\") pod \"barbican-api-6db664db48-4nf5z\" (UID: \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\") " pod="openstack/barbican-api-6db664db48-4nf5z" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.819790 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2502b2fe-3275-49f1-ac6e-b0c73c16527b-config-data\") pod \"barbican-api-6db664db48-4nf5z\" (UID: \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\") " pod="openstack/barbican-api-6db664db48-4nf5z" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.819822 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec60a834-532a-45ec-ac78-cc5954ab9d85-ovsdbserver-sb\") pod \"dnsmasq-dns-5645ccfdc5-zldk4\" (UID: \"ec60a834-532a-45ec-ac78-cc5954ab9d85\") " pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.819862 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2502b2fe-3275-49f1-ac6e-b0c73c16527b-combined-ca-bundle\") pod \"barbican-api-6db664db48-4nf5z\" (UID: \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\") " pod="openstack/barbican-api-6db664db48-4nf5z" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.819906 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr5d7\" (UniqueName: \"kubernetes.io/projected/ec60a834-532a-45ec-ac78-cc5954ab9d85-kube-api-access-nr5d7\") pod \"dnsmasq-dns-5645ccfdc5-zldk4\" (UID: \"ec60a834-532a-45ec-ac78-cc5954ab9d85\") " pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.819930 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec60a834-532a-45ec-ac78-cc5954ab9d85-ovsdbserver-nb\") pod \"dnsmasq-dns-5645ccfdc5-zldk4\" (UID: \"ec60a834-532a-45ec-ac78-cc5954ab9d85\") " pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.820001 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2502b2fe-3275-49f1-ac6e-b0c73c16527b-logs\") pod \"barbican-api-6db664db48-4nf5z\" (UID: \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\") " pod="openstack/barbican-api-6db664db48-4nf5z" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.822311 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec60a834-532a-45ec-ac78-cc5954ab9d85-config\") pod \"dnsmasq-dns-5645ccfdc5-zldk4\" (UID: \"ec60a834-532a-45ec-ac78-cc5954ab9d85\") " pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.824206 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec60a834-532a-45ec-ac78-cc5954ab9d85-dns-svc\") pod \"dnsmasq-dns-5645ccfdc5-zldk4\" (UID: \"ec60a834-532a-45ec-ac78-cc5954ab9d85\") " pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.825022 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec60a834-532a-45ec-ac78-cc5954ab9d85-ovsdbserver-sb\") pod \"dnsmasq-dns-5645ccfdc5-zldk4\" (UID: \"ec60a834-532a-45ec-ac78-cc5954ab9d85\") " pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.826453 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec60a834-532a-45ec-ac78-cc5954ab9d85-ovsdbserver-nb\") pod \"dnsmasq-dns-5645ccfdc5-zldk4\" (UID: \"ec60a834-532a-45ec-ac78-cc5954ab9d85\") " pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.845189 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr5d7\" (UniqueName: \"kubernetes.io/projected/ec60a834-532a-45ec-ac78-cc5954ab9d85-kube-api-access-nr5d7\") pod \"dnsmasq-dns-5645ccfdc5-zldk4\" (UID: \"ec60a834-532a-45ec-ac78-cc5954ab9d85\") " pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.906349 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.936673 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpfrl\" (UniqueName: \"kubernetes.io/projected/2502b2fe-3275-49f1-ac6e-b0c73c16527b-kube-api-access-wpfrl\") pod \"barbican-api-6db664db48-4nf5z\" (UID: \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\") " pod="openstack/barbican-api-6db664db48-4nf5z" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.936824 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2502b2fe-3275-49f1-ac6e-b0c73c16527b-config-data-custom\") pod \"barbican-api-6db664db48-4nf5z\" (UID: \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\") " pod="openstack/barbican-api-6db664db48-4nf5z" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.936885 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2502b2fe-3275-49f1-ac6e-b0c73c16527b-config-data\") pod \"barbican-api-6db664db48-4nf5z\" (UID: \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\") " pod="openstack/barbican-api-6db664db48-4nf5z" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.936964 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2502b2fe-3275-49f1-ac6e-b0c73c16527b-combined-ca-bundle\") pod \"barbican-api-6db664db48-4nf5z\" (UID: \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\") " pod="openstack/barbican-api-6db664db48-4nf5z" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.937170 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2502b2fe-3275-49f1-ac6e-b0c73c16527b-logs\") pod \"barbican-api-6db664db48-4nf5z\" (UID: \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\") " pod="openstack/barbican-api-6db664db48-4nf5z" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.938186 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2502b2fe-3275-49f1-ac6e-b0c73c16527b-logs\") pod \"barbican-api-6db664db48-4nf5z\" (UID: \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\") " pod="openstack/barbican-api-6db664db48-4nf5z" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.946356 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2502b2fe-3275-49f1-ac6e-b0c73c16527b-combined-ca-bundle\") pod \"barbican-api-6db664db48-4nf5z\" (UID: \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\") " pod="openstack/barbican-api-6db664db48-4nf5z" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.956729 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2502b2fe-3275-49f1-ac6e-b0c73c16527b-config-data\") pod \"barbican-api-6db664db48-4nf5z\" (UID: \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\") " pod="openstack/barbican-api-6db664db48-4nf5z" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.958994 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpfrl\" (UniqueName: \"kubernetes.io/projected/2502b2fe-3275-49f1-ac6e-b0c73c16527b-kube-api-access-wpfrl\") pod \"barbican-api-6db664db48-4nf5z\" (UID: \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\") " pod="openstack/barbican-api-6db664db48-4nf5z" Mar 10 08:12:37 crc kubenswrapper[4825]: I0310 08:12:37.970261 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2502b2fe-3275-49f1-ac6e-b0c73c16527b-config-data-custom\") pod \"barbican-api-6db664db48-4nf5z\" (UID: \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\") " pod="openstack/barbican-api-6db664db48-4nf5z" Mar 10 08:12:38 crc kubenswrapper[4825]: I0310 08:12:38.152572 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6db664db48-4nf5z" Mar 10 08:12:38 crc kubenswrapper[4825]: I0310 08:12:38.307634 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-644cfb5dc8-ldc2h"] Mar 10 08:12:38 crc kubenswrapper[4825]: I0310 08:12:38.419851 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7c98ff7-wbdq9"] Mar 10 08:12:38 crc kubenswrapper[4825]: W0310 08:12:38.424684 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod057a22f7_8752_4d5b_b896_5e1c5b46ce91.slice/crio-d54cc2a148451436e95dc44aeabaaf03bd2c1026ea90e96c46c24ed0ac0d868a WatchSource:0}: Error finding container d54cc2a148451436e95dc44aeabaaf03bd2c1026ea90e96c46c24ed0ac0d868a: Status 404 returned error can't find the container with id d54cc2a148451436e95dc44aeabaaf03bd2c1026ea90e96c46c24ed0ac0d868a Mar 10 08:12:38 crc kubenswrapper[4825]: I0310 08:12:38.534903 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5645ccfdc5-zldk4"] Mar 10 08:12:38 crc kubenswrapper[4825]: W0310 08:12:38.548861 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec60a834_532a_45ec_ac78_cc5954ab9d85.slice/crio-03e84782fb95fa0b4284c76df0f0b6b92fa763801f20ac45d1137cd69cc6fccd WatchSource:0}: Error finding container 03e84782fb95fa0b4284c76df0f0b6b92fa763801f20ac45d1137cd69cc6fccd: Status 404 returned error can't find the container with id 03e84782fb95fa0b4284c76df0f0b6b92fa763801f20ac45d1137cd69cc6fccd Mar 10 08:12:38 crc kubenswrapper[4825]: W0310 08:12:38.665931 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2502b2fe_3275_49f1_ac6e_b0c73c16527b.slice/crio-7dc1d47991806d00e433b972440fe2509810e8d2d91bedc310b8ae9390ed02d8 WatchSource:0}: Error finding container 7dc1d47991806d00e433b972440fe2509810e8d2d91bedc310b8ae9390ed02d8: Status 404 returned error can't find the container with id 7dc1d47991806d00e433b972440fe2509810e8d2d91bedc310b8ae9390ed02d8 Mar 10 08:12:38 crc kubenswrapper[4825]: I0310 08:12:38.671522 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6db664db48-4nf5z"] Mar 10 08:12:39 crc kubenswrapper[4825]: I0310 08:12:39.165317 4825 generic.go:334] "Generic (PLEG): container finished" podID="ec60a834-532a-45ec-ac78-cc5954ab9d85" containerID="d3c45a92f19c624046e763329cbd5050855a56ea250689eeda7c7e8c21529770" exitCode=0 Mar 10 08:12:39 crc kubenswrapper[4825]: I0310 08:12:39.165511 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" event={"ID":"ec60a834-532a-45ec-ac78-cc5954ab9d85","Type":"ContainerDied","Data":"d3c45a92f19c624046e763329cbd5050855a56ea250689eeda7c7e8c21529770"} Mar 10 08:12:39 crc kubenswrapper[4825]: I0310 08:12:39.165643 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" event={"ID":"ec60a834-532a-45ec-ac78-cc5954ab9d85","Type":"ContainerStarted","Data":"03e84782fb95fa0b4284c76df0f0b6b92fa763801f20ac45d1137cd69cc6fccd"} Mar 10 08:12:39 crc kubenswrapper[4825]: I0310 08:12:39.169281 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-644cfb5dc8-ldc2h" event={"ID":"0ac2019c-71a9-4006-b1e3-a3d5104834c7","Type":"ContainerStarted","Data":"23f55bf669b2a1d41fc5d45c857d8ab6649fd0b12022fb3eaab38820b3bae8ae"} Mar 10 08:12:39 crc kubenswrapper[4825]: I0310 08:12:39.171751 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6db664db48-4nf5z" event={"ID":"2502b2fe-3275-49f1-ac6e-b0c73c16527b","Type":"ContainerStarted","Data":"2befe22d78227385d4ced2cea60fea7daf7f6d482311839da6bd0ed2c479f3f2"} Mar 10 08:12:39 crc kubenswrapper[4825]: I0310 08:12:39.171779 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6db664db48-4nf5z" event={"ID":"2502b2fe-3275-49f1-ac6e-b0c73c16527b","Type":"ContainerStarted","Data":"a3444f4a0618ed86dacf9d0ccb42a1d1e892b5122680e604b3f9a40553ee5948"} Mar 10 08:12:39 crc kubenswrapper[4825]: I0310 08:12:39.171788 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6db664db48-4nf5z" event={"ID":"2502b2fe-3275-49f1-ac6e-b0c73c16527b","Type":"ContainerStarted","Data":"7dc1d47991806d00e433b972440fe2509810e8d2d91bedc310b8ae9390ed02d8"} Mar 10 08:12:39 crc kubenswrapper[4825]: I0310 08:12:39.172416 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6db664db48-4nf5z" Mar 10 08:12:39 crc kubenswrapper[4825]: I0310 08:12:39.172435 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6db664db48-4nf5z" Mar 10 08:12:39 crc kubenswrapper[4825]: I0310 08:12:39.176506 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c98ff7-wbdq9" event={"ID":"057a22f7-8752-4d5b-b896-5e1c5b46ce91","Type":"ContainerStarted","Data":"d54cc2a148451436e95dc44aeabaaf03bd2c1026ea90e96c46c24ed0ac0d868a"} Mar 10 08:12:39 crc kubenswrapper[4825]: I0310 08:12:39.207588 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6db664db48-4nf5z" podStartSLOduration=2.207558893 podStartE2EDuration="2.207558893s" podCreationTimestamp="2026-03-10 08:12:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:12:39.206583387 +0000 UTC m=+5312.236364002" watchObservedRunningTime="2026-03-10 08:12:39.207558893 +0000 UTC m=+5312.237339518" Mar 10 08:12:40 crc kubenswrapper[4825]: I0310 08:12:40.184932 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" event={"ID":"ec60a834-532a-45ec-ac78-cc5954ab9d85","Type":"ContainerStarted","Data":"1c18cf7a8d81a88fb7f7c509bf70fa564fdef115c481b793ed4d7113f8cc6f48"} Mar 10 08:12:40 crc kubenswrapper[4825]: I0310 08:12:40.185330 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" Mar 10 08:12:40 crc kubenswrapper[4825]: I0310 08:12:40.208310 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" podStartSLOduration=3.20829305 podStartE2EDuration="3.20829305s" podCreationTimestamp="2026-03-10 08:12:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:12:40.200800903 +0000 UTC m=+5313.230581518" watchObservedRunningTime="2026-03-10 08:12:40.20829305 +0000 UTC m=+5313.238073665" Mar 10 08:12:40 crc kubenswrapper[4825]: I0310 08:12:40.725874 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-59f9c97db-j6zhw"] Mar 10 08:12:40 crc kubenswrapper[4825]: I0310 08:12:40.727379 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:40 crc kubenswrapper[4825]: I0310 08:12:40.730974 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 10 08:12:40 crc kubenswrapper[4825]: I0310 08:12:40.731090 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 10 08:12:40 crc kubenswrapper[4825]: I0310 08:12:40.745915 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-59f9c97db-j6zhw"] Mar 10 08:12:40 crc kubenswrapper[4825]: I0310 08:12:40.891221 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bacaf889-e480-4e5d-b5c8-d4496d119261-combined-ca-bundle\") pod \"barbican-api-59f9c97db-j6zhw\" (UID: \"bacaf889-e480-4e5d-b5c8-d4496d119261\") " pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:40 crc kubenswrapper[4825]: I0310 08:12:40.891572 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bacaf889-e480-4e5d-b5c8-d4496d119261-logs\") pod \"barbican-api-59f9c97db-j6zhw\" (UID: \"bacaf889-e480-4e5d-b5c8-d4496d119261\") " pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:40 crc kubenswrapper[4825]: I0310 08:12:40.891628 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bacaf889-e480-4e5d-b5c8-d4496d119261-config-data\") pod \"barbican-api-59f9c97db-j6zhw\" (UID: \"bacaf889-e480-4e5d-b5c8-d4496d119261\") " pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:40 crc kubenswrapper[4825]: I0310 08:12:40.891651 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bacaf889-e480-4e5d-b5c8-d4496d119261-internal-tls-certs\") pod \"barbican-api-59f9c97db-j6zhw\" (UID: \"bacaf889-e480-4e5d-b5c8-d4496d119261\") " pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:40 crc kubenswrapper[4825]: I0310 08:12:40.891790 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bacaf889-e480-4e5d-b5c8-d4496d119261-public-tls-certs\") pod \"barbican-api-59f9c97db-j6zhw\" (UID: \"bacaf889-e480-4e5d-b5c8-d4496d119261\") " pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:40 crc kubenswrapper[4825]: I0310 08:12:40.891965 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bacaf889-e480-4e5d-b5c8-d4496d119261-config-data-custom\") pod \"barbican-api-59f9c97db-j6zhw\" (UID: \"bacaf889-e480-4e5d-b5c8-d4496d119261\") " pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:40 crc kubenswrapper[4825]: I0310 08:12:40.892221 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs6gx\" (UniqueName: \"kubernetes.io/projected/bacaf889-e480-4e5d-b5c8-d4496d119261-kube-api-access-rs6gx\") pod \"barbican-api-59f9c97db-j6zhw\" (UID: \"bacaf889-e480-4e5d-b5c8-d4496d119261\") " pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:40 crc kubenswrapper[4825]: I0310 08:12:40.993621 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs6gx\" (UniqueName: \"kubernetes.io/projected/bacaf889-e480-4e5d-b5c8-d4496d119261-kube-api-access-rs6gx\") pod \"barbican-api-59f9c97db-j6zhw\" (UID: \"bacaf889-e480-4e5d-b5c8-d4496d119261\") " pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:40 crc kubenswrapper[4825]: I0310 08:12:40.993678 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bacaf889-e480-4e5d-b5c8-d4496d119261-combined-ca-bundle\") pod \"barbican-api-59f9c97db-j6zhw\" (UID: \"bacaf889-e480-4e5d-b5c8-d4496d119261\") " pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:40 crc kubenswrapper[4825]: I0310 08:12:40.993721 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bacaf889-e480-4e5d-b5c8-d4496d119261-logs\") pod \"barbican-api-59f9c97db-j6zhw\" (UID: \"bacaf889-e480-4e5d-b5c8-d4496d119261\") " pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:40 crc kubenswrapper[4825]: I0310 08:12:40.993758 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bacaf889-e480-4e5d-b5c8-d4496d119261-config-data\") pod \"barbican-api-59f9c97db-j6zhw\" (UID: \"bacaf889-e480-4e5d-b5c8-d4496d119261\") " pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:40 crc kubenswrapper[4825]: I0310 08:12:40.993772 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bacaf889-e480-4e5d-b5c8-d4496d119261-internal-tls-certs\") pod \"barbican-api-59f9c97db-j6zhw\" (UID: \"bacaf889-e480-4e5d-b5c8-d4496d119261\") " pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:40 crc kubenswrapper[4825]: I0310 08:12:40.993796 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bacaf889-e480-4e5d-b5c8-d4496d119261-public-tls-certs\") pod \"barbican-api-59f9c97db-j6zhw\" (UID: \"bacaf889-e480-4e5d-b5c8-d4496d119261\") " pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:40 crc kubenswrapper[4825]: I0310 08:12:40.993839 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bacaf889-e480-4e5d-b5c8-d4496d119261-config-data-custom\") pod \"barbican-api-59f9c97db-j6zhw\" (UID: \"bacaf889-e480-4e5d-b5c8-d4496d119261\") " pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:40 crc kubenswrapper[4825]: I0310 08:12:40.994390 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bacaf889-e480-4e5d-b5c8-d4496d119261-logs\") pod \"barbican-api-59f9c97db-j6zhw\" (UID: \"bacaf889-e480-4e5d-b5c8-d4496d119261\") " pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:41 crc kubenswrapper[4825]: I0310 08:12:40.999512 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bacaf889-e480-4e5d-b5c8-d4496d119261-combined-ca-bundle\") pod \"barbican-api-59f9c97db-j6zhw\" (UID: \"bacaf889-e480-4e5d-b5c8-d4496d119261\") " pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:41 crc kubenswrapper[4825]: I0310 08:12:40.999594 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bacaf889-e480-4e5d-b5c8-d4496d119261-config-data-custom\") pod \"barbican-api-59f9c97db-j6zhw\" (UID: \"bacaf889-e480-4e5d-b5c8-d4496d119261\") " pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:41 crc kubenswrapper[4825]: I0310 08:12:40.999750 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bacaf889-e480-4e5d-b5c8-d4496d119261-internal-tls-certs\") pod \"barbican-api-59f9c97db-j6zhw\" (UID: \"bacaf889-e480-4e5d-b5c8-d4496d119261\") " pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:41 crc kubenswrapper[4825]: I0310 08:12:41.006321 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bacaf889-e480-4e5d-b5c8-d4496d119261-config-data\") pod \"barbican-api-59f9c97db-j6zhw\" (UID: \"bacaf889-e480-4e5d-b5c8-d4496d119261\") " pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:41 crc kubenswrapper[4825]: I0310 08:12:41.014771 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bacaf889-e480-4e5d-b5c8-d4496d119261-public-tls-certs\") pod \"barbican-api-59f9c97db-j6zhw\" (UID: \"bacaf889-e480-4e5d-b5c8-d4496d119261\") " pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:41 crc kubenswrapper[4825]: I0310 08:12:41.015964 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs6gx\" (UniqueName: \"kubernetes.io/projected/bacaf889-e480-4e5d-b5c8-d4496d119261-kube-api-access-rs6gx\") pod \"barbican-api-59f9c97db-j6zhw\" (UID: \"bacaf889-e480-4e5d-b5c8-d4496d119261\") " pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:41 crc kubenswrapper[4825]: I0310 08:12:41.049903 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:41 crc kubenswrapper[4825]: I0310 08:12:41.214184 4825 scope.go:117] "RemoveContainer" containerID="7bce7cbde66a23cf5d2619d1a5be3ef4415be3ce23f68212a5f04f2444b54ef9" Mar 10 08:12:41 crc kubenswrapper[4825]: W0310 08:12:41.490773 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbacaf889_e480_4e5d_b5c8_d4496d119261.slice/crio-fdb01abfa0a07fcc798de24973cad2c4bbe6f4fced27d5f9c5bedb5936577683 WatchSource:0}: Error finding container fdb01abfa0a07fcc798de24973cad2c4bbe6f4fced27d5f9c5bedb5936577683: Status 404 returned error can't find the container with id fdb01abfa0a07fcc798de24973cad2c4bbe6f4fced27d5f9c5bedb5936577683 Mar 10 08:12:41 crc kubenswrapper[4825]: I0310 08:12:41.492401 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-59f9c97db-j6zhw"] Mar 10 08:12:42 crc kubenswrapper[4825]: I0310 08:12:42.204874 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59f9c97db-j6zhw" event={"ID":"bacaf889-e480-4e5d-b5c8-d4496d119261","Type":"ContainerStarted","Data":"df38f0cd1a39d39be79149199106f8f6c6a755c734ab337856475eda7baac1de"} Mar 10 08:12:42 crc kubenswrapper[4825]: I0310 08:12:42.205233 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:42 crc kubenswrapper[4825]: I0310 08:12:42.205247 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:42 crc kubenswrapper[4825]: I0310 08:12:42.205256 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59f9c97db-j6zhw" event={"ID":"bacaf889-e480-4e5d-b5c8-d4496d119261","Type":"ContainerStarted","Data":"87151d43c59e85343020a073b571afab0fc67fe32a95de7fb8cf37d2ff45f5fb"} Mar 10 08:12:42 crc kubenswrapper[4825]: I0310 08:12:42.205268 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59f9c97db-j6zhw" event={"ID":"bacaf889-e480-4e5d-b5c8-d4496d119261","Type":"ContainerStarted","Data":"fdb01abfa0a07fcc798de24973cad2c4bbe6f4fced27d5f9c5bedb5936577683"} Mar 10 08:12:42 crc kubenswrapper[4825]: I0310 08:12:42.231976 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-59f9c97db-j6zhw" podStartSLOduration=2.231949688 podStartE2EDuration="2.231949688s" podCreationTimestamp="2026-03-10 08:12:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:12:42.229464083 +0000 UTC m=+5315.259244708" watchObservedRunningTime="2026-03-10 08:12:42.231949688 +0000 UTC m=+5315.261730333" Mar 10 08:12:44 crc kubenswrapper[4825]: I0310 08:12:44.664600 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6db664db48-4nf5z" Mar 10 08:12:46 crc kubenswrapper[4825]: I0310 08:12:46.053506 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6db664db48-4nf5z" Mar 10 08:12:46 crc kubenswrapper[4825]: I0310 08:12:46.236472 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-644cfb5dc8-ldc2h" event={"ID":"0ac2019c-71a9-4006-b1e3-a3d5104834c7","Type":"ContainerStarted","Data":"c2005460f2d17d71b6fdc883cffabcf8c1323a845cb8b476b39eed002d5045fb"} Mar 10 08:12:46 crc kubenswrapper[4825]: I0310 08:12:46.236508 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-644cfb5dc8-ldc2h" event={"ID":"0ac2019c-71a9-4006-b1e3-a3d5104834c7","Type":"ContainerStarted","Data":"712b40fcab4727a60a6f0eaa762e3ec0f8106f910cc2bbb9743d2b6cfe8c96d2"} Mar 10 08:12:46 crc kubenswrapper[4825]: I0310 08:12:46.241289 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c98ff7-wbdq9" event={"ID":"057a22f7-8752-4d5b-b896-5e1c5b46ce91","Type":"ContainerStarted","Data":"86d3ae9216daaf00996754caa2cc579ba40b514929039ea198b79bcb07cda3a3"} Mar 10 08:12:46 crc kubenswrapper[4825]: I0310 08:12:46.241327 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c98ff7-wbdq9" event={"ID":"057a22f7-8752-4d5b-b896-5e1c5b46ce91","Type":"ContainerStarted","Data":"d26674ae682b46c203d0ae493b2966367f4c697f293ac65738bf47c506446202"} Mar 10 08:12:46 crc kubenswrapper[4825]: I0310 08:12:46.255214 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-644cfb5dc8-ldc2h" podStartSLOduration=2.497962571 podStartE2EDuration="9.255198782s" podCreationTimestamp="2026-03-10 08:12:37 +0000 UTC" firstStartedPulling="2026-03-10 08:12:38.317287287 +0000 UTC m=+5311.347067902" lastFinishedPulling="2026-03-10 08:12:45.074523498 +0000 UTC m=+5318.104304113" observedRunningTime="2026-03-10 08:12:46.252973824 +0000 UTC m=+5319.282754449" watchObservedRunningTime="2026-03-10 08:12:46.255198782 +0000 UTC m=+5319.284979397" Mar 10 08:12:46 crc kubenswrapper[4825]: I0310 08:12:46.278965 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7c98ff7-wbdq9" podStartSLOduration=2.637205099 podStartE2EDuration="9.278945266s" podCreationTimestamp="2026-03-10 08:12:37 +0000 UTC" firstStartedPulling="2026-03-10 08:12:38.427864201 +0000 UTC m=+5311.457644816" lastFinishedPulling="2026-03-10 08:12:45.069604368 +0000 UTC m=+5318.099384983" observedRunningTime="2026-03-10 08:12:46.274292044 +0000 UTC m=+5319.304072679" watchObservedRunningTime="2026-03-10 08:12:46.278945266 +0000 UTC m=+5319.308725881" Mar 10 08:12:46 crc kubenswrapper[4825]: I0310 08:12:46.887910 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:12:46 crc kubenswrapper[4825]: I0310 08:12:46.888242 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:12:47 crc kubenswrapper[4825]: I0310 08:12:47.669749 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:47 crc kubenswrapper[4825]: I0310 08:12:47.715026 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-59f9c97db-j6zhw" Mar 10 08:12:47 crc kubenswrapper[4825]: I0310 08:12:47.782043 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6db664db48-4nf5z"] Mar 10 08:12:47 crc kubenswrapper[4825]: I0310 08:12:47.782335 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6db664db48-4nf5z" podUID="2502b2fe-3275-49f1-ac6e-b0c73c16527b" containerName="barbican-api-log" containerID="cri-o://a3444f4a0618ed86dacf9d0ccb42a1d1e892b5122680e604b3f9a40553ee5948" gracePeriod=30 Mar 10 08:12:47 crc kubenswrapper[4825]: I0310 08:12:47.782388 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6db664db48-4nf5z" podUID="2502b2fe-3275-49f1-ac6e-b0c73c16527b" containerName="barbican-api" containerID="cri-o://2befe22d78227385d4ced2cea60fea7daf7f6d482311839da6bd0ed2c479f3f2" gracePeriod=30 Mar 10 08:12:47 crc kubenswrapper[4825]: I0310 08:12:47.792612 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6db664db48-4nf5z" podUID="2502b2fe-3275-49f1-ac6e-b0c73c16527b" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.69:9311/healthcheck\": EOF" Mar 10 08:12:47 crc kubenswrapper[4825]: I0310 08:12:47.792612 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6db664db48-4nf5z" podUID="2502b2fe-3275-49f1-ac6e-b0c73c16527b" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.1.69:9311/healthcheck\": EOF" Mar 10 08:12:47 crc kubenswrapper[4825]: I0310 08:12:47.909266 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" Mar 10 08:12:47 crc kubenswrapper[4825]: I0310 08:12:47.971330 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dfd5d77f-gdfgt"] Mar 10 08:12:47 crc kubenswrapper[4825]: I0310 08:12:47.971721 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" podUID="e57cbdf7-2b03-4898-b7bd-2ffddbef90c9" containerName="dnsmasq-dns" containerID="cri-o://ccd934705cd89f8776e1f8903c14de8d5ca47047552ef100f24e6bbbdce4f4c2" gracePeriod=10 Mar 10 08:12:48 crc kubenswrapper[4825]: I0310 08:12:48.270120 4825 generic.go:334] "Generic (PLEG): container finished" podID="2502b2fe-3275-49f1-ac6e-b0c73c16527b" containerID="a3444f4a0618ed86dacf9d0ccb42a1d1e892b5122680e604b3f9a40553ee5948" exitCode=143 Mar 10 08:12:48 crc kubenswrapper[4825]: I0310 08:12:48.270184 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6db664db48-4nf5z" event={"ID":"2502b2fe-3275-49f1-ac6e-b0c73c16527b","Type":"ContainerDied","Data":"a3444f4a0618ed86dacf9d0ccb42a1d1e892b5122680e604b3f9a40553ee5948"} Mar 10 08:12:48 crc kubenswrapper[4825]: I0310 08:12:48.271917 4825 generic.go:334] "Generic (PLEG): container finished" podID="e57cbdf7-2b03-4898-b7bd-2ffddbef90c9" containerID="ccd934705cd89f8776e1f8903c14de8d5ca47047552ef100f24e6bbbdce4f4c2" exitCode=0 Mar 10 08:12:48 crc kubenswrapper[4825]: I0310 08:12:48.272726 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" event={"ID":"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9","Type":"ContainerDied","Data":"ccd934705cd89f8776e1f8903c14de8d5ca47047552ef100f24e6bbbdce4f4c2"} Mar 10 08:12:48 crc kubenswrapper[4825]: I0310 08:12:48.543678 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" Mar 10 08:12:48 crc kubenswrapper[4825]: I0310 08:12:48.649610 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-ovsdbserver-nb\") pod \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\" (UID: \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\") " Mar 10 08:12:48 crc kubenswrapper[4825]: I0310 08:12:48.649687 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-ovsdbserver-sb\") pod \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\" (UID: \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\") " Mar 10 08:12:48 crc kubenswrapper[4825]: I0310 08:12:48.649740 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-config\") pod \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\" (UID: \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\") " Mar 10 08:12:48 crc kubenswrapper[4825]: I0310 08:12:48.649827 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jl9ft\" (UniqueName: \"kubernetes.io/projected/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-kube-api-access-jl9ft\") pod \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\" (UID: \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\") " Mar 10 08:12:48 crc kubenswrapper[4825]: I0310 08:12:48.649852 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-dns-svc\") pod \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\" (UID: \"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9\") " Mar 10 08:12:48 crc kubenswrapper[4825]: I0310 08:12:48.669716 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-kube-api-access-jl9ft" (OuterVolumeSpecName: "kube-api-access-jl9ft") pod "e57cbdf7-2b03-4898-b7bd-2ffddbef90c9" (UID: "e57cbdf7-2b03-4898-b7bd-2ffddbef90c9"). InnerVolumeSpecName "kube-api-access-jl9ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:12:48 crc kubenswrapper[4825]: I0310 08:12:48.695110 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e57cbdf7-2b03-4898-b7bd-2ffddbef90c9" (UID: "e57cbdf7-2b03-4898-b7bd-2ffddbef90c9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:12:48 crc kubenswrapper[4825]: I0310 08:12:48.704758 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e57cbdf7-2b03-4898-b7bd-2ffddbef90c9" (UID: "e57cbdf7-2b03-4898-b7bd-2ffddbef90c9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:12:48 crc kubenswrapper[4825]: I0310 08:12:48.718589 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e57cbdf7-2b03-4898-b7bd-2ffddbef90c9" (UID: "e57cbdf7-2b03-4898-b7bd-2ffddbef90c9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:12:48 crc kubenswrapper[4825]: I0310 08:12:48.720862 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-config" (OuterVolumeSpecName: "config") pod "e57cbdf7-2b03-4898-b7bd-2ffddbef90c9" (UID: "e57cbdf7-2b03-4898-b7bd-2ffddbef90c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:12:48 crc kubenswrapper[4825]: I0310 08:12:48.752151 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jl9ft\" (UniqueName: \"kubernetes.io/projected/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-kube-api-access-jl9ft\") on node \"crc\" DevicePath \"\"" Mar 10 08:12:48 crc kubenswrapper[4825]: I0310 08:12:48.752194 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 08:12:48 crc kubenswrapper[4825]: I0310 08:12:48.752204 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 08:12:48 crc kubenswrapper[4825]: I0310 08:12:48.752216 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 08:12:48 crc kubenswrapper[4825]: I0310 08:12:48.752225 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9-config\") on node \"crc\" DevicePath \"\"" Mar 10 08:12:49 crc kubenswrapper[4825]: I0310 08:12:49.281551 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" event={"ID":"e57cbdf7-2b03-4898-b7bd-2ffddbef90c9","Type":"ContainerDied","Data":"0bddb9aefff3b48461f265a5c41b3a693f6f377498294d6003627246b6c479b1"} Mar 10 08:12:49 crc kubenswrapper[4825]: I0310 08:12:49.281609 4825 scope.go:117] "RemoveContainer" containerID="ccd934705cd89f8776e1f8903c14de8d5ca47047552ef100f24e6bbbdce4f4c2" Mar 10 08:12:49 crc kubenswrapper[4825]: I0310 08:12:49.281619 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dfd5d77f-gdfgt" Mar 10 08:12:49 crc kubenswrapper[4825]: I0310 08:12:49.309183 4825 scope.go:117] "RemoveContainer" containerID="3943a89c108fe7c1692b6c3449a15c3b6a3360d7f26dd8f9e42cf60127331675" Mar 10 08:12:49 crc kubenswrapper[4825]: I0310 08:12:49.310342 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dfd5d77f-gdfgt"] Mar 10 08:12:49 crc kubenswrapper[4825]: I0310 08:12:49.319537 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dfd5d77f-gdfgt"] Mar 10 08:12:51 crc kubenswrapper[4825]: I0310 08:12:51.248353 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e57cbdf7-2b03-4898-b7bd-2ffddbef90c9" path="/var/lib/kubelet/pods/e57cbdf7-2b03-4898-b7bd-2ffddbef90c9/volumes" Mar 10 08:12:52 crc kubenswrapper[4825]: I0310 08:12:52.207041 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6db664db48-4nf5z" podUID="2502b2fe-3275-49f1-ac6e-b0c73c16527b" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.1.69:9311/healthcheck\": read tcp 10.217.0.2:36304->10.217.1.69:9311: read: connection reset by peer" Mar 10 08:12:52 crc kubenswrapper[4825]: I0310 08:12:52.207223 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6db664db48-4nf5z" podUID="2502b2fe-3275-49f1-ac6e-b0c73c16527b" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.69:9311/healthcheck\": read tcp 10.217.0.2:36296->10.217.1.69:9311: read: connection reset by peer" Mar 10 08:12:52 crc kubenswrapper[4825]: I0310 08:12:52.309093 4825 generic.go:334] "Generic (PLEG): container finished" podID="2502b2fe-3275-49f1-ac6e-b0c73c16527b" containerID="2befe22d78227385d4ced2cea60fea7daf7f6d482311839da6bd0ed2c479f3f2" exitCode=0 Mar 10 08:12:52 crc kubenswrapper[4825]: I0310 08:12:52.309201 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6db664db48-4nf5z" event={"ID":"2502b2fe-3275-49f1-ac6e-b0c73c16527b","Type":"ContainerDied","Data":"2befe22d78227385d4ced2cea60fea7daf7f6d482311839da6bd0ed2c479f3f2"} Mar 10 08:12:52 crc kubenswrapper[4825]: I0310 08:12:52.652179 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6db664db48-4nf5z" Mar 10 08:12:52 crc kubenswrapper[4825]: I0310 08:12:52.720347 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2502b2fe-3275-49f1-ac6e-b0c73c16527b-config-data-custom\") pod \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\" (UID: \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\") " Mar 10 08:12:52 crc kubenswrapper[4825]: I0310 08:12:52.720450 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpfrl\" (UniqueName: \"kubernetes.io/projected/2502b2fe-3275-49f1-ac6e-b0c73c16527b-kube-api-access-wpfrl\") pod \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\" (UID: \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\") " Mar 10 08:12:52 crc kubenswrapper[4825]: I0310 08:12:52.720545 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2502b2fe-3275-49f1-ac6e-b0c73c16527b-config-data\") pod \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\" (UID: \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\") " Mar 10 08:12:52 crc kubenswrapper[4825]: I0310 08:12:52.720615 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2502b2fe-3275-49f1-ac6e-b0c73c16527b-logs\") pod \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\" (UID: \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\") " Mar 10 08:12:52 crc kubenswrapper[4825]: I0310 08:12:52.720719 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2502b2fe-3275-49f1-ac6e-b0c73c16527b-combined-ca-bundle\") pod \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\" (UID: \"2502b2fe-3275-49f1-ac6e-b0c73c16527b\") " Mar 10 08:12:52 crc kubenswrapper[4825]: I0310 08:12:52.721227 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2502b2fe-3275-49f1-ac6e-b0c73c16527b-logs" (OuterVolumeSpecName: "logs") pod "2502b2fe-3275-49f1-ac6e-b0c73c16527b" (UID: "2502b2fe-3275-49f1-ac6e-b0c73c16527b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:12:52 crc kubenswrapper[4825]: I0310 08:12:52.721378 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2502b2fe-3275-49f1-ac6e-b0c73c16527b-logs\") on node \"crc\" DevicePath \"\"" Mar 10 08:12:52 crc kubenswrapper[4825]: I0310 08:12:52.726589 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2502b2fe-3275-49f1-ac6e-b0c73c16527b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2502b2fe-3275-49f1-ac6e-b0c73c16527b" (UID: "2502b2fe-3275-49f1-ac6e-b0c73c16527b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:12:52 crc kubenswrapper[4825]: I0310 08:12:52.727113 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2502b2fe-3275-49f1-ac6e-b0c73c16527b-kube-api-access-wpfrl" (OuterVolumeSpecName: "kube-api-access-wpfrl") pod "2502b2fe-3275-49f1-ac6e-b0c73c16527b" (UID: "2502b2fe-3275-49f1-ac6e-b0c73c16527b"). InnerVolumeSpecName "kube-api-access-wpfrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:12:52 crc kubenswrapper[4825]: I0310 08:12:52.748698 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2502b2fe-3275-49f1-ac6e-b0c73c16527b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2502b2fe-3275-49f1-ac6e-b0c73c16527b" (UID: "2502b2fe-3275-49f1-ac6e-b0c73c16527b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:12:52 crc kubenswrapper[4825]: I0310 08:12:52.764331 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2502b2fe-3275-49f1-ac6e-b0c73c16527b-config-data" (OuterVolumeSpecName: "config-data") pod "2502b2fe-3275-49f1-ac6e-b0c73c16527b" (UID: "2502b2fe-3275-49f1-ac6e-b0c73c16527b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:12:52 crc kubenswrapper[4825]: I0310 08:12:52.822851 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2502b2fe-3275-49f1-ac6e-b0c73c16527b-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 08:12:52 crc kubenswrapper[4825]: I0310 08:12:52.822927 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpfrl\" (UniqueName: \"kubernetes.io/projected/2502b2fe-3275-49f1-ac6e-b0c73c16527b-kube-api-access-wpfrl\") on node \"crc\" DevicePath \"\"" Mar 10 08:12:52 crc kubenswrapper[4825]: I0310 08:12:52.822942 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2502b2fe-3275-49f1-ac6e-b0c73c16527b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:12:52 crc kubenswrapper[4825]: I0310 08:12:52.822955 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2502b2fe-3275-49f1-ac6e-b0c73c16527b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:12:53 crc kubenswrapper[4825]: I0310 08:12:53.319694 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6db664db48-4nf5z" event={"ID":"2502b2fe-3275-49f1-ac6e-b0c73c16527b","Type":"ContainerDied","Data":"7dc1d47991806d00e433b972440fe2509810e8d2d91bedc310b8ae9390ed02d8"} Mar 10 08:12:53 crc kubenswrapper[4825]: I0310 08:12:53.319774 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6db664db48-4nf5z" Mar 10 08:12:53 crc kubenswrapper[4825]: I0310 08:12:53.320541 4825 scope.go:117] "RemoveContainer" containerID="2befe22d78227385d4ced2cea60fea7daf7f6d482311839da6bd0ed2c479f3f2" Mar 10 08:12:53 crc kubenswrapper[4825]: I0310 08:12:53.354397 4825 scope.go:117] "RemoveContainer" containerID="a3444f4a0618ed86dacf9d0ccb42a1d1e892b5122680e604b3f9a40553ee5948" Mar 10 08:12:53 crc kubenswrapper[4825]: I0310 08:12:53.363971 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6db664db48-4nf5z"] Mar 10 08:12:53 crc kubenswrapper[4825]: I0310 08:12:53.385811 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6db664db48-4nf5z"] Mar 10 08:12:55 crc kubenswrapper[4825]: I0310 08:12:55.250095 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2502b2fe-3275-49f1-ac6e-b0c73c16527b" path="/var/lib/kubelet/pods/2502b2fe-3275-49f1-ac6e-b0c73c16527b/volumes" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.330731 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-g6wcl"] Mar 10 08:13:00 crc kubenswrapper[4825]: E0310 08:13:00.331564 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57cbdf7-2b03-4898-b7bd-2ffddbef90c9" containerName="init" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.331577 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57cbdf7-2b03-4898-b7bd-2ffddbef90c9" containerName="init" Mar 10 08:13:00 crc kubenswrapper[4825]: E0310 08:13:00.331591 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57cbdf7-2b03-4898-b7bd-2ffddbef90c9" containerName="dnsmasq-dns" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.331596 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57cbdf7-2b03-4898-b7bd-2ffddbef90c9" containerName="dnsmasq-dns" Mar 10 08:13:00 crc kubenswrapper[4825]: E0310 08:13:00.331620 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2502b2fe-3275-49f1-ac6e-b0c73c16527b" containerName="barbican-api" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.331626 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="2502b2fe-3275-49f1-ac6e-b0c73c16527b" containerName="barbican-api" Mar 10 08:13:00 crc kubenswrapper[4825]: E0310 08:13:00.331633 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2502b2fe-3275-49f1-ac6e-b0c73c16527b" containerName="barbican-api-log" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.331639 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="2502b2fe-3275-49f1-ac6e-b0c73c16527b" containerName="barbican-api-log" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.331813 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57cbdf7-2b03-4898-b7bd-2ffddbef90c9" containerName="dnsmasq-dns" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.331828 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="2502b2fe-3275-49f1-ac6e-b0c73c16527b" containerName="barbican-api-log" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.331837 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="2502b2fe-3275-49f1-ac6e-b0c73c16527b" containerName="barbican-api" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.332404 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-g6wcl" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.349700 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-g6wcl"] Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.438226 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-acb9-account-create-update-s4m8d"] Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.439767 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-acb9-account-create-update-s4m8d" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.441627 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.446251 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-acb9-account-create-update-s4m8d"] Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.453723 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29d7bb79-c212-470e-927d-9415a2e9f206-operator-scripts\") pod \"neutron-db-create-g6wcl\" (UID: \"29d7bb79-c212-470e-927d-9415a2e9f206\") " pod="openstack/neutron-db-create-g6wcl" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.453801 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l5l7\" (UniqueName: \"kubernetes.io/projected/29d7bb79-c212-470e-927d-9415a2e9f206-kube-api-access-6l5l7\") pod \"neutron-db-create-g6wcl\" (UID: \"29d7bb79-c212-470e-927d-9415a2e9f206\") " pod="openstack/neutron-db-create-g6wcl" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.555513 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6p9c\" (UniqueName: \"kubernetes.io/projected/d38f516f-6216-4ed2-9efe-95492dfc61e8-kube-api-access-r6p9c\") pod \"neutron-acb9-account-create-update-s4m8d\" (UID: \"d38f516f-6216-4ed2-9efe-95492dfc61e8\") " pod="openstack/neutron-acb9-account-create-update-s4m8d" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.555558 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d38f516f-6216-4ed2-9efe-95492dfc61e8-operator-scripts\") pod \"neutron-acb9-account-create-update-s4m8d\" (UID: \"d38f516f-6216-4ed2-9efe-95492dfc61e8\") " pod="openstack/neutron-acb9-account-create-update-s4m8d" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.555589 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29d7bb79-c212-470e-927d-9415a2e9f206-operator-scripts\") pod \"neutron-db-create-g6wcl\" (UID: \"29d7bb79-c212-470e-927d-9415a2e9f206\") " pod="openstack/neutron-db-create-g6wcl" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.555721 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l5l7\" (UniqueName: \"kubernetes.io/projected/29d7bb79-c212-470e-927d-9415a2e9f206-kube-api-access-6l5l7\") pod \"neutron-db-create-g6wcl\" (UID: \"29d7bb79-c212-470e-927d-9415a2e9f206\") " pod="openstack/neutron-db-create-g6wcl" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.556394 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29d7bb79-c212-470e-927d-9415a2e9f206-operator-scripts\") pod \"neutron-db-create-g6wcl\" (UID: \"29d7bb79-c212-470e-927d-9415a2e9f206\") " pod="openstack/neutron-db-create-g6wcl" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.574646 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l5l7\" (UniqueName: \"kubernetes.io/projected/29d7bb79-c212-470e-927d-9415a2e9f206-kube-api-access-6l5l7\") pod \"neutron-db-create-g6wcl\" (UID: \"29d7bb79-c212-470e-927d-9415a2e9f206\") " pod="openstack/neutron-db-create-g6wcl" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.652299 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-g6wcl" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.657104 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d38f516f-6216-4ed2-9efe-95492dfc61e8-operator-scripts\") pod \"neutron-acb9-account-create-update-s4m8d\" (UID: \"d38f516f-6216-4ed2-9efe-95492dfc61e8\") " pod="openstack/neutron-acb9-account-create-update-s4m8d" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.657154 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6p9c\" (UniqueName: \"kubernetes.io/projected/d38f516f-6216-4ed2-9efe-95492dfc61e8-kube-api-access-r6p9c\") pod \"neutron-acb9-account-create-update-s4m8d\" (UID: \"d38f516f-6216-4ed2-9efe-95492dfc61e8\") " pod="openstack/neutron-acb9-account-create-update-s4m8d" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.658117 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d38f516f-6216-4ed2-9efe-95492dfc61e8-operator-scripts\") pod \"neutron-acb9-account-create-update-s4m8d\" (UID: \"d38f516f-6216-4ed2-9efe-95492dfc61e8\") " pod="openstack/neutron-acb9-account-create-update-s4m8d" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.675916 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6p9c\" (UniqueName: \"kubernetes.io/projected/d38f516f-6216-4ed2-9efe-95492dfc61e8-kube-api-access-r6p9c\") pod \"neutron-acb9-account-create-update-s4m8d\" (UID: \"d38f516f-6216-4ed2-9efe-95492dfc61e8\") " pod="openstack/neutron-acb9-account-create-update-s4m8d" Mar 10 08:13:00 crc kubenswrapper[4825]: I0310 08:13:00.766780 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-acb9-account-create-update-s4m8d" Mar 10 08:13:01 crc kubenswrapper[4825]: I0310 08:13:01.102604 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-g6wcl"] Mar 10 08:13:01 crc kubenswrapper[4825]: W0310 08:13:01.109876 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29d7bb79_c212_470e_927d_9415a2e9f206.slice/crio-dc5093b7052ad35e2ff42c833e2879184c95654dce5f4ccbfec806d7b8f856d9 WatchSource:0}: Error finding container dc5093b7052ad35e2ff42c833e2879184c95654dce5f4ccbfec806d7b8f856d9: Status 404 returned error can't find the container with id dc5093b7052ad35e2ff42c833e2879184c95654dce5f4ccbfec806d7b8f856d9 Mar 10 08:13:01 crc kubenswrapper[4825]: W0310 08:13:01.212921 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd38f516f_6216_4ed2_9efe_95492dfc61e8.slice/crio-6f8a27806c2ccdb0af2444d3eed8bc26029786cfeded1f015721bc264192602f WatchSource:0}: Error finding container 6f8a27806c2ccdb0af2444d3eed8bc26029786cfeded1f015721bc264192602f: Status 404 returned error can't find the container with id 6f8a27806c2ccdb0af2444d3eed8bc26029786cfeded1f015721bc264192602f Mar 10 08:13:01 crc kubenswrapper[4825]: I0310 08:13:01.213594 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-acb9-account-create-update-s4m8d"] Mar 10 08:13:01 crc kubenswrapper[4825]: I0310 08:13:01.403002 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-g6wcl" event={"ID":"29d7bb79-c212-470e-927d-9415a2e9f206","Type":"ContainerStarted","Data":"40edc66acaea18a4331bbb40750d16617c40c50f75e69141af2f7dbbdc2b1274"} Mar 10 08:13:01 crc kubenswrapper[4825]: I0310 08:13:01.403376 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-g6wcl" event={"ID":"29d7bb79-c212-470e-927d-9415a2e9f206","Type":"ContainerStarted","Data":"dc5093b7052ad35e2ff42c833e2879184c95654dce5f4ccbfec806d7b8f856d9"} Mar 10 08:13:01 crc kubenswrapper[4825]: I0310 08:13:01.406086 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-acb9-account-create-update-s4m8d" event={"ID":"d38f516f-6216-4ed2-9efe-95492dfc61e8","Type":"ContainerStarted","Data":"2e9fa9c64694d31a7607ba8d638cb7990e44d0ec8780f6c393152feaa46ef579"} Mar 10 08:13:01 crc kubenswrapper[4825]: I0310 08:13:01.406117 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-acb9-account-create-update-s4m8d" event={"ID":"d38f516f-6216-4ed2-9efe-95492dfc61e8","Type":"ContainerStarted","Data":"6f8a27806c2ccdb0af2444d3eed8bc26029786cfeded1f015721bc264192602f"} Mar 10 08:13:01 crc kubenswrapper[4825]: I0310 08:13:01.419356 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-g6wcl" podStartSLOduration=1.4193316280000001 podStartE2EDuration="1.419331628s" podCreationTimestamp="2026-03-10 08:13:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:13:01.415117987 +0000 UTC m=+5334.444898622" watchObservedRunningTime="2026-03-10 08:13:01.419331628 +0000 UTC m=+5334.449112253" Mar 10 08:13:01 crc kubenswrapper[4825]: I0310 08:13:01.434109 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-acb9-account-create-update-s4m8d" podStartSLOduration=1.434087736 podStartE2EDuration="1.434087736s" podCreationTimestamp="2026-03-10 08:13:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:13:01.433594803 +0000 UTC m=+5334.463375418" watchObservedRunningTime="2026-03-10 08:13:01.434087736 +0000 UTC m=+5334.463868351" Mar 10 08:13:02 crc kubenswrapper[4825]: I0310 08:13:02.421049 4825 generic.go:334] "Generic (PLEG): container finished" podID="29d7bb79-c212-470e-927d-9415a2e9f206" containerID="40edc66acaea18a4331bbb40750d16617c40c50f75e69141af2f7dbbdc2b1274" exitCode=0 Mar 10 08:13:02 crc kubenswrapper[4825]: I0310 08:13:02.421598 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-g6wcl" event={"ID":"29d7bb79-c212-470e-927d-9415a2e9f206","Type":"ContainerDied","Data":"40edc66acaea18a4331bbb40750d16617c40c50f75e69141af2f7dbbdc2b1274"} Mar 10 08:13:02 crc kubenswrapper[4825]: I0310 08:13:02.424662 4825 generic.go:334] "Generic (PLEG): container finished" podID="d38f516f-6216-4ed2-9efe-95492dfc61e8" containerID="2e9fa9c64694d31a7607ba8d638cb7990e44d0ec8780f6c393152feaa46ef579" exitCode=0 Mar 10 08:13:02 crc kubenswrapper[4825]: I0310 08:13:02.424737 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-acb9-account-create-update-s4m8d" event={"ID":"d38f516f-6216-4ed2-9efe-95492dfc61e8","Type":"ContainerDied","Data":"2e9fa9c64694d31a7607ba8d638cb7990e44d0ec8780f6c393152feaa46ef579"} Mar 10 08:13:03 crc kubenswrapper[4825]: I0310 08:13:03.832052 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-g6wcl" Mar 10 08:13:03 crc kubenswrapper[4825]: I0310 08:13:03.838863 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-acb9-account-create-update-s4m8d" Mar 10 08:13:03 crc kubenswrapper[4825]: I0310 08:13:03.912856 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6p9c\" (UniqueName: \"kubernetes.io/projected/d38f516f-6216-4ed2-9efe-95492dfc61e8-kube-api-access-r6p9c\") pod \"d38f516f-6216-4ed2-9efe-95492dfc61e8\" (UID: \"d38f516f-6216-4ed2-9efe-95492dfc61e8\") " Mar 10 08:13:03 crc kubenswrapper[4825]: I0310 08:13:03.913358 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l5l7\" (UniqueName: \"kubernetes.io/projected/29d7bb79-c212-470e-927d-9415a2e9f206-kube-api-access-6l5l7\") pod \"29d7bb79-c212-470e-927d-9415a2e9f206\" (UID: \"29d7bb79-c212-470e-927d-9415a2e9f206\") " Mar 10 08:13:03 crc kubenswrapper[4825]: I0310 08:13:03.913432 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29d7bb79-c212-470e-927d-9415a2e9f206-operator-scripts\") pod \"29d7bb79-c212-470e-927d-9415a2e9f206\" (UID: \"29d7bb79-c212-470e-927d-9415a2e9f206\") " Mar 10 08:13:03 crc kubenswrapper[4825]: I0310 08:13:03.913503 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d38f516f-6216-4ed2-9efe-95492dfc61e8-operator-scripts\") pod \"d38f516f-6216-4ed2-9efe-95492dfc61e8\" (UID: \"d38f516f-6216-4ed2-9efe-95492dfc61e8\") " Mar 10 08:13:03 crc kubenswrapper[4825]: I0310 08:13:03.913885 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29d7bb79-c212-470e-927d-9415a2e9f206-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29d7bb79-c212-470e-927d-9415a2e9f206" (UID: "29d7bb79-c212-470e-927d-9415a2e9f206"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:13:03 crc kubenswrapper[4825]: I0310 08:13:03.913924 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d38f516f-6216-4ed2-9efe-95492dfc61e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d38f516f-6216-4ed2-9efe-95492dfc61e8" (UID: "d38f516f-6216-4ed2-9efe-95492dfc61e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:13:03 crc kubenswrapper[4825]: I0310 08:13:03.919154 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29d7bb79-c212-470e-927d-9415a2e9f206-kube-api-access-6l5l7" (OuterVolumeSpecName: "kube-api-access-6l5l7") pod "29d7bb79-c212-470e-927d-9415a2e9f206" (UID: "29d7bb79-c212-470e-927d-9415a2e9f206"). InnerVolumeSpecName "kube-api-access-6l5l7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:13:03 crc kubenswrapper[4825]: I0310 08:13:03.919203 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d38f516f-6216-4ed2-9efe-95492dfc61e8-kube-api-access-r6p9c" (OuterVolumeSpecName: "kube-api-access-r6p9c") pod "d38f516f-6216-4ed2-9efe-95492dfc61e8" (UID: "d38f516f-6216-4ed2-9efe-95492dfc61e8"). InnerVolumeSpecName "kube-api-access-r6p9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:13:04 crc kubenswrapper[4825]: I0310 08:13:04.015972 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6l5l7\" (UniqueName: \"kubernetes.io/projected/29d7bb79-c212-470e-927d-9415a2e9f206-kube-api-access-6l5l7\") on node \"crc\" DevicePath \"\"" Mar 10 08:13:04 crc kubenswrapper[4825]: I0310 08:13:04.016017 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29d7bb79-c212-470e-927d-9415a2e9f206-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:13:04 crc kubenswrapper[4825]: I0310 08:13:04.016032 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d38f516f-6216-4ed2-9efe-95492dfc61e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:13:04 crc kubenswrapper[4825]: I0310 08:13:04.016043 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6p9c\" (UniqueName: \"kubernetes.io/projected/d38f516f-6216-4ed2-9efe-95492dfc61e8-kube-api-access-r6p9c\") on node \"crc\" DevicePath \"\"" Mar 10 08:13:04 crc kubenswrapper[4825]: I0310 08:13:04.443274 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-acb9-account-create-update-s4m8d" event={"ID":"d38f516f-6216-4ed2-9efe-95492dfc61e8","Type":"ContainerDied","Data":"6f8a27806c2ccdb0af2444d3eed8bc26029786cfeded1f015721bc264192602f"} Mar 10 08:13:04 crc kubenswrapper[4825]: I0310 08:13:04.443323 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f8a27806c2ccdb0af2444d3eed8bc26029786cfeded1f015721bc264192602f" Mar 10 08:13:04 crc kubenswrapper[4825]: I0310 08:13:04.443384 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-acb9-account-create-update-s4m8d" Mar 10 08:13:04 crc kubenswrapper[4825]: I0310 08:13:04.447923 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-g6wcl" event={"ID":"29d7bb79-c212-470e-927d-9415a2e9f206","Type":"ContainerDied","Data":"dc5093b7052ad35e2ff42c833e2879184c95654dce5f4ccbfec806d7b8f856d9"} Mar 10 08:13:04 crc kubenswrapper[4825]: I0310 08:13:04.447958 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc5093b7052ad35e2ff42c833e2879184c95654dce5f4ccbfec806d7b8f856d9" Mar 10 08:13:04 crc kubenswrapper[4825]: I0310 08:13:04.448206 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-g6wcl" Mar 10 08:13:05 crc kubenswrapper[4825]: I0310 08:13:05.649954 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-s9c9n"] Mar 10 08:13:05 crc kubenswrapper[4825]: E0310 08:13:05.656205 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29d7bb79-c212-470e-927d-9415a2e9f206" containerName="mariadb-database-create" Mar 10 08:13:05 crc kubenswrapper[4825]: I0310 08:13:05.656319 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d7bb79-c212-470e-927d-9415a2e9f206" containerName="mariadb-database-create" Mar 10 08:13:05 crc kubenswrapper[4825]: E0310 08:13:05.656343 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d38f516f-6216-4ed2-9efe-95492dfc61e8" containerName="mariadb-account-create-update" Mar 10 08:13:05 crc kubenswrapper[4825]: I0310 08:13:05.656376 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d38f516f-6216-4ed2-9efe-95492dfc61e8" containerName="mariadb-account-create-update" Mar 10 08:13:05 crc kubenswrapper[4825]: I0310 08:13:05.656710 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="29d7bb79-c212-470e-927d-9415a2e9f206" containerName="mariadb-database-create" Mar 10 08:13:05 crc kubenswrapper[4825]: I0310 08:13:05.656760 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="d38f516f-6216-4ed2-9efe-95492dfc61e8" containerName="mariadb-account-create-update" Mar 10 08:13:05 crc kubenswrapper[4825]: I0310 08:13:05.657686 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-s9c9n" Mar 10 08:13:05 crc kubenswrapper[4825]: I0310 08:13:05.660648 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 10 08:13:05 crc kubenswrapper[4825]: I0310 08:13:05.660754 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 10 08:13:05 crc kubenswrapper[4825]: I0310 08:13:05.663024 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vhbnf" Mar 10 08:13:05 crc kubenswrapper[4825]: I0310 08:13:05.670638 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-s9c9n"] Mar 10 08:13:05 crc kubenswrapper[4825]: I0310 08:13:05.748716 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8phn\" (UniqueName: \"kubernetes.io/projected/311dddb8-bb79-4100-8497-ee38e452b266-kube-api-access-w8phn\") pod \"neutron-db-sync-s9c9n\" (UID: \"311dddb8-bb79-4100-8497-ee38e452b266\") " pod="openstack/neutron-db-sync-s9c9n" Mar 10 08:13:05 crc kubenswrapper[4825]: I0310 08:13:05.748983 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/311dddb8-bb79-4100-8497-ee38e452b266-combined-ca-bundle\") pod \"neutron-db-sync-s9c9n\" (UID: \"311dddb8-bb79-4100-8497-ee38e452b266\") " pod="openstack/neutron-db-sync-s9c9n" Mar 10 08:13:05 crc kubenswrapper[4825]: I0310 08:13:05.749011 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/311dddb8-bb79-4100-8497-ee38e452b266-config\") pod \"neutron-db-sync-s9c9n\" (UID: \"311dddb8-bb79-4100-8497-ee38e452b266\") " pod="openstack/neutron-db-sync-s9c9n" Mar 10 08:13:05 crc kubenswrapper[4825]: I0310 08:13:05.850306 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8phn\" (UniqueName: \"kubernetes.io/projected/311dddb8-bb79-4100-8497-ee38e452b266-kube-api-access-w8phn\") pod \"neutron-db-sync-s9c9n\" (UID: \"311dddb8-bb79-4100-8497-ee38e452b266\") " pod="openstack/neutron-db-sync-s9c9n" Mar 10 08:13:05 crc kubenswrapper[4825]: I0310 08:13:05.850405 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/311dddb8-bb79-4100-8497-ee38e452b266-combined-ca-bundle\") pod \"neutron-db-sync-s9c9n\" (UID: \"311dddb8-bb79-4100-8497-ee38e452b266\") " pod="openstack/neutron-db-sync-s9c9n" Mar 10 08:13:05 crc kubenswrapper[4825]: I0310 08:13:05.850439 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/311dddb8-bb79-4100-8497-ee38e452b266-config\") pod \"neutron-db-sync-s9c9n\" (UID: \"311dddb8-bb79-4100-8497-ee38e452b266\") " pod="openstack/neutron-db-sync-s9c9n" Mar 10 08:13:05 crc kubenswrapper[4825]: I0310 08:13:05.858746 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/311dddb8-bb79-4100-8497-ee38e452b266-combined-ca-bundle\") pod \"neutron-db-sync-s9c9n\" (UID: \"311dddb8-bb79-4100-8497-ee38e452b266\") " pod="openstack/neutron-db-sync-s9c9n" Mar 10 08:13:05 crc kubenswrapper[4825]: I0310 08:13:05.858984 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/311dddb8-bb79-4100-8497-ee38e452b266-config\") pod \"neutron-db-sync-s9c9n\" (UID: \"311dddb8-bb79-4100-8497-ee38e452b266\") " pod="openstack/neutron-db-sync-s9c9n" Mar 10 08:13:05 crc kubenswrapper[4825]: I0310 08:13:05.874474 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8phn\" (UniqueName: \"kubernetes.io/projected/311dddb8-bb79-4100-8497-ee38e452b266-kube-api-access-w8phn\") pod \"neutron-db-sync-s9c9n\" (UID: \"311dddb8-bb79-4100-8497-ee38e452b266\") " pod="openstack/neutron-db-sync-s9c9n" Mar 10 08:13:05 crc kubenswrapper[4825]: I0310 08:13:05.977676 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-s9c9n" Mar 10 08:13:06 crc kubenswrapper[4825]: I0310 08:13:06.230500 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-s9c9n"] Mar 10 08:13:06 crc kubenswrapper[4825]: I0310 08:13:06.479383 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-s9c9n" event={"ID":"311dddb8-bb79-4100-8497-ee38e452b266","Type":"ContainerStarted","Data":"c57527c0af5115273139dc4418c1d2b2e55ae48862c11d14b4fe0cce50780c5a"} Mar 10 08:13:06 crc kubenswrapper[4825]: I0310 08:13:06.479618 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-s9c9n" event={"ID":"311dddb8-bb79-4100-8497-ee38e452b266","Type":"ContainerStarted","Data":"2a362b02a029435be1f7381d276b6df1d6ce8596ac0dadf026430e83061520d3"} Mar 10 08:13:10 crc kubenswrapper[4825]: I0310 08:13:10.523644 4825 generic.go:334] "Generic (PLEG): container finished" podID="311dddb8-bb79-4100-8497-ee38e452b266" containerID="c57527c0af5115273139dc4418c1d2b2e55ae48862c11d14b4fe0cce50780c5a" exitCode=0 Mar 10 08:13:10 crc kubenswrapper[4825]: I0310 08:13:10.523780 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-s9c9n" event={"ID":"311dddb8-bb79-4100-8497-ee38e452b266","Type":"ContainerDied","Data":"c57527c0af5115273139dc4418c1d2b2e55ae48862c11d14b4fe0cce50780c5a"} Mar 10 08:13:11 crc kubenswrapper[4825]: I0310 08:13:11.890492 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-s9c9n" Mar 10 08:13:11 crc kubenswrapper[4825]: I0310 08:13:11.984980 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/311dddb8-bb79-4100-8497-ee38e452b266-config\") pod \"311dddb8-bb79-4100-8497-ee38e452b266\" (UID: \"311dddb8-bb79-4100-8497-ee38e452b266\") " Mar 10 08:13:11 crc kubenswrapper[4825]: I0310 08:13:11.985040 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8phn\" (UniqueName: \"kubernetes.io/projected/311dddb8-bb79-4100-8497-ee38e452b266-kube-api-access-w8phn\") pod \"311dddb8-bb79-4100-8497-ee38e452b266\" (UID: \"311dddb8-bb79-4100-8497-ee38e452b266\") " Mar 10 08:13:11 crc kubenswrapper[4825]: I0310 08:13:11.985073 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/311dddb8-bb79-4100-8497-ee38e452b266-combined-ca-bundle\") pod \"311dddb8-bb79-4100-8497-ee38e452b266\" (UID: \"311dddb8-bb79-4100-8497-ee38e452b266\") " Mar 10 08:13:11 crc kubenswrapper[4825]: I0310 08:13:11.991040 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/311dddb8-bb79-4100-8497-ee38e452b266-kube-api-access-w8phn" (OuterVolumeSpecName: "kube-api-access-w8phn") pod "311dddb8-bb79-4100-8497-ee38e452b266" (UID: "311dddb8-bb79-4100-8497-ee38e452b266"). InnerVolumeSpecName "kube-api-access-w8phn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.010735 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/311dddb8-bb79-4100-8497-ee38e452b266-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "311dddb8-bb79-4100-8497-ee38e452b266" (UID: "311dddb8-bb79-4100-8497-ee38e452b266"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.015323 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/311dddb8-bb79-4100-8497-ee38e452b266-config" (OuterVolumeSpecName: "config") pod "311dddb8-bb79-4100-8497-ee38e452b266" (UID: "311dddb8-bb79-4100-8497-ee38e452b266"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.086968 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/311dddb8-bb79-4100-8497-ee38e452b266-config\") on node \"crc\" DevicePath \"\"" Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.087328 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8phn\" (UniqueName: \"kubernetes.io/projected/311dddb8-bb79-4100-8497-ee38e452b266-kube-api-access-w8phn\") on node \"crc\" DevicePath \"\"" Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.087345 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/311dddb8-bb79-4100-8497-ee38e452b266-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.566475 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-s9c9n" event={"ID":"311dddb8-bb79-4100-8497-ee38e452b266","Type":"ContainerDied","Data":"2a362b02a029435be1f7381d276b6df1d6ce8596ac0dadf026430e83061520d3"} Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.566520 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a362b02a029435be1f7381d276b6df1d6ce8596ac0dadf026430e83061520d3" Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.566607 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-s9c9n" Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.777484 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f6cf695d7-glngc"] Mar 10 08:13:12 crc kubenswrapper[4825]: E0310 08:13:12.777839 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="311dddb8-bb79-4100-8497-ee38e452b266" containerName="neutron-db-sync" Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.777855 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="311dddb8-bb79-4100-8497-ee38e452b266" containerName="neutron-db-sync" Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.778004 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="311dddb8-bb79-4100-8497-ee38e452b266" containerName="neutron-db-sync" Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.778847 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6cf695d7-glngc" Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.801845 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f6cf695d7-glngc"] Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.904940 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27777bdb-4fb4-4704-9ccb-df7a83fde963-dns-svc\") pod \"dnsmasq-dns-f6cf695d7-glngc\" (UID: \"27777bdb-4fb4-4704-9ccb-df7a83fde963\") " pod="openstack/dnsmasq-dns-f6cf695d7-glngc" Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.906012 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th4cm\" (UniqueName: \"kubernetes.io/projected/27777bdb-4fb4-4704-9ccb-df7a83fde963-kube-api-access-th4cm\") pod \"dnsmasq-dns-f6cf695d7-glngc\" (UID: \"27777bdb-4fb4-4704-9ccb-df7a83fde963\") " pod="openstack/dnsmasq-dns-f6cf695d7-glngc" Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.906195 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27777bdb-4fb4-4704-9ccb-df7a83fde963-ovsdbserver-sb\") pod \"dnsmasq-dns-f6cf695d7-glngc\" (UID: \"27777bdb-4fb4-4704-9ccb-df7a83fde963\") " pod="openstack/dnsmasq-dns-f6cf695d7-glngc" Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.906353 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27777bdb-4fb4-4704-9ccb-df7a83fde963-config\") pod \"dnsmasq-dns-f6cf695d7-glngc\" (UID: \"27777bdb-4fb4-4704-9ccb-df7a83fde963\") " pod="openstack/dnsmasq-dns-f6cf695d7-glngc" Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.906519 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27777bdb-4fb4-4704-9ccb-df7a83fde963-ovsdbserver-nb\") pod \"dnsmasq-dns-f6cf695d7-glngc\" (UID: \"27777bdb-4fb4-4704-9ccb-df7a83fde963\") " pod="openstack/dnsmasq-dns-f6cf695d7-glngc" Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.927656 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-78b7969f6-2pchj"] Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.932079 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78b7969f6-2pchj" Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.938672 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.938872 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.938922 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.941387 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vhbnf" Mar 10 08:13:12 crc kubenswrapper[4825]: I0310 08:13:12.943106 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78b7969f6-2pchj"] Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.008595 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th4cm\" (UniqueName: \"kubernetes.io/projected/27777bdb-4fb4-4704-9ccb-df7a83fde963-kube-api-access-th4cm\") pod \"dnsmasq-dns-f6cf695d7-glngc\" (UID: \"27777bdb-4fb4-4704-9ccb-df7a83fde963\") " pod="openstack/dnsmasq-dns-f6cf695d7-glngc" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.008683 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27777bdb-4fb4-4704-9ccb-df7a83fde963-ovsdbserver-sb\") pod \"dnsmasq-dns-f6cf695d7-glngc\" (UID: \"27777bdb-4fb4-4704-9ccb-df7a83fde963\") " pod="openstack/dnsmasq-dns-f6cf695d7-glngc" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.008720 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3584c742-f634-450e-a0e0-f7b58137efd5-httpd-config\") pod \"neutron-78b7969f6-2pchj\" (UID: \"3584c742-f634-450e-a0e0-f7b58137efd5\") " pod="openstack/neutron-78b7969f6-2pchj" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.008750 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27777bdb-4fb4-4704-9ccb-df7a83fde963-config\") pod \"dnsmasq-dns-f6cf695d7-glngc\" (UID: \"27777bdb-4fb4-4704-9ccb-df7a83fde963\") " pod="openstack/dnsmasq-dns-f6cf695d7-glngc" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.008784 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3584c742-f634-450e-a0e0-f7b58137efd5-ovndb-tls-certs\") pod \"neutron-78b7969f6-2pchj\" (UID: \"3584c742-f634-450e-a0e0-f7b58137efd5\") " pod="openstack/neutron-78b7969f6-2pchj" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.008827 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27777bdb-4fb4-4704-9ccb-df7a83fde963-ovsdbserver-nb\") pod \"dnsmasq-dns-f6cf695d7-glngc\" (UID: \"27777bdb-4fb4-4704-9ccb-df7a83fde963\") " pod="openstack/dnsmasq-dns-f6cf695d7-glngc" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.008881 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnt4s\" (UniqueName: \"kubernetes.io/projected/3584c742-f634-450e-a0e0-f7b58137efd5-kube-api-access-hnt4s\") pod \"neutron-78b7969f6-2pchj\" (UID: \"3584c742-f634-450e-a0e0-f7b58137efd5\") " pod="openstack/neutron-78b7969f6-2pchj" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.008934 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3584c742-f634-450e-a0e0-f7b58137efd5-config\") pod \"neutron-78b7969f6-2pchj\" (UID: \"3584c742-f634-450e-a0e0-f7b58137efd5\") " pod="openstack/neutron-78b7969f6-2pchj" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.008999 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27777bdb-4fb4-4704-9ccb-df7a83fde963-dns-svc\") pod \"dnsmasq-dns-f6cf695d7-glngc\" (UID: \"27777bdb-4fb4-4704-9ccb-df7a83fde963\") " pod="openstack/dnsmasq-dns-f6cf695d7-glngc" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.009061 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3584c742-f634-450e-a0e0-f7b58137efd5-combined-ca-bundle\") pod \"neutron-78b7969f6-2pchj\" (UID: \"3584c742-f634-450e-a0e0-f7b58137efd5\") " pod="openstack/neutron-78b7969f6-2pchj" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.009603 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27777bdb-4fb4-4704-9ccb-df7a83fde963-ovsdbserver-sb\") pod \"dnsmasq-dns-f6cf695d7-glngc\" (UID: \"27777bdb-4fb4-4704-9ccb-df7a83fde963\") " pod="openstack/dnsmasq-dns-f6cf695d7-glngc" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.009889 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27777bdb-4fb4-4704-9ccb-df7a83fde963-ovsdbserver-nb\") pod \"dnsmasq-dns-f6cf695d7-glngc\" (UID: \"27777bdb-4fb4-4704-9ccb-df7a83fde963\") " pod="openstack/dnsmasq-dns-f6cf695d7-glngc" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.009958 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27777bdb-4fb4-4704-9ccb-df7a83fde963-config\") pod \"dnsmasq-dns-f6cf695d7-glngc\" (UID: \"27777bdb-4fb4-4704-9ccb-df7a83fde963\") " pod="openstack/dnsmasq-dns-f6cf695d7-glngc" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.010251 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27777bdb-4fb4-4704-9ccb-df7a83fde963-dns-svc\") pod \"dnsmasq-dns-f6cf695d7-glngc\" (UID: \"27777bdb-4fb4-4704-9ccb-df7a83fde963\") " pod="openstack/dnsmasq-dns-f6cf695d7-glngc" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.028991 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th4cm\" (UniqueName: \"kubernetes.io/projected/27777bdb-4fb4-4704-9ccb-df7a83fde963-kube-api-access-th4cm\") pod \"dnsmasq-dns-f6cf695d7-glngc\" (UID: \"27777bdb-4fb4-4704-9ccb-df7a83fde963\") " pod="openstack/dnsmasq-dns-f6cf695d7-glngc" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.099964 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6cf695d7-glngc" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.111301 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3584c742-f634-450e-a0e0-f7b58137efd5-config\") pod \"neutron-78b7969f6-2pchj\" (UID: \"3584c742-f634-450e-a0e0-f7b58137efd5\") " pod="openstack/neutron-78b7969f6-2pchj" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.111372 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3584c742-f634-450e-a0e0-f7b58137efd5-combined-ca-bundle\") pod \"neutron-78b7969f6-2pchj\" (UID: \"3584c742-f634-450e-a0e0-f7b58137efd5\") " pod="openstack/neutron-78b7969f6-2pchj" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.111456 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3584c742-f634-450e-a0e0-f7b58137efd5-httpd-config\") pod \"neutron-78b7969f6-2pchj\" (UID: \"3584c742-f634-450e-a0e0-f7b58137efd5\") " pod="openstack/neutron-78b7969f6-2pchj" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.111489 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3584c742-f634-450e-a0e0-f7b58137efd5-ovndb-tls-certs\") pod \"neutron-78b7969f6-2pchj\" (UID: \"3584c742-f634-450e-a0e0-f7b58137efd5\") " pod="openstack/neutron-78b7969f6-2pchj" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.111568 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnt4s\" (UniqueName: \"kubernetes.io/projected/3584c742-f634-450e-a0e0-f7b58137efd5-kube-api-access-hnt4s\") pod \"neutron-78b7969f6-2pchj\" (UID: \"3584c742-f634-450e-a0e0-f7b58137efd5\") " pod="openstack/neutron-78b7969f6-2pchj" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.116005 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3584c742-f634-450e-a0e0-f7b58137efd5-httpd-config\") pod \"neutron-78b7969f6-2pchj\" (UID: \"3584c742-f634-450e-a0e0-f7b58137efd5\") " pod="openstack/neutron-78b7969f6-2pchj" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.116660 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3584c742-f634-450e-a0e0-f7b58137efd5-config\") pod \"neutron-78b7969f6-2pchj\" (UID: \"3584c742-f634-450e-a0e0-f7b58137efd5\") " pod="openstack/neutron-78b7969f6-2pchj" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.119684 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3584c742-f634-450e-a0e0-f7b58137efd5-ovndb-tls-certs\") pod \"neutron-78b7969f6-2pchj\" (UID: \"3584c742-f634-450e-a0e0-f7b58137efd5\") " pod="openstack/neutron-78b7969f6-2pchj" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.119844 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3584c742-f634-450e-a0e0-f7b58137efd5-combined-ca-bundle\") pod \"neutron-78b7969f6-2pchj\" (UID: \"3584c742-f634-450e-a0e0-f7b58137efd5\") " pod="openstack/neutron-78b7969f6-2pchj" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.138469 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnt4s\" (UniqueName: \"kubernetes.io/projected/3584c742-f634-450e-a0e0-f7b58137efd5-kube-api-access-hnt4s\") pod \"neutron-78b7969f6-2pchj\" (UID: \"3584c742-f634-450e-a0e0-f7b58137efd5\") " pod="openstack/neutron-78b7969f6-2pchj" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.251578 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78b7969f6-2pchj" Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.377025 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f6cf695d7-glngc"] Mar 10 08:13:13 crc kubenswrapper[4825]: W0310 08:13:13.390073 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27777bdb_4fb4_4704_9ccb_df7a83fde963.slice/crio-ef2489eb579fa52c7c2401f5f49b11ed2c4ff01f7501ceb449be010783431bb7 WatchSource:0}: Error finding container ef2489eb579fa52c7c2401f5f49b11ed2c4ff01f7501ceb449be010783431bb7: Status 404 returned error can't find the container with id ef2489eb579fa52c7c2401f5f49b11ed2c4ff01f7501ceb449be010783431bb7 Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.582756 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6cf695d7-glngc" event={"ID":"27777bdb-4fb4-4704-9ccb-df7a83fde963","Type":"ContainerStarted","Data":"ef2489eb579fa52c7c2401f5f49b11ed2c4ff01f7501ceb449be010783431bb7"} Mar 10 08:13:13 crc kubenswrapper[4825]: I0310 08:13:13.900951 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78b7969f6-2pchj"] Mar 10 08:13:13 crc kubenswrapper[4825]: W0310 08:13:13.914360 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3584c742_f634_450e_a0e0_f7b58137efd5.slice/crio-a2248e7acda873a76dce4bef5ab3344f99028feaa7bce8a68928b59472931f82 WatchSource:0}: Error finding container a2248e7acda873a76dce4bef5ab3344f99028feaa7bce8a68928b59472931f82: Status 404 returned error can't find the container with id a2248e7acda873a76dce4bef5ab3344f99028feaa7bce8a68928b59472931f82 Mar 10 08:13:14 crc kubenswrapper[4825]: I0310 08:13:14.593194 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78b7969f6-2pchj" event={"ID":"3584c742-f634-450e-a0e0-f7b58137efd5","Type":"ContainerStarted","Data":"8505c43db0640117759a114a90eb749089ae2677bed12c6b3d1a24bb8efdd414"} Mar 10 08:13:14 crc kubenswrapper[4825]: I0310 08:13:14.593530 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78b7969f6-2pchj" event={"ID":"3584c742-f634-450e-a0e0-f7b58137efd5","Type":"ContainerStarted","Data":"1fbab6f62014d9031910313fc8f525aa40d47072232e9fec4695a9e7ebf91965"} Mar 10 08:13:14 crc kubenswrapper[4825]: I0310 08:13:14.593558 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-78b7969f6-2pchj" Mar 10 08:13:14 crc kubenswrapper[4825]: I0310 08:13:14.593574 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78b7969f6-2pchj" event={"ID":"3584c742-f634-450e-a0e0-f7b58137efd5","Type":"ContainerStarted","Data":"a2248e7acda873a76dce4bef5ab3344f99028feaa7bce8a68928b59472931f82"} Mar 10 08:13:14 crc kubenswrapper[4825]: I0310 08:13:14.595281 4825 generic.go:334] "Generic (PLEG): container finished" podID="27777bdb-4fb4-4704-9ccb-df7a83fde963" containerID="008822619d45bc01dc85394373c8c5ccf93140832050a2d8a4548874405727ed" exitCode=0 Mar 10 08:13:14 crc kubenswrapper[4825]: I0310 08:13:14.595318 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6cf695d7-glngc" event={"ID":"27777bdb-4fb4-4704-9ccb-df7a83fde963","Type":"ContainerDied","Data":"008822619d45bc01dc85394373c8c5ccf93140832050a2d8a4548874405727ed"} Mar 10 08:13:14 crc kubenswrapper[4825]: I0310 08:13:14.620554 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-78b7969f6-2pchj" podStartSLOduration=2.620532283 podStartE2EDuration="2.620532283s" podCreationTimestamp="2026-03-10 08:13:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:13:14.61814932 +0000 UTC m=+5347.647929935" watchObservedRunningTime="2026-03-10 08:13:14.620532283 +0000 UTC m=+5347.650312898" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.247279 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-cd8d6ccb7-tj99q"] Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.251671 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.256837 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.257057 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.264479 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cd8d6ccb7-tj99q"] Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.368770 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b23b1e12-ba16-4288-857c-116ccab84267-ovndb-tls-certs\") pod \"neutron-cd8d6ccb7-tj99q\" (UID: \"b23b1e12-ba16-4288-857c-116ccab84267\") " pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.368983 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b23b1e12-ba16-4288-857c-116ccab84267-internal-tls-certs\") pod \"neutron-cd8d6ccb7-tj99q\" (UID: \"b23b1e12-ba16-4288-857c-116ccab84267\") " pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.369199 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46662\" (UniqueName: \"kubernetes.io/projected/b23b1e12-ba16-4288-857c-116ccab84267-kube-api-access-46662\") pod \"neutron-cd8d6ccb7-tj99q\" (UID: \"b23b1e12-ba16-4288-857c-116ccab84267\") " pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.369317 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b23b1e12-ba16-4288-857c-116ccab84267-public-tls-certs\") pod \"neutron-cd8d6ccb7-tj99q\" (UID: \"b23b1e12-ba16-4288-857c-116ccab84267\") " pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.369393 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b23b1e12-ba16-4288-857c-116ccab84267-config\") pod \"neutron-cd8d6ccb7-tj99q\" (UID: \"b23b1e12-ba16-4288-857c-116ccab84267\") " pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.369547 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b23b1e12-ba16-4288-857c-116ccab84267-combined-ca-bundle\") pod \"neutron-cd8d6ccb7-tj99q\" (UID: \"b23b1e12-ba16-4288-857c-116ccab84267\") " pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.369666 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b23b1e12-ba16-4288-857c-116ccab84267-httpd-config\") pod \"neutron-cd8d6ccb7-tj99q\" (UID: \"b23b1e12-ba16-4288-857c-116ccab84267\") " pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.471544 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b23b1e12-ba16-4288-857c-116ccab84267-internal-tls-certs\") pod \"neutron-cd8d6ccb7-tj99q\" (UID: \"b23b1e12-ba16-4288-857c-116ccab84267\") " pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.471608 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46662\" (UniqueName: \"kubernetes.io/projected/b23b1e12-ba16-4288-857c-116ccab84267-kube-api-access-46662\") pod \"neutron-cd8d6ccb7-tj99q\" (UID: \"b23b1e12-ba16-4288-857c-116ccab84267\") " pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.471645 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b23b1e12-ba16-4288-857c-116ccab84267-public-tls-certs\") pod \"neutron-cd8d6ccb7-tj99q\" (UID: \"b23b1e12-ba16-4288-857c-116ccab84267\") " pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.471669 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b23b1e12-ba16-4288-857c-116ccab84267-config\") pod \"neutron-cd8d6ccb7-tj99q\" (UID: \"b23b1e12-ba16-4288-857c-116ccab84267\") " pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.471709 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b23b1e12-ba16-4288-857c-116ccab84267-combined-ca-bundle\") pod \"neutron-cd8d6ccb7-tj99q\" (UID: \"b23b1e12-ba16-4288-857c-116ccab84267\") " pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.471744 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b23b1e12-ba16-4288-857c-116ccab84267-httpd-config\") pod \"neutron-cd8d6ccb7-tj99q\" (UID: \"b23b1e12-ba16-4288-857c-116ccab84267\") " pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.471766 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b23b1e12-ba16-4288-857c-116ccab84267-ovndb-tls-certs\") pod \"neutron-cd8d6ccb7-tj99q\" (UID: \"b23b1e12-ba16-4288-857c-116ccab84267\") " pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.477043 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b23b1e12-ba16-4288-857c-116ccab84267-internal-tls-certs\") pod \"neutron-cd8d6ccb7-tj99q\" (UID: \"b23b1e12-ba16-4288-857c-116ccab84267\") " pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.477208 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b23b1e12-ba16-4288-857c-116ccab84267-config\") pod \"neutron-cd8d6ccb7-tj99q\" (UID: \"b23b1e12-ba16-4288-857c-116ccab84267\") " pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.478542 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b23b1e12-ba16-4288-857c-116ccab84267-ovndb-tls-certs\") pod \"neutron-cd8d6ccb7-tj99q\" (UID: \"b23b1e12-ba16-4288-857c-116ccab84267\") " pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.483401 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b23b1e12-ba16-4288-857c-116ccab84267-public-tls-certs\") pod \"neutron-cd8d6ccb7-tj99q\" (UID: \"b23b1e12-ba16-4288-857c-116ccab84267\") " pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.483605 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b23b1e12-ba16-4288-857c-116ccab84267-combined-ca-bundle\") pod \"neutron-cd8d6ccb7-tj99q\" (UID: \"b23b1e12-ba16-4288-857c-116ccab84267\") " pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.483736 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b23b1e12-ba16-4288-857c-116ccab84267-httpd-config\") pod \"neutron-cd8d6ccb7-tj99q\" (UID: \"b23b1e12-ba16-4288-857c-116ccab84267\") " pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.503822 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46662\" (UniqueName: \"kubernetes.io/projected/b23b1e12-ba16-4288-857c-116ccab84267-kube-api-access-46662\") pod \"neutron-cd8d6ccb7-tj99q\" (UID: \"b23b1e12-ba16-4288-857c-116ccab84267\") " pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.573728 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.610061 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6cf695d7-glngc" event={"ID":"27777bdb-4fb4-4704-9ccb-df7a83fde963","Type":"ContainerStarted","Data":"d8d111e7061e5eb38b8fc0fc2bda73f06b2b419ff2ca5ed768b8c753a206df58"} Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.610370 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f6cf695d7-glngc" Mar 10 08:13:15 crc kubenswrapper[4825]: I0310 08:13:15.629675 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f6cf695d7-glngc" podStartSLOduration=3.6296526399999998 podStartE2EDuration="3.62965264s" podCreationTimestamp="2026-03-10 08:13:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:13:15.62889482 +0000 UTC m=+5348.658675455" watchObservedRunningTime="2026-03-10 08:13:15.62965264 +0000 UTC m=+5348.659433255" Mar 10 08:13:16 crc kubenswrapper[4825]: I0310 08:13:16.769433 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cd8d6ccb7-tj99q"] Mar 10 08:13:16 crc kubenswrapper[4825]: W0310 08:13:16.771307 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb23b1e12_ba16_4288_857c_116ccab84267.slice/crio-31c1ab614e0923a43819e9e5a6738027b669749c052ad501496d3e2279d3f257 WatchSource:0}: Error finding container 31c1ab614e0923a43819e9e5a6738027b669749c052ad501496d3e2279d3f257: Status 404 returned error can't find the container with id 31c1ab614e0923a43819e9e5a6738027b669749c052ad501496d3e2279d3f257 Mar 10 08:13:16 crc kubenswrapper[4825]: I0310 08:13:16.888648 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:13:16 crc kubenswrapper[4825]: I0310 08:13:16.888936 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:13:17 crc kubenswrapper[4825]: I0310 08:13:17.634482 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cd8d6ccb7-tj99q" event={"ID":"b23b1e12-ba16-4288-857c-116ccab84267","Type":"ContainerStarted","Data":"dc66f314976daa6ec31f372764c24373afc7c1dcdd035f1de5b90fa6c50cb756"} Mar 10 08:13:17 crc kubenswrapper[4825]: I0310 08:13:17.634977 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:17 crc kubenswrapper[4825]: I0310 08:13:17.634997 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cd8d6ccb7-tj99q" event={"ID":"b23b1e12-ba16-4288-857c-116ccab84267","Type":"ContainerStarted","Data":"39b2929673d595cb50ff3804b281334e6436b789861f3360d7adcd0c0c674c95"} Mar 10 08:13:17 crc kubenswrapper[4825]: I0310 08:13:17.635012 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cd8d6ccb7-tj99q" event={"ID":"b23b1e12-ba16-4288-857c-116ccab84267","Type":"ContainerStarted","Data":"31c1ab614e0923a43819e9e5a6738027b669749c052ad501496d3e2279d3f257"} Mar 10 08:13:17 crc kubenswrapper[4825]: I0310 08:13:17.663590 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-cd8d6ccb7-tj99q" podStartSLOduration=2.663569448 podStartE2EDuration="2.663569448s" podCreationTimestamp="2026-03-10 08:13:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:13:17.654422178 +0000 UTC m=+5350.684202793" watchObservedRunningTime="2026-03-10 08:13:17.663569448 +0000 UTC m=+5350.693350063" Mar 10 08:13:23 crc kubenswrapper[4825]: I0310 08:13:23.102097 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f6cf695d7-glngc" Mar 10 08:13:23 crc kubenswrapper[4825]: I0310 08:13:23.175032 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5645ccfdc5-zldk4"] Mar 10 08:13:23 crc kubenswrapper[4825]: I0310 08:13:23.175323 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" podUID="ec60a834-532a-45ec-ac78-cc5954ab9d85" containerName="dnsmasq-dns" containerID="cri-o://1c18cf7a8d81a88fb7f7c509bf70fa564fdef115c481b793ed4d7113f8cc6f48" gracePeriod=10 Mar 10 08:13:23 crc kubenswrapper[4825]: I0310 08:13:23.685597 4825 generic.go:334] "Generic (PLEG): container finished" podID="ec60a834-532a-45ec-ac78-cc5954ab9d85" containerID="1c18cf7a8d81a88fb7f7c509bf70fa564fdef115c481b793ed4d7113f8cc6f48" exitCode=0 Mar 10 08:13:23 crc kubenswrapper[4825]: I0310 08:13:23.685667 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" event={"ID":"ec60a834-532a-45ec-ac78-cc5954ab9d85","Type":"ContainerDied","Data":"1c18cf7a8d81a88fb7f7c509bf70fa564fdef115c481b793ed4d7113f8cc6f48"} Mar 10 08:13:23 crc kubenswrapper[4825]: I0310 08:13:23.685978 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" event={"ID":"ec60a834-532a-45ec-ac78-cc5954ab9d85","Type":"ContainerDied","Data":"03e84782fb95fa0b4284c76df0f0b6b92fa763801f20ac45d1137cd69cc6fccd"} Mar 10 08:13:23 crc kubenswrapper[4825]: I0310 08:13:23.686000 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03e84782fb95fa0b4284c76df0f0b6b92fa763801f20ac45d1137cd69cc6fccd" Mar 10 08:13:23 crc kubenswrapper[4825]: I0310 08:13:23.733288 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" Mar 10 08:13:23 crc kubenswrapper[4825]: I0310 08:13:23.829122 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec60a834-532a-45ec-ac78-cc5954ab9d85-dns-svc\") pod \"ec60a834-532a-45ec-ac78-cc5954ab9d85\" (UID: \"ec60a834-532a-45ec-ac78-cc5954ab9d85\") " Mar 10 08:13:23 crc kubenswrapper[4825]: I0310 08:13:23.829310 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec60a834-532a-45ec-ac78-cc5954ab9d85-ovsdbserver-nb\") pod \"ec60a834-532a-45ec-ac78-cc5954ab9d85\" (UID: \"ec60a834-532a-45ec-ac78-cc5954ab9d85\") " Mar 10 08:13:23 crc kubenswrapper[4825]: I0310 08:13:23.829355 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec60a834-532a-45ec-ac78-cc5954ab9d85-config\") pod \"ec60a834-532a-45ec-ac78-cc5954ab9d85\" (UID: \"ec60a834-532a-45ec-ac78-cc5954ab9d85\") " Mar 10 08:13:23 crc kubenswrapper[4825]: I0310 08:13:23.829427 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec60a834-532a-45ec-ac78-cc5954ab9d85-ovsdbserver-sb\") pod \"ec60a834-532a-45ec-ac78-cc5954ab9d85\" (UID: \"ec60a834-532a-45ec-ac78-cc5954ab9d85\") " Mar 10 08:13:23 crc kubenswrapper[4825]: I0310 08:13:23.829470 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr5d7\" (UniqueName: \"kubernetes.io/projected/ec60a834-532a-45ec-ac78-cc5954ab9d85-kube-api-access-nr5d7\") pod \"ec60a834-532a-45ec-ac78-cc5954ab9d85\" (UID: \"ec60a834-532a-45ec-ac78-cc5954ab9d85\") " Mar 10 08:13:23 crc kubenswrapper[4825]: I0310 08:13:23.835079 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec60a834-532a-45ec-ac78-cc5954ab9d85-kube-api-access-nr5d7" (OuterVolumeSpecName: "kube-api-access-nr5d7") pod "ec60a834-532a-45ec-ac78-cc5954ab9d85" (UID: "ec60a834-532a-45ec-ac78-cc5954ab9d85"). InnerVolumeSpecName "kube-api-access-nr5d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:13:23 crc kubenswrapper[4825]: I0310 08:13:23.882118 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec60a834-532a-45ec-ac78-cc5954ab9d85-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ec60a834-532a-45ec-ac78-cc5954ab9d85" (UID: "ec60a834-532a-45ec-ac78-cc5954ab9d85"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:13:23 crc kubenswrapper[4825]: I0310 08:13:23.887403 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec60a834-532a-45ec-ac78-cc5954ab9d85-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ec60a834-532a-45ec-ac78-cc5954ab9d85" (UID: "ec60a834-532a-45ec-ac78-cc5954ab9d85"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:13:23 crc kubenswrapper[4825]: I0310 08:13:23.890798 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec60a834-532a-45ec-ac78-cc5954ab9d85-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ec60a834-532a-45ec-ac78-cc5954ab9d85" (UID: "ec60a834-532a-45ec-ac78-cc5954ab9d85"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:13:23 crc kubenswrapper[4825]: I0310 08:13:23.896330 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec60a834-532a-45ec-ac78-cc5954ab9d85-config" (OuterVolumeSpecName: "config") pod "ec60a834-532a-45ec-ac78-cc5954ab9d85" (UID: "ec60a834-532a-45ec-ac78-cc5954ab9d85"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:13:23 crc kubenswrapper[4825]: I0310 08:13:23.931256 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec60a834-532a-45ec-ac78-cc5954ab9d85-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 08:13:23 crc kubenswrapper[4825]: I0310 08:13:23.931504 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec60a834-532a-45ec-ac78-cc5954ab9d85-config\") on node \"crc\" DevicePath \"\"" Mar 10 08:13:23 crc kubenswrapper[4825]: I0310 08:13:23.931583 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec60a834-532a-45ec-ac78-cc5954ab9d85-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 08:13:23 crc kubenswrapper[4825]: I0310 08:13:23.931653 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nr5d7\" (UniqueName: \"kubernetes.io/projected/ec60a834-532a-45ec-ac78-cc5954ab9d85-kube-api-access-nr5d7\") on node \"crc\" DevicePath \"\"" Mar 10 08:13:23 crc kubenswrapper[4825]: I0310 08:13:23.931735 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec60a834-532a-45ec-ac78-cc5954ab9d85-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 08:13:24 crc kubenswrapper[4825]: I0310 08:13:24.694217 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5645ccfdc5-zldk4" Mar 10 08:13:24 crc kubenswrapper[4825]: I0310 08:13:24.728353 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5645ccfdc5-zldk4"] Mar 10 08:13:24 crc kubenswrapper[4825]: I0310 08:13:24.735265 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5645ccfdc5-zldk4"] Mar 10 08:13:25 crc kubenswrapper[4825]: I0310 08:13:25.244972 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec60a834-532a-45ec-ac78-cc5954ab9d85" path="/var/lib/kubelet/pods/ec60a834-532a-45ec-ac78-cc5954ab9d85/volumes" Mar 10 08:13:43 crc kubenswrapper[4825]: I0310 08:13:43.265236 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-78b7969f6-2pchj" Mar 10 08:13:45 crc kubenswrapper[4825]: I0310 08:13:45.059460 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-pbrmr"] Mar 10 08:13:45 crc kubenswrapper[4825]: I0310 08:13:45.067588 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-pbrmr"] Mar 10 08:13:45 crc kubenswrapper[4825]: I0310 08:13:45.249231 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c42f77e-dffc-402b-adce-0a83e6a92739" path="/var/lib/kubelet/pods/9c42f77e-dffc-402b-adce-0a83e6a92739/volumes" Mar 10 08:13:45 crc kubenswrapper[4825]: I0310 08:13:45.596168 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-cd8d6ccb7-tj99q" Mar 10 08:13:45 crc kubenswrapper[4825]: I0310 08:13:45.678402 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-78b7969f6-2pchj"] Mar 10 08:13:45 crc kubenswrapper[4825]: I0310 08:13:45.678605 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-78b7969f6-2pchj" podUID="3584c742-f634-450e-a0e0-f7b58137efd5" containerName="neutron-api" containerID="cri-o://1fbab6f62014d9031910313fc8f525aa40d47072232e9fec4695a9e7ebf91965" gracePeriod=30 Mar 10 08:13:45 crc kubenswrapper[4825]: I0310 08:13:45.679850 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-78b7969f6-2pchj" podUID="3584c742-f634-450e-a0e0-f7b58137efd5" containerName="neutron-httpd" containerID="cri-o://8505c43db0640117759a114a90eb749089ae2677bed12c6b3d1a24bb8efdd414" gracePeriod=30 Mar 10 08:13:45 crc kubenswrapper[4825]: I0310 08:13:45.950363 4825 generic.go:334] "Generic (PLEG): container finished" podID="3584c742-f634-450e-a0e0-f7b58137efd5" containerID="8505c43db0640117759a114a90eb749089ae2677bed12c6b3d1a24bb8efdd414" exitCode=0 Mar 10 08:13:45 crc kubenswrapper[4825]: I0310 08:13:45.950976 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78b7969f6-2pchj" event={"ID":"3584c742-f634-450e-a0e0-f7b58137efd5","Type":"ContainerDied","Data":"8505c43db0640117759a114a90eb749089ae2677bed12c6b3d1a24bb8efdd414"} Mar 10 08:13:46 crc kubenswrapper[4825]: I0310 08:13:46.888054 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:13:46 crc kubenswrapper[4825]: I0310 08:13:46.888645 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:13:46 crc kubenswrapper[4825]: I0310 08:13:46.890304 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 08:13:46 crc kubenswrapper[4825]: I0310 08:13:46.892284 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 08:13:46 crc kubenswrapper[4825]: I0310 08:13:46.892404 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" gracePeriod=600 Mar 10 08:13:47 crc kubenswrapper[4825]: E0310 08:13:47.043321 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:13:47 crc kubenswrapper[4825]: I0310 08:13:47.969051 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" exitCode=0 Mar 10 08:13:47 crc kubenswrapper[4825]: I0310 08:13:47.969094 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414"} Mar 10 08:13:47 crc kubenswrapper[4825]: I0310 08:13:47.969125 4825 scope.go:117] "RemoveContainer" containerID="8802fd3b03472e4fc7e7e01ba0c1a6885938d2d647b76d216f70475068123de1" Mar 10 08:13:47 crc kubenswrapper[4825]: I0310 08:13:47.969716 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:13:47 crc kubenswrapper[4825]: E0310 08:13:47.970039 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:13:48 crc kubenswrapper[4825]: I0310 08:13:48.993933 4825 generic.go:334] "Generic (PLEG): container finished" podID="3584c742-f634-450e-a0e0-f7b58137efd5" containerID="1fbab6f62014d9031910313fc8f525aa40d47072232e9fec4695a9e7ebf91965" exitCode=0 Mar 10 08:13:48 crc kubenswrapper[4825]: I0310 08:13:48.994436 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78b7969f6-2pchj" event={"ID":"3584c742-f634-450e-a0e0-f7b58137efd5","Type":"ContainerDied","Data":"1fbab6f62014d9031910313fc8f525aa40d47072232e9fec4695a9e7ebf91965"} Mar 10 08:13:49 crc kubenswrapper[4825]: I0310 08:13:49.331384 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78b7969f6-2pchj" Mar 10 08:13:49 crc kubenswrapper[4825]: I0310 08:13:49.385600 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3584c742-f634-450e-a0e0-f7b58137efd5-config\") pod \"3584c742-f634-450e-a0e0-f7b58137efd5\" (UID: \"3584c742-f634-450e-a0e0-f7b58137efd5\") " Mar 10 08:13:49 crc kubenswrapper[4825]: I0310 08:13:49.385646 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3584c742-f634-450e-a0e0-f7b58137efd5-ovndb-tls-certs\") pod \"3584c742-f634-450e-a0e0-f7b58137efd5\" (UID: \"3584c742-f634-450e-a0e0-f7b58137efd5\") " Mar 10 08:13:49 crc kubenswrapper[4825]: I0310 08:13:49.385670 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3584c742-f634-450e-a0e0-f7b58137efd5-combined-ca-bundle\") pod \"3584c742-f634-450e-a0e0-f7b58137efd5\" (UID: \"3584c742-f634-450e-a0e0-f7b58137efd5\") " Mar 10 08:13:49 crc kubenswrapper[4825]: I0310 08:13:49.385968 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnt4s\" (UniqueName: \"kubernetes.io/projected/3584c742-f634-450e-a0e0-f7b58137efd5-kube-api-access-hnt4s\") pod \"3584c742-f634-450e-a0e0-f7b58137efd5\" (UID: \"3584c742-f634-450e-a0e0-f7b58137efd5\") " Mar 10 08:13:49 crc kubenswrapper[4825]: I0310 08:13:49.385997 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3584c742-f634-450e-a0e0-f7b58137efd5-httpd-config\") pod \"3584c742-f634-450e-a0e0-f7b58137efd5\" (UID: \"3584c742-f634-450e-a0e0-f7b58137efd5\") " Mar 10 08:13:49 crc kubenswrapper[4825]: I0310 08:13:49.406851 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3584c742-f634-450e-a0e0-f7b58137efd5-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "3584c742-f634-450e-a0e0-f7b58137efd5" (UID: "3584c742-f634-450e-a0e0-f7b58137efd5"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:13:49 crc kubenswrapper[4825]: I0310 08:13:49.407342 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3584c742-f634-450e-a0e0-f7b58137efd5-kube-api-access-hnt4s" (OuterVolumeSpecName: "kube-api-access-hnt4s") pod "3584c742-f634-450e-a0e0-f7b58137efd5" (UID: "3584c742-f634-450e-a0e0-f7b58137efd5"). InnerVolumeSpecName "kube-api-access-hnt4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:13:49 crc kubenswrapper[4825]: I0310 08:13:49.434517 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3584c742-f634-450e-a0e0-f7b58137efd5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3584c742-f634-450e-a0e0-f7b58137efd5" (UID: "3584c742-f634-450e-a0e0-f7b58137efd5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:13:49 crc kubenswrapper[4825]: I0310 08:13:49.436703 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3584c742-f634-450e-a0e0-f7b58137efd5-config" (OuterVolumeSpecName: "config") pod "3584c742-f634-450e-a0e0-f7b58137efd5" (UID: "3584c742-f634-450e-a0e0-f7b58137efd5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:13:49 crc kubenswrapper[4825]: I0310 08:13:49.469793 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3584c742-f634-450e-a0e0-f7b58137efd5-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "3584c742-f634-450e-a0e0-f7b58137efd5" (UID: "3584c742-f634-450e-a0e0-f7b58137efd5"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:13:49 crc kubenswrapper[4825]: I0310 08:13:49.488814 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnt4s\" (UniqueName: \"kubernetes.io/projected/3584c742-f634-450e-a0e0-f7b58137efd5-kube-api-access-hnt4s\") on node \"crc\" DevicePath \"\"" Mar 10 08:13:49 crc kubenswrapper[4825]: I0310 08:13:49.488909 4825 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3584c742-f634-450e-a0e0-f7b58137efd5-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 10 08:13:49 crc kubenswrapper[4825]: I0310 08:13:49.488927 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3584c742-f634-450e-a0e0-f7b58137efd5-config\") on node \"crc\" DevicePath \"\"" Mar 10 08:13:49 crc kubenswrapper[4825]: I0310 08:13:49.488941 4825 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3584c742-f634-450e-a0e0-f7b58137efd5-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 08:13:49 crc kubenswrapper[4825]: I0310 08:13:49.488957 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3584c742-f634-450e-a0e0-f7b58137efd5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:13:50 crc kubenswrapper[4825]: I0310 08:13:50.010189 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78b7969f6-2pchj" event={"ID":"3584c742-f634-450e-a0e0-f7b58137efd5","Type":"ContainerDied","Data":"a2248e7acda873a76dce4bef5ab3344f99028feaa7bce8a68928b59472931f82"} Mar 10 08:13:50 crc kubenswrapper[4825]: I0310 08:13:50.010637 4825 scope.go:117] "RemoveContainer" containerID="8505c43db0640117759a114a90eb749089ae2677bed12c6b3d1a24bb8efdd414" Mar 10 08:13:50 crc kubenswrapper[4825]: I0310 08:13:50.010268 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78b7969f6-2pchj" Mar 10 08:13:50 crc kubenswrapper[4825]: I0310 08:13:50.046649 4825 scope.go:117] "RemoveContainer" containerID="1fbab6f62014d9031910313fc8f525aa40d47072232e9fec4695a9e7ebf91965" Mar 10 08:13:50 crc kubenswrapper[4825]: I0310 08:13:50.060211 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-78b7969f6-2pchj"] Mar 10 08:13:50 crc kubenswrapper[4825]: I0310 08:13:50.066327 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-78b7969f6-2pchj"] Mar 10 08:13:51 crc kubenswrapper[4825]: I0310 08:13:51.248645 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3584c742-f634-450e-a0e0-f7b58137efd5" path="/var/lib/kubelet/pods/3584c742-f634-450e-a0e0-f7b58137efd5/volumes" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.120315 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-th79c"] Mar 10 08:13:56 crc kubenswrapper[4825]: E0310 08:13:56.121200 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec60a834-532a-45ec-ac78-cc5954ab9d85" containerName="init" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.121216 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec60a834-532a-45ec-ac78-cc5954ab9d85" containerName="init" Mar 10 08:13:56 crc kubenswrapper[4825]: E0310 08:13:56.121227 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec60a834-532a-45ec-ac78-cc5954ab9d85" containerName="dnsmasq-dns" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.121235 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec60a834-532a-45ec-ac78-cc5954ab9d85" containerName="dnsmasq-dns" Mar 10 08:13:56 crc kubenswrapper[4825]: E0310 08:13:56.121263 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3584c742-f634-450e-a0e0-f7b58137efd5" containerName="neutron-api" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.121272 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3584c742-f634-450e-a0e0-f7b58137efd5" containerName="neutron-api" Mar 10 08:13:56 crc kubenswrapper[4825]: E0310 08:13:56.121291 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3584c742-f634-450e-a0e0-f7b58137efd5" containerName="neutron-httpd" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.121298 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3584c742-f634-450e-a0e0-f7b58137efd5" containerName="neutron-httpd" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.121480 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3584c742-f634-450e-a0e0-f7b58137efd5" containerName="neutron-httpd" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.121495 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec60a834-532a-45ec-ac78-cc5954ab9d85" containerName="dnsmasq-dns" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.121517 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3584c742-f634-450e-a0e0-f7b58137efd5" containerName="neutron-api" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.122194 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.126438 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-q7smr" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.126661 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.126798 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.127016 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.127212 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.135968 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-th79c"] Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.206980 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvhwm\" (UniqueName: \"kubernetes.io/projected/72b5083e-156e-42a5-abbd-1d71521f8ed4-kube-api-access-rvhwm\") pod \"swift-ring-rebalance-th79c\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.207224 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/72b5083e-156e-42a5-abbd-1d71521f8ed4-swiftconf\") pod \"swift-ring-rebalance-th79c\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.207277 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/72b5083e-156e-42a5-abbd-1d71521f8ed4-ring-data-devices\") pod \"swift-ring-rebalance-th79c\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.207322 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72b5083e-156e-42a5-abbd-1d71521f8ed4-scripts\") pod \"swift-ring-rebalance-th79c\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.207364 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/72b5083e-156e-42a5-abbd-1d71521f8ed4-etc-swift\") pod \"swift-ring-rebalance-th79c\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.207432 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/72b5083e-156e-42a5-abbd-1d71521f8ed4-dispersionconf\") pod \"swift-ring-rebalance-th79c\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.207476 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b5083e-156e-42a5-abbd-1d71521f8ed4-combined-ca-bundle\") pod \"swift-ring-rebalance-th79c\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.243667 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c7b7657fc-vwr72"] Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.245340 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.255546 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c7b7657fc-vwr72"] Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.309012 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72b5083e-156e-42a5-abbd-1d71521f8ed4-scripts\") pod \"swift-ring-rebalance-th79c\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.309068 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/72b5083e-156e-42a5-abbd-1d71521f8ed4-etc-swift\") pod \"swift-ring-rebalance-th79c\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.309109 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/72b5083e-156e-42a5-abbd-1d71521f8ed4-dispersionconf\") pod \"swift-ring-rebalance-th79c\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.309159 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b5083e-156e-42a5-abbd-1d71521f8ed4-combined-ca-bundle\") pod \"swift-ring-rebalance-th79c\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.309210 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg72c\" (UniqueName: \"kubernetes.io/projected/a8e6f407-4c43-4105-8b90-6d1eefa79a20-kube-api-access-sg72c\") pod \"dnsmasq-dns-6c7b7657fc-vwr72\" (UID: \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\") " pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.309261 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8e6f407-4c43-4105-8b90-6d1eefa79a20-dns-svc\") pod \"dnsmasq-dns-6c7b7657fc-vwr72\" (UID: \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\") " pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.309285 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8e6f407-4c43-4105-8b90-6d1eefa79a20-config\") pod \"dnsmasq-dns-6c7b7657fc-vwr72\" (UID: \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\") " pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.309323 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8e6f407-4c43-4105-8b90-6d1eefa79a20-ovsdbserver-sb\") pod \"dnsmasq-dns-6c7b7657fc-vwr72\" (UID: \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\") " pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.309366 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvhwm\" (UniqueName: \"kubernetes.io/projected/72b5083e-156e-42a5-abbd-1d71521f8ed4-kube-api-access-rvhwm\") pod \"swift-ring-rebalance-th79c\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.309479 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/72b5083e-156e-42a5-abbd-1d71521f8ed4-swiftconf\") pod \"swift-ring-rebalance-th79c\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.309519 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8e6f407-4c43-4105-8b90-6d1eefa79a20-ovsdbserver-nb\") pod \"dnsmasq-dns-6c7b7657fc-vwr72\" (UID: \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\") " pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.309550 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/72b5083e-156e-42a5-abbd-1d71521f8ed4-ring-data-devices\") pod \"swift-ring-rebalance-th79c\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.309592 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/72b5083e-156e-42a5-abbd-1d71521f8ed4-etc-swift\") pod \"swift-ring-rebalance-th79c\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.310004 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72b5083e-156e-42a5-abbd-1d71521f8ed4-scripts\") pod \"swift-ring-rebalance-th79c\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.310211 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/72b5083e-156e-42a5-abbd-1d71521f8ed4-ring-data-devices\") pod \"swift-ring-rebalance-th79c\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.318780 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/72b5083e-156e-42a5-abbd-1d71521f8ed4-swiftconf\") pod \"swift-ring-rebalance-th79c\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.318981 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b5083e-156e-42a5-abbd-1d71521f8ed4-combined-ca-bundle\") pod \"swift-ring-rebalance-th79c\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.319245 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/72b5083e-156e-42a5-abbd-1d71521f8ed4-dispersionconf\") pod \"swift-ring-rebalance-th79c\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.327245 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvhwm\" (UniqueName: \"kubernetes.io/projected/72b5083e-156e-42a5-abbd-1d71521f8ed4-kube-api-access-rvhwm\") pod \"swift-ring-rebalance-th79c\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.411331 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8e6f407-4c43-4105-8b90-6d1eefa79a20-ovsdbserver-nb\") pod \"dnsmasq-dns-6c7b7657fc-vwr72\" (UID: \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\") " pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.411417 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg72c\" (UniqueName: \"kubernetes.io/projected/a8e6f407-4c43-4105-8b90-6d1eefa79a20-kube-api-access-sg72c\") pod \"dnsmasq-dns-6c7b7657fc-vwr72\" (UID: \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\") " pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.411449 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8e6f407-4c43-4105-8b90-6d1eefa79a20-dns-svc\") pod \"dnsmasq-dns-6c7b7657fc-vwr72\" (UID: \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\") " pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.411468 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8e6f407-4c43-4105-8b90-6d1eefa79a20-config\") pod \"dnsmasq-dns-6c7b7657fc-vwr72\" (UID: \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\") " pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.411486 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8e6f407-4c43-4105-8b90-6d1eefa79a20-ovsdbserver-sb\") pod \"dnsmasq-dns-6c7b7657fc-vwr72\" (UID: \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\") " pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.412358 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8e6f407-4c43-4105-8b90-6d1eefa79a20-ovsdbserver-sb\") pod \"dnsmasq-dns-6c7b7657fc-vwr72\" (UID: \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\") " pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.413046 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8e6f407-4c43-4105-8b90-6d1eefa79a20-dns-svc\") pod \"dnsmasq-dns-6c7b7657fc-vwr72\" (UID: \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\") " pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.413174 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8e6f407-4c43-4105-8b90-6d1eefa79a20-config\") pod \"dnsmasq-dns-6c7b7657fc-vwr72\" (UID: \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\") " pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.413413 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8e6f407-4c43-4105-8b90-6d1eefa79a20-ovsdbserver-nb\") pod \"dnsmasq-dns-6c7b7657fc-vwr72\" (UID: \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\") " pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.427780 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg72c\" (UniqueName: \"kubernetes.io/projected/a8e6f407-4c43-4105-8b90-6d1eefa79a20-kube-api-access-sg72c\") pod \"dnsmasq-dns-6c7b7657fc-vwr72\" (UID: \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\") " pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.450819 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.576415 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" Mar 10 08:13:56 crc kubenswrapper[4825]: I0310 08:13:56.963980 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-th79c"] Mar 10 08:13:57 crc kubenswrapper[4825]: I0310 08:13:57.065943 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-th79c" event={"ID":"72b5083e-156e-42a5-abbd-1d71521f8ed4","Type":"ContainerStarted","Data":"779b193065f53f9949df8a5187657a408c5221a6cfb41a75aa7859cee0999279"} Mar 10 08:13:57 crc kubenswrapper[4825]: W0310 08:13:57.068036 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8e6f407_4c43_4105_8b90_6d1eefa79a20.slice/crio-51760ef14e493e55c03eb00bbaaf85877694d9c414da82f8185f488e892e077b WatchSource:0}: Error finding container 51760ef14e493e55c03eb00bbaaf85877694d9c414da82f8185f488e892e077b: Status 404 returned error can't find the container with id 51760ef14e493e55c03eb00bbaaf85877694d9c414da82f8185f488e892e077b Mar 10 08:13:57 crc kubenswrapper[4825]: I0310 08:13:57.069747 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c7b7657fc-vwr72"] Mar 10 08:13:57 crc kubenswrapper[4825]: I0310 08:13:57.883315 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7bc9479664-wlv62"] Mar 10 08:13:57 crc kubenswrapper[4825]: I0310 08:13:57.890567 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:13:57 crc kubenswrapper[4825]: I0310 08:13:57.893245 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 10 08:13:57 crc kubenswrapper[4825]: I0310 08:13:57.920460 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7bc9479664-wlv62"] Mar 10 08:13:58 crc kubenswrapper[4825]: I0310 08:13:58.058543 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8197ce55-6e66-4f96-9964-b4f7bfea74a7-combined-ca-bundle\") pod \"swift-proxy-7bc9479664-wlv62\" (UID: \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\") " pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:13:58 crc kubenswrapper[4825]: I0310 08:13:58.058616 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8197ce55-6e66-4f96-9964-b4f7bfea74a7-run-httpd\") pod \"swift-proxy-7bc9479664-wlv62\" (UID: \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\") " pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:13:58 crc kubenswrapper[4825]: I0310 08:13:58.058686 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8197ce55-6e66-4f96-9964-b4f7bfea74a7-log-httpd\") pod \"swift-proxy-7bc9479664-wlv62\" (UID: \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\") " pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:13:58 crc kubenswrapper[4825]: I0310 08:13:58.058780 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtj5t\" (UniqueName: \"kubernetes.io/projected/8197ce55-6e66-4f96-9964-b4f7bfea74a7-kube-api-access-xtj5t\") pod \"swift-proxy-7bc9479664-wlv62\" (UID: \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\") " pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:13:58 crc kubenswrapper[4825]: I0310 08:13:58.058885 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8197ce55-6e66-4f96-9964-b4f7bfea74a7-etc-swift\") pod \"swift-proxy-7bc9479664-wlv62\" (UID: \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\") " pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:13:58 crc kubenswrapper[4825]: I0310 08:13:58.058932 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8197ce55-6e66-4f96-9964-b4f7bfea74a7-config-data\") pod \"swift-proxy-7bc9479664-wlv62\" (UID: \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\") " pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:13:58 crc kubenswrapper[4825]: I0310 08:13:58.079244 4825 generic.go:334] "Generic (PLEG): container finished" podID="a8e6f407-4c43-4105-8b90-6d1eefa79a20" containerID="7e47c76ed5bdceee305106f6805f553116458fd19154c222ac8de20aaadc5130" exitCode=0 Mar 10 08:13:58 crc kubenswrapper[4825]: I0310 08:13:58.079289 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" event={"ID":"a8e6f407-4c43-4105-8b90-6d1eefa79a20","Type":"ContainerDied","Data":"7e47c76ed5bdceee305106f6805f553116458fd19154c222ac8de20aaadc5130"} Mar 10 08:13:58 crc kubenswrapper[4825]: I0310 08:13:58.079316 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" event={"ID":"a8e6f407-4c43-4105-8b90-6d1eefa79a20","Type":"ContainerStarted","Data":"51760ef14e493e55c03eb00bbaaf85877694d9c414da82f8185f488e892e077b"} Mar 10 08:13:58 crc kubenswrapper[4825]: I0310 08:13:58.161067 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8197ce55-6e66-4f96-9964-b4f7bfea74a7-run-httpd\") pod \"swift-proxy-7bc9479664-wlv62\" (UID: \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\") " pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:13:58 crc kubenswrapper[4825]: I0310 08:13:58.161168 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8197ce55-6e66-4f96-9964-b4f7bfea74a7-log-httpd\") pod \"swift-proxy-7bc9479664-wlv62\" (UID: \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\") " pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:13:58 crc kubenswrapper[4825]: I0310 08:13:58.161218 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtj5t\" (UniqueName: \"kubernetes.io/projected/8197ce55-6e66-4f96-9964-b4f7bfea74a7-kube-api-access-xtj5t\") pod \"swift-proxy-7bc9479664-wlv62\" (UID: \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\") " pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:13:58 crc kubenswrapper[4825]: I0310 08:13:58.161585 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8197ce55-6e66-4f96-9964-b4f7bfea74a7-run-httpd\") pod \"swift-proxy-7bc9479664-wlv62\" (UID: \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\") " pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:13:58 crc kubenswrapper[4825]: I0310 08:13:58.161691 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8197ce55-6e66-4f96-9964-b4f7bfea74a7-log-httpd\") pod \"swift-proxy-7bc9479664-wlv62\" (UID: \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\") " pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:13:58 crc kubenswrapper[4825]: I0310 08:13:58.161902 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8197ce55-6e66-4f96-9964-b4f7bfea74a7-etc-swift\") pod \"swift-proxy-7bc9479664-wlv62\" (UID: \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\") " pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:13:58 crc kubenswrapper[4825]: I0310 08:13:58.161970 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8197ce55-6e66-4f96-9964-b4f7bfea74a7-config-data\") pod \"swift-proxy-7bc9479664-wlv62\" (UID: \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\") " pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:13:58 crc kubenswrapper[4825]: I0310 08:13:58.162092 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8197ce55-6e66-4f96-9964-b4f7bfea74a7-combined-ca-bundle\") pod \"swift-proxy-7bc9479664-wlv62\" (UID: \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\") " pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:13:58 crc kubenswrapper[4825]: I0310 08:13:58.169685 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8197ce55-6e66-4f96-9964-b4f7bfea74a7-config-data\") pod \"swift-proxy-7bc9479664-wlv62\" (UID: \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\") " pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:13:58 crc kubenswrapper[4825]: I0310 08:13:58.170033 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8197ce55-6e66-4f96-9964-b4f7bfea74a7-etc-swift\") pod \"swift-proxy-7bc9479664-wlv62\" (UID: \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\") " pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:13:58 crc kubenswrapper[4825]: I0310 08:13:58.170311 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8197ce55-6e66-4f96-9964-b4f7bfea74a7-combined-ca-bundle\") pod \"swift-proxy-7bc9479664-wlv62\" (UID: \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\") " pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:13:58 crc kubenswrapper[4825]: I0310 08:13:58.191559 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtj5t\" (UniqueName: \"kubernetes.io/projected/8197ce55-6e66-4f96-9964-b4f7bfea74a7-kube-api-access-xtj5t\") pod \"swift-proxy-7bc9479664-wlv62\" (UID: \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\") " pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:13:58 crc kubenswrapper[4825]: I0310 08:13:58.212633 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.043747 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6d6b57675b-q5h4k"] Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.047820 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.050594 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.050912 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.064484 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6d6b57675b-q5h4k"] Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.136933 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552174-xcsb5"] Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.138197 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552174-xcsb5" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.139974 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.141010 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.141802 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.146335 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552174-xcsb5"] Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.197719 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8b67c6af-7ab6-4359-b2c7-c3e8b57fc722-etc-swift\") pod \"swift-proxy-6d6b57675b-q5h4k\" (UID: \"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722\") " pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.197865 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b67c6af-7ab6-4359-b2c7-c3e8b57fc722-log-httpd\") pod \"swift-proxy-6d6b57675b-q5h4k\" (UID: \"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722\") " pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.197888 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b67c6af-7ab6-4359-b2c7-c3e8b57fc722-internal-tls-certs\") pod \"swift-proxy-6d6b57675b-q5h4k\" (UID: \"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722\") " pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.197979 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b67c6af-7ab6-4359-b2c7-c3e8b57fc722-combined-ca-bundle\") pod \"swift-proxy-6d6b57675b-q5h4k\" (UID: \"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722\") " pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.198003 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sr2v\" (UniqueName: \"kubernetes.io/projected/8b67c6af-7ab6-4359-b2c7-c3e8b57fc722-kube-api-access-5sr2v\") pod \"swift-proxy-6d6b57675b-q5h4k\" (UID: \"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722\") " pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.198042 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b67c6af-7ab6-4359-b2c7-c3e8b57fc722-public-tls-certs\") pod \"swift-proxy-6d6b57675b-q5h4k\" (UID: \"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722\") " pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.198118 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b67c6af-7ab6-4359-b2c7-c3e8b57fc722-run-httpd\") pod \"swift-proxy-6d6b57675b-q5h4k\" (UID: \"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722\") " pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.198202 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b67c6af-7ab6-4359-b2c7-c3e8b57fc722-config-data\") pod \"swift-proxy-6d6b57675b-q5h4k\" (UID: \"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722\") " pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.300814 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b67c6af-7ab6-4359-b2c7-c3e8b57fc722-log-httpd\") pod \"swift-proxy-6d6b57675b-q5h4k\" (UID: \"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722\") " pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.301310 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b67c6af-7ab6-4359-b2c7-c3e8b57fc722-internal-tls-certs\") pod \"swift-proxy-6d6b57675b-q5h4k\" (UID: \"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722\") " pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.301349 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b67c6af-7ab6-4359-b2c7-c3e8b57fc722-combined-ca-bundle\") pod \"swift-proxy-6d6b57675b-q5h4k\" (UID: \"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722\") " pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.301373 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sr2v\" (UniqueName: \"kubernetes.io/projected/8b67c6af-7ab6-4359-b2c7-c3e8b57fc722-kube-api-access-5sr2v\") pod \"swift-proxy-6d6b57675b-q5h4k\" (UID: \"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722\") " pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.301393 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b67c6af-7ab6-4359-b2c7-c3e8b57fc722-public-tls-certs\") pod \"swift-proxy-6d6b57675b-q5h4k\" (UID: \"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722\") " pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.301442 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b67c6af-7ab6-4359-b2c7-c3e8b57fc722-run-httpd\") pod \"swift-proxy-6d6b57675b-q5h4k\" (UID: \"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722\") " pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.301464 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsz5m\" (UniqueName: \"kubernetes.io/projected/fa9e162c-76d6-4e53-9cf3-72544f8a1399-kube-api-access-wsz5m\") pod \"auto-csr-approver-29552174-xcsb5\" (UID: \"fa9e162c-76d6-4e53-9cf3-72544f8a1399\") " pod="openshift-infra/auto-csr-approver-29552174-xcsb5" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.301489 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b67c6af-7ab6-4359-b2c7-c3e8b57fc722-config-data\") pod \"swift-proxy-6d6b57675b-q5h4k\" (UID: \"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722\") " pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.301552 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8b67c6af-7ab6-4359-b2c7-c3e8b57fc722-etc-swift\") pod \"swift-proxy-6d6b57675b-q5h4k\" (UID: \"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722\") " pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.302142 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b67c6af-7ab6-4359-b2c7-c3e8b57fc722-run-httpd\") pod \"swift-proxy-6d6b57675b-q5h4k\" (UID: \"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722\") " pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.302787 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b67c6af-7ab6-4359-b2c7-c3e8b57fc722-log-httpd\") pod \"swift-proxy-6d6b57675b-q5h4k\" (UID: \"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722\") " pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.307164 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b67c6af-7ab6-4359-b2c7-c3e8b57fc722-config-data\") pod \"swift-proxy-6d6b57675b-q5h4k\" (UID: \"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722\") " pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.307871 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b67c6af-7ab6-4359-b2c7-c3e8b57fc722-internal-tls-certs\") pod \"swift-proxy-6d6b57675b-q5h4k\" (UID: \"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722\") " pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.308801 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8b67c6af-7ab6-4359-b2c7-c3e8b57fc722-etc-swift\") pod \"swift-proxy-6d6b57675b-q5h4k\" (UID: \"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722\") " pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.308894 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b67c6af-7ab6-4359-b2c7-c3e8b57fc722-combined-ca-bundle\") pod \"swift-proxy-6d6b57675b-q5h4k\" (UID: \"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722\") " pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.309773 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b67c6af-7ab6-4359-b2c7-c3e8b57fc722-public-tls-certs\") pod \"swift-proxy-6d6b57675b-q5h4k\" (UID: \"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722\") " pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.321985 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sr2v\" (UniqueName: \"kubernetes.io/projected/8b67c6af-7ab6-4359-b2c7-c3e8b57fc722-kube-api-access-5sr2v\") pod \"swift-proxy-6d6b57675b-q5h4k\" (UID: \"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722\") " pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.373525 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.403258 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsz5m\" (UniqueName: \"kubernetes.io/projected/fa9e162c-76d6-4e53-9cf3-72544f8a1399-kube-api-access-wsz5m\") pod \"auto-csr-approver-29552174-xcsb5\" (UID: \"fa9e162c-76d6-4e53-9cf3-72544f8a1399\") " pod="openshift-infra/auto-csr-approver-29552174-xcsb5" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.557896 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsz5m\" (UniqueName: \"kubernetes.io/projected/fa9e162c-76d6-4e53-9cf3-72544f8a1399-kube-api-access-wsz5m\") pod \"auto-csr-approver-29552174-xcsb5\" (UID: \"fa9e162c-76d6-4e53-9cf3-72544f8a1399\") " pod="openshift-infra/auto-csr-approver-29552174-xcsb5" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.663005 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552174-xcsb5" Mar 10 08:14:00 crc kubenswrapper[4825]: I0310 08:14:00.839874 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7bc9479664-wlv62"] Mar 10 08:14:00 crc kubenswrapper[4825]: W0310 08:14:00.841119 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8197ce55_6e66_4f96_9964_b4f7bfea74a7.slice/crio-aa6be3e879ecfe1c3d9acc3ae9ac22e109c9095ff450ccdcdc37235eeb300799 WatchSource:0}: Error finding container aa6be3e879ecfe1c3d9acc3ae9ac22e109c9095ff450ccdcdc37235eeb300799: Status 404 returned error can't find the container with id aa6be3e879ecfe1c3d9acc3ae9ac22e109c9095ff450ccdcdc37235eeb300799 Mar 10 08:14:01 crc kubenswrapper[4825]: I0310 08:14:01.125255 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7bc9479664-wlv62" event={"ID":"8197ce55-6e66-4f96-9964-b4f7bfea74a7","Type":"ContainerStarted","Data":"63fa391deabef5224165acf41e5f511070e87347af64375ff1f1bdb5a91b2773"} Mar 10 08:14:01 crc kubenswrapper[4825]: I0310 08:14:01.125579 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7bc9479664-wlv62" event={"ID":"8197ce55-6e66-4f96-9964-b4f7bfea74a7","Type":"ContainerStarted","Data":"aa6be3e879ecfe1c3d9acc3ae9ac22e109c9095ff450ccdcdc37235eeb300799"} Mar 10 08:14:01 crc kubenswrapper[4825]: I0310 08:14:01.132116 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-th79c" event={"ID":"72b5083e-156e-42a5-abbd-1d71521f8ed4","Type":"ContainerStarted","Data":"705bbe7acda5d75f53629b08636b8badb5ad771e565f5bda42324338d62402f2"} Mar 10 08:14:01 crc kubenswrapper[4825]: I0310 08:14:01.133040 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552174-xcsb5"] Mar 10 08:14:01 crc kubenswrapper[4825]: I0310 08:14:01.138336 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" event={"ID":"a8e6f407-4c43-4105-8b90-6d1eefa79a20","Type":"ContainerStarted","Data":"8ff48b0e845c119b201f42252ec7936b2bc34895932f700db12f398de5e90390"} Mar 10 08:14:01 crc kubenswrapper[4825]: I0310 08:14:01.139005 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" Mar 10 08:14:01 crc kubenswrapper[4825]: I0310 08:14:01.164330 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-th79c" podStartSLOduration=1.923686266 podStartE2EDuration="5.16427624s" podCreationTimestamp="2026-03-10 08:13:56 +0000 UTC" firstStartedPulling="2026-03-10 08:13:56.966769659 +0000 UTC m=+5389.996550274" lastFinishedPulling="2026-03-10 08:14:00.207359633 +0000 UTC m=+5393.237140248" observedRunningTime="2026-03-10 08:14:01.154970396 +0000 UTC m=+5394.184751011" watchObservedRunningTime="2026-03-10 08:14:01.16427624 +0000 UTC m=+5394.194056855" Mar 10 08:14:01 crc kubenswrapper[4825]: I0310 08:14:01.186192 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" podStartSLOduration=5.186172185 podStartE2EDuration="5.186172185s" podCreationTimestamp="2026-03-10 08:13:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:14:01.183564057 +0000 UTC m=+5394.213344692" watchObservedRunningTime="2026-03-10 08:14:01.186172185 +0000 UTC m=+5394.215952800" Mar 10 08:14:01 crc kubenswrapper[4825]: I0310 08:14:01.253421 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6d6b57675b-q5h4k"] Mar 10 08:14:02 crc kubenswrapper[4825]: I0310 08:14:02.171258 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552174-xcsb5" event={"ID":"fa9e162c-76d6-4e53-9cf3-72544f8a1399","Type":"ContainerStarted","Data":"41b3a0749b1964ddcf4d073111e0630bc276bfb2650c1e1de0d2ba912d220fc0"} Mar 10 08:14:02 crc kubenswrapper[4825]: I0310 08:14:02.174200 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6d6b57675b-q5h4k" event={"ID":"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722","Type":"ContainerStarted","Data":"2540a75f2a9fa8e4b2f072b3c1468b50b10bf0a72aabeb3e54260afaf61221bd"} Mar 10 08:14:02 crc kubenswrapper[4825]: I0310 08:14:02.174241 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6d6b57675b-q5h4k" event={"ID":"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722","Type":"ContainerStarted","Data":"b4b1397d9008b9aac58ffb6be701b60bff2fc0143389c58e938c0233695ee46b"} Mar 10 08:14:02 crc kubenswrapper[4825]: I0310 08:14:02.174254 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6d6b57675b-q5h4k" event={"ID":"8b67c6af-7ab6-4359-b2c7-c3e8b57fc722","Type":"ContainerStarted","Data":"ac6fd950018d424db196bf1ea5e97bfcecfb2baadcdd2a56de4368d5fddb2936"} Mar 10 08:14:02 crc kubenswrapper[4825]: I0310 08:14:02.175240 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:02 crc kubenswrapper[4825]: I0310 08:14:02.175506 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:02 crc kubenswrapper[4825]: I0310 08:14:02.193227 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7bc9479664-wlv62" event={"ID":"8197ce55-6e66-4f96-9964-b4f7bfea74a7","Type":"ContainerStarted","Data":"0445a81d70caf56e23376649a443336447a0a30059364373d8bd1773e80c6108"} Mar 10 08:14:02 crc kubenswrapper[4825]: I0310 08:14:02.203478 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6d6b57675b-q5h4k" podStartSLOduration=2.203462757 podStartE2EDuration="2.203462757s" podCreationTimestamp="2026-03-10 08:14:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:14:02.202744678 +0000 UTC m=+5395.232525303" watchObservedRunningTime="2026-03-10 08:14:02.203462757 +0000 UTC m=+5395.233243372" Mar 10 08:14:02 crc kubenswrapper[4825]: I0310 08:14:02.232064 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7bc9479664-wlv62" podStartSLOduration=5.232043558 podStartE2EDuration="5.232043558s" podCreationTimestamp="2026-03-10 08:13:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:14:02.227635512 +0000 UTC m=+5395.257416127" watchObservedRunningTime="2026-03-10 08:14:02.232043558 +0000 UTC m=+5395.261824193" Mar 10 08:14:03 crc kubenswrapper[4825]: I0310 08:14:03.205263 4825 generic.go:334] "Generic (PLEG): container finished" podID="fa9e162c-76d6-4e53-9cf3-72544f8a1399" containerID="a0ef714ed80b9877b5e762ee73cb01ae9e769071f74dc2a2689e7de7628babf6" exitCode=0 Mar 10 08:14:03 crc kubenswrapper[4825]: I0310 08:14:03.205415 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552174-xcsb5" event={"ID":"fa9e162c-76d6-4e53-9cf3-72544f8a1399","Type":"ContainerDied","Data":"a0ef714ed80b9877b5e762ee73cb01ae9e769071f74dc2a2689e7de7628babf6"} Mar 10 08:14:03 crc kubenswrapper[4825]: I0310 08:14:03.206220 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:14:03 crc kubenswrapper[4825]: I0310 08:14:03.206260 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:14:03 crc kubenswrapper[4825]: I0310 08:14:03.237586 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:14:03 crc kubenswrapper[4825]: E0310 08:14:03.238080 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:14:04 crc kubenswrapper[4825]: I0310 08:14:04.579755 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552174-xcsb5" Mar 10 08:14:04 crc kubenswrapper[4825]: I0310 08:14:04.705802 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsz5m\" (UniqueName: \"kubernetes.io/projected/fa9e162c-76d6-4e53-9cf3-72544f8a1399-kube-api-access-wsz5m\") pod \"fa9e162c-76d6-4e53-9cf3-72544f8a1399\" (UID: \"fa9e162c-76d6-4e53-9cf3-72544f8a1399\") " Mar 10 08:14:04 crc kubenswrapper[4825]: I0310 08:14:04.711312 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa9e162c-76d6-4e53-9cf3-72544f8a1399-kube-api-access-wsz5m" (OuterVolumeSpecName: "kube-api-access-wsz5m") pod "fa9e162c-76d6-4e53-9cf3-72544f8a1399" (UID: "fa9e162c-76d6-4e53-9cf3-72544f8a1399"). InnerVolumeSpecName "kube-api-access-wsz5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:14:04 crc kubenswrapper[4825]: I0310 08:14:04.808721 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsz5m\" (UniqueName: \"kubernetes.io/projected/fa9e162c-76d6-4e53-9cf3-72544f8a1399-kube-api-access-wsz5m\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:05 crc kubenswrapper[4825]: E0310 08:14:05.095701 4825 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72b5083e_156e_42a5_abbd_1d71521f8ed4.slice/crio-705bbe7acda5d75f53629b08636b8badb5ad771e565f5bda42324338d62402f2.scope\": RecentStats: unable to find data in memory cache]" Mar 10 08:14:05 crc kubenswrapper[4825]: I0310 08:14:05.224057 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552174-xcsb5" Mar 10 08:14:05 crc kubenswrapper[4825]: I0310 08:14:05.224051 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552174-xcsb5" event={"ID":"fa9e162c-76d6-4e53-9cf3-72544f8a1399","Type":"ContainerDied","Data":"41b3a0749b1964ddcf4d073111e0630bc276bfb2650c1e1de0d2ba912d220fc0"} Mar 10 08:14:05 crc kubenswrapper[4825]: I0310 08:14:05.224215 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41b3a0749b1964ddcf4d073111e0630bc276bfb2650c1e1de0d2ba912d220fc0" Mar 10 08:14:05 crc kubenswrapper[4825]: I0310 08:14:05.225929 4825 generic.go:334] "Generic (PLEG): container finished" podID="72b5083e-156e-42a5-abbd-1d71521f8ed4" containerID="705bbe7acda5d75f53629b08636b8badb5ad771e565f5bda42324338d62402f2" exitCode=0 Mar 10 08:14:05 crc kubenswrapper[4825]: I0310 08:14:05.225971 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-th79c" event={"ID":"72b5083e-156e-42a5-abbd-1d71521f8ed4","Type":"ContainerDied","Data":"705bbe7acda5d75f53629b08636b8badb5ad771e565f5bda42324338d62402f2"} Mar 10 08:14:05 crc kubenswrapper[4825]: I0310 08:14:05.670529 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552168-fcmhd"] Mar 10 08:14:05 crc kubenswrapper[4825]: I0310 08:14:05.679831 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552168-fcmhd"] Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.578552 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.661373 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.679866 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f6cf695d7-glngc"] Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.680269 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f6cf695d7-glngc" podUID="27777bdb-4fb4-4704-9ccb-df7a83fde963" containerName="dnsmasq-dns" containerID="cri-o://d8d111e7061e5eb38b8fc0fc2bda73f06b2b419ff2ca5ed768b8c753a206df58" gracePeriod=10 Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.860984 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b5083e-156e-42a5-abbd-1d71521f8ed4-combined-ca-bundle\") pod \"72b5083e-156e-42a5-abbd-1d71521f8ed4\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.861063 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72b5083e-156e-42a5-abbd-1d71521f8ed4-scripts\") pod \"72b5083e-156e-42a5-abbd-1d71521f8ed4\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.861147 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/72b5083e-156e-42a5-abbd-1d71521f8ed4-etc-swift\") pod \"72b5083e-156e-42a5-abbd-1d71521f8ed4\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.861166 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/72b5083e-156e-42a5-abbd-1d71521f8ed4-dispersionconf\") pod \"72b5083e-156e-42a5-abbd-1d71521f8ed4\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.861282 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/72b5083e-156e-42a5-abbd-1d71521f8ed4-ring-data-devices\") pod \"72b5083e-156e-42a5-abbd-1d71521f8ed4\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.861306 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/72b5083e-156e-42a5-abbd-1d71521f8ed4-swiftconf\") pod \"72b5083e-156e-42a5-abbd-1d71521f8ed4\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.861339 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvhwm\" (UniqueName: \"kubernetes.io/projected/72b5083e-156e-42a5-abbd-1d71521f8ed4-kube-api-access-rvhwm\") pod \"72b5083e-156e-42a5-abbd-1d71521f8ed4\" (UID: \"72b5083e-156e-42a5-abbd-1d71521f8ed4\") " Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.865598 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72b5083e-156e-42a5-abbd-1d71521f8ed4-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "72b5083e-156e-42a5-abbd-1d71521f8ed4" (UID: "72b5083e-156e-42a5-abbd-1d71521f8ed4"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.866016 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72b5083e-156e-42a5-abbd-1d71521f8ed4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "72b5083e-156e-42a5-abbd-1d71521f8ed4" (UID: "72b5083e-156e-42a5-abbd-1d71521f8ed4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.869336 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72b5083e-156e-42a5-abbd-1d71521f8ed4-kube-api-access-rvhwm" (OuterVolumeSpecName: "kube-api-access-rvhwm") pod "72b5083e-156e-42a5-abbd-1d71521f8ed4" (UID: "72b5083e-156e-42a5-abbd-1d71521f8ed4"). InnerVolumeSpecName "kube-api-access-rvhwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.883102 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b5083e-156e-42a5-abbd-1d71521f8ed4-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "72b5083e-156e-42a5-abbd-1d71521f8ed4" (UID: "72b5083e-156e-42a5-abbd-1d71521f8ed4"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.887654 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72b5083e-156e-42a5-abbd-1d71521f8ed4-scripts" (OuterVolumeSpecName: "scripts") pod "72b5083e-156e-42a5-abbd-1d71521f8ed4" (UID: "72b5083e-156e-42a5-abbd-1d71521f8ed4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.902800 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b5083e-156e-42a5-abbd-1d71521f8ed4-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "72b5083e-156e-42a5-abbd-1d71521f8ed4" (UID: "72b5083e-156e-42a5-abbd-1d71521f8ed4"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.916482 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b5083e-156e-42a5-abbd-1d71521f8ed4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72b5083e-156e-42a5-abbd-1d71521f8ed4" (UID: "72b5083e-156e-42a5-abbd-1d71521f8ed4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.962863 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72b5083e-156e-42a5-abbd-1d71521f8ed4-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.962895 4825 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/72b5083e-156e-42a5-abbd-1d71521f8ed4-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.962905 4825 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/72b5083e-156e-42a5-abbd-1d71521f8ed4-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.962914 4825 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/72b5083e-156e-42a5-abbd-1d71521f8ed4-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.962926 4825 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/72b5083e-156e-42a5-abbd-1d71521f8ed4-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.962936 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvhwm\" (UniqueName: \"kubernetes.io/projected/72b5083e-156e-42a5-abbd-1d71521f8ed4-kube-api-access-rvhwm\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:06 crc kubenswrapper[4825]: I0310 08:14:06.962945 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b5083e-156e-42a5-abbd-1d71521f8ed4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.116824 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6cf695d7-glngc" Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.246767 4825 generic.go:334] "Generic (PLEG): container finished" podID="27777bdb-4fb4-4704-9ccb-df7a83fde963" containerID="d8d111e7061e5eb38b8fc0fc2bda73f06b2b419ff2ca5ed768b8c753a206df58" exitCode=0 Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.246931 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6cf695d7-glngc" Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.249339 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-th79c" Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.254087 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="501e7d92-d832-43a7-8f14-38e1747824c9" path="/var/lib/kubelet/pods/501e7d92-d832-43a7-8f14-38e1747824c9/volumes" Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.257085 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6cf695d7-glngc" event={"ID":"27777bdb-4fb4-4704-9ccb-df7a83fde963","Type":"ContainerDied","Data":"d8d111e7061e5eb38b8fc0fc2bda73f06b2b419ff2ca5ed768b8c753a206df58"} Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.257130 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6cf695d7-glngc" event={"ID":"27777bdb-4fb4-4704-9ccb-df7a83fde963","Type":"ContainerDied","Data":"ef2489eb579fa52c7c2401f5f49b11ed2c4ff01f7501ceb449be010783431bb7"} Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.257163 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-th79c" event={"ID":"72b5083e-156e-42a5-abbd-1d71521f8ed4","Type":"ContainerDied","Data":"779b193065f53f9949df8a5187657a408c5221a6cfb41a75aa7859cee0999279"} Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.257177 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="779b193065f53f9949df8a5187657a408c5221a6cfb41a75aa7859cee0999279" Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.257196 4825 scope.go:117] "RemoveContainer" containerID="d8d111e7061e5eb38b8fc0fc2bda73f06b2b419ff2ca5ed768b8c753a206df58" Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.269959 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27777bdb-4fb4-4704-9ccb-df7a83fde963-dns-svc\") pod \"27777bdb-4fb4-4704-9ccb-df7a83fde963\" (UID: \"27777bdb-4fb4-4704-9ccb-df7a83fde963\") " Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.270005 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27777bdb-4fb4-4704-9ccb-df7a83fde963-ovsdbserver-nb\") pod \"27777bdb-4fb4-4704-9ccb-df7a83fde963\" (UID: \"27777bdb-4fb4-4704-9ccb-df7a83fde963\") " Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.270060 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th4cm\" (UniqueName: \"kubernetes.io/projected/27777bdb-4fb4-4704-9ccb-df7a83fde963-kube-api-access-th4cm\") pod \"27777bdb-4fb4-4704-9ccb-df7a83fde963\" (UID: \"27777bdb-4fb4-4704-9ccb-df7a83fde963\") " Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.270085 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27777bdb-4fb4-4704-9ccb-df7a83fde963-config\") pod \"27777bdb-4fb4-4704-9ccb-df7a83fde963\" (UID: \"27777bdb-4fb4-4704-9ccb-df7a83fde963\") " Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.270215 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27777bdb-4fb4-4704-9ccb-df7a83fde963-ovsdbserver-sb\") pod \"27777bdb-4fb4-4704-9ccb-df7a83fde963\" (UID: \"27777bdb-4fb4-4704-9ccb-df7a83fde963\") " Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.282386 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27777bdb-4fb4-4704-9ccb-df7a83fde963-kube-api-access-th4cm" (OuterVolumeSpecName: "kube-api-access-th4cm") pod "27777bdb-4fb4-4704-9ccb-df7a83fde963" (UID: "27777bdb-4fb4-4704-9ccb-df7a83fde963"). InnerVolumeSpecName "kube-api-access-th4cm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.291006 4825 scope.go:117] "RemoveContainer" containerID="008822619d45bc01dc85394373c8c5ccf93140832050a2d8a4548874405727ed" Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.311239 4825 scope.go:117] "RemoveContainer" containerID="d8d111e7061e5eb38b8fc0fc2bda73f06b2b419ff2ca5ed768b8c753a206df58" Mar 10 08:14:07 crc kubenswrapper[4825]: E0310 08:14:07.312384 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8d111e7061e5eb38b8fc0fc2bda73f06b2b419ff2ca5ed768b8c753a206df58\": container with ID starting with d8d111e7061e5eb38b8fc0fc2bda73f06b2b419ff2ca5ed768b8c753a206df58 not found: ID does not exist" containerID="d8d111e7061e5eb38b8fc0fc2bda73f06b2b419ff2ca5ed768b8c753a206df58" Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.312437 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8d111e7061e5eb38b8fc0fc2bda73f06b2b419ff2ca5ed768b8c753a206df58"} err="failed to get container status \"d8d111e7061e5eb38b8fc0fc2bda73f06b2b419ff2ca5ed768b8c753a206df58\": rpc error: code = NotFound desc = could not find container \"d8d111e7061e5eb38b8fc0fc2bda73f06b2b419ff2ca5ed768b8c753a206df58\": container with ID starting with d8d111e7061e5eb38b8fc0fc2bda73f06b2b419ff2ca5ed768b8c753a206df58 not found: ID does not exist" Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.312472 4825 scope.go:117] "RemoveContainer" containerID="008822619d45bc01dc85394373c8c5ccf93140832050a2d8a4548874405727ed" Mar 10 08:14:07 crc kubenswrapper[4825]: E0310 08:14:07.314744 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"008822619d45bc01dc85394373c8c5ccf93140832050a2d8a4548874405727ed\": container with ID starting with 008822619d45bc01dc85394373c8c5ccf93140832050a2d8a4548874405727ed not found: ID does not exist" containerID="008822619d45bc01dc85394373c8c5ccf93140832050a2d8a4548874405727ed" Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.314790 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"008822619d45bc01dc85394373c8c5ccf93140832050a2d8a4548874405727ed"} err="failed to get container status \"008822619d45bc01dc85394373c8c5ccf93140832050a2d8a4548874405727ed\": rpc error: code = NotFound desc = could not find container \"008822619d45bc01dc85394373c8c5ccf93140832050a2d8a4548874405727ed\": container with ID starting with 008822619d45bc01dc85394373c8c5ccf93140832050a2d8a4548874405727ed not found: ID does not exist" Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.321897 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27777bdb-4fb4-4704-9ccb-df7a83fde963-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "27777bdb-4fb4-4704-9ccb-df7a83fde963" (UID: "27777bdb-4fb4-4704-9ccb-df7a83fde963"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.325450 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27777bdb-4fb4-4704-9ccb-df7a83fde963-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "27777bdb-4fb4-4704-9ccb-df7a83fde963" (UID: "27777bdb-4fb4-4704-9ccb-df7a83fde963"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.326906 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27777bdb-4fb4-4704-9ccb-df7a83fde963-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "27777bdb-4fb4-4704-9ccb-df7a83fde963" (UID: "27777bdb-4fb4-4704-9ccb-df7a83fde963"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.327378 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27777bdb-4fb4-4704-9ccb-df7a83fde963-config" (OuterVolumeSpecName: "config") pod "27777bdb-4fb4-4704-9ccb-df7a83fde963" (UID: "27777bdb-4fb4-4704-9ccb-df7a83fde963"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.373569 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27777bdb-4fb4-4704-9ccb-df7a83fde963-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.373858 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27777bdb-4fb4-4704-9ccb-df7a83fde963-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.373949 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-th4cm\" (UniqueName: \"kubernetes.io/projected/27777bdb-4fb4-4704-9ccb-df7a83fde963-kube-api-access-th4cm\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.374329 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27777bdb-4fb4-4704-9ccb-df7a83fde963-config\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.374432 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27777bdb-4fb4-4704-9ccb-df7a83fde963-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.621818 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f6cf695d7-glngc"] Mar 10 08:14:07 crc kubenswrapper[4825]: I0310 08:14:07.628580 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f6cf695d7-glngc"] Mar 10 08:14:08 crc kubenswrapper[4825]: I0310 08:14:08.216626 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:14:08 crc kubenswrapper[4825]: I0310 08:14:08.217004 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:14:09 crc kubenswrapper[4825]: I0310 08:14:09.248409 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27777bdb-4fb4-4704-9ccb-df7a83fde963" path="/var/lib/kubelet/pods/27777bdb-4fb4-4704-9ccb-df7a83fde963/volumes" Mar 10 08:14:10 crc kubenswrapper[4825]: I0310 08:14:10.380901 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:10 crc kubenswrapper[4825]: I0310 08:14:10.382713 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6d6b57675b-q5h4k" Mar 10 08:14:10 crc kubenswrapper[4825]: I0310 08:14:10.516255 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7bc9479664-wlv62"] Mar 10 08:14:10 crc kubenswrapper[4825]: I0310 08:14:10.516514 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7bc9479664-wlv62" podUID="8197ce55-6e66-4f96-9964-b4f7bfea74a7" containerName="proxy-httpd" containerID="cri-o://63fa391deabef5224165acf41e5f511070e87347af64375ff1f1bdb5a91b2773" gracePeriod=30 Mar 10 08:14:10 crc kubenswrapper[4825]: I0310 08:14:10.516900 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7bc9479664-wlv62" podUID="8197ce55-6e66-4f96-9964-b4f7bfea74a7" containerName="proxy-server" containerID="cri-o://0445a81d70caf56e23376649a443336447a0a30059364373d8bd1773e80c6108" gracePeriod=30 Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.294675 4825 generic.go:334] "Generic (PLEG): container finished" podID="8197ce55-6e66-4f96-9964-b4f7bfea74a7" containerID="0445a81d70caf56e23376649a443336447a0a30059364373d8bd1773e80c6108" exitCode=0 Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.294989 4825 generic.go:334] "Generic (PLEG): container finished" podID="8197ce55-6e66-4f96-9964-b4f7bfea74a7" containerID="63fa391deabef5224165acf41e5f511070e87347af64375ff1f1bdb5a91b2773" exitCode=0 Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.296120 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7bc9479664-wlv62" event={"ID":"8197ce55-6e66-4f96-9964-b4f7bfea74a7","Type":"ContainerDied","Data":"0445a81d70caf56e23376649a443336447a0a30059364373d8bd1773e80c6108"} Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.296174 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7bc9479664-wlv62" event={"ID":"8197ce55-6e66-4f96-9964-b4f7bfea74a7","Type":"ContainerDied","Data":"63fa391deabef5224165acf41e5f511070e87347af64375ff1f1bdb5a91b2773"} Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.296189 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7bc9479664-wlv62" event={"ID":"8197ce55-6e66-4f96-9964-b4f7bfea74a7","Type":"ContainerDied","Data":"aa6be3e879ecfe1c3d9acc3ae9ac22e109c9095ff450ccdcdc37235eeb300799"} Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.296201 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa6be3e879ecfe1c3d9acc3ae9ac22e109c9095ff450ccdcdc37235eeb300799" Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.340066 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.455378 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8197ce55-6e66-4f96-9964-b4f7bfea74a7-combined-ca-bundle\") pod \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\" (UID: \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\") " Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.455446 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8197ce55-6e66-4f96-9964-b4f7bfea74a7-log-httpd\") pod \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\" (UID: \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\") " Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.455522 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8197ce55-6e66-4f96-9964-b4f7bfea74a7-run-httpd\") pod \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\" (UID: \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\") " Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.455553 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8197ce55-6e66-4f96-9964-b4f7bfea74a7-etc-swift\") pod \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\" (UID: \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\") " Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.455615 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8197ce55-6e66-4f96-9964-b4f7bfea74a7-config-data\") pod \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\" (UID: \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\") " Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.455641 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtj5t\" (UniqueName: \"kubernetes.io/projected/8197ce55-6e66-4f96-9964-b4f7bfea74a7-kube-api-access-xtj5t\") pod \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\" (UID: \"8197ce55-6e66-4f96-9964-b4f7bfea74a7\") " Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.456030 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8197ce55-6e66-4f96-9964-b4f7bfea74a7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8197ce55-6e66-4f96-9964-b4f7bfea74a7" (UID: "8197ce55-6e66-4f96-9964-b4f7bfea74a7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.456296 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8197ce55-6e66-4f96-9964-b4f7bfea74a7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8197ce55-6e66-4f96-9964-b4f7bfea74a7" (UID: "8197ce55-6e66-4f96-9964-b4f7bfea74a7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.457045 4825 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8197ce55-6e66-4f96-9964-b4f7bfea74a7-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.457072 4825 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8197ce55-6e66-4f96-9964-b4f7bfea74a7-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.461854 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8197ce55-6e66-4f96-9964-b4f7bfea74a7-kube-api-access-xtj5t" (OuterVolumeSpecName: "kube-api-access-xtj5t") pod "8197ce55-6e66-4f96-9964-b4f7bfea74a7" (UID: "8197ce55-6e66-4f96-9964-b4f7bfea74a7"). InnerVolumeSpecName "kube-api-access-xtj5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.461901 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8197ce55-6e66-4f96-9964-b4f7bfea74a7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8197ce55-6e66-4f96-9964-b4f7bfea74a7" (UID: "8197ce55-6e66-4f96-9964-b4f7bfea74a7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.507411 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8197ce55-6e66-4f96-9964-b4f7bfea74a7-config-data" (OuterVolumeSpecName: "config-data") pod "8197ce55-6e66-4f96-9964-b4f7bfea74a7" (UID: "8197ce55-6e66-4f96-9964-b4f7bfea74a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.510741 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8197ce55-6e66-4f96-9964-b4f7bfea74a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8197ce55-6e66-4f96-9964-b4f7bfea74a7" (UID: "8197ce55-6e66-4f96-9964-b4f7bfea74a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.558412 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8197ce55-6e66-4f96-9964-b4f7bfea74a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.558645 4825 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8197ce55-6e66-4f96-9964-b4f7bfea74a7-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.558703 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8197ce55-6e66-4f96-9964-b4f7bfea74a7-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:11 crc kubenswrapper[4825]: I0310 08:14:11.558766 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtj5t\" (UniqueName: \"kubernetes.io/projected/8197ce55-6e66-4f96-9964-b4f7bfea74a7-kube-api-access-xtj5t\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:12 crc kubenswrapper[4825]: I0310 08:14:12.306258 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7bc9479664-wlv62" Mar 10 08:14:12 crc kubenswrapper[4825]: I0310 08:14:12.347058 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7bc9479664-wlv62"] Mar 10 08:14:12 crc kubenswrapper[4825]: I0310 08:14:12.354623 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-7bc9479664-wlv62"] Mar 10 08:14:13 crc kubenswrapper[4825]: I0310 08:14:13.246580 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8197ce55-6e66-4f96-9964-b4f7bfea74a7" path="/var/lib/kubelet/pods/8197ce55-6e66-4f96-9964-b4f7bfea74a7/volumes" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.327695 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-dw5mh"] Mar 10 08:14:16 crc kubenswrapper[4825]: E0310 08:14:16.328425 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27777bdb-4fb4-4704-9ccb-df7a83fde963" containerName="dnsmasq-dns" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.328440 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="27777bdb-4fb4-4704-9ccb-df7a83fde963" containerName="dnsmasq-dns" Mar 10 08:14:16 crc kubenswrapper[4825]: E0310 08:14:16.328464 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72b5083e-156e-42a5-abbd-1d71521f8ed4" containerName="swift-ring-rebalance" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.328473 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="72b5083e-156e-42a5-abbd-1d71521f8ed4" containerName="swift-ring-rebalance" Mar 10 08:14:16 crc kubenswrapper[4825]: E0310 08:14:16.328507 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8197ce55-6e66-4f96-9964-b4f7bfea74a7" containerName="proxy-httpd" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.328515 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8197ce55-6e66-4f96-9964-b4f7bfea74a7" containerName="proxy-httpd" Mar 10 08:14:16 crc kubenswrapper[4825]: E0310 08:14:16.328533 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8197ce55-6e66-4f96-9964-b4f7bfea74a7" containerName="proxy-server" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.328541 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8197ce55-6e66-4f96-9964-b4f7bfea74a7" containerName="proxy-server" Mar 10 08:14:16 crc kubenswrapper[4825]: E0310 08:14:16.328551 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9e162c-76d6-4e53-9cf3-72544f8a1399" containerName="oc" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.328558 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9e162c-76d6-4e53-9cf3-72544f8a1399" containerName="oc" Mar 10 08:14:16 crc kubenswrapper[4825]: E0310 08:14:16.328574 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27777bdb-4fb4-4704-9ccb-df7a83fde963" containerName="init" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.328581 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="27777bdb-4fb4-4704-9ccb-df7a83fde963" containerName="init" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.328764 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa9e162c-76d6-4e53-9cf3-72544f8a1399" containerName="oc" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.328784 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="27777bdb-4fb4-4704-9ccb-df7a83fde963" containerName="dnsmasq-dns" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.328795 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="72b5083e-156e-42a5-abbd-1d71521f8ed4" containerName="swift-ring-rebalance" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.328810 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="8197ce55-6e66-4f96-9964-b4f7bfea74a7" containerName="proxy-httpd" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.328825 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="8197ce55-6e66-4f96-9964-b4f7bfea74a7" containerName="proxy-server" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.329492 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dw5mh" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.336710 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-dw5mh"] Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.360881 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/986cc5c1-8756-4ea7-ae05-97ce9a4182b1-operator-scripts\") pod \"cinder-db-create-dw5mh\" (UID: \"986cc5c1-8756-4ea7-ae05-97ce9a4182b1\") " pod="openstack/cinder-db-create-dw5mh" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.360970 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9vwj\" (UniqueName: \"kubernetes.io/projected/986cc5c1-8756-4ea7-ae05-97ce9a4182b1-kube-api-access-t9vwj\") pod \"cinder-db-create-dw5mh\" (UID: \"986cc5c1-8756-4ea7-ae05-97ce9a4182b1\") " pod="openstack/cinder-db-create-dw5mh" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.426408 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-14b7-account-create-update-lmmrz"] Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.427660 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-14b7-account-create-update-lmmrz" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.429203 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.462390 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg4r6\" (UniqueName: \"kubernetes.io/projected/0d16dea2-c672-4fc5-932e-734973e299dd-kube-api-access-fg4r6\") pod \"cinder-14b7-account-create-update-lmmrz\" (UID: \"0d16dea2-c672-4fc5-932e-734973e299dd\") " pod="openstack/cinder-14b7-account-create-update-lmmrz" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.462454 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/986cc5c1-8756-4ea7-ae05-97ce9a4182b1-operator-scripts\") pod \"cinder-db-create-dw5mh\" (UID: \"986cc5c1-8756-4ea7-ae05-97ce9a4182b1\") " pod="openstack/cinder-db-create-dw5mh" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.462546 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9vwj\" (UniqueName: \"kubernetes.io/projected/986cc5c1-8756-4ea7-ae05-97ce9a4182b1-kube-api-access-t9vwj\") pod \"cinder-db-create-dw5mh\" (UID: \"986cc5c1-8756-4ea7-ae05-97ce9a4182b1\") " pod="openstack/cinder-db-create-dw5mh" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.462611 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d16dea2-c672-4fc5-932e-734973e299dd-operator-scripts\") pod \"cinder-14b7-account-create-update-lmmrz\" (UID: \"0d16dea2-c672-4fc5-932e-734973e299dd\") " pod="openstack/cinder-14b7-account-create-update-lmmrz" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.463721 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/986cc5c1-8756-4ea7-ae05-97ce9a4182b1-operator-scripts\") pod \"cinder-db-create-dw5mh\" (UID: \"986cc5c1-8756-4ea7-ae05-97ce9a4182b1\") " pod="openstack/cinder-db-create-dw5mh" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.480010 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9vwj\" (UniqueName: \"kubernetes.io/projected/986cc5c1-8756-4ea7-ae05-97ce9a4182b1-kube-api-access-t9vwj\") pod \"cinder-db-create-dw5mh\" (UID: \"986cc5c1-8756-4ea7-ae05-97ce9a4182b1\") " pod="openstack/cinder-db-create-dw5mh" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.563812 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d16dea2-c672-4fc5-932e-734973e299dd-operator-scripts\") pod \"cinder-14b7-account-create-update-lmmrz\" (UID: \"0d16dea2-c672-4fc5-932e-734973e299dd\") " pod="openstack/cinder-14b7-account-create-update-lmmrz" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.563923 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg4r6\" (UniqueName: \"kubernetes.io/projected/0d16dea2-c672-4fc5-932e-734973e299dd-kube-api-access-fg4r6\") pod \"cinder-14b7-account-create-update-lmmrz\" (UID: \"0d16dea2-c672-4fc5-932e-734973e299dd\") " pod="openstack/cinder-14b7-account-create-update-lmmrz" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.564601 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d16dea2-c672-4fc5-932e-734973e299dd-operator-scripts\") pod \"cinder-14b7-account-create-update-lmmrz\" (UID: \"0d16dea2-c672-4fc5-932e-734973e299dd\") " pod="openstack/cinder-14b7-account-create-update-lmmrz" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.580807 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg4r6\" (UniqueName: \"kubernetes.io/projected/0d16dea2-c672-4fc5-932e-734973e299dd-kube-api-access-fg4r6\") pod \"cinder-14b7-account-create-update-lmmrz\" (UID: \"0d16dea2-c672-4fc5-932e-734973e299dd\") " pod="openstack/cinder-14b7-account-create-update-lmmrz" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.637718 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-14b7-account-create-update-lmmrz"] Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.650743 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dw5mh" Mar 10 08:14:16 crc kubenswrapper[4825]: I0310 08:14:16.743557 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-14b7-account-create-update-lmmrz" Mar 10 08:14:17 crc kubenswrapper[4825]: I0310 08:14:17.110233 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-dw5mh"] Mar 10 08:14:17 crc kubenswrapper[4825]: I0310 08:14:17.213422 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-14b7-account-create-update-lmmrz"] Mar 10 08:14:17 crc kubenswrapper[4825]: W0310 08:14:17.223967 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d16dea2_c672_4fc5_932e_734973e299dd.slice/crio-93628cd0522e786999e5094d3cd41d5e628e898f11302a51e04458f332434768 WatchSource:0}: Error finding container 93628cd0522e786999e5094d3cd41d5e628e898f11302a51e04458f332434768: Status 404 returned error can't find the container with id 93628cd0522e786999e5094d3cd41d5e628e898f11302a51e04458f332434768 Mar 10 08:14:17 crc kubenswrapper[4825]: I0310 08:14:17.239893 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:14:17 crc kubenswrapper[4825]: E0310 08:14:17.240209 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:14:17 crc kubenswrapper[4825]: I0310 08:14:17.356830 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-14b7-account-create-update-lmmrz" event={"ID":"0d16dea2-c672-4fc5-932e-734973e299dd","Type":"ContainerStarted","Data":"93628cd0522e786999e5094d3cd41d5e628e898f11302a51e04458f332434768"} Mar 10 08:14:17 crc kubenswrapper[4825]: I0310 08:14:17.359438 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-dw5mh" event={"ID":"986cc5c1-8756-4ea7-ae05-97ce9a4182b1","Type":"ContainerStarted","Data":"7de10519b9e1b6f98c1a1256dab9a133f0e65a322025cc77b71087bb11943577"} Mar 10 08:14:17 crc kubenswrapper[4825]: I0310 08:14:17.359513 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-dw5mh" event={"ID":"986cc5c1-8756-4ea7-ae05-97ce9a4182b1","Type":"ContainerStarted","Data":"5c37eb8b620178ed8e39840d782b45091da94d56a457e93baae899cd7d5d7f21"} Mar 10 08:14:17 crc kubenswrapper[4825]: I0310 08:14:17.381642 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-dw5mh" podStartSLOduration=1.381621252 podStartE2EDuration="1.381621252s" podCreationTimestamp="2026-03-10 08:14:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:14:17.375304046 +0000 UTC m=+5410.405084661" watchObservedRunningTime="2026-03-10 08:14:17.381621252 +0000 UTC m=+5410.411401867" Mar 10 08:14:18 crc kubenswrapper[4825]: I0310 08:14:18.369461 4825 generic.go:334] "Generic (PLEG): container finished" podID="986cc5c1-8756-4ea7-ae05-97ce9a4182b1" containerID="7de10519b9e1b6f98c1a1256dab9a133f0e65a322025cc77b71087bb11943577" exitCode=0 Mar 10 08:14:18 crc kubenswrapper[4825]: I0310 08:14:18.369886 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-dw5mh" event={"ID":"986cc5c1-8756-4ea7-ae05-97ce9a4182b1","Type":"ContainerDied","Data":"7de10519b9e1b6f98c1a1256dab9a133f0e65a322025cc77b71087bb11943577"} Mar 10 08:14:18 crc kubenswrapper[4825]: I0310 08:14:18.372049 4825 generic.go:334] "Generic (PLEG): container finished" podID="0d16dea2-c672-4fc5-932e-734973e299dd" containerID="0f554e63e5c766240e80d6b35cb69ff0af9a63cce450ff6a93312478f01708a4" exitCode=0 Mar 10 08:14:18 crc kubenswrapper[4825]: I0310 08:14:18.372093 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-14b7-account-create-update-lmmrz" event={"ID":"0d16dea2-c672-4fc5-932e-734973e299dd","Type":"ContainerDied","Data":"0f554e63e5c766240e80d6b35cb69ff0af9a63cce450ff6a93312478f01708a4"} Mar 10 08:14:19 crc kubenswrapper[4825]: I0310 08:14:19.775608 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dw5mh" Mar 10 08:14:19 crc kubenswrapper[4825]: I0310 08:14:19.780733 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-14b7-account-create-update-lmmrz" Mar 10 08:14:19 crc kubenswrapper[4825]: I0310 08:14:19.934480 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9vwj\" (UniqueName: \"kubernetes.io/projected/986cc5c1-8756-4ea7-ae05-97ce9a4182b1-kube-api-access-t9vwj\") pod \"986cc5c1-8756-4ea7-ae05-97ce9a4182b1\" (UID: \"986cc5c1-8756-4ea7-ae05-97ce9a4182b1\") " Mar 10 08:14:19 crc kubenswrapper[4825]: I0310 08:14:19.934597 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d16dea2-c672-4fc5-932e-734973e299dd-operator-scripts\") pod \"0d16dea2-c672-4fc5-932e-734973e299dd\" (UID: \"0d16dea2-c672-4fc5-932e-734973e299dd\") " Mar 10 08:14:19 crc kubenswrapper[4825]: I0310 08:14:19.934633 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/986cc5c1-8756-4ea7-ae05-97ce9a4182b1-operator-scripts\") pod \"986cc5c1-8756-4ea7-ae05-97ce9a4182b1\" (UID: \"986cc5c1-8756-4ea7-ae05-97ce9a4182b1\") " Mar 10 08:14:19 crc kubenswrapper[4825]: I0310 08:14:19.934661 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg4r6\" (UniqueName: \"kubernetes.io/projected/0d16dea2-c672-4fc5-932e-734973e299dd-kube-api-access-fg4r6\") pod \"0d16dea2-c672-4fc5-932e-734973e299dd\" (UID: \"0d16dea2-c672-4fc5-932e-734973e299dd\") " Mar 10 08:14:19 crc kubenswrapper[4825]: I0310 08:14:19.935245 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/986cc5c1-8756-4ea7-ae05-97ce9a4182b1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "986cc5c1-8756-4ea7-ae05-97ce9a4182b1" (UID: "986cc5c1-8756-4ea7-ae05-97ce9a4182b1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:14:19 crc kubenswrapper[4825]: I0310 08:14:19.935258 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d16dea2-c672-4fc5-932e-734973e299dd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0d16dea2-c672-4fc5-932e-734973e299dd" (UID: "0d16dea2-c672-4fc5-932e-734973e299dd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:14:19 crc kubenswrapper[4825]: I0310 08:14:19.940226 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/986cc5c1-8756-4ea7-ae05-97ce9a4182b1-kube-api-access-t9vwj" (OuterVolumeSpecName: "kube-api-access-t9vwj") pod "986cc5c1-8756-4ea7-ae05-97ce9a4182b1" (UID: "986cc5c1-8756-4ea7-ae05-97ce9a4182b1"). InnerVolumeSpecName "kube-api-access-t9vwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:14:19 crc kubenswrapper[4825]: I0310 08:14:19.940504 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d16dea2-c672-4fc5-932e-734973e299dd-kube-api-access-fg4r6" (OuterVolumeSpecName: "kube-api-access-fg4r6") pod "0d16dea2-c672-4fc5-932e-734973e299dd" (UID: "0d16dea2-c672-4fc5-932e-734973e299dd"). InnerVolumeSpecName "kube-api-access-fg4r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:14:20 crc kubenswrapper[4825]: I0310 08:14:20.036531 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9vwj\" (UniqueName: \"kubernetes.io/projected/986cc5c1-8756-4ea7-ae05-97ce9a4182b1-kube-api-access-t9vwj\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:20 crc kubenswrapper[4825]: I0310 08:14:20.036575 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d16dea2-c672-4fc5-932e-734973e299dd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:20 crc kubenswrapper[4825]: I0310 08:14:20.036588 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/986cc5c1-8756-4ea7-ae05-97ce9a4182b1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:20 crc kubenswrapper[4825]: I0310 08:14:20.036600 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg4r6\" (UniqueName: \"kubernetes.io/projected/0d16dea2-c672-4fc5-932e-734973e299dd-kube-api-access-fg4r6\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:20 crc kubenswrapper[4825]: I0310 08:14:20.391687 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-14b7-account-create-update-lmmrz" event={"ID":"0d16dea2-c672-4fc5-932e-734973e299dd","Type":"ContainerDied","Data":"93628cd0522e786999e5094d3cd41d5e628e898f11302a51e04458f332434768"} Mar 10 08:14:20 crc kubenswrapper[4825]: I0310 08:14:20.391759 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93628cd0522e786999e5094d3cd41d5e628e898f11302a51e04458f332434768" Mar 10 08:14:20 crc kubenswrapper[4825]: I0310 08:14:20.391758 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-14b7-account-create-update-lmmrz" Mar 10 08:14:20 crc kubenswrapper[4825]: I0310 08:14:20.394373 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-dw5mh" event={"ID":"986cc5c1-8756-4ea7-ae05-97ce9a4182b1","Type":"ContainerDied","Data":"5c37eb8b620178ed8e39840d782b45091da94d56a457e93baae899cd7d5d7f21"} Mar 10 08:14:20 crc kubenswrapper[4825]: I0310 08:14:20.394402 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c37eb8b620178ed8e39840d782b45091da94d56a457e93baae899cd7d5d7f21" Mar 10 08:14:20 crc kubenswrapper[4825]: I0310 08:14:20.394422 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dw5mh" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.816416 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-xf7vk"] Mar 10 08:14:21 crc kubenswrapper[4825]: E0310 08:14:21.816749 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d16dea2-c672-4fc5-932e-734973e299dd" containerName="mariadb-account-create-update" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.816764 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d16dea2-c672-4fc5-932e-734973e299dd" containerName="mariadb-account-create-update" Mar 10 08:14:21 crc kubenswrapper[4825]: E0310 08:14:21.816792 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="986cc5c1-8756-4ea7-ae05-97ce9a4182b1" containerName="mariadb-database-create" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.816801 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="986cc5c1-8756-4ea7-ae05-97ce9a4182b1" containerName="mariadb-database-create" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.816990 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d16dea2-c672-4fc5-932e-734973e299dd" containerName="mariadb-account-create-update" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.817016 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="986cc5c1-8756-4ea7-ae05-97ce9a4182b1" containerName="mariadb-database-create" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.817601 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xf7vk" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.819845 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-hgw5n" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.820559 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.820862 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.829919 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xf7vk"] Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.870649 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-config-data\") pod \"cinder-db-sync-xf7vk\" (UID: \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\") " pod="openstack/cinder-db-sync-xf7vk" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.870689 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-etc-machine-id\") pod \"cinder-db-sync-xf7vk\" (UID: \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\") " pod="openstack/cinder-db-sync-xf7vk" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.870707 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-combined-ca-bundle\") pod \"cinder-db-sync-xf7vk\" (UID: \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\") " pod="openstack/cinder-db-sync-xf7vk" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.870863 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-scripts\") pod \"cinder-db-sync-xf7vk\" (UID: \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\") " pod="openstack/cinder-db-sync-xf7vk" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.870879 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-db-sync-config-data\") pod \"cinder-db-sync-xf7vk\" (UID: \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\") " pod="openstack/cinder-db-sync-xf7vk" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.870902 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22k5q\" (UniqueName: \"kubernetes.io/projected/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-kube-api-access-22k5q\") pod \"cinder-db-sync-xf7vk\" (UID: \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\") " pod="openstack/cinder-db-sync-xf7vk" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.972367 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-scripts\") pod \"cinder-db-sync-xf7vk\" (UID: \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\") " pod="openstack/cinder-db-sync-xf7vk" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.972423 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-db-sync-config-data\") pod \"cinder-db-sync-xf7vk\" (UID: \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\") " pod="openstack/cinder-db-sync-xf7vk" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.972453 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22k5q\" (UniqueName: \"kubernetes.io/projected/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-kube-api-access-22k5q\") pod \"cinder-db-sync-xf7vk\" (UID: \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\") " pod="openstack/cinder-db-sync-xf7vk" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.972505 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-config-data\") pod \"cinder-db-sync-xf7vk\" (UID: \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\") " pod="openstack/cinder-db-sync-xf7vk" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.972524 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-etc-machine-id\") pod \"cinder-db-sync-xf7vk\" (UID: \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\") " pod="openstack/cinder-db-sync-xf7vk" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.972539 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-combined-ca-bundle\") pod \"cinder-db-sync-xf7vk\" (UID: \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\") " pod="openstack/cinder-db-sync-xf7vk" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.972910 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-etc-machine-id\") pod \"cinder-db-sync-xf7vk\" (UID: \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\") " pod="openstack/cinder-db-sync-xf7vk" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.979040 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-scripts\") pod \"cinder-db-sync-xf7vk\" (UID: \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\") " pod="openstack/cinder-db-sync-xf7vk" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.979528 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-config-data\") pod \"cinder-db-sync-xf7vk\" (UID: \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\") " pod="openstack/cinder-db-sync-xf7vk" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.979757 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-db-sync-config-data\") pod \"cinder-db-sync-xf7vk\" (UID: \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\") " pod="openstack/cinder-db-sync-xf7vk" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.980197 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-combined-ca-bundle\") pod \"cinder-db-sync-xf7vk\" (UID: \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\") " pod="openstack/cinder-db-sync-xf7vk" Mar 10 08:14:21 crc kubenswrapper[4825]: I0310 08:14:21.989450 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22k5q\" (UniqueName: \"kubernetes.io/projected/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-kube-api-access-22k5q\") pod \"cinder-db-sync-xf7vk\" (UID: \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\") " pod="openstack/cinder-db-sync-xf7vk" Mar 10 08:14:22 crc kubenswrapper[4825]: I0310 08:14:22.143735 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xf7vk" Mar 10 08:14:22 crc kubenswrapper[4825]: I0310 08:14:22.591199 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xf7vk"] Mar 10 08:14:22 crc kubenswrapper[4825]: W0310 08:14:22.592043 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50272a2f_4ff6_4c66_b7de_80bf6ce236c3.slice/crio-d11f62618636f0837efa7aa7166f35c750c15ad1cbc9aca8c3038faf9d1ea948 WatchSource:0}: Error finding container d11f62618636f0837efa7aa7166f35c750c15ad1cbc9aca8c3038faf9d1ea948: Status 404 returned error can't find the container with id d11f62618636f0837efa7aa7166f35c750c15ad1cbc9aca8c3038faf9d1ea948 Mar 10 08:14:22 crc kubenswrapper[4825]: I0310 08:14:22.594575 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 08:14:23 crc kubenswrapper[4825]: I0310 08:14:23.422738 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xf7vk" event={"ID":"50272a2f-4ff6-4c66-b7de-80bf6ce236c3","Type":"ContainerStarted","Data":"d11f62618636f0837efa7aa7166f35c750c15ad1cbc9aca8c3038faf9d1ea948"} Mar 10 08:14:31 crc kubenswrapper[4825]: I0310 08:14:31.236781 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:14:31 crc kubenswrapper[4825]: E0310 08:14:31.237331 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:14:31 crc kubenswrapper[4825]: I0310 08:14:31.306106 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lrrtv"] Mar 10 08:14:31 crc kubenswrapper[4825]: I0310 08:14:31.308258 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrrtv" Mar 10 08:14:31 crc kubenswrapper[4825]: I0310 08:14:31.318341 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lrrtv"] Mar 10 08:14:31 crc kubenswrapper[4825]: I0310 08:14:31.452539 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d703efe1-32d9-438b-af28-bea13efa058e-utilities\") pod \"community-operators-lrrtv\" (UID: \"d703efe1-32d9-438b-af28-bea13efa058e\") " pod="openshift-marketplace/community-operators-lrrtv" Mar 10 08:14:31 crc kubenswrapper[4825]: I0310 08:14:31.452604 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d703efe1-32d9-438b-af28-bea13efa058e-catalog-content\") pod \"community-operators-lrrtv\" (UID: \"d703efe1-32d9-438b-af28-bea13efa058e\") " pod="openshift-marketplace/community-operators-lrrtv" Mar 10 08:14:31 crc kubenswrapper[4825]: I0310 08:14:31.452926 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmg8s\" (UniqueName: \"kubernetes.io/projected/d703efe1-32d9-438b-af28-bea13efa058e-kube-api-access-rmg8s\") pod \"community-operators-lrrtv\" (UID: \"d703efe1-32d9-438b-af28-bea13efa058e\") " pod="openshift-marketplace/community-operators-lrrtv" Mar 10 08:14:31 crc kubenswrapper[4825]: I0310 08:14:31.555121 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d703efe1-32d9-438b-af28-bea13efa058e-utilities\") pod \"community-operators-lrrtv\" (UID: \"d703efe1-32d9-438b-af28-bea13efa058e\") " pod="openshift-marketplace/community-operators-lrrtv" Mar 10 08:14:31 crc kubenswrapper[4825]: I0310 08:14:31.555239 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d703efe1-32d9-438b-af28-bea13efa058e-catalog-content\") pod \"community-operators-lrrtv\" (UID: \"d703efe1-32d9-438b-af28-bea13efa058e\") " pod="openshift-marketplace/community-operators-lrrtv" Mar 10 08:14:31 crc kubenswrapper[4825]: I0310 08:14:31.555313 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmg8s\" (UniqueName: \"kubernetes.io/projected/d703efe1-32d9-438b-af28-bea13efa058e-kube-api-access-rmg8s\") pod \"community-operators-lrrtv\" (UID: \"d703efe1-32d9-438b-af28-bea13efa058e\") " pod="openshift-marketplace/community-operators-lrrtv" Mar 10 08:14:31 crc kubenswrapper[4825]: I0310 08:14:31.555837 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d703efe1-32d9-438b-af28-bea13efa058e-catalog-content\") pod \"community-operators-lrrtv\" (UID: \"d703efe1-32d9-438b-af28-bea13efa058e\") " pod="openshift-marketplace/community-operators-lrrtv" Mar 10 08:14:31 crc kubenswrapper[4825]: I0310 08:14:31.555921 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d703efe1-32d9-438b-af28-bea13efa058e-utilities\") pod \"community-operators-lrrtv\" (UID: \"d703efe1-32d9-438b-af28-bea13efa058e\") " pod="openshift-marketplace/community-operators-lrrtv" Mar 10 08:14:31 crc kubenswrapper[4825]: I0310 08:14:31.609375 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmg8s\" (UniqueName: \"kubernetes.io/projected/d703efe1-32d9-438b-af28-bea13efa058e-kube-api-access-rmg8s\") pod \"community-operators-lrrtv\" (UID: \"d703efe1-32d9-438b-af28-bea13efa058e\") " pod="openshift-marketplace/community-operators-lrrtv" Mar 10 08:14:31 crc kubenswrapper[4825]: I0310 08:14:31.667784 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrrtv" Mar 10 08:14:31 crc kubenswrapper[4825]: I0310 08:14:31.975239 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lrrtv"] Mar 10 08:14:32 crc kubenswrapper[4825]: I0310 08:14:32.505795 4825 generic.go:334] "Generic (PLEG): container finished" podID="d703efe1-32d9-438b-af28-bea13efa058e" containerID="ae0ad4e7287176ff8c9dee5816efe21a982641ab4427009b4a29de1f93b1f121" exitCode=0 Mar 10 08:14:32 crc kubenswrapper[4825]: I0310 08:14:32.505864 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrrtv" event={"ID":"d703efe1-32d9-438b-af28-bea13efa058e","Type":"ContainerDied","Data":"ae0ad4e7287176ff8c9dee5816efe21a982641ab4427009b4a29de1f93b1f121"} Mar 10 08:14:32 crc kubenswrapper[4825]: I0310 08:14:32.506294 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrrtv" event={"ID":"d703efe1-32d9-438b-af28-bea13efa058e","Type":"ContainerStarted","Data":"11c3609d5896bf05ec3d6ec086d6089615a13c6286ff573f853991710b4fc57e"} Mar 10 08:14:34 crc kubenswrapper[4825]: I0310 08:14:34.525193 4825 generic.go:334] "Generic (PLEG): container finished" podID="d703efe1-32d9-438b-af28-bea13efa058e" containerID="25a132751b38c1985b2ec9b002615656a007077bf3f5706079ac0a3fd6ab4147" exitCode=0 Mar 10 08:14:34 crc kubenswrapper[4825]: I0310 08:14:34.525286 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrrtv" event={"ID":"d703efe1-32d9-438b-af28-bea13efa058e","Type":"ContainerDied","Data":"25a132751b38c1985b2ec9b002615656a007077bf3f5706079ac0a3fd6ab4147"} Mar 10 08:14:41 crc kubenswrapper[4825]: I0310 08:14:41.407674 4825 scope.go:117] "RemoveContainer" containerID="3a08e1311949cfde24b33789a2fdb7b4c86c8ab8c13f514d2c8de1edcbafb48a" Mar 10 08:14:42 crc kubenswrapper[4825]: I0310 08:14:42.642735 4825 scope.go:117] "RemoveContainer" containerID="d47f17f2d6711d9df57a1a123518e67c9c43ff6f9cec5e1afd62736eb7829d33" Mar 10 08:14:42 crc kubenswrapper[4825]: I0310 08:14:42.700815 4825 scope.go:117] "RemoveContainer" containerID="0e3f03f9e285a51c36cb1f99c0b6950b36203c1b284e40b961dde1d7f3dc31f8" Mar 10 08:14:43 crc kubenswrapper[4825]: I0310 08:14:43.606719 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xf7vk" event={"ID":"50272a2f-4ff6-4c66-b7de-80bf6ce236c3","Type":"ContainerStarted","Data":"010a911e5e9ec3ed61ba006b2584f13f6e8b789e247d40f22a396878a027d480"} Mar 10 08:14:43 crc kubenswrapper[4825]: I0310 08:14:43.608803 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrrtv" event={"ID":"d703efe1-32d9-438b-af28-bea13efa058e","Type":"ContainerStarted","Data":"7a683ed3a74c5e88dec46d48b98535d6d0eace44be3a60e9462d1afba279aade"} Mar 10 08:14:43 crc kubenswrapper[4825]: I0310 08:14:43.626427 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-xf7vk" podStartSLOduration=2.47172788 podStartE2EDuration="22.626403458s" podCreationTimestamp="2026-03-10 08:14:21 +0000 UTC" firstStartedPulling="2026-03-10 08:14:22.594284411 +0000 UTC m=+5415.624065026" lastFinishedPulling="2026-03-10 08:14:42.748959989 +0000 UTC m=+5435.778740604" observedRunningTime="2026-03-10 08:14:43.619864097 +0000 UTC m=+5436.649644762" watchObservedRunningTime="2026-03-10 08:14:43.626403458 +0000 UTC m=+5436.656184083" Mar 10 08:14:43 crc kubenswrapper[4825]: I0310 08:14:43.647621 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lrrtv" podStartSLOduration=2.439848875 podStartE2EDuration="12.647603485s" podCreationTimestamp="2026-03-10 08:14:31 +0000 UTC" firstStartedPulling="2026-03-10 08:14:32.508592453 +0000 UTC m=+5425.538373088" lastFinishedPulling="2026-03-10 08:14:42.716347083 +0000 UTC m=+5435.746127698" observedRunningTime="2026-03-10 08:14:43.642559813 +0000 UTC m=+5436.672340438" watchObservedRunningTime="2026-03-10 08:14:43.647603485 +0000 UTC m=+5436.677384100" Mar 10 08:14:45 crc kubenswrapper[4825]: I0310 08:14:45.237022 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:14:45 crc kubenswrapper[4825]: E0310 08:14:45.237866 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:14:46 crc kubenswrapper[4825]: I0310 08:14:46.639195 4825 generic.go:334] "Generic (PLEG): container finished" podID="50272a2f-4ff6-4c66-b7de-80bf6ce236c3" containerID="010a911e5e9ec3ed61ba006b2584f13f6e8b789e247d40f22a396878a027d480" exitCode=0 Mar 10 08:14:46 crc kubenswrapper[4825]: I0310 08:14:46.639495 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xf7vk" event={"ID":"50272a2f-4ff6-4c66-b7de-80bf6ce236c3","Type":"ContainerDied","Data":"010a911e5e9ec3ed61ba006b2584f13f6e8b789e247d40f22a396878a027d480"} Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.018498 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xf7vk" Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.170014 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-scripts\") pod \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\" (UID: \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\") " Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.170073 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-config-data\") pod \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\" (UID: \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\") " Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.170123 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-db-sync-config-data\") pod \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\" (UID: \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\") " Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.170907 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-combined-ca-bundle\") pod \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\" (UID: \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\") " Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.170969 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-etc-machine-id\") pod \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\" (UID: \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\") " Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.171072 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22k5q\" (UniqueName: \"kubernetes.io/projected/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-kube-api-access-22k5q\") pod \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\" (UID: \"50272a2f-4ff6-4c66-b7de-80bf6ce236c3\") " Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.171158 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "50272a2f-4ff6-4c66-b7de-80bf6ce236c3" (UID: "50272a2f-4ff6-4c66-b7de-80bf6ce236c3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.172121 4825 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.175663 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-kube-api-access-22k5q" (OuterVolumeSpecName: "kube-api-access-22k5q") pod "50272a2f-4ff6-4c66-b7de-80bf6ce236c3" (UID: "50272a2f-4ff6-4c66-b7de-80bf6ce236c3"). InnerVolumeSpecName "kube-api-access-22k5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.175922 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "50272a2f-4ff6-4c66-b7de-80bf6ce236c3" (UID: "50272a2f-4ff6-4c66-b7de-80bf6ce236c3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.177301 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-scripts" (OuterVolumeSpecName: "scripts") pod "50272a2f-4ff6-4c66-b7de-80bf6ce236c3" (UID: "50272a2f-4ff6-4c66-b7de-80bf6ce236c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.217296 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50272a2f-4ff6-4c66-b7de-80bf6ce236c3" (UID: "50272a2f-4ff6-4c66-b7de-80bf6ce236c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.224269 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-config-data" (OuterVolumeSpecName: "config-data") pod "50272a2f-4ff6-4c66-b7de-80bf6ce236c3" (UID: "50272a2f-4ff6-4c66-b7de-80bf6ce236c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.274815 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.274861 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.274882 4825 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.274899 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.274917 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22k5q\" (UniqueName: \"kubernetes.io/projected/50272a2f-4ff6-4c66-b7de-80bf6ce236c3-kube-api-access-22k5q\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.666227 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xf7vk" event={"ID":"50272a2f-4ff6-4c66-b7de-80bf6ce236c3","Type":"ContainerDied","Data":"d11f62618636f0837efa7aa7166f35c750c15ad1cbc9aca8c3038faf9d1ea948"} Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.666759 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d11f62618636f0837efa7aa7166f35c750c15ad1cbc9aca8c3038faf9d1ea948" Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.666347 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xf7vk" Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.987710 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74d56d465f-d2t8h"] Mar 10 08:14:48 crc kubenswrapper[4825]: E0310 08:14:48.988055 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50272a2f-4ff6-4c66-b7de-80bf6ce236c3" containerName="cinder-db-sync" Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.988066 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="50272a2f-4ff6-4c66-b7de-80bf6ce236c3" containerName="cinder-db-sync" Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.988234 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="50272a2f-4ff6-4c66-b7de-80bf6ce236c3" containerName="cinder-db-sync" Mar 10 08:14:48 crc kubenswrapper[4825]: I0310 08:14:48.989075 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.005810 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74d56d465f-d2t8h"] Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.092232 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56ad7438-3747-4fff-a88f-34f87df435d6-dns-svc\") pod \"dnsmasq-dns-74d56d465f-d2t8h\" (UID: \"56ad7438-3747-4fff-a88f-34f87df435d6\") " pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.092275 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56ad7438-3747-4fff-a88f-34f87df435d6-ovsdbserver-sb\") pod \"dnsmasq-dns-74d56d465f-d2t8h\" (UID: \"56ad7438-3747-4fff-a88f-34f87df435d6\") " pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.092302 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56ad7438-3747-4fff-a88f-34f87df435d6-ovsdbserver-nb\") pod \"dnsmasq-dns-74d56d465f-d2t8h\" (UID: \"56ad7438-3747-4fff-a88f-34f87df435d6\") " pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.092326 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56ad7438-3747-4fff-a88f-34f87df435d6-config\") pod \"dnsmasq-dns-74d56d465f-d2t8h\" (UID: \"56ad7438-3747-4fff-a88f-34f87df435d6\") " pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.092346 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdvrd\" (UniqueName: \"kubernetes.io/projected/56ad7438-3747-4fff-a88f-34f87df435d6-kube-api-access-vdvrd\") pod \"dnsmasq-dns-74d56d465f-d2t8h\" (UID: \"56ad7438-3747-4fff-a88f-34f87df435d6\") " pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.194105 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56ad7438-3747-4fff-a88f-34f87df435d6-dns-svc\") pod \"dnsmasq-dns-74d56d465f-d2t8h\" (UID: \"56ad7438-3747-4fff-a88f-34f87df435d6\") " pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.194169 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56ad7438-3747-4fff-a88f-34f87df435d6-ovsdbserver-sb\") pod \"dnsmasq-dns-74d56d465f-d2t8h\" (UID: \"56ad7438-3747-4fff-a88f-34f87df435d6\") " pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.194193 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56ad7438-3747-4fff-a88f-34f87df435d6-ovsdbserver-nb\") pod \"dnsmasq-dns-74d56d465f-d2t8h\" (UID: \"56ad7438-3747-4fff-a88f-34f87df435d6\") " pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.194219 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56ad7438-3747-4fff-a88f-34f87df435d6-config\") pod \"dnsmasq-dns-74d56d465f-d2t8h\" (UID: \"56ad7438-3747-4fff-a88f-34f87df435d6\") " pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.194236 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdvrd\" (UniqueName: \"kubernetes.io/projected/56ad7438-3747-4fff-a88f-34f87df435d6-kube-api-access-vdvrd\") pod \"dnsmasq-dns-74d56d465f-d2t8h\" (UID: \"56ad7438-3747-4fff-a88f-34f87df435d6\") " pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.195250 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56ad7438-3747-4fff-a88f-34f87df435d6-ovsdbserver-sb\") pod \"dnsmasq-dns-74d56d465f-d2t8h\" (UID: \"56ad7438-3747-4fff-a88f-34f87df435d6\") " pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.195304 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56ad7438-3747-4fff-a88f-34f87df435d6-config\") pod \"dnsmasq-dns-74d56d465f-d2t8h\" (UID: \"56ad7438-3747-4fff-a88f-34f87df435d6\") " pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.195326 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56ad7438-3747-4fff-a88f-34f87df435d6-dns-svc\") pod \"dnsmasq-dns-74d56d465f-d2t8h\" (UID: \"56ad7438-3747-4fff-a88f-34f87df435d6\") " pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.195861 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56ad7438-3747-4fff-a88f-34f87df435d6-ovsdbserver-nb\") pod \"dnsmasq-dns-74d56d465f-d2t8h\" (UID: \"56ad7438-3747-4fff-a88f-34f87df435d6\") " pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.212280 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.213922 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.215675 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.216209 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.217306 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.221742 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-hgw5n" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.236200 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdvrd\" (UniqueName: \"kubernetes.io/projected/56ad7438-3747-4fff-a88f-34f87df435d6-kube-api-access-vdvrd\") pod \"dnsmasq-dns-74d56d465f-d2t8h\" (UID: \"56ad7438-3747-4fff-a88f-34f87df435d6\") " pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.274722 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.296104 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brrgq\" (UniqueName: \"kubernetes.io/projected/cf868273-8f1d-462f-9101-fdc96477fd2c-kube-api-access-brrgq\") pod \"cinder-api-0\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " pod="openstack/cinder-api-0" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.296186 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf868273-8f1d-462f-9101-fdc96477fd2c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " pod="openstack/cinder-api-0" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.296230 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf868273-8f1d-462f-9101-fdc96477fd2c-config-data-custom\") pod \"cinder-api-0\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " pod="openstack/cinder-api-0" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.296265 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf868273-8f1d-462f-9101-fdc96477fd2c-scripts\") pod \"cinder-api-0\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " pod="openstack/cinder-api-0" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.296289 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cf868273-8f1d-462f-9101-fdc96477fd2c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " pod="openstack/cinder-api-0" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.296412 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf868273-8f1d-462f-9101-fdc96477fd2c-logs\") pod \"cinder-api-0\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " pod="openstack/cinder-api-0" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.296443 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf868273-8f1d-462f-9101-fdc96477fd2c-config-data\") pod \"cinder-api-0\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " pod="openstack/cinder-api-0" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.315823 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.401840 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf868273-8f1d-462f-9101-fdc96477fd2c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " pod="openstack/cinder-api-0" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.406667 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf868273-8f1d-462f-9101-fdc96477fd2c-config-data-custom\") pod \"cinder-api-0\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " pod="openstack/cinder-api-0" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.406789 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf868273-8f1d-462f-9101-fdc96477fd2c-scripts\") pod \"cinder-api-0\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " pod="openstack/cinder-api-0" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.406845 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cf868273-8f1d-462f-9101-fdc96477fd2c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " pod="openstack/cinder-api-0" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.407014 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf868273-8f1d-462f-9101-fdc96477fd2c-logs\") pod \"cinder-api-0\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " pod="openstack/cinder-api-0" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.407056 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf868273-8f1d-462f-9101-fdc96477fd2c-config-data\") pod \"cinder-api-0\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " pod="openstack/cinder-api-0" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.407149 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brrgq\" (UniqueName: \"kubernetes.io/projected/cf868273-8f1d-462f-9101-fdc96477fd2c-kube-api-access-brrgq\") pod \"cinder-api-0\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " pod="openstack/cinder-api-0" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.407358 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cf868273-8f1d-462f-9101-fdc96477fd2c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " pod="openstack/cinder-api-0" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.407623 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf868273-8f1d-462f-9101-fdc96477fd2c-logs\") pod \"cinder-api-0\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " pod="openstack/cinder-api-0" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.410703 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf868273-8f1d-462f-9101-fdc96477fd2c-scripts\") pod \"cinder-api-0\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " pod="openstack/cinder-api-0" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.411205 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf868273-8f1d-462f-9101-fdc96477fd2c-config-data-custom\") pod \"cinder-api-0\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " pod="openstack/cinder-api-0" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.411276 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf868273-8f1d-462f-9101-fdc96477fd2c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " pod="openstack/cinder-api-0" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.413556 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf868273-8f1d-462f-9101-fdc96477fd2c-config-data\") pod \"cinder-api-0\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " pod="openstack/cinder-api-0" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.437666 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brrgq\" (UniqueName: \"kubernetes.io/projected/cf868273-8f1d-462f-9101-fdc96477fd2c-kube-api-access-brrgq\") pod \"cinder-api-0\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " pod="openstack/cinder-api-0" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.575021 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.776077 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74d56d465f-d2t8h"] Mar 10 08:14:49 crc kubenswrapper[4825]: W0310 08:14:49.778569 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56ad7438_3747_4fff_a88f_34f87df435d6.slice/crio-75a3ad6a73c1ba45bf481e8969e616536cfae9c6896214de60a762c4508984a6 WatchSource:0}: Error finding container 75a3ad6a73c1ba45bf481e8969e616536cfae9c6896214de60a762c4508984a6: Status 404 returned error can't find the container with id 75a3ad6a73c1ba45bf481e8969e616536cfae9c6896214de60a762c4508984a6 Mar 10 08:14:49 crc kubenswrapper[4825]: I0310 08:14:49.834175 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 08:14:49 crc kubenswrapper[4825]: W0310 08:14:49.850550 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf868273_8f1d_462f_9101_fdc96477fd2c.slice/crio-a36feb42ad0d1815aaf94cf8f61f0c4ef5cc1d7796014d91b2b4e237734f37c0 WatchSource:0}: Error finding container a36feb42ad0d1815aaf94cf8f61f0c4ef5cc1d7796014d91b2b4e237734f37c0: Status 404 returned error can't find the container with id a36feb42ad0d1815aaf94cf8f61f0c4ef5cc1d7796014d91b2b4e237734f37c0 Mar 10 08:14:50 crc kubenswrapper[4825]: I0310 08:14:50.685103 4825 generic.go:334] "Generic (PLEG): container finished" podID="56ad7438-3747-4fff-a88f-34f87df435d6" containerID="31afbfa85d1b8c8b7ab676c6ab3b0efb6487c66e1418dd92d1207a9c8c547043" exitCode=0 Mar 10 08:14:50 crc kubenswrapper[4825]: I0310 08:14:50.685175 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" event={"ID":"56ad7438-3747-4fff-a88f-34f87df435d6","Type":"ContainerDied","Data":"31afbfa85d1b8c8b7ab676c6ab3b0efb6487c66e1418dd92d1207a9c8c547043"} Mar 10 08:14:50 crc kubenswrapper[4825]: I0310 08:14:50.685624 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" event={"ID":"56ad7438-3747-4fff-a88f-34f87df435d6","Type":"ContainerStarted","Data":"75a3ad6a73c1ba45bf481e8969e616536cfae9c6896214de60a762c4508984a6"} Mar 10 08:14:50 crc kubenswrapper[4825]: I0310 08:14:50.717370 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cf868273-8f1d-462f-9101-fdc96477fd2c","Type":"ContainerStarted","Data":"1c124094a26fd6ec16e9ce524e78440d91f07c595858d8c22a30e98b6c160f4f"} Mar 10 08:14:50 crc kubenswrapper[4825]: I0310 08:14:50.717421 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cf868273-8f1d-462f-9101-fdc96477fd2c","Type":"ContainerStarted","Data":"a36feb42ad0d1815aaf94cf8f61f0c4ef5cc1d7796014d91b2b4e237734f37c0"} Mar 10 08:14:51 crc kubenswrapper[4825]: I0310 08:14:51.669313 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lrrtv" Mar 10 08:14:51 crc kubenswrapper[4825]: I0310 08:14:51.669719 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lrrtv" Mar 10 08:14:51 crc kubenswrapper[4825]: I0310 08:14:51.712045 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 08:14:51 crc kubenswrapper[4825]: I0310 08:14:51.727924 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lrrtv" Mar 10 08:14:51 crc kubenswrapper[4825]: I0310 08:14:51.728040 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cf868273-8f1d-462f-9101-fdc96477fd2c","Type":"ContainerStarted","Data":"eb8dab71535a06a7bf672273e3c2f0d42daefa70dfa52f2ca896131191d24425"} Mar 10 08:14:51 crc kubenswrapper[4825]: I0310 08:14:51.728320 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 10 08:14:51 crc kubenswrapper[4825]: I0310 08:14:51.730108 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" event={"ID":"56ad7438-3747-4fff-a88f-34f87df435d6","Type":"ContainerStarted","Data":"93e50380a27201dc777ba257a73ff60ccbd7ddb727bf3ff5f02acec9cac182e3"} Mar 10 08:14:51 crc kubenswrapper[4825]: I0310 08:14:51.771004 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.770965705 podStartE2EDuration="2.770965705s" podCreationTimestamp="2026-03-10 08:14:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:14:51.763387375 +0000 UTC m=+5444.793167990" watchObservedRunningTime="2026-03-10 08:14:51.770965705 +0000 UTC m=+5444.800746330" Mar 10 08:14:51 crc kubenswrapper[4825]: I0310 08:14:51.786936 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lrrtv" Mar 10 08:14:51 crc kubenswrapper[4825]: I0310 08:14:51.794558 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" podStartSLOduration=3.794540994 podStartE2EDuration="3.794540994s" podCreationTimestamp="2026-03-10 08:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:14:51.786224635 +0000 UTC m=+5444.816005250" watchObservedRunningTime="2026-03-10 08:14:51.794540994 +0000 UTC m=+5444.824321609" Mar 10 08:14:51 crc kubenswrapper[4825]: I0310 08:14:51.967717 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lrrtv"] Mar 10 08:14:52 crc kubenswrapper[4825]: I0310 08:14:52.737973 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="cf868273-8f1d-462f-9101-fdc96477fd2c" containerName="cinder-api-log" containerID="cri-o://1c124094a26fd6ec16e9ce524e78440d91f07c595858d8c22a30e98b6c160f4f" gracePeriod=30 Mar 10 08:14:52 crc kubenswrapper[4825]: I0310 08:14:52.738274 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="cf868273-8f1d-462f-9101-fdc96477fd2c" containerName="cinder-api" containerID="cri-o://eb8dab71535a06a7bf672273e3c2f0d42daefa70dfa52f2ca896131191d24425" gracePeriod=30 Mar 10 08:14:52 crc kubenswrapper[4825]: I0310 08:14:52.738801 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.319504 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.479853 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf868273-8f1d-462f-9101-fdc96477fd2c-combined-ca-bundle\") pod \"cf868273-8f1d-462f-9101-fdc96477fd2c\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.480089 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf868273-8f1d-462f-9101-fdc96477fd2c-config-data\") pod \"cf868273-8f1d-462f-9101-fdc96477fd2c\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.480242 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cf868273-8f1d-462f-9101-fdc96477fd2c-etc-machine-id\") pod \"cf868273-8f1d-462f-9101-fdc96477fd2c\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.480466 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brrgq\" (UniqueName: \"kubernetes.io/projected/cf868273-8f1d-462f-9101-fdc96477fd2c-kube-api-access-brrgq\") pod \"cf868273-8f1d-462f-9101-fdc96477fd2c\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.480551 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf868273-8f1d-462f-9101-fdc96477fd2c-logs\") pod \"cf868273-8f1d-462f-9101-fdc96477fd2c\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.480711 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf868273-8f1d-462f-9101-fdc96477fd2c-scripts\") pod \"cf868273-8f1d-462f-9101-fdc96477fd2c\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.480786 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf868273-8f1d-462f-9101-fdc96477fd2c-config-data-custom\") pod \"cf868273-8f1d-462f-9101-fdc96477fd2c\" (UID: \"cf868273-8f1d-462f-9101-fdc96477fd2c\") " Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.480695 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf868273-8f1d-462f-9101-fdc96477fd2c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "cf868273-8f1d-462f-9101-fdc96477fd2c" (UID: "cf868273-8f1d-462f-9101-fdc96477fd2c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.481089 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf868273-8f1d-462f-9101-fdc96477fd2c-logs" (OuterVolumeSpecName: "logs") pod "cf868273-8f1d-462f-9101-fdc96477fd2c" (UID: "cf868273-8f1d-462f-9101-fdc96477fd2c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.481425 4825 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cf868273-8f1d-462f-9101-fdc96477fd2c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.481516 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf868273-8f1d-462f-9101-fdc96477fd2c-logs\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.486290 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf868273-8f1d-462f-9101-fdc96477fd2c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cf868273-8f1d-462f-9101-fdc96477fd2c" (UID: "cf868273-8f1d-462f-9101-fdc96477fd2c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.486550 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf868273-8f1d-462f-9101-fdc96477fd2c-kube-api-access-brrgq" (OuterVolumeSpecName: "kube-api-access-brrgq") pod "cf868273-8f1d-462f-9101-fdc96477fd2c" (UID: "cf868273-8f1d-462f-9101-fdc96477fd2c"). InnerVolumeSpecName "kube-api-access-brrgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.488253 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf868273-8f1d-462f-9101-fdc96477fd2c-scripts" (OuterVolumeSpecName: "scripts") pod "cf868273-8f1d-462f-9101-fdc96477fd2c" (UID: "cf868273-8f1d-462f-9101-fdc96477fd2c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.510376 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf868273-8f1d-462f-9101-fdc96477fd2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf868273-8f1d-462f-9101-fdc96477fd2c" (UID: "cf868273-8f1d-462f-9101-fdc96477fd2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.542843 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf868273-8f1d-462f-9101-fdc96477fd2c-config-data" (OuterVolumeSpecName: "config-data") pod "cf868273-8f1d-462f-9101-fdc96477fd2c" (UID: "cf868273-8f1d-462f-9101-fdc96477fd2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.582741 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brrgq\" (UniqueName: \"kubernetes.io/projected/cf868273-8f1d-462f-9101-fdc96477fd2c-kube-api-access-brrgq\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.582774 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf868273-8f1d-462f-9101-fdc96477fd2c-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.582784 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf868273-8f1d-462f-9101-fdc96477fd2c-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.582794 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf868273-8f1d-462f-9101-fdc96477fd2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.582803 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf868273-8f1d-462f-9101-fdc96477fd2c-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.746566 4825 generic.go:334] "Generic (PLEG): container finished" podID="cf868273-8f1d-462f-9101-fdc96477fd2c" containerID="eb8dab71535a06a7bf672273e3c2f0d42daefa70dfa52f2ca896131191d24425" exitCode=0 Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.746606 4825 generic.go:334] "Generic (PLEG): container finished" podID="cf868273-8f1d-462f-9101-fdc96477fd2c" containerID="1c124094a26fd6ec16e9ce524e78440d91f07c595858d8c22a30e98b6c160f4f" exitCode=143 Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.746661 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cf868273-8f1d-462f-9101-fdc96477fd2c","Type":"ContainerDied","Data":"eb8dab71535a06a7bf672273e3c2f0d42daefa70dfa52f2ca896131191d24425"} Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.746734 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cf868273-8f1d-462f-9101-fdc96477fd2c","Type":"ContainerDied","Data":"1c124094a26fd6ec16e9ce524e78440d91f07c595858d8c22a30e98b6c160f4f"} Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.746746 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cf868273-8f1d-462f-9101-fdc96477fd2c","Type":"ContainerDied","Data":"a36feb42ad0d1815aaf94cf8f61f0c4ef5cc1d7796014d91b2b4e237734f37c0"} Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.746766 4825 scope.go:117] "RemoveContainer" containerID="eb8dab71535a06a7bf672273e3c2f0d42daefa70dfa52f2ca896131191d24425" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.746915 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lrrtv" podUID="d703efe1-32d9-438b-af28-bea13efa058e" containerName="registry-server" containerID="cri-o://7a683ed3a74c5e88dec46d48b98535d6d0eace44be3a60e9462d1afba279aade" gracePeriod=2 Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.748277 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.770978 4825 scope.go:117] "RemoveContainer" containerID="1c124094a26fd6ec16e9ce524e78440d91f07c595858d8c22a30e98b6c160f4f" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.792390 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.803640 4825 scope.go:117] "RemoveContainer" containerID="eb8dab71535a06a7bf672273e3c2f0d42daefa70dfa52f2ca896131191d24425" Mar 10 08:14:53 crc kubenswrapper[4825]: E0310 08:14:53.804508 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb8dab71535a06a7bf672273e3c2f0d42daefa70dfa52f2ca896131191d24425\": container with ID starting with eb8dab71535a06a7bf672273e3c2f0d42daefa70dfa52f2ca896131191d24425 not found: ID does not exist" containerID="eb8dab71535a06a7bf672273e3c2f0d42daefa70dfa52f2ca896131191d24425" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.804597 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb8dab71535a06a7bf672273e3c2f0d42daefa70dfa52f2ca896131191d24425"} err="failed to get container status \"eb8dab71535a06a7bf672273e3c2f0d42daefa70dfa52f2ca896131191d24425\": rpc error: code = NotFound desc = could not find container \"eb8dab71535a06a7bf672273e3c2f0d42daefa70dfa52f2ca896131191d24425\": container with ID starting with eb8dab71535a06a7bf672273e3c2f0d42daefa70dfa52f2ca896131191d24425 not found: ID does not exist" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.804657 4825 scope.go:117] "RemoveContainer" containerID="1c124094a26fd6ec16e9ce524e78440d91f07c595858d8c22a30e98b6c160f4f" Mar 10 08:14:53 crc kubenswrapper[4825]: E0310 08:14:53.805611 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c124094a26fd6ec16e9ce524e78440d91f07c595858d8c22a30e98b6c160f4f\": container with ID starting with 1c124094a26fd6ec16e9ce524e78440d91f07c595858d8c22a30e98b6c160f4f not found: ID does not exist" containerID="1c124094a26fd6ec16e9ce524e78440d91f07c595858d8c22a30e98b6c160f4f" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.805679 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c124094a26fd6ec16e9ce524e78440d91f07c595858d8c22a30e98b6c160f4f"} err="failed to get container status \"1c124094a26fd6ec16e9ce524e78440d91f07c595858d8c22a30e98b6c160f4f\": rpc error: code = NotFound desc = could not find container \"1c124094a26fd6ec16e9ce524e78440d91f07c595858d8c22a30e98b6c160f4f\": container with ID starting with 1c124094a26fd6ec16e9ce524e78440d91f07c595858d8c22a30e98b6c160f4f not found: ID does not exist" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.805723 4825 scope.go:117] "RemoveContainer" containerID="eb8dab71535a06a7bf672273e3c2f0d42daefa70dfa52f2ca896131191d24425" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.806228 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb8dab71535a06a7bf672273e3c2f0d42daefa70dfa52f2ca896131191d24425"} err="failed to get container status \"eb8dab71535a06a7bf672273e3c2f0d42daefa70dfa52f2ca896131191d24425\": rpc error: code = NotFound desc = could not find container \"eb8dab71535a06a7bf672273e3c2f0d42daefa70dfa52f2ca896131191d24425\": container with ID starting with eb8dab71535a06a7bf672273e3c2f0d42daefa70dfa52f2ca896131191d24425 not found: ID does not exist" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.806276 4825 scope.go:117] "RemoveContainer" containerID="1c124094a26fd6ec16e9ce524e78440d91f07c595858d8c22a30e98b6c160f4f" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.806623 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c124094a26fd6ec16e9ce524e78440d91f07c595858d8c22a30e98b6c160f4f"} err="failed to get container status \"1c124094a26fd6ec16e9ce524e78440d91f07c595858d8c22a30e98b6c160f4f\": rpc error: code = NotFound desc = could not find container \"1c124094a26fd6ec16e9ce524e78440d91f07c595858d8c22a30e98b6c160f4f\": container with ID starting with 1c124094a26fd6ec16e9ce524e78440d91f07c595858d8c22a30e98b6c160f4f not found: ID does not exist" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.806790 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.830428 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 10 08:14:53 crc kubenswrapper[4825]: E0310 08:14:53.830840 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf868273-8f1d-462f-9101-fdc96477fd2c" containerName="cinder-api-log" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.830868 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf868273-8f1d-462f-9101-fdc96477fd2c" containerName="cinder-api-log" Mar 10 08:14:53 crc kubenswrapper[4825]: E0310 08:14:53.830894 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf868273-8f1d-462f-9101-fdc96477fd2c" containerName="cinder-api" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.830903 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf868273-8f1d-462f-9101-fdc96477fd2c" containerName="cinder-api" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.831122 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf868273-8f1d-462f-9101-fdc96477fd2c" containerName="cinder-api" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.831162 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf868273-8f1d-462f-9101-fdc96477fd2c" containerName="cinder-api-log" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.832206 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.832304 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.868125 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.868472 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.868644 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.869993 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.870438 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.870596 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-hgw5n" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.990300 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-scripts\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.990357 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-config-data-custom\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.990376 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-etc-machine-id\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.990399 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-config-data\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.990420 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.990451 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.990480 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-public-tls-certs\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.990499 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-logs\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:53 crc kubenswrapper[4825]: I0310 08:14:53.990549 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jjs2\" (UniqueName: \"kubernetes.io/projected/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-kube-api-access-9jjs2\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.092591 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-public-tls-certs\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.092654 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-logs\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.092731 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jjs2\" (UniqueName: \"kubernetes.io/projected/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-kube-api-access-9jjs2\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.092802 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-scripts\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.092827 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-config-data-custom\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.092849 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-etc-machine-id\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.092878 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-config-data\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.092906 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.092946 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.093353 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-etc-machine-id\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.093405 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-logs\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.098336 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.100747 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-config-data\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.101038 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.103337 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-scripts\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.103755 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-config-data-custom\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.105645 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-public-tls-certs\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.109305 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jjs2\" (UniqueName: \"kubernetes.io/projected/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-kube-api-access-9jjs2\") pod \"cinder-api-0\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " pod="openstack/cinder-api-0" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.207736 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrrtv" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.234498 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.296462 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d703efe1-32d9-438b-af28-bea13efa058e-utilities\") pod \"d703efe1-32d9-438b-af28-bea13efa058e\" (UID: \"d703efe1-32d9-438b-af28-bea13efa058e\") " Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.296561 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmg8s\" (UniqueName: \"kubernetes.io/projected/d703efe1-32d9-438b-af28-bea13efa058e-kube-api-access-rmg8s\") pod \"d703efe1-32d9-438b-af28-bea13efa058e\" (UID: \"d703efe1-32d9-438b-af28-bea13efa058e\") " Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.296599 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d703efe1-32d9-438b-af28-bea13efa058e-catalog-content\") pod \"d703efe1-32d9-438b-af28-bea13efa058e\" (UID: \"d703efe1-32d9-438b-af28-bea13efa058e\") " Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.298743 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d703efe1-32d9-438b-af28-bea13efa058e-utilities" (OuterVolumeSpecName: "utilities") pod "d703efe1-32d9-438b-af28-bea13efa058e" (UID: "d703efe1-32d9-438b-af28-bea13efa058e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.300971 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d703efe1-32d9-438b-af28-bea13efa058e-kube-api-access-rmg8s" (OuterVolumeSpecName: "kube-api-access-rmg8s") pod "d703efe1-32d9-438b-af28-bea13efa058e" (UID: "d703efe1-32d9-438b-af28-bea13efa058e"). InnerVolumeSpecName "kube-api-access-rmg8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.353212 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d703efe1-32d9-438b-af28-bea13efa058e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d703efe1-32d9-438b-af28-bea13efa058e" (UID: "d703efe1-32d9-438b-af28-bea13efa058e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.399400 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d703efe1-32d9-438b-af28-bea13efa058e-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.399441 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmg8s\" (UniqueName: \"kubernetes.io/projected/d703efe1-32d9-438b-af28-bea13efa058e-kube-api-access-rmg8s\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.399457 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d703efe1-32d9-438b-af28-bea13efa058e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.668953 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.770469 4825 generic.go:334] "Generic (PLEG): container finished" podID="d703efe1-32d9-438b-af28-bea13efa058e" containerID="7a683ed3a74c5e88dec46d48b98535d6d0eace44be3a60e9462d1afba279aade" exitCode=0 Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.770548 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrrtv" event={"ID":"d703efe1-32d9-438b-af28-bea13efa058e","Type":"ContainerDied","Data":"7a683ed3a74c5e88dec46d48b98535d6d0eace44be3a60e9462d1afba279aade"} Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.770573 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrrtv" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.770604 4825 scope.go:117] "RemoveContainer" containerID="7a683ed3a74c5e88dec46d48b98535d6d0eace44be3a60e9462d1afba279aade" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.770591 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrrtv" event={"ID":"d703efe1-32d9-438b-af28-bea13efa058e","Type":"ContainerDied","Data":"11c3609d5896bf05ec3d6ec086d6089615a13c6286ff573f853991710b4fc57e"} Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.772341 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36","Type":"ContainerStarted","Data":"da87b1fa2845c81c241b53418ff82bab78b4dc3e7fd12452e75d5e7fe7fd9920"} Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.807243 4825 scope.go:117] "RemoveContainer" containerID="25a132751b38c1985b2ec9b002615656a007077bf3f5706079ac0a3fd6ab4147" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.815793 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lrrtv"] Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.825994 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lrrtv"] Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.837226 4825 scope.go:117] "RemoveContainer" containerID="ae0ad4e7287176ff8c9dee5816efe21a982641ab4427009b4a29de1f93b1f121" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.855719 4825 scope.go:117] "RemoveContainer" containerID="7a683ed3a74c5e88dec46d48b98535d6d0eace44be3a60e9462d1afba279aade" Mar 10 08:14:54 crc kubenswrapper[4825]: E0310 08:14:54.862639 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a683ed3a74c5e88dec46d48b98535d6d0eace44be3a60e9462d1afba279aade\": container with ID starting with 7a683ed3a74c5e88dec46d48b98535d6d0eace44be3a60e9462d1afba279aade not found: ID does not exist" containerID="7a683ed3a74c5e88dec46d48b98535d6d0eace44be3a60e9462d1afba279aade" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.862668 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a683ed3a74c5e88dec46d48b98535d6d0eace44be3a60e9462d1afba279aade"} err="failed to get container status \"7a683ed3a74c5e88dec46d48b98535d6d0eace44be3a60e9462d1afba279aade\": rpc error: code = NotFound desc = could not find container \"7a683ed3a74c5e88dec46d48b98535d6d0eace44be3a60e9462d1afba279aade\": container with ID starting with 7a683ed3a74c5e88dec46d48b98535d6d0eace44be3a60e9462d1afba279aade not found: ID does not exist" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.862689 4825 scope.go:117] "RemoveContainer" containerID="25a132751b38c1985b2ec9b002615656a007077bf3f5706079ac0a3fd6ab4147" Mar 10 08:14:54 crc kubenswrapper[4825]: E0310 08:14:54.862997 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25a132751b38c1985b2ec9b002615656a007077bf3f5706079ac0a3fd6ab4147\": container with ID starting with 25a132751b38c1985b2ec9b002615656a007077bf3f5706079ac0a3fd6ab4147 not found: ID does not exist" containerID="25a132751b38c1985b2ec9b002615656a007077bf3f5706079ac0a3fd6ab4147" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.863030 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25a132751b38c1985b2ec9b002615656a007077bf3f5706079ac0a3fd6ab4147"} err="failed to get container status \"25a132751b38c1985b2ec9b002615656a007077bf3f5706079ac0a3fd6ab4147\": rpc error: code = NotFound desc = could not find container \"25a132751b38c1985b2ec9b002615656a007077bf3f5706079ac0a3fd6ab4147\": container with ID starting with 25a132751b38c1985b2ec9b002615656a007077bf3f5706079ac0a3fd6ab4147 not found: ID does not exist" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.863050 4825 scope.go:117] "RemoveContainer" containerID="ae0ad4e7287176ff8c9dee5816efe21a982641ab4427009b4a29de1f93b1f121" Mar 10 08:14:54 crc kubenswrapper[4825]: E0310 08:14:54.866549 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae0ad4e7287176ff8c9dee5816efe21a982641ab4427009b4a29de1f93b1f121\": container with ID starting with ae0ad4e7287176ff8c9dee5816efe21a982641ab4427009b4a29de1f93b1f121 not found: ID does not exist" containerID="ae0ad4e7287176ff8c9dee5816efe21a982641ab4427009b4a29de1f93b1f121" Mar 10 08:14:54 crc kubenswrapper[4825]: I0310 08:14:54.866635 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae0ad4e7287176ff8c9dee5816efe21a982641ab4427009b4a29de1f93b1f121"} err="failed to get container status \"ae0ad4e7287176ff8c9dee5816efe21a982641ab4427009b4a29de1f93b1f121\": rpc error: code = NotFound desc = could not find container \"ae0ad4e7287176ff8c9dee5816efe21a982641ab4427009b4a29de1f93b1f121\": container with ID starting with ae0ad4e7287176ff8c9dee5816efe21a982641ab4427009b4a29de1f93b1f121 not found: ID does not exist" Mar 10 08:14:55 crc kubenswrapper[4825]: I0310 08:14:55.245381 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf868273-8f1d-462f-9101-fdc96477fd2c" path="/var/lib/kubelet/pods/cf868273-8f1d-462f-9101-fdc96477fd2c/volumes" Mar 10 08:14:55 crc kubenswrapper[4825]: I0310 08:14:55.246118 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d703efe1-32d9-438b-af28-bea13efa058e" path="/var/lib/kubelet/pods/d703efe1-32d9-438b-af28-bea13efa058e/volumes" Mar 10 08:14:55 crc kubenswrapper[4825]: I0310 08:14:55.792675 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36","Type":"ContainerStarted","Data":"5ecd9977232bc5148d24eb393a5e62db939f45a2db927f5ebd5f554b99d75cc5"} Mar 10 08:14:56 crc kubenswrapper[4825]: I0310 08:14:56.236473 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:14:56 crc kubenswrapper[4825]: E0310 08:14:56.237041 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:14:56 crc kubenswrapper[4825]: I0310 08:14:56.805963 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36","Type":"ContainerStarted","Data":"f4dcebc193b4ec012c3b6f0d8afa587e6d461a9bca3448c05ec5d1539b09ac73"} Mar 10 08:14:56 crc kubenswrapper[4825]: I0310 08:14:56.806124 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 10 08:14:56 crc kubenswrapper[4825]: I0310 08:14:56.849889 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.849859978 podStartE2EDuration="3.849859978s" podCreationTimestamp="2026-03-10 08:14:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:14:56.834394502 +0000 UTC m=+5449.864175117" watchObservedRunningTime="2026-03-10 08:14:56.849859978 +0000 UTC m=+5449.879640633" Mar 10 08:14:59 crc kubenswrapper[4825]: I0310 08:14:59.318270 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" Mar 10 08:14:59 crc kubenswrapper[4825]: I0310 08:14:59.416919 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c7b7657fc-vwr72"] Mar 10 08:14:59 crc kubenswrapper[4825]: I0310 08:14:59.417409 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" podUID="a8e6f407-4c43-4105-8b90-6d1eefa79a20" containerName="dnsmasq-dns" containerID="cri-o://8ff48b0e845c119b201f42252ec7936b2bc34895932f700db12f398de5e90390" gracePeriod=10 Mar 10 08:14:59 crc kubenswrapper[4825]: I0310 08:14:59.836890 4825 generic.go:334] "Generic (PLEG): container finished" podID="a8e6f407-4c43-4105-8b90-6d1eefa79a20" containerID="8ff48b0e845c119b201f42252ec7936b2bc34895932f700db12f398de5e90390" exitCode=0 Mar 10 08:14:59 crc kubenswrapper[4825]: I0310 08:14:59.836979 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" event={"ID":"a8e6f407-4c43-4105-8b90-6d1eefa79a20","Type":"ContainerDied","Data":"8ff48b0e845c119b201f42252ec7936b2bc34895932f700db12f398de5e90390"} Mar 10 08:14:59 crc kubenswrapper[4825]: I0310 08:14:59.837273 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" event={"ID":"a8e6f407-4c43-4105-8b90-6d1eefa79a20","Type":"ContainerDied","Data":"51760ef14e493e55c03eb00bbaaf85877694d9c414da82f8185f488e892e077b"} Mar 10 08:14:59 crc kubenswrapper[4825]: I0310 08:14:59.837296 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51760ef14e493e55c03eb00bbaaf85877694d9c414da82f8185f488e892e077b" Mar 10 08:14:59 crc kubenswrapper[4825]: I0310 08:14:59.837845 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.013348 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8e6f407-4c43-4105-8b90-6d1eefa79a20-dns-svc\") pod \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\" (UID: \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\") " Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.013412 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg72c\" (UniqueName: \"kubernetes.io/projected/a8e6f407-4c43-4105-8b90-6d1eefa79a20-kube-api-access-sg72c\") pod \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\" (UID: \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\") " Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.013499 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8e6f407-4c43-4105-8b90-6d1eefa79a20-config\") pod \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\" (UID: \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\") " Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.013533 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8e6f407-4c43-4105-8b90-6d1eefa79a20-ovsdbserver-nb\") pod \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\" (UID: \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\") " Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.013555 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8e6f407-4c43-4105-8b90-6d1eefa79a20-ovsdbserver-sb\") pod \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\" (UID: \"a8e6f407-4c43-4105-8b90-6d1eefa79a20\") " Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.018536 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8e6f407-4c43-4105-8b90-6d1eefa79a20-kube-api-access-sg72c" (OuterVolumeSpecName: "kube-api-access-sg72c") pod "a8e6f407-4c43-4105-8b90-6d1eefa79a20" (UID: "a8e6f407-4c43-4105-8b90-6d1eefa79a20"). InnerVolumeSpecName "kube-api-access-sg72c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.059864 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8e6f407-4c43-4105-8b90-6d1eefa79a20-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a8e6f407-4c43-4105-8b90-6d1eefa79a20" (UID: "a8e6f407-4c43-4105-8b90-6d1eefa79a20"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.061453 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8e6f407-4c43-4105-8b90-6d1eefa79a20-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a8e6f407-4c43-4105-8b90-6d1eefa79a20" (UID: "a8e6f407-4c43-4105-8b90-6d1eefa79a20"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.067278 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8e6f407-4c43-4105-8b90-6d1eefa79a20-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a8e6f407-4c43-4105-8b90-6d1eefa79a20" (UID: "a8e6f407-4c43-4105-8b90-6d1eefa79a20"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.071748 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8e6f407-4c43-4105-8b90-6d1eefa79a20-config" (OuterVolumeSpecName: "config") pod "a8e6f407-4c43-4105-8b90-6d1eefa79a20" (UID: "a8e6f407-4c43-4105-8b90-6d1eefa79a20"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.115391 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8e6f407-4c43-4105-8b90-6d1eefa79a20-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.115648 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8e6f407-4c43-4105-8b90-6d1eefa79a20-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.115775 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8e6f407-4c43-4105-8b90-6d1eefa79a20-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.115866 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg72c\" (UniqueName: \"kubernetes.io/projected/a8e6f407-4c43-4105-8b90-6d1eefa79a20-kube-api-access-sg72c\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.115957 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8e6f407-4c43-4105-8b90-6d1eefa79a20-config\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.135608 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552175-b5k8l"] Mar 10 08:15:00 crc kubenswrapper[4825]: E0310 08:15:00.136902 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d703efe1-32d9-438b-af28-bea13efa058e" containerName="registry-server" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.136995 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d703efe1-32d9-438b-af28-bea13efa058e" containerName="registry-server" Mar 10 08:15:00 crc kubenswrapper[4825]: E0310 08:15:00.137054 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d703efe1-32d9-438b-af28-bea13efa058e" containerName="extract-content" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.137129 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d703efe1-32d9-438b-af28-bea13efa058e" containerName="extract-content" Mar 10 08:15:00 crc kubenswrapper[4825]: E0310 08:15:00.137236 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d703efe1-32d9-438b-af28-bea13efa058e" containerName="extract-utilities" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.137303 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d703efe1-32d9-438b-af28-bea13efa058e" containerName="extract-utilities" Mar 10 08:15:00 crc kubenswrapper[4825]: E0310 08:15:00.137380 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8e6f407-4c43-4105-8b90-6d1eefa79a20" containerName="dnsmasq-dns" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.137437 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8e6f407-4c43-4105-8b90-6d1eefa79a20" containerName="dnsmasq-dns" Mar 10 08:15:00 crc kubenswrapper[4825]: E0310 08:15:00.137489 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8e6f407-4c43-4105-8b90-6d1eefa79a20" containerName="init" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.137541 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8e6f407-4c43-4105-8b90-6d1eefa79a20" containerName="init" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.137770 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8e6f407-4c43-4105-8b90-6d1eefa79a20" containerName="dnsmasq-dns" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.137846 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="d703efe1-32d9-438b-af28-bea13efa058e" containerName="registry-server" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.140713 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552175-b5k8l" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.143296 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.143645 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.156048 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552175-b5k8l"] Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.320393 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/634b3cd8-d25a-4786-8206-d0ab216d359d-config-volume\") pod \"collect-profiles-29552175-b5k8l\" (UID: \"634b3cd8-d25a-4786-8206-d0ab216d359d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552175-b5k8l" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.321347 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w987\" (UniqueName: \"kubernetes.io/projected/634b3cd8-d25a-4786-8206-d0ab216d359d-kube-api-access-4w987\") pod \"collect-profiles-29552175-b5k8l\" (UID: \"634b3cd8-d25a-4786-8206-d0ab216d359d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552175-b5k8l" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.321920 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/634b3cd8-d25a-4786-8206-d0ab216d359d-secret-volume\") pod \"collect-profiles-29552175-b5k8l\" (UID: \"634b3cd8-d25a-4786-8206-d0ab216d359d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552175-b5k8l" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.423967 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/634b3cd8-d25a-4786-8206-d0ab216d359d-config-volume\") pod \"collect-profiles-29552175-b5k8l\" (UID: \"634b3cd8-d25a-4786-8206-d0ab216d359d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552175-b5k8l" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.424286 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w987\" (UniqueName: \"kubernetes.io/projected/634b3cd8-d25a-4786-8206-d0ab216d359d-kube-api-access-4w987\") pod \"collect-profiles-29552175-b5k8l\" (UID: \"634b3cd8-d25a-4786-8206-d0ab216d359d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552175-b5k8l" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.424402 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/634b3cd8-d25a-4786-8206-d0ab216d359d-secret-volume\") pod \"collect-profiles-29552175-b5k8l\" (UID: \"634b3cd8-d25a-4786-8206-d0ab216d359d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552175-b5k8l" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.424936 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/634b3cd8-d25a-4786-8206-d0ab216d359d-config-volume\") pod \"collect-profiles-29552175-b5k8l\" (UID: \"634b3cd8-d25a-4786-8206-d0ab216d359d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552175-b5k8l" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.428732 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/634b3cd8-d25a-4786-8206-d0ab216d359d-secret-volume\") pod \"collect-profiles-29552175-b5k8l\" (UID: \"634b3cd8-d25a-4786-8206-d0ab216d359d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552175-b5k8l" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.440117 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w987\" (UniqueName: \"kubernetes.io/projected/634b3cd8-d25a-4786-8206-d0ab216d359d-kube-api-access-4w987\") pod \"collect-profiles-29552175-b5k8l\" (UID: \"634b3cd8-d25a-4786-8206-d0ab216d359d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552175-b5k8l" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.470180 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552175-b5k8l" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.846281 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c7b7657fc-vwr72" Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.891108 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c7b7657fc-vwr72"] Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.899373 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c7b7657fc-vwr72"] Mar 10 08:15:00 crc kubenswrapper[4825]: I0310 08:15:00.932731 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552175-b5k8l"] Mar 10 08:15:01 crc kubenswrapper[4825]: I0310 08:15:01.246411 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8e6f407-4c43-4105-8b90-6d1eefa79a20" path="/var/lib/kubelet/pods/a8e6f407-4c43-4105-8b90-6d1eefa79a20/volumes" Mar 10 08:15:01 crc kubenswrapper[4825]: I0310 08:15:01.857045 4825 generic.go:334] "Generic (PLEG): container finished" podID="634b3cd8-d25a-4786-8206-d0ab216d359d" containerID="69d280806fb41c5c27e18f2c606cad1ef2f2991f0cf454bc8d1d590fb4946a5e" exitCode=0 Mar 10 08:15:01 crc kubenswrapper[4825]: I0310 08:15:01.857095 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552175-b5k8l" event={"ID":"634b3cd8-d25a-4786-8206-d0ab216d359d","Type":"ContainerDied","Data":"69d280806fb41c5c27e18f2c606cad1ef2f2991f0cf454bc8d1d590fb4946a5e"} Mar 10 08:15:01 crc kubenswrapper[4825]: I0310 08:15:01.857430 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552175-b5k8l" event={"ID":"634b3cd8-d25a-4786-8206-d0ab216d359d","Type":"ContainerStarted","Data":"617f1d9c3a54fc53946591c7c4fd2e371ed1c11fc1fc6930e816b0f4f95c2017"} Mar 10 08:15:03 crc kubenswrapper[4825]: I0310 08:15:03.227526 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552175-b5k8l" Mar 10 08:15:03 crc kubenswrapper[4825]: I0310 08:15:03.390899 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/634b3cd8-d25a-4786-8206-d0ab216d359d-secret-volume\") pod \"634b3cd8-d25a-4786-8206-d0ab216d359d\" (UID: \"634b3cd8-d25a-4786-8206-d0ab216d359d\") " Mar 10 08:15:03 crc kubenswrapper[4825]: I0310 08:15:03.391169 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/634b3cd8-d25a-4786-8206-d0ab216d359d-config-volume\") pod \"634b3cd8-d25a-4786-8206-d0ab216d359d\" (UID: \"634b3cd8-d25a-4786-8206-d0ab216d359d\") " Mar 10 08:15:03 crc kubenswrapper[4825]: I0310 08:15:03.391194 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w987\" (UniqueName: \"kubernetes.io/projected/634b3cd8-d25a-4786-8206-d0ab216d359d-kube-api-access-4w987\") pod \"634b3cd8-d25a-4786-8206-d0ab216d359d\" (UID: \"634b3cd8-d25a-4786-8206-d0ab216d359d\") " Mar 10 08:15:03 crc kubenswrapper[4825]: I0310 08:15:03.392035 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/634b3cd8-d25a-4786-8206-d0ab216d359d-config-volume" (OuterVolumeSpecName: "config-volume") pod "634b3cd8-d25a-4786-8206-d0ab216d359d" (UID: "634b3cd8-d25a-4786-8206-d0ab216d359d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:15:03 crc kubenswrapper[4825]: I0310 08:15:03.398516 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/634b3cd8-d25a-4786-8206-d0ab216d359d-kube-api-access-4w987" (OuterVolumeSpecName: "kube-api-access-4w987") pod "634b3cd8-d25a-4786-8206-d0ab216d359d" (UID: "634b3cd8-d25a-4786-8206-d0ab216d359d"). InnerVolumeSpecName "kube-api-access-4w987". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:15:03 crc kubenswrapper[4825]: I0310 08:15:03.399800 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/634b3cd8-d25a-4786-8206-d0ab216d359d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "634b3cd8-d25a-4786-8206-d0ab216d359d" (UID: "634b3cd8-d25a-4786-8206-d0ab216d359d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:15:03 crc kubenswrapper[4825]: I0310 08:15:03.493967 4825 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/634b3cd8-d25a-4786-8206-d0ab216d359d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:03 crc kubenswrapper[4825]: I0310 08:15:03.494038 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w987\" (UniqueName: \"kubernetes.io/projected/634b3cd8-d25a-4786-8206-d0ab216d359d-kube-api-access-4w987\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:03 crc kubenswrapper[4825]: I0310 08:15:03.494062 4825 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/634b3cd8-d25a-4786-8206-d0ab216d359d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:03 crc kubenswrapper[4825]: I0310 08:15:03.877455 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552175-b5k8l" event={"ID":"634b3cd8-d25a-4786-8206-d0ab216d359d","Type":"ContainerDied","Data":"617f1d9c3a54fc53946591c7c4fd2e371ed1c11fc1fc6930e816b0f4f95c2017"} Mar 10 08:15:03 crc kubenswrapper[4825]: I0310 08:15:03.877489 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="617f1d9c3a54fc53946591c7c4fd2e371ed1c11fc1fc6930e816b0f4f95c2017" Mar 10 08:15:03 crc kubenswrapper[4825]: I0310 08:15:03.877586 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552175-b5k8l" Mar 10 08:15:04 crc kubenswrapper[4825]: I0310 08:15:04.312850 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552130-sbkmx"] Mar 10 08:15:04 crc kubenswrapper[4825]: I0310 08:15:04.322085 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552130-sbkmx"] Mar 10 08:15:05 crc kubenswrapper[4825]: I0310 08:15:05.250082 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f" path="/var/lib/kubelet/pods/a5833cf1-e0d8-4f34-8a4e-d33d9dc86c8f/volumes" Mar 10 08:15:06 crc kubenswrapper[4825]: I0310 08:15:06.094983 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 10 08:15:09 crc kubenswrapper[4825]: I0310 08:15:09.244609 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:15:09 crc kubenswrapper[4825]: E0310 08:15:09.247173 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:15:20 crc kubenswrapper[4825]: I0310 08:15:20.236464 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:15:20 crc kubenswrapper[4825]: E0310 08:15:20.237246 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:15:22 crc kubenswrapper[4825]: I0310 08:15:22.531518 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dzb8c"] Mar 10 08:15:22 crc kubenswrapper[4825]: E0310 08:15:22.533305 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="634b3cd8-d25a-4786-8206-d0ab216d359d" containerName="collect-profiles" Mar 10 08:15:22 crc kubenswrapper[4825]: I0310 08:15:22.533445 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="634b3cd8-d25a-4786-8206-d0ab216d359d" containerName="collect-profiles" Mar 10 08:15:22 crc kubenswrapper[4825]: I0310 08:15:22.534050 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="634b3cd8-d25a-4786-8206-d0ab216d359d" containerName="collect-profiles" Mar 10 08:15:22 crc kubenswrapper[4825]: I0310 08:15:22.537702 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzb8c" Mar 10 08:15:22 crc kubenswrapper[4825]: I0310 08:15:22.553348 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dzb8c"] Mar 10 08:15:22 crc kubenswrapper[4825]: I0310 08:15:22.658610 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9-catalog-content\") pod \"redhat-operators-dzb8c\" (UID: \"2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9\") " pod="openshift-marketplace/redhat-operators-dzb8c" Mar 10 08:15:22 crc kubenswrapper[4825]: I0310 08:15:22.658696 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b222p\" (UniqueName: \"kubernetes.io/projected/2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9-kube-api-access-b222p\") pod \"redhat-operators-dzb8c\" (UID: \"2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9\") " pod="openshift-marketplace/redhat-operators-dzb8c" Mar 10 08:15:22 crc kubenswrapper[4825]: I0310 08:15:22.658760 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9-utilities\") pod \"redhat-operators-dzb8c\" (UID: \"2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9\") " pod="openshift-marketplace/redhat-operators-dzb8c" Mar 10 08:15:22 crc kubenswrapper[4825]: I0310 08:15:22.761054 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9-catalog-content\") pod \"redhat-operators-dzb8c\" (UID: \"2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9\") " pod="openshift-marketplace/redhat-operators-dzb8c" Mar 10 08:15:22 crc kubenswrapper[4825]: I0310 08:15:22.761117 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b222p\" (UniqueName: \"kubernetes.io/projected/2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9-kube-api-access-b222p\") pod \"redhat-operators-dzb8c\" (UID: \"2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9\") " pod="openshift-marketplace/redhat-operators-dzb8c" Mar 10 08:15:22 crc kubenswrapper[4825]: I0310 08:15:22.761189 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9-utilities\") pod \"redhat-operators-dzb8c\" (UID: \"2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9\") " pod="openshift-marketplace/redhat-operators-dzb8c" Mar 10 08:15:22 crc kubenswrapper[4825]: I0310 08:15:22.761780 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9-catalog-content\") pod \"redhat-operators-dzb8c\" (UID: \"2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9\") " pod="openshift-marketplace/redhat-operators-dzb8c" Mar 10 08:15:22 crc kubenswrapper[4825]: I0310 08:15:22.761895 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9-utilities\") pod \"redhat-operators-dzb8c\" (UID: \"2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9\") " pod="openshift-marketplace/redhat-operators-dzb8c" Mar 10 08:15:22 crc kubenswrapper[4825]: I0310 08:15:22.782557 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b222p\" (UniqueName: \"kubernetes.io/projected/2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9-kube-api-access-b222p\") pod \"redhat-operators-dzb8c\" (UID: \"2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9\") " pod="openshift-marketplace/redhat-operators-dzb8c" Mar 10 08:15:22 crc kubenswrapper[4825]: I0310 08:15:22.871987 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzb8c" Mar 10 08:15:23 crc kubenswrapper[4825]: I0310 08:15:23.347034 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dzb8c"] Mar 10 08:15:23 crc kubenswrapper[4825]: W0310 08:15:23.351237 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c0a4ca4_aa1e_488e_b8c6_3ff0909807e9.slice/crio-70e73746230991a2eaca80fdd05017b09afdbabdc260885aab804910ff23a59d WatchSource:0}: Error finding container 70e73746230991a2eaca80fdd05017b09afdbabdc260885aab804910ff23a59d: Status 404 returned error can't find the container with id 70e73746230991a2eaca80fdd05017b09afdbabdc260885aab804910ff23a59d Mar 10 08:15:23 crc kubenswrapper[4825]: I0310 08:15:23.600838 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 08:15:23 crc kubenswrapper[4825]: I0310 08:15:23.602437 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 08:15:23 crc kubenswrapper[4825]: I0310 08:15:23.605827 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 10 08:15:23 crc kubenswrapper[4825]: I0310 08:15:23.612419 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 08:15:23 crc kubenswrapper[4825]: I0310 08:15:23.786306 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79dfca78-daec-4e1b-93cb-505e50873031-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"79dfca78-daec-4e1b-93cb-505e50873031\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:23 crc kubenswrapper[4825]: I0310 08:15:23.786417 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79dfca78-daec-4e1b-93cb-505e50873031-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"79dfca78-daec-4e1b-93cb-505e50873031\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:23 crc kubenswrapper[4825]: I0310 08:15:23.786627 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79dfca78-daec-4e1b-93cb-505e50873031-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"79dfca78-daec-4e1b-93cb-505e50873031\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:23 crc kubenswrapper[4825]: I0310 08:15:23.786812 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79dfca78-daec-4e1b-93cb-505e50873031-scripts\") pod \"cinder-scheduler-0\" (UID: \"79dfca78-daec-4e1b-93cb-505e50873031\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:23 crc kubenswrapper[4825]: I0310 08:15:23.786930 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79dfca78-daec-4e1b-93cb-505e50873031-config-data\") pod \"cinder-scheduler-0\" (UID: \"79dfca78-daec-4e1b-93cb-505e50873031\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:23 crc kubenswrapper[4825]: I0310 08:15:23.786962 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvq99\" (UniqueName: \"kubernetes.io/projected/79dfca78-daec-4e1b-93cb-505e50873031-kube-api-access-bvq99\") pod \"cinder-scheduler-0\" (UID: \"79dfca78-daec-4e1b-93cb-505e50873031\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:23 crc kubenswrapper[4825]: I0310 08:15:23.888988 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79dfca78-daec-4e1b-93cb-505e50873031-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"79dfca78-daec-4e1b-93cb-505e50873031\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:23 crc kubenswrapper[4825]: I0310 08:15:23.889077 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79dfca78-daec-4e1b-93cb-505e50873031-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"79dfca78-daec-4e1b-93cb-505e50873031\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:23 crc kubenswrapper[4825]: I0310 08:15:23.889120 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79dfca78-daec-4e1b-93cb-505e50873031-scripts\") pod \"cinder-scheduler-0\" (UID: \"79dfca78-daec-4e1b-93cb-505e50873031\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:23 crc kubenswrapper[4825]: I0310 08:15:23.889197 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79dfca78-daec-4e1b-93cb-505e50873031-config-data\") pod \"cinder-scheduler-0\" (UID: \"79dfca78-daec-4e1b-93cb-505e50873031\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:23 crc kubenswrapper[4825]: I0310 08:15:23.889196 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79dfca78-daec-4e1b-93cb-505e50873031-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"79dfca78-daec-4e1b-93cb-505e50873031\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:23 crc kubenswrapper[4825]: I0310 08:15:23.889984 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvq99\" (UniqueName: \"kubernetes.io/projected/79dfca78-daec-4e1b-93cb-505e50873031-kube-api-access-bvq99\") pod \"cinder-scheduler-0\" (UID: \"79dfca78-daec-4e1b-93cb-505e50873031\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:23 crc kubenswrapper[4825]: I0310 08:15:23.890097 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79dfca78-daec-4e1b-93cb-505e50873031-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"79dfca78-daec-4e1b-93cb-505e50873031\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:23 crc kubenswrapper[4825]: I0310 08:15:23.896247 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79dfca78-daec-4e1b-93cb-505e50873031-config-data\") pod \"cinder-scheduler-0\" (UID: \"79dfca78-daec-4e1b-93cb-505e50873031\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:23 crc kubenswrapper[4825]: I0310 08:15:23.897687 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79dfca78-daec-4e1b-93cb-505e50873031-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"79dfca78-daec-4e1b-93cb-505e50873031\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:23 crc kubenswrapper[4825]: I0310 08:15:23.898121 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79dfca78-daec-4e1b-93cb-505e50873031-scripts\") pod \"cinder-scheduler-0\" (UID: \"79dfca78-daec-4e1b-93cb-505e50873031\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:23 crc kubenswrapper[4825]: I0310 08:15:23.898475 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79dfca78-daec-4e1b-93cb-505e50873031-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"79dfca78-daec-4e1b-93cb-505e50873031\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:23 crc kubenswrapper[4825]: I0310 08:15:23.916575 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvq99\" (UniqueName: \"kubernetes.io/projected/79dfca78-daec-4e1b-93cb-505e50873031-kube-api-access-bvq99\") pod \"cinder-scheduler-0\" (UID: \"79dfca78-daec-4e1b-93cb-505e50873031\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:23 crc kubenswrapper[4825]: I0310 08:15:23.918818 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 08:15:24 crc kubenswrapper[4825]: I0310 08:15:24.063569 4825 generic.go:334] "Generic (PLEG): container finished" podID="2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9" containerID="bc694e0f3164dea92fce9c751813baa56c873de3782ca3aba3ed3acaeb00484d" exitCode=0 Mar 10 08:15:24 crc kubenswrapper[4825]: I0310 08:15:24.063781 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzb8c" event={"ID":"2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9","Type":"ContainerDied","Data":"bc694e0f3164dea92fce9c751813baa56c873de3782ca3aba3ed3acaeb00484d"} Mar 10 08:15:24 crc kubenswrapper[4825]: I0310 08:15:24.063860 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzb8c" event={"ID":"2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9","Type":"ContainerStarted","Data":"70e73746230991a2eaca80fdd05017b09afdbabdc260885aab804910ff23a59d"} Mar 10 08:15:24 crc kubenswrapper[4825]: I0310 08:15:24.388504 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 08:15:24 crc kubenswrapper[4825]: I0310 08:15:24.731549 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 08:15:24 crc kubenswrapper[4825]: I0310 08:15:24.731837 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="02cf0c5c-1b04-4f5f-b7e1-b86669b0be36" containerName="cinder-api-log" containerID="cri-o://5ecd9977232bc5148d24eb393a5e62db939f45a2db927f5ebd5f554b99d75cc5" gracePeriod=30 Mar 10 08:15:24 crc kubenswrapper[4825]: I0310 08:15:24.731897 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="02cf0c5c-1b04-4f5f-b7e1-b86669b0be36" containerName="cinder-api" containerID="cri-o://f4dcebc193b4ec012c3b6f0d8afa587e6d461a9bca3448c05ec5d1539b09ac73" gracePeriod=30 Mar 10 08:15:25 crc kubenswrapper[4825]: I0310 08:15:25.089027 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"79dfca78-daec-4e1b-93cb-505e50873031","Type":"ContainerStarted","Data":"184491406c981901bde534458991cb3af8cbd64559c4cd7cd471000525c7c580"} Mar 10 08:15:25 crc kubenswrapper[4825]: I0310 08:15:25.091998 4825 generic.go:334] "Generic (PLEG): container finished" podID="02cf0c5c-1b04-4f5f-b7e1-b86669b0be36" containerID="5ecd9977232bc5148d24eb393a5e62db939f45a2db927f5ebd5f554b99d75cc5" exitCode=143 Mar 10 08:15:25 crc kubenswrapper[4825]: I0310 08:15:25.092038 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36","Type":"ContainerDied","Data":"5ecd9977232bc5148d24eb393a5e62db939f45a2db927f5ebd5f554b99d75cc5"} Mar 10 08:15:26 crc kubenswrapper[4825]: I0310 08:15:26.132583 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzb8c" event={"ID":"2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9","Type":"ContainerStarted","Data":"e9aa548e1dc3fdf6c7b5eafdcf50956e9c87e40eacb16fd3b0cf1a6a13010b2a"} Mar 10 08:15:26 crc kubenswrapper[4825]: I0310 08:15:26.148836 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"79dfca78-daec-4e1b-93cb-505e50873031","Type":"ContainerStarted","Data":"1e733cf69039acc31766e6f545c5948a7a271d01448ad95183df6fa8a4bc9ab4"} Mar 10 08:15:26 crc kubenswrapper[4825]: I0310 08:15:26.148874 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"79dfca78-daec-4e1b-93cb-505e50873031","Type":"ContainerStarted","Data":"91594b35941ba8d6fa70ed3653a7cf7be36cb337a408535fffe4a8d90860a75f"} Mar 10 08:15:26 crc kubenswrapper[4825]: I0310 08:15:26.169797 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.965918398 podStartE2EDuration="3.169779213s" podCreationTimestamp="2026-03-10 08:15:23 +0000 UTC" firstStartedPulling="2026-03-10 08:15:24.399323726 +0000 UTC m=+5477.429104341" lastFinishedPulling="2026-03-10 08:15:24.603184541 +0000 UTC m=+5477.632965156" observedRunningTime="2026-03-10 08:15:26.169028143 +0000 UTC m=+5479.198808768" watchObservedRunningTime="2026-03-10 08:15:26.169779213 +0000 UTC m=+5479.199559828" Mar 10 08:15:28 crc kubenswrapper[4825]: I0310 08:15:28.167222 4825 generic.go:334] "Generic (PLEG): container finished" podID="2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9" containerID="e9aa548e1dc3fdf6c7b5eafdcf50956e9c87e40eacb16fd3b0cf1a6a13010b2a" exitCode=0 Mar 10 08:15:28 crc kubenswrapper[4825]: I0310 08:15:28.167272 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzb8c" event={"ID":"2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9","Type":"ContainerDied","Data":"e9aa548e1dc3fdf6c7b5eafdcf50956e9c87e40eacb16fd3b0cf1a6a13010b2a"} Mar 10 08:15:28 crc kubenswrapper[4825]: I0310 08:15:28.171512 4825 generic.go:334] "Generic (PLEG): container finished" podID="02cf0c5c-1b04-4f5f-b7e1-b86669b0be36" containerID="f4dcebc193b4ec012c3b6f0d8afa587e6d461a9bca3448c05ec5d1539b09ac73" exitCode=0 Mar 10 08:15:28 crc kubenswrapper[4825]: I0310 08:15:28.171564 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36","Type":"ContainerDied","Data":"f4dcebc193b4ec012c3b6f0d8afa587e6d461a9bca3448c05ec5d1539b09ac73"} Mar 10 08:15:28 crc kubenswrapper[4825]: I0310 08:15:28.761489 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 08:15:28 crc kubenswrapper[4825]: I0310 08:15:28.920250 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 10 08:15:28 crc kubenswrapper[4825]: I0310 08:15:28.923107 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-logs\") pod \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " Mar 10 08:15:28 crc kubenswrapper[4825]: I0310 08:15:28.923192 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-config-data\") pod \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " Mar 10 08:15:28 crc kubenswrapper[4825]: I0310 08:15:28.923273 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jjs2\" (UniqueName: \"kubernetes.io/projected/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-kube-api-access-9jjs2\") pod \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " Mar 10 08:15:28 crc kubenswrapper[4825]: I0310 08:15:28.923307 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-config-data-custom\") pod \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " Mar 10 08:15:28 crc kubenswrapper[4825]: I0310 08:15:28.923372 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-scripts\") pod \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " Mar 10 08:15:28 crc kubenswrapper[4825]: I0310 08:15:28.923426 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-etc-machine-id\") pod \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " Mar 10 08:15:28 crc kubenswrapper[4825]: I0310 08:15:28.923470 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-internal-tls-certs\") pod \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " Mar 10 08:15:28 crc kubenswrapper[4825]: I0310 08:15:28.923566 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "02cf0c5c-1b04-4f5f-b7e1-b86669b0be36" (UID: "02cf0c5c-1b04-4f5f-b7e1-b86669b0be36"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 08:15:28 crc kubenswrapper[4825]: I0310 08:15:28.923512 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-combined-ca-bundle\") pod \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " Mar 10 08:15:28 crc kubenswrapper[4825]: I0310 08:15:28.924245 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-public-tls-certs\") pod \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\" (UID: \"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36\") " Mar 10 08:15:28 crc kubenswrapper[4825]: I0310 08:15:28.924663 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-logs" (OuterVolumeSpecName: "logs") pod "02cf0c5c-1b04-4f5f-b7e1-b86669b0be36" (UID: "02cf0c5c-1b04-4f5f-b7e1-b86669b0be36"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:15:28 crc kubenswrapper[4825]: I0310 08:15:28.924812 4825 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:28 crc kubenswrapper[4825]: I0310 08:15:28.930238 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-kube-api-access-9jjs2" (OuterVolumeSpecName: "kube-api-access-9jjs2") pod "02cf0c5c-1b04-4f5f-b7e1-b86669b0be36" (UID: "02cf0c5c-1b04-4f5f-b7e1-b86669b0be36"). InnerVolumeSpecName "kube-api-access-9jjs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:15:28 crc kubenswrapper[4825]: I0310 08:15:28.933198 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-scripts" (OuterVolumeSpecName: "scripts") pod "02cf0c5c-1b04-4f5f-b7e1-b86669b0be36" (UID: "02cf0c5c-1b04-4f5f-b7e1-b86669b0be36"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:15:28 crc kubenswrapper[4825]: I0310 08:15:28.939518 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "02cf0c5c-1b04-4f5f-b7e1-b86669b0be36" (UID: "02cf0c5c-1b04-4f5f-b7e1-b86669b0be36"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:15:28 crc kubenswrapper[4825]: I0310 08:15:28.967069 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "02cf0c5c-1b04-4f5f-b7e1-b86669b0be36" (UID: "02cf0c5c-1b04-4f5f-b7e1-b86669b0be36"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:15:28 crc kubenswrapper[4825]: I0310 08:15:28.976753 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-config-data" (OuterVolumeSpecName: "config-data") pod "02cf0c5c-1b04-4f5f-b7e1-b86669b0be36" (UID: "02cf0c5c-1b04-4f5f-b7e1-b86669b0be36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:15:28 crc kubenswrapper[4825]: I0310 08:15:28.982319 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02cf0c5c-1b04-4f5f-b7e1-b86669b0be36" (UID: "02cf0c5c-1b04-4f5f-b7e1-b86669b0be36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.001449 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "02cf0c5c-1b04-4f5f-b7e1-b86669b0be36" (UID: "02cf0c5c-1b04-4f5f-b7e1-b86669b0be36"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.026648 4825 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.026687 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.026697 4825 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.026706 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-logs\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.026716 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.026724 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jjs2\" (UniqueName: \"kubernetes.io/projected/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-kube-api-access-9jjs2\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.026733 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.026740 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.182806 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzb8c" event={"ID":"2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9","Type":"ContainerStarted","Data":"94a8ce5d6c3fcc2a8c2ae7b60f50866672ad044e167589386154b1b2557d4011"} Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.186993 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"02cf0c5c-1b04-4f5f-b7e1-b86669b0be36","Type":"ContainerDied","Data":"da87b1fa2845c81c241b53418ff82bab78b4dc3e7fd12452e75d5e7fe7fd9920"} Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.187046 4825 scope.go:117] "RemoveContainer" containerID="f4dcebc193b4ec012c3b6f0d8afa587e6d461a9bca3448c05ec5d1539b09ac73" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.187229 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.225721 4825 scope.go:117] "RemoveContainer" containerID="5ecd9977232bc5148d24eb393a5e62db939f45a2db927f5ebd5f554b99d75cc5" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.231738 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dzb8c" podStartSLOduration=2.715183413 podStartE2EDuration="7.231711784s" podCreationTimestamp="2026-03-10 08:15:22 +0000 UTC" firstStartedPulling="2026-03-10 08:15:24.066561925 +0000 UTC m=+5477.096342540" lastFinishedPulling="2026-03-10 08:15:28.583090286 +0000 UTC m=+5481.612870911" observedRunningTime="2026-03-10 08:15:29.209534112 +0000 UTC m=+5482.239314727" watchObservedRunningTime="2026-03-10 08:15:29.231711784 +0000 UTC m=+5482.261492399" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.248319 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.250381 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.259444 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 10 08:15:29 crc kubenswrapper[4825]: E0310 08:15:29.259800 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02cf0c5c-1b04-4f5f-b7e1-b86669b0be36" containerName="cinder-api" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.259818 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="02cf0c5c-1b04-4f5f-b7e1-b86669b0be36" containerName="cinder-api" Mar 10 08:15:29 crc kubenswrapper[4825]: E0310 08:15:29.259836 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02cf0c5c-1b04-4f5f-b7e1-b86669b0be36" containerName="cinder-api-log" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.259843 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="02cf0c5c-1b04-4f5f-b7e1-b86669b0be36" containerName="cinder-api-log" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.260013 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="02cf0c5c-1b04-4f5f-b7e1-b86669b0be36" containerName="cinder-api-log" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.260038 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="02cf0c5c-1b04-4f5f-b7e1-b86669b0be36" containerName="cinder-api" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.260883 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.264981 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.264969 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.265366 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.312125 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.436172 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6ed5b03-6418-446b-8bb0-88ed213149d1-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.436252 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjkdv\" (UniqueName: \"kubernetes.io/projected/f6ed5b03-6418-446b-8bb0-88ed213149d1-kube-api-access-jjkdv\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.436293 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6ed5b03-6418-446b-8bb0-88ed213149d1-logs\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.436329 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6ed5b03-6418-446b-8bb0-88ed213149d1-config-data\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.436380 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6ed5b03-6418-446b-8bb0-88ed213149d1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.436579 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6ed5b03-6418-446b-8bb0-88ed213149d1-config-data-custom\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.436724 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f6ed5b03-6418-446b-8bb0-88ed213149d1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.436760 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6ed5b03-6418-446b-8bb0-88ed213149d1-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.436820 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6ed5b03-6418-446b-8bb0-88ed213149d1-scripts\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.538317 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f6ed5b03-6418-446b-8bb0-88ed213149d1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.538367 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6ed5b03-6418-446b-8bb0-88ed213149d1-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.538402 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6ed5b03-6418-446b-8bb0-88ed213149d1-scripts\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.538435 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjkdv\" (UniqueName: \"kubernetes.io/projected/f6ed5b03-6418-446b-8bb0-88ed213149d1-kube-api-access-jjkdv\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.538436 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f6ed5b03-6418-446b-8bb0-88ed213149d1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.538455 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6ed5b03-6418-446b-8bb0-88ed213149d1-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.538500 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6ed5b03-6418-446b-8bb0-88ed213149d1-logs\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.538675 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6ed5b03-6418-446b-8bb0-88ed213149d1-config-data\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.539178 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6ed5b03-6418-446b-8bb0-88ed213149d1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.539245 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6ed5b03-6418-446b-8bb0-88ed213149d1-logs\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.539261 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6ed5b03-6418-446b-8bb0-88ed213149d1-config-data-custom\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.542730 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6ed5b03-6418-446b-8bb0-88ed213149d1-scripts\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.543245 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6ed5b03-6418-446b-8bb0-88ed213149d1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.543755 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6ed5b03-6418-446b-8bb0-88ed213149d1-config-data-custom\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.544035 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6ed5b03-6418-446b-8bb0-88ed213149d1-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.544495 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6ed5b03-6418-446b-8bb0-88ed213149d1-config-data\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.547836 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6ed5b03-6418-446b-8bb0-88ed213149d1-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.561236 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjkdv\" (UniqueName: \"kubernetes.io/projected/f6ed5b03-6418-446b-8bb0-88ed213149d1-kube-api-access-jjkdv\") pod \"cinder-api-0\" (UID: \"f6ed5b03-6418-446b-8bb0-88ed213149d1\") " pod="openstack/cinder-api-0" Mar 10 08:15:29 crc kubenswrapper[4825]: I0310 08:15:29.627933 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 08:15:30 crc kubenswrapper[4825]: I0310 08:15:30.086021 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 08:15:30 crc kubenswrapper[4825]: I0310 08:15:30.200007 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f6ed5b03-6418-446b-8bb0-88ed213149d1","Type":"ContainerStarted","Data":"b3b9f9b30a48935a7ecc30afea52fd63ed82c0754df0ed94960980137e9eaa5f"} Mar 10 08:15:31 crc kubenswrapper[4825]: I0310 08:15:31.214069 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f6ed5b03-6418-446b-8bb0-88ed213149d1","Type":"ContainerStarted","Data":"e1a052d0f68c140c022c103edb518bbdb3f4d2bb0ede898d10e6099a659b71e9"} Mar 10 08:15:31 crc kubenswrapper[4825]: I0310 08:15:31.214648 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 10 08:15:31 crc kubenswrapper[4825]: I0310 08:15:31.214695 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f6ed5b03-6418-446b-8bb0-88ed213149d1","Type":"ContainerStarted","Data":"8fdca136bd4037da703bdabfdfbf3019a3f2db8359b38d8ded2c573f0abbfdaa"} Mar 10 08:15:31 crc kubenswrapper[4825]: I0310 08:15:31.249904 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02cf0c5c-1b04-4f5f-b7e1-b86669b0be36" path="/var/lib/kubelet/pods/02cf0c5c-1b04-4f5f-b7e1-b86669b0be36/volumes" Mar 10 08:15:31 crc kubenswrapper[4825]: I0310 08:15:31.268466 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.268441056 podStartE2EDuration="2.268441056s" podCreationTimestamp="2026-03-10 08:15:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:15:31.264554534 +0000 UTC m=+5484.294335169" watchObservedRunningTime="2026-03-10 08:15:31.268441056 +0000 UTC m=+5484.298221681" Mar 10 08:15:32 crc kubenswrapper[4825]: I0310 08:15:32.873157 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dzb8c" Mar 10 08:15:32 crc kubenswrapper[4825]: I0310 08:15:32.873237 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dzb8c" Mar 10 08:15:33 crc kubenswrapper[4825]: I0310 08:15:33.237523 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:15:33 crc kubenswrapper[4825]: E0310 08:15:33.238186 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:15:33 crc kubenswrapper[4825]: I0310 08:15:33.953268 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dzb8c" podUID="2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9" containerName="registry-server" probeResult="failure" output=< Mar 10 08:15:33 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 08:15:33 crc kubenswrapper[4825]: > Mar 10 08:15:34 crc kubenswrapper[4825]: I0310 08:15:34.143301 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 10 08:15:34 crc kubenswrapper[4825]: I0310 08:15:34.229173 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 08:15:34 crc kubenswrapper[4825]: I0310 08:15:34.241011 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="79dfca78-daec-4e1b-93cb-505e50873031" containerName="cinder-scheduler" containerID="cri-o://91594b35941ba8d6fa70ed3653a7cf7be36cb337a408535fffe4a8d90860a75f" gracePeriod=30 Mar 10 08:15:34 crc kubenswrapper[4825]: I0310 08:15:34.241369 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="79dfca78-daec-4e1b-93cb-505e50873031" containerName="probe" containerID="cri-o://1e733cf69039acc31766e6f545c5948a7a271d01448ad95183df6fa8a4bc9ab4" gracePeriod=30 Mar 10 08:15:35 crc kubenswrapper[4825]: I0310 08:15:35.250305 4825 generic.go:334] "Generic (PLEG): container finished" podID="79dfca78-daec-4e1b-93cb-505e50873031" containerID="1e733cf69039acc31766e6f545c5948a7a271d01448ad95183df6fa8a4bc9ab4" exitCode=0 Mar 10 08:15:35 crc kubenswrapper[4825]: I0310 08:15:35.250351 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"79dfca78-daec-4e1b-93cb-505e50873031","Type":"ContainerDied","Data":"1e733cf69039acc31766e6f545c5948a7a271d01448ad95183df6fa8a4bc9ab4"} Mar 10 08:15:37 crc kubenswrapper[4825]: E0310 08:15:37.189309 4825 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79dfca78_daec_4e1b_93cb_505e50873031.slice/crio-conmon-91594b35941ba8d6fa70ed3653a7cf7be36cb337a408535fffe4a8d90860a75f.scope\": RecentStats: unable to find data in memory cache]" Mar 10 08:15:37 crc kubenswrapper[4825]: I0310 08:15:37.274844 4825 generic.go:334] "Generic (PLEG): container finished" podID="79dfca78-daec-4e1b-93cb-505e50873031" containerID="91594b35941ba8d6fa70ed3653a7cf7be36cb337a408535fffe4a8d90860a75f" exitCode=0 Mar 10 08:15:37 crc kubenswrapper[4825]: I0310 08:15:37.274888 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"79dfca78-daec-4e1b-93cb-505e50873031","Type":"ContainerDied","Data":"91594b35941ba8d6fa70ed3653a7cf7be36cb337a408535fffe4a8d90860a75f"} Mar 10 08:15:37 crc kubenswrapper[4825]: I0310 08:15:37.400481 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 08:15:37 crc kubenswrapper[4825]: I0310 08:15:37.504918 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvq99\" (UniqueName: \"kubernetes.io/projected/79dfca78-daec-4e1b-93cb-505e50873031-kube-api-access-bvq99\") pod \"79dfca78-daec-4e1b-93cb-505e50873031\" (UID: \"79dfca78-daec-4e1b-93cb-505e50873031\") " Mar 10 08:15:37 crc kubenswrapper[4825]: I0310 08:15:37.504968 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79dfca78-daec-4e1b-93cb-505e50873031-scripts\") pod \"79dfca78-daec-4e1b-93cb-505e50873031\" (UID: \"79dfca78-daec-4e1b-93cb-505e50873031\") " Mar 10 08:15:37 crc kubenswrapper[4825]: I0310 08:15:37.505025 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79dfca78-daec-4e1b-93cb-505e50873031-combined-ca-bundle\") pod \"79dfca78-daec-4e1b-93cb-505e50873031\" (UID: \"79dfca78-daec-4e1b-93cb-505e50873031\") " Mar 10 08:15:37 crc kubenswrapper[4825]: I0310 08:15:37.505101 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79dfca78-daec-4e1b-93cb-505e50873031-config-data-custom\") pod \"79dfca78-daec-4e1b-93cb-505e50873031\" (UID: \"79dfca78-daec-4e1b-93cb-505e50873031\") " Mar 10 08:15:37 crc kubenswrapper[4825]: I0310 08:15:37.505124 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79dfca78-daec-4e1b-93cb-505e50873031-config-data\") pod \"79dfca78-daec-4e1b-93cb-505e50873031\" (UID: \"79dfca78-daec-4e1b-93cb-505e50873031\") " Mar 10 08:15:37 crc kubenswrapper[4825]: I0310 08:15:37.505162 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79dfca78-daec-4e1b-93cb-505e50873031-etc-machine-id\") pod \"79dfca78-daec-4e1b-93cb-505e50873031\" (UID: \"79dfca78-daec-4e1b-93cb-505e50873031\") " Mar 10 08:15:37 crc kubenswrapper[4825]: I0310 08:15:37.505384 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79dfca78-daec-4e1b-93cb-505e50873031-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "79dfca78-daec-4e1b-93cb-505e50873031" (UID: "79dfca78-daec-4e1b-93cb-505e50873031"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 08:15:37 crc kubenswrapper[4825]: I0310 08:15:37.505985 4825 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/79dfca78-daec-4e1b-93cb-505e50873031-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:37 crc kubenswrapper[4825]: I0310 08:15:37.510249 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79dfca78-daec-4e1b-93cb-505e50873031-scripts" (OuterVolumeSpecName: "scripts") pod "79dfca78-daec-4e1b-93cb-505e50873031" (UID: "79dfca78-daec-4e1b-93cb-505e50873031"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:15:37 crc kubenswrapper[4825]: I0310 08:15:37.510688 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79dfca78-daec-4e1b-93cb-505e50873031-kube-api-access-bvq99" (OuterVolumeSpecName: "kube-api-access-bvq99") pod "79dfca78-daec-4e1b-93cb-505e50873031" (UID: "79dfca78-daec-4e1b-93cb-505e50873031"). InnerVolumeSpecName "kube-api-access-bvq99". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:15:37 crc kubenswrapper[4825]: I0310 08:15:37.520317 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79dfca78-daec-4e1b-93cb-505e50873031-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "79dfca78-daec-4e1b-93cb-505e50873031" (UID: "79dfca78-daec-4e1b-93cb-505e50873031"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:15:37 crc kubenswrapper[4825]: I0310 08:15:37.555287 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79dfca78-daec-4e1b-93cb-505e50873031-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79dfca78-daec-4e1b-93cb-505e50873031" (UID: "79dfca78-daec-4e1b-93cb-505e50873031"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:15:37 crc kubenswrapper[4825]: I0310 08:15:37.590340 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79dfca78-daec-4e1b-93cb-505e50873031-config-data" (OuterVolumeSpecName: "config-data") pod "79dfca78-daec-4e1b-93cb-505e50873031" (UID: "79dfca78-daec-4e1b-93cb-505e50873031"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:15:37 crc kubenswrapper[4825]: I0310 08:15:37.608247 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79dfca78-daec-4e1b-93cb-505e50873031-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:37 crc kubenswrapper[4825]: I0310 08:15:37.608288 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79dfca78-daec-4e1b-93cb-505e50873031-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:37 crc kubenswrapper[4825]: I0310 08:15:37.608299 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79dfca78-daec-4e1b-93cb-505e50873031-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:37 crc kubenswrapper[4825]: I0310 08:15:37.608313 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvq99\" (UniqueName: \"kubernetes.io/projected/79dfca78-daec-4e1b-93cb-505e50873031-kube-api-access-bvq99\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:37 crc kubenswrapper[4825]: I0310 08:15:37.608325 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79dfca78-daec-4e1b-93cb-505e50873031-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.286351 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"79dfca78-daec-4e1b-93cb-505e50873031","Type":"ContainerDied","Data":"184491406c981901bde534458991cb3af8cbd64559c4cd7cd471000525c7c580"} Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.286383 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.286441 4825 scope.go:117] "RemoveContainer" containerID="1e733cf69039acc31766e6f545c5948a7a271d01448ad95183df6fa8a4bc9ab4" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.317999 4825 scope.go:117] "RemoveContainer" containerID="91594b35941ba8d6fa70ed3653a7cf7be36cb337a408535fffe4a8d90860a75f" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.337872 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.384341 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.397275 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 08:15:38 crc kubenswrapper[4825]: E0310 08:15:38.397666 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79dfca78-daec-4e1b-93cb-505e50873031" containerName="cinder-scheduler" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.397685 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="79dfca78-daec-4e1b-93cb-505e50873031" containerName="cinder-scheduler" Mar 10 08:15:38 crc kubenswrapper[4825]: E0310 08:15:38.397714 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79dfca78-daec-4e1b-93cb-505e50873031" containerName="probe" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.397721 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="79dfca78-daec-4e1b-93cb-505e50873031" containerName="probe" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.397903 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="79dfca78-daec-4e1b-93cb-505e50873031" containerName="cinder-scheduler" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.397922 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="79dfca78-daec-4e1b-93cb-505e50873031" containerName="probe" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.398848 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.401738 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.407662 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.527467 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3edaef1f-39d6-4e94-a143-486f77278314-config-data\") pod \"cinder-scheduler-0\" (UID: \"3edaef1f-39d6-4e94-a143-486f77278314\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.527548 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edaef1f-39d6-4e94-a143-486f77278314-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3edaef1f-39d6-4e94-a143-486f77278314\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.527585 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3edaef1f-39d6-4e94-a143-486f77278314-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3edaef1f-39d6-4e94-a143-486f77278314\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.527638 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3edaef1f-39d6-4e94-a143-486f77278314-scripts\") pod \"cinder-scheduler-0\" (UID: \"3edaef1f-39d6-4e94-a143-486f77278314\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.527660 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrh54\" (UniqueName: \"kubernetes.io/projected/3edaef1f-39d6-4e94-a143-486f77278314-kube-api-access-nrh54\") pod \"cinder-scheduler-0\" (UID: \"3edaef1f-39d6-4e94-a143-486f77278314\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.527689 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3edaef1f-39d6-4e94-a143-486f77278314-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3edaef1f-39d6-4e94-a143-486f77278314\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.629214 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3edaef1f-39d6-4e94-a143-486f77278314-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3edaef1f-39d6-4e94-a143-486f77278314\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.629282 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3edaef1f-39d6-4e94-a143-486f77278314-scripts\") pod \"cinder-scheduler-0\" (UID: \"3edaef1f-39d6-4e94-a143-486f77278314\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.629301 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrh54\" (UniqueName: \"kubernetes.io/projected/3edaef1f-39d6-4e94-a143-486f77278314-kube-api-access-nrh54\") pod \"cinder-scheduler-0\" (UID: \"3edaef1f-39d6-4e94-a143-486f77278314\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.629340 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3edaef1f-39d6-4e94-a143-486f77278314-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3edaef1f-39d6-4e94-a143-486f77278314\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.629423 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3edaef1f-39d6-4e94-a143-486f77278314-config-data\") pod \"cinder-scheduler-0\" (UID: \"3edaef1f-39d6-4e94-a143-486f77278314\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.629470 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edaef1f-39d6-4e94-a143-486f77278314-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3edaef1f-39d6-4e94-a143-486f77278314\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.630656 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3edaef1f-39d6-4e94-a143-486f77278314-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3edaef1f-39d6-4e94-a143-486f77278314\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.634000 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3edaef1f-39d6-4e94-a143-486f77278314-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3edaef1f-39d6-4e94-a143-486f77278314\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.635254 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3edaef1f-39d6-4e94-a143-486f77278314-scripts\") pod \"cinder-scheduler-0\" (UID: \"3edaef1f-39d6-4e94-a143-486f77278314\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.637656 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3edaef1f-39d6-4e94-a143-486f77278314-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3edaef1f-39d6-4e94-a143-486f77278314\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.638013 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3edaef1f-39d6-4e94-a143-486f77278314-config-data\") pod \"cinder-scheduler-0\" (UID: \"3edaef1f-39d6-4e94-a143-486f77278314\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.650699 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrh54\" (UniqueName: \"kubernetes.io/projected/3edaef1f-39d6-4e94-a143-486f77278314-kube-api-access-nrh54\") pod \"cinder-scheduler-0\" (UID: \"3edaef1f-39d6-4e94-a143-486f77278314\") " pod="openstack/cinder-scheduler-0" Mar 10 08:15:38 crc kubenswrapper[4825]: I0310 08:15:38.713934 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 08:15:39 crc kubenswrapper[4825]: I0310 08:15:39.178818 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 08:15:39 crc kubenswrapper[4825]: W0310 08:15:39.181207 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3edaef1f_39d6_4e94_a143_486f77278314.slice/crio-66013380f5b6546ae07d4a3ffc6fa39b794b1c4c556f5efb86c8a6d66f8dc627 WatchSource:0}: Error finding container 66013380f5b6546ae07d4a3ffc6fa39b794b1c4c556f5efb86c8a6d66f8dc627: Status 404 returned error can't find the container with id 66013380f5b6546ae07d4a3ffc6fa39b794b1c4c556f5efb86c8a6d66f8dc627 Mar 10 08:15:39 crc kubenswrapper[4825]: I0310 08:15:39.254948 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79dfca78-daec-4e1b-93cb-505e50873031" path="/var/lib/kubelet/pods/79dfca78-daec-4e1b-93cb-505e50873031/volumes" Mar 10 08:15:39 crc kubenswrapper[4825]: I0310 08:15:39.298060 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3edaef1f-39d6-4e94-a143-486f77278314","Type":"ContainerStarted","Data":"66013380f5b6546ae07d4a3ffc6fa39b794b1c4c556f5efb86c8a6d66f8dc627"} Mar 10 08:15:40 crc kubenswrapper[4825]: I0310 08:15:40.323392 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3edaef1f-39d6-4e94-a143-486f77278314","Type":"ContainerStarted","Data":"a400b0cd57fa511100b43268421f330986b8d5f58db04de123ab08ff962b2e9a"} Mar 10 08:15:41 crc kubenswrapper[4825]: I0310 08:15:41.336718 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3edaef1f-39d6-4e94-a143-486f77278314","Type":"ContainerStarted","Data":"022947942a6eebe8cd1f1ec126ebc07833f727ecb7d56fcbfbb34582a4e6a3dd"} Mar 10 08:15:41 crc kubenswrapper[4825]: I0310 08:15:41.357775 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.357752865 podStartE2EDuration="3.357752865s" podCreationTimestamp="2026-03-10 08:15:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:15:41.352268991 +0000 UTC m=+5494.382049646" watchObservedRunningTime="2026-03-10 08:15:41.357752865 +0000 UTC m=+5494.387533510" Mar 10 08:15:41 crc kubenswrapper[4825]: I0310 08:15:41.568093 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 10 08:15:42 crc kubenswrapper[4825]: I0310 08:15:42.865987 4825 scope.go:117] "RemoveContainer" containerID="f3bcc1214ca1681a4785ba6f7a392fab4cefc0685994622d7ba0d42d12949c17" Mar 10 08:15:42 crc kubenswrapper[4825]: I0310 08:15:42.923411 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dzb8c" Mar 10 08:15:42 crc kubenswrapper[4825]: I0310 08:15:42.979909 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dzb8c" Mar 10 08:15:43 crc kubenswrapper[4825]: I0310 08:15:43.159030 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dzb8c"] Mar 10 08:15:43 crc kubenswrapper[4825]: I0310 08:15:43.714582 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 10 08:15:44 crc kubenswrapper[4825]: I0310 08:15:44.363801 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dzb8c" podUID="2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9" containerName="registry-server" containerID="cri-o://94a8ce5d6c3fcc2a8c2ae7b60f50866672ad044e167589386154b1b2557d4011" gracePeriod=2 Mar 10 08:15:44 crc kubenswrapper[4825]: I0310 08:15:44.754568 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzb8c" Mar 10 08:15:44 crc kubenswrapper[4825]: I0310 08:15:44.897028 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9-utilities\") pod \"2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9\" (UID: \"2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9\") " Mar 10 08:15:44 crc kubenswrapper[4825]: I0310 08:15:44.897226 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9-catalog-content\") pod \"2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9\" (UID: \"2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9\") " Mar 10 08:15:44 crc kubenswrapper[4825]: I0310 08:15:44.897268 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b222p\" (UniqueName: \"kubernetes.io/projected/2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9-kube-api-access-b222p\") pod \"2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9\" (UID: \"2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9\") " Mar 10 08:15:44 crc kubenswrapper[4825]: I0310 08:15:44.898096 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9-utilities" (OuterVolumeSpecName: "utilities") pod "2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9" (UID: "2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:15:44 crc kubenswrapper[4825]: I0310 08:15:44.903524 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9-kube-api-access-b222p" (OuterVolumeSpecName: "kube-api-access-b222p") pod "2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9" (UID: "2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9"). InnerVolumeSpecName "kube-api-access-b222p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:15:44 crc kubenswrapper[4825]: I0310 08:15:44.999049 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b222p\" (UniqueName: \"kubernetes.io/projected/2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9-kube-api-access-b222p\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:44 crc kubenswrapper[4825]: I0310 08:15:44.999083 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:45 crc kubenswrapper[4825]: I0310 08:15:45.029442 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9" (UID: "2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:15:45 crc kubenswrapper[4825]: I0310 08:15:45.100924 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:45 crc kubenswrapper[4825]: I0310 08:15:45.376227 4825 generic.go:334] "Generic (PLEG): container finished" podID="2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9" containerID="94a8ce5d6c3fcc2a8c2ae7b60f50866672ad044e167589386154b1b2557d4011" exitCode=0 Mar 10 08:15:45 crc kubenswrapper[4825]: I0310 08:15:45.376309 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzb8c" event={"ID":"2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9","Type":"ContainerDied","Data":"94a8ce5d6c3fcc2a8c2ae7b60f50866672ad044e167589386154b1b2557d4011"} Mar 10 08:15:45 crc kubenswrapper[4825]: I0310 08:15:45.376362 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzb8c" event={"ID":"2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9","Type":"ContainerDied","Data":"70e73746230991a2eaca80fdd05017b09afdbabdc260885aab804910ff23a59d"} Mar 10 08:15:45 crc kubenswrapper[4825]: I0310 08:15:45.376401 4825 scope.go:117] "RemoveContainer" containerID="94a8ce5d6c3fcc2a8c2ae7b60f50866672ad044e167589386154b1b2557d4011" Mar 10 08:15:45 crc kubenswrapper[4825]: I0310 08:15:45.376663 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzb8c" Mar 10 08:15:45 crc kubenswrapper[4825]: I0310 08:15:45.407294 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dzb8c"] Mar 10 08:15:45 crc kubenswrapper[4825]: I0310 08:15:45.409372 4825 scope.go:117] "RemoveContainer" containerID="e9aa548e1dc3fdf6c7b5eafdcf50956e9c87e40eacb16fd3b0cf1a6a13010b2a" Mar 10 08:15:45 crc kubenswrapper[4825]: I0310 08:15:45.415048 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dzb8c"] Mar 10 08:15:45 crc kubenswrapper[4825]: I0310 08:15:45.447181 4825 scope.go:117] "RemoveContainer" containerID="bc694e0f3164dea92fce9c751813baa56c873de3782ca3aba3ed3acaeb00484d" Mar 10 08:15:45 crc kubenswrapper[4825]: I0310 08:15:45.475273 4825 scope.go:117] "RemoveContainer" containerID="94a8ce5d6c3fcc2a8c2ae7b60f50866672ad044e167589386154b1b2557d4011" Mar 10 08:15:45 crc kubenswrapper[4825]: E0310 08:15:45.475648 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94a8ce5d6c3fcc2a8c2ae7b60f50866672ad044e167589386154b1b2557d4011\": container with ID starting with 94a8ce5d6c3fcc2a8c2ae7b60f50866672ad044e167589386154b1b2557d4011 not found: ID does not exist" containerID="94a8ce5d6c3fcc2a8c2ae7b60f50866672ad044e167589386154b1b2557d4011" Mar 10 08:15:45 crc kubenswrapper[4825]: I0310 08:15:45.475692 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94a8ce5d6c3fcc2a8c2ae7b60f50866672ad044e167589386154b1b2557d4011"} err="failed to get container status \"94a8ce5d6c3fcc2a8c2ae7b60f50866672ad044e167589386154b1b2557d4011\": rpc error: code = NotFound desc = could not find container \"94a8ce5d6c3fcc2a8c2ae7b60f50866672ad044e167589386154b1b2557d4011\": container with ID starting with 94a8ce5d6c3fcc2a8c2ae7b60f50866672ad044e167589386154b1b2557d4011 not found: ID does not exist" Mar 10 08:15:45 crc kubenswrapper[4825]: I0310 08:15:45.475718 4825 scope.go:117] "RemoveContainer" containerID="e9aa548e1dc3fdf6c7b5eafdcf50956e9c87e40eacb16fd3b0cf1a6a13010b2a" Mar 10 08:15:45 crc kubenswrapper[4825]: E0310 08:15:45.476080 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9aa548e1dc3fdf6c7b5eafdcf50956e9c87e40eacb16fd3b0cf1a6a13010b2a\": container with ID starting with e9aa548e1dc3fdf6c7b5eafdcf50956e9c87e40eacb16fd3b0cf1a6a13010b2a not found: ID does not exist" containerID="e9aa548e1dc3fdf6c7b5eafdcf50956e9c87e40eacb16fd3b0cf1a6a13010b2a" Mar 10 08:15:45 crc kubenswrapper[4825]: I0310 08:15:45.476181 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9aa548e1dc3fdf6c7b5eafdcf50956e9c87e40eacb16fd3b0cf1a6a13010b2a"} err="failed to get container status \"e9aa548e1dc3fdf6c7b5eafdcf50956e9c87e40eacb16fd3b0cf1a6a13010b2a\": rpc error: code = NotFound desc = could not find container \"e9aa548e1dc3fdf6c7b5eafdcf50956e9c87e40eacb16fd3b0cf1a6a13010b2a\": container with ID starting with e9aa548e1dc3fdf6c7b5eafdcf50956e9c87e40eacb16fd3b0cf1a6a13010b2a not found: ID does not exist" Mar 10 08:15:45 crc kubenswrapper[4825]: I0310 08:15:45.476234 4825 scope.go:117] "RemoveContainer" containerID="bc694e0f3164dea92fce9c751813baa56c873de3782ca3aba3ed3acaeb00484d" Mar 10 08:15:45 crc kubenswrapper[4825]: E0310 08:15:45.476593 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc694e0f3164dea92fce9c751813baa56c873de3782ca3aba3ed3acaeb00484d\": container with ID starting with bc694e0f3164dea92fce9c751813baa56c873de3782ca3aba3ed3acaeb00484d not found: ID does not exist" containerID="bc694e0f3164dea92fce9c751813baa56c873de3782ca3aba3ed3acaeb00484d" Mar 10 08:15:45 crc kubenswrapper[4825]: I0310 08:15:45.476622 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc694e0f3164dea92fce9c751813baa56c873de3782ca3aba3ed3acaeb00484d"} err="failed to get container status \"bc694e0f3164dea92fce9c751813baa56c873de3782ca3aba3ed3acaeb00484d\": rpc error: code = NotFound desc = could not find container \"bc694e0f3164dea92fce9c751813baa56c873de3782ca3aba3ed3acaeb00484d\": container with ID starting with bc694e0f3164dea92fce9c751813baa56c873de3782ca3aba3ed3acaeb00484d not found: ID does not exist" Mar 10 08:15:47 crc kubenswrapper[4825]: I0310 08:15:47.246899 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9" path="/var/lib/kubelet/pods/2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9/volumes" Mar 10 08:15:48 crc kubenswrapper[4825]: I0310 08:15:48.236905 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:15:48 crc kubenswrapper[4825]: E0310 08:15:48.237173 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:15:48 crc kubenswrapper[4825]: I0310 08:15:48.917896 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.504353 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-nlxld"] Mar 10 08:15:50 crc kubenswrapper[4825]: E0310 08:15:50.504940 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9" containerName="registry-server" Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.504955 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9" containerName="registry-server" Mar 10 08:15:50 crc kubenswrapper[4825]: E0310 08:15:50.504965 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9" containerName="extract-utilities" Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.504973 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9" containerName="extract-utilities" Mar 10 08:15:50 crc kubenswrapper[4825]: E0310 08:15:50.504995 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9" containerName="extract-content" Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.505002 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9" containerName="extract-content" Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.505207 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c0a4ca4-aa1e-488e-b8c6-3ff0909807e9" containerName="registry-server" Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.505795 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nlxld" Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.520108 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-nlxld"] Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.601858 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w95jv\" (UniqueName: \"kubernetes.io/projected/16d6fbed-6e1d-42c0-b023-2b2f0bef5889-kube-api-access-w95jv\") pod \"glance-db-create-nlxld\" (UID: \"16d6fbed-6e1d-42c0-b023-2b2f0bef5889\") " pod="openstack/glance-db-create-nlxld" Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.602028 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16d6fbed-6e1d-42c0-b023-2b2f0bef5889-operator-scripts\") pod \"glance-db-create-nlxld\" (UID: \"16d6fbed-6e1d-42c0-b023-2b2f0bef5889\") " pod="openstack/glance-db-create-nlxld" Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.607862 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-4078-account-create-update-tbhx6"] Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.608871 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4078-account-create-update-tbhx6" Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.610654 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.621578 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4078-account-create-update-tbhx6"] Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.703611 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w95jv\" (UniqueName: \"kubernetes.io/projected/16d6fbed-6e1d-42c0-b023-2b2f0bef5889-kube-api-access-w95jv\") pod \"glance-db-create-nlxld\" (UID: \"16d6fbed-6e1d-42c0-b023-2b2f0bef5889\") " pod="openstack/glance-db-create-nlxld" Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.703705 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hw9w\" (UniqueName: \"kubernetes.io/projected/9667b429-21a2-47d3-8874-5cc660a9e1d6-kube-api-access-4hw9w\") pod \"glance-4078-account-create-update-tbhx6\" (UID: \"9667b429-21a2-47d3-8874-5cc660a9e1d6\") " pod="openstack/glance-4078-account-create-update-tbhx6" Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.703772 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16d6fbed-6e1d-42c0-b023-2b2f0bef5889-operator-scripts\") pod \"glance-db-create-nlxld\" (UID: \"16d6fbed-6e1d-42c0-b023-2b2f0bef5889\") " pod="openstack/glance-db-create-nlxld" Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.703846 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9667b429-21a2-47d3-8874-5cc660a9e1d6-operator-scripts\") pod \"glance-4078-account-create-update-tbhx6\" (UID: \"9667b429-21a2-47d3-8874-5cc660a9e1d6\") " pod="openstack/glance-4078-account-create-update-tbhx6" Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.704832 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16d6fbed-6e1d-42c0-b023-2b2f0bef5889-operator-scripts\") pod \"glance-db-create-nlxld\" (UID: \"16d6fbed-6e1d-42c0-b023-2b2f0bef5889\") " pod="openstack/glance-db-create-nlxld" Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.724629 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w95jv\" (UniqueName: \"kubernetes.io/projected/16d6fbed-6e1d-42c0-b023-2b2f0bef5889-kube-api-access-w95jv\") pod \"glance-db-create-nlxld\" (UID: \"16d6fbed-6e1d-42c0-b023-2b2f0bef5889\") " pod="openstack/glance-db-create-nlxld" Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.805342 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9667b429-21a2-47d3-8874-5cc660a9e1d6-operator-scripts\") pod \"glance-4078-account-create-update-tbhx6\" (UID: \"9667b429-21a2-47d3-8874-5cc660a9e1d6\") " pod="openstack/glance-4078-account-create-update-tbhx6" Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.805716 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hw9w\" (UniqueName: \"kubernetes.io/projected/9667b429-21a2-47d3-8874-5cc660a9e1d6-kube-api-access-4hw9w\") pod \"glance-4078-account-create-update-tbhx6\" (UID: \"9667b429-21a2-47d3-8874-5cc660a9e1d6\") " pod="openstack/glance-4078-account-create-update-tbhx6" Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.806155 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9667b429-21a2-47d3-8874-5cc660a9e1d6-operator-scripts\") pod \"glance-4078-account-create-update-tbhx6\" (UID: \"9667b429-21a2-47d3-8874-5cc660a9e1d6\") " pod="openstack/glance-4078-account-create-update-tbhx6" Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.824980 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hw9w\" (UniqueName: \"kubernetes.io/projected/9667b429-21a2-47d3-8874-5cc660a9e1d6-kube-api-access-4hw9w\") pod \"glance-4078-account-create-update-tbhx6\" (UID: \"9667b429-21a2-47d3-8874-5cc660a9e1d6\") " pod="openstack/glance-4078-account-create-update-tbhx6" Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.827916 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nlxld" Mar 10 08:15:50 crc kubenswrapper[4825]: I0310 08:15:50.932185 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4078-account-create-update-tbhx6" Mar 10 08:15:51 crc kubenswrapper[4825]: I0310 08:15:51.278591 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-nlxld"] Mar 10 08:15:51 crc kubenswrapper[4825]: W0310 08:15:51.279810 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16d6fbed_6e1d_42c0_b023_2b2f0bef5889.slice/crio-0565ff6d03a4dfb9d17e05f702b296992279a08e41ae0028387bcf6bed0d5807 WatchSource:0}: Error finding container 0565ff6d03a4dfb9d17e05f702b296992279a08e41ae0028387bcf6bed0d5807: Status 404 returned error can't find the container with id 0565ff6d03a4dfb9d17e05f702b296992279a08e41ae0028387bcf6bed0d5807 Mar 10 08:15:51 crc kubenswrapper[4825]: I0310 08:15:51.436937 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nlxld" event={"ID":"16d6fbed-6e1d-42c0-b023-2b2f0bef5889","Type":"ContainerStarted","Data":"b10ac4feb1521b1355959eafff1beaf457f3ea9f88c00c84f17ec1a7addda1d9"} Mar 10 08:15:51 crc kubenswrapper[4825]: I0310 08:15:51.436980 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nlxld" event={"ID":"16d6fbed-6e1d-42c0-b023-2b2f0bef5889","Type":"ContainerStarted","Data":"0565ff6d03a4dfb9d17e05f702b296992279a08e41ae0028387bcf6bed0d5807"} Mar 10 08:15:51 crc kubenswrapper[4825]: W0310 08:15:51.443171 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9667b429_21a2_47d3_8874_5cc660a9e1d6.slice/crio-e9091b0b8ad7837ac478a923eaac99de5b6364b75ea11f8802878969e0ac88bb WatchSource:0}: Error finding container e9091b0b8ad7837ac478a923eaac99de5b6364b75ea11f8802878969e0ac88bb: Status 404 returned error can't find the container with id e9091b0b8ad7837ac478a923eaac99de5b6364b75ea11f8802878969e0ac88bb Mar 10 08:15:51 crc kubenswrapper[4825]: I0310 08:15:51.450878 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4078-account-create-update-tbhx6"] Mar 10 08:15:51 crc kubenswrapper[4825]: I0310 08:15:51.453682 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-nlxld" podStartSLOduration=1.453660347 podStartE2EDuration="1.453660347s" podCreationTimestamp="2026-03-10 08:15:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:15:51.449261422 +0000 UTC m=+5504.479042037" watchObservedRunningTime="2026-03-10 08:15:51.453660347 +0000 UTC m=+5504.483440962" Mar 10 08:15:52 crc kubenswrapper[4825]: I0310 08:15:52.461961 4825 generic.go:334] "Generic (PLEG): container finished" podID="16d6fbed-6e1d-42c0-b023-2b2f0bef5889" containerID="b10ac4feb1521b1355959eafff1beaf457f3ea9f88c00c84f17ec1a7addda1d9" exitCode=0 Mar 10 08:15:52 crc kubenswrapper[4825]: I0310 08:15:52.462435 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nlxld" event={"ID":"16d6fbed-6e1d-42c0-b023-2b2f0bef5889","Type":"ContainerDied","Data":"b10ac4feb1521b1355959eafff1beaf457f3ea9f88c00c84f17ec1a7addda1d9"} Mar 10 08:15:52 crc kubenswrapper[4825]: I0310 08:15:52.465120 4825 generic.go:334] "Generic (PLEG): container finished" podID="9667b429-21a2-47d3-8874-5cc660a9e1d6" containerID="8b6d793b02a03d2aa4759b0c90664b47e6964de5a55678386bfc749d3b063e35" exitCode=0 Mar 10 08:15:52 crc kubenswrapper[4825]: I0310 08:15:52.465173 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4078-account-create-update-tbhx6" event={"ID":"9667b429-21a2-47d3-8874-5cc660a9e1d6","Type":"ContainerDied","Data":"8b6d793b02a03d2aa4759b0c90664b47e6964de5a55678386bfc749d3b063e35"} Mar 10 08:15:52 crc kubenswrapper[4825]: I0310 08:15:52.465194 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4078-account-create-update-tbhx6" event={"ID":"9667b429-21a2-47d3-8874-5cc660a9e1d6","Type":"ContainerStarted","Data":"e9091b0b8ad7837ac478a923eaac99de5b6364b75ea11f8802878969e0ac88bb"} Mar 10 08:15:53 crc kubenswrapper[4825]: I0310 08:15:53.866830 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4078-account-create-update-tbhx6" Mar 10 08:15:53 crc kubenswrapper[4825]: I0310 08:15:53.874677 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nlxld" Mar 10 08:15:53 crc kubenswrapper[4825]: I0310 08:15:53.987959 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16d6fbed-6e1d-42c0-b023-2b2f0bef5889-operator-scripts\") pod \"16d6fbed-6e1d-42c0-b023-2b2f0bef5889\" (UID: \"16d6fbed-6e1d-42c0-b023-2b2f0bef5889\") " Mar 10 08:15:53 crc kubenswrapper[4825]: I0310 08:15:53.988013 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9667b429-21a2-47d3-8874-5cc660a9e1d6-operator-scripts\") pod \"9667b429-21a2-47d3-8874-5cc660a9e1d6\" (UID: \"9667b429-21a2-47d3-8874-5cc660a9e1d6\") " Mar 10 08:15:53 crc kubenswrapper[4825]: I0310 08:15:53.988068 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w95jv\" (UniqueName: \"kubernetes.io/projected/16d6fbed-6e1d-42c0-b023-2b2f0bef5889-kube-api-access-w95jv\") pod \"16d6fbed-6e1d-42c0-b023-2b2f0bef5889\" (UID: \"16d6fbed-6e1d-42c0-b023-2b2f0bef5889\") " Mar 10 08:15:53 crc kubenswrapper[4825]: I0310 08:15:53.988253 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hw9w\" (UniqueName: \"kubernetes.io/projected/9667b429-21a2-47d3-8874-5cc660a9e1d6-kube-api-access-4hw9w\") pod \"9667b429-21a2-47d3-8874-5cc660a9e1d6\" (UID: \"9667b429-21a2-47d3-8874-5cc660a9e1d6\") " Mar 10 08:15:53 crc kubenswrapper[4825]: I0310 08:15:53.989595 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16d6fbed-6e1d-42c0-b023-2b2f0bef5889-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "16d6fbed-6e1d-42c0-b023-2b2f0bef5889" (UID: "16d6fbed-6e1d-42c0-b023-2b2f0bef5889"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:15:53 crc kubenswrapper[4825]: I0310 08:15:53.990890 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16d6fbed-6e1d-42c0-b023-2b2f0bef5889-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:53 crc kubenswrapper[4825]: I0310 08:15:53.992111 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9667b429-21a2-47d3-8874-5cc660a9e1d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9667b429-21a2-47d3-8874-5cc660a9e1d6" (UID: "9667b429-21a2-47d3-8874-5cc660a9e1d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:15:53 crc kubenswrapper[4825]: I0310 08:15:53.998508 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16d6fbed-6e1d-42c0-b023-2b2f0bef5889-kube-api-access-w95jv" (OuterVolumeSpecName: "kube-api-access-w95jv") pod "16d6fbed-6e1d-42c0-b023-2b2f0bef5889" (UID: "16d6fbed-6e1d-42c0-b023-2b2f0bef5889"). InnerVolumeSpecName "kube-api-access-w95jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:15:54 crc kubenswrapper[4825]: I0310 08:15:54.004423 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9667b429-21a2-47d3-8874-5cc660a9e1d6-kube-api-access-4hw9w" (OuterVolumeSpecName: "kube-api-access-4hw9w") pod "9667b429-21a2-47d3-8874-5cc660a9e1d6" (UID: "9667b429-21a2-47d3-8874-5cc660a9e1d6"). InnerVolumeSpecName "kube-api-access-4hw9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:15:54 crc kubenswrapper[4825]: I0310 08:15:54.092605 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9667b429-21a2-47d3-8874-5cc660a9e1d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:54 crc kubenswrapper[4825]: I0310 08:15:54.092637 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w95jv\" (UniqueName: \"kubernetes.io/projected/16d6fbed-6e1d-42c0-b023-2b2f0bef5889-kube-api-access-w95jv\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:54 crc kubenswrapper[4825]: I0310 08:15:54.092653 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hw9w\" (UniqueName: \"kubernetes.io/projected/9667b429-21a2-47d3-8874-5cc660a9e1d6-kube-api-access-4hw9w\") on node \"crc\" DevicePath \"\"" Mar 10 08:15:54 crc kubenswrapper[4825]: I0310 08:15:54.486892 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nlxld" Mar 10 08:15:54 crc kubenswrapper[4825]: I0310 08:15:54.486941 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nlxld" event={"ID":"16d6fbed-6e1d-42c0-b023-2b2f0bef5889","Type":"ContainerDied","Data":"0565ff6d03a4dfb9d17e05f702b296992279a08e41ae0028387bcf6bed0d5807"} Mar 10 08:15:54 crc kubenswrapper[4825]: I0310 08:15:54.486972 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0565ff6d03a4dfb9d17e05f702b296992279a08e41ae0028387bcf6bed0d5807" Mar 10 08:15:54 crc kubenswrapper[4825]: I0310 08:15:54.489241 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4078-account-create-update-tbhx6" event={"ID":"9667b429-21a2-47d3-8874-5cc660a9e1d6","Type":"ContainerDied","Data":"e9091b0b8ad7837ac478a923eaac99de5b6364b75ea11f8802878969e0ac88bb"} Mar 10 08:15:54 crc kubenswrapper[4825]: I0310 08:15:54.489263 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9091b0b8ad7837ac478a923eaac99de5b6364b75ea11f8802878969e0ac88bb" Mar 10 08:15:54 crc kubenswrapper[4825]: I0310 08:15:54.489360 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4078-account-create-update-tbhx6" Mar 10 08:15:55 crc kubenswrapper[4825]: I0310 08:15:55.784698 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-zhwf4"] Mar 10 08:15:55 crc kubenswrapper[4825]: E0310 08:15:55.785482 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9667b429-21a2-47d3-8874-5cc660a9e1d6" containerName="mariadb-account-create-update" Mar 10 08:15:55 crc kubenswrapper[4825]: I0310 08:15:55.785501 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="9667b429-21a2-47d3-8874-5cc660a9e1d6" containerName="mariadb-account-create-update" Mar 10 08:15:55 crc kubenswrapper[4825]: E0310 08:15:55.785520 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d6fbed-6e1d-42c0-b023-2b2f0bef5889" containerName="mariadb-database-create" Mar 10 08:15:55 crc kubenswrapper[4825]: I0310 08:15:55.785527 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d6fbed-6e1d-42c0-b023-2b2f0bef5889" containerName="mariadb-database-create" Mar 10 08:15:55 crc kubenswrapper[4825]: I0310 08:15:55.785752 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="9667b429-21a2-47d3-8874-5cc660a9e1d6" containerName="mariadb-account-create-update" Mar 10 08:15:55 crc kubenswrapper[4825]: I0310 08:15:55.785788 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="16d6fbed-6e1d-42c0-b023-2b2f0bef5889" containerName="mariadb-database-create" Mar 10 08:15:55 crc kubenswrapper[4825]: I0310 08:15:55.787413 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zhwf4" Mar 10 08:15:55 crc kubenswrapper[4825]: I0310 08:15:55.789909 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 10 08:15:55 crc kubenswrapper[4825]: I0310 08:15:55.789909 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vs878" Mar 10 08:15:55 crc kubenswrapper[4825]: I0310 08:15:55.793391 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zhwf4"] Mar 10 08:15:55 crc kubenswrapper[4825]: I0310 08:15:55.823407 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed01571-e0e3-438d-ab0e-3e0a85675f29-combined-ca-bundle\") pod \"glance-db-sync-zhwf4\" (UID: \"7ed01571-e0e3-438d-ab0e-3e0a85675f29\") " pod="openstack/glance-db-sync-zhwf4" Mar 10 08:15:55 crc kubenswrapper[4825]: I0310 08:15:55.823453 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64pt6\" (UniqueName: \"kubernetes.io/projected/7ed01571-e0e3-438d-ab0e-3e0a85675f29-kube-api-access-64pt6\") pod \"glance-db-sync-zhwf4\" (UID: \"7ed01571-e0e3-438d-ab0e-3e0a85675f29\") " pod="openstack/glance-db-sync-zhwf4" Mar 10 08:15:55 crc kubenswrapper[4825]: I0310 08:15:55.823518 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed01571-e0e3-438d-ab0e-3e0a85675f29-config-data\") pod \"glance-db-sync-zhwf4\" (UID: \"7ed01571-e0e3-438d-ab0e-3e0a85675f29\") " pod="openstack/glance-db-sync-zhwf4" Mar 10 08:15:55 crc kubenswrapper[4825]: I0310 08:15:55.823538 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ed01571-e0e3-438d-ab0e-3e0a85675f29-db-sync-config-data\") pod \"glance-db-sync-zhwf4\" (UID: \"7ed01571-e0e3-438d-ab0e-3e0a85675f29\") " pod="openstack/glance-db-sync-zhwf4" Mar 10 08:15:55 crc kubenswrapper[4825]: I0310 08:15:55.925050 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed01571-e0e3-438d-ab0e-3e0a85675f29-combined-ca-bundle\") pod \"glance-db-sync-zhwf4\" (UID: \"7ed01571-e0e3-438d-ab0e-3e0a85675f29\") " pod="openstack/glance-db-sync-zhwf4" Mar 10 08:15:55 crc kubenswrapper[4825]: I0310 08:15:55.925107 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64pt6\" (UniqueName: \"kubernetes.io/projected/7ed01571-e0e3-438d-ab0e-3e0a85675f29-kube-api-access-64pt6\") pod \"glance-db-sync-zhwf4\" (UID: \"7ed01571-e0e3-438d-ab0e-3e0a85675f29\") " pod="openstack/glance-db-sync-zhwf4" Mar 10 08:15:55 crc kubenswrapper[4825]: I0310 08:15:55.925238 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed01571-e0e3-438d-ab0e-3e0a85675f29-config-data\") pod \"glance-db-sync-zhwf4\" (UID: \"7ed01571-e0e3-438d-ab0e-3e0a85675f29\") " pod="openstack/glance-db-sync-zhwf4" Mar 10 08:15:55 crc kubenswrapper[4825]: I0310 08:15:55.925269 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ed01571-e0e3-438d-ab0e-3e0a85675f29-db-sync-config-data\") pod \"glance-db-sync-zhwf4\" (UID: \"7ed01571-e0e3-438d-ab0e-3e0a85675f29\") " pod="openstack/glance-db-sync-zhwf4" Mar 10 08:15:55 crc kubenswrapper[4825]: I0310 08:15:55.929995 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed01571-e0e3-438d-ab0e-3e0a85675f29-config-data\") pod \"glance-db-sync-zhwf4\" (UID: \"7ed01571-e0e3-438d-ab0e-3e0a85675f29\") " pod="openstack/glance-db-sync-zhwf4" Mar 10 08:15:55 crc kubenswrapper[4825]: I0310 08:15:55.930283 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed01571-e0e3-438d-ab0e-3e0a85675f29-combined-ca-bundle\") pod \"glance-db-sync-zhwf4\" (UID: \"7ed01571-e0e3-438d-ab0e-3e0a85675f29\") " pod="openstack/glance-db-sync-zhwf4" Mar 10 08:15:55 crc kubenswrapper[4825]: I0310 08:15:55.930700 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ed01571-e0e3-438d-ab0e-3e0a85675f29-db-sync-config-data\") pod \"glance-db-sync-zhwf4\" (UID: \"7ed01571-e0e3-438d-ab0e-3e0a85675f29\") " pod="openstack/glance-db-sync-zhwf4" Mar 10 08:15:55 crc kubenswrapper[4825]: I0310 08:15:55.939855 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64pt6\" (UniqueName: \"kubernetes.io/projected/7ed01571-e0e3-438d-ab0e-3e0a85675f29-kube-api-access-64pt6\") pod \"glance-db-sync-zhwf4\" (UID: \"7ed01571-e0e3-438d-ab0e-3e0a85675f29\") " pod="openstack/glance-db-sync-zhwf4" Mar 10 08:15:56 crc kubenswrapper[4825]: I0310 08:15:56.112183 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zhwf4" Mar 10 08:15:56 crc kubenswrapper[4825]: I0310 08:15:56.789458 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zhwf4"] Mar 10 08:15:57 crc kubenswrapper[4825]: I0310 08:15:57.514174 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zhwf4" event={"ID":"7ed01571-e0e3-438d-ab0e-3e0a85675f29","Type":"ContainerStarted","Data":"a9c5e91251a6e089460adf9f0c2d8d10509d596a16b83b86c7c30fd7177e37dc"} Mar 10 08:16:00 crc kubenswrapper[4825]: I0310 08:16:00.156228 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552176-gkhkl"] Mar 10 08:16:00 crc kubenswrapper[4825]: I0310 08:16:00.159285 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552176-gkhkl" Mar 10 08:16:00 crc kubenswrapper[4825]: I0310 08:16:00.162336 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:16:00 crc kubenswrapper[4825]: I0310 08:16:00.162623 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:16:00 crc kubenswrapper[4825]: I0310 08:16:00.163244 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:16:00 crc kubenswrapper[4825]: I0310 08:16:00.165218 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552176-gkhkl"] Mar 10 08:16:00 crc kubenswrapper[4825]: I0310 08:16:00.201121 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z97cc\" (UniqueName: \"kubernetes.io/projected/2b6e77c2-7c47-47cb-b6ab-d44cbba3ade6-kube-api-access-z97cc\") pod \"auto-csr-approver-29552176-gkhkl\" (UID: \"2b6e77c2-7c47-47cb-b6ab-d44cbba3ade6\") " pod="openshift-infra/auto-csr-approver-29552176-gkhkl" Mar 10 08:16:00 crc kubenswrapper[4825]: I0310 08:16:00.302371 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z97cc\" (UniqueName: \"kubernetes.io/projected/2b6e77c2-7c47-47cb-b6ab-d44cbba3ade6-kube-api-access-z97cc\") pod \"auto-csr-approver-29552176-gkhkl\" (UID: \"2b6e77c2-7c47-47cb-b6ab-d44cbba3ade6\") " pod="openshift-infra/auto-csr-approver-29552176-gkhkl" Mar 10 08:16:00 crc kubenswrapper[4825]: I0310 08:16:00.320863 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z97cc\" (UniqueName: \"kubernetes.io/projected/2b6e77c2-7c47-47cb-b6ab-d44cbba3ade6-kube-api-access-z97cc\") pod \"auto-csr-approver-29552176-gkhkl\" (UID: \"2b6e77c2-7c47-47cb-b6ab-d44cbba3ade6\") " pod="openshift-infra/auto-csr-approver-29552176-gkhkl" Mar 10 08:16:00 crc kubenswrapper[4825]: I0310 08:16:00.489899 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552176-gkhkl" Mar 10 08:16:00 crc kubenswrapper[4825]: I0310 08:16:00.962830 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552176-gkhkl"] Mar 10 08:16:01 crc kubenswrapper[4825]: I0310 08:16:01.552899 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552176-gkhkl" event={"ID":"2b6e77c2-7c47-47cb-b6ab-d44cbba3ade6","Type":"ContainerStarted","Data":"47eda1d66ff8689873e5d03872ba5b79d7d3e5874661badab5211c0a5204aa36"} Mar 10 08:16:02 crc kubenswrapper[4825]: I0310 08:16:02.237669 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:16:02 crc kubenswrapper[4825]: E0310 08:16:02.238357 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:16:03 crc kubenswrapper[4825]: I0310 08:16:03.568897 4825 generic.go:334] "Generic (PLEG): container finished" podID="2b6e77c2-7c47-47cb-b6ab-d44cbba3ade6" containerID="57c2b272221ddc23f212408754974013bc3a6e357f2917374ae738af261e9f88" exitCode=0 Mar 10 08:16:03 crc kubenswrapper[4825]: I0310 08:16:03.568938 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552176-gkhkl" event={"ID":"2b6e77c2-7c47-47cb-b6ab-d44cbba3ade6","Type":"ContainerDied","Data":"57c2b272221ddc23f212408754974013bc3a6e357f2917374ae738af261e9f88"} Mar 10 08:16:04 crc kubenswrapper[4825]: I0310 08:16:04.952683 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552176-gkhkl" Mar 10 08:16:04 crc kubenswrapper[4825]: I0310 08:16:04.989311 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z97cc\" (UniqueName: \"kubernetes.io/projected/2b6e77c2-7c47-47cb-b6ab-d44cbba3ade6-kube-api-access-z97cc\") pod \"2b6e77c2-7c47-47cb-b6ab-d44cbba3ade6\" (UID: \"2b6e77c2-7c47-47cb-b6ab-d44cbba3ade6\") " Mar 10 08:16:04 crc kubenswrapper[4825]: I0310 08:16:04.996431 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b6e77c2-7c47-47cb-b6ab-d44cbba3ade6-kube-api-access-z97cc" (OuterVolumeSpecName: "kube-api-access-z97cc") pod "2b6e77c2-7c47-47cb-b6ab-d44cbba3ade6" (UID: "2b6e77c2-7c47-47cb-b6ab-d44cbba3ade6"). InnerVolumeSpecName "kube-api-access-z97cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:16:05 crc kubenswrapper[4825]: I0310 08:16:05.091725 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z97cc\" (UniqueName: \"kubernetes.io/projected/2b6e77c2-7c47-47cb-b6ab-d44cbba3ade6-kube-api-access-z97cc\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:05 crc kubenswrapper[4825]: I0310 08:16:05.587217 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552176-gkhkl" event={"ID":"2b6e77c2-7c47-47cb-b6ab-d44cbba3ade6","Type":"ContainerDied","Data":"47eda1d66ff8689873e5d03872ba5b79d7d3e5874661badab5211c0a5204aa36"} Mar 10 08:16:05 crc kubenswrapper[4825]: I0310 08:16:05.587503 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47eda1d66ff8689873e5d03872ba5b79d7d3e5874661badab5211c0a5204aa36" Mar 10 08:16:05 crc kubenswrapper[4825]: I0310 08:16:05.587268 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552176-gkhkl" Mar 10 08:16:06 crc kubenswrapper[4825]: I0310 08:16:06.019856 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552170-d7mn2"] Mar 10 08:16:06 crc kubenswrapper[4825]: I0310 08:16:06.033641 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552170-d7mn2"] Mar 10 08:16:07 crc kubenswrapper[4825]: I0310 08:16:07.247604 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb5138fc-0a3c-4581-8319-0154b854ccce" path="/var/lib/kubelet/pods/fb5138fc-0a3c-4581-8319-0154b854ccce/volumes" Mar 10 08:16:14 crc kubenswrapper[4825]: I0310 08:16:14.668749 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zhwf4" event={"ID":"7ed01571-e0e3-438d-ab0e-3e0a85675f29","Type":"ContainerStarted","Data":"89a0d60e3882cad80ee3fc44a51053fcea4ea3f6161e8c265fd82229de0d004c"} Mar 10 08:16:14 crc kubenswrapper[4825]: I0310 08:16:14.690954 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-zhwf4" podStartSLOduration=2.960571722 podStartE2EDuration="19.690932293s" podCreationTimestamp="2026-03-10 08:15:55 +0000 UTC" firstStartedPulling="2026-03-10 08:15:56.786021879 +0000 UTC m=+5509.815802494" lastFinishedPulling="2026-03-10 08:16:13.51638245 +0000 UTC m=+5526.546163065" observedRunningTime="2026-03-10 08:16:14.684407111 +0000 UTC m=+5527.714187726" watchObservedRunningTime="2026-03-10 08:16:14.690932293 +0000 UTC m=+5527.720712908" Mar 10 08:16:15 crc kubenswrapper[4825]: I0310 08:16:15.236864 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:16:15 crc kubenswrapper[4825]: E0310 08:16:15.237162 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:16:17 crc kubenswrapper[4825]: I0310 08:16:17.707680 4825 generic.go:334] "Generic (PLEG): container finished" podID="7ed01571-e0e3-438d-ab0e-3e0a85675f29" containerID="89a0d60e3882cad80ee3fc44a51053fcea4ea3f6161e8c265fd82229de0d004c" exitCode=0 Mar 10 08:16:17 crc kubenswrapper[4825]: I0310 08:16:17.707857 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zhwf4" event={"ID":"7ed01571-e0e3-438d-ab0e-3e0a85675f29","Type":"ContainerDied","Data":"89a0d60e3882cad80ee3fc44a51053fcea4ea3f6161e8c265fd82229de0d004c"} Mar 10 08:16:19 crc kubenswrapper[4825]: I0310 08:16:19.066899 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zhwf4" Mar 10 08:16:19 crc kubenswrapper[4825]: I0310 08:16:19.170438 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64pt6\" (UniqueName: \"kubernetes.io/projected/7ed01571-e0e3-438d-ab0e-3e0a85675f29-kube-api-access-64pt6\") pod \"7ed01571-e0e3-438d-ab0e-3e0a85675f29\" (UID: \"7ed01571-e0e3-438d-ab0e-3e0a85675f29\") " Mar 10 08:16:19 crc kubenswrapper[4825]: I0310 08:16:19.170523 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed01571-e0e3-438d-ab0e-3e0a85675f29-combined-ca-bundle\") pod \"7ed01571-e0e3-438d-ab0e-3e0a85675f29\" (UID: \"7ed01571-e0e3-438d-ab0e-3e0a85675f29\") " Mar 10 08:16:19 crc kubenswrapper[4825]: I0310 08:16:19.170586 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed01571-e0e3-438d-ab0e-3e0a85675f29-config-data\") pod \"7ed01571-e0e3-438d-ab0e-3e0a85675f29\" (UID: \"7ed01571-e0e3-438d-ab0e-3e0a85675f29\") " Mar 10 08:16:19 crc kubenswrapper[4825]: I0310 08:16:19.170730 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ed01571-e0e3-438d-ab0e-3e0a85675f29-db-sync-config-data\") pod \"7ed01571-e0e3-438d-ab0e-3e0a85675f29\" (UID: \"7ed01571-e0e3-438d-ab0e-3e0a85675f29\") " Mar 10 08:16:19 crc kubenswrapper[4825]: I0310 08:16:19.177650 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed01571-e0e3-438d-ab0e-3e0a85675f29-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7ed01571-e0e3-438d-ab0e-3e0a85675f29" (UID: "7ed01571-e0e3-438d-ab0e-3e0a85675f29"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:16:19 crc kubenswrapper[4825]: I0310 08:16:19.180007 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ed01571-e0e3-438d-ab0e-3e0a85675f29-kube-api-access-64pt6" (OuterVolumeSpecName: "kube-api-access-64pt6") pod "7ed01571-e0e3-438d-ab0e-3e0a85675f29" (UID: "7ed01571-e0e3-438d-ab0e-3e0a85675f29"). InnerVolumeSpecName "kube-api-access-64pt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:16:19 crc kubenswrapper[4825]: I0310 08:16:19.208080 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed01571-e0e3-438d-ab0e-3e0a85675f29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ed01571-e0e3-438d-ab0e-3e0a85675f29" (UID: "7ed01571-e0e3-438d-ab0e-3e0a85675f29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:16:19 crc kubenswrapper[4825]: I0310 08:16:19.266780 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed01571-e0e3-438d-ab0e-3e0a85675f29-config-data" (OuterVolumeSpecName: "config-data") pod "7ed01571-e0e3-438d-ab0e-3e0a85675f29" (UID: "7ed01571-e0e3-438d-ab0e-3e0a85675f29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:16:19 crc kubenswrapper[4825]: I0310 08:16:19.273064 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64pt6\" (UniqueName: \"kubernetes.io/projected/7ed01571-e0e3-438d-ab0e-3e0a85675f29-kube-api-access-64pt6\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:19 crc kubenswrapper[4825]: I0310 08:16:19.273109 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed01571-e0e3-438d-ab0e-3e0a85675f29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:19 crc kubenswrapper[4825]: I0310 08:16:19.273141 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed01571-e0e3-438d-ab0e-3e0a85675f29-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:19 crc kubenswrapper[4825]: I0310 08:16:19.273155 4825 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7ed01571-e0e3-438d-ab0e-3e0a85675f29-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:19 crc kubenswrapper[4825]: I0310 08:16:19.732479 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zhwf4" event={"ID":"7ed01571-e0e3-438d-ab0e-3e0a85675f29","Type":"ContainerDied","Data":"a9c5e91251a6e089460adf9f0c2d8d10509d596a16b83b86c7c30fd7177e37dc"} Mar 10 08:16:19 crc kubenswrapper[4825]: I0310 08:16:19.732565 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9c5e91251a6e089460adf9f0c2d8d10509d596a16b83b86c7c30fd7177e37dc" Mar 10 08:16:19 crc kubenswrapper[4825]: I0310 08:16:19.732604 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zhwf4" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.007266 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 08:16:20 crc kubenswrapper[4825]: E0310 08:16:20.007980 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed01571-e0e3-438d-ab0e-3e0a85675f29" containerName="glance-db-sync" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.007999 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed01571-e0e3-438d-ab0e-3e0a85675f29" containerName="glance-db-sync" Mar 10 08:16:20 crc kubenswrapper[4825]: E0310 08:16:20.008012 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6e77c2-7c47-47cb-b6ab-d44cbba3ade6" containerName="oc" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.008019 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6e77c2-7c47-47cb-b6ab-d44cbba3ade6" containerName="oc" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.008211 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b6e77c2-7c47-47cb-b6ab-d44cbba3ade6" containerName="oc" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.008226 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ed01571-e0e3-438d-ab0e-3e0a85675f29" containerName="glance-db-sync" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.009176 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.013122 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.017179 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vs878" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.017433 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.041817 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.164988 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d95445669-fw682"] Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.166445 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d95445669-fw682" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.188510 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d95445669-fw682"] Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.189244 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe716cf-a746-4f7b-b5f1-beb9282140a5-config-data\") pod \"glance-default-external-api-0\" (UID: \"afe716cf-a746-4f7b-b5f1-beb9282140a5\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.189485 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe716cf-a746-4f7b-b5f1-beb9282140a5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"afe716cf-a746-4f7b-b5f1-beb9282140a5\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.189548 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7696\" (UniqueName: \"kubernetes.io/projected/afe716cf-a746-4f7b-b5f1-beb9282140a5-kube-api-access-z7696\") pod \"glance-default-external-api-0\" (UID: \"afe716cf-a746-4f7b-b5f1-beb9282140a5\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.189642 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afe716cf-a746-4f7b-b5f1-beb9282140a5-scripts\") pod \"glance-default-external-api-0\" (UID: \"afe716cf-a746-4f7b-b5f1-beb9282140a5\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.189665 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/afe716cf-a746-4f7b-b5f1-beb9282140a5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"afe716cf-a746-4f7b-b5f1-beb9282140a5\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.189709 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe716cf-a746-4f7b-b5f1-beb9282140a5-logs\") pod \"glance-default-external-api-0\" (UID: \"afe716cf-a746-4f7b-b5f1-beb9282140a5\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.278212 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.282712 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.291737 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d04c43e-2981-4964-9545-3c8b6e6fc729-config\") pod \"dnsmasq-dns-d95445669-fw682\" (UID: \"5d04c43e-2981-4964-9545-3c8b6e6fc729\") " pod="openstack/dnsmasq-dns-d95445669-fw682" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.291779 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d04c43e-2981-4964-9545-3c8b6e6fc729-dns-svc\") pod \"dnsmasq-dns-d95445669-fw682\" (UID: \"5d04c43e-2981-4964-9545-3c8b6e6fc729\") " pod="openstack/dnsmasq-dns-d95445669-fw682" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.291781 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.291812 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe716cf-a746-4f7b-b5f1-beb9282140a5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"afe716cf-a746-4f7b-b5f1-beb9282140a5\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.291838 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7696\" (UniqueName: \"kubernetes.io/projected/afe716cf-a746-4f7b-b5f1-beb9282140a5-kube-api-access-z7696\") pod \"glance-default-external-api-0\" (UID: \"afe716cf-a746-4f7b-b5f1-beb9282140a5\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.291876 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d04c43e-2981-4964-9545-3c8b6e6fc729-ovsdbserver-nb\") pod \"dnsmasq-dns-d95445669-fw682\" (UID: \"5d04c43e-2981-4964-9545-3c8b6e6fc729\") " pod="openstack/dnsmasq-dns-d95445669-fw682" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.291903 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afe716cf-a746-4f7b-b5f1-beb9282140a5-scripts\") pod \"glance-default-external-api-0\" (UID: \"afe716cf-a746-4f7b-b5f1-beb9282140a5\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.291925 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/afe716cf-a746-4f7b-b5f1-beb9282140a5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"afe716cf-a746-4f7b-b5f1-beb9282140a5\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.291953 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe716cf-a746-4f7b-b5f1-beb9282140a5-logs\") pod \"glance-default-external-api-0\" (UID: \"afe716cf-a746-4f7b-b5f1-beb9282140a5\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.291978 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnw9b\" (UniqueName: \"kubernetes.io/projected/5d04c43e-2981-4964-9545-3c8b6e6fc729-kube-api-access-bnw9b\") pod \"dnsmasq-dns-d95445669-fw682\" (UID: \"5d04c43e-2981-4964-9545-3c8b6e6fc729\") " pod="openstack/dnsmasq-dns-d95445669-fw682" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.292052 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d04c43e-2981-4964-9545-3c8b6e6fc729-ovsdbserver-sb\") pod \"dnsmasq-dns-d95445669-fw682\" (UID: \"5d04c43e-2981-4964-9545-3c8b6e6fc729\") " pod="openstack/dnsmasq-dns-d95445669-fw682" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.292112 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe716cf-a746-4f7b-b5f1-beb9282140a5-config-data\") pod \"glance-default-external-api-0\" (UID: \"afe716cf-a746-4f7b-b5f1-beb9282140a5\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.298761 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe716cf-a746-4f7b-b5f1-beb9282140a5-logs\") pod \"glance-default-external-api-0\" (UID: \"afe716cf-a746-4f7b-b5f1-beb9282140a5\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.301633 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/afe716cf-a746-4f7b-b5f1-beb9282140a5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"afe716cf-a746-4f7b-b5f1-beb9282140a5\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.305767 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe716cf-a746-4f7b-b5f1-beb9282140a5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"afe716cf-a746-4f7b-b5f1-beb9282140a5\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.309412 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afe716cf-a746-4f7b-b5f1-beb9282140a5-scripts\") pod \"glance-default-external-api-0\" (UID: \"afe716cf-a746-4f7b-b5f1-beb9282140a5\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.321198 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe716cf-a746-4f7b-b5f1-beb9282140a5-config-data\") pod \"glance-default-external-api-0\" (UID: \"afe716cf-a746-4f7b-b5f1-beb9282140a5\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.321294 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.347831 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7696\" (UniqueName: \"kubernetes.io/projected/afe716cf-a746-4f7b-b5f1-beb9282140a5-kube-api-access-z7696\") pod \"glance-default-external-api-0\" (UID: \"afe716cf-a746-4f7b-b5f1-beb9282140a5\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.395248 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b9ddf47-a404-4d81-8826-106a75c1c3af-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5b9ddf47-a404-4d81-8826-106a75c1c3af\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.396805 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d04c43e-2981-4964-9545-3c8b6e6fc729-config\") pod \"dnsmasq-dns-d95445669-fw682\" (UID: \"5d04c43e-2981-4964-9545-3c8b6e6fc729\") " pod="openstack/dnsmasq-dns-d95445669-fw682" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.398099 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d04c43e-2981-4964-9545-3c8b6e6fc729-config\") pod \"dnsmasq-dns-d95445669-fw682\" (UID: \"5d04c43e-2981-4964-9545-3c8b6e6fc729\") " pod="openstack/dnsmasq-dns-d95445669-fw682" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.398194 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d04c43e-2981-4964-9545-3c8b6e6fc729-dns-svc\") pod \"dnsmasq-dns-d95445669-fw682\" (UID: \"5d04c43e-2981-4964-9545-3c8b6e6fc729\") " pod="openstack/dnsmasq-dns-d95445669-fw682" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.398234 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b9ddf47-a404-4d81-8826-106a75c1c3af-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5b9ddf47-a404-4d81-8826-106a75c1c3af\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.398370 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d04c43e-2981-4964-9545-3c8b6e6fc729-ovsdbserver-nb\") pod \"dnsmasq-dns-d95445669-fw682\" (UID: \"5d04c43e-2981-4964-9545-3c8b6e6fc729\") " pod="openstack/dnsmasq-dns-d95445669-fw682" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.398468 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b9ddf47-a404-4d81-8826-106a75c1c3af-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5b9ddf47-a404-4d81-8826-106a75c1c3af\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.398516 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b9ddf47-a404-4d81-8826-106a75c1c3af-logs\") pod \"glance-default-internal-api-0\" (UID: \"5b9ddf47-a404-4d81-8826-106a75c1c3af\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.398543 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnw9b\" (UniqueName: \"kubernetes.io/projected/5d04c43e-2981-4964-9545-3c8b6e6fc729-kube-api-access-bnw9b\") pod \"dnsmasq-dns-d95445669-fw682\" (UID: \"5d04c43e-2981-4964-9545-3c8b6e6fc729\") " pod="openstack/dnsmasq-dns-d95445669-fw682" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.398599 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d04c43e-2981-4964-9545-3c8b6e6fc729-ovsdbserver-sb\") pod \"dnsmasq-dns-d95445669-fw682\" (UID: \"5d04c43e-2981-4964-9545-3c8b6e6fc729\") " pod="openstack/dnsmasq-dns-d95445669-fw682" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.399153 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q982c\" (UniqueName: \"kubernetes.io/projected/5b9ddf47-a404-4d81-8826-106a75c1c3af-kube-api-access-q982c\") pod \"glance-default-internal-api-0\" (UID: \"5b9ddf47-a404-4d81-8826-106a75c1c3af\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.399584 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5b9ddf47-a404-4d81-8826-106a75c1c3af-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5b9ddf47-a404-4d81-8826-106a75c1c3af\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.400470 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d04c43e-2981-4964-9545-3c8b6e6fc729-dns-svc\") pod \"dnsmasq-dns-d95445669-fw682\" (UID: \"5d04c43e-2981-4964-9545-3c8b6e6fc729\") " pod="openstack/dnsmasq-dns-d95445669-fw682" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.400980 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d04c43e-2981-4964-9545-3c8b6e6fc729-ovsdbserver-nb\") pod \"dnsmasq-dns-d95445669-fw682\" (UID: \"5d04c43e-2981-4964-9545-3c8b6e6fc729\") " pod="openstack/dnsmasq-dns-d95445669-fw682" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.401101 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d04c43e-2981-4964-9545-3c8b6e6fc729-ovsdbserver-sb\") pod \"dnsmasq-dns-d95445669-fw682\" (UID: \"5d04c43e-2981-4964-9545-3c8b6e6fc729\") " pod="openstack/dnsmasq-dns-d95445669-fw682" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.419915 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnw9b\" (UniqueName: \"kubernetes.io/projected/5d04c43e-2981-4964-9545-3c8b6e6fc729-kube-api-access-bnw9b\") pod \"dnsmasq-dns-d95445669-fw682\" (UID: \"5d04c43e-2981-4964-9545-3c8b6e6fc729\") " pod="openstack/dnsmasq-dns-d95445669-fw682" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.500634 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d95445669-fw682" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.514488 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b9ddf47-a404-4d81-8826-106a75c1c3af-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5b9ddf47-a404-4d81-8826-106a75c1c3af\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.514690 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b9ddf47-a404-4d81-8826-106a75c1c3af-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5b9ddf47-a404-4d81-8826-106a75c1c3af\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.514727 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b9ddf47-a404-4d81-8826-106a75c1c3af-logs\") pod \"glance-default-internal-api-0\" (UID: \"5b9ddf47-a404-4d81-8826-106a75c1c3af\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.514926 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q982c\" (UniqueName: \"kubernetes.io/projected/5b9ddf47-a404-4d81-8826-106a75c1c3af-kube-api-access-q982c\") pod \"glance-default-internal-api-0\" (UID: \"5b9ddf47-a404-4d81-8826-106a75c1c3af\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.514982 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5b9ddf47-a404-4d81-8826-106a75c1c3af-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5b9ddf47-a404-4d81-8826-106a75c1c3af\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.515060 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b9ddf47-a404-4d81-8826-106a75c1c3af-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5b9ddf47-a404-4d81-8826-106a75c1c3af\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.515443 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b9ddf47-a404-4d81-8826-106a75c1c3af-logs\") pod \"glance-default-internal-api-0\" (UID: \"5b9ddf47-a404-4d81-8826-106a75c1c3af\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.515552 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5b9ddf47-a404-4d81-8826-106a75c1c3af-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5b9ddf47-a404-4d81-8826-106a75c1c3af\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.521545 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b9ddf47-a404-4d81-8826-106a75c1c3af-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5b9ddf47-a404-4d81-8826-106a75c1c3af\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.521847 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b9ddf47-a404-4d81-8826-106a75c1c3af-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5b9ddf47-a404-4d81-8826-106a75c1c3af\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.521868 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b9ddf47-a404-4d81-8826-106a75c1c3af-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5b9ddf47-a404-4d81-8826-106a75c1c3af\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.531477 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q982c\" (UniqueName: \"kubernetes.io/projected/5b9ddf47-a404-4d81-8826-106a75c1c3af-kube-api-access-q982c\") pod \"glance-default-internal-api-0\" (UID: \"5b9ddf47-a404-4d81-8826-106a75c1c3af\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.642100 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.683209 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 08:16:20 crc kubenswrapper[4825]: I0310 08:16:20.959255 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d95445669-fw682"] Mar 10 08:16:22 crc kubenswrapper[4825]: I0310 08:16:21.126519 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 08:16:22 crc kubenswrapper[4825]: W0310 08:16:21.134256 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b9ddf47_a404_4d81_8826_106a75c1c3af.slice/crio-6a0558a070b9c0569d9871542fe6f132743b72df4ce33bbdd0f77754c4c66ca9 WatchSource:0}: Error finding container 6a0558a070b9c0569d9871542fe6f132743b72df4ce33bbdd0f77754c4c66ca9: Status 404 returned error can't find the container with id 6a0558a070b9c0569d9871542fe6f132743b72df4ce33bbdd0f77754c4c66ca9 Mar 10 08:16:22 crc kubenswrapper[4825]: I0310 08:16:21.223103 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 08:16:22 crc kubenswrapper[4825]: W0310 08:16:21.230986 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafe716cf_a746_4f7b_b5f1_beb9282140a5.slice/crio-aa6ad75f69d7aa4c93e11aa87ab756f397a71820a168b14110ea9ee33445a13b WatchSource:0}: Error finding container aa6ad75f69d7aa4c93e11aa87ab756f397a71820a168b14110ea9ee33445a13b: Status 404 returned error can't find the container with id aa6ad75f69d7aa4c93e11aa87ab756f397a71820a168b14110ea9ee33445a13b Mar 10 08:16:22 crc kubenswrapper[4825]: I0310 08:16:21.437065 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 08:16:22 crc kubenswrapper[4825]: I0310 08:16:21.770485 4825 generic.go:334] "Generic (PLEG): container finished" podID="5d04c43e-2981-4964-9545-3c8b6e6fc729" containerID="d1333c08c6dfe951c98f1b355b7accf634ac2610ca31ec9e74c16efe2999b96b" exitCode=0 Mar 10 08:16:22 crc kubenswrapper[4825]: I0310 08:16:21.770586 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d95445669-fw682" event={"ID":"5d04c43e-2981-4964-9545-3c8b6e6fc729","Type":"ContainerDied","Data":"d1333c08c6dfe951c98f1b355b7accf634ac2610ca31ec9e74c16efe2999b96b"} Mar 10 08:16:22 crc kubenswrapper[4825]: I0310 08:16:21.770621 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d95445669-fw682" event={"ID":"5d04c43e-2981-4964-9545-3c8b6e6fc729","Type":"ContainerStarted","Data":"07904c34be26099a86b01693aeb56bb01ae0f28efa0de06157321006dade7211"} Mar 10 08:16:22 crc kubenswrapper[4825]: I0310 08:16:21.780020 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"afe716cf-a746-4f7b-b5f1-beb9282140a5","Type":"ContainerStarted","Data":"aa6ad75f69d7aa4c93e11aa87ab756f397a71820a168b14110ea9ee33445a13b"} Mar 10 08:16:22 crc kubenswrapper[4825]: I0310 08:16:21.781905 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5b9ddf47-a404-4d81-8826-106a75c1c3af","Type":"ContainerStarted","Data":"6a0558a070b9c0569d9871542fe6f132743b72df4ce33bbdd0f77754c4c66ca9"} Mar 10 08:16:22 crc kubenswrapper[4825]: I0310 08:16:22.535771 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 08:16:22 crc kubenswrapper[4825]: I0310 08:16:22.804963 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d95445669-fw682" event={"ID":"5d04c43e-2981-4964-9545-3c8b6e6fc729","Type":"ContainerStarted","Data":"036c536c50bcd54816ef85eb5048b562a0550d4afc523cd440acef00376c965c"} Mar 10 08:16:22 crc kubenswrapper[4825]: I0310 08:16:22.806255 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d95445669-fw682" Mar 10 08:16:22 crc kubenswrapper[4825]: I0310 08:16:22.808893 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"afe716cf-a746-4f7b-b5f1-beb9282140a5","Type":"ContainerStarted","Data":"ccb72c82096c7e245188dcca8e97f71e8fbf23660808cf882175aba30900ecf8"} Mar 10 08:16:22 crc kubenswrapper[4825]: I0310 08:16:22.811302 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5b9ddf47-a404-4d81-8826-106a75c1c3af","Type":"ContainerStarted","Data":"20a826d40adbdb889c57e20f1fc7430344d58b430428d2d3b4d8daa031ae748b"} Mar 10 08:16:22 crc kubenswrapper[4825]: I0310 08:16:22.833072 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d95445669-fw682" podStartSLOduration=2.833049693 podStartE2EDuration="2.833049693s" podCreationTimestamp="2026-03-10 08:16:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:16:22.826264584 +0000 UTC m=+5535.856045219" watchObservedRunningTime="2026-03-10 08:16:22.833049693 +0000 UTC m=+5535.862830308" Mar 10 08:16:23 crc kubenswrapper[4825]: I0310 08:16:23.820238 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"afe716cf-a746-4f7b-b5f1-beb9282140a5","Type":"ContainerStarted","Data":"8b2d2d4f6b0f114db42858f530067eb7ac7c3527c975b8512749ef6848c3c65b"} Mar 10 08:16:23 crc kubenswrapper[4825]: I0310 08:16:23.820348 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="afe716cf-a746-4f7b-b5f1-beb9282140a5" containerName="glance-log" containerID="cri-o://ccb72c82096c7e245188dcca8e97f71e8fbf23660808cf882175aba30900ecf8" gracePeriod=30 Mar 10 08:16:23 crc kubenswrapper[4825]: I0310 08:16:23.820406 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="afe716cf-a746-4f7b-b5f1-beb9282140a5" containerName="glance-httpd" containerID="cri-o://8b2d2d4f6b0f114db42858f530067eb7ac7c3527c975b8512749ef6848c3c65b" gracePeriod=30 Mar 10 08:16:23 crc kubenswrapper[4825]: I0310 08:16:23.823650 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5b9ddf47-a404-4d81-8826-106a75c1c3af","Type":"ContainerStarted","Data":"1458877aa109199100822f71b9a41ba40cf27bcaa699868f10964609a50027f1"} Mar 10 08:16:23 crc kubenswrapper[4825]: I0310 08:16:23.823876 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5b9ddf47-a404-4d81-8826-106a75c1c3af" containerName="glance-log" containerID="cri-o://20a826d40adbdb889c57e20f1fc7430344d58b430428d2d3b4d8daa031ae748b" gracePeriod=30 Mar 10 08:16:23 crc kubenswrapper[4825]: I0310 08:16:23.823936 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5b9ddf47-a404-4d81-8826-106a75c1c3af" containerName="glance-httpd" containerID="cri-o://1458877aa109199100822f71b9a41ba40cf27bcaa699868f10964609a50027f1" gracePeriod=30 Mar 10 08:16:23 crc kubenswrapper[4825]: I0310 08:16:23.843437 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.843420484 podStartE2EDuration="4.843420484s" podCreationTimestamp="2026-03-10 08:16:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:16:23.839337006 +0000 UTC m=+5536.869117631" watchObservedRunningTime="2026-03-10 08:16:23.843420484 +0000 UTC m=+5536.873201099" Mar 10 08:16:23 crc kubenswrapper[4825]: I0310 08:16:23.861873 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.861847478 podStartE2EDuration="3.861847478s" podCreationTimestamp="2026-03-10 08:16:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:16:23.860106922 +0000 UTC m=+5536.889887547" watchObservedRunningTime="2026-03-10 08:16:23.861847478 +0000 UTC m=+5536.891628113" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.657785 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.701836 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/afe716cf-a746-4f7b-b5f1-beb9282140a5-httpd-run\") pod \"afe716cf-a746-4f7b-b5f1-beb9282140a5\" (UID: \"afe716cf-a746-4f7b-b5f1-beb9282140a5\") " Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.701928 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7696\" (UniqueName: \"kubernetes.io/projected/afe716cf-a746-4f7b-b5f1-beb9282140a5-kube-api-access-z7696\") pod \"afe716cf-a746-4f7b-b5f1-beb9282140a5\" (UID: \"afe716cf-a746-4f7b-b5f1-beb9282140a5\") " Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.701955 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe716cf-a746-4f7b-b5f1-beb9282140a5-logs\") pod \"afe716cf-a746-4f7b-b5f1-beb9282140a5\" (UID: \"afe716cf-a746-4f7b-b5f1-beb9282140a5\") " Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.701987 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe716cf-a746-4f7b-b5f1-beb9282140a5-config-data\") pod \"afe716cf-a746-4f7b-b5f1-beb9282140a5\" (UID: \"afe716cf-a746-4f7b-b5f1-beb9282140a5\") " Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.702034 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe716cf-a746-4f7b-b5f1-beb9282140a5-combined-ca-bundle\") pod \"afe716cf-a746-4f7b-b5f1-beb9282140a5\" (UID: \"afe716cf-a746-4f7b-b5f1-beb9282140a5\") " Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.702154 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afe716cf-a746-4f7b-b5f1-beb9282140a5-scripts\") pod \"afe716cf-a746-4f7b-b5f1-beb9282140a5\" (UID: \"afe716cf-a746-4f7b-b5f1-beb9282140a5\") " Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.703756 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afe716cf-a746-4f7b-b5f1-beb9282140a5-logs" (OuterVolumeSpecName: "logs") pod "afe716cf-a746-4f7b-b5f1-beb9282140a5" (UID: "afe716cf-a746-4f7b-b5f1-beb9282140a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.703860 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afe716cf-a746-4f7b-b5f1-beb9282140a5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "afe716cf-a746-4f7b-b5f1-beb9282140a5" (UID: "afe716cf-a746-4f7b-b5f1-beb9282140a5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.708162 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afe716cf-a746-4f7b-b5f1-beb9282140a5-kube-api-access-z7696" (OuterVolumeSpecName: "kube-api-access-z7696") pod "afe716cf-a746-4f7b-b5f1-beb9282140a5" (UID: "afe716cf-a746-4f7b-b5f1-beb9282140a5"). InnerVolumeSpecName "kube-api-access-z7696". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.708471 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe716cf-a746-4f7b-b5f1-beb9282140a5-scripts" (OuterVolumeSpecName: "scripts") pod "afe716cf-a746-4f7b-b5f1-beb9282140a5" (UID: "afe716cf-a746-4f7b-b5f1-beb9282140a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.740783 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe716cf-a746-4f7b-b5f1-beb9282140a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "afe716cf-a746-4f7b-b5f1-beb9282140a5" (UID: "afe716cf-a746-4f7b-b5f1-beb9282140a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.755324 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe716cf-a746-4f7b-b5f1-beb9282140a5-config-data" (OuterVolumeSpecName: "config-data") pod "afe716cf-a746-4f7b-b5f1-beb9282140a5" (UID: "afe716cf-a746-4f7b-b5f1-beb9282140a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.799206 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.803553 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7696\" (UniqueName: \"kubernetes.io/projected/afe716cf-a746-4f7b-b5f1-beb9282140a5-kube-api-access-z7696\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.803647 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe716cf-a746-4f7b-b5f1-beb9282140a5-logs\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.803667 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe716cf-a746-4f7b-b5f1-beb9282140a5-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.803689 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe716cf-a746-4f7b-b5f1-beb9282140a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.803711 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afe716cf-a746-4f7b-b5f1-beb9282140a5-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.803760 4825 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/afe716cf-a746-4f7b-b5f1-beb9282140a5-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.837533 4825 generic.go:334] "Generic (PLEG): container finished" podID="afe716cf-a746-4f7b-b5f1-beb9282140a5" containerID="8b2d2d4f6b0f114db42858f530067eb7ac7c3527c975b8512749ef6848c3c65b" exitCode=0 Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.837568 4825 generic.go:334] "Generic (PLEG): container finished" podID="afe716cf-a746-4f7b-b5f1-beb9282140a5" containerID="ccb72c82096c7e245188dcca8e97f71e8fbf23660808cf882175aba30900ecf8" exitCode=143 Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.837600 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.837629 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"afe716cf-a746-4f7b-b5f1-beb9282140a5","Type":"ContainerDied","Data":"8b2d2d4f6b0f114db42858f530067eb7ac7c3527c975b8512749ef6848c3c65b"} Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.837697 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"afe716cf-a746-4f7b-b5f1-beb9282140a5","Type":"ContainerDied","Data":"ccb72c82096c7e245188dcca8e97f71e8fbf23660808cf882175aba30900ecf8"} Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.837708 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"afe716cf-a746-4f7b-b5f1-beb9282140a5","Type":"ContainerDied","Data":"aa6ad75f69d7aa4c93e11aa87ab756f397a71820a168b14110ea9ee33445a13b"} Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.837724 4825 scope.go:117] "RemoveContainer" containerID="8b2d2d4f6b0f114db42858f530067eb7ac7c3527c975b8512749ef6848c3c65b" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.840434 4825 generic.go:334] "Generic (PLEG): container finished" podID="5b9ddf47-a404-4d81-8826-106a75c1c3af" containerID="1458877aa109199100822f71b9a41ba40cf27bcaa699868f10964609a50027f1" exitCode=0 Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.840453 4825 generic.go:334] "Generic (PLEG): container finished" podID="5b9ddf47-a404-4d81-8826-106a75c1c3af" containerID="20a826d40adbdb889c57e20f1fc7430344d58b430428d2d3b4d8daa031ae748b" exitCode=143 Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.840687 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.841209 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5b9ddf47-a404-4d81-8826-106a75c1c3af","Type":"ContainerDied","Data":"1458877aa109199100822f71b9a41ba40cf27bcaa699868f10964609a50027f1"} Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.841249 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5b9ddf47-a404-4d81-8826-106a75c1c3af","Type":"ContainerDied","Data":"20a826d40adbdb889c57e20f1fc7430344d58b430428d2d3b4d8daa031ae748b"} Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.841264 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5b9ddf47-a404-4d81-8826-106a75c1c3af","Type":"ContainerDied","Data":"6a0558a070b9c0569d9871542fe6f132743b72df4ce33bbdd0f77754c4c66ca9"} Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.867844 4825 scope.go:117] "RemoveContainer" containerID="ccb72c82096c7e245188dcca8e97f71e8fbf23660808cf882175aba30900ecf8" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.878948 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.897699 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.898971 4825 scope.go:117] "RemoveContainer" containerID="8b2d2d4f6b0f114db42858f530067eb7ac7c3527c975b8512749ef6848c3c65b" Mar 10 08:16:24 crc kubenswrapper[4825]: E0310 08:16:24.899355 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b2d2d4f6b0f114db42858f530067eb7ac7c3527c975b8512749ef6848c3c65b\": container with ID starting with 8b2d2d4f6b0f114db42858f530067eb7ac7c3527c975b8512749ef6848c3c65b not found: ID does not exist" containerID="8b2d2d4f6b0f114db42858f530067eb7ac7c3527c975b8512749ef6848c3c65b" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.899389 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b2d2d4f6b0f114db42858f530067eb7ac7c3527c975b8512749ef6848c3c65b"} err="failed to get container status \"8b2d2d4f6b0f114db42858f530067eb7ac7c3527c975b8512749ef6848c3c65b\": rpc error: code = NotFound desc = could not find container \"8b2d2d4f6b0f114db42858f530067eb7ac7c3527c975b8512749ef6848c3c65b\": container with ID starting with 8b2d2d4f6b0f114db42858f530067eb7ac7c3527c975b8512749ef6848c3c65b not found: ID does not exist" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.899453 4825 scope.go:117] "RemoveContainer" containerID="ccb72c82096c7e245188dcca8e97f71e8fbf23660808cf882175aba30900ecf8" Mar 10 08:16:24 crc kubenswrapper[4825]: E0310 08:16:24.899715 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccb72c82096c7e245188dcca8e97f71e8fbf23660808cf882175aba30900ecf8\": container with ID starting with ccb72c82096c7e245188dcca8e97f71e8fbf23660808cf882175aba30900ecf8 not found: ID does not exist" containerID="ccb72c82096c7e245188dcca8e97f71e8fbf23660808cf882175aba30900ecf8" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.899748 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccb72c82096c7e245188dcca8e97f71e8fbf23660808cf882175aba30900ecf8"} err="failed to get container status \"ccb72c82096c7e245188dcca8e97f71e8fbf23660808cf882175aba30900ecf8\": rpc error: code = NotFound desc = could not find container \"ccb72c82096c7e245188dcca8e97f71e8fbf23660808cf882175aba30900ecf8\": container with ID starting with ccb72c82096c7e245188dcca8e97f71e8fbf23660808cf882175aba30900ecf8 not found: ID does not exist" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.899774 4825 scope.go:117] "RemoveContainer" containerID="8b2d2d4f6b0f114db42858f530067eb7ac7c3527c975b8512749ef6848c3c65b" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.900107 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b2d2d4f6b0f114db42858f530067eb7ac7c3527c975b8512749ef6848c3c65b"} err="failed to get container status \"8b2d2d4f6b0f114db42858f530067eb7ac7c3527c975b8512749ef6848c3c65b\": rpc error: code = NotFound desc = could not find container \"8b2d2d4f6b0f114db42858f530067eb7ac7c3527c975b8512749ef6848c3c65b\": container with ID starting with 8b2d2d4f6b0f114db42858f530067eb7ac7c3527c975b8512749ef6848c3c65b not found: ID does not exist" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.900168 4825 scope.go:117] "RemoveContainer" containerID="ccb72c82096c7e245188dcca8e97f71e8fbf23660808cf882175aba30900ecf8" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.900416 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccb72c82096c7e245188dcca8e97f71e8fbf23660808cf882175aba30900ecf8"} err="failed to get container status \"ccb72c82096c7e245188dcca8e97f71e8fbf23660808cf882175aba30900ecf8\": rpc error: code = NotFound desc = could not find container \"ccb72c82096c7e245188dcca8e97f71e8fbf23660808cf882175aba30900ecf8\": container with ID starting with ccb72c82096c7e245188dcca8e97f71e8fbf23660808cf882175aba30900ecf8 not found: ID does not exist" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.900436 4825 scope.go:117] "RemoveContainer" containerID="1458877aa109199100822f71b9a41ba40cf27bcaa699868f10964609a50027f1" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.904867 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b9ddf47-a404-4d81-8826-106a75c1c3af-config-data\") pod \"5b9ddf47-a404-4d81-8826-106a75c1c3af\" (UID: \"5b9ddf47-a404-4d81-8826-106a75c1c3af\") " Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.904921 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5b9ddf47-a404-4d81-8826-106a75c1c3af-httpd-run\") pod \"5b9ddf47-a404-4d81-8826-106a75c1c3af\" (UID: \"5b9ddf47-a404-4d81-8826-106a75c1c3af\") " Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.905001 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b9ddf47-a404-4d81-8826-106a75c1c3af-logs\") pod \"5b9ddf47-a404-4d81-8826-106a75c1c3af\" (UID: \"5b9ddf47-a404-4d81-8826-106a75c1c3af\") " Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.905025 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b9ddf47-a404-4d81-8826-106a75c1c3af-combined-ca-bundle\") pod \"5b9ddf47-a404-4d81-8826-106a75c1c3af\" (UID: \"5b9ddf47-a404-4d81-8826-106a75c1c3af\") " Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.905053 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q982c\" (UniqueName: \"kubernetes.io/projected/5b9ddf47-a404-4d81-8826-106a75c1c3af-kube-api-access-q982c\") pod \"5b9ddf47-a404-4d81-8826-106a75c1c3af\" (UID: \"5b9ddf47-a404-4d81-8826-106a75c1c3af\") " Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.905623 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b9ddf47-a404-4d81-8826-106a75c1c3af-scripts\") pod \"5b9ddf47-a404-4d81-8826-106a75c1c3af\" (UID: \"5b9ddf47-a404-4d81-8826-106a75c1c3af\") " Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.905913 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b9ddf47-a404-4d81-8826-106a75c1c3af-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5b9ddf47-a404-4d81-8826-106a75c1c3af" (UID: "5b9ddf47-a404-4d81-8826-106a75c1c3af"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.905992 4825 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5b9ddf47-a404-4d81-8826-106a75c1c3af-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.907653 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b9ddf47-a404-4d81-8826-106a75c1c3af-logs" (OuterVolumeSpecName: "logs") pod "5b9ddf47-a404-4d81-8826-106a75c1c3af" (UID: "5b9ddf47-a404-4d81-8826-106a75c1c3af"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.909695 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b9ddf47-a404-4d81-8826-106a75c1c3af-scripts" (OuterVolumeSpecName: "scripts") pod "5b9ddf47-a404-4d81-8826-106a75c1c3af" (UID: "5b9ddf47-a404-4d81-8826-106a75c1c3af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.909881 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b9ddf47-a404-4d81-8826-106a75c1c3af-kube-api-access-q982c" (OuterVolumeSpecName: "kube-api-access-q982c") pod "5b9ddf47-a404-4d81-8826-106a75c1c3af" (UID: "5b9ddf47-a404-4d81-8826-106a75c1c3af"). InnerVolumeSpecName "kube-api-access-q982c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.911441 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 08:16:24 crc kubenswrapper[4825]: E0310 08:16:24.911818 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afe716cf-a746-4f7b-b5f1-beb9282140a5" containerName="glance-httpd" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.911840 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="afe716cf-a746-4f7b-b5f1-beb9282140a5" containerName="glance-httpd" Mar 10 08:16:24 crc kubenswrapper[4825]: E0310 08:16:24.911861 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afe716cf-a746-4f7b-b5f1-beb9282140a5" containerName="glance-log" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.911867 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="afe716cf-a746-4f7b-b5f1-beb9282140a5" containerName="glance-log" Mar 10 08:16:24 crc kubenswrapper[4825]: E0310 08:16:24.911878 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b9ddf47-a404-4d81-8826-106a75c1c3af" containerName="glance-httpd" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.911884 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b9ddf47-a404-4d81-8826-106a75c1c3af" containerName="glance-httpd" Mar 10 08:16:24 crc kubenswrapper[4825]: E0310 08:16:24.911901 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b9ddf47-a404-4d81-8826-106a75c1c3af" containerName="glance-log" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.911907 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b9ddf47-a404-4d81-8826-106a75c1c3af" containerName="glance-log" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.912072 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="afe716cf-a746-4f7b-b5f1-beb9282140a5" containerName="glance-log" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.912095 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="afe716cf-a746-4f7b-b5f1-beb9282140a5" containerName="glance-httpd" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.912110 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b9ddf47-a404-4d81-8826-106a75c1c3af" containerName="glance-httpd" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.912121 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b9ddf47-a404-4d81-8826-106a75c1c3af" containerName="glance-log" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.913101 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.922751 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.923006 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.924347 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.930078 4825 scope.go:117] "RemoveContainer" containerID="20a826d40adbdb889c57e20f1fc7430344d58b430428d2d3b4d8daa031ae748b" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.936497 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b9ddf47-a404-4d81-8826-106a75c1c3af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b9ddf47-a404-4d81-8826-106a75c1c3af" (UID: "5b9ddf47-a404-4d81-8826-106a75c1c3af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:16:24 crc kubenswrapper[4825]: I0310 08:16:24.966407 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b9ddf47-a404-4d81-8826-106a75c1c3af-config-data" (OuterVolumeSpecName: "config-data") pod "5b9ddf47-a404-4d81-8826-106a75c1c3af" (UID: "5b9ddf47-a404-4d81-8826-106a75c1c3af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.008248 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b9ddf47-a404-4d81-8826-106a75c1c3af-logs\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.008526 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b9ddf47-a404-4d81-8826-106a75c1c3af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.008612 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q982c\" (UniqueName: \"kubernetes.io/projected/5b9ddf47-a404-4d81-8826-106a75c1c3af-kube-api-access-q982c\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.008767 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b9ddf47-a404-4d81-8826-106a75c1c3af-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.009026 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b9ddf47-a404-4d81-8826-106a75c1c3af-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.058346 4825 scope.go:117] "RemoveContainer" containerID="1458877aa109199100822f71b9a41ba40cf27bcaa699868f10964609a50027f1" Mar 10 08:16:25 crc kubenswrapper[4825]: E0310 08:16:25.059200 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1458877aa109199100822f71b9a41ba40cf27bcaa699868f10964609a50027f1\": container with ID starting with 1458877aa109199100822f71b9a41ba40cf27bcaa699868f10964609a50027f1 not found: ID does not exist" containerID="1458877aa109199100822f71b9a41ba40cf27bcaa699868f10964609a50027f1" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.059228 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1458877aa109199100822f71b9a41ba40cf27bcaa699868f10964609a50027f1"} err="failed to get container status \"1458877aa109199100822f71b9a41ba40cf27bcaa699868f10964609a50027f1\": rpc error: code = NotFound desc = could not find container \"1458877aa109199100822f71b9a41ba40cf27bcaa699868f10964609a50027f1\": container with ID starting with 1458877aa109199100822f71b9a41ba40cf27bcaa699868f10964609a50027f1 not found: ID does not exist" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.059247 4825 scope.go:117] "RemoveContainer" containerID="20a826d40adbdb889c57e20f1fc7430344d58b430428d2d3b4d8daa031ae748b" Mar 10 08:16:25 crc kubenswrapper[4825]: E0310 08:16:25.059668 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20a826d40adbdb889c57e20f1fc7430344d58b430428d2d3b4d8daa031ae748b\": container with ID starting with 20a826d40adbdb889c57e20f1fc7430344d58b430428d2d3b4d8daa031ae748b not found: ID does not exist" containerID="20a826d40adbdb889c57e20f1fc7430344d58b430428d2d3b4d8daa031ae748b" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.059792 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20a826d40adbdb889c57e20f1fc7430344d58b430428d2d3b4d8daa031ae748b"} err="failed to get container status \"20a826d40adbdb889c57e20f1fc7430344d58b430428d2d3b4d8daa031ae748b\": rpc error: code = NotFound desc = could not find container \"20a826d40adbdb889c57e20f1fc7430344d58b430428d2d3b4d8daa031ae748b\": container with ID starting with 20a826d40adbdb889c57e20f1fc7430344d58b430428d2d3b4d8daa031ae748b not found: ID does not exist" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.059863 4825 scope.go:117] "RemoveContainer" containerID="1458877aa109199100822f71b9a41ba40cf27bcaa699868f10964609a50027f1" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.060274 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1458877aa109199100822f71b9a41ba40cf27bcaa699868f10964609a50027f1"} err="failed to get container status \"1458877aa109199100822f71b9a41ba40cf27bcaa699868f10964609a50027f1\": rpc error: code = NotFound desc = could not find container \"1458877aa109199100822f71b9a41ba40cf27bcaa699868f10964609a50027f1\": container with ID starting with 1458877aa109199100822f71b9a41ba40cf27bcaa699868f10964609a50027f1 not found: ID does not exist" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.060321 4825 scope.go:117] "RemoveContainer" containerID="20a826d40adbdb889c57e20f1fc7430344d58b430428d2d3b4d8daa031ae748b" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.060638 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20a826d40adbdb889c57e20f1fc7430344d58b430428d2d3b4d8daa031ae748b"} err="failed to get container status \"20a826d40adbdb889c57e20f1fc7430344d58b430428d2d3b4d8daa031ae748b\": rpc error: code = NotFound desc = could not find container \"20a826d40adbdb889c57e20f1fc7430344d58b430428d2d3b4d8daa031ae748b\": container with ID starting with 20a826d40adbdb889c57e20f1fc7430344d58b430428d2d3b4d8daa031ae748b not found: ID does not exist" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.110337 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1c3fd1-11dc-49a3-9869-95438b93ab08-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.110387 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kqgb\" (UniqueName: \"kubernetes.io/projected/ec1c3fd1-11dc-49a3-9869-95438b93ab08-kube-api-access-8kqgb\") pod \"glance-default-external-api-0\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.110449 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ec1c3fd1-11dc-49a3-9869-95438b93ab08-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.110478 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec1c3fd1-11dc-49a3-9869-95438b93ab08-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.110524 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec1c3fd1-11dc-49a3-9869-95438b93ab08-logs\") pod \"glance-default-external-api-0\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.110652 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1c3fd1-11dc-49a3-9869-95438b93ab08-scripts\") pod \"glance-default-external-api-0\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.110723 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1c3fd1-11dc-49a3-9869-95438b93ab08-config-data\") pod \"glance-default-external-api-0\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.174452 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.192020 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.206479 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.208003 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.210343 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.210709 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.211933 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec1c3fd1-11dc-49a3-9869-95438b93ab08-logs\") pod \"glance-default-external-api-0\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.212001 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1c3fd1-11dc-49a3-9869-95438b93ab08-scripts\") pod \"glance-default-external-api-0\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.212393 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec1c3fd1-11dc-49a3-9869-95438b93ab08-logs\") pod \"glance-default-external-api-0\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.212724 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1c3fd1-11dc-49a3-9869-95438b93ab08-config-data\") pod \"glance-default-external-api-0\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.212777 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1c3fd1-11dc-49a3-9869-95438b93ab08-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.212802 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kqgb\" (UniqueName: \"kubernetes.io/projected/ec1c3fd1-11dc-49a3-9869-95438b93ab08-kube-api-access-8kqgb\") pod \"glance-default-external-api-0\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.212840 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ec1c3fd1-11dc-49a3-9869-95438b93ab08-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.212864 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec1c3fd1-11dc-49a3-9869-95438b93ab08-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.214376 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ec1c3fd1-11dc-49a3-9869-95438b93ab08-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.215823 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1c3fd1-11dc-49a3-9869-95438b93ab08-scripts\") pod \"glance-default-external-api-0\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.216424 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1c3fd1-11dc-49a3-9869-95438b93ab08-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.219115 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1c3fd1-11dc-49a3-9869-95438b93ab08-config-data\") pod \"glance-default-external-api-0\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.225441 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.242355 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kqgb\" (UniqueName: \"kubernetes.io/projected/ec1c3fd1-11dc-49a3-9869-95438b93ab08-kube-api-access-8kqgb\") pod \"glance-default-external-api-0\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.261418 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b9ddf47-a404-4d81-8826-106a75c1c3af" path="/var/lib/kubelet/pods/5b9ddf47-a404-4d81-8826-106a75c1c3af/volumes" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.262082 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afe716cf-a746-4f7b-b5f1-beb9282140a5" path="/var/lib/kubelet/pods/afe716cf-a746-4f7b-b5f1-beb9282140a5/volumes" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.273986 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec1c3fd1-11dc-49a3-9869-95438b93ab08-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " pod="openstack/glance-default-external-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.314005 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee73a326-c85e-4756-b836-3fc323ef11ac-logs\") pod \"glance-default-internal-api-0\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.314056 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee73a326-c85e-4756-b836-3fc323ef11ac-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.314171 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7qb5\" (UniqueName: \"kubernetes.io/projected/ee73a326-c85e-4756-b836-3fc323ef11ac-kube-api-access-t7qb5\") pod \"glance-default-internal-api-0\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.314279 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee73a326-c85e-4756-b836-3fc323ef11ac-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.314317 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee73a326-c85e-4756-b836-3fc323ef11ac-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.314364 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee73a326-c85e-4756-b836-3fc323ef11ac-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.314388 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ee73a326-c85e-4756-b836-3fc323ef11ac-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.358002 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.415009 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee73a326-c85e-4756-b836-3fc323ef11ac-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.415066 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee73a326-c85e-4756-b836-3fc323ef11ac-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.415088 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ee73a326-c85e-4756-b836-3fc323ef11ac-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.415116 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee73a326-c85e-4756-b836-3fc323ef11ac-logs\") pod \"glance-default-internal-api-0\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.415149 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee73a326-c85e-4756-b836-3fc323ef11ac-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.415194 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7qb5\" (UniqueName: \"kubernetes.io/projected/ee73a326-c85e-4756-b836-3fc323ef11ac-kube-api-access-t7qb5\") pod \"glance-default-internal-api-0\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.415243 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee73a326-c85e-4756-b836-3fc323ef11ac-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.416042 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ee73a326-c85e-4756-b836-3fc323ef11ac-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.416109 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee73a326-c85e-4756-b836-3fc323ef11ac-logs\") pod \"glance-default-internal-api-0\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.419643 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee73a326-c85e-4756-b836-3fc323ef11ac-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.419880 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee73a326-c85e-4756-b836-3fc323ef11ac-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.420700 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee73a326-c85e-4756-b836-3fc323ef11ac-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.421753 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee73a326-c85e-4756-b836-3fc323ef11ac-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.431861 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7qb5\" (UniqueName: \"kubernetes.io/projected/ee73a326-c85e-4756-b836-3fc323ef11ac-kube-api-access-t7qb5\") pod \"glance-default-internal-api-0\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.604663 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 08:16:25 crc kubenswrapper[4825]: I0310 08:16:25.898389 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 08:16:25 crc kubenswrapper[4825]: W0310 08:16:25.907623 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec1c3fd1_11dc_49a3_9869_95438b93ab08.slice/crio-24de60a2db0176c4af9a50dfd92a80a4cd3dca3a99c757d808115cef634ca6e9 WatchSource:0}: Error finding container 24de60a2db0176c4af9a50dfd92a80a4cd3dca3a99c757d808115cef634ca6e9: Status 404 returned error can't find the container with id 24de60a2db0176c4af9a50dfd92a80a4cd3dca3a99c757d808115cef634ca6e9 Mar 10 08:16:26 crc kubenswrapper[4825]: I0310 08:16:26.758891 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 08:16:26 crc kubenswrapper[4825]: I0310 08:16:26.894112 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ec1c3fd1-11dc-49a3-9869-95438b93ab08","Type":"ContainerStarted","Data":"1d3e4e3e439cd0f6b04d665a1435005ce50c78d9a937614f5ef8a5ade7a959a5"} Mar 10 08:16:26 crc kubenswrapper[4825]: I0310 08:16:26.894172 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ec1c3fd1-11dc-49a3-9869-95438b93ab08","Type":"ContainerStarted","Data":"24de60a2db0176c4af9a50dfd92a80a4cd3dca3a99c757d808115cef634ca6e9"} Mar 10 08:16:26 crc kubenswrapper[4825]: I0310 08:16:26.898469 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ee73a326-c85e-4756-b836-3fc323ef11ac","Type":"ContainerStarted","Data":"960d8532115364cc8bf7f43ff151e2645c7b102a81e55beb189bcac54787383d"} Mar 10 08:16:27 crc kubenswrapper[4825]: I0310 08:16:27.910868 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ee73a326-c85e-4756-b836-3fc323ef11ac","Type":"ContainerStarted","Data":"2f3556bb6af608a5cfdaa3a62737229b2129fac11fe4cdf0a59d6331965d4f05"} Mar 10 08:16:27 crc kubenswrapper[4825]: I0310 08:16:27.914101 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ec1c3fd1-11dc-49a3-9869-95438b93ab08","Type":"ContainerStarted","Data":"e8b549f3aca00d92c110d0fabf012350fc907bc04b117fdc2dc18e6faf605abe"} Mar 10 08:16:27 crc kubenswrapper[4825]: I0310 08:16:27.942813 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.942794467 podStartE2EDuration="3.942794467s" podCreationTimestamp="2026-03-10 08:16:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:16:27.935576788 +0000 UTC m=+5540.965357403" watchObservedRunningTime="2026-03-10 08:16:27.942794467 +0000 UTC m=+5540.972575082" Mar 10 08:16:28 crc kubenswrapper[4825]: I0310 08:16:28.929696 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ee73a326-c85e-4756-b836-3fc323ef11ac","Type":"ContainerStarted","Data":"6cf54bd96acb81570c21132944a0bdf95aaa4372936cf091dfc290a59d872175"} Mar 10 08:16:28 crc kubenswrapper[4825]: I0310 08:16:28.956370 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.956336651 podStartE2EDuration="3.956336651s" podCreationTimestamp="2026-03-10 08:16:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:16:28.951165705 +0000 UTC m=+5541.980946340" watchObservedRunningTime="2026-03-10 08:16:28.956336651 +0000 UTC m=+5541.986117266" Mar 10 08:16:29 crc kubenswrapper[4825]: I0310 08:16:29.241917 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:16:29 crc kubenswrapper[4825]: E0310 08:16:29.242545 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:16:30 crc kubenswrapper[4825]: I0310 08:16:30.502381 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d95445669-fw682" Mar 10 08:16:30 crc kubenswrapper[4825]: I0310 08:16:30.559594 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74d56d465f-d2t8h"] Mar 10 08:16:30 crc kubenswrapper[4825]: I0310 08:16:30.559915 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" podUID="56ad7438-3747-4fff-a88f-34f87df435d6" containerName="dnsmasq-dns" containerID="cri-o://93e50380a27201dc777ba257a73ff60ccbd7ddb727bf3ff5f02acec9cac182e3" gracePeriod=10 Mar 10 08:16:30 crc kubenswrapper[4825]: I0310 08:16:30.952279 4825 generic.go:334] "Generic (PLEG): container finished" podID="56ad7438-3747-4fff-a88f-34f87df435d6" containerID="93e50380a27201dc777ba257a73ff60ccbd7ddb727bf3ff5f02acec9cac182e3" exitCode=0 Mar 10 08:16:30 crc kubenswrapper[4825]: I0310 08:16:30.952361 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" event={"ID":"56ad7438-3747-4fff-a88f-34f87df435d6","Type":"ContainerDied","Data":"93e50380a27201dc777ba257a73ff60ccbd7ddb727bf3ff5f02acec9cac182e3"} Mar 10 08:16:30 crc kubenswrapper[4825]: I0310 08:16:30.952572 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" event={"ID":"56ad7438-3747-4fff-a88f-34f87df435d6","Type":"ContainerDied","Data":"75a3ad6a73c1ba45bf481e8969e616536cfae9c6896214de60a762c4508984a6"} Mar 10 08:16:30 crc kubenswrapper[4825]: I0310 08:16:30.952589 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75a3ad6a73c1ba45bf481e8969e616536cfae9c6896214de60a762c4508984a6" Mar 10 08:16:31 crc kubenswrapper[4825]: I0310 08:16:31.016706 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" Mar 10 08:16:31 crc kubenswrapper[4825]: I0310 08:16:31.128883 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56ad7438-3747-4fff-a88f-34f87df435d6-ovsdbserver-sb\") pod \"56ad7438-3747-4fff-a88f-34f87df435d6\" (UID: \"56ad7438-3747-4fff-a88f-34f87df435d6\") " Mar 10 08:16:31 crc kubenswrapper[4825]: I0310 08:16:31.128950 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56ad7438-3747-4fff-a88f-34f87df435d6-dns-svc\") pod \"56ad7438-3747-4fff-a88f-34f87df435d6\" (UID: \"56ad7438-3747-4fff-a88f-34f87df435d6\") " Mar 10 08:16:31 crc kubenswrapper[4825]: I0310 08:16:31.129066 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56ad7438-3747-4fff-a88f-34f87df435d6-ovsdbserver-nb\") pod \"56ad7438-3747-4fff-a88f-34f87df435d6\" (UID: \"56ad7438-3747-4fff-a88f-34f87df435d6\") " Mar 10 08:16:31 crc kubenswrapper[4825]: I0310 08:16:31.129146 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56ad7438-3747-4fff-a88f-34f87df435d6-config\") pod \"56ad7438-3747-4fff-a88f-34f87df435d6\" (UID: \"56ad7438-3747-4fff-a88f-34f87df435d6\") " Mar 10 08:16:31 crc kubenswrapper[4825]: I0310 08:16:31.129252 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdvrd\" (UniqueName: \"kubernetes.io/projected/56ad7438-3747-4fff-a88f-34f87df435d6-kube-api-access-vdvrd\") pod \"56ad7438-3747-4fff-a88f-34f87df435d6\" (UID: \"56ad7438-3747-4fff-a88f-34f87df435d6\") " Mar 10 08:16:31 crc kubenswrapper[4825]: I0310 08:16:31.134562 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56ad7438-3747-4fff-a88f-34f87df435d6-kube-api-access-vdvrd" (OuterVolumeSpecName: "kube-api-access-vdvrd") pod "56ad7438-3747-4fff-a88f-34f87df435d6" (UID: "56ad7438-3747-4fff-a88f-34f87df435d6"). InnerVolumeSpecName "kube-api-access-vdvrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:16:31 crc kubenswrapper[4825]: I0310 08:16:31.175327 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56ad7438-3747-4fff-a88f-34f87df435d6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "56ad7438-3747-4fff-a88f-34f87df435d6" (UID: "56ad7438-3747-4fff-a88f-34f87df435d6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:16:31 crc kubenswrapper[4825]: I0310 08:16:31.178336 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56ad7438-3747-4fff-a88f-34f87df435d6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "56ad7438-3747-4fff-a88f-34f87df435d6" (UID: "56ad7438-3747-4fff-a88f-34f87df435d6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:16:31 crc kubenswrapper[4825]: I0310 08:16:31.183596 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56ad7438-3747-4fff-a88f-34f87df435d6-config" (OuterVolumeSpecName: "config") pod "56ad7438-3747-4fff-a88f-34f87df435d6" (UID: "56ad7438-3747-4fff-a88f-34f87df435d6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:16:31 crc kubenswrapper[4825]: I0310 08:16:31.190820 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56ad7438-3747-4fff-a88f-34f87df435d6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "56ad7438-3747-4fff-a88f-34f87df435d6" (UID: "56ad7438-3747-4fff-a88f-34f87df435d6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:16:31 crc kubenswrapper[4825]: I0310 08:16:31.231409 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56ad7438-3747-4fff-a88f-34f87df435d6-config\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:31 crc kubenswrapper[4825]: I0310 08:16:31.231611 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdvrd\" (UniqueName: \"kubernetes.io/projected/56ad7438-3747-4fff-a88f-34f87df435d6-kube-api-access-vdvrd\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:31 crc kubenswrapper[4825]: I0310 08:16:31.231667 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56ad7438-3747-4fff-a88f-34f87df435d6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:31 crc kubenswrapper[4825]: I0310 08:16:31.231717 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56ad7438-3747-4fff-a88f-34f87df435d6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:31 crc kubenswrapper[4825]: I0310 08:16:31.231763 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56ad7438-3747-4fff-a88f-34f87df435d6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:31 crc kubenswrapper[4825]: I0310 08:16:31.962337 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74d56d465f-d2t8h" Mar 10 08:16:31 crc kubenswrapper[4825]: I0310 08:16:31.987305 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74d56d465f-d2t8h"] Mar 10 08:16:31 crc kubenswrapper[4825]: I0310 08:16:31.995210 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74d56d465f-d2t8h"] Mar 10 08:16:33 crc kubenswrapper[4825]: I0310 08:16:33.251215 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56ad7438-3747-4fff-a88f-34f87df435d6" path="/var/lib/kubelet/pods/56ad7438-3747-4fff-a88f-34f87df435d6/volumes" Mar 10 08:16:35 crc kubenswrapper[4825]: I0310 08:16:35.358292 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 08:16:35 crc kubenswrapper[4825]: I0310 08:16:35.359317 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 08:16:35 crc kubenswrapper[4825]: I0310 08:16:35.392193 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 08:16:35 crc kubenswrapper[4825]: I0310 08:16:35.405956 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 08:16:35 crc kubenswrapper[4825]: I0310 08:16:35.604982 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 08:16:35 crc kubenswrapper[4825]: I0310 08:16:35.605055 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 08:16:35 crc kubenswrapper[4825]: I0310 08:16:35.633474 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 08:16:35 crc kubenswrapper[4825]: I0310 08:16:35.648312 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 08:16:36 crc kubenswrapper[4825]: I0310 08:16:36.002293 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 08:16:36 crc kubenswrapper[4825]: I0310 08:16:36.002591 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 08:16:36 crc kubenswrapper[4825]: I0310 08:16:36.002607 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 08:16:36 crc kubenswrapper[4825]: I0310 08:16:36.002617 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 08:16:38 crc kubenswrapper[4825]: I0310 08:16:38.004872 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 08:16:38 crc kubenswrapper[4825]: I0310 08:16:38.018357 4825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 08:16:38 crc kubenswrapper[4825]: I0310 08:16:38.022495 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 08:16:38 crc kubenswrapper[4825]: I0310 08:16:38.037263 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 08:16:38 crc kubenswrapper[4825]: I0310 08:16:38.037587 4825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 08:16:38 crc kubenswrapper[4825]: I0310 08:16:38.136526 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 08:16:43 crc kubenswrapper[4825]: I0310 08:16:43.025956 4825 scope.go:117] "RemoveContainer" containerID="8488bd91af709b889b7fc2c8e6562a78138b630e47186398a9d97e4c0e154db4" Mar 10 08:16:43 crc kubenswrapper[4825]: I0310 08:16:43.068291 4825 scope.go:117] "RemoveContainer" containerID="1e270fb12bcae28fa39d843b0de9753a845039c08cd041f524cf77f266f8b086" Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.234321 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-hbbvh"] Mar 10 08:16:44 crc kubenswrapper[4825]: E0310 08:16:44.234747 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ad7438-3747-4fff-a88f-34f87df435d6" containerName="init" Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.234761 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ad7438-3747-4fff-a88f-34f87df435d6" containerName="init" Mar 10 08:16:44 crc kubenswrapper[4825]: E0310 08:16:44.234774 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ad7438-3747-4fff-a88f-34f87df435d6" containerName="dnsmasq-dns" Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.234780 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ad7438-3747-4fff-a88f-34f87df435d6" containerName="dnsmasq-dns" Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.234976 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="56ad7438-3747-4fff-a88f-34f87df435d6" containerName="dnsmasq-dns" Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.235601 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hbbvh" Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.236695 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:16:44 crc kubenswrapper[4825]: E0310 08:16:44.237096 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.242953 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-71b1-account-create-update-w4phr"] Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.244566 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-71b1-account-create-update-w4phr" Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.246378 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.267395 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-hbbvh"] Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.267465 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-71b1-account-create-update-w4phr"] Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.400212 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f9c69e7-8f45-4b9e-abe4-54f811f4227d-operator-scripts\") pod \"placement-db-create-hbbvh\" (UID: \"6f9c69e7-8f45-4b9e-abe4-54f811f4227d\") " pod="openstack/placement-db-create-hbbvh" Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.400256 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d74zb\" (UniqueName: \"kubernetes.io/projected/6f9c69e7-8f45-4b9e-abe4-54f811f4227d-kube-api-access-d74zb\") pod \"placement-db-create-hbbvh\" (UID: \"6f9c69e7-8f45-4b9e-abe4-54f811f4227d\") " pod="openstack/placement-db-create-hbbvh" Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.400372 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7c1a03f-33d6-4b82-92c2-05a42b793e86-operator-scripts\") pod \"placement-71b1-account-create-update-w4phr\" (UID: \"c7c1a03f-33d6-4b82-92c2-05a42b793e86\") " pod="openstack/placement-71b1-account-create-update-w4phr" Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.400507 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trsds\" (UniqueName: \"kubernetes.io/projected/c7c1a03f-33d6-4b82-92c2-05a42b793e86-kube-api-access-trsds\") pod \"placement-71b1-account-create-update-w4phr\" (UID: \"c7c1a03f-33d6-4b82-92c2-05a42b793e86\") " pod="openstack/placement-71b1-account-create-update-w4phr" Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.502390 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7c1a03f-33d6-4b82-92c2-05a42b793e86-operator-scripts\") pod \"placement-71b1-account-create-update-w4phr\" (UID: \"c7c1a03f-33d6-4b82-92c2-05a42b793e86\") " pod="openstack/placement-71b1-account-create-update-w4phr" Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.502503 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trsds\" (UniqueName: \"kubernetes.io/projected/c7c1a03f-33d6-4b82-92c2-05a42b793e86-kube-api-access-trsds\") pod \"placement-71b1-account-create-update-w4phr\" (UID: \"c7c1a03f-33d6-4b82-92c2-05a42b793e86\") " pod="openstack/placement-71b1-account-create-update-w4phr" Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.502543 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f9c69e7-8f45-4b9e-abe4-54f811f4227d-operator-scripts\") pod \"placement-db-create-hbbvh\" (UID: \"6f9c69e7-8f45-4b9e-abe4-54f811f4227d\") " pod="openstack/placement-db-create-hbbvh" Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.502563 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d74zb\" (UniqueName: \"kubernetes.io/projected/6f9c69e7-8f45-4b9e-abe4-54f811f4227d-kube-api-access-d74zb\") pod \"placement-db-create-hbbvh\" (UID: \"6f9c69e7-8f45-4b9e-abe4-54f811f4227d\") " pod="openstack/placement-db-create-hbbvh" Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.503071 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7c1a03f-33d6-4b82-92c2-05a42b793e86-operator-scripts\") pod \"placement-71b1-account-create-update-w4phr\" (UID: \"c7c1a03f-33d6-4b82-92c2-05a42b793e86\") " pod="openstack/placement-71b1-account-create-update-w4phr" Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.503418 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f9c69e7-8f45-4b9e-abe4-54f811f4227d-operator-scripts\") pod \"placement-db-create-hbbvh\" (UID: \"6f9c69e7-8f45-4b9e-abe4-54f811f4227d\") " pod="openstack/placement-db-create-hbbvh" Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.532340 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trsds\" (UniqueName: \"kubernetes.io/projected/c7c1a03f-33d6-4b82-92c2-05a42b793e86-kube-api-access-trsds\") pod \"placement-71b1-account-create-update-w4phr\" (UID: \"c7c1a03f-33d6-4b82-92c2-05a42b793e86\") " pod="openstack/placement-71b1-account-create-update-w4phr" Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.533033 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d74zb\" (UniqueName: \"kubernetes.io/projected/6f9c69e7-8f45-4b9e-abe4-54f811f4227d-kube-api-access-d74zb\") pod \"placement-db-create-hbbvh\" (UID: \"6f9c69e7-8f45-4b9e-abe4-54f811f4227d\") " pod="openstack/placement-db-create-hbbvh" Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.597608 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hbbvh" Mar 10 08:16:44 crc kubenswrapper[4825]: I0310 08:16:44.607062 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-71b1-account-create-update-w4phr" Mar 10 08:16:45 crc kubenswrapper[4825]: I0310 08:16:45.073446 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-hbbvh"] Mar 10 08:16:45 crc kubenswrapper[4825]: W0310 08:16:45.083316 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f9c69e7_8f45_4b9e_abe4_54f811f4227d.slice/crio-c1b84a73a9feeb1a41f6157898fcb27d664bb0082b0c0a19e0f9719146747056 WatchSource:0}: Error finding container c1b84a73a9feeb1a41f6157898fcb27d664bb0082b0c0a19e0f9719146747056: Status 404 returned error can't find the container with id c1b84a73a9feeb1a41f6157898fcb27d664bb0082b0c0a19e0f9719146747056 Mar 10 08:16:45 crc kubenswrapper[4825]: I0310 08:16:45.147490 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-71b1-account-create-update-w4phr"] Mar 10 08:16:45 crc kubenswrapper[4825]: W0310 08:16:45.147784 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7c1a03f_33d6_4b82_92c2_05a42b793e86.slice/crio-6806a79c3535e30fff9281c7780848955a24c93f65d51296bd069a0f04d3d6d1 WatchSource:0}: Error finding container 6806a79c3535e30fff9281c7780848955a24c93f65d51296bd069a0f04d3d6d1: Status 404 returned error can't find the container with id 6806a79c3535e30fff9281c7780848955a24c93f65d51296bd069a0f04d3d6d1 Mar 10 08:16:46 crc kubenswrapper[4825]: I0310 08:16:46.091818 4825 generic.go:334] "Generic (PLEG): container finished" podID="6f9c69e7-8f45-4b9e-abe4-54f811f4227d" containerID="5b9928f32ac768c9046299d5bb4a58704c39fc658abb72c248435f228cbe8e56" exitCode=0 Mar 10 08:16:46 crc kubenswrapper[4825]: I0310 08:16:46.091917 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hbbvh" event={"ID":"6f9c69e7-8f45-4b9e-abe4-54f811f4227d","Type":"ContainerDied","Data":"5b9928f32ac768c9046299d5bb4a58704c39fc658abb72c248435f228cbe8e56"} Mar 10 08:16:46 crc kubenswrapper[4825]: I0310 08:16:46.092307 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hbbvh" event={"ID":"6f9c69e7-8f45-4b9e-abe4-54f811f4227d","Type":"ContainerStarted","Data":"c1b84a73a9feeb1a41f6157898fcb27d664bb0082b0c0a19e0f9719146747056"} Mar 10 08:16:46 crc kubenswrapper[4825]: I0310 08:16:46.095011 4825 generic.go:334] "Generic (PLEG): container finished" podID="c7c1a03f-33d6-4b82-92c2-05a42b793e86" containerID="86985ca1303771554b9fa682859ad2aff5f3f8c84e7991b815c56606593d3700" exitCode=0 Mar 10 08:16:46 crc kubenswrapper[4825]: I0310 08:16:46.095069 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-71b1-account-create-update-w4phr" event={"ID":"c7c1a03f-33d6-4b82-92c2-05a42b793e86","Type":"ContainerDied","Data":"86985ca1303771554b9fa682859ad2aff5f3f8c84e7991b815c56606593d3700"} Mar 10 08:16:46 crc kubenswrapper[4825]: I0310 08:16:46.095120 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-71b1-account-create-update-w4phr" event={"ID":"c7c1a03f-33d6-4b82-92c2-05a42b793e86","Type":"ContainerStarted","Data":"6806a79c3535e30fff9281c7780848955a24c93f65d51296bd069a0f04d3d6d1"} Mar 10 08:16:47 crc kubenswrapper[4825]: I0310 08:16:47.582797 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-71b1-account-create-update-w4phr" Mar 10 08:16:47 crc kubenswrapper[4825]: I0310 08:16:47.589358 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hbbvh" Mar 10 08:16:47 crc kubenswrapper[4825]: I0310 08:16:47.669613 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d74zb\" (UniqueName: \"kubernetes.io/projected/6f9c69e7-8f45-4b9e-abe4-54f811f4227d-kube-api-access-d74zb\") pod \"6f9c69e7-8f45-4b9e-abe4-54f811f4227d\" (UID: \"6f9c69e7-8f45-4b9e-abe4-54f811f4227d\") " Mar 10 08:16:47 crc kubenswrapper[4825]: I0310 08:16:47.669725 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trsds\" (UniqueName: \"kubernetes.io/projected/c7c1a03f-33d6-4b82-92c2-05a42b793e86-kube-api-access-trsds\") pod \"c7c1a03f-33d6-4b82-92c2-05a42b793e86\" (UID: \"c7c1a03f-33d6-4b82-92c2-05a42b793e86\") " Mar 10 08:16:47 crc kubenswrapper[4825]: I0310 08:16:47.669968 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f9c69e7-8f45-4b9e-abe4-54f811f4227d-operator-scripts\") pod \"6f9c69e7-8f45-4b9e-abe4-54f811f4227d\" (UID: \"6f9c69e7-8f45-4b9e-abe4-54f811f4227d\") " Mar 10 08:16:47 crc kubenswrapper[4825]: I0310 08:16:47.670097 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7c1a03f-33d6-4b82-92c2-05a42b793e86-operator-scripts\") pod \"c7c1a03f-33d6-4b82-92c2-05a42b793e86\" (UID: \"c7c1a03f-33d6-4b82-92c2-05a42b793e86\") " Mar 10 08:16:47 crc kubenswrapper[4825]: I0310 08:16:47.670337 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f9c69e7-8f45-4b9e-abe4-54f811f4227d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6f9c69e7-8f45-4b9e-abe4-54f811f4227d" (UID: "6f9c69e7-8f45-4b9e-abe4-54f811f4227d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:16:47 crc kubenswrapper[4825]: I0310 08:16:47.670539 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7c1a03f-33d6-4b82-92c2-05a42b793e86-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c7c1a03f-33d6-4b82-92c2-05a42b793e86" (UID: "c7c1a03f-33d6-4b82-92c2-05a42b793e86"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:16:47 crc kubenswrapper[4825]: I0310 08:16:47.670874 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f9c69e7-8f45-4b9e-abe4-54f811f4227d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:47 crc kubenswrapper[4825]: I0310 08:16:47.670898 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7c1a03f-33d6-4b82-92c2-05a42b793e86-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:47 crc kubenswrapper[4825]: I0310 08:16:47.677167 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7c1a03f-33d6-4b82-92c2-05a42b793e86-kube-api-access-trsds" (OuterVolumeSpecName: "kube-api-access-trsds") pod "c7c1a03f-33d6-4b82-92c2-05a42b793e86" (UID: "c7c1a03f-33d6-4b82-92c2-05a42b793e86"). InnerVolumeSpecName "kube-api-access-trsds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:16:47 crc kubenswrapper[4825]: I0310 08:16:47.678432 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f9c69e7-8f45-4b9e-abe4-54f811f4227d-kube-api-access-d74zb" (OuterVolumeSpecName: "kube-api-access-d74zb") pod "6f9c69e7-8f45-4b9e-abe4-54f811f4227d" (UID: "6f9c69e7-8f45-4b9e-abe4-54f811f4227d"). InnerVolumeSpecName "kube-api-access-d74zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:16:47 crc kubenswrapper[4825]: I0310 08:16:47.773740 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d74zb\" (UniqueName: \"kubernetes.io/projected/6f9c69e7-8f45-4b9e-abe4-54f811f4227d-kube-api-access-d74zb\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:47 crc kubenswrapper[4825]: I0310 08:16:47.773807 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trsds\" (UniqueName: \"kubernetes.io/projected/c7c1a03f-33d6-4b82-92c2-05a42b793e86-kube-api-access-trsds\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:48 crc kubenswrapper[4825]: I0310 08:16:48.115259 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-71b1-account-create-update-w4phr" event={"ID":"c7c1a03f-33d6-4b82-92c2-05a42b793e86","Type":"ContainerDied","Data":"6806a79c3535e30fff9281c7780848955a24c93f65d51296bd069a0f04d3d6d1"} Mar 10 08:16:48 crc kubenswrapper[4825]: I0310 08:16:48.115305 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6806a79c3535e30fff9281c7780848955a24c93f65d51296bd069a0f04d3d6d1" Mar 10 08:16:48 crc kubenswrapper[4825]: I0310 08:16:48.115352 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-71b1-account-create-update-w4phr" Mar 10 08:16:48 crc kubenswrapper[4825]: I0310 08:16:48.119698 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hbbvh" event={"ID":"6f9c69e7-8f45-4b9e-abe4-54f811f4227d","Type":"ContainerDied","Data":"c1b84a73a9feeb1a41f6157898fcb27d664bb0082b0c0a19e0f9719146747056"} Mar 10 08:16:48 crc kubenswrapper[4825]: I0310 08:16:48.119737 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1b84a73a9feeb1a41f6157898fcb27d664bb0082b0c0a19e0f9719146747056" Mar 10 08:16:48 crc kubenswrapper[4825]: I0310 08:16:48.119703 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hbbvh" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.429130 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78d6b6f8b7-qd4kd"] Mar 10 08:16:49 crc kubenswrapper[4825]: E0310 08:16:49.429889 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9c69e7-8f45-4b9e-abe4-54f811f4227d" containerName="mariadb-database-create" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.429905 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9c69e7-8f45-4b9e-abe4-54f811f4227d" containerName="mariadb-database-create" Mar 10 08:16:49 crc kubenswrapper[4825]: E0310 08:16:49.429930 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7c1a03f-33d6-4b82-92c2-05a42b793e86" containerName="mariadb-account-create-update" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.429938 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7c1a03f-33d6-4b82-92c2-05a42b793e86" containerName="mariadb-account-create-update" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.430159 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9c69e7-8f45-4b9e-abe4-54f811f4227d" containerName="mariadb-database-create" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.430177 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7c1a03f-33d6-4b82-92c2-05a42b793e86" containerName="mariadb-account-create-update" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.431216 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.447197 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78d6b6f8b7-qd4kd"] Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.489063 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-2sjk7"] Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.490387 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2sjk7" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.493363 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.493606 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-jv5dv" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.494652 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.540842 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-dns-svc\") pod \"dnsmasq-dns-78d6b6f8b7-qd4kd\" (UID: \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\") " pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.541613 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf6dh\" (UniqueName: \"kubernetes.io/projected/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-kube-api-access-wf6dh\") pod \"dnsmasq-dns-78d6b6f8b7-qd4kd\" (UID: \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\") " pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.541855 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-config\") pod \"dnsmasq-dns-78d6b6f8b7-qd4kd\" (UID: \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\") " pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.541908 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-ovsdbserver-nb\") pod \"dnsmasq-dns-78d6b6f8b7-qd4kd\" (UID: \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\") " pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.541928 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-ovsdbserver-sb\") pod \"dnsmasq-dns-78d6b6f8b7-qd4kd\" (UID: \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\") " pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.557278 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2sjk7"] Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.644011 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm5qm\" (UniqueName: \"kubernetes.io/projected/3bd32bd9-5073-4d95-994e-a51836a51f67-kube-api-access-gm5qm\") pod \"placement-db-sync-2sjk7\" (UID: \"3bd32bd9-5073-4d95-994e-a51836a51f67\") " pod="openstack/placement-db-sync-2sjk7" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.644080 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-config\") pod \"dnsmasq-dns-78d6b6f8b7-qd4kd\" (UID: \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\") " pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.644194 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-ovsdbserver-nb\") pod \"dnsmasq-dns-78d6b6f8b7-qd4kd\" (UID: \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\") " pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.644240 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-ovsdbserver-sb\") pod \"dnsmasq-dns-78d6b6f8b7-qd4kd\" (UID: \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\") " pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.644484 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bd32bd9-5073-4d95-994e-a51836a51f67-logs\") pod \"placement-db-sync-2sjk7\" (UID: \"3bd32bd9-5073-4d95-994e-a51836a51f67\") " pod="openstack/placement-db-sync-2sjk7" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.644511 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bd32bd9-5073-4d95-994e-a51836a51f67-combined-ca-bundle\") pod \"placement-db-sync-2sjk7\" (UID: \"3bd32bd9-5073-4d95-994e-a51836a51f67\") " pod="openstack/placement-db-sync-2sjk7" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.644629 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-dns-svc\") pod \"dnsmasq-dns-78d6b6f8b7-qd4kd\" (UID: \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\") " pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.644654 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bd32bd9-5073-4d95-994e-a51836a51f67-scripts\") pod \"placement-db-sync-2sjk7\" (UID: \"3bd32bd9-5073-4d95-994e-a51836a51f67\") " pod="openstack/placement-db-sync-2sjk7" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.644680 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf6dh\" (UniqueName: \"kubernetes.io/projected/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-kube-api-access-wf6dh\") pod \"dnsmasq-dns-78d6b6f8b7-qd4kd\" (UID: \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\") " pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.644696 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bd32bd9-5073-4d95-994e-a51836a51f67-config-data\") pod \"placement-db-sync-2sjk7\" (UID: \"3bd32bd9-5073-4d95-994e-a51836a51f67\") " pod="openstack/placement-db-sync-2sjk7" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.644999 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-ovsdbserver-nb\") pod \"dnsmasq-dns-78d6b6f8b7-qd4kd\" (UID: \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\") " pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.645065 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-config\") pod \"dnsmasq-dns-78d6b6f8b7-qd4kd\" (UID: \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\") " pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.645155 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-ovsdbserver-sb\") pod \"dnsmasq-dns-78d6b6f8b7-qd4kd\" (UID: \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\") " pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.645379 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-dns-svc\") pod \"dnsmasq-dns-78d6b6f8b7-qd4kd\" (UID: \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\") " pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.668022 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf6dh\" (UniqueName: \"kubernetes.io/projected/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-kube-api-access-wf6dh\") pod \"dnsmasq-dns-78d6b6f8b7-qd4kd\" (UID: \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\") " pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.746343 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm5qm\" (UniqueName: \"kubernetes.io/projected/3bd32bd9-5073-4d95-994e-a51836a51f67-kube-api-access-gm5qm\") pod \"placement-db-sync-2sjk7\" (UID: \"3bd32bd9-5073-4d95-994e-a51836a51f67\") " pod="openstack/placement-db-sync-2sjk7" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.746499 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bd32bd9-5073-4d95-994e-a51836a51f67-logs\") pod \"placement-db-sync-2sjk7\" (UID: \"3bd32bd9-5073-4d95-994e-a51836a51f67\") " pod="openstack/placement-db-sync-2sjk7" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.746527 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bd32bd9-5073-4d95-994e-a51836a51f67-combined-ca-bundle\") pod \"placement-db-sync-2sjk7\" (UID: \"3bd32bd9-5073-4d95-994e-a51836a51f67\") " pod="openstack/placement-db-sync-2sjk7" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.746584 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bd32bd9-5073-4d95-994e-a51836a51f67-scripts\") pod \"placement-db-sync-2sjk7\" (UID: \"3bd32bd9-5073-4d95-994e-a51836a51f67\") " pod="openstack/placement-db-sync-2sjk7" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.746613 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bd32bd9-5073-4d95-994e-a51836a51f67-config-data\") pod \"placement-db-sync-2sjk7\" (UID: \"3bd32bd9-5073-4d95-994e-a51836a51f67\") " pod="openstack/placement-db-sync-2sjk7" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.747264 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bd32bd9-5073-4d95-994e-a51836a51f67-logs\") pod \"placement-db-sync-2sjk7\" (UID: \"3bd32bd9-5073-4d95-994e-a51836a51f67\") " pod="openstack/placement-db-sync-2sjk7" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.751621 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.752069 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bd32bd9-5073-4d95-994e-a51836a51f67-config-data\") pod \"placement-db-sync-2sjk7\" (UID: \"3bd32bd9-5073-4d95-994e-a51836a51f67\") " pod="openstack/placement-db-sync-2sjk7" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.753597 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bd32bd9-5073-4d95-994e-a51836a51f67-combined-ca-bundle\") pod \"placement-db-sync-2sjk7\" (UID: \"3bd32bd9-5073-4d95-994e-a51836a51f67\") " pod="openstack/placement-db-sync-2sjk7" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.755483 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bd32bd9-5073-4d95-994e-a51836a51f67-scripts\") pod \"placement-db-sync-2sjk7\" (UID: \"3bd32bd9-5073-4d95-994e-a51836a51f67\") " pod="openstack/placement-db-sync-2sjk7" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.784963 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm5qm\" (UniqueName: \"kubernetes.io/projected/3bd32bd9-5073-4d95-994e-a51836a51f67-kube-api-access-gm5qm\") pod \"placement-db-sync-2sjk7\" (UID: \"3bd32bd9-5073-4d95-994e-a51836a51f67\") " pod="openstack/placement-db-sync-2sjk7" Mar 10 08:16:49 crc kubenswrapper[4825]: I0310 08:16:49.825405 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2sjk7" Mar 10 08:16:50 crc kubenswrapper[4825]: I0310 08:16:50.308496 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2sjk7"] Mar 10 08:16:50 crc kubenswrapper[4825]: I0310 08:16:50.386993 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78d6b6f8b7-qd4kd"] Mar 10 08:16:50 crc kubenswrapper[4825]: W0310 08:16:50.387386 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16b77a07_02ee_4ec1_84ed_4d3e6203e93f.slice/crio-aabf080d2c09eb7d1df52148dba26f40fdfb08a8a9213fe8cf07da622f7a83d4 WatchSource:0}: Error finding container aabf080d2c09eb7d1df52148dba26f40fdfb08a8a9213fe8cf07da622f7a83d4: Status 404 returned error can't find the container with id aabf080d2c09eb7d1df52148dba26f40fdfb08a8a9213fe8cf07da622f7a83d4 Mar 10 08:16:51 crc kubenswrapper[4825]: I0310 08:16:51.160622 4825 generic.go:334] "Generic (PLEG): container finished" podID="16b77a07-02ee-4ec1-84ed-4d3e6203e93f" containerID="15c04c00e06290a19ea8e58bd77f516c9c11dfeb666aa465e3177f13d7abdbbd" exitCode=0 Mar 10 08:16:51 crc kubenswrapper[4825]: I0310 08:16:51.160730 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" event={"ID":"16b77a07-02ee-4ec1-84ed-4d3e6203e93f","Type":"ContainerDied","Data":"15c04c00e06290a19ea8e58bd77f516c9c11dfeb666aa465e3177f13d7abdbbd"} Mar 10 08:16:51 crc kubenswrapper[4825]: I0310 08:16:51.160981 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" event={"ID":"16b77a07-02ee-4ec1-84ed-4d3e6203e93f","Type":"ContainerStarted","Data":"aabf080d2c09eb7d1df52148dba26f40fdfb08a8a9213fe8cf07da622f7a83d4"} Mar 10 08:16:51 crc kubenswrapper[4825]: I0310 08:16:51.162518 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2sjk7" event={"ID":"3bd32bd9-5073-4d95-994e-a51836a51f67","Type":"ContainerStarted","Data":"320b55a647a201f1957d05b6f95758bd1075b3d5b6566e5814f4d4caf5f8c7b8"} Mar 10 08:16:52 crc kubenswrapper[4825]: I0310 08:16:52.173341 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" event={"ID":"16b77a07-02ee-4ec1-84ed-4d3e6203e93f","Type":"ContainerStarted","Data":"66688f1a6aa8b692f481651827bd05dff89f575dfe204812d61b8ef0ad3febfd"} Mar 10 08:16:52 crc kubenswrapper[4825]: I0310 08:16:52.173780 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" Mar 10 08:16:54 crc kubenswrapper[4825]: I0310 08:16:54.190887 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2sjk7" event={"ID":"3bd32bd9-5073-4d95-994e-a51836a51f67","Type":"ContainerStarted","Data":"398bf2634549284d1287df5167b546153037e8d96bd52c1f6cbf3a716089f156"} Mar 10 08:16:54 crc kubenswrapper[4825]: I0310 08:16:54.217063 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" podStartSLOduration=5.217045269 podStartE2EDuration="5.217045269s" podCreationTimestamp="2026-03-10 08:16:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:16:52.194666175 +0000 UTC m=+5565.224446800" watchObservedRunningTime="2026-03-10 08:16:54.217045269 +0000 UTC m=+5567.246825884" Mar 10 08:16:54 crc kubenswrapper[4825]: I0310 08:16:54.220805 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-2sjk7" podStartSLOduration=1.8168277819999998 podStartE2EDuration="5.220795578s" podCreationTimestamp="2026-03-10 08:16:49 +0000 UTC" firstStartedPulling="2026-03-10 08:16:50.312259338 +0000 UTC m=+5563.342039953" lastFinishedPulling="2026-03-10 08:16:53.716227114 +0000 UTC m=+5566.746007749" observedRunningTime="2026-03-10 08:16:54.211114073 +0000 UTC m=+5567.240894688" watchObservedRunningTime="2026-03-10 08:16:54.220795578 +0000 UTC m=+5567.250576193" Mar 10 08:16:55 crc kubenswrapper[4825]: I0310 08:16:55.199571 4825 generic.go:334] "Generic (PLEG): container finished" podID="3bd32bd9-5073-4d95-994e-a51836a51f67" containerID="398bf2634549284d1287df5167b546153037e8d96bd52c1f6cbf3a716089f156" exitCode=0 Mar 10 08:16:55 crc kubenswrapper[4825]: I0310 08:16:55.199621 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2sjk7" event={"ID":"3bd32bd9-5073-4d95-994e-a51836a51f67","Type":"ContainerDied","Data":"398bf2634549284d1287df5167b546153037e8d96bd52c1f6cbf3a716089f156"} Mar 10 08:16:56 crc kubenswrapper[4825]: I0310 08:16:56.581603 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2sjk7" Mar 10 08:16:56 crc kubenswrapper[4825]: I0310 08:16:56.683689 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm5qm\" (UniqueName: \"kubernetes.io/projected/3bd32bd9-5073-4d95-994e-a51836a51f67-kube-api-access-gm5qm\") pod \"3bd32bd9-5073-4d95-994e-a51836a51f67\" (UID: \"3bd32bd9-5073-4d95-994e-a51836a51f67\") " Mar 10 08:16:56 crc kubenswrapper[4825]: I0310 08:16:56.684001 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bd32bd9-5073-4d95-994e-a51836a51f67-config-data\") pod \"3bd32bd9-5073-4d95-994e-a51836a51f67\" (UID: \"3bd32bd9-5073-4d95-994e-a51836a51f67\") " Mar 10 08:16:56 crc kubenswrapper[4825]: I0310 08:16:56.684259 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bd32bd9-5073-4d95-994e-a51836a51f67-scripts\") pod \"3bd32bd9-5073-4d95-994e-a51836a51f67\" (UID: \"3bd32bd9-5073-4d95-994e-a51836a51f67\") " Mar 10 08:16:56 crc kubenswrapper[4825]: I0310 08:16:56.684603 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bd32bd9-5073-4d95-994e-a51836a51f67-combined-ca-bundle\") pod \"3bd32bd9-5073-4d95-994e-a51836a51f67\" (UID: \"3bd32bd9-5073-4d95-994e-a51836a51f67\") " Mar 10 08:16:56 crc kubenswrapper[4825]: I0310 08:16:56.684717 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bd32bd9-5073-4d95-994e-a51836a51f67-logs\") pod \"3bd32bd9-5073-4d95-994e-a51836a51f67\" (UID: \"3bd32bd9-5073-4d95-994e-a51836a51f67\") " Mar 10 08:16:56 crc kubenswrapper[4825]: I0310 08:16:56.684923 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd32bd9-5073-4d95-994e-a51836a51f67-logs" (OuterVolumeSpecName: "logs") pod "3bd32bd9-5073-4d95-994e-a51836a51f67" (UID: "3bd32bd9-5073-4d95-994e-a51836a51f67"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:16:56 crc kubenswrapper[4825]: I0310 08:16:56.685311 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bd32bd9-5073-4d95-994e-a51836a51f67-logs\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:56 crc kubenswrapper[4825]: I0310 08:16:56.689088 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bd32bd9-5073-4d95-994e-a51836a51f67-scripts" (OuterVolumeSpecName: "scripts") pod "3bd32bd9-5073-4d95-994e-a51836a51f67" (UID: "3bd32bd9-5073-4d95-994e-a51836a51f67"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:16:56 crc kubenswrapper[4825]: I0310 08:16:56.690679 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bd32bd9-5073-4d95-994e-a51836a51f67-kube-api-access-gm5qm" (OuterVolumeSpecName: "kube-api-access-gm5qm") pod "3bd32bd9-5073-4d95-994e-a51836a51f67" (UID: "3bd32bd9-5073-4d95-994e-a51836a51f67"). InnerVolumeSpecName "kube-api-access-gm5qm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:16:56 crc kubenswrapper[4825]: I0310 08:16:56.726597 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bd32bd9-5073-4d95-994e-a51836a51f67-config-data" (OuterVolumeSpecName: "config-data") pod "3bd32bd9-5073-4d95-994e-a51836a51f67" (UID: "3bd32bd9-5073-4d95-994e-a51836a51f67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:16:56 crc kubenswrapper[4825]: I0310 08:16:56.727091 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bd32bd9-5073-4d95-994e-a51836a51f67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3bd32bd9-5073-4d95-994e-a51836a51f67" (UID: "3bd32bd9-5073-4d95-994e-a51836a51f67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:16:56 crc kubenswrapper[4825]: I0310 08:16:56.786734 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bd32bd9-5073-4d95-994e-a51836a51f67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:56 crc kubenswrapper[4825]: I0310 08:16:56.786767 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm5qm\" (UniqueName: \"kubernetes.io/projected/3bd32bd9-5073-4d95-994e-a51836a51f67-kube-api-access-gm5qm\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:56 crc kubenswrapper[4825]: I0310 08:16:56.786778 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bd32bd9-5073-4d95-994e-a51836a51f67-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:56 crc kubenswrapper[4825]: I0310 08:16:56.786787 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bd32bd9-5073-4d95-994e-a51836a51f67-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.219626 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2sjk7" event={"ID":"3bd32bd9-5073-4d95-994e-a51836a51f67","Type":"ContainerDied","Data":"320b55a647a201f1957d05b6f95758bd1075b3d5b6566e5814f4d4caf5f8c7b8"} Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.219663 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="320b55a647a201f1957d05b6f95758bd1075b3d5b6566e5814f4d4caf5f8c7b8" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.219718 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2sjk7" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.241485 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:16:57 crc kubenswrapper[4825]: E0310 08:16:57.241699 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.296184 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7bd6645db8-zxmsk"] Mar 10 08:16:57 crc kubenswrapper[4825]: E0310 08:16:57.296590 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd32bd9-5073-4d95-994e-a51836a51f67" containerName="placement-db-sync" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.296609 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd32bd9-5073-4d95-994e-a51836a51f67" containerName="placement-db-sync" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.296800 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bd32bd9-5073-4d95-994e-a51836a51f67" containerName="placement-db-sync" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.297871 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.300068 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.301507 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.301963 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.301997 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-jv5dv" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.301968 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.316743 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7bd6645db8-zxmsk"] Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.498419 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lwsw\" (UniqueName: \"kubernetes.io/projected/61eaa20d-9a8d-4edb-97c7-3d9ce430bab0-kube-api-access-4lwsw\") pod \"placement-7bd6645db8-zxmsk\" (UID: \"61eaa20d-9a8d-4edb-97c7-3d9ce430bab0\") " pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.499475 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61eaa20d-9a8d-4edb-97c7-3d9ce430bab0-config-data\") pod \"placement-7bd6645db8-zxmsk\" (UID: \"61eaa20d-9a8d-4edb-97c7-3d9ce430bab0\") " pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.499629 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61eaa20d-9a8d-4edb-97c7-3d9ce430bab0-internal-tls-certs\") pod \"placement-7bd6645db8-zxmsk\" (UID: \"61eaa20d-9a8d-4edb-97c7-3d9ce430bab0\") " pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.499782 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61eaa20d-9a8d-4edb-97c7-3d9ce430bab0-scripts\") pod \"placement-7bd6645db8-zxmsk\" (UID: \"61eaa20d-9a8d-4edb-97c7-3d9ce430bab0\") " pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.499874 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61eaa20d-9a8d-4edb-97c7-3d9ce430bab0-logs\") pod \"placement-7bd6645db8-zxmsk\" (UID: \"61eaa20d-9a8d-4edb-97c7-3d9ce430bab0\") " pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.499960 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61eaa20d-9a8d-4edb-97c7-3d9ce430bab0-combined-ca-bundle\") pod \"placement-7bd6645db8-zxmsk\" (UID: \"61eaa20d-9a8d-4edb-97c7-3d9ce430bab0\") " pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.500076 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61eaa20d-9a8d-4edb-97c7-3d9ce430bab0-public-tls-certs\") pod \"placement-7bd6645db8-zxmsk\" (UID: \"61eaa20d-9a8d-4edb-97c7-3d9ce430bab0\") " pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.602034 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61eaa20d-9a8d-4edb-97c7-3d9ce430bab0-config-data\") pod \"placement-7bd6645db8-zxmsk\" (UID: \"61eaa20d-9a8d-4edb-97c7-3d9ce430bab0\") " pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.602093 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61eaa20d-9a8d-4edb-97c7-3d9ce430bab0-internal-tls-certs\") pod \"placement-7bd6645db8-zxmsk\" (UID: \"61eaa20d-9a8d-4edb-97c7-3d9ce430bab0\") " pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.602149 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61eaa20d-9a8d-4edb-97c7-3d9ce430bab0-scripts\") pod \"placement-7bd6645db8-zxmsk\" (UID: \"61eaa20d-9a8d-4edb-97c7-3d9ce430bab0\") " pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.602172 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61eaa20d-9a8d-4edb-97c7-3d9ce430bab0-logs\") pod \"placement-7bd6645db8-zxmsk\" (UID: \"61eaa20d-9a8d-4edb-97c7-3d9ce430bab0\") " pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.602211 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61eaa20d-9a8d-4edb-97c7-3d9ce430bab0-combined-ca-bundle\") pod \"placement-7bd6645db8-zxmsk\" (UID: \"61eaa20d-9a8d-4edb-97c7-3d9ce430bab0\") " pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.602251 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61eaa20d-9a8d-4edb-97c7-3d9ce430bab0-public-tls-certs\") pod \"placement-7bd6645db8-zxmsk\" (UID: \"61eaa20d-9a8d-4edb-97c7-3d9ce430bab0\") " pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.602316 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lwsw\" (UniqueName: \"kubernetes.io/projected/61eaa20d-9a8d-4edb-97c7-3d9ce430bab0-kube-api-access-4lwsw\") pod \"placement-7bd6645db8-zxmsk\" (UID: \"61eaa20d-9a8d-4edb-97c7-3d9ce430bab0\") " pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.603437 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61eaa20d-9a8d-4edb-97c7-3d9ce430bab0-logs\") pod \"placement-7bd6645db8-zxmsk\" (UID: \"61eaa20d-9a8d-4edb-97c7-3d9ce430bab0\") " pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.607083 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61eaa20d-9a8d-4edb-97c7-3d9ce430bab0-public-tls-certs\") pod \"placement-7bd6645db8-zxmsk\" (UID: \"61eaa20d-9a8d-4edb-97c7-3d9ce430bab0\") " pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.607237 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61eaa20d-9a8d-4edb-97c7-3d9ce430bab0-combined-ca-bundle\") pod \"placement-7bd6645db8-zxmsk\" (UID: \"61eaa20d-9a8d-4edb-97c7-3d9ce430bab0\") " pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.607633 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61eaa20d-9a8d-4edb-97c7-3d9ce430bab0-internal-tls-certs\") pod \"placement-7bd6645db8-zxmsk\" (UID: \"61eaa20d-9a8d-4edb-97c7-3d9ce430bab0\") " pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.608905 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61eaa20d-9a8d-4edb-97c7-3d9ce430bab0-scripts\") pod \"placement-7bd6645db8-zxmsk\" (UID: \"61eaa20d-9a8d-4edb-97c7-3d9ce430bab0\") " pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.613086 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61eaa20d-9a8d-4edb-97c7-3d9ce430bab0-config-data\") pod \"placement-7bd6645db8-zxmsk\" (UID: \"61eaa20d-9a8d-4edb-97c7-3d9ce430bab0\") " pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.621343 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lwsw\" (UniqueName: \"kubernetes.io/projected/61eaa20d-9a8d-4edb-97c7-3d9ce430bab0-kube-api-access-4lwsw\") pod \"placement-7bd6645db8-zxmsk\" (UID: \"61eaa20d-9a8d-4edb-97c7-3d9ce430bab0\") " pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:16:57 crc kubenswrapper[4825]: I0310 08:16:57.917009 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:16:58 crc kubenswrapper[4825]: I0310 08:16:58.364556 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7bd6645db8-zxmsk"] Mar 10 08:16:58 crc kubenswrapper[4825]: W0310 08:16:58.373484 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61eaa20d_9a8d_4edb_97c7_3d9ce430bab0.slice/crio-74a7e426bd30d4aa4736f4e7bdaf6d5fa7af07176d46daae165341c0f80bd33a WatchSource:0}: Error finding container 74a7e426bd30d4aa4736f4e7bdaf6d5fa7af07176d46daae165341c0f80bd33a: Status 404 returned error can't find the container with id 74a7e426bd30d4aa4736f4e7bdaf6d5fa7af07176d46daae165341c0f80bd33a Mar 10 08:16:59 crc kubenswrapper[4825]: I0310 08:16:59.246657 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7bd6645db8-zxmsk" event={"ID":"61eaa20d-9a8d-4edb-97c7-3d9ce430bab0","Type":"ContainerStarted","Data":"cf1c2b802a0ea35c85ffcc095621bf4bb0a7a20f37605840849d68a722dd59d5"} Mar 10 08:16:59 crc kubenswrapper[4825]: I0310 08:16:59.246957 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7bd6645db8-zxmsk" event={"ID":"61eaa20d-9a8d-4edb-97c7-3d9ce430bab0","Type":"ContainerStarted","Data":"e74d6fb557c33b9ca6abb801cf4aa8fda1430a87277ab735fc278627d3ad00cd"} Mar 10 08:16:59 crc kubenswrapper[4825]: I0310 08:16:59.247968 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7bd6645db8-zxmsk" event={"ID":"61eaa20d-9a8d-4edb-97c7-3d9ce430bab0","Type":"ContainerStarted","Data":"74a7e426bd30d4aa4736f4e7bdaf6d5fa7af07176d46daae165341c0f80bd33a"} Mar 10 08:16:59 crc kubenswrapper[4825]: I0310 08:16:59.295674 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7bd6645db8-zxmsk" podStartSLOduration=2.295653606 podStartE2EDuration="2.295653606s" podCreationTimestamp="2026-03-10 08:16:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:16:59.289874044 +0000 UTC m=+5572.319654679" watchObservedRunningTime="2026-03-10 08:16:59.295653606 +0000 UTC m=+5572.325434231" Mar 10 08:16:59 crc kubenswrapper[4825]: I0310 08:16:59.753280 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" Mar 10 08:16:59 crc kubenswrapper[4825]: I0310 08:16:59.806656 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d95445669-fw682"] Mar 10 08:16:59 crc kubenswrapper[4825]: I0310 08:16:59.806929 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d95445669-fw682" podUID="5d04c43e-2981-4964-9545-3c8b6e6fc729" containerName="dnsmasq-dns" containerID="cri-o://036c536c50bcd54816ef85eb5048b562a0550d4afc523cd440acef00376c965c" gracePeriod=10 Mar 10 08:17:00 crc kubenswrapper[4825]: I0310 08:17:00.252287 4825 generic.go:334] "Generic (PLEG): container finished" podID="5d04c43e-2981-4964-9545-3c8b6e6fc729" containerID="036c536c50bcd54816ef85eb5048b562a0550d4afc523cd440acef00376c965c" exitCode=0 Mar 10 08:17:00 crc kubenswrapper[4825]: I0310 08:17:00.252516 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d95445669-fw682" event={"ID":"5d04c43e-2981-4964-9545-3c8b6e6fc729","Type":"ContainerDied","Data":"036c536c50bcd54816ef85eb5048b562a0550d4afc523cd440acef00376c965c"} Mar 10 08:17:00 crc kubenswrapper[4825]: I0310 08:17:00.252656 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d95445669-fw682" event={"ID":"5d04c43e-2981-4964-9545-3c8b6e6fc729","Type":"ContainerDied","Data":"07904c34be26099a86b01693aeb56bb01ae0f28efa0de06157321006dade7211"} Mar 10 08:17:00 crc kubenswrapper[4825]: I0310 08:17:00.252672 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07904c34be26099a86b01693aeb56bb01ae0f28efa0de06157321006dade7211" Mar 10 08:17:00 crc kubenswrapper[4825]: I0310 08:17:00.252849 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:17:00 crc kubenswrapper[4825]: I0310 08:17:00.252870 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:17:00 crc kubenswrapper[4825]: I0310 08:17:00.319493 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d95445669-fw682" Mar 10 08:17:00 crc kubenswrapper[4825]: I0310 08:17:00.465724 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnw9b\" (UniqueName: \"kubernetes.io/projected/5d04c43e-2981-4964-9545-3c8b6e6fc729-kube-api-access-bnw9b\") pod \"5d04c43e-2981-4964-9545-3c8b6e6fc729\" (UID: \"5d04c43e-2981-4964-9545-3c8b6e6fc729\") " Mar 10 08:17:00 crc kubenswrapper[4825]: I0310 08:17:00.465995 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d04c43e-2981-4964-9545-3c8b6e6fc729-dns-svc\") pod \"5d04c43e-2981-4964-9545-3c8b6e6fc729\" (UID: \"5d04c43e-2981-4964-9545-3c8b6e6fc729\") " Mar 10 08:17:00 crc kubenswrapper[4825]: I0310 08:17:00.466311 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d04c43e-2981-4964-9545-3c8b6e6fc729-ovsdbserver-nb\") pod \"5d04c43e-2981-4964-9545-3c8b6e6fc729\" (UID: \"5d04c43e-2981-4964-9545-3c8b6e6fc729\") " Mar 10 08:17:00 crc kubenswrapper[4825]: I0310 08:17:00.466421 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d04c43e-2981-4964-9545-3c8b6e6fc729-config\") pod \"5d04c43e-2981-4964-9545-3c8b6e6fc729\" (UID: \"5d04c43e-2981-4964-9545-3c8b6e6fc729\") " Mar 10 08:17:00 crc kubenswrapper[4825]: I0310 08:17:00.466518 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d04c43e-2981-4964-9545-3c8b6e6fc729-ovsdbserver-sb\") pod \"5d04c43e-2981-4964-9545-3c8b6e6fc729\" (UID: \"5d04c43e-2981-4964-9545-3c8b6e6fc729\") " Mar 10 08:17:00 crc kubenswrapper[4825]: I0310 08:17:00.474564 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d04c43e-2981-4964-9545-3c8b6e6fc729-kube-api-access-bnw9b" (OuterVolumeSpecName: "kube-api-access-bnw9b") pod "5d04c43e-2981-4964-9545-3c8b6e6fc729" (UID: "5d04c43e-2981-4964-9545-3c8b6e6fc729"). InnerVolumeSpecName "kube-api-access-bnw9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:17:00 crc kubenswrapper[4825]: I0310 08:17:00.508939 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d04c43e-2981-4964-9545-3c8b6e6fc729-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5d04c43e-2981-4964-9545-3c8b6e6fc729" (UID: "5d04c43e-2981-4964-9545-3c8b6e6fc729"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:17:00 crc kubenswrapper[4825]: I0310 08:17:00.508952 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d04c43e-2981-4964-9545-3c8b6e6fc729-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5d04c43e-2981-4964-9545-3c8b6e6fc729" (UID: "5d04c43e-2981-4964-9545-3c8b6e6fc729"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:17:00 crc kubenswrapper[4825]: I0310 08:17:00.518356 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d04c43e-2981-4964-9545-3c8b6e6fc729-config" (OuterVolumeSpecName: "config") pod "5d04c43e-2981-4964-9545-3c8b6e6fc729" (UID: "5d04c43e-2981-4964-9545-3c8b6e6fc729"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:17:00 crc kubenswrapper[4825]: I0310 08:17:00.524488 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d04c43e-2981-4964-9545-3c8b6e6fc729-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5d04c43e-2981-4964-9545-3c8b6e6fc729" (UID: "5d04c43e-2981-4964-9545-3c8b6e6fc729"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:17:00 crc kubenswrapper[4825]: I0310 08:17:00.568859 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d04c43e-2981-4964-9545-3c8b6e6fc729-config\") on node \"crc\" DevicePath \"\"" Mar 10 08:17:00 crc kubenswrapper[4825]: I0310 08:17:00.569077 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d04c43e-2981-4964-9545-3c8b6e6fc729-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 08:17:00 crc kubenswrapper[4825]: I0310 08:17:00.569154 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnw9b\" (UniqueName: \"kubernetes.io/projected/5d04c43e-2981-4964-9545-3c8b6e6fc729-kube-api-access-bnw9b\") on node \"crc\" DevicePath \"\"" Mar 10 08:17:00 crc kubenswrapper[4825]: I0310 08:17:00.569236 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d04c43e-2981-4964-9545-3c8b6e6fc729-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 08:17:00 crc kubenswrapper[4825]: I0310 08:17:00.569311 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d04c43e-2981-4964-9545-3c8b6e6fc729-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 08:17:01 crc kubenswrapper[4825]: I0310 08:17:01.258445 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d95445669-fw682" Mar 10 08:17:01 crc kubenswrapper[4825]: I0310 08:17:01.283642 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d95445669-fw682"] Mar 10 08:17:01 crc kubenswrapper[4825]: I0310 08:17:01.291753 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d95445669-fw682"] Mar 10 08:17:03 crc kubenswrapper[4825]: I0310 08:17:03.250539 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d04c43e-2981-4964-9545-3c8b6e6fc729" path="/var/lib/kubelet/pods/5d04c43e-2981-4964-9545-3c8b6e6fc729/volumes" Mar 10 08:17:09 crc kubenswrapper[4825]: I0310 08:17:09.241744 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:17:09 crc kubenswrapper[4825]: E0310 08:17:09.242489 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:17:20 crc kubenswrapper[4825]: I0310 08:17:20.236707 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:17:20 crc kubenswrapper[4825]: E0310 08:17:20.237523 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:17:28 crc kubenswrapper[4825]: I0310 08:17:28.869544 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:17:28 crc kubenswrapper[4825]: I0310 08:17:28.882762 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7bd6645db8-zxmsk" Mar 10 08:17:31 crc kubenswrapper[4825]: I0310 08:17:31.237241 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:17:31 crc kubenswrapper[4825]: E0310 08:17:31.238917 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:17:44 crc kubenswrapper[4825]: I0310 08:17:44.236394 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:17:44 crc kubenswrapper[4825]: E0310 08:17:44.237476 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.112182 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-mng4f"] Mar 10 08:17:53 crc kubenswrapper[4825]: E0310 08:17:53.112962 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d04c43e-2981-4964-9545-3c8b6e6fc729" containerName="init" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.112974 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d04c43e-2981-4964-9545-3c8b6e6fc729" containerName="init" Mar 10 08:17:53 crc kubenswrapper[4825]: E0310 08:17:53.112999 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d04c43e-2981-4964-9545-3c8b6e6fc729" containerName="dnsmasq-dns" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.113005 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d04c43e-2981-4964-9545-3c8b6e6fc729" containerName="dnsmasq-dns" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.113190 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d04c43e-2981-4964-9545-3c8b6e6fc729" containerName="dnsmasq-dns" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.114009 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mng4f" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.131848 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mng4f"] Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.205464 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-dtd22"] Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.206739 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dtd22" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.214158 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-9ac4-account-create-update-fms75"] Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.215185 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9ac4-account-create-update-fms75" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.217368 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.224830 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-dtd22"] Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.235093 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-9ac4-account-create-update-fms75"] Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.296863 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bba32ca3-b10c-41ec-9154-65c3dfd60c48-operator-scripts\") pod \"nova-api-db-create-mng4f\" (UID: \"bba32ca3-b10c-41ec-9154-65c3dfd60c48\") " pod="openstack/nova-api-db-create-mng4f" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.296960 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rqbd\" (UniqueName: \"kubernetes.io/projected/bba32ca3-b10c-41ec-9154-65c3dfd60c48-kube-api-access-8rqbd\") pod \"nova-api-db-create-mng4f\" (UID: \"bba32ca3-b10c-41ec-9154-65c3dfd60c48\") " pod="openstack/nova-api-db-create-mng4f" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.398396 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eab59002-2587-4875-acb8-52330470fc45-operator-scripts\") pod \"nova-api-9ac4-account-create-update-fms75\" (UID: \"eab59002-2587-4875-acb8-52330470fc45\") " pod="openstack/nova-api-9ac4-account-create-update-fms75" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.398455 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bba32ca3-b10c-41ec-9154-65c3dfd60c48-operator-scripts\") pod \"nova-api-db-create-mng4f\" (UID: \"bba32ca3-b10c-41ec-9154-65c3dfd60c48\") " pod="openstack/nova-api-db-create-mng4f" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.398517 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr46j\" (UniqueName: \"kubernetes.io/projected/f1172cce-7e11-4cd0-8c82-f4baa5a15bcc-kube-api-access-fr46j\") pod \"nova-cell0-db-create-dtd22\" (UID: \"f1172cce-7e11-4cd0-8c82-f4baa5a15bcc\") " pod="openstack/nova-cell0-db-create-dtd22" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.398576 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1172cce-7e11-4cd0-8c82-f4baa5a15bcc-operator-scripts\") pod \"nova-cell0-db-create-dtd22\" (UID: \"f1172cce-7e11-4cd0-8c82-f4baa5a15bcc\") " pod="openstack/nova-cell0-db-create-dtd22" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.398594 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt7hk\" (UniqueName: \"kubernetes.io/projected/eab59002-2587-4875-acb8-52330470fc45-kube-api-access-pt7hk\") pod \"nova-api-9ac4-account-create-update-fms75\" (UID: \"eab59002-2587-4875-acb8-52330470fc45\") " pod="openstack/nova-api-9ac4-account-create-update-fms75" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.398619 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rqbd\" (UniqueName: \"kubernetes.io/projected/bba32ca3-b10c-41ec-9154-65c3dfd60c48-kube-api-access-8rqbd\") pod \"nova-api-db-create-mng4f\" (UID: \"bba32ca3-b10c-41ec-9154-65c3dfd60c48\") " pod="openstack/nova-api-db-create-mng4f" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.399484 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bba32ca3-b10c-41ec-9154-65c3dfd60c48-operator-scripts\") pod \"nova-api-db-create-mng4f\" (UID: \"bba32ca3-b10c-41ec-9154-65c3dfd60c48\") " pod="openstack/nova-api-db-create-mng4f" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.409706 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-k7f4d"] Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.410932 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k7f4d" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.420911 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rqbd\" (UniqueName: \"kubernetes.io/projected/bba32ca3-b10c-41ec-9154-65c3dfd60c48-kube-api-access-8rqbd\") pod \"nova-api-db-create-mng4f\" (UID: \"bba32ca3-b10c-41ec-9154-65c3dfd60c48\") " pod="openstack/nova-api-db-create-mng4f" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.422128 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-k7f4d"] Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.437070 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mng4f" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.438506 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-655f-account-create-update-wpvkq"] Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.439597 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-655f-account-create-update-wpvkq" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.442609 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.446269 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-655f-account-create-update-wpvkq"] Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.500140 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eab59002-2587-4875-acb8-52330470fc45-operator-scripts\") pod \"nova-api-9ac4-account-create-update-fms75\" (UID: \"eab59002-2587-4875-acb8-52330470fc45\") " pod="openstack/nova-api-9ac4-account-create-update-fms75" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.500232 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr46j\" (UniqueName: \"kubernetes.io/projected/f1172cce-7e11-4cd0-8c82-f4baa5a15bcc-kube-api-access-fr46j\") pod \"nova-cell0-db-create-dtd22\" (UID: \"f1172cce-7e11-4cd0-8c82-f4baa5a15bcc\") " pod="openstack/nova-cell0-db-create-dtd22" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.500257 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1172cce-7e11-4cd0-8c82-f4baa5a15bcc-operator-scripts\") pod \"nova-cell0-db-create-dtd22\" (UID: \"f1172cce-7e11-4cd0-8c82-f4baa5a15bcc\") " pod="openstack/nova-cell0-db-create-dtd22" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.500273 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt7hk\" (UniqueName: \"kubernetes.io/projected/eab59002-2587-4875-acb8-52330470fc45-kube-api-access-pt7hk\") pod \"nova-api-9ac4-account-create-update-fms75\" (UID: \"eab59002-2587-4875-acb8-52330470fc45\") " pod="openstack/nova-api-9ac4-account-create-update-fms75" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.501218 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eab59002-2587-4875-acb8-52330470fc45-operator-scripts\") pod \"nova-api-9ac4-account-create-update-fms75\" (UID: \"eab59002-2587-4875-acb8-52330470fc45\") " pod="openstack/nova-api-9ac4-account-create-update-fms75" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.501270 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1172cce-7e11-4cd0-8c82-f4baa5a15bcc-operator-scripts\") pod \"nova-cell0-db-create-dtd22\" (UID: \"f1172cce-7e11-4cd0-8c82-f4baa5a15bcc\") " pod="openstack/nova-cell0-db-create-dtd22" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.523356 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt7hk\" (UniqueName: \"kubernetes.io/projected/eab59002-2587-4875-acb8-52330470fc45-kube-api-access-pt7hk\") pod \"nova-api-9ac4-account-create-update-fms75\" (UID: \"eab59002-2587-4875-acb8-52330470fc45\") " pod="openstack/nova-api-9ac4-account-create-update-fms75" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.524016 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr46j\" (UniqueName: \"kubernetes.io/projected/f1172cce-7e11-4cd0-8c82-f4baa5a15bcc-kube-api-access-fr46j\") pod \"nova-cell0-db-create-dtd22\" (UID: \"f1172cce-7e11-4cd0-8c82-f4baa5a15bcc\") " pod="openstack/nova-cell0-db-create-dtd22" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.527614 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dtd22" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.546907 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9ac4-account-create-update-fms75" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.602376 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vn69\" (UniqueName: \"kubernetes.io/projected/5f00f896-c7ba-4380-9b97-da67e92c1741-kube-api-access-2vn69\") pod \"nova-cell1-db-create-k7f4d\" (UID: \"5f00f896-c7ba-4380-9b97-da67e92c1741\") " pod="openstack/nova-cell1-db-create-k7f4d" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.602430 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kldc\" (UniqueName: \"kubernetes.io/projected/29f630f8-cffd-4cd9-8ad6-42b5647e57a6-kube-api-access-9kldc\") pod \"nova-cell0-655f-account-create-update-wpvkq\" (UID: \"29f630f8-cffd-4cd9-8ad6-42b5647e57a6\") " pod="openstack/nova-cell0-655f-account-create-update-wpvkq" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.602462 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29f630f8-cffd-4cd9-8ad6-42b5647e57a6-operator-scripts\") pod \"nova-cell0-655f-account-create-update-wpvkq\" (UID: \"29f630f8-cffd-4cd9-8ad6-42b5647e57a6\") " pod="openstack/nova-cell0-655f-account-create-update-wpvkq" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.602527 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f00f896-c7ba-4380-9b97-da67e92c1741-operator-scripts\") pod \"nova-cell1-db-create-k7f4d\" (UID: \"5f00f896-c7ba-4380-9b97-da67e92c1741\") " pod="openstack/nova-cell1-db-create-k7f4d" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.634967 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-30de-account-create-update-wgghb"] Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.636102 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-30de-account-create-update-wgghb" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.641003 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.666759 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-30de-account-create-update-wgghb"] Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.704555 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vn69\" (UniqueName: \"kubernetes.io/projected/5f00f896-c7ba-4380-9b97-da67e92c1741-kube-api-access-2vn69\") pod \"nova-cell1-db-create-k7f4d\" (UID: \"5f00f896-c7ba-4380-9b97-da67e92c1741\") " pod="openstack/nova-cell1-db-create-k7f4d" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.704667 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kldc\" (UniqueName: \"kubernetes.io/projected/29f630f8-cffd-4cd9-8ad6-42b5647e57a6-kube-api-access-9kldc\") pod \"nova-cell0-655f-account-create-update-wpvkq\" (UID: \"29f630f8-cffd-4cd9-8ad6-42b5647e57a6\") " pod="openstack/nova-cell0-655f-account-create-update-wpvkq" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.704742 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29f630f8-cffd-4cd9-8ad6-42b5647e57a6-operator-scripts\") pod \"nova-cell0-655f-account-create-update-wpvkq\" (UID: \"29f630f8-cffd-4cd9-8ad6-42b5647e57a6\") " pod="openstack/nova-cell0-655f-account-create-update-wpvkq" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.704811 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f00f896-c7ba-4380-9b97-da67e92c1741-operator-scripts\") pod \"nova-cell1-db-create-k7f4d\" (UID: \"5f00f896-c7ba-4380-9b97-da67e92c1741\") " pod="openstack/nova-cell1-db-create-k7f4d" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.705813 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29f630f8-cffd-4cd9-8ad6-42b5647e57a6-operator-scripts\") pod \"nova-cell0-655f-account-create-update-wpvkq\" (UID: \"29f630f8-cffd-4cd9-8ad6-42b5647e57a6\") " pod="openstack/nova-cell0-655f-account-create-update-wpvkq" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.705876 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f00f896-c7ba-4380-9b97-da67e92c1741-operator-scripts\") pod \"nova-cell1-db-create-k7f4d\" (UID: \"5f00f896-c7ba-4380-9b97-da67e92c1741\") " pod="openstack/nova-cell1-db-create-k7f4d" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.723566 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vn69\" (UniqueName: \"kubernetes.io/projected/5f00f896-c7ba-4380-9b97-da67e92c1741-kube-api-access-2vn69\") pod \"nova-cell1-db-create-k7f4d\" (UID: \"5f00f896-c7ba-4380-9b97-da67e92c1741\") " pod="openstack/nova-cell1-db-create-k7f4d" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.723756 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kldc\" (UniqueName: \"kubernetes.io/projected/29f630f8-cffd-4cd9-8ad6-42b5647e57a6-kube-api-access-9kldc\") pod \"nova-cell0-655f-account-create-update-wpvkq\" (UID: \"29f630f8-cffd-4cd9-8ad6-42b5647e57a6\") " pod="openstack/nova-cell0-655f-account-create-update-wpvkq" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.806618 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rmqd\" (UniqueName: \"kubernetes.io/projected/5aed51f5-8b74-479f-a86c-1dd1e51d7bb5-kube-api-access-8rmqd\") pod \"nova-cell1-30de-account-create-update-wgghb\" (UID: \"5aed51f5-8b74-479f-a86c-1dd1e51d7bb5\") " pod="openstack/nova-cell1-30de-account-create-update-wgghb" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.806814 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5aed51f5-8b74-479f-a86c-1dd1e51d7bb5-operator-scripts\") pod \"nova-cell1-30de-account-create-update-wgghb\" (UID: \"5aed51f5-8b74-479f-a86c-1dd1e51d7bb5\") " pod="openstack/nova-cell1-30de-account-create-update-wgghb" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.908184 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5aed51f5-8b74-479f-a86c-1dd1e51d7bb5-operator-scripts\") pod \"nova-cell1-30de-account-create-update-wgghb\" (UID: \"5aed51f5-8b74-479f-a86c-1dd1e51d7bb5\") " pod="openstack/nova-cell1-30de-account-create-update-wgghb" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.908311 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rmqd\" (UniqueName: \"kubernetes.io/projected/5aed51f5-8b74-479f-a86c-1dd1e51d7bb5-kube-api-access-8rmqd\") pod \"nova-cell1-30de-account-create-update-wgghb\" (UID: \"5aed51f5-8b74-479f-a86c-1dd1e51d7bb5\") " pod="openstack/nova-cell1-30de-account-create-update-wgghb" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.909289 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5aed51f5-8b74-479f-a86c-1dd1e51d7bb5-operator-scripts\") pod \"nova-cell1-30de-account-create-update-wgghb\" (UID: \"5aed51f5-8b74-479f-a86c-1dd1e51d7bb5\") " pod="openstack/nova-cell1-30de-account-create-update-wgghb" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.928973 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k7f4d" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.934027 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rmqd\" (UniqueName: \"kubernetes.io/projected/5aed51f5-8b74-479f-a86c-1dd1e51d7bb5-kube-api-access-8rmqd\") pod \"nova-cell1-30de-account-create-update-wgghb\" (UID: \"5aed51f5-8b74-479f-a86c-1dd1e51d7bb5\") " pod="openstack/nova-cell1-30de-account-create-update-wgghb" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.937920 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-655f-account-create-update-wpvkq" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.969590 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-30de-account-create-update-wgghb" Mar 10 08:17:53 crc kubenswrapper[4825]: I0310 08:17:53.985719 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mng4f"] Mar 10 08:17:54 crc kubenswrapper[4825]: W0310 08:17:54.008537 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbba32ca3_b10c_41ec_9154_65c3dfd60c48.slice/crio-a3f6790bd247781d2509841f86b94729a370b4b04730ea2f4eac80b8b918efe4 WatchSource:0}: Error finding container a3f6790bd247781d2509841f86b94729a370b4b04730ea2f4eac80b8b918efe4: Status 404 returned error can't find the container with id a3f6790bd247781d2509841f86b94729a370b4b04730ea2f4eac80b8b918efe4 Mar 10 08:17:54 crc kubenswrapper[4825]: I0310 08:17:54.147776 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-dtd22"] Mar 10 08:17:54 crc kubenswrapper[4825]: I0310 08:17:54.180607 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-9ac4-account-create-update-fms75"] Mar 10 08:17:54 crc kubenswrapper[4825]: I0310 08:17:54.497608 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-k7f4d"] Mar 10 08:17:54 crc kubenswrapper[4825]: I0310 08:17:54.509423 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-655f-account-create-update-wpvkq"] Mar 10 08:17:54 crc kubenswrapper[4825]: I0310 08:17:54.713246 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-30de-account-create-update-wgghb"] Mar 10 08:17:54 crc kubenswrapper[4825]: W0310 08:17:54.728152 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5aed51f5_8b74_479f_a86c_1dd1e51d7bb5.slice/crio-c78b8f72da1b2c6846c1ee46c01788e45acfaa597b167125e7d7b8cd0b5f031d WatchSource:0}: Error finding container c78b8f72da1b2c6846c1ee46c01788e45acfaa597b167125e7d7b8cd0b5f031d: Status 404 returned error can't find the container with id c78b8f72da1b2c6846c1ee46c01788e45acfaa597b167125e7d7b8cd0b5f031d Mar 10 08:17:54 crc kubenswrapper[4825]: I0310 08:17:54.881005 4825 generic.go:334] "Generic (PLEG): container finished" podID="eab59002-2587-4875-acb8-52330470fc45" containerID="7cc2331bec61d3810da58bcf1342b6a173d0fe899374b5aa53cff13a063ead98" exitCode=0 Mar 10 08:17:54 crc kubenswrapper[4825]: I0310 08:17:54.881082 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9ac4-account-create-update-fms75" event={"ID":"eab59002-2587-4875-acb8-52330470fc45","Type":"ContainerDied","Data":"7cc2331bec61d3810da58bcf1342b6a173d0fe899374b5aa53cff13a063ead98"} Mar 10 08:17:54 crc kubenswrapper[4825]: I0310 08:17:54.881111 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9ac4-account-create-update-fms75" event={"ID":"eab59002-2587-4875-acb8-52330470fc45","Type":"ContainerStarted","Data":"efb4dccaf6b0b2e13392f8fdab88ed29755198b5ab39a28456cc6ce842d703b4"} Mar 10 08:17:54 crc kubenswrapper[4825]: I0310 08:17:54.882819 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-655f-account-create-update-wpvkq" event={"ID":"29f630f8-cffd-4cd9-8ad6-42b5647e57a6","Type":"ContainerStarted","Data":"c36bf97e37330c8ea2e693f00a1394735878683e84201cc0b7d62618b17ef049"} Mar 10 08:17:54 crc kubenswrapper[4825]: I0310 08:17:54.884862 4825 generic.go:334] "Generic (PLEG): container finished" podID="f1172cce-7e11-4cd0-8c82-f4baa5a15bcc" containerID="664e6a99164eb7653d0d2fed78814ac77a88f3ae4a846b851f9cd0c3392231b3" exitCode=0 Mar 10 08:17:54 crc kubenswrapper[4825]: I0310 08:17:54.884934 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dtd22" event={"ID":"f1172cce-7e11-4cd0-8c82-f4baa5a15bcc","Type":"ContainerDied","Data":"664e6a99164eb7653d0d2fed78814ac77a88f3ae4a846b851f9cd0c3392231b3"} Mar 10 08:17:54 crc kubenswrapper[4825]: I0310 08:17:54.884965 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dtd22" event={"ID":"f1172cce-7e11-4cd0-8c82-f4baa5a15bcc","Type":"ContainerStarted","Data":"3b6a1ac146e6fb7cd9da0b79570d8d70fa8793e2d26f79f5aa3aa1d1b25d4597"} Mar 10 08:17:54 crc kubenswrapper[4825]: I0310 08:17:54.887423 4825 generic.go:334] "Generic (PLEG): container finished" podID="bba32ca3-b10c-41ec-9154-65c3dfd60c48" containerID="668ea68a80ec250c4f30de40c68994b40e473c435dde0111f63ac540cea48e9d" exitCode=0 Mar 10 08:17:54 crc kubenswrapper[4825]: I0310 08:17:54.887490 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mng4f" event={"ID":"bba32ca3-b10c-41ec-9154-65c3dfd60c48","Type":"ContainerDied","Data":"668ea68a80ec250c4f30de40c68994b40e473c435dde0111f63ac540cea48e9d"} Mar 10 08:17:54 crc kubenswrapper[4825]: I0310 08:17:54.887512 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mng4f" event={"ID":"bba32ca3-b10c-41ec-9154-65c3dfd60c48","Type":"ContainerStarted","Data":"a3f6790bd247781d2509841f86b94729a370b4b04730ea2f4eac80b8b918efe4"} Mar 10 08:17:54 crc kubenswrapper[4825]: I0310 08:17:54.888630 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-30de-account-create-update-wgghb" event={"ID":"5aed51f5-8b74-479f-a86c-1dd1e51d7bb5","Type":"ContainerStarted","Data":"c78b8f72da1b2c6846c1ee46c01788e45acfaa597b167125e7d7b8cd0b5f031d"} Mar 10 08:17:54 crc kubenswrapper[4825]: I0310 08:17:54.890399 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-k7f4d" event={"ID":"5f00f896-c7ba-4380-9b97-da67e92c1741","Type":"ContainerStarted","Data":"e984947c78ca8a6e15fe4ce581c4a2a71bb30441db5bbb91baf8f51c7704bcf5"} Mar 10 08:17:55 crc kubenswrapper[4825]: I0310 08:17:55.901564 4825 generic.go:334] "Generic (PLEG): container finished" podID="29f630f8-cffd-4cd9-8ad6-42b5647e57a6" containerID="482a3fc01815cfcbe356cbc95fff4e50562a60743144e06ac6aafcad018c33c7" exitCode=0 Mar 10 08:17:55 crc kubenswrapper[4825]: I0310 08:17:55.901797 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-655f-account-create-update-wpvkq" event={"ID":"29f630f8-cffd-4cd9-8ad6-42b5647e57a6","Type":"ContainerDied","Data":"482a3fc01815cfcbe356cbc95fff4e50562a60743144e06ac6aafcad018c33c7"} Mar 10 08:17:55 crc kubenswrapper[4825]: I0310 08:17:55.910070 4825 generic.go:334] "Generic (PLEG): container finished" podID="5aed51f5-8b74-479f-a86c-1dd1e51d7bb5" containerID="eb723b605616917b814449b23fa7a3c157a29e1774bf0a241b81b8cc3d0e5019" exitCode=0 Mar 10 08:17:55 crc kubenswrapper[4825]: I0310 08:17:55.910180 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-30de-account-create-update-wgghb" event={"ID":"5aed51f5-8b74-479f-a86c-1dd1e51d7bb5","Type":"ContainerDied","Data":"eb723b605616917b814449b23fa7a3c157a29e1774bf0a241b81b8cc3d0e5019"} Mar 10 08:17:55 crc kubenswrapper[4825]: I0310 08:17:55.912429 4825 generic.go:334] "Generic (PLEG): container finished" podID="5f00f896-c7ba-4380-9b97-da67e92c1741" containerID="19d325794cf2ba66bd034f1af37d6240febd0a7a28b7089c1d5d329c476cfae6" exitCode=0 Mar 10 08:17:55 crc kubenswrapper[4825]: I0310 08:17:55.912668 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-k7f4d" event={"ID":"5f00f896-c7ba-4380-9b97-da67e92c1741","Type":"ContainerDied","Data":"19d325794cf2ba66bd034f1af37d6240febd0a7a28b7089c1d5d329c476cfae6"} Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.351216 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mng4f" Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.358962 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dtd22" Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.371695 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9ac4-account-create-update-fms75" Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.457697 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eab59002-2587-4875-acb8-52330470fc45-operator-scripts\") pod \"eab59002-2587-4875-acb8-52330470fc45\" (UID: \"eab59002-2587-4875-acb8-52330470fc45\") " Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.457748 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rqbd\" (UniqueName: \"kubernetes.io/projected/bba32ca3-b10c-41ec-9154-65c3dfd60c48-kube-api-access-8rqbd\") pod \"bba32ca3-b10c-41ec-9154-65c3dfd60c48\" (UID: \"bba32ca3-b10c-41ec-9154-65c3dfd60c48\") " Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.457792 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bba32ca3-b10c-41ec-9154-65c3dfd60c48-operator-scripts\") pod \"bba32ca3-b10c-41ec-9154-65c3dfd60c48\" (UID: \"bba32ca3-b10c-41ec-9154-65c3dfd60c48\") " Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.457892 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1172cce-7e11-4cd0-8c82-f4baa5a15bcc-operator-scripts\") pod \"f1172cce-7e11-4cd0-8c82-f4baa5a15bcc\" (UID: \"f1172cce-7e11-4cd0-8c82-f4baa5a15bcc\") " Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.457938 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt7hk\" (UniqueName: \"kubernetes.io/projected/eab59002-2587-4875-acb8-52330470fc45-kube-api-access-pt7hk\") pod \"eab59002-2587-4875-acb8-52330470fc45\" (UID: \"eab59002-2587-4875-acb8-52330470fc45\") " Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.457969 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr46j\" (UniqueName: \"kubernetes.io/projected/f1172cce-7e11-4cd0-8c82-f4baa5a15bcc-kube-api-access-fr46j\") pod \"f1172cce-7e11-4cd0-8c82-f4baa5a15bcc\" (UID: \"f1172cce-7e11-4cd0-8c82-f4baa5a15bcc\") " Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.458106 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eab59002-2587-4875-acb8-52330470fc45-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eab59002-2587-4875-acb8-52330470fc45" (UID: "eab59002-2587-4875-acb8-52330470fc45"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.458470 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eab59002-2587-4875-acb8-52330470fc45-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.458584 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1172cce-7e11-4cd0-8c82-f4baa5a15bcc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f1172cce-7e11-4cd0-8c82-f4baa5a15bcc" (UID: "f1172cce-7e11-4cd0-8c82-f4baa5a15bcc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.459030 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bba32ca3-b10c-41ec-9154-65c3dfd60c48-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bba32ca3-b10c-41ec-9154-65c3dfd60c48" (UID: "bba32ca3-b10c-41ec-9154-65c3dfd60c48"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.463299 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bba32ca3-b10c-41ec-9154-65c3dfd60c48-kube-api-access-8rqbd" (OuterVolumeSpecName: "kube-api-access-8rqbd") pod "bba32ca3-b10c-41ec-9154-65c3dfd60c48" (UID: "bba32ca3-b10c-41ec-9154-65c3dfd60c48"). InnerVolumeSpecName "kube-api-access-8rqbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.463595 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eab59002-2587-4875-acb8-52330470fc45-kube-api-access-pt7hk" (OuterVolumeSpecName: "kube-api-access-pt7hk") pod "eab59002-2587-4875-acb8-52330470fc45" (UID: "eab59002-2587-4875-acb8-52330470fc45"). InnerVolumeSpecName "kube-api-access-pt7hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.481931 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1172cce-7e11-4cd0-8c82-f4baa5a15bcc-kube-api-access-fr46j" (OuterVolumeSpecName: "kube-api-access-fr46j") pod "f1172cce-7e11-4cd0-8c82-f4baa5a15bcc" (UID: "f1172cce-7e11-4cd0-8c82-f4baa5a15bcc"). InnerVolumeSpecName "kube-api-access-fr46j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.560569 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt7hk\" (UniqueName: \"kubernetes.io/projected/eab59002-2587-4875-acb8-52330470fc45-kube-api-access-pt7hk\") on node \"crc\" DevicePath \"\"" Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.560600 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr46j\" (UniqueName: \"kubernetes.io/projected/f1172cce-7e11-4cd0-8c82-f4baa5a15bcc-kube-api-access-fr46j\") on node \"crc\" DevicePath \"\"" Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.560612 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rqbd\" (UniqueName: \"kubernetes.io/projected/bba32ca3-b10c-41ec-9154-65c3dfd60c48-kube-api-access-8rqbd\") on node \"crc\" DevicePath \"\"" Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.560624 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bba32ca3-b10c-41ec-9154-65c3dfd60c48-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.560639 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1172cce-7e11-4cd0-8c82-f4baa5a15bcc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.934045 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dtd22" event={"ID":"f1172cce-7e11-4cd0-8c82-f4baa5a15bcc","Type":"ContainerDied","Data":"3b6a1ac146e6fb7cd9da0b79570d8d70fa8793e2d26f79f5aa3aa1d1b25d4597"} Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.934099 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b6a1ac146e6fb7cd9da0b79570d8d70fa8793e2d26f79f5aa3aa1d1b25d4597" Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.934225 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dtd22" Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.947682 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mng4f" event={"ID":"bba32ca3-b10c-41ec-9154-65c3dfd60c48","Type":"ContainerDied","Data":"a3f6790bd247781d2509841f86b94729a370b4b04730ea2f4eac80b8b918efe4"} Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.947760 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3f6790bd247781d2509841f86b94729a370b4b04730ea2f4eac80b8b918efe4" Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.947697 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mng4f" Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.954759 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9ac4-account-create-update-fms75" event={"ID":"eab59002-2587-4875-acb8-52330470fc45","Type":"ContainerDied","Data":"efb4dccaf6b0b2e13392f8fdab88ed29755198b5ab39a28456cc6ce842d703b4"} Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.954808 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efb4dccaf6b0b2e13392f8fdab88ed29755198b5ab39a28456cc6ce842d703b4" Mar 10 08:17:56 crc kubenswrapper[4825]: I0310 08:17:56.954936 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9ac4-account-create-update-fms75" Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.340569 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-30de-account-create-update-wgghb" Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.357715 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-655f-account-create-update-wpvkq" Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.360335 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k7f4d" Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.480034 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f00f896-c7ba-4380-9b97-da67e92c1741-operator-scripts\") pod \"5f00f896-c7ba-4380-9b97-da67e92c1741\" (UID: \"5f00f896-c7ba-4380-9b97-da67e92c1741\") " Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.480110 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rmqd\" (UniqueName: \"kubernetes.io/projected/5aed51f5-8b74-479f-a86c-1dd1e51d7bb5-kube-api-access-8rmqd\") pod \"5aed51f5-8b74-479f-a86c-1dd1e51d7bb5\" (UID: \"5aed51f5-8b74-479f-a86c-1dd1e51d7bb5\") " Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.480161 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kldc\" (UniqueName: \"kubernetes.io/projected/29f630f8-cffd-4cd9-8ad6-42b5647e57a6-kube-api-access-9kldc\") pod \"29f630f8-cffd-4cd9-8ad6-42b5647e57a6\" (UID: \"29f630f8-cffd-4cd9-8ad6-42b5647e57a6\") " Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.480289 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29f630f8-cffd-4cd9-8ad6-42b5647e57a6-operator-scripts\") pod \"29f630f8-cffd-4cd9-8ad6-42b5647e57a6\" (UID: \"29f630f8-cffd-4cd9-8ad6-42b5647e57a6\") " Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.480323 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vn69\" (UniqueName: \"kubernetes.io/projected/5f00f896-c7ba-4380-9b97-da67e92c1741-kube-api-access-2vn69\") pod \"5f00f896-c7ba-4380-9b97-da67e92c1741\" (UID: \"5f00f896-c7ba-4380-9b97-da67e92c1741\") " Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.480359 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5aed51f5-8b74-479f-a86c-1dd1e51d7bb5-operator-scripts\") pod \"5aed51f5-8b74-479f-a86c-1dd1e51d7bb5\" (UID: \"5aed51f5-8b74-479f-a86c-1dd1e51d7bb5\") " Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.480783 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f00f896-c7ba-4380-9b97-da67e92c1741-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5f00f896-c7ba-4380-9b97-da67e92c1741" (UID: "5f00f896-c7ba-4380-9b97-da67e92c1741"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.481147 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aed51f5-8b74-479f-a86c-1dd1e51d7bb5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5aed51f5-8b74-479f-a86c-1dd1e51d7bb5" (UID: "5aed51f5-8b74-479f-a86c-1dd1e51d7bb5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.482793 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29f630f8-cffd-4cd9-8ad6-42b5647e57a6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29f630f8-cffd-4cd9-8ad6-42b5647e57a6" (UID: "29f630f8-cffd-4cd9-8ad6-42b5647e57a6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.485006 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f00f896-c7ba-4380-9b97-da67e92c1741-kube-api-access-2vn69" (OuterVolumeSpecName: "kube-api-access-2vn69") pod "5f00f896-c7ba-4380-9b97-da67e92c1741" (UID: "5f00f896-c7ba-4380-9b97-da67e92c1741"). InnerVolumeSpecName "kube-api-access-2vn69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.485157 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aed51f5-8b74-479f-a86c-1dd1e51d7bb5-kube-api-access-8rmqd" (OuterVolumeSpecName: "kube-api-access-8rmqd") pod "5aed51f5-8b74-479f-a86c-1dd1e51d7bb5" (UID: "5aed51f5-8b74-479f-a86c-1dd1e51d7bb5"). InnerVolumeSpecName "kube-api-access-8rmqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.488800 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29f630f8-cffd-4cd9-8ad6-42b5647e57a6-kube-api-access-9kldc" (OuterVolumeSpecName: "kube-api-access-9kldc") pod "29f630f8-cffd-4cd9-8ad6-42b5647e57a6" (UID: "29f630f8-cffd-4cd9-8ad6-42b5647e57a6"). InnerVolumeSpecName "kube-api-access-9kldc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.582231 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f00f896-c7ba-4380-9b97-da67e92c1741-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.582274 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rmqd\" (UniqueName: \"kubernetes.io/projected/5aed51f5-8b74-479f-a86c-1dd1e51d7bb5-kube-api-access-8rmqd\") on node \"crc\" DevicePath \"\"" Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.582290 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kldc\" (UniqueName: \"kubernetes.io/projected/29f630f8-cffd-4cd9-8ad6-42b5647e57a6-kube-api-access-9kldc\") on node \"crc\" DevicePath \"\"" Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.582299 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29f630f8-cffd-4cd9-8ad6-42b5647e57a6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.582307 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vn69\" (UniqueName: \"kubernetes.io/projected/5f00f896-c7ba-4380-9b97-da67e92c1741-kube-api-access-2vn69\") on node \"crc\" DevicePath \"\"" Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.582316 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5aed51f5-8b74-479f-a86c-1dd1e51d7bb5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.963507 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-655f-account-create-update-wpvkq" event={"ID":"29f630f8-cffd-4cd9-8ad6-42b5647e57a6","Type":"ContainerDied","Data":"c36bf97e37330c8ea2e693f00a1394735878683e84201cc0b7d62618b17ef049"} Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.963816 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c36bf97e37330c8ea2e693f00a1394735878683e84201cc0b7d62618b17ef049" Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.963554 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-655f-account-create-update-wpvkq" Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.964793 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-30de-account-create-update-wgghb" Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.964835 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-30de-account-create-update-wgghb" event={"ID":"5aed51f5-8b74-479f-a86c-1dd1e51d7bb5","Type":"ContainerDied","Data":"c78b8f72da1b2c6846c1ee46c01788e45acfaa597b167125e7d7b8cd0b5f031d"} Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.964884 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c78b8f72da1b2c6846c1ee46c01788e45acfaa597b167125e7d7b8cd0b5f031d" Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.966103 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-k7f4d" event={"ID":"5f00f896-c7ba-4380-9b97-da67e92c1741","Type":"ContainerDied","Data":"e984947c78ca8a6e15fe4ce581c4a2a71bb30441db5bbb91baf8f51c7704bcf5"} Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.966147 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e984947c78ca8a6e15fe4ce581c4a2a71bb30441db5bbb91baf8f51c7704bcf5" Mar 10 08:17:57 crc kubenswrapper[4825]: I0310 08:17:57.966191 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k7f4d" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.710949 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-66fg7"] Mar 10 08:17:58 crc kubenswrapper[4825]: E0310 08:17:58.711424 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f630f8-cffd-4cd9-8ad6-42b5647e57a6" containerName="mariadb-account-create-update" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.711451 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f630f8-cffd-4cd9-8ad6-42b5647e57a6" containerName="mariadb-account-create-update" Mar 10 08:17:58 crc kubenswrapper[4825]: E0310 08:17:58.711466 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bba32ca3-b10c-41ec-9154-65c3dfd60c48" containerName="mariadb-database-create" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.711474 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="bba32ca3-b10c-41ec-9154-65c3dfd60c48" containerName="mariadb-database-create" Mar 10 08:17:58 crc kubenswrapper[4825]: E0310 08:17:58.711489 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f00f896-c7ba-4380-9b97-da67e92c1741" containerName="mariadb-database-create" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.711498 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f00f896-c7ba-4380-9b97-da67e92c1741" containerName="mariadb-database-create" Mar 10 08:17:58 crc kubenswrapper[4825]: E0310 08:17:58.711522 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eab59002-2587-4875-acb8-52330470fc45" containerName="mariadb-account-create-update" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.711529 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab59002-2587-4875-acb8-52330470fc45" containerName="mariadb-account-create-update" Mar 10 08:17:58 crc kubenswrapper[4825]: E0310 08:17:58.711544 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aed51f5-8b74-479f-a86c-1dd1e51d7bb5" containerName="mariadb-account-create-update" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.711552 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aed51f5-8b74-479f-a86c-1dd1e51d7bb5" containerName="mariadb-account-create-update" Mar 10 08:17:58 crc kubenswrapper[4825]: E0310 08:17:58.711568 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1172cce-7e11-4cd0-8c82-f4baa5a15bcc" containerName="mariadb-database-create" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.711576 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1172cce-7e11-4cd0-8c82-f4baa5a15bcc" containerName="mariadb-database-create" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.711788 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1172cce-7e11-4cd0-8c82-f4baa5a15bcc" containerName="mariadb-database-create" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.711805 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="eab59002-2587-4875-acb8-52330470fc45" containerName="mariadb-account-create-update" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.711813 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aed51f5-8b74-479f-a86c-1dd1e51d7bb5" containerName="mariadb-account-create-update" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.711829 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="29f630f8-cffd-4cd9-8ad6-42b5647e57a6" containerName="mariadb-account-create-update" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.711848 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="bba32ca3-b10c-41ec-9154-65c3dfd60c48" containerName="mariadb-database-create" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.711866 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f00f896-c7ba-4380-9b97-da67e92c1741" containerName="mariadb-database-create" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.712574 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-66fg7" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.717561 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.718012 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-j7mb8" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.718223 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.720861 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-66fg7"] Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.807557 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff-config-data\") pod \"nova-cell0-conductor-db-sync-66fg7\" (UID: \"77ec0a62-9440-49c2-87c1-2d9d9d30b7ff\") " pod="openstack/nova-cell0-conductor-db-sync-66fg7" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.807608 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpvtz\" (UniqueName: \"kubernetes.io/projected/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff-kube-api-access-jpvtz\") pod \"nova-cell0-conductor-db-sync-66fg7\" (UID: \"77ec0a62-9440-49c2-87c1-2d9d9d30b7ff\") " pod="openstack/nova-cell0-conductor-db-sync-66fg7" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.807767 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff-scripts\") pod \"nova-cell0-conductor-db-sync-66fg7\" (UID: \"77ec0a62-9440-49c2-87c1-2d9d9d30b7ff\") " pod="openstack/nova-cell0-conductor-db-sync-66fg7" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.808270 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-66fg7\" (UID: \"77ec0a62-9440-49c2-87c1-2d9d9d30b7ff\") " pod="openstack/nova-cell0-conductor-db-sync-66fg7" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.909807 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-66fg7\" (UID: \"77ec0a62-9440-49c2-87c1-2d9d9d30b7ff\") " pod="openstack/nova-cell0-conductor-db-sync-66fg7" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.909919 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff-config-data\") pod \"nova-cell0-conductor-db-sync-66fg7\" (UID: \"77ec0a62-9440-49c2-87c1-2d9d9d30b7ff\") " pod="openstack/nova-cell0-conductor-db-sync-66fg7" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.909958 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpvtz\" (UniqueName: \"kubernetes.io/projected/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff-kube-api-access-jpvtz\") pod \"nova-cell0-conductor-db-sync-66fg7\" (UID: \"77ec0a62-9440-49c2-87c1-2d9d9d30b7ff\") " pod="openstack/nova-cell0-conductor-db-sync-66fg7" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.910012 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff-scripts\") pod \"nova-cell0-conductor-db-sync-66fg7\" (UID: \"77ec0a62-9440-49c2-87c1-2d9d9d30b7ff\") " pod="openstack/nova-cell0-conductor-db-sync-66fg7" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.916412 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff-scripts\") pod \"nova-cell0-conductor-db-sync-66fg7\" (UID: \"77ec0a62-9440-49c2-87c1-2d9d9d30b7ff\") " pod="openstack/nova-cell0-conductor-db-sync-66fg7" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.917039 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-66fg7\" (UID: \"77ec0a62-9440-49c2-87c1-2d9d9d30b7ff\") " pod="openstack/nova-cell0-conductor-db-sync-66fg7" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.934731 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpvtz\" (UniqueName: \"kubernetes.io/projected/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff-kube-api-access-jpvtz\") pod \"nova-cell0-conductor-db-sync-66fg7\" (UID: \"77ec0a62-9440-49c2-87c1-2d9d9d30b7ff\") " pod="openstack/nova-cell0-conductor-db-sync-66fg7" Mar 10 08:17:58 crc kubenswrapper[4825]: I0310 08:17:58.935588 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff-config-data\") pod \"nova-cell0-conductor-db-sync-66fg7\" (UID: \"77ec0a62-9440-49c2-87c1-2d9d9d30b7ff\") " pod="openstack/nova-cell0-conductor-db-sync-66fg7" Mar 10 08:17:59 crc kubenswrapper[4825]: I0310 08:17:59.031567 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-66fg7" Mar 10 08:17:59 crc kubenswrapper[4825]: I0310 08:17:59.242908 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:17:59 crc kubenswrapper[4825]: E0310 08:17:59.243225 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:17:59 crc kubenswrapper[4825]: I0310 08:17:59.679768 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-66fg7"] Mar 10 08:17:59 crc kubenswrapper[4825]: W0310 08:17:59.688384 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77ec0a62_9440_49c2_87c1_2d9d9d30b7ff.slice/crio-0b08ab3cd84b391e40e7c50ca10bcd1982fcfd18d0288f93827e3d630c00061e WatchSource:0}: Error finding container 0b08ab3cd84b391e40e7c50ca10bcd1982fcfd18d0288f93827e3d630c00061e: Status 404 returned error can't find the container with id 0b08ab3cd84b391e40e7c50ca10bcd1982fcfd18d0288f93827e3d630c00061e Mar 10 08:17:59 crc kubenswrapper[4825]: I0310 08:17:59.984805 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-66fg7" event={"ID":"77ec0a62-9440-49c2-87c1-2d9d9d30b7ff","Type":"ContainerStarted","Data":"0b08ab3cd84b391e40e7c50ca10bcd1982fcfd18d0288f93827e3d630c00061e"} Mar 10 08:18:00 crc kubenswrapper[4825]: I0310 08:18:00.140193 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552178-zgtl6"] Mar 10 08:18:00 crc kubenswrapper[4825]: I0310 08:18:00.141480 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552178-zgtl6" Mar 10 08:18:00 crc kubenswrapper[4825]: I0310 08:18:00.143836 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:18:00 crc kubenswrapper[4825]: I0310 08:18:00.144155 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:18:00 crc kubenswrapper[4825]: I0310 08:18:00.144313 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:18:00 crc kubenswrapper[4825]: I0310 08:18:00.152397 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552178-zgtl6"] Mar 10 08:18:00 crc kubenswrapper[4825]: I0310 08:18:00.237665 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8br5n\" (UniqueName: \"kubernetes.io/projected/5b94e20e-d08c-4e38-989b-715a5b1d4365-kube-api-access-8br5n\") pod \"auto-csr-approver-29552178-zgtl6\" (UID: \"5b94e20e-d08c-4e38-989b-715a5b1d4365\") " pod="openshift-infra/auto-csr-approver-29552178-zgtl6" Mar 10 08:18:00 crc kubenswrapper[4825]: I0310 08:18:00.342316 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8br5n\" (UniqueName: \"kubernetes.io/projected/5b94e20e-d08c-4e38-989b-715a5b1d4365-kube-api-access-8br5n\") pod \"auto-csr-approver-29552178-zgtl6\" (UID: \"5b94e20e-d08c-4e38-989b-715a5b1d4365\") " pod="openshift-infra/auto-csr-approver-29552178-zgtl6" Mar 10 08:18:00 crc kubenswrapper[4825]: I0310 08:18:00.361708 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8br5n\" (UniqueName: \"kubernetes.io/projected/5b94e20e-d08c-4e38-989b-715a5b1d4365-kube-api-access-8br5n\") pod \"auto-csr-approver-29552178-zgtl6\" (UID: \"5b94e20e-d08c-4e38-989b-715a5b1d4365\") " pod="openshift-infra/auto-csr-approver-29552178-zgtl6" Mar 10 08:18:00 crc kubenswrapper[4825]: I0310 08:18:00.464996 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552178-zgtl6" Mar 10 08:18:00 crc kubenswrapper[4825]: I0310 08:18:00.918951 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552178-zgtl6"] Mar 10 08:18:00 crc kubenswrapper[4825]: W0310 08:18:00.923990 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b94e20e_d08c_4e38_989b_715a5b1d4365.slice/crio-9ddb07f59d33c2e395c3b3a4b7f84f98ed038dfd7a3e80ea3bb9367b2166355c WatchSource:0}: Error finding container 9ddb07f59d33c2e395c3b3a4b7f84f98ed038dfd7a3e80ea3bb9367b2166355c: Status 404 returned error can't find the container with id 9ddb07f59d33c2e395c3b3a4b7f84f98ed038dfd7a3e80ea3bb9367b2166355c Mar 10 08:18:00 crc kubenswrapper[4825]: I0310 08:18:00.994603 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552178-zgtl6" event={"ID":"5b94e20e-d08c-4e38-989b-715a5b1d4365","Type":"ContainerStarted","Data":"9ddb07f59d33c2e395c3b3a4b7f84f98ed038dfd7a3e80ea3bb9367b2166355c"} Mar 10 08:18:03 crc kubenswrapper[4825]: I0310 08:18:03.022977 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552178-zgtl6" event={"ID":"5b94e20e-d08c-4e38-989b-715a5b1d4365","Type":"ContainerStarted","Data":"38bd83551c0dac70d6198bf0e2a000e2d03b88269c897a0e0b841de5a64a9f02"} Mar 10 08:18:04 crc kubenswrapper[4825]: I0310 08:18:04.032681 4825 generic.go:334] "Generic (PLEG): container finished" podID="5b94e20e-d08c-4e38-989b-715a5b1d4365" containerID="38bd83551c0dac70d6198bf0e2a000e2d03b88269c897a0e0b841de5a64a9f02" exitCode=0 Mar 10 08:18:04 crc kubenswrapper[4825]: I0310 08:18:04.032741 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552178-zgtl6" event={"ID":"5b94e20e-d08c-4e38-989b-715a5b1d4365","Type":"ContainerDied","Data":"38bd83551c0dac70d6198bf0e2a000e2d03b88269c897a0e0b841de5a64a9f02"} Mar 10 08:18:08 crc kubenswrapper[4825]: I0310 08:18:08.470872 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552178-zgtl6" Mar 10 08:18:08 crc kubenswrapper[4825]: I0310 08:18:08.620636 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8br5n\" (UniqueName: \"kubernetes.io/projected/5b94e20e-d08c-4e38-989b-715a5b1d4365-kube-api-access-8br5n\") pod \"5b94e20e-d08c-4e38-989b-715a5b1d4365\" (UID: \"5b94e20e-d08c-4e38-989b-715a5b1d4365\") " Mar 10 08:18:08 crc kubenswrapper[4825]: I0310 08:18:08.626702 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b94e20e-d08c-4e38-989b-715a5b1d4365-kube-api-access-8br5n" (OuterVolumeSpecName: "kube-api-access-8br5n") pod "5b94e20e-d08c-4e38-989b-715a5b1d4365" (UID: "5b94e20e-d08c-4e38-989b-715a5b1d4365"). InnerVolumeSpecName "kube-api-access-8br5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:18:08 crc kubenswrapper[4825]: I0310 08:18:08.723065 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8br5n\" (UniqueName: \"kubernetes.io/projected/5b94e20e-d08c-4e38-989b-715a5b1d4365-kube-api-access-8br5n\") on node \"crc\" DevicePath \"\"" Mar 10 08:18:09 crc kubenswrapper[4825]: I0310 08:18:09.094761 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552178-zgtl6" Mar 10 08:18:09 crc kubenswrapper[4825]: I0310 08:18:09.097511 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552178-zgtl6" event={"ID":"5b94e20e-d08c-4e38-989b-715a5b1d4365","Type":"ContainerDied","Data":"9ddb07f59d33c2e395c3b3a4b7f84f98ed038dfd7a3e80ea3bb9367b2166355c"} Mar 10 08:18:09 crc kubenswrapper[4825]: I0310 08:18:09.097545 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ddb07f59d33c2e395c3b3a4b7f84f98ed038dfd7a3e80ea3bb9367b2166355c" Mar 10 08:18:09 crc kubenswrapper[4825]: I0310 08:18:09.099439 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-66fg7" event={"ID":"77ec0a62-9440-49c2-87c1-2d9d9d30b7ff","Type":"ContainerStarted","Data":"6e150d3c7d3d12a275b1e406a13d7afbda25a1272de07c8ffcce956747b50b02"} Mar 10 08:18:09 crc kubenswrapper[4825]: I0310 08:18:09.118689 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-66fg7" podStartSLOduration=2.045377399 podStartE2EDuration="11.118672907s" podCreationTimestamp="2026-03-10 08:17:58 +0000 UTC" firstStartedPulling="2026-03-10 08:17:59.689988039 +0000 UTC m=+5632.719768654" lastFinishedPulling="2026-03-10 08:18:08.763283547 +0000 UTC m=+5641.793064162" observedRunningTime="2026-03-10 08:18:09.113883731 +0000 UTC m=+5642.143664346" watchObservedRunningTime="2026-03-10 08:18:09.118672907 +0000 UTC m=+5642.148453522" Mar 10 08:18:09 crc kubenswrapper[4825]: I0310 08:18:09.552047 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552172-hdgvs"] Mar 10 08:18:09 crc kubenswrapper[4825]: I0310 08:18:09.563586 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552172-hdgvs"] Mar 10 08:18:11 crc kubenswrapper[4825]: I0310 08:18:11.249969 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9032c572-c243-4bb6-b65c-83017848cc2c" path="/var/lib/kubelet/pods/9032c572-c243-4bb6-b65c-83017848cc2c/volumes" Mar 10 08:18:12 crc kubenswrapper[4825]: I0310 08:18:12.236465 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:18:12 crc kubenswrapper[4825]: E0310 08:18:12.237267 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:18:14 crc kubenswrapper[4825]: I0310 08:18:14.156753 4825 generic.go:334] "Generic (PLEG): container finished" podID="77ec0a62-9440-49c2-87c1-2d9d9d30b7ff" containerID="6e150d3c7d3d12a275b1e406a13d7afbda25a1272de07c8ffcce956747b50b02" exitCode=0 Mar 10 08:18:14 crc kubenswrapper[4825]: I0310 08:18:14.156788 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-66fg7" event={"ID":"77ec0a62-9440-49c2-87c1-2d9d9d30b7ff","Type":"ContainerDied","Data":"6e150d3c7d3d12a275b1e406a13d7afbda25a1272de07c8ffcce956747b50b02"} Mar 10 08:18:15 crc kubenswrapper[4825]: I0310 08:18:15.510822 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-66fg7" Mar 10 08:18:15 crc kubenswrapper[4825]: I0310 08:18:15.672421 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff-scripts\") pod \"77ec0a62-9440-49c2-87c1-2d9d9d30b7ff\" (UID: \"77ec0a62-9440-49c2-87c1-2d9d9d30b7ff\") " Mar 10 08:18:15 crc kubenswrapper[4825]: I0310 08:18:15.672705 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff-combined-ca-bundle\") pod \"77ec0a62-9440-49c2-87c1-2d9d9d30b7ff\" (UID: \"77ec0a62-9440-49c2-87c1-2d9d9d30b7ff\") " Mar 10 08:18:15 crc kubenswrapper[4825]: I0310 08:18:15.672814 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff-config-data\") pod \"77ec0a62-9440-49c2-87c1-2d9d9d30b7ff\" (UID: \"77ec0a62-9440-49c2-87c1-2d9d9d30b7ff\") " Mar 10 08:18:15 crc kubenswrapper[4825]: I0310 08:18:15.673000 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpvtz\" (UniqueName: \"kubernetes.io/projected/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff-kube-api-access-jpvtz\") pod \"77ec0a62-9440-49c2-87c1-2d9d9d30b7ff\" (UID: \"77ec0a62-9440-49c2-87c1-2d9d9d30b7ff\") " Mar 10 08:18:15 crc kubenswrapper[4825]: I0310 08:18:15.679028 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff-kube-api-access-jpvtz" (OuterVolumeSpecName: "kube-api-access-jpvtz") pod "77ec0a62-9440-49c2-87c1-2d9d9d30b7ff" (UID: "77ec0a62-9440-49c2-87c1-2d9d9d30b7ff"). InnerVolumeSpecName "kube-api-access-jpvtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:18:15 crc kubenswrapper[4825]: I0310 08:18:15.684326 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff-scripts" (OuterVolumeSpecName: "scripts") pod "77ec0a62-9440-49c2-87c1-2d9d9d30b7ff" (UID: "77ec0a62-9440-49c2-87c1-2d9d9d30b7ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:18:15 crc kubenswrapper[4825]: I0310 08:18:15.700224 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff-config-data" (OuterVolumeSpecName: "config-data") pod "77ec0a62-9440-49c2-87c1-2d9d9d30b7ff" (UID: "77ec0a62-9440-49c2-87c1-2d9d9d30b7ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:18:15 crc kubenswrapper[4825]: I0310 08:18:15.704826 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77ec0a62-9440-49c2-87c1-2d9d9d30b7ff" (UID: "77ec0a62-9440-49c2-87c1-2d9d9d30b7ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:18:15 crc kubenswrapper[4825]: I0310 08:18:15.775665 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:18:15 crc kubenswrapper[4825]: I0310 08:18:15.775710 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:18:15 crc kubenswrapper[4825]: I0310 08:18:15.775724 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:18:15 crc kubenswrapper[4825]: I0310 08:18:15.775736 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpvtz\" (UniqueName: \"kubernetes.io/projected/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff-kube-api-access-jpvtz\") on node \"crc\" DevicePath \"\"" Mar 10 08:18:16 crc kubenswrapper[4825]: I0310 08:18:16.180342 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-66fg7" event={"ID":"77ec0a62-9440-49c2-87c1-2d9d9d30b7ff","Type":"ContainerDied","Data":"0b08ab3cd84b391e40e7c50ca10bcd1982fcfd18d0288f93827e3d630c00061e"} Mar 10 08:18:16 crc kubenswrapper[4825]: I0310 08:18:16.180975 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b08ab3cd84b391e40e7c50ca10bcd1982fcfd18d0288f93827e3d630c00061e" Mar 10 08:18:16 crc kubenswrapper[4825]: I0310 08:18:16.180398 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-66fg7" Mar 10 08:18:16 crc kubenswrapper[4825]: I0310 08:18:16.287341 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 08:18:16 crc kubenswrapper[4825]: E0310 08:18:16.287713 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77ec0a62-9440-49c2-87c1-2d9d9d30b7ff" containerName="nova-cell0-conductor-db-sync" Mar 10 08:18:16 crc kubenswrapper[4825]: I0310 08:18:16.287729 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ec0a62-9440-49c2-87c1-2d9d9d30b7ff" containerName="nova-cell0-conductor-db-sync" Mar 10 08:18:16 crc kubenswrapper[4825]: E0310 08:18:16.287745 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b94e20e-d08c-4e38-989b-715a5b1d4365" containerName="oc" Mar 10 08:18:16 crc kubenswrapper[4825]: I0310 08:18:16.287752 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b94e20e-d08c-4e38-989b-715a5b1d4365" containerName="oc" Mar 10 08:18:16 crc kubenswrapper[4825]: I0310 08:18:16.287921 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b94e20e-d08c-4e38-989b-715a5b1d4365" containerName="oc" Mar 10 08:18:16 crc kubenswrapper[4825]: I0310 08:18:16.287937 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="77ec0a62-9440-49c2-87c1-2d9d9d30b7ff" containerName="nova-cell0-conductor-db-sync" Mar 10 08:18:16 crc kubenswrapper[4825]: I0310 08:18:16.288525 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 08:18:16 crc kubenswrapper[4825]: I0310 08:18:16.292446 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 08:18:16 crc kubenswrapper[4825]: I0310 08:18:16.293604 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-j7mb8" Mar 10 08:18:16 crc kubenswrapper[4825]: I0310 08:18:16.300339 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 08:18:16 crc kubenswrapper[4825]: I0310 08:18:16.386941 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z5gv\" (UniqueName: \"kubernetes.io/projected/81060137-51ef-4ed2-a740-aeb94217f912-kube-api-access-7z5gv\") pod \"nova-cell0-conductor-0\" (UID: \"81060137-51ef-4ed2-a740-aeb94217f912\") " pod="openstack/nova-cell0-conductor-0" Mar 10 08:18:16 crc kubenswrapper[4825]: I0310 08:18:16.387076 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81060137-51ef-4ed2-a740-aeb94217f912-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"81060137-51ef-4ed2-a740-aeb94217f912\") " pod="openstack/nova-cell0-conductor-0" Mar 10 08:18:16 crc kubenswrapper[4825]: I0310 08:18:16.387185 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81060137-51ef-4ed2-a740-aeb94217f912-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"81060137-51ef-4ed2-a740-aeb94217f912\") " pod="openstack/nova-cell0-conductor-0" Mar 10 08:18:16 crc kubenswrapper[4825]: I0310 08:18:16.489013 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81060137-51ef-4ed2-a740-aeb94217f912-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"81060137-51ef-4ed2-a740-aeb94217f912\") " pod="openstack/nova-cell0-conductor-0" Mar 10 08:18:16 crc kubenswrapper[4825]: I0310 08:18:16.489098 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81060137-51ef-4ed2-a740-aeb94217f912-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"81060137-51ef-4ed2-a740-aeb94217f912\") " pod="openstack/nova-cell0-conductor-0" Mar 10 08:18:16 crc kubenswrapper[4825]: I0310 08:18:16.489206 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z5gv\" (UniqueName: \"kubernetes.io/projected/81060137-51ef-4ed2-a740-aeb94217f912-kube-api-access-7z5gv\") pod \"nova-cell0-conductor-0\" (UID: \"81060137-51ef-4ed2-a740-aeb94217f912\") " pod="openstack/nova-cell0-conductor-0" Mar 10 08:18:16 crc kubenswrapper[4825]: I0310 08:18:16.493725 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81060137-51ef-4ed2-a740-aeb94217f912-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"81060137-51ef-4ed2-a740-aeb94217f912\") " pod="openstack/nova-cell0-conductor-0" Mar 10 08:18:16 crc kubenswrapper[4825]: I0310 08:18:16.494199 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81060137-51ef-4ed2-a740-aeb94217f912-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"81060137-51ef-4ed2-a740-aeb94217f912\") " pod="openstack/nova-cell0-conductor-0" Mar 10 08:18:16 crc kubenswrapper[4825]: I0310 08:18:16.505414 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z5gv\" (UniqueName: \"kubernetes.io/projected/81060137-51ef-4ed2-a740-aeb94217f912-kube-api-access-7z5gv\") pod \"nova-cell0-conductor-0\" (UID: \"81060137-51ef-4ed2-a740-aeb94217f912\") " pod="openstack/nova-cell0-conductor-0" Mar 10 08:18:16 crc kubenswrapper[4825]: I0310 08:18:16.647633 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 08:18:17 crc kubenswrapper[4825]: I0310 08:18:17.092466 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 08:18:17 crc kubenswrapper[4825]: I0310 08:18:17.190191 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"81060137-51ef-4ed2-a740-aeb94217f912","Type":"ContainerStarted","Data":"88b99cd0a4864516413d8b98dc28ab7468aa1e5bb28924fc9a81cb4cc0243992"} Mar 10 08:18:18 crc kubenswrapper[4825]: I0310 08:18:18.202249 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"81060137-51ef-4ed2-a740-aeb94217f912","Type":"ContainerStarted","Data":"78f7952655702f7d5aee9e67af3b70c3c97c4a73047658e3a8e794fcc744b3ae"} Mar 10 08:18:18 crc kubenswrapper[4825]: I0310 08:18:18.202445 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 10 08:18:18 crc kubenswrapper[4825]: I0310 08:18:18.225682 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.22564338 podStartE2EDuration="2.22564338s" podCreationTimestamp="2026-03-10 08:18:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:18:18.216825238 +0000 UTC m=+5651.246605853" watchObservedRunningTime="2026-03-10 08:18:18.22564338 +0000 UTC m=+5651.255424015" Mar 10 08:18:26 crc kubenswrapper[4825]: I0310 08:18:26.676842 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.113641 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-twtvd"] Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.115245 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-twtvd" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.116850 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.118156 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.125036 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-twtvd"] Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.208829 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564e7588-ac0e-48e5-80c5-3e4592e06601-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-twtvd\" (UID: \"564e7588-ac0e-48e5-80c5-3e4592e06601\") " pod="openstack/nova-cell0-cell-mapping-twtvd" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.208905 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2rmh\" (UniqueName: \"kubernetes.io/projected/564e7588-ac0e-48e5-80c5-3e4592e06601-kube-api-access-k2rmh\") pod \"nova-cell0-cell-mapping-twtvd\" (UID: \"564e7588-ac0e-48e5-80c5-3e4592e06601\") " pod="openstack/nova-cell0-cell-mapping-twtvd" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.209031 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564e7588-ac0e-48e5-80c5-3e4592e06601-config-data\") pod \"nova-cell0-cell-mapping-twtvd\" (UID: \"564e7588-ac0e-48e5-80c5-3e4592e06601\") " pod="openstack/nova-cell0-cell-mapping-twtvd" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.209194 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/564e7588-ac0e-48e5-80c5-3e4592e06601-scripts\") pod \"nova-cell0-cell-mapping-twtvd\" (UID: \"564e7588-ac0e-48e5-80c5-3e4592e06601\") " pod="openstack/nova-cell0-cell-mapping-twtvd" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.236767 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:18:27 crc kubenswrapper[4825]: E0310 08:18:27.237005 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.286673 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.289767 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.312499 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2rmh\" (UniqueName: \"kubernetes.io/projected/564e7588-ac0e-48e5-80c5-3e4592e06601-kube-api-access-k2rmh\") pod \"nova-cell0-cell-mapping-twtvd\" (UID: \"564e7588-ac0e-48e5-80c5-3e4592e06601\") " pod="openstack/nova-cell0-cell-mapping-twtvd" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.312793 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564e7588-ac0e-48e5-80c5-3e4592e06601-config-data\") pod \"nova-cell0-cell-mapping-twtvd\" (UID: \"564e7588-ac0e-48e5-80c5-3e4592e06601\") " pod="openstack/nova-cell0-cell-mapping-twtvd" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.312923 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/564e7588-ac0e-48e5-80c5-3e4592e06601-scripts\") pod \"nova-cell0-cell-mapping-twtvd\" (UID: \"564e7588-ac0e-48e5-80c5-3e4592e06601\") " pod="openstack/nova-cell0-cell-mapping-twtvd" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.313263 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564e7588-ac0e-48e5-80c5-3e4592e06601-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-twtvd\" (UID: \"564e7588-ac0e-48e5-80c5-3e4592e06601\") " pod="openstack/nova-cell0-cell-mapping-twtvd" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.316279 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.327628 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564e7588-ac0e-48e5-80c5-3e4592e06601-config-data\") pod \"nova-cell0-cell-mapping-twtvd\" (UID: \"564e7588-ac0e-48e5-80c5-3e4592e06601\") " pod="openstack/nova-cell0-cell-mapping-twtvd" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.349788 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564e7588-ac0e-48e5-80c5-3e4592e06601-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-twtvd\" (UID: \"564e7588-ac0e-48e5-80c5-3e4592e06601\") " pod="openstack/nova-cell0-cell-mapping-twtvd" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.355063 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2rmh\" (UniqueName: \"kubernetes.io/projected/564e7588-ac0e-48e5-80c5-3e4592e06601-kube-api-access-k2rmh\") pod \"nova-cell0-cell-mapping-twtvd\" (UID: \"564e7588-ac0e-48e5-80c5-3e4592e06601\") " pod="openstack/nova-cell0-cell-mapping-twtvd" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.356646 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/564e7588-ac0e-48e5-80c5-3e4592e06601-scripts\") pod \"nova-cell0-cell-mapping-twtvd\" (UID: \"564e7588-ac0e-48e5-80c5-3e4592e06601\") " pod="openstack/nova-cell0-cell-mapping-twtvd" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.368898 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.412205 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.414243 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.416498 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7694b530-6dab-4abd-b16c-355dfbb3cde7-config-data\") pod \"nova-api-0\" (UID: \"7694b530-6dab-4abd-b16c-355dfbb3cde7\") " pod="openstack/nova-api-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.416587 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7694b530-6dab-4abd-b16c-355dfbb3cde7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7694b530-6dab-4abd-b16c-355dfbb3cde7\") " pod="openstack/nova-api-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.416645 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfbmg\" (UniqueName: \"kubernetes.io/projected/7694b530-6dab-4abd-b16c-355dfbb3cde7-kube-api-access-nfbmg\") pod \"nova-api-0\" (UID: \"7694b530-6dab-4abd-b16c-355dfbb3cde7\") " pod="openstack/nova-api-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.416726 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7694b530-6dab-4abd-b16c-355dfbb3cde7-logs\") pod \"nova-api-0\" (UID: \"7694b530-6dab-4abd-b16c-355dfbb3cde7\") " pod="openstack/nova-api-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.430860 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.447583 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-twtvd" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.481193 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.521320 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a1ce21-885b-475b-a111-4ca0ce117d4d-config-data\") pod \"nova-metadata-0\" (UID: \"d7a1ce21-885b-475b-a111-4ca0ce117d4d\") " pod="openstack/nova-metadata-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.521410 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7a1ce21-885b-475b-a111-4ca0ce117d4d-logs\") pod \"nova-metadata-0\" (UID: \"d7a1ce21-885b-475b-a111-4ca0ce117d4d\") " pod="openstack/nova-metadata-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.521448 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7694b530-6dab-4abd-b16c-355dfbb3cde7-config-data\") pod \"nova-api-0\" (UID: \"7694b530-6dab-4abd-b16c-355dfbb3cde7\") " pod="openstack/nova-api-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.521483 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g49t4\" (UniqueName: \"kubernetes.io/projected/d7a1ce21-885b-475b-a111-4ca0ce117d4d-kube-api-access-g49t4\") pod \"nova-metadata-0\" (UID: \"d7a1ce21-885b-475b-a111-4ca0ce117d4d\") " pod="openstack/nova-metadata-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.521516 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a1ce21-885b-475b-a111-4ca0ce117d4d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d7a1ce21-885b-475b-a111-4ca0ce117d4d\") " pod="openstack/nova-metadata-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.521562 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7694b530-6dab-4abd-b16c-355dfbb3cde7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7694b530-6dab-4abd-b16c-355dfbb3cde7\") " pod="openstack/nova-api-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.521611 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfbmg\" (UniqueName: \"kubernetes.io/projected/7694b530-6dab-4abd-b16c-355dfbb3cde7-kube-api-access-nfbmg\") pod \"nova-api-0\" (UID: \"7694b530-6dab-4abd-b16c-355dfbb3cde7\") " pod="openstack/nova-api-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.521698 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7694b530-6dab-4abd-b16c-355dfbb3cde7-logs\") pod \"nova-api-0\" (UID: \"7694b530-6dab-4abd-b16c-355dfbb3cde7\") " pod="openstack/nova-api-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.522202 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7694b530-6dab-4abd-b16c-355dfbb3cde7-logs\") pod \"nova-api-0\" (UID: \"7694b530-6dab-4abd-b16c-355dfbb3cde7\") " pod="openstack/nova-api-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.535257 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.536420 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.549933 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7694b530-6dab-4abd-b16c-355dfbb3cde7-config-data\") pod \"nova-api-0\" (UID: \"7694b530-6dab-4abd-b16c-355dfbb3cde7\") " pod="openstack/nova-api-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.550216 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.551003 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7694b530-6dab-4abd-b16c-355dfbb3cde7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7694b530-6dab-4abd-b16c-355dfbb3cde7\") " pod="openstack/nova-api-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.588944 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfbmg\" (UniqueName: \"kubernetes.io/projected/7694b530-6dab-4abd-b16c-355dfbb3cde7-kube-api-access-nfbmg\") pod \"nova-api-0\" (UID: \"7694b530-6dab-4abd-b16c-355dfbb3cde7\") " pod="openstack/nova-api-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.589027 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.623149 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b68a6e58-9a3c-434e-9f2f-f81e4e508f79-config-data\") pod \"nova-scheduler-0\" (UID: \"b68a6e58-9a3c-434e-9f2f-f81e4e508f79\") " pod="openstack/nova-scheduler-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.623294 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a1ce21-885b-475b-a111-4ca0ce117d4d-config-data\") pod \"nova-metadata-0\" (UID: \"d7a1ce21-885b-475b-a111-4ca0ce117d4d\") " pod="openstack/nova-metadata-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.623350 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7a1ce21-885b-475b-a111-4ca0ce117d4d-logs\") pod \"nova-metadata-0\" (UID: \"d7a1ce21-885b-475b-a111-4ca0ce117d4d\") " pod="openstack/nova-metadata-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.623379 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b68a6e58-9a3c-434e-9f2f-f81e4e508f79-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b68a6e58-9a3c-434e-9f2f-f81e4e508f79\") " pod="openstack/nova-scheduler-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.623413 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g49t4\" (UniqueName: \"kubernetes.io/projected/d7a1ce21-885b-475b-a111-4ca0ce117d4d-kube-api-access-g49t4\") pod \"nova-metadata-0\" (UID: \"d7a1ce21-885b-475b-a111-4ca0ce117d4d\") " pod="openstack/nova-metadata-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.623446 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a1ce21-885b-475b-a111-4ca0ce117d4d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d7a1ce21-885b-475b-a111-4ca0ce117d4d\") " pod="openstack/nova-metadata-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.623485 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rvtt\" (UniqueName: \"kubernetes.io/projected/b68a6e58-9a3c-434e-9f2f-f81e4e508f79-kube-api-access-5rvtt\") pod \"nova-scheduler-0\" (UID: \"b68a6e58-9a3c-434e-9f2f-f81e4e508f79\") " pod="openstack/nova-scheduler-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.625629 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7a1ce21-885b-475b-a111-4ca0ce117d4d-logs\") pod \"nova-metadata-0\" (UID: \"d7a1ce21-885b-475b-a111-4ca0ce117d4d\") " pod="openstack/nova-metadata-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.635301 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a1ce21-885b-475b-a111-4ca0ce117d4d-config-data\") pod \"nova-metadata-0\" (UID: \"d7a1ce21-885b-475b-a111-4ca0ce117d4d\") " pod="openstack/nova-metadata-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.635812 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a1ce21-885b-475b-a111-4ca0ce117d4d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d7a1ce21-885b-475b-a111-4ca0ce117d4d\") " pod="openstack/nova-metadata-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.650362 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g49t4\" (UniqueName: \"kubernetes.io/projected/d7a1ce21-885b-475b-a111-4ca0ce117d4d-kube-api-access-g49t4\") pod \"nova-metadata-0\" (UID: \"d7a1ce21-885b-475b-a111-4ca0ce117d4d\") " pod="openstack/nova-metadata-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.658766 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.660298 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.665999 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.669519 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.681361 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b65f5864c-2nkbg"] Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.689801 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b65f5864c-2nkbg"] Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.689931 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.726527 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f843f940-bb91-4557-bd1b-16f0d0c2135c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f843f940-bb91-4557-bd1b-16f0d0c2135c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.726669 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v84nl\" (UniqueName: \"kubernetes.io/projected/f843f940-bb91-4557-bd1b-16f0d0c2135c-kube-api-access-v84nl\") pod \"nova-cell1-novncproxy-0\" (UID: \"f843f940-bb91-4557-bd1b-16f0d0c2135c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.726837 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b68a6e58-9a3c-434e-9f2f-f81e4e508f79-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b68a6e58-9a3c-434e-9f2f-f81e4e508f79\") " pod="openstack/nova-scheduler-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.726908 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f843f940-bb91-4557-bd1b-16f0d0c2135c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f843f940-bb91-4557-bd1b-16f0d0c2135c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.726987 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rvtt\" (UniqueName: \"kubernetes.io/projected/b68a6e58-9a3c-434e-9f2f-f81e4e508f79-kube-api-access-5rvtt\") pod \"nova-scheduler-0\" (UID: \"b68a6e58-9a3c-434e-9f2f-f81e4e508f79\") " pod="openstack/nova-scheduler-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.727076 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b68a6e58-9a3c-434e-9f2f-f81e4e508f79-config-data\") pod \"nova-scheduler-0\" (UID: \"b68a6e58-9a3c-434e-9f2f-f81e4e508f79\") " pod="openstack/nova-scheduler-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.730514 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.734146 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b68a6e58-9a3c-434e-9f2f-f81e4e508f79-config-data\") pod \"nova-scheduler-0\" (UID: \"b68a6e58-9a3c-434e-9f2f-f81e4e508f79\") " pod="openstack/nova-scheduler-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.737350 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b68a6e58-9a3c-434e-9f2f-f81e4e508f79-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b68a6e58-9a3c-434e-9f2f-f81e4e508f79\") " pod="openstack/nova-scheduler-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.747280 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rvtt\" (UniqueName: \"kubernetes.io/projected/b68a6e58-9a3c-434e-9f2f-f81e4e508f79-kube-api-access-5rvtt\") pod \"nova-scheduler-0\" (UID: \"b68a6e58-9a3c-434e-9f2f-f81e4e508f79\") " pod="openstack/nova-scheduler-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.778549 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.829148 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e09d2ea-af50-4fff-ba76-270737793800-ovsdbserver-nb\") pod \"dnsmasq-dns-7b65f5864c-2nkbg\" (UID: \"4e09d2ea-af50-4fff-ba76-270737793800\") " pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.829212 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v84nl\" (UniqueName: \"kubernetes.io/projected/f843f940-bb91-4557-bd1b-16f0d0c2135c-kube-api-access-v84nl\") pod \"nova-cell1-novncproxy-0\" (UID: \"f843f940-bb91-4557-bd1b-16f0d0c2135c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.829435 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcqbl\" (UniqueName: \"kubernetes.io/projected/4e09d2ea-af50-4fff-ba76-270737793800-kube-api-access-zcqbl\") pod \"dnsmasq-dns-7b65f5864c-2nkbg\" (UID: \"4e09d2ea-af50-4fff-ba76-270737793800\") " pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.829521 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e09d2ea-af50-4fff-ba76-270737793800-ovsdbserver-sb\") pod \"dnsmasq-dns-7b65f5864c-2nkbg\" (UID: \"4e09d2ea-af50-4fff-ba76-270737793800\") " pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.829546 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f843f940-bb91-4557-bd1b-16f0d0c2135c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f843f940-bb91-4557-bd1b-16f0d0c2135c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.829583 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e09d2ea-af50-4fff-ba76-270737793800-config\") pod \"dnsmasq-dns-7b65f5864c-2nkbg\" (UID: \"4e09d2ea-af50-4fff-ba76-270737793800\") " pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.829647 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e09d2ea-af50-4fff-ba76-270737793800-dns-svc\") pod \"dnsmasq-dns-7b65f5864c-2nkbg\" (UID: \"4e09d2ea-af50-4fff-ba76-270737793800\") " pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.829679 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f843f940-bb91-4557-bd1b-16f0d0c2135c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f843f940-bb91-4557-bd1b-16f0d0c2135c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.833651 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f843f940-bb91-4557-bd1b-16f0d0c2135c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f843f940-bb91-4557-bd1b-16f0d0c2135c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.833735 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f843f940-bb91-4557-bd1b-16f0d0c2135c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f843f940-bb91-4557-bd1b-16f0d0c2135c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.867758 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v84nl\" (UniqueName: \"kubernetes.io/projected/f843f940-bb91-4557-bd1b-16f0d0c2135c-kube-api-access-v84nl\") pod \"nova-cell1-novncproxy-0\" (UID: \"f843f940-bb91-4557-bd1b-16f0d0c2135c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.930764 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e09d2ea-af50-4fff-ba76-270737793800-ovsdbserver-sb\") pod \"dnsmasq-dns-7b65f5864c-2nkbg\" (UID: \"4e09d2ea-af50-4fff-ba76-270737793800\") " pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.930820 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e09d2ea-af50-4fff-ba76-270737793800-config\") pod \"dnsmasq-dns-7b65f5864c-2nkbg\" (UID: \"4e09d2ea-af50-4fff-ba76-270737793800\") " pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.930888 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e09d2ea-af50-4fff-ba76-270737793800-dns-svc\") pod \"dnsmasq-dns-7b65f5864c-2nkbg\" (UID: \"4e09d2ea-af50-4fff-ba76-270737793800\") " pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.930943 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e09d2ea-af50-4fff-ba76-270737793800-ovsdbserver-nb\") pod \"dnsmasq-dns-7b65f5864c-2nkbg\" (UID: \"4e09d2ea-af50-4fff-ba76-270737793800\") " pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.931004 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcqbl\" (UniqueName: \"kubernetes.io/projected/4e09d2ea-af50-4fff-ba76-270737793800-kube-api-access-zcqbl\") pod \"dnsmasq-dns-7b65f5864c-2nkbg\" (UID: \"4e09d2ea-af50-4fff-ba76-270737793800\") " pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.932022 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e09d2ea-af50-4fff-ba76-270737793800-config\") pod \"dnsmasq-dns-7b65f5864c-2nkbg\" (UID: \"4e09d2ea-af50-4fff-ba76-270737793800\") " pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.932062 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e09d2ea-af50-4fff-ba76-270737793800-dns-svc\") pod \"dnsmasq-dns-7b65f5864c-2nkbg\" (UID: \"4e09d2ea-af50-4fff-ba76-270737793800\") " pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.932216 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e09d2ea-af50-4fff-ba76-270737793800-ovsdbserver-nb\") pod \"dnsmasq-dns-7b65f5864c-2nkbg\" (UID: \"4e09d2ea-af50-4fff-ba76-270737793800\") " pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.933844 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e09d2ea-af50-4fff-ba76-270737793800-ovsdbserver-sb\") pod \"dnsmasq-dns-7b65f5864c-2nkbg\" (UID: \"4e09d2ea-af50-4fff-ba76-270737793800\") " pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.940827 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.951679 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcqbl\" (UniqueName: \"kubernetes.io/projected/4e09d2ea-af50-4fff-ba76-270737793800-kube-api-access-zcqbl\") pod \"dnsmasq-dns-7b65f5864c-2nkbg\" (UID: \"4e09d2ea-af50-4fff-ba76-270737793800\") " pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" Mar 10 08:18:27 crc kubenswrapper[4825]: I0310 08:18:27.993061 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.009570 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.022679 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-twtvd"] Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.291548 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lvqlw"] Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.293642 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lvqlw" Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.296519 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.296573 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.303749 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.306650 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-twtvd" event={"ID":"564e7588-ac0e-48e5-80c5-3e4592e06601","Type":"ContainerStarted","Data":"8986ab6e16d2366472f7d644734591d7e9ee495d766effd1ccdc55397060c039"} Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.306701 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-twtvd" event={"ID":"564e7588-ac0e-48e5-80c5-3e4592e06601","Type":"ContainerStarted","Data":"339273710b20f907a79da29578ceec40052984fd535293477813208996153a80"} Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.314341 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lvqlw"] Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.335265 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-twtvd" podStartSLOduration=1.335244705 podStartE2EDuration="1.335244705s" podCreationTimestamp="2026-03-10 08:18:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:18:28.326654779 +0000 UTC m=+5661.356435394" watchObservedRunningTime="2026-03-10 08:18:28.335244705 +0000 UTC m=+5661.365025320" Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.346109 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/429f49f0-f549-4a10-a6d4-9dff065e5258-scripts\") pod \"nova-cell1-conductor-db-sync-lvqlw\" (UID: \"429f49f0-f549-4a10-a6d4-9dff065e5258\") " pod="openstack/nova-cell1-conductor-db-sync-lvqlw" Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.346307 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/429f49f0-f549-4a10-a6d4-9dff065e5258-config-data\") pod \"nova-cell1-conductor-db-sync-lvqlw\" (UID: \"429f49f0-f549-4a10-a6d4-9dff065e5258\") " pod="openstack/nova-cell1-conductor-db-sync-lvqlw" Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.346409 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc2qb\" (UniqueName: \"kubernetes.io/projected/429f49f0-f549-4a10-a6d4-9dff065e5258-kube-api-access-hc2qb\") pod \"nova-cell1-conductor-db-sync-lvqlw\" (UID: \"429f49f0-f549-4a10-a6d4-9dff065e5258\") " pod="openstack/nova-cell1-conductor-db-sync-lvqlw" Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.346570 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429f49f0-f549-4a10-a6d4-9dff065e5258-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lvqlw\" (UID: \"429f49f0-f549-4a10-a6d4-9dff065e5258\") " pod="openstack/nova-cell1-conductor-db-sync-lvqlw" Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.397607 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.448433 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/429f49f0-f549-4a10-a6d4-9dff065e5258-scripts\") pod \"nova-cell1-conductor-db-sync-lvqlw\" (UID: \"429f49f0-f549-4a10-a6d4-9dff065e5258\") " pod="openstack/nova-cell1-conductor-db-sync-lvqlw" Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.448751 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/429f49f0-f549-4a10-a6d4-9dff065e5258-config-data\") pod \"nova-cell1-conductor-db-sync-lvqlw\" (UID: \"429f49f0-f549-4a10-a6d4-9dff065e5258\") " pod="openstack/nova-cell1-conductor-db-sync-lvqlw" Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.448798 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc2qb\" (UniqueName: \"kubernetes.io/projected/429f49f0-f549-4a10-a6d4-9dff065e5258-kube-api-access-hc2qb\") pod \"nova-cell1-conductor-db-sync-lvqlw\" (UID: \"429f49f0-f549-4a10-a6d4-9dff065e5258\") " pod="openstack/nova-cell1-conductor-db-sync-lvqlw" Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.448893 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429f49f0-f549-4a10-a6d4-9dff065e5258-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lvqlw\" (UID: \"429f49f0-f549-4a10-a6d4-9dff065e5258\") " pod="openstack/nova-cell1-conductor-db-sync-lvqlw" Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.454821 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/429f49f0-f549-4a10-a6d4-9dff065e5258-config-data\") pod \"nova-cell1-conductor-db-sync-lvqlw\" (UID: \"429f49f0-f549-4a10-a6d4-9dff065e5258\") " pod="openstack/nova-cell1-conductor-db-sync-lvqlw" Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.463881 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/429f49f0-f549-4a10-a6d4-9dff065e5258-scripts\") pod \"nova-cell1-conductor-db-sync-lvqlw\" (UID: \"429f49f0-f549-4a10-a6d4-9dff065e5258\") " pod="openstack/nova-cell1-conductor-db-sync-lvqlw" Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.470819 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429f49f0-f549-4a10-a6d4-9dff065e5258-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lvqlw\" (UID: \"429f49f0-f549-4a10-a6d4-9dff065e5258\") " pod="openstack/nova-cell1-conductor-db-sync-lvqlw" Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.473608 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc2qb\" (UniqueName: \"kubernetes.io/projected/429f49f0-f549-4a10-a6d4-9dff065e5258-kube-api-access-hc2qb\") pod \"nova-cell1-conductor-db-sync-lvqlw\" (UID: \"429f49f0-f549-4a10-a6d4-9dff065e5258\") " pod="openstack/nova-cell1-conductor-db-sync-lvqlw" Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.516231 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.601256 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.626009 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lvqlw" Mar 10 08:18:28 crc kubenswrapper[4825]: I0310 08:18:28.671731 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b65f5864c-2nkbg"] Mar 10 08:18:28 crc kubenswrapper[4825]: W0310 08:18:28.683889 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e09d2ea_af50_4fff_ba76_270737793800.slice/crio-6fa927544056305e29afa011575339da3d3364c56e9bc879549ebd8e277b5cbc WatchSource:0}: Error finding container 6fa927544056305e29afa011575339da3d3364c56e9bc879549ebd8e277b5cbc: Status 404 returned error can't find the container with id 6fa927544056305e29afa011575339da3d3364c56e9bc879549ebd8e277b5cbc Mar 10 08:18:29 crc kubenswrapper[4825]: I0310 08:18:29.173392 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lvqlw"] Mar 10 08:18:29 crc kubenswrapper[4825]: W0310 08:18:29.175916 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod429f49f0_f549_4a10_a6d4_9dff065e5258.slice/crio-7a69e0c84e5f241aefdd4ed5a0900c58126eb407548d01779ddfdb2299defbcf WatchSource:0}: Error finding container 7a69e0c84e5f241aefdd4ed5a0900c58126eb407548d01779ddfdb2299defbcf: Status 404 returned error can't find the container with id 7a69e0c84e5f241aefdd4ed5a0900c58126eb407548d01779ddfdb2299defbcf Mar 10 08:18:29 crc kubenswrapper[4825]: I0310 08:18:29.324912 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f843f940-bb91-4557-bd1b-16f0d0c2135c","Type":"ContainerStarted","Data":"45166ed59667a68349c96a59dc8a61ce9fef7b667f0f4803320a2bbaefe1d544"} Mar 10 08:18:29 crc kubenswrapper[4825]: I0310 08:18:29.333495 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b68a6e58-9a3c-434e-9f2f-f81e4e508f79","Type":"ContainerStarted","Data":"a470d665993a035e02ecb21ae135f447f537fb62b6d00cb3a03194132d8117a9"} Mar 10 08:18:29 crc kubenswrapper[4825]: I0310 08:18:29.336778 4825 generic.go:334] "Generic (PLEG): container finished" podID="4e09d2ea-af50-4fff-ba76-270737793800" containerID="1aa04356f697b3fcdd3359ee9e10b7ab3948b624433309f62f5ac8f761d1e1a7" exitCode=0 Mar 10 08:18:29 crc kubenswrapper[4825]: I0310 08:18:29.336838 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" event={"ID":"4e09d2ea-af50-4fff-ba76-270737793800","Type":"ContainerDied","Data":"1aa04356f697b3fcdd3359ee9e10b7ab3948b624433309f62f5ac8f761d1e1a7"} Mar 10 08:18:29 crc kubenswrapper[4825]: I0310 08:18:29.336866 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" event={"ID":"4e09d2ea-af50-4fff-ba76-270737793800","Type":"ContainerStarted","Data":"6fa927544056305e29afa011575339da3d3364c56e9bc879549ebd8e277b5cbc"} Mar 10 08:18:29 crc kubenswrapper[4825]: I0310 08:18:29.341781 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d7a1ce21-885b-475b-a111-4ca0ce117d4d","Type":"ContainerStarted","Data":"65ac1896c808fff2cae5dcc7ba34953f928b75c491191b132c865c464ad2f2e2"} Mar 10 08:18:29 crc kubenswrapper[4825]: I0310 08:18:29.350886 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7694b530-6dab-4abd-b16c-355dfbb3cde7","Type":"ContainerStarted","Data":"ddcccb1a9971e0b0a8130cf6dbc87c9f174d42916eb7474adeee823a0fbedcbc"} Mar 10 08:18:29 crc kubenswrapper[4825]: I0310 08:18:29.359402 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lvqlw" event={"ID":"429f49f0-f549-4a10-a6d4-9dff065e5258","Type":"ContainerStarted","Data":"7a69e0c84e5f241aefdd4ed5a0900c58126eb407548d01779ddfdb2299defbcf"} Mar 10 08:18:30 crc kubenswrapper[4825]: I0310 08:18:30.369079 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" event={"ID":"4e09d2ea-af50-4fff-ba76-270737793800","Type":"ContainerStarted","Data":"493ca5094ecb817da7adae4f0b754addd1f62b6347c54a114fccb0f800dae09a"} Mar 10 08:18:30 crc kubenswrapper[4825]: I0310 08:18:30.369272 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" Mar 10 08:18:30 crc kubenswrapper[4825]: I0310 08:18:30.372748 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lvqlw" event={"ID":"429f49f0-f549-4a10-a6d4-9dff065e5258","Type":"ContainerStarted","Data":"1a24f86eb00e34063c60a38ed8ee69df6405c36aec606e1ca5aa9ff3f787d50e"} Mar 10 08:18:30 crc kubenswrapper[4825]: I0310 08:18:30.394471 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" podStartSLOduration=3.394450287 podStartE2EDuration="3.394450287s" podCreationTimestamp="2026-03-10 08:18:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:18:30.389464587 +0000 UTC m=+5663.419245202" watchObservedRunningTime="2026-03-10 08:18:30.394450287 +0000 UTC m=+5663.424230902" Mar 10 08:18:30 crc kubenswrapper[4825]: I0310 08:18:30.413634 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-lvqlw" podStartSLOduration=2.41360823 podStartE2EDuration="2.41360823s" podCreationTimestamp="2026-03-10 08:18:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:18:30.407656994 +0000 UTC m=+5663.437437609" watchObservedRunningTime="2026-03-10 08:18:30.41360823 +0000 UTC m=+5663.443388845" Mar 10 08:18:31 crc kubenswrapper[4825]: I0310 08:18:31.797336 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 08:18:31 crc kubenswrapper[4825]: I0310 08:18:31.817476 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 08:18:32 crc kubenswrapper[4825]: I0310 08:18:32.400563 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b68a6e58-9a3c-434e-9f2f-f81e4e508f79","Type":"ContainerStarted","Data":"af451d6c609d8d92167a2839ec34e6668fc58d0ccb31df53930bac041f9ac6c1"} Mar 10 08:18:32 crc kubenswrapper[4825]: I0310 08:18:32.404166 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d7a1ce21-885b-475b-a111-4ca0ce117d4d","Type":"ContainerStarted","Data":"8d7d76a45372a15352a8afb09c32de4067530f3ab959800b6e7fb842eb96b3fb"} Mar 10 08:18:32 crc kubenswrapper[4825]: I0310 08:18:32.404221 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d7a1ce21-885b-475b-a111-4ca0ce117d4d","Type":"ContainerStarted","Data":"6946d0f532d9fa3bbb3520e9f5621790224b28a66004f4f97228d7b6a2249494"} Mar 10 08:18:32 crc kubenswrapper[4825]: I0310 08:18:32.404380 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d7a1ce21-885b-475b-a111-4ca0ce117d4d" containerName="nova-metadata-log" containerID="cri-o://6946d0f532d9fa3bbb3520e9f5621790224b28a66004f4f97228d7b6a2249494" gracePeriod=30 Mar 10 08:18:32 crc kubenswrapper[4825]: I0310 08:18:32.404683 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d7a1ce21-885b-475b-a111-4ca0ce117d4d" containerName="nova-metadata-metadata" containerID="cri-o://8d7d76a45372a15352a8afb09c32de4067530f3ab959800b6e7fb842eb96b3fb" gracePeriod=30 Mar 10 08:18:32 crc kubenswrapper[4825]: I0310 08:18:32.409775 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7694b530-6dab-4abd-b16c-355dfbb3cde7","Type":"ContainerStarted","Data":"838350bd39928cc47234a0b89400e3d0c444bd682bc3d520183d337b8220a467"} Mar 10 08:18:32 crc kubenswrapper[4825]: I0310 08:18:32.411803 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f843f940-bb91-4557-bd1b-16f0d0c2135c","Type":"ContainerStarted","Data":"9e332074f3e89ed80e91ef8f24d5df1560b12f375ccab19be12dae5e7e06b6de"} Mar 10 08:18:32 crc kubenswrapper[4825]: I0310 08:18:32.411946 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="f843f940-bb91-4557-bd1b-16f0d0c2135c" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://9e332074f3e89ed80e91ef8f24d5df1560b12f375ccab19be12dae5e7e06b6de" gracePeriod=30 Mar 10 08:18:32 crc kubenswrapper[4825]: I0310 08:18:32.419933 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.18483376 podStartE2EDuration="5.419907803s" podCreationTimestamp="2026-03-10 08:18:27 +0000 UTC" firstStartedPulling="2026-03-10 08:18:28.537880035 +0000 UTC m=+5661.567660650" lastFinishedPulling="2026-03-10 08:18:31.772954078 +0000 UTC m=+5664.802734693" observedRunningTime="2026-03-10 08:18:32.419908173 +0000 UTC m=+5665.449688798" watchObservedRunningTime="2026-03-10 08:18:32.419907803 +0000 UTC m=+5665.449688418" Mar 10 08:18:32 crc kubenswrapper[4825]: I0310 08:18:32.473861 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.308780774 podStartE2EDuration="5.473815888s" podCreationTimestamp="2026-03-10 08:18:27 +0000 UTC" firstStartedPulling="2026-03-10 08:18:28.607921184 +0000 UTC m=+5661.637701799" lastFinishedPulling="2026-03-10 08:18:31.772956298 +0000 UTC m=+5664.802736913" observedRunningTime="2026-03-10 08:18:32.445950837 +0000 UTC m=+5665.475731462" watchObservedRunningTime="2026-03-10 08:18:32.473815888 +0000 UTC m=+5665.503596533" Mar 10 08:18:32 crc kubenswrapper[4825]: I0310 08:18:32.779287 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 08:18:32 crc kubenswrapper[4825]: I0310 08:18:32.779612 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 08:18:32 crc kubenswrapper[4825]: I0310 08:18:32.941828 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 08:18:32 crc kubenswrapper[4825]: I0310 08:18:32.994191 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:18:33 crc kubenswrapper[4825]: I0310 08:18:33.422410 4825 generic.go:334] "Generic (PLEG): container finished" podID="d7a1ce21-885b-475b-a111-4ca0ce117d4d" containerID="6946d0f532d9fa3bbb3520e9f5621790224b28a66004f4f97228d7b6a2249494" exitCode=143 Mar 10 08:18:33 crc kubenswrapper[4825]: I0310 08:18:33.422474 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d7a1ce21-885b-475b-a111-4ca0ce117d4d","Type":"ContainerDied","Data":"6946d0f532d9fa3bbb3520e9f5621790224b28a66004f4f97228d7b6a2249494"} Mar 10 08:18:33 crc kubenswrapper[4825]: I0310 08:18:33.424004 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7694b530-6dab-4abd-b16c-355dfbb3cde7","Type":"ContainerStarted","Data":"f15781b854fa9b8e9572f7ebe8fe0b280494822d98978ddce956b0d8c05f8f59"} Mar 10 08:18:33 crc kubenswrapper[4825]: I0310 08:18:33.426925 4825 generic.go:334] "Generic (PLEG): container finished" podID="429f49f0-f549-4a10-a6d4-9dff065e5258" containerID="1a24f86eb00e34063c60a38ed8ee69df6405c36aec606e1ca5aa9ff3f787d50e" exitCode=0 Mar 10 08:18:33 crc kubenswrapper[4825]: I0310 08:18:33.427445 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lvqlw" event={"ID":"429f49f0-f549-4a10-a6d4-9dff065e5258","Type":"ContainerDied","Data":"1a24f86eb00e34063c60a38ed8ee69df6405c36aec606e1ca5aa9ff3f787d50e"} Mar 10 08:18:33 crc kubenswrapper[4825]: I0310 08:18:33.444390 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.968669299 podStartE2EDuration="6.444370579s" podCreationTimestamp="2026-03-10 08:18:27 +0000 UTC" firstStartedPulling="2026-03-10 08:18:28.307707002 +0000 UTC m=+5661.337487617" lastFinishedPulling="2026-03-10 08:18:31.783408282 +0000 UTC m=+5664.813188897" observedRunningTime="2026-03-10 08:18:33.440330553 +0000 UTC m=+5666.470111198" watchObservedRunningTime="2026-03-10 08:18:33.444370579 +0000 UTC m=+5666.474151194" Mar 10 08:18:33 crc kubenswrapper[4825]: I0310 08:18:33.450883 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.081498692 podStartE2EDuration="6.45086449s" podCreationTimestamp="2026-03-10 08:18:27 +0000 UTC" firstStartedPulling="2026-03-10 08:18:28.403534138 +0000 UTC m=+5661.433314753" lastFinishedPulling="2026-03-10 08:18:31.772899936 +0000 UTC m=+5664.802680551" observedRunningTime="2026-03-10 08:18:32.46892318 +0000 UTC m=+5665.498703815" watchObservedRunningTime="2026-03-10 08:18:33.45086449 +0000 UTC m=+5666.480645105" Mar 10 08:18:34 crc kubenswrapper[4825]: I0310 08:18:34.437116 4825 generic.go:334] "Generic (PLEG): container finished" podID="564e7588-ac0e-48e5-80c5-3e4592e06601" containerID="8986ab6e16d2366472f7d644734591d7e9ee495d766effd1ccdc55397060c039" exitCode=0 Mar 10 08:18:34 crc kubenswrapper[4825]: I0310 08:18:34.437182 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-twtvd" event={"ID":"564e7588-ac0e-48e5-80c5-3e4592e06601","Type":"ContainerDied","Data":"8986ab6e16d2366472f7d644734591d7e9ee495d766effd1ccdc55397060c039"} Mar 10 08:18:34 crc kubenswrapper[4825]: I0310 08:18:34.824840 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lvqlw" Mar 10 08:18:34 crc kubenswrapper[4825]: I0310 08:18:34.922941 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc2qb\" (UniqueName: \"kubernetes.io/projected/429f49f0-f549-4a10-a6d4-9dff065e5258-kube-api-access-hc2qb\") pod \"429f49f0-f549-4a10-a6d4-9dff065e5258\" (UID: \"429f49f0-f549-4a10-a6d4-9dff065e5258\") " Mar 10 08:18:34 crc kubenswrapper[4825]: I0310 08:18:34.923049 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/429f49f0-f549-4a10-a6d4-9dff065e5258-config-data\") pod \"429f49f0-f549-4a10-a6d4-9dff065e5258\" (UID: \"429f49f0-f549-4a10-a6d4-9dff065e5258\") " Mar 10 08:18:34 crc kubenswrapper[4825]: I0310 08:18:34.923201 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429f49f0-f549-4a10-a6d4-9dff065e5258-combined-ca-bundle\") pod \"429f49f0-f549-4a10-a6d4-9dff065e5258\" (UID: \"429f49f0-f549-4a10-a6d4-9dff065e5258\") " Mar 10 08:18:34 crc kubenswrapper[4825]: I0310 08:18:34.923235 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/429f49f0-f549-4a10-a6d4-9dff065e5258-scripts\") pod \"429f49f0-f549-4a10-a6d4-9dff065e5258\" (UID: \"429f49f0-f549-4a10-a6d4-9dff065e5258\") " Mar 10 08:18:34 crc kubenswrapper[4825]: I0310 08:18:34.927340 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/429f49f0-f549-4a10-a6d4-9dff065e5258-scripts" (OuterVolumeSpecName: "scripts") pod "429f49f0-f549-4a10-a6d4-9dff065e5258" (UID: "429f49f0-f549-4a10-a6d4-9dff065e5258"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:18:34 crc kubenswrapper[4825]: I0310 08:18:34.927616 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/429f49f0-f549-4a10-a6d4-9dff065e5258-kube-api-access-hc2qb" (OuterVolumeSpecName: "kube-api-access-hc2qb") pod "429f49f0-f549-4a10-a6d4-9dff065e5258" (UID: "429f49f0-f549-4a10-a6d4-9dff065e5258"). InnerVolumeSpecName "kube-api-access-hc2qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:18:34 crc kubenswrapper[4825]: I0310 08:18:34.946372 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/429f49f0-f549-4a10-a6d4-9dff065e5258-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "429f49f0-f549-4a10-a6d4-9dff065e5258" (UID: "429f49f0-f549-4a10-a6d4-9dff065e5258"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:18:34 crc kubenswrapper[4825]: I0310 08:18:34.947669 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/429f49f0-f549-4a10-a6d4-9dff065e5258-config-data" (OuterVolumeSpecName: "config-data") pod "429f49f0-f549-4a10-a6d4-9dff065e5258" (UID: "429f49f0-f549-4a10-a6d4-9dff065e5258"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.025628 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc2qb\" (UniqueName: \"kubernetes.io/projected/429f49f0-f549-4a10-a6d4-9dff065e5258-kube-api-access-hc2qb\") on node \"crc\" DevicePath \"\"" Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.025671 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/429f49f0-f549-4a10-a6d4-9dff065e5258-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.025685 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429f49f0-f549-4a10-a6d4-9dff065e5258-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.025696 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/429f49f0-f549-4a10-a6d4-9dff065e5258-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.447687 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lvqlw" Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.447669 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lvqlw" event={"ID":"429f49f0-f549-4a10-a6d4-9dff065e5258","Type":"ContainerDied","Data":"7a69e0c84e5f241aefdd4ed5a0900c58126eb407548d01779ddfdb2299defbcf"} Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.447747 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a69e0c84e5f241aefdd4ed5a0900c58126eb407548d01779ddfdb2299defbcf" Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.546932 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 08:18:35 crc kubenswrapper[4825]: E0310 08:18:35.547301 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="429f49f0-f549-4a10-a6d4-9dff065e5258" containerName="nova-cell1-conductor-db-sync" Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.547317 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="429f49f0-f549-4a10-a6d4-9dff065e5258" containerName="nova-cell1-conductor-db-sync" Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.547515 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="429f49f0-f549-4a10-a6d4-9dff065e5258" containerName="nova-cell1-conductor-db-sync" Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.548089 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.550333 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.562913 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.636264 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84p4l\" (UniqueName: \"kubernetes.io/projected/e5972ff3-b652-4ed2-8414-404f2d7b24e0-kube-api-access-84p4l\") pod \"nova-cell1-conductor-0\" (UID: \"e5972ff3-b652-4ed2-8414-404f2d7b24e0\") " pod="openstack/nova-cell1-conductor-0" Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.636317 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5972ff3-b652-4ed2-8414-404f2d7b24e0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e5972ff3-b652-4ed2-8414-404f2d7b24e0\") " pod="openstack/nova-cell1-conductor-0" Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.636382 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5972ff3-b652-4ed2-8414-404f2d7b24e0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e5972ff3-b652-4ed2-8414-404f2d7b24e0\") " pod="openstack/nova-cell1-conductor-0" Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.737991 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84p4l\" (UniqueName: \"kubernetes.io/projected/e5972ff3-b652-4ed2-8414-404f2d7b24e0-kube-api-access-84p4l\") pod \"nova-cell1-conductor-0\" (UID: \"e5972ff3-b652-4ed2-8414-404f2d7b24e0\") " pod="openstack/nova-cell1-conductor-0" Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.738042 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5972ff3-b652-4ed2-8414-404f2d7b24e0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e5972ff3-b652-4ed2-8414-404f2d7b24e0\") " pod="openstack/nova-cell1-conductor-0" Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.738098 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5972ff3-b652-4ed2-8414-404f2d7b24e0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e5972ff3-b652-4ed2-8414-404f2d7b24e0\") " pod="openstack/nova-cell1-conductor-0" Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.744762 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5972ff3-b652-4ed2-8414-404f2d7b24e0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e5972ff3-b652-4ed2-8414-404f2d7b24e0\") " pod="openstack/nova-cell1-conductor-0" Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.754782 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5972ff3-b652-4ed2-8414-404f2d7b24e0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e5972ff3-b652-4ed2-8414-404f2d7b24e0\") " pod="openstack/nova-cell1-conductor-0" Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.757582 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84p4l\" (UniqueName: \"kubernetes.io/projected/e5972ff3-b652-4ed2-8414-404f2d7b24e0-kube-api-access-84p4l\") pod \"nova-cell1-conductor-0\" (UID: \"e5972ff3-b652-4ed2-8414-404f2d7b24e0\") " pod="openstack/nova-cell1-conductor-0" Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.849316 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-twtvd" Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.901733 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.940372 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2rmh\" (UniqueName: \"kubernetes.io/projected/564e7588-ac0e-48e5-80c5-3e4592e06601-kube-api-access-k2rmh\") pod \"564e7588-ac0e-48e5-80c5-3e4592e06601\" (UID: \"564e7588-ac0e-48e5-80c5-3e4592e06601\") " Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.940493 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564e7588-ac0e-48e5-80c5-3e4592e06601-combined-ca-bundle\") pod \"564e7588-ac0e-48e5-80c5-3e4592e06601\" (UID: \"564e7588-ac0e-48e5-80c5-3e4592e06601\") " Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.940652 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/564e7588-ac0e-48e5-80c5-3e4592e06601-scripts\") pod \"564e7588-ac0e-48e5-80c5-3e4592e06601\" (UID: \"564e7588-ac0e-48e5-80c5-3e4592e06601\") " Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.940689 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564e7588-ac0e-48e5-80c5-3e4592e06601-config-data\") pod \"564e7588-ac0e-48e5-80c5-3e4592e06601\" (UID: \"564e7588-ac0e-48e5-80c5-3e4592e06601\") " Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.943932 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/564e7588-ac0e-48e5-80c5-3e4592e06601-kube-api-access-k2rmh" (OuterVolumeSpecName: "kube-api-access-k2rmh") pod "564e7588-ac0e-48e5-80c5-3e4592e06601" (UID: "564e7588-ac0e-48e5-80c5-3e4592e06601"). InnerVolumeSpecName "kube-api-access-k2rmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:18:35 crc kubenswrapper[4825]: I0310 08:18:35.950242 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/564e7588-ac0e-48e5-80c5-3e4592e06601-scripts" (OuterVolumeSpecName: "scripts") pod "564e7588-ac0e-48e5-80c5-3e4592e06601" (UID: "564e7588-ac0e-48e5-80c5-3e4592e06601"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:18:36 crc kubenswrapper[4825]: I0310 08:18:36.050544 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/564e7588-ac0e-48e5-80c5-3e4592e06601-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:18:36 crc kubenswrapper[4825]: I0310 08:18:36.050583 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2rmh\" (UniqueName: \"kubernetes.io/projected/564e7588-ac0e-48e5-80c5-3e4592e06601-kube-api-access-k2rmh\") on node \"crc\" DevicePath \"\"" Mar 10 08:18:36 crc kubenswrapper[4825]: I0310 08:18:36.072509 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/564e7588-ac0e-48e5-80c5-3e4592e06601-config-data" (OuterVolumeSpecName: "config-data") pod "564e7588-ac0e-48e5-80c5-3e4592e06601" (UID: "564e7588-ac0e-48e5-80c5-3e4592e06601"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:18:36 crc kubenswrapper[4825]: I0310 08:18:36.092298 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/564e7588-ac0e-48e5-80c5-3e4592e06601-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "564e7588-ac0e-48e5-80c5-3e4592e06601" (UID: "564e7588-ac0e-48e5-80c5-3e4592e06601"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:18:36 crc kubenswrapper[4825]: I0310 08:18:36.152817 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/564e7588-ac0e-48e5-80c5-3e4592e06601-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:18:36 crc kubenswrapper[4825]: I0310 08:18:36.153142 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/564e7588-ac0e-48e5-80c5-3e4592e06601-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:18:36 crc kubenswrapper[4825]: I0310 08:18:36.459312 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-twtvd" event={"ID":"564e7588-ac0e-48e5-80c5-3e4592e06601","Type":"ContainerDied","Data":"339273710b20f907a79da29578ceec40052984fd535293477813208996153a80"} Mar 10 08:18:36 crc kubenswrapper[4825]: I0310 08:18:36.459354 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="339273710b20f907a79da29578ceec40052984fd535293477813208996153a80" Mar 10 08:18:36 crc kubenswrapper[4825]: I0310 08:18:36.459366 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-twtvd" Mar 10 08:18:36 crc kubenswrapper[4825]: W0310 08:18:36.516804 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5972ff3_b652_4ed2_8414_404f2d7b24e0.slice/crio-6b6670fac27d4102b421612ff592d554e6ff035e7168df4573e2483c3179f626 WatchSource:0}: Error finding container 6b6670fac27d4102b421612ff592d554e6ff035e7168df4573e2483c3179f626: Status 404 returned error can't find the container with id 6b6670fac27d4102b421612ff592d554e6ff035e7168df4573e2483c3179f626 Mar 10 08:18:36 crc kubenswrapper[4825]: I0310 08:18:36.518821 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 08:18:36 crc kubenswrapper[4825]: I0310 08:18:36.634898 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 08:18:36 crc kubenswrapper[4825]: I0310 08:18:36.635414 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7694b530-6dab-4abd-b16c-355dfbb3cde7" containerName="nova-api-log" containerID="cri-o://838350bd39928cc47234a0b89400e3d0c444bd682bc3d520183d337b8220a467" gracePeriod=30 Mar 10 08:18:36 crc kubenswrapper[4825]: I0310 08:18:36.635899 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7694b530-6dab-4abd-b16c-355dfbb3cde7" containerName="nova-api-api" containerID="cri-o://f15781b854fa9b8e9572f7ebe8fe0b280494822d98978ddce956b0d8c05f8f59" gracePeriod=30 Mar 10 08:18:36 crc kubenswrapper[4825]: I0310 08:18:36.652407 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 08:18:36 crc kubenswrapper[4825]: I0310 08:18:36.653060 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b68a6e58-9a3c-434e-9f2f-f81e4e508f79" containerName="nova-scheduler-scheduler" containerID="cri-o://af451d6c609d8d92167a2839ec34e6668fc58d0ccb31df53930bac041f9ac6c1" gracePeriod=30 Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.208869 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.284117 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7694b530-6dab-4abd-b16c-355dfbb3cde7-config-data\") pod \"7694b530-6dab-4abd-b16c-355dfbb3cde7\" (UID: \"7694b530-6dab-4abd-b16c-355dfbb3cde7\") " Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.284249 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7694b530-6dab-4abd-b16c-355dfbb3cde7-logs\") pod \"7694b530-6dab-4abd-b16c-355dfbb3cde7\" (UID: \"7694b530-6dab-4abd-b16c-355dfbb3cde7\") " Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.284389 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfbmg\" (UniqueName: \"kubernetes.io/projected/7694b530-6dab-4abd-b16c-355dfbb3cde7-kube-api-access-nfbmg\") pod \"7694b530-6dab-4abd-b16c-355dfbb3cde7\" (UID: \"7694b530-6dab-4abd-b16c-355dfbb3cde7\") " Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.284487 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7694b530-6dab-4abd-b16c-355dfbb3cde7-combined-ca-bundle\") pod \"7694b530-6dab-4abd-b16c-355dfbb3cde7\" (UID: \"7694b530-6dab-4abd-b16c-355dfbb3cde7\") " Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.284662 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7694b530-6dab-4abd-b16c-355dfbb3cde7-logs" (OuterVolumeSpecName: "logs") pod "7694b530-6dab-4abd-b16c-355dfbb3cde7" (UID: "7694b530-6dab-4abd-b16c-355dfbb3cde7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.285026 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7694b530-6dab-4abd-b16c-355dfbb3cde7-logs\") on node \"crc\" DevicePath \"\"" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.289778 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7694b530-6dab-4abd-b16c-355dfbb3cde7-kube-api-access-nfbmg" (OuterVolumeSpecName: "kube-api-access-nfbmg") pod "7694b530-6dab-4abd-b16c-355dfbb3cde7" (UID: "7694b530-6dab-4abd-b16c-355dfbb3cde7"). InnerVolumeSpecName "kube-api-access-nfbmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.316753 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7694b530-6dab-4abd-b16c-355dfbb3cde7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7694b530-6dab-4abd-b16c-355dfbb3cde7" (UID: "7694b530-6dab-4abd-b16c-355dfbb3cde7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.321252 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7694b530-6dab-4abd-b16c-355dfbb3cde7-config-data" (OuterVolumeSpecName: "config-data") pod "7694b530-6dab-4abd-b16c-355dfbb3cde7" (UID: "7694b530-6dab-4abd-b16c-355dfbb3cde7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.386336 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfbmg\" (UniqueName: \"kubernetes.io/projected/7694b530-6dab-4abd-b16c-355dfbb3cde7-kube-api-access-nfbmg\") on node \"crc\" DevicePath \"\"" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.386379 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7694b530-6dab-4abd-b16c-355dfbb3cde7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.386392 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7694b530-6dab-4abd-b16c-355dfbb3cde7-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.469257 4825 generic.go:334] "Generic (PLEG): container finished" podID="7694b530-6dab-4abd-b16c-355dfbb3cde7" containerID="f15781b854fa9b8e9572f7ebe8fe0b280494822d98978ddce956b0d8c05f8f59" exitCode=0 Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.469296 4825 generic.go:334] "Generic (PLEG): container finished" podID="7694b530-6dab-4abd-b16c-355dfbb3cde7" containerID="838350bd39928cc47234a0b89400e3d0c444bd682bc3d520183d337b8220a467" exitCode=143 Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.469413 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7694b530-6dab-4abd-b16c-355dfbb3cde7","Type":"ContainerDied","Data":"f15781b854fa9b8e9572f7ebe8fe0b280494822d98978ddce956b0d8c05f8f59"} Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.469451 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7694b530-6dab-4abd-b16c-355dfbb3cde7","Type":"ContainerDied","Data":"838350bd39928cc47234a0b89400e3d0c444bd682bc3d520183d337b8220a467"} Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.469465 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7694b530-6dab-4abd-b16c-355dfbb3cde7","Type":"ContainerDied","Data":"ddcccb1a9971e0b0a8130cf6dbc87c9f174d42916eb7474adeee823a0fbedcbc"} Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.469481 4825 scope.go:117] "RemoveContainer" containerID="f15781b854fa9b8e9572f7ebe8fe0b280494822d98978ddce956b0d8c05f8f59" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.469788 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.482957 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e5972ff3-b652-4ed2-8414-404f2d7b24e0","Type":"ContainerStarted","Data":"5dd6b79a640eb1c780a7157a5aa0793fc089f2ff62c548e00d338a55ea63a00c"} Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.483039 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e5972ff3-b652-4ed2-8414-404f2d7b24e0","Type":"ContainerStarted","Data":"6b6670fac27d4102b421612ff592d554e6ff035e7168df4573e2483c3179f626"} Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.483858 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.515071 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.515051622 podStartE2EDuration="2.515051622s" podCreationTimestamp="2026-03-10 08:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:18:37.501529677 +0000 UTC m=+5670.531310292" watchObservedRunningTime="2026-03-10 08:18:37.515051622 +0000 UTC m=+5670.544832237" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.522507 4825 scope.go:117] "RemoveContainer" containerID="838350bd39928cc47234a0b89400e3d0c444bd682bc3d520183d337b8220a467" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.540685 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.559748 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.567380 4825 scope.go:117] "RemoveContainer" containerID="f15781b854fa9b8e9572f7ebe8fe0b280494822d98978ddce956b0d8c05f8f59" Mar 10 08:18:37 crc kubenswrapper[4825]: E0310 08:18:37.568786 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f15781b854fa9b8e9572f7ebe8fe0b280494822d98978ddce956b0d8c05f8f59\": container with ID starting with f15781b854fa9b8e9572f7ebe8fe0b280494822d98978ddce956b0d8c05f8f59 not found: ID does not exist" containerID="f15781b854fa9b8e9572f7ebe8fe0b280494822d98978ddce956b0d8c05f8f59" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.568814 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f15781b854fa9b8e9572f7ebe8fe0b280494822d98978ddce956b0d8c05f8f59"} err="failed to get container status \"f15781b854fa9b8e9572f7ebe8fe0b280494822d98978ddce956b0d8c05f8f59\": rpc error: code = NotFound desc = could not find container \"f15781b854fa9b8e9572f7ebe8fe0b280494822d98978ddce956b0d8c05f8f59\": container with ID starting with f15781b854fa9b8e9572f7ebe8fe0b280494822d98978ddce956b0d8c05f8f59 not found: ID does not exist" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.568835 4825 scope.go:117] "RemoveContainer" containerID="838350bd39928cc47234a0b89400e3d0c444bd682bc3d520183d337b8220a467" Mar 10 08:18:37 crc kubenswrapper[4825]: E0310 08:18:37.569168 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"838350bd39928cc47234a0b89400e3d0c444bd682bc3d520183d337b8220a467\": container with ID starting with 838350bd39928cc47234a0b89400e3d0c444bd682bc3d520183d337b8220a467 not found: ID does not exist" containerID="838350bd39928cc47234a0b89400e3d0c444bd682bc3d520183d337b8220a467" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.569192 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"838350bd39928cc47234a0b89400e3d0c444bd682bc3d520183d337b8220a467"} err="failed to get container status \"838350bd39928cc47234a0b89400e3d0c444bd682bc3d520183d337b8220a467\": rpc error: code = NotFound desc = could not find container \"838350bd39928cc47234a0b89400e3d0c444bd682bc3d520183d337b8220a467\": container with ID starting with 838350bd39928cc47234a0b89400e3d0c444bd682bc3d520183d337b8220a467 not found: ID does not exist" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.569207 4825 scope.go:117] "RemoveContainer" containerID="f15781b854fa9b8e9572f7ebe8fe0b280494822d98978ddce956b0d8c05f8f59" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.569399 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f15781b854fa9b8e9572f7ebe8fe0b280494822d98978ddce956b0d8c05f8f59"} err="failed to get container status \"f15781b854fa9b8e9572f7ebe8fe0b280494822d98978ddce956b0d8c05f8f59\": rpc error: code = NotFound desc = could not find container \"f15781b854fa9b8e9572f7ebe8fe0b280494822d98978ddce956b0d8c05f8f59\": container with ID starting with f15781b854fa9b8e9572f7ebe8fe0b280494822d98978ddce956b0d8c05f8f59 not found: ID does not exist" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.569416 4825 scope.go:117] "RemoveContainer" containerID="838350bd39928cc47234a0b89400e3d0c444bd682bc3d520183d337b8220a467" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.569624 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"838350bd39928cc47234a0b89400e3d0c444bd682bc3d520183d337b8220a467"} err="failed to get container status \"838350bd39928cc47234a0b89400e3d0c444bd682bc3d520183d337b8220a467\": rpc error: code = NotFound desc = could not find container \"838350bd39928cc47234a0b89400e3d0c444bd682bc3d520183d337b8220a467\": container with ID starting with 838350bd39928cc47234a0b89400e3d0c444bd682bc3d520183d337b8220a467 not found: ID does not exist" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.581359 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 08:18:37 crc kubenswrapper[4825]: E0310 08:18:37.581882 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564e7588-ac0e-48e5-80c5-3e4592e06601" containerName="nova-manage" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.581906 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="564e7588-ac0e-48e5-80c5-3e4592e06601" containerName="nova-manage" Mar 10 08:18:37 crc kubenswrapper[4825]: E0310 08:18:37.581923 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7694b530-6dab-4abd-b16c-355dfbb3cde7" containerName="nova-api-api" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.581932 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7694b530-6dab-4abd-b16c-355dfbb3cde7" containerName="nova-api-api" Mar 10 08:18:37 crc kubenswrapper[4825]: E0310 08:18:37.581972 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7694b530-6dab-4abd-b16c-355dfbb3cde7" containerName="nova-api-log" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.581982 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7694b530-6dab-4abd-b16c-355dfbb3cde7" containerName="nova-api-log" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.582284 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="564e7588-ac0e-48e5-80c5-3e4592e06601" containerName="nova-manage" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.582316 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="7694b530-6dab-4abd-b16c-355dfbb3cde7" containerName="nova-api-api" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.582337 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="7694b530-6dab-4abd-b16c-355dfbb3cde7" containerName="nova-api-log" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.584380 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.587029 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.589890 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.695663 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eea587f-9e48-49e9-ad29-4f71b911aa63-logs\") pod \"nova-api-0\" (UID: \"8eea587f-9e48-49e9-ad29-4f71b911aa63\") " pod="openstack/nova-api-0" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.695749 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eea587f-9e48-49e9-ad29-4f71b911aa63-config-data\") pod \"nova-api-0\" (UID: \"8eea587f-9e48-49e9-ad29-4f71b911aa63\") " pod="openstack/nova-api-0" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.695885 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9smj\" (UniqueName: \"kubernetes.io/projected/8eea587f-9e48-49e9-ad29-4f71b911aa63-kube-api-access-f9smj\") pod \"nova-api-0\" (UID: \"8eea587f-9e48-49e9-ad29-4f71b911aa63\") " pod="openstack/nova-api-0" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.695968 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eea587f-9e48-49e9-ad29-4f71b911aa63-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8eea587f-9e48-49e9-ad29-4f71b911aa63\") " pod="openstack/nova-api-0" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.798117 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eea587f-9e48-49e9-ad29-4f71b911aa63-logs\") pod \"nova-api-0\" (UID: \"8eea587f-9e48-49e9-ad29-4f71b911aa63\") " pod="openstack/nova-api-0" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.798208 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eea587f-9e48-49e9-ad29-4f71b911aa63-config-data\") pod \"nova-api-0\" (UID: \"8eea587f-9e48-49e9-ad29-4f71b911aa63\") " pod="openstack/nova-api-0" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.798248 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9smj\" (UniqueName: \"kubernetes.io/projected/8eea587f-9e48-49e9-ad29-4f71b911aa63-kube-api-access-f9smj\") pod \"nova-api-0\" (UID: \"8eea587f-9e48-49e9-ad29-4f71b911aa63\") " pod="openstack/nova-api-0" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.798282 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eea587f-9e48-49e9-ad29-4f71b911aa63-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8eea587f-9e48-49e9-ad29-4f71b911aa63\") " pod="openstack/nova-api-0" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.798582 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eea587f-9e48-49e9-ad29-4f71b911aa63-logs\") pod \"nova-api-0\" (UID: \"8eea587f-9e48-49e9-ad29-4f71b911aa63\") " pod="openstack/nova-api-0" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.802002 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eea587f-9e48-49e9-ad29-4f71b911aa63-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8eea587f-9e48-49e9-ad29-4f71b911aa63\") " pod="openstack/nova-api-0" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.802927 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eea587f-9e48-49e9-ad29-4f71b911aa63-config-data\") pod \"nova-api-0\" (UID: \"8eea587f-9e48-49e9-ad29-4f71b911aa63\") " pod="openstack/nova-api-0" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.815581 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9smj\" (UniqueName: \"kubernetes.io/projected/8eea587f-9e48-49e9-ad29-4f71b911aa63-kube-api-access-f9smj\") pod \"nova-api-0\" (UID: \"8eea587f-9e48-49e9-ad29-4f71b911aa63\") " pod="openstack/nova-api-0" Mar 10 08:18:37 crc kubenswrapper[4825]: I0310 08:18:37.904054 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 08:18:38 crc kubenswrapper[4825]: I0310 08:18:38.011930 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" Mar 10 08:18:38 crc kubenswrapper[4825]: I0310 08:18:38.089467 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78d6b6f8b7-qd4kd"] Mar 10 08:18:38 crc kubenswrapper[4825]: I0310 08:18:38.089671 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" podUID="16b77a07-02ee-4ec1-84ed-4d3e6203e93f" containerName="dnsmasq-dns" containerID="cri-o://66688f1a6aa8b692f481651827bd05dff89f575dfe204812d61b8ef0ad3febfd" gracePeriod=10 Mar 10 08:18:38 crc kubenswrapper[4825]: I0310 08:18:38.427631 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 08:18:38 crc kubenswrapper[4825]: I0310 08:18:38.499995 4825 generic.go:334] "Generic (PLEG): container finished" podID="16b77a07-02ee-4ec1-84ed-4d3e6203e93f" containerID="66688f1a6aa8b692f481651827bd05dff89f575dfe204812d61b8ef0ad3febfd" exitCode=0 Mar 10 08:18:38 crc kubenswrapper[4825]: I0310 08:18:38.500063 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" event={"ID":"16b77a07-02ee-4ec1-84ed-4d3e6203e93f","Type":"ContainerDied","Data":"66688f1a6aa8b692f481651827bd05dff89f575dfe204812d61b8ef0ad3febfd"} Mar 10 08:18:38 crc kubenswrapper[4825]: I0310 08:18:38.501353 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8eea587f-9e48-49e9-ad29-4f71b911aa63","Type":"ContainerStarted","Data":"970f1273b5fcfe04ee4c291c98de1323ad40a590775e4d714b5d7f8b8ab6ab53"} Mar 10 08:18:38 crc kubenswrapper[4825]: I0310 08:18:38.548361 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" Mar 10 08:18:38 crc kubenswrapper[4825]: I0310 08:18:38.612104 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-dns-svc\") pod \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\" (UID: \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\") " Mar 10 08:18:38 crc kubenswrapper[4825]: I0310 08:18:38.613246 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-ovsdbserver-nb\") pod \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\" (UID: \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\") " Mar 10 08:18:38 crc kubenswrapper[4825]: I0310 08:18:38.613360 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-ovsdbserver-sb\") pod \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\" (UID: \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\") " Mar 10 08:18:38 crc kubenswrapper[4825]: I0310 08:18:38.613464 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf6dh\" (UniqueName: \"kubernetes.io/projected/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-kube-api-access-wf6dh\") pod \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\" (UID: \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\") " Mar 10 08:18:38 crc kubenswrapper[4825]: I0310 08:18:38.613586 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-config\") pod \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\" (UID: \"16b77a07-02ee-4ec1-84ed-4d3e6203e93f\") " Mar 10 08:18:38 crc kubenswrapper[4825]: I0310 08:18:38.624789 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-kube-api-access-wf6dh" (OuterVolumeSpecName: "kube-api-access-wf6dh") pod "16b77a07-02ee-4ec1-84ed-4d3e6203e93f" (UID: "16b77a07-02ee-4ec1-84ed-4d3e6203e93f"). InnerVolumeSpecName "kube-api-access-wf6dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:18:38 crc kubenswrapper[4825]: I0310 08:18:38.674628 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "16b77a07-02ee-4ec1-84ed-4d3e6203e93f" (UID: "16b77a07-02ee-4ec1-84ed-4d3e6203e93f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:18:38 crc kubenswrapper[4825]: I0310 08:18:38.677069 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "16b77a07-02ee-4ec1-84ed-4d3e6203e93f" (UID: "16b77a07-02ee-4ec1-84ed-4d3e6203e93f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:18:38 crc kubenswrapper[4825]: I0310 08:18:38.687109 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "16b77a07-02ee-4ec1-84ed-4d3e6203e93f" (UID: "16b77a07-02ee-4ec1-84ed-4d3e6203e93f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:18:38 crc kubenswrapper[4825]: I0310 08:18:38.699183 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-config" (OuterVolumeSpecName: "config") pod "16b77a07-02ee-4ec1-84ed-4d3e6203e93f" (UID: "16b77a07-02ee-4ec1-84ed-4d3e6203e93f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:18:38 crc kubenswrapper[4825]: I0310 08:18:38.715963 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 08:18:38 crc kubenswrapper[4825]: I0310 08:18:38.715999 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 08:18:38 crc kubenswrapper[4825]: I0310 08:18:38.716012 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 08:18:38 crc kubenswrapper[4825]: I0310 08:18:38.716024 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wf6dh\" (UniqueName: \"kubernetes.io/projected/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-kube-api-access-wf6dh\") on node \"crc\" DevicePath \"\"" Mar 10 08:18:38 crc kubenswrapper[4825]: I0310 08:18:38.716035 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16b77a07-02ee-4ec1-84ed-4d3e6203e93f-config\") on node \"crc\" DevicePath \"\"" Mar 10 08:18:39 crc kubenswrapper[4825]: I0310 08:18:39.249787 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7694b530-6dab-4abd-b16c-355dfbb3cde7" path="/var/lib/kubelet/pods/7694b530-6dab-4abd-b16c-355dfbb3cde7/volumes" Mar 10 08:18:39 crc kubenswrapper[4825]: I0310 08:18:39.516586 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" event={"ID":"16b77a07-02ee-4ec1-84ed-4d3e6203e93f","Type":"ContainerDied","Data":"aabf080d2c09eb7d1df52148dba26f40fdfb08a8a9213fe8cf07da622f7a83d4"} Mar 10 08:18:39 crc kubenswrapper[4825]: I0310 08:18:39.516673 4825 scope.go:117] "RemoveContainer" containerID="66688f1a6aa8b692f481651827bd05dff89f575dfe204812d61b8ef0ad3febfd" Mar 10 08:18:39 crc kubenswrapper[4825]: I0310 08:18:39.516609 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78d6b6f8b7-qd4kd" Mar 10 08:18:39 crc kubenswrapper[4825]: I0310 08:18:39.518917 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8eea587f-9e48-49e9-ad29-4f71b911aa63","Type":"ContainerStarted","Data":"858f7c177d82012dc13403454b29f81fdd129a40b95e41f69c5a4203358ff6ac"} Mar 10 08:18:39 crc kubenswrapper[4825]: I0310 08:18:39.518992 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8eea587f-9e48-49e9-ad29-4f71b911aa63","Type":"ContainerStarted","Data":"c16e9d3476199bb5aa6b9e33b5eec6f3e512e172a85f8fc62e5f64552fd8239b"} Mar 10 08:18:39 crc kubenswrapper[4825]: I0310 08:18:39.543807 4825 scope.go:117] "RemoveContainer" containerID="15c04c00e06290a19ea8e58bd77f516c9c11dfeb666aa465e3177f13d7abdbbd" Mar 10 08:18:39 crc kubenswrapper[4825]: I0310 08:18:39.545943 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78d6b6f8b7-qd4kd"] Mar 10 08:18:39 crc kubenswrapper[4825]: I0310 08:18:39.559422 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78d6b6f8b7-qd4kd"] Mar 10 08:18:39 crc kubenswrapper[4825]: I0310 08:18:39.581581 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.581555285 podStartE2EDuration="2.581555285s" podCreationTimestamp="2026-03-10 08:18:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:18:39.567366643 +0000 UTC m=+5672.597147268" watchObservedRunningTime="2026-03-10 08:18:39.581555285 +0000 UTC m=+5672.611335900" Mar 10 08:18:41 crc kubenswrapper[4825]: I0310 08:18:41.236422 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:18:41 crc kubenswrapper[4825]: E0310 08:18:41.237044 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:18:41 crc kubenswrapper[4825]: I0310 08:18:41.248000 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16b77a07-02ee-4ec1-84ed-4d3e6203e93f" path="/var/lib/kubelet/pods/16b77a07-02ee-4ec1-84ed-4d3e6203e93f/volumes" Mar 10 08:18:43 crc kubenswrapper[4825]: I0310 08:18:43.224670 4825 scope.go:117] "RemoveContainer" containerID="d3c45a92f19c624046e763329cbd5050855a56ea250689eeda7c7e8c21529770" Mar 10 08:18:43 crc kubenswrapper[4825]: I0310 08:18:43.259155 4825 scope.go:117] "RemoveContainer" containerID="189a96d07f651b5b7db5b6d68d9943764df8afa614cf63037b9e4dc2d490d28b" Mar 10 08:18:43 crc kubenswrapper[4825]: I0310 08:18:43.327316 4825 scope.go:117] "RemoveContainer" containerID="1c18cf7a8d81a88fb7f7c509bf70fa564fdef115c481b793ed4d7113f8cc6f48" Mar 10 08:18:45 crc kubenswrapper[4825]: I0310 08:18:45.928304 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 10 08:18:47 crc kubenswrapper[4825]: I0310 08:18:47.904661 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 08:18:47 crc kubenswrapper[4825]: I0310 08:18:47.905661 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 08:18:48 crc kubenswrapper[4825]: I0310 08:18:48.987457 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8eea587f-9e48-49e9-ad29-4f71b911aa63" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.125:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 08:18:48 crc kubenswrapper[4825]: I0310 08:18:48.987637 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8eea587f-9e48-49e9-ad29-4f71b911aa63" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.125:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 08:18:55 crc kubenswrapper[4825]: I0310 08:18:55.236566 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:18:55 crc kubenswrapper[4825]: I0310 08:18:55.682778 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"527769f1bd8bc09217383522eb96db2b542ebee6be919f1442be652e8e45cf86"} Mar 10 08:18:58 crc kubenswrapper[4825]: I0310 08:18:58.987294 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8eea587f-9e48-49e9-ad29-4f71b911aa63" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.125:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 08:18:58 crc kubenswrapper[4825]: I0310 08:18:58.987798 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8eea587f-9e48-49e9-ad29-4f71b911aa63" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.125:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 08:19:02 crc kubenswrapper[4825]: I0310 08:19:02.772929 4825 generic.go:334] "Generic (PLEG): container finished" podID="d7a1ce21-885b-475b-a111-4ca0ce117d4d" containerID="8d7d76a45372a15352a8afb09c32de4067530f3ab959800b6e7fb842eb96b3fb" exitCode=137 Mar 10 08:19:02 crc kubenswrapper[4825]: I0310 08:19:02.773004 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d7a1ce21-885b-475b-a111-4ca0ce117d4d","Type":"ContainerDied","Data":"8d7d76a45372a15352a8afb09c32de4067530f3ab959800b6e7fb842eb96b3fb"} Mar 10 08:19:02 crc kubenswrapper[4825]: I0310 08:19:02.775911 4825 generic.go:334] "Generic (PLEG): container finished" podID="f843f940-bb91-4557-bd1b-16f0d0c2135c" containerID="9e332074f3e89ed80e91ef8f24d5df1560b12f375ccab19be12dae5e7e06b6de" exitCode=137 Mar 10 08:19:02 crc kubenswrapper[4825]: I0310 08:19:02.775951 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f843f940-bb91-4557-bd1b-16f0d0c2135c","Type":"ContainerDied","Data":"9e332074f3e89ed80e91ef8f24d5df1560b12f375ccab19be12dae5e7e06b6de"} Mar 10 08:19:02 crc kubenswrapper[4825]: I0310 08:19:02.929286 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 08:19:02 crc kubenswrapper[4825]: I0310 08:19:02.935588 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.011034 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7a1ce21-885b-475b-a111-4ca0ce117d4d-logs\") pod \"d7a1ce21-885b-475b-a111-4ca0ce117d4d\" (UID: \"d7a1ce21-885b-475b-a111-4ca0ce117d4d\") " Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.011093 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a1ce21-885b-475b-a111-4ca0ce117d4d-config-data\") pod \"d7a1ce21-885b-475b-a111-4ca0ce117d4d\" (UID: \"d7a1ce21-885b-475b-a111-4ca0ce117d4d\") " Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.011119 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a1ce21-885b-475b-a111-4ca0ce117d4d-combined-ca-bundle\") pod \"d7a1ce21-885b-475b-a111-4ca0ce117d4d\" (UID: \"d7a1ce21-885b-475b-a111-4ca0ce117d4d\") " Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.011251 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f843f940-bb91-4557-bd1b-16f0d0c2135c-combined-ca-bundle\") pod \"f843f940-bb91-4557-bd1b-16f0d0c2135c\" (UID: \"f843f940-bb91-4557-bd1b-16f0d0c2135c\") " Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.011323 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v84nl\" (UniqueName: \"kubernetes.io/projected/f843f940-bb91-4557-bd1b-16f0d0c2135c-kube-api-access-v84nl\") pod \"f843f940-bb91-4557-bd1b-16f0d0c2135c\" (UID: \"f843f940-bb91-4557-bd1b-16f0d0c2135c\") " Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.011364 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f843f940-bb91-4557-bd1b-16f0d0c2135c-config-data\") pod \"f843f940-bb91-4557-bd1b-16f0d0c2135c\" (UID: \"f843f940-bb91-4557-bd1b-16f0d0c2135c\") " Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.011392 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g49t4\" (UniqueName: \"kubernetes.io/projected/d7a1ce21-885b-475b-a111-4ca0ce117d4d-kube-api-access-g49t4\") pod \"d7a1ce21-885b-475b-a111-4ca0ce117d4d\" (UID: \"d7a1ce21-885b-475b-a111-4ca0ce117d4d\") " Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.011764 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7a1ce21-885b-475b-a111-4ca0ce117d4d-logs" (OuterVolumeSpecName: "logs") pod "d7a1ce21-885b-475b-a111-4ca0ce117d4d" (UID: "d7a1ce21-885b-475b-a111-4ca0ce117d4d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.017551 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f843f940-bb91-4557-bd1b-16f0d0c2135c-kube-api-access-v84nl" (OuterVolumeSpecName: "kube-api-access-v84nl") pod "f843f940-bb91-4557-bd1b-16f0d0c2135c" (UID: "f843f940-bb91-4557-bd1b-16f0d0c2135c"). InnerVolumeSpecName "kube-api-access-v84nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.018294 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7a1ce21-885b-475b-a111-4ca0ce117d4d-kube-api-access-g49t4" (OuterVolumeSpecName: "kube-api-access-g49t4") pod "d7a1ce21-885b-475b-a111-4ca0ce117d4d" (UID: "d7a1ce21-885b-475b-a111-4ca0ce117d4d"). InnerVolumeSpecName "kube-api-access-g49t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.037603 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f843f940-bb91-4557-bd1b-16f0d0c2135c-config-data" (OuterVolumeSpecName: "config-data") pod "f843f940-bb91-4557-bd1b-16f0d0c2135c" (UID: "f843f940-bb91-4557-bd1b-16f0d0c2135c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.037902 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f843f940-bb91-4557-bd1b-16f0d0c2135c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f843f940-bb91-4557-bd1b-16f0d0c2135c" (UID: "f843f940-bb91-4557-bd1b-16f0d0c2135c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.039941 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7a1ce21-885b-475b-a111-4ca0ce117d4d-config-data" (OuterVolumeSpecName: "config-data") pod "d7a1ce21-885b-475b-a111-4ca0ce117d4d" (UID: "d7a1ce21-885b-475b-a111-4ca0ce117d4d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.041457 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7a1ce21-885b-475b-a111-4ca0ce117d4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7a1ce21-885b-475b-a111-4ca0ce117d4d" (UID: "d7a1ce21-885b-475b-a111-4ca0ce117d4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.113462 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7a1ce21-885b-475b-a111-4ca0ce117d4d-logs\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.113498 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a1ce21-885b-475b-a111-4ca0ce117d4d-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.113512 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a1ce21-885b-475b-a111-4ca0ce117d4d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.113526 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f843f940-bb91-4557-bd1b-16f0d0c2135c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.113536 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v84nl\" (UniqueName: \"kubernetes.io/projected/f843f940-bb91-4557-bd1b-16f0d0c2135c-kube-api-access-v84nl\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.113547 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f843f940-bb91-4557-bd1b-16f0d0c2135c-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.113556 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g49t4\" (UniqueName: \"kubernetes.io/projected/d7a1ce21-885b-475b-a111-4ca0ce117d4d-kube-api-access-g49t4\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.786078 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d7a1ce21-885b-475b-a111-4ca0ce117d4d","Type":"ContainerDied","Data":"65ac1896c808fff2cae5dcc7ba34953f928b75c491191b132c865c464ad2f2e2"} Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.786092 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.786359 4825 scope.go:117] "RemoveContainer" containerID="8d7d76a45372a15352a8afb09c32de4067530f3ab959800b6e7fb842eb96b3fb" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.788749 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f843f940-bb91-4557-bd1b-16f0d0c2135c","Type":"ContainerDied","Data":"45166ed59667a68349c96a59dc8a61ce9fef7b667f0f4803320a2bbaefe1d544"} Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.788831 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.811005 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.822229 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.828112 4825 scope.go:117] "RemoveContainer" containerID="6946d0f532d9fa3bbb3520e9f5621790224b28a66004f4f97228d7b6a2249494" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.831986 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.844279 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.854050 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 08:19:03 crc kubenswrapper[4825]: E0310 08:19:03.854615 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a1ce21-885b-475b-a111-4ca0ce117d4d" containerName="nova-metadata-log" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.854660 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a1ce21-885b-475b-a111-4ca0ce117d4d" containerName="nova-metadata-log" Mar 10 08:19:03 crc kubenswrapper[4825]: E0310 08:19:03.854676 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f843f940-bb91-4557-bd1b-16f0d0c2135c" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.854688 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f843f940-bb91-4557-bd1b-16f0d0c2135c" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 08:19:03 crc kubenswrapper[4825]: E0310 08:19:03.854710 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a1ce21-885b-475b-a111-4ca0ce117d4d" containerName="nova-metadata-metadata" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.854744 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a1ce21-885b-475b-a111-4ca0ce117d4d" containerName="nova-metadata-metadata" Mar 10 08:19:03 crc kubenswrapper[4825]: E0310 08:19:03.854769 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16b77a07-02ee-4ec1-84ed-4d3e6203e93f" containerName="init" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.854780 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b77a07-02ee-4ec1-84ed-4d3e6203e93f" containerName="init" Mar 10 08:19:03 crc kubenswrapper[4825]: E0310 08:19:03.854807 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16b77a07-02ee-4ec1-84ed-4d3e6203e93f" containerName="dnsmasq-dns" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.854818 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b77a07-02ee-4ec1-84ed-4d3e6203e93f" containerName="dnsmasq-dns" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.855109 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f843f940-bb91-4557-bd1b-16f0d0c2135c" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.855170 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7a1ce21-885b-475b-a111-4ca0ce117d4d" containerName="nova-metadata-metadata" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.855216 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="16b77a07-02ee-4ec1-84ed-4d3e6203e93f" containerName="dnsmasq-dns" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.855240 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7a1ce21-885b-475b-a111-4ca0ce117d4d" containerName="nova-metadata-log" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.856844 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.859029 4825 scope.go:117] "RemoveContainer" containerID="9e332074f3e89ed80e91ef8f24d5df1560b12f375ccab19be12dae5e7e06b6de" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.859715 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.859889 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.867298 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.869274 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.875440 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.882853 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.883044 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.883200 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.886378 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.930549 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04673a37-4f3a-4129-9c02-16126e1a072e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"04673a37-4f3a-4129-9c02-16126e1a072e\") " pod="openstack/nova-metadata-0" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.930644 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04673a37-4f3a-4129-9c02-16126e1a072e-config-data\") pod \"nova-metadata-0\" (UID: \"04673a37-4f3a-4129-9c02-16126e1a072e\") " pod="openstack/nova-metadata-0" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.930665 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fvt7\" (UniqueName: \"kubernetes.io/projected/04673a37-4f3a-4129-9c02-16126e1a072e-kube-api-access-4fvt7\") pod \"nova-metadata-0\" (UID: \"04673a37-4f3a-4129-9c02-16126e1a072e\") " pod="openstack/nova-metadata-0" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.930689 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ef73e54-5497-4af4-a1c3-89a47c67bcb8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8ef73e54-5497-4af4-a1c3-89a47c67bcb8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.930715 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04673a37-4f3a-4129-9c02-16126e1a072e-logs\") pod \"nova-metadata-0\" (UID: \"04673a37-4f3a-4129-9c02-16126e1a072e\") " pod="openstack/nova-metadata-0" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.930738 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef73e54-5497-4af4-a1c3-89a47c67bcb8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8ef73e54-5497-4af4-a1c3-89a47c67bcb8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.930757 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef73e54-5497-4af4-a1c3-89a47c67bcb8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8ef73e54-5497-4af4-a1c3-89a47c67bcb8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.930779 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/04673a37-4f3a-4129-9c02-16126e1a072e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"04673a37-4f3a-4129-9c02-16126e1a072e\") " pod="openstack/nova-metadata-0" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.930798 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwjss\" (UniqueName: \"kubernetes.io/projected/8ef73e54-5497-4af4-a1c3-89a47c67bcb8-kube-api-access-cwjss\") pod \"nova-cell1-novncproxy-0\" (UID: \"8ef73e54-5497-4af4-a1c3-89a47c67bcb8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:19:03 crc kubenswrapper[4825]: I0310 08:19:03.930844 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ef73e54-5497-4af4-a1c3-89a47c67bcb8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8ef73e54-5497-4af4-a1c3-89a47c67bcb8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.033458 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04673a37-4f3a-4129-9c02-16126e1a072e-config-data\") pod \"nova-metadata-0\" (UID: \"04673a37-4f3a-4129-9c02-16126e1a072e\") " pod="openstack/nova-metadata-0" Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.033516 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fvt7\" (UniqueName: \"kubernetes.io/projected/04673a37-4f3a-4129-9c02-16126e1a072e-kube-api-access-4fvt7\") pod \"nova-metadata-0\" (UID: \"04673a37-4f3a-4129-9c02-16126e1a072e\") " pod="openstack/nova-metadata-0" Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.033557 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ef73e54-5497-4af4-a1c3-89a47c67bcb8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8ef73e54-5497-4af4-a1c3-89a47c67bcb8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.033598 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04673a37-4f3a-4129-9c02-16126e1a072e-logs\") pod \"nova-metadata-0\" (UID: \"04673a37-4f3a-4129-9c02-16126e1a072e\") " pod="openstack/nova-metadata-0" Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.033637 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef73e54-5497-4af4-a1c3-89a47c67bcb8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8ef73e54-5497-4af4-a1c3-89a47c67bcb8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.033666 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef73e54-5497-4af4-a1c3-89a47c67bcb8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8ef73e54-5497-4af4-a1c3-89a47c67bcb8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.033701 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/04673a37-4f3a-4129-9c02-16126e1a072e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"04673a37-4f3a-4129-9c02-16126e1a072e\") " pod="openstack/nova-metadata-0" Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.033733 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwjss\" (UniqueName: \"kubernetes.io/projected/8ef73e54-5497-4af4-a1c3-89a47c67bcb8-kube-api-access-cwjss\") pod \"nova-cell1-novncproxy-0\" (UID: \"8ef73e54-5497-4af4-a1c3-89a47c67bcb8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.033800 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ef73e54-5497-4af4-a1c3-89a47c67bcb8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8ef73e54-5497-4af4-a1c3-89a47c67bcb8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.033861 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04673a37-4f3a-4129-9c02-16126e1a072e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"04673a37-4f3a-4129-9c02-16126e1a072e\") " pod="openstack/nova-metadata-0" Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.034669 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04673a37-4f3a-4129-9c02-16126e1a072e-logs\") pod \"nova-metadata-0\" (UID: \"04673a37-4f3a-4129-9c02-16126e1a072e\") " pod="openstack/nova-metadata-0" Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.039154 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ef73e54-5497-4af4-a1c3-89a47c67bcb8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8ef73e54-5497-4af4-a1c3-89a47c67bcb8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.039196 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04673a37-4f3a-4129-9c02-16126e1a072e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"04673a37-4f3a-4129-9c02-16126e1a072e\") " pod="openstack/nova-metadata-0" Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.039789 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04673a37-4f3a-4129-9c02-16126e1a072e-config-data\") pod \"nova-metadata-0\" (UID: \"04673a37-4f3a-4129-9c02-16126e1a072e\") " pod="openstack/nova-metadata-0" Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.042599 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef73e54-5497-4af4-a1c3-89a47c67bcb8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8ef73e54-5497-4af4-a1c3-89a47c67bcb8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.043969 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef73e54-5497-4af4-a1c3-89a47c67bcb8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8ef73e54-5497-4af4-a1c3-89a47c67bcb8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.045053 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ef73e54-5497-4af4-a1c3-89a47c67bcb8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8ef73e54-5497-4af4-a1c3-89a47c67bcb8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.046655 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/04673a37-4f3a-4129-9c02-16126e1a072e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"04673a37-4f3a-4129-9c02-16126e1a072e\") " pod="openstack/nova-metadata-0" Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.050189 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fvt7\" (UniqueName: \"kubernetes.io/projected/04673a37-4f3a-4129-9c02-16126e1a072e-kube-api-access-4fvt7\") pod \"nova-metadata-0\" (UID: \"04673a37-4f3a-4129-9c02-16126e1a072e\") " pod="openstack/nova-metadata-0" Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.051467 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwjss\" (UniqueName: \"kubernetes.io/projected/8ef73e54-5497-4af4-a1c3-89a47c67bcb8-kube-api-access-cwjss\") pod \"nova-cell1-novncproxy-0\" (UID: \"8ef73e54-5497-4af4-a1c3-89a47c67bcb8\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.196554 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.217180 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.657445 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 08:19:04 crc kubenswrapper[4825]: W0310 08:19:04.659591 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04673a37_4f3a_4129_9c02_16126e1a072e.slice/crio-db4485f878668ba6b46de7141880ffc39259d1f0a6249726da53938342a162ed WatchSource:0}: Error finding container db4485f878668ba6b46de7141880ffc39259d1f0a6249726da53938342a162ed: Status 404 returned error can't find the container with id db4485f878668ba6b46de7141880ffc39259d1f0a6249726da53938342a162ed Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.722933 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 08:19:04 crc kubenswrapper[4825]: W0310 08:19:04.725004 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ef73e54_5497_4af4_a1c3_89a47c67bcb8.slice/crio-9601ca175443c74cf57dda99ef13684ca0c59770a95c7c610d798ee05d1fa89e WatchSource:0}: Error finding container 9601ca175443c74cf57dda99ef13684ca0c59770a95c7c610d798ee05d1fa89e: Status 404 returned error can't find the container with id 9601ca175443c74cf57dda99ef13684ca0c59770a95c7c610d798ee05d1fa89e Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.806048 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8ef73e54-5497-4af4-a1c3-89a47c67bcb8","Type":"ContainerStarted","Data":"9601ca175443c74cf57dda99ef13684ca0c59770a95c7c610d798ee05d1fa89e"} Mar 10 08:19:04 crc kubenswrapper[4825]: I0310 08:19:04.806928 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04673a37-4f3a-4129-9c02-16126e1a072e","Type":"ContainerStarted","Data":"db4485f878668ba6b46de7141880ffc39259d1f0a6249726da53938342a162ed"} Mar 10 08:19:05 crc kubenswrapper[4825]: I0310 08:19:05.247972 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7a1ce21-885b-475b-a111-4ca0ce117d4d" path="/var/lib/kubelet/pods/d7a1ce21-885b-475b-a111-4ca0ce117d4d/volumes" Mar 10 08:19:05 crc kubenswrapper[4825]: I0310 08:19:05.249067 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f843f940-bb91-4557-bd1b-16f0d0c2135c" path="/var/lib/kubelet/pods/f843f940-bb91-4557-bd1b-16f0d0c2135c/volumes" Mar 10 08:19:05 crc kubenswrapper[4825]: I0310 08:19:05.819142 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8ef73e54-5497-4af4-a1c3-89a47c67bcb8","Type":"ContainerStarted","Data":"e68c633d50b127a488b6866d586381c0fc54fff2cbfd10b7d299a88df034b5a9"} Mar 10 08:19:05 crc kubenswrapper[4825]: I0310 08:19:05.830433 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04673a37-4f3a-4129-9c02-16126e1a072e","Type":"ContainerStarted","Data":"3ca87529f407c049b8b4aab44f73dc17631384da28a48cf142fe4ef94af92232"} Mar 10 08:19:05 crc kubenswrapper[4825]: I0310 08:19:05.830489 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04673a37-4f3a-4129-9c02-16126e1a072e","Type":"ContainerStarted","Data":"3ee24d6f700523a4adaf41742256a65e57beb3d27789b99b283e4a144ded3530"} Mar 10 08:19:05 crc kubenswrapper[4825]: I0310 08:19:05.843651 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.843633015 podStartE2EDuration="2.843633015s" podCreationTimestamp="2026-03-10 08:19:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:19:05.84075125 +0000 UTC m=+5698.870531865" watchObservedRunningTime="2026-03-10 08:19:05.843633015 +0000 UTC m=+5698.873413630" Mar 10 08:19:05 crc kubenswrapper[4825]: I0310 08:19:05.861528 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.861508355 podStartE2EDuration="2.861508355s" podCreationTimestamp="2026-03-10 08:19:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:19:05.861237387 +0000 UTC m=+5698.891018012" watchObservedRunningTime="2026-03-10 08:19:05.861508355 +0000 UTC m=+5698.891288970" Mar 10 08:19:06 crc kubenswrapper[4825]: I0310 08:19:06.842012 4825 generic.go:334] "Generic (PLEG): container finished" podID="b68a6e58-9a3c-434e-9f2f-f81e4e508f79" containerID="af451d6c609d8d92167a2839ec34e6668fc58d0ccb31df53930bac041f9ac6c1" exitCode=137 Mar 10 08:19:06 crc kubenswrapper[4825]: I0310 08:19:06.842176 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b68a6e58-9a3c-434e-9f2f-f81e4e508f79","Type":"ContainerDied","Data":"af451d6c609d8d92167a2839ec34e6668fc58d0ccb31df53930bac041f9ac6c1"} Mar 10 08:19:07 crc kubenswrapper[4825]: I0310 08:19:07.026733 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 08:19:07 crc kubenswrapper[4825]: I0310 08:19:07.092278 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b68a6e58-9a3c-434e-9f2f-f81e4e508f79-combined-ca-bundle\") pod \"b68a6e58-9a3c-434e-9f2f-f81e4e508f79\" (UID: \"b68a6e58-9a3c-434e-9f2f-f81e4e508f79\") " Mar 10 08:19:07 crc kubenswrapper[4825]: I0310 08:19:07.092358 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b68a6e58-9a3c-434e-9f2f-f81e4e508f79-config-data\") pod \"b68a6e58-9a3c-434e-9f2f-f81e4e508f79\" (UID: \"b68a6e58-9a3c-434e-9f2f-f81e4e508f79\") " Mar 10 08:19:07 crc kubenswrapper[4825]: I0310 08:19:07.092523 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rvtt\" (UniqueName: \"kubernetes.io/projected/b68a6e58-9a3c-434e-9f2f-f81e4e508f79-kube-api-access-5rvtt\") pod \"b68a6e58-9a3c-434e-9f2f-f81e4e508f79\" (UID: \"b68a6e58-9a3c-434e-9f2f-f81e4e508f79\") " Mar 10 08:19:07 crc kubenswrapper[4825]: I0310 08:19:07.097551 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b68a6e58-9a3c-434e-9f2f-f81e4e508f79-kube-api-access-5rvtt" (OuterVolumeSpecName: "kube-api-access-5rvtt") pod "b68a6e58-9a3c-434e-9f2f-f81e4e508f79" (UID: "b68a6e58-9a3c-434e-9f2f-f81e4e508f79"). InnerVolumeSpecName "kube-api-access-5rvtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:19:07 crc kubenswrapper[4825]: I0310 08:19:07.119046 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b68a6e58-9a3c-434e-9f2f-f81e4e508f79-config-data" (OuterVolumeSpecName: "config-data") pod "b68a6e58-9a3c-434e-9f2f-f81e4e508f79" (UID: "b68a6e58-9a3c-434e-9f2f-f81e4e508f79"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:19:07 crc kubenswrapper[4825]: I0310 08:19:07.122289 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b68a6e58-9a3c-434e-9f2f-f81e4e508f79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b68a6e58-9a3c-434e-9f2f-f81e4e508f79" (UID: "b68a6e58-9a3c-434e-9f2f-f81e4e508f79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:19:07 crc kubenswrapper[4825]: I0310 08:19:07.194414 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rvtt\" (UniqueName: \"kubernetes.io/projected/b68a6e58-9a3c-434e-9f2f-f81e4e508f79-kube-api-access-5rvtt\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:07 crc kubenswrapper[4825]: I0310 08:19:07.194459 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b68a6e58-9a3c-434e-9f2f-f81e4e508f79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:07 crc kubenswrapper[4825]: I0310 08:19:07.194473 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b68a6e58-9a3c-434e-9f2f-f81e4e508f79-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:07 crc kubenswrapper[4825]: I0310 08:19:07.854151 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b68a6e58-9a3c-434e-9f2f-f81e4e508f79","Type":"ContainerDied","Data":"a470d665993a035e02ecb21ae135f447f537fb62b6d00cb3a03194132d8117a9"} Mar 10 08:19:07 crc kubenswrapper[4825]: I0310 08:19:07.854199 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 08:19:07 crc kubenswrapper[4825]: I0310 08:19:07.854511 4825 scope.go:117] "RemoveContainer" containerID="af451d6c609d8d92167a2839ec34e6668fc58d0ccb31df53930bac041f9ac6c1" Mar 10 08:19:07 crc kubenswrapper[4825]: I0310 08:19:07.883295 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 08:19:07 crc kubenswrapper[4825]: I0310 08:19:07.894464 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 08:19:07 crc kubenswrapper[4825]: I0310 08:19:07.903108 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 08:19:07 crc kubenswrapper[4825]: E0310 08:19:07.903527 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b68a6e58-9a3c-434e-9f2f-f81e4e508f79" containerName="nova-scheduler-scheduler" Mar 10 08:19:07 crc kubenswrapper[4825]: I0310 08:19:07.903549 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b68a6e58-9a3c-434e-9f2f-f81e4e508f79" containerName="nova-scheduler-scheduler" Mar 10 08:19:07 crc kubenswrapper[4825]: I0310 08:19:07.903701 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b68a6e58-9a3c-434e-9f2f-f81e4e508f79" containerName="nova-scheduler-scheduler" Mar 10 08:19:07 crc kubenswrapper[4825]: I0310 08:19:07.905531 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 08:19:07 crc kubenswrapper[4825]: I0310 08:19:07.906553 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 08:19:07 crc kubenswrapper[4825]: I0310 08:19:07.906640 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 08:19:07 crc kubenswrapper[4825]: I0310 08:19:07.909418 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 08:19:07 crc kubenswrapper[4825]: I0310 08:19:07.917700 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 08:19:08 crc kubenswrapper[4825]: I0310 08:19:08.006056 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60-config-data\") pod \"nova-scheduler-0\" (UID: \"360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60\") " pod="openstack/nova-scheduler-0" Mar 10 08:19:08 crc kubenswrapper[4825]: I0310 08:19:08.006197 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60\") " pod="openstack/nova-scheduler-0" Mar 10 08:19:08 crc kubenswrapper[4825]: I0310 08:19:08.006259 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px8v8\" (UniqueName: \"kubernetes.io/projected/360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60-kube-api-access-px8v8\") pod \"nova-scheduler-0\" (UID: \"360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60\") " pod="openstack/nova-scheduler-0" Mar 10 08:19:08 crc kubenswrapper[4825]: I0310 08:19:08.107832 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px8v8\" (UniqueName: \"kubernetes.io/projected/360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60-kube-api-access-px8v8\") pod \"nova-scheduler-0\" (UID: \"360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60\") " pod="openstack/nova-scheduler-0" Mar 10 08:19:08 crc kubenswrapper[4825]: I0310 08:19:08.107965 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60-config-data\") pod \"nova-scheduler-0\" (UID: \"360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60\") " pod="openstack/nova-scheduler-0" Mar 10 08:19:08 crc kubenswrapper[4825]: I0310 08:19:08.108058 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60\") " pod="openstack/nova-scheduler-0" Mar 10 08:19:08 crc kubenswrapper[4825]: I0310 08:19:08.112522 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60\") " pod="openstack/nova-scheduler-0" Mar 10 08:19:08 crc kubenswrapper[4825]: I0310 08:19:08.116271 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60-config-data\") pod \"nova-scheduler-0\" (UID: \"360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60\") " pod="openstack/nova-scheduler-0" Mar 10 08:19:08 crc kubenswrapper[4825]: I0310 08:19:08.123313 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px8v8\" (UniqueName: \"kubernetes.io/projected/360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60-kube-api-access-px8v8\") pod \"nova-scheduler-0\" (UID: \"360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60\") " pod="openstack/nova-scheduler-0" Mar 10 08:19:08 crc kubenswrapper[4825]: I0310 08:19:08.231360 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 08:19:08 crc kubenswrapper[4825]: I0310 08:19:08.713062 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 08:19:08 crc kubenswrapper[4825]: W0310 08:19:08.717192 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod360a0a5a_bfd5_481d_8f0a_bd6a24ec4a60.slice/crio-5a74c611f00214c9d1f633747bc7850fb58339d013708a0fdae8bfec0180558f WatchSource:0}: Error finding container 5a74c611f00214c9d1f633747bc7850fb58339d013708a0fdae8bfec0180558f: Status 404 returned error can't find the container with id 5a74c611f00214c9d1f633747bc7850fb58339d013708a0fdae8bfec0180558f Mar 10 08:19:08 crc kubenswrapper[4825]: I0310 08:19:08.866326 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60","Type":"ContainerStarted","Data":"5a74c611f00214c9d1f633747bc7850fb58339d013708a0fdae8bfec0180558f"} Mar 10 08:19:08 crc kubenswrapper[4825]: I0310 08:19:08.989389 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8eea587f-9e48-49e9-ad29-4f71b911aa63" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.125:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 08:19:08 crc kubenswrapper[4825]: I0310 08:19:08.989485 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8eea587f-9e48-49e9-ad29-4f71b911aa63" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.125:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 08:19:09 crc kubenswrapper[4825]: I0310 08:19:09.196826 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 08:19:09 crc kubenswrapper[4825]: I0310 08:19:09.196878 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 08:19:09 crc kubenswrapper[4825]: I0310 08:19:09.218277 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:19:09 crc kubenswrapper[4825]: I0310 08:19:09.248426 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b68a6e58-9a3c-434e-9f2f-f81e4e508f79" path="/var/lib/kubelet/pods/b68a6e58-9a3c-434e-9f2f-f81e4e508f79/volumes" Mar 10 08:19:09 crc kubenswrapper[4825]: I0310 08:19:09.876459 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60","Type":"ContainerStarted","Data":"a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94"} Mar 10 08:19:09 crc kubenswrapper[4825]: I0310 08:19:09.904547 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.904529009 podStartE2EDuration="2.904529009s" podCreationTimestamp="2026-03-10 08:19:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:19:09.894116036 +0000 UTC m=+5702.923896651" watchObservedRunningTime="2026-03-10 08:19:09.904529009 +0000 UTC m=+5702.934309624" Mar 10 08:19:13 crc kubenswrapper[4825]: I0310 08:19:13.232344 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 08:19:14 crc kubenswrapper[4825]: I0310 08:19:14.197703 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 08:19:14 crc kubenswrapper[4825]: I0310 08:19:14.198343 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 08:19:14 crc kubenswrapper[4825]: I0310 08:19:14.218291 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:19:14 crc kubenswrapper[4825]: I0310 08:19:14.236377 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:19:14 crc kubenswrapper[4825]: I0310 08:19:14.953293 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 10 08:19:15 crc kubenswrapper[4825]: I0310 08:19:15.148090 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-mwgq4"] Mar 10 08:19:15 crc kubenswrapper[4825]: I0310 08:19:15.150437 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mwgq4" Mar 10 08:19:15 crc kubenswrapper[4825]: I0310 08:19:15.153471 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 10 08:19:15 crc kubenswrapper[4825]: I0310 08:19:15.153588 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 10 08:19:15 crc kubenswrapper[4825]: I0310 08:19:15.158304 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mwgq4"] Mar 10 08:19:15 crc kubenswrapper[4825]: I0310 08:19:15.212396 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="04673a37-4f3a-4129-9c02-16126e1a072e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.126:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 08:19:15 crc kubenswrapper[4825]: I0310 08:19:15.212399 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="04673a37-4f3a-4129-9c02-16126e1a072e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.126:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 08:19:15 crc kubenswrapper[4825]: I0310 08:19:15.256156 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3963c8-93c2-4495-96f5-24227a0db90d-scripts\") pod \"nova-cell1-cell-mapping-mwgq4\" (UID: \"dc3963c8-93c2-4495-96f5-24227a0db90d\") " pod="openstack/nova-cell1-cell-mapping-mwgq4" Mar 10 08:19:15 crc kubenswrapper[4825]: I0310 08:19:15.256226 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3963c8-93c2-4495-96f5-24227a0db90d-config-data\") pod \"nova-cell1-cell-mapping-mwgq4\" (UID: \"dc3963c8-93c2-4495-96f5-24227a0db90d\") " pod="openstack/nova-cell1-cell-mapping-mwgq4" Mar 10 08:19:15 crc kubenswrapper[4825]: I0310 08:19:15.256497 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m5l6\" (UniqueName: \"kubernetes.io/projected/dc3963c8-93c2-4495-96f5-24227a0db90d-kube-api-access-8m5l6\") pod \"nova-cell1-cell-mapping-mwgq4\" (UID: \"dc3963c8-93c2-4495-96f5-24227a0db90d\") " pod="openstack/nova-cell1-cell-mapping-mwgq4" Mar 10 08:19:15 crc kubenswrapper[4825]: I0310 08:19:15.256640 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3963c8-93c2-4495-96f5-24227a0db90d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mwgq4\" (UID: \"dc3963c8-93c2-4495-96f5-24227a0db90d\") " pod="openstack/nova-cell1-cell-mapping-mwgq4" Mar 10 08:19:15 crc kubenswrapper[4825]: I0310 08:19:15.357978 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3963c8-93c2-4495-96f5-24227a0db90d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mwgq4\" (UID: \"dc3963c8-93c2-4495-96f5-24227a0db90d\") " pod="openstack/nova-cell1-cell-mapping-mwgq4" Mar 10 08:19:15 crc kubenswrapper[4825]: I0310 08:19:15.358069 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3963c8-93c2-4495-96f5-24227a0db90d-scripts\") pod \"nova-cell1-cell-mapping-mwgq4\" (UID: \"dc3963c8-93c2-4495-96f5-24227a0db90d\") " pod="openstack/nova-cell1-cell-mapping-mwgq4" Mar 10 08:19:15 crc kubenswrapper[4825]: I0310 08:19:15.358111 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3963c8-93c2-4495-96f5-24227a0db90d-config-data\") pod \"nova-cell1-cell-mapping-mwgq4\" (UID: \"dc3963c8-93c2-4495-96f5-24227a0db90d\") " pod="openstack/nova-cell1-cell-mapping-mwgq4" Mar 10 08:19:15 crc kubenswrapper[4825]: I0310 08:19:15.358219 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m5l6\" (UniqueName: \"kubernetes.io/projected/dc3963c8-93c2-4495-96f5-24227a0db90d-kube-api-access-8m5l6\") pod \"nova-cell1-cell-mapping-mwgq4\" (UID: \"dc3963c8-93c2-4495-96f5-24227a0db90d\") " pod="openstack/nova-cell1-cell-mapping-mwgq4" Mar 10 08:19:15 crc kubenswrapper[4825]: I0310 08:19:15.365595 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3963c8-93c2-4495-96f5-24227a0db90d-scripts\") pod \"nova-cell1-cell-mapping-mwgq4\" (UID: \"dc3963c8-93c2-4495-96f5-24227a0db90d\") " pod="openstack/nova-cell1-cell-mapping-mwgq4" Mar 10 08:19:15 crc kubenswrapper[4825]: I0310 08:19:15.365905 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3963c8-93c2-4495-96f5-24227a0db90d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mwgq4\" (UID: \"dc3963c8-93c2-4495-96f5-24227a0db90d\") " pod="openstack/nova-cell1-cell-mapping-mwgq4" Mar 10 08:19:15 crc kubenswrapper[4825]: I0310 08:19:15.366016 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3963c8-93c2-4495-96f5-24227a0db90d-config-data\") pod \"nova-cell1-cell-mapping-mwgq4\" (UID: \"dc3963c8-93c2-4495-96f5-24227a0db90d\") " pod="openstack/nova-cell1-cell-mapping-mwgq4" Mar 10 08:19:15 crc kubenswrapper[4825]: I0310 08:19:15.384062 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m5l6\" (UniqueName: \"kubernetes.io/projected/dc3963c8-93c2-4495-96f5-24227a0db90d-kube-api-access-8m5l6\") pod \"nova-cell1-cell-mapping-mwgq4\" (UID: \"dc3963c8-93c2-4495-96f5-24227a0db90d\") " pod="openstack/nova-cell1-cell-mapping-mwgq4" Mar 10 08:19:15 crc kubenswrapper[4825]: I0310 08:19:15.470519 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mwgq4" Mar 10 08:19:15 crc kubenswrapper[4825]: I0310 08:19:15.797321 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mwgq4"] Mar 10 08:19:15 crc kubenswrapper[4825]: W0310 08:19:15.817601 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc3963c8_93c2_4495_96f5_24227a0db90d.slice/crio-e29eeaeceff0676704f958482fc9c2ab955ca623827d420fd9a7b66b5c980060 WatchSource:0}: Error finding container e29eeaeceff0676704f958482fc9c2ab955ca623827d420fd9a7b66b5c980060: Status 404 returned error can't find the container with id e29eeaeceff0676704f958482fc9c2ab955ca623827d420fd9a7b66b5c980060 Mar 10 08:19:15 crc kubenswrapper[4825]: I0310 08:19:15.941041 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mwgq4" event={"ID":"dc3963c8-93c2-4495-96f5-24227a0db90d","Type":"ContainerStarted","Data":"e29eeaeceff0676704f958482fc9c2ab955ca623827d420fd9a7b66b5c980060"} Mar 10 08:19:16 crc kubenswrapper[4825]: I0310 08:19:16.954033 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mwgq4" event={"ID":"dc3963c8-93c2-4495-96f5-24227a0db90d","Type":"ContainerStarted","Data":"60937d0ffd4b47821e2bfc207c85886292910aaf52e0dbd87838e3cb21ba27f7"} Mar 10 08:19:16 crc kubenswrapper[4825]: I0310 08:19:16.971046 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-mwgq4" podStartSLOduration=1.9710271320000001 podStartE2EDuration="1.971027132s" podCreationTimestamp="2026-03-10 08:19:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:19:16.964723556 +0000 UTC m=+5709.994504171" watchObservedRunningTime="2026-03-10 08:19:16.971027132 +0000 UTC m=+5710.000807747" Mar 10 08:19:18 crc kubenswrapper[4825]: I0310 08:19:18.231991 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 08:19:18 crc kubenswrapper[4825]: I0310 08:19:18.260863 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 08:19:18 crc kubenswrapper[4825]: I0310 08:19:18.987458 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8eea587f-9e48-49e9-ad29-4f71b911aa63" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.125:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 08:19:18 crc kubenswrapper[4825]: I0310 08:19:18.987567 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8eea587f-9e48-49e9-ad29-4f71b911aa63" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.125:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 08:19:18 crc kubenswrapper[4825]: I0310 08:19:18.996662 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 08:19:20 crc kubenswrapper[4825]: I0310 08:19:20.991835 4825 generic.go:334] "Generic (PLEG): container finished" podID="dc3963c8-93c2-4495-96f5-24227a0db90d" containerID="60937d0ffd4b47821e2bfc207c85886292910aaf52e0dbd87838e3cb21ba27f7" exitCode=0 Mar 10 08:19:20 crc kubenswrapper[4825]: I0310 08:19:20.991884 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mwgq4" event={"ID":"dc3963c8-93c2-4495-96f5-24227a0db90d","Type":"ContainerDied","Data":"60937d0ffd4b47821e2bfc207c85886292910aaf52e0dbd87838e3cb21ba27f7"} Mar 10 08:19:22 crc kubenswrapper[4825]: I0310 08:19:22.345296 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mwgq4" Mar 10 08:19:22 crc kubenswrapper[4825]: I0310 08:19:22.501819 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3963c8-93c2-4495-96f5-24227a0db90d-combined-ca-bundle\") pod \"dc3963c8-93c2-4495-96f5-24227a0db90d\" (UID: \"dc3963c8-93c2-4495-96f5-24227a0db90d\") " Mar 10 08:19:22 crc kubenswrapper[4825]: I0310 08:19:22.501979 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m5l6\" (UniqueName: \"kubernetes.io/projected/dc3963c8-93c2-4495-96f5-24227a0db90d-kube-api-access-8m5l6\") pod \"dc3963c8-93c2-4495-96f5-24227a0db90d\" (UID: \"dc3963c8-93c2-4495-96f5-24227a0db90d\") " Mar 10 08:19:22 crc kubenswrapper[4825]: I0310 08:19:22.502028 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3963c8-93c2-4495-96f5-24227a0db90d-config-data\") pod \"dc3963c8-93c2-4495-96f5-24227a0db90d\" (UID: \"dc3963c8-93c2-4495-96f5-24227a0db90d\") " Mar 10 08:19:22 crc kubenswrapper[4825]: I0310 08:19:22.502103 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3963c8-93c2-4495-96f5-24227a0db90d-scripts\") pod \"dc3963c8-93c2-4495-96f5-24227a0db90d\" (UID: \"dc3963c8-93c2-4495-96f5-24227a0db90d\") " Mar 10 08:19:22 crc kubenswrapper[4825]: I0310 08:19:22.515944 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc3963c8-93c2-4495-96f5-24227a0db90d-kube-api-access-8m5l6" (OuterVolumeSpecName: "kube-api-access-8m5l6") pod "dc3963c8-93c2-4495-96f5-24227a0db90d" (UID: "dc3963c8-93c2-4495-96f5-24227a0db90d"). InnerVolumeSpecName "kube-api-access-8m5l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:19:22 crc kubenswrapper[4825]: I0310 08:19:22.518855 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3963c8-93c2-4495-96f5-24227a0db90d-scripts" (OuterVolumeSpecName: "scripts") pod "dc3963c8-93c2-4495-96f5-24227a0db90d" (UID: "dc3963c8-93c2-4495-96f5-24227a0db90d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:19:22 crc kubenswrapper[4825]: I0310 08:19:22.539121 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3963c8-93c2-4495-96f5-24227a0db90d-config-data" (OuterVolumeSpecName: "config-data") pod "dc3963c8-93c2-4495-96f5-24227a0db90d" (UID: "dc3963c8-93c2-4495-96f5-24227a0db90d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:19:22 crc kubenswrapper[4825]: I0310 08:19:22.549254 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3963c8-93c2-4495-96f5-24227a0db90d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc3963c8-93c2-4495-96f5-24227a0db90d" (UID: "dc3963c8-93c2-4495-96f5-24227a0db90d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:19:22 crc kubenswrapper[4825]: I0310 08:19:22.604830 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3963c8-93c2-4495-96f5-24227a0db90d-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:22 crc kubenswrapper[4825]: I0310 08:19:22.604887 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3963c8-93c2-4495-96f5-24227a0db90d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:22 crc kubenswrapper[4825]: I0310 08:19:22.604903 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8m5l6\" (UniqueName: \"kubernetes.io/projected/dc3963c8-93c2-4495-96f5-24227a0db90d-kube-api-access-8m5l6\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:22 crc kubenswrapper[4825]: I0310 08:19:22.604916 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3963c8-93c2-4495-96f5-24227a0db90d-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:23 crc kubenswrapper[4825]: I0310 08:19:23.062777 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mwgq4" event={"ID":"dc3963c8-93c2-4495-96f5-24227a0db90d","Type":"ContainerDied","Data":"e29eeaeceff0676704f958482fc9c2ab955ca623827d420fd9a7b66b5c980060"} Mar 10 08:19:23 crc kubenswrapper[4825]: I0310 08:19:23.063186 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e29eeaeceff0676704f958482fc9c2ab955ca623827d420fd9a7b66b5c980060" Mar 10 08:19:23 crc kubenswrapper[4825]: I0310 08:19:23.062793 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mwgq4" Mar 10 08:19:23 crc kubenswrapper[4825]: I0310 08:19:23.204239 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 08:19:23 crc kubenswrapper[4825]: I0310 08:19:23.204492 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60" containerName="nova-scheduler-scheduler" containerID="cri-o://a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94" gracePeriod=30 Mar 10 08:19:23 crc kubenswrapper[4825]: E0310 08:19:23.234316 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 08:19:23 crc kubenswrapper[4825]: E0310 08:19:23.238588 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 08:19:23 crc kubenswrapper[4825]: E0310 08:19:23.243317 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 08:19:23 crc kubenswrapper[4825]: E0310 08:19:23.243383 4825 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60" containerName="nova-scheduler-scheduler" Mar 10 08:19:23 crc kubenswrapper[4825]: I0310 08:19:23.293660 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 08:19:23 crc kubenswrapper[4825]: I0310 08:19:23.294185 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8eea587f-9e48-49e9-ad29-4f71b911aa63" containerName="nova-api-log" containerID="cri-o://c16e9d3476199bb5aa6b9e33b5eec6f3e512e172a85f8fc62e5f64552fd8239b" gracePeriod=30 Mar 10 08:19:23 crc kubenswrapper[4825]: I0310 08:19:23.294244 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8eea587f-9e48-49e9-ad29-4f71b911aa63" containerName="nova-api-api" containerID="cri-o://858f7c177d82012dc13403454b29f81fdd129a40b95e41f69c5a4203358ff6ac" gracePeriod=30 Mar 10 08:19:23 crc kubenswrapper[4825]: I0310 08:19:23.317052 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 08:19:23 crc kubenswrapper[4825]: I0310 08:19:23.317374 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="04673a37-4f3a-4129-9c02-16126e1a072e" containerName="nova-metadata-log" containerID="cri-o://3ee24d6f700523a4adaf41742256a65e57beb3d27789b99b283e4a144ded3530" gracePeriod=30 Mar 10 08:19:23 crc kubenswrapper[4825]: I0310 08:19:23.317480 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="04673a37-4f3a-4129-9c02-16126e1a072e" containerName="nova-metadata-metadata" containerID="cri-o://3ca87529f407c049b8b4aab44f73dc17631384da28a48cf142fe4ef94af92232" gracePeriod=30 Mar 10 08:19:24 crc kubenswrapper[4825]: I0310 08:19:24.074283 4825 generic.go:334] "Generic (PLEG): container finished" podID="8eea587f-9e48-49e9-ad29-4f71b911aa63" containerID="c16e9d3476199bb5aa6b9e33b5eec6f3e512e172a85f8fc62e5f64552fd8239b" exitCode=143 Mar 10 08:19:24 crc kubenswrapper[4825]: I0310 08:19:24.074388 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8eea587f-9e48-49e9-ad29-4f71b911aa63","Type":"ContainerDied","Data":"c16e9d3476199bb5aa6b9e33b5eec6f3e512e172a85f8fc62e5f64552fd8239b"} Mar 10 08:19:24 crc kubenswrapper[4825]: I0310 08:19:24.076438 4825 generic.go:334] "Generic (PLEG): container finished" podID="04673a37-4f3a-4129-9c02-16126e1a072e" containerID="3ee24d6f700523a4adaf41742256a65e57beb3d27789b99b283e4a144ded3530" exitCode=143 Mar 10 08:19:24 crc kubenswrapper[4825]: I0310 08:19:24.076468 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04673a37-4f3a-4129-9c02-16126e1a072e","Type":"ContainerDied","Data":"3ee24d6f700523a4adaf41742256a65e57beb3d27789b99b283e4a144ded3530"} Mar 10 08:19:28 crc kubenswrapper[4825]: E0310 08:19:28.234078 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 08:19:28 crc kubenswrapper[4825]: E0310 08:19:28.236849 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 08:19:28 crc kubenswrapper[4825]: E0310 08:19:28.238461 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 08:19:28 crc kubenswrapper[4825]: E0310 08:19:28.238517 4825 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60" containerName="nova-scheduler-scheduler" Mar 10 08:19:33 crc kubenswrapper[4825]: E0310 08:19:33.233829 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 08:19:33 crc kubenswrapper[4825]: E0310 08:19:33.237696 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 08:19:33 crc kubenswrapper[4825]: E0310 08:19:33.239333 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 08:19:33 crc kubenswrapper[4825]: E0310 08:19:33.239399 4825 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60" containerName="nova-scheduler-scheduler" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.135436 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.139825 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.193347 4825 generic.go:334] "Generic (PLEG): container finished" podID="8eea587f-9e48-49e9-ad29-4f71b911aa63" containerID="858f7c177d82012dc13403454b29f81fdd129a40b95e41f69c5a4203358ff6ac" exitCode=0 Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.193523 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.193561 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8eea587f-9e48-49e9-ad29-4f71b911aa63","Type":"ContainerDied","Data":"858f7c177d82012dc13403454b29f81fdd129a40b95e41f69c5a4203358ff6ac"} Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.193606 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8eea587f-9e48-49e9-ad29-4f71b911aa63","Type":"ContainerDied","Data":"970f1273b5fcfe04ee4c291c98de1323ad40a590775e4d714b5d7f8b8ab6ab53"} Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.193628 4825 scope.go:117] "RemoveContainer" containerID="858f7c177d82012dc13403454b29f81fdd129a40b95e41f69c5a4203358ff6ac" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.196633 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.196739 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04673a37-4f3a-4129-9c02-16126e1a072e","Type":"ContainerDied","Data":"3ca87529f407c049b8b4aab44f73dc17631384da28a48cf142fe4ef94af92232"} Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.196834 4825 generic.go:334] "Generic (PLEG): container finished" podID="04673a37-4f3a-4129-9c02-16126e1a072e" containerID="3ca87529f407c049b8b4aab44f73dc17631384da28a48cf142fe4ef94af92232" exitCode=0 Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.196944 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04673a37-4f3a-4129-9c02-16126e1a072e","Type":"ContainerDied","Data":"db4485f878668ba6b46de7141880ffc39259d1f0a6249726da53938342a162ed"} Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.231711 4825 scope.go:117] "RemoveContainer" containerID="c16e9d3476199bb5aa6b9e33b5eec6f3e512e172a85f8fc62e5f64552fd8239b" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.250715 4825 scope.go:117] "RemoveContainer" containerID="858f7c177d82012dc13403454b29f81fdd129a40b95e41f69c5a4203358ff6ac" Mar 10 08:19:37 crc kubenswrapper[4825]: E0310 08:19:37.251157 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"858f7c177d82012dc13403454b29f81fdd129a40b95e41f69c5a4203358ff6ac\": container with ID starting with 858f7c177d82012dc13403454b29f81fdd129a40b95e41f69c5a4203358ff6ac not found: ID does not exist" containerID="858f7c177d82012dc13403454b29f81fdd129a40b95e41f69c5a4203358ff6ac" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.251197 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"858f7c177d82012dc13403454b29f81fdd129a40b95e41f69c5a4203358ff6ac"} err="failed to get container status \"858f7c177d82012dc13403454b29f81fdd129a40b95e41f69c5a4203358ff6ac\": rpc error: code = NotFound desc = could not find container \"858f7c177d82012dc13403454b29f81fdd129a40b95e41f69c5a4203358ff6ac\": container with ID starting with 858f7c177d82012dc13403454b29f81fdd129a40b95e41f69c5a4203358ff6ac not found: ID does not exist" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.251221 4825 scope.go:117] "RemoveContainer" containerID="c16e9d3476199bb5aa6b9e33b5eec6f3e512e172a85f8fc62e5f64552fd8239b" Mar 10 08:19:37 crc kubenswrapper[4825]: E0310 08:19:37.251522 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c16e9d3476199bb5aa6b9e33b5eec6f3e512e172a85f8fc62e5f64552fd8239b\": container with ID starting with c16e9d3476199bb5aa6b9e33b5eec6f3e512e172a85f8fc62e5f64552fd8239b not found: ID does not exist" containerID="c16e9d3476199bb5aa6b9e33b5eec6f3e512e172a85f8fc62e5f64552fd8239b" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.251550 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c16e9d3476199bb5aa6b9e33b5eec6f3e512e172a85f8fc62e5f64552fd8239b"} err="failed to get container status \"c16e9d3476199bb5aa6b9e33b5eec6f3e512e172a85f8fc62e5f64552fd8239b\": rpc error: code = NotFound desc = could not find container \"c16e9d3476199bb5aa6b9e33b5eec6f3e512e172a85f8fc62e5f64552fd8239b\": container with ID starting with c16e9d3476199bb5aa6b9e33b5eec6f3e512e172a85f8fc62e5f64552fd8239b not found: ID does not exist" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.251568 4825 scope.go:117] "RemoveContainer" containerID="3ca87529f407c049b8b4aab44f73dc17631384da28a48cf142fe4ef94af92232" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.271535 4825 scope.go:117] "RemoveContainer" containerID="3ee24d6f700523a4adaf41742256a65e57beb3d27789b99b283e4a144ded3530" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.284533 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eea587f-9e48-49e9-ad29-4f71b911aa63-config-data\") pod \"8eea587f-9e48-49e9-ad29-4f71b911aa63\" (UID: \"8eea587f-9e48-49e9-ad29-4f71b911aa63\") " Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.284616 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9smj\" (UniqueName: \"kubernetes.io/projected/8eea587f-9e48-49e9-ad29-4f71b911aa63-kube-api-access-f9smj\") pod \"8eea587f-9e48-49e9-ad29-4f71b911aa63\" (UID: \"8eea587f-9e48-49e9-ad29-4f71b911aa63\") " Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.284718 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eea587f-9e48-49e9-ad29-4f71b911aa63-combined-ca-bundle\") pod \"8eea587f-9e48-49e9-ad29-4f71b911aa63\" (UID: \"8eea587f-9e48-49e9-ad29-4f71b911aa63\") " Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.284742 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fvt7\" (UniqueName: \"kubernetes.io/projected/04673a37-4f3a-4129-9c02-16126e1a072e-kube-api-access-4fvt7\") pod \"04673a37-4f3a-4129-9c02-16126e1a072e\" (UID: \"04673a37-4f3a-4129-9c02-16126e1a072e\") " Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.284759 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04673a37-4f3a-4129-9c02-16126e1a072e-logs\") pod \"04673a37-4f3a-4129-9c02-16126e1a072e\" (UID: \"04673a37-4f3a-4129-9c02-16126e1a072e\") " Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.284847 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04673a37-4f3a-4129-9c02-16126e1a072e-combined-ca-bundle\") pod \"04673a37-4f3a-4129-9c02-16126e1a072e\" (UID: \"04673a37-4f3a-4129-9c02-16126e1a072e\") " Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.284902 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eea587f-9e48-49e9-ad29-4f71b911aa63-logs\") pod \"8eea587f-9e48-49e9-ad29-4f71b911aa63\" (UID: \"8eea587f-9e48-49e9-ad29-4f71b911aa63\") " Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.284917 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04673a37-4f3a-4129-9c02-16126e1a072e-config-data\") pod \"04673a37-4f3a-4129-9c02-16126e1a072e\" (UID: \"04673a37-4f3a-4129-9c02-16126e1a072e\") " Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.285012 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/04673a37-4f3a-4129-9c02-16126e1a072e-nova-metadata-tls-certs\") pod \"04673a37-4f3a-4129-9c02-16126e1a072e\" (UID: \"04673a37-4f3a-4129-9c02-16126e1a072e\") " Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.285619 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eea587f-9e48-49e9-ad29-4f71b911aa63-logs" (OuterVolumeSpecName: "logs") pod "8eea587f-9e48-49e9-ad29-4f71b911aa63" (UID: "8eea587f-9e48-49e9-ad29-4f71b911aa63"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.285782 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04673a37-4f3a-4129-9c02-16126e1a072e-logs" (OuterVolumeSpecName: "logs") pod "04673a37-4f3a-4129-9c02-16126e1a072e" (UID: "04673a37-4f3a-4129-9c02-16126e1a072e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.286195 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04673a37-4f3a-4129-9c02-16126e1a072e-logs\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.286213 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eea587f-9e48-49e9-ad29-4f71b911aa63-logs\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.290298 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eea587f-9e48-49e9-ad29-4f71b911aa63-kube-api-access-f9smj" (OuterVolumeSpecName: "kube-api-access-f9smj") pod "8eea587f-9e48-49e9-ad29-4f71b911aa63" (UID: "8eea587f-9e48-49e9-ad29-4f71b911aa63"). InnerVolumeSpecName "kube-api-access-f9smj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.290742 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04673a37-4f3a-4129-9c02-16126e1a072e-kube-api-access-4fvt7" (OuterVolumeSpecName: "kube-api-access-4fvt7") pod "04673a37-4f3a-4129-9c02-16126e1a072e" (UID: "04673a37-4f3a-4129-9c02-16126e1a072e"). InnerVolumeSpecName "kube-api-access-4fvt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.294217 4825 scope.go:117] "RemoveContainer" containerID="3ca87529f407c049b8b4aab44f73dc17631384da28a48cf142fe4ef94af92232" Mar 10 08:19:37 crc kubenswrapper[4825]: E0310 08:19:37.295973 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ca87529f407c049b8b4aab44f73dc17631384da28a48cf142fe4ef94af92232\": container with ID starting with 3ca87529f407c049b8b4aab44f73dc17631384da28a48cf142fe4ef94af92232 not found: ID does not exist" containerID="3ca87529f407c049b8b4aab44f73dc17631384da28a48cf142fe4ef94af92232" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.296017 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ca87529f407c049b8b4aab44f73dc17631384da28a48cf142fe4ef94af92232"} err="failed to get container status \"3ca87529f407c049b8b4aab44f73dc17631384da28a48cf142fe4ef94af92232\": rpc error: code = NotFound desc = could not find container \"3ca87529f407c049b8b4aab44f73dc17631384da28a48cf142fe4ef94af92232\": container with ID starting with 3ca87529f407c049b8b4aab44f73dc17631384da28a48cf142fe4ef94af92232 not found: ID does not exist" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.296046 4825 scope.go:117] "RemoveContainer" containerID="3ee24d6f700523a4adaf41742256a65e57beb3d27789b99b283e4a144ded3530" Mar 10 08:19:37 crc kubenswrapper[4825]: E0310 08:19:37.296668 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ee24d6f700523a4adaf41742256a65e57beb3d27789b99b283e4a144ded3530\": container with ID starting with 3ee24d6f700523a4adaf41742256a65e57beb3d27789b99b283e4a144ded3530 not found: ID does not exist" containerID="3ee24d6f700523a4adaf41742256a65e57beb3d27789b99b283e4a144ded3530" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.296798 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ee24d6f700523a4adaf41742256a65e57beb3d27789b99b283e4a144ded3530"} err="failed to get container status \"3ee24d6f700523a4adaf41742256a65e57beb3d27789b99b283e4a144ded3530\": rpc error: code = NotFound desc = could not find container \"3ee24d6f700523a4adaf41742256a65e57beb3d27789b99b283e4a144ded3530\": container with ID starting with 3ee24d6f700523a4adaf41742256a65e57beb3d27789b99b283e4a144ded3530 not found: ID does not exist" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.311504 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04673a37-4f3a-4129-9c02-16126e1a072e-config-data" (OuterVolumeSpecName: "config-data") pod "04673a37-4f3a-4129-9c02-16126e1a072e" (UID: "04673a37-4f3a-4129-9c02-16126e1a072e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.318188 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04673a37-4f3a-4129-9c02-16126e1a072e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04673a37-4f3a-4129-9c02-16126e1a072e" (UID: "04673a37-4f3a-4129-9c02-16126e1a072e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.320406 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eea587f-9e48-49e9-ad29-4f71b911aa63-config-data" (OuterVolumeSpecName: "config-data") pod "8eea587f-9e48-49e9-ad29-4f71b911aa63" (UID: "8eea587f-9e48-49e9-ad29-4f71b911aa63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.322669 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eea587f-9e48-49e9-ad29-4f71b911aa63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8eea587f-9e48-49e9-ad29-4f71b911aa63" (UID: "8eea587f-9e48-49e9-ad29-4f71b911aa63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.347660 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04673a37-4f3a-4129-9c02-16126e1a072e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "04673a37-4f3a-4129-9c02-16126e1a072e" (UID: "04673a37-4f3a-4129-9c02-16126e1a072e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.388965 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04673a37-4f3a-4129-9c02-16126e1a072e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.389010 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04673a37-4f3a-4129-9c02-16126e1a072e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.389028 4825 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/04673a37-4f3a-4129-9c02-16126e1a072e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.389049 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eea587f-9e48-49e9-ad29-4f71b911aa63-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.389065 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9smj\" (UniqueName: \"kubernetes.io/projected/8eea587f-9e48-49e9-ad29-4f71b911aa63-kube-api-access-f9smj\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.389079 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eea587f-9e48-49e9-ad29-4f71b911aa63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.389095 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fvt7\" (UniqueName: \"kubernetes.io/projected/04673a37-4f3a-4129-9c02-16126e1a072e-kube-api-access-4fvt7\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.528087 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.542057 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.564863 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.575059 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.584490 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 08:19:37 crc kubenswrapper[4825]: E0310 08:19:37.584932 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc3963c8-93c2-4495-96f5-24227a0db90d" containerName="nova-manage" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.584952 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc3963c8-93c2-4495-96f5-24227a0db90d" containerName="nova-manage" Mar 10 08:19:37 crc kubenswrapper[4825]: E0310 08:19:37.584971 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04673a37-4f3a-4129-9c02-16126e1a072e" containerName="nova-metadata-log" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.584978 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="04673a37-4f3a-4129-9c02-16126e1a072e" containerName="nova-metadata-log" Mar 10 08:19:37 crc kubenswrapper[4825]: E0310 08:19:37.584994 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eea587f-9e48-49e9-ad29-4f71b911aa63" containerName="nova-api-api" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.585001 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eea587f-9e48-49e9-ad29-4f71b911aa63" containerName="nova-api-api" Mar 10 08:19:37 crc kubenswrapper[4825]: E0310 08:19:37.585010 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04673a37-4f3a-4129-9c02-16126e1a072e" containerName="nova-metadata-metadata" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.585016 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="04673a37-4f3a-4129-9c02-16126e1a072e" containerName="nova-metadata-metadata" Mar 10 08:19:37 crc kubenswrapper[4825]: E0310 08:19:37.585029 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eea587f-9e48-49e9-ad29-4f71b911aa63" containerName="nova-api-log" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.585034 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eea587f-9e48-49e9-ad29-4f71b911aa63" containerName="nova-api-log" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.585219 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eea587f-9e48-49e9-ad29-4f71b911aa63" containerName="nova-api-log" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.585234 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc3963c8-93c2-4495-96f5-24227a0db90d" containerName="nova-manage" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.585244 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="04673a37-4f3a-4129-9c02-16126e1a072e" containerName="nova-metadata-metadata" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.585255 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="04673a37-4f3a-4129-9c02-16126e1a072e" containerName="nova-metadata-log" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.585263 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eea587f-9e48-49e9-ad29-4f71b911aa63" containerName="nova-api-api" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.586226 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.587600 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.591174 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xff5l\" (UniqueName: \"kubernetes.io/projected/125572ab-4336-41da-900d-a0649d7d378d-kube-api-access-xff5l\") pod \"nova-api-0\" (UID: \"125572ab-4336-41da-900d-a0649d7d378d\") " pod="openstack/nova-api-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.591244 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/125572ab-4336-41da-900d-a0649d7d378d-config-data\") pod \"nova-api-0\" (UID: \"125572ab-4336-41da-900d-a0649d7d378d\") " pod="openstack/nova-api-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.591534 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125572ab-4336-41da-900d-a0649d7d378d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"125572ab-4336-41da-900d-a0649d7d378d\") " pod="openstack/nova-api-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.591597 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/125572ab-4336-41da-900d-a0649d7d378d-logs\") pod \"nova-api-0\" (UID: \"125572ab-4336-41da-900d-a0649d7d378d\") " pod="openstack/nova-api-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.601364 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.603089 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.604028 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.609782 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.609794 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.614329 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.692659 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac546983-f13f-4257-ad1d-f0ff7398a28b-config-data\") pod \"nova-metadata-0\" (UID: \"ac546983-f13f-4257-ad1d-f0ff7398a28b\") " pod="openstack/nova-metadata-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.692710 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac546983-f13f-4257-ad1d-f0ff7398a28b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ac546983-f13f-4257-ad1d-f0ff7398a28b\") " pod="openstack/nova-metadata-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.692752 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125572ab-4336-41da-900d-a0649d7d378d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"125572ab-4336-41da-900d-a0649d7d378d\") " pod="openstack/nova-api-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.692774 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/125572ab-4336-41da-900d-a0649d7d378d-logs\") pod \"nova-api-0\" (UID: \"125572ab-4336-41da-900d-a0649d7d378d\") " pod="openstack/nova-api-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.692857 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72nms\" (UniqueName: \"kubernetes.io/projected/ac546983-f13f-4257-ad1d-f0ff7398a28b-kube-api-access-72nms\") pod \"nova-metadata-0\" (UID: \"ac546983-f13f-4257-ad1d-f0ff7398a28b\") " pod="openstack/nova-metadata-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.692911 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xff5l\" (UniqueName: \"kubernetes.io/projected/125572ab-4336-41da-900d-a0649d7d378d-kube-api-access-xff5l\") pod \"nova-api-0\" (UID: \"125572ab-4336-41da-900d-a0649d7d378d\") " pod="openstack/nova-api-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.692950 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac546983-f13f-4257-ad1d-f0ff7398a28b-logs\") pod \"nova-metadata-0\" (UID: \"ac546983-f13f-4257-ad1d-f0ff7398a28b\") " pod="openstack/nova-metadata-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.692969 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac546983-f13f-4257-ad1d-f0ff7398a28b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ac546983-f13f-4257-ad1d-f0ff7398a28b\") " pod="openstack/nova-metadata-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.693002 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/125572ab-4336-41da-900d-a0649d7d378d-config-data\") pod \"nova-api-0\" (UID: \"125572ab-4336-41da-900d-a0649d7d378d\") " pod="openstack/nova-api-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.693200 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/125572ab-4336-41da-900d-a0649d7d378d-logs\") pod \"nova-api-0\" (UID: \"125572ab-4336-41da-900d-a0649d7d378d\") " pod="openstack/nova-api-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.697049 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125572ab-4336-41da-900d-a0649d7d378d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"125572ab-4336-41da-900d-a0649d7d378d\") " pod="openstack/nova-api-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.697641 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/125572ab-4336-41da-900d-a0649d7d378d-config-data\") pod \"nova-api-0\" (UID: \"125572ab-4336-41da-900d-a0649d7d378d\") " pod="openstack/nova-api-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.716260 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xff5l\" (UniqueName: \"kubernetes.io/projected/125572ab-4336-41da-900d-a0649d7d378d-kube-api-access-xff5l\") pod \"nova-api-0\" (UID: \"125572ab-4336-41da-900d-a0649d7d378d\") " pod="openstack/nova-api-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.798393 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac546983-f13f-4257-ad1d-f0ff7398a28b-config-data\") pod \"nova-metadata-0\" (UID: \"ac546983-f13f-4257-ad1d-f0ff7398a28b\") " pod="openstack/nova-metadata-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.798480 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac546983-f13f-4257-ad1d-f0ff7398a28b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ac546983-f13f-4257-ad1d-f0ff7398a28b\") " pod="openstack/nova-metadata-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.798624 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72nms\" (UniqueName: \"kubernetes.io/projected/ac546983-f13f-4257-ad1d-f0ff7398a28b-kube-api-access-72nms\") pod \"nova-metadata-0\" (UID: \"ac546983-f13f-4257-ad1d-f0ff7398a28b\") " pod="openstack/nova-metadata-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.798709 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac546983-f13f-4257-ad1d-f0ff7398a28b-logs\") pod \"nova-metadata-0\" (UID: \"ac546983-f13f-4257-ad1d-f0ff7398a28b\") " pod="openstack/nova-metadata-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.798739 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac546983-f13f-4257-ad1d-f0ff7398a28b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ac546983-f13f-4257-ad1d-f0ff7398a28b\") " pod="openstack/nova-metadata-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.802104 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac546983-f13f-4257-ad1d-f0ff7398a28b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ac546983-f13f-4257-ad1d-f0ff7398a28b\") " pod="openstack/nova-metadata-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.802770 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac546983-f13f-4257-ad1d-f0ff7398a28b-logs\") pod \"nova-metadata-0\" (UID: \"ac546983-f13f-4257-ad1d-f0ff7398a28b\") " pod="openstack/nova-metadata-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.803429 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac546983-f13f-4257-ad1d-f0ff7398a28b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ac546983-f13f-4257-ad1d-f0ff7398a28b\") " pod="openstack/nova-metadata-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.813280 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac546983-f13f-4257-ad1d-f0ff7398a28b-config-data\") pod \"nova-metadata-0\" (UID: \"ac546983-f13f-4257-ad1d-f0ff7398a28b\") " pod="openstack/nova-metadata-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.819555 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72nms\" (UniqueName: \"kubernetes.io/projected/ac546983-f13f-4257-ad1d-f0ff7398a28b-kube-api-access-72nms\") pod \"nova-metadata-0\" (UID: \"ac546983-f13f-4257-ad1d-f0ff7398a28b\") " pod="openstack/nova-metadata-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.922104 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 08:19:37 crc kubenswrapper[4825]: I0310 08:19:37.936682 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 08:19:38 crc kubenswrapper[4825]: E0310 08:19:38.234252 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 08:19:38 crc kubenswrapper[4825]: E0310 08:19:38.236356 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 08:19:38 crc kubenswrapper[4825]: E0310 08:19:38.238575 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 08:19:38 crc kubenswrapper[4825]: E0310 08:19:38.238676 4825 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60" containerName="nova-scheduler-scheduler" Mar 10 08:19:38 crc kubenswrapper[4825]: I0310 08:19:38.380893 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 08:19:38 crc kubenswrapper[4825]: W0310 08:19:38.437297 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac546983_f13f_4257_ad1d_f0ff7398a28b.slice/crio-d046535f77908c0e136ed66785af0ff9ed53a3a9cad56fe3ded368e7723b0bb0 WatchSource:0}: Error finding container d046535f77908c0e136ed66785af0ff9ed53a3a9cad56fe3ded368e7723b0bb0: Status 404 returned error can't find the container with id d046535f77908c0e136ed66785af0ff9ed53a3a9cad56fe3ded368e7723b0bb0 Mar 10 08:19:38 crc kubenswrapper[4825]: I0310 08:19:38.438989 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 08:19:39 crc kubenswrapper[4825]: I0310 08:19:39.224154 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ac546983-f13f-4257-ad1d-f0ff7398a28b","Type":"ContainerStarted","Data":"a150344ed6b966cabe77567d2b52d46ec4894b221ec2780fc56ef6d0b3b79aea"} Mar 10 08:19:39 crc kubenswrapper[4825]: I0310 08:19:39.224199 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ac546983-f13f-4257-ad1d-f0ff7398a28b","Type":"ContainerStarted","Data":"9e25877e7cb42aa5c8c8b1f77338e31beeeb7dfb90d6daa03b3a197a82f65e77"} Mar 10 08:19:39 crc kubenswrapper[4825]: I0310 08:19:39.224209 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ac546983-f13f-4257-ad1d-f0ff7398a28b","Type":"ContainerStarted","Data":"d046535f77908c0e136ed66785af0ff9ed53a3a9cad56fe3ded368e7723b0bb0"} Mar 10 08:19:39 crc kubenswrapper[4825]: I0310 08:19:39.227780 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"125572ab-4336-41da-900d-a0649d7d378d","Type":"ContainerStarted","Data":"c136487bcbacd9b215a10bacabe846100c2b90633f10d9513f3a49349032a85b"} Mar 10 08:19:39 crc kubenswrapper[4825]: I0310 08:19:39.228020 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"125572ab-4336-41da-900d-a0649d7d378d","Type":"ContainerStarted","Data":"5ecde6232d4d92fb21f8fce82c558c0bf2c8a5c77f788596d29a02bfd6d5b55e"} Mar 10 08:19:39 crc kubenswrapper[4825]: I0310 08:19:39.228114 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"125572ab-4336-41da-900d-a0649d7d378d","Type":"ContainerStarted","Data":"06d8b1b3c0e75bdec21fb01016a619499ac425777e97fd23221dc640f0f0440b"} Mar 10 08:19:39 crc kubenswrapper[4825]: I0310 08:19:39.247508 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04673a37-4f3a-4129-9c02-16126e1a072e" path="/var/lib/kubelet/pods/04673a37-4f3a-4129-9c02-16126e1a072e/volumes" Mar 10 08:19:39 crc kubenswrapper[4825]: I0310 08:19:39.248307 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eea587f-9e48-49e9-ad29-4f71b911aa63" path="/var/lib/kubelet/pods/8eea587f-9e48-49e9-ad29-4f71b911aa63/volumes" Mar 10 08:19:39 crc kubenswrapper[4825]: I0310 08:19:39.250407 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.25038986 podStartE2EDuration="2.25038986s" podCreationTimestamp="2026-03-10 08:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:19:39.24584662 +0000 UTC m=+5732.275627255" watchObservedRunningTime="2026-03-10 08:19:39.25038986 +0000 UTC m=+5732.280170475" Mar 10 08:19:39 crc kubenswrapper[4825]: I0310 08:19:39.268742 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.268722791 podStartE2EDuration="2.268722791s" podCreationTimestamp="2026-03-10 08:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:19:39.260247649 +0000 UTC m=+5732.290028274" watchObservedRunningTime="2026-03-10 08:19:39.268722791 +0000 UTC m=+5732.298503406" Mar 10 08:19:42 crc kubenswrapper[4825]: I0310 08:19:42.937312 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 08:19:42 crc kubenswrapper[4825]: I0310 08:19:42.937851 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 08:19:43 crc kubenswrapper[4825]: I0310 08:19:43.038474 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-ccee-account-create-update-dnx7s"] Mar 10 08:19:43 crc kubenswrapper[4825]: I0310 08:19:43.048418 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-rn9vj"] Mar 10 08:19:43 crc kubenswrapper[4825]: I0310 08:19:43.058440 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-ccee-account-create-update-dnx7s"] Mar 10 08:19:43 crc kubenswrapper[4825]: I0310 08:19:43.067115 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-rn9vj"] Mar 10 08:19:43 crc kubenswrapper[4825]: E0310 08:19:43.233660 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 08:19:43 crc kubenswrapper[4825]: E0310 08:19:43.237326 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 08:19:43 crc kubenswrapper[4825]: E0310 08:19:43.238698 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 08:19:43 crc kubenswrapper[4825]: E0310 08:19:43.238735 4825 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60" containerName="nova-scheduler-scheduler" Mar 10 08:19:43 crc kubenswrapper[4825]: I0310 08:19:43.250193 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71b648ef-f8a8-475f-a831-8ebb97d1d57e" path="/var/lib/kubelet/pods/71b648ef-f8a8-475f-a831-8ebb97d1d57e/volumes" Mar 10 08:19:43 crc kubenswrapper[4825]: I0310 08:19:43.251175 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95bf6167-d4ef-425d-a6ea-7678a3fb8956" path="/var/lib/kubelet/pods/95bf6167-d4ef-425d-a6ea-7678a3fb8956/volumes" Mar 10 08:19:43 crc kubenswrapper[4825]: I0310 08:19:43.423438 4825 scope.go:117] "RemoveContainer" containerID="1748fcb25c55190745bc6107d4e046b63ffd4821e2b74fdf4c5ec5c198c3087f" Mar 10 08:19:43 crc kubenswrapper[4825]: I0310 08:19:43.451157 4825 scope.go:117] "RemoveContainer" containerID="5fc0fd428da0df49f8a2e5b863ca67ba564a9dcca824b1d01642d3c5a44ba489" Mar 10 08:19:47 crc kubenswrapper[4825]: I0310 08:19:47.923421 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 08:19:47 crc kubenswrapper[4825]: I0310 08:19:47.924400 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 08:19:47 crc kubenswrapper[4825]: I0310 08:19:47.937412 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 08:19:47 crc kubenswrapper[4825]: I0310 08:19:47.937462 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 08:19:48 crc kubenswrapper[4825]: E0310 08:19:48.234120 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 08:19:48 crc kubenswrapper[4825]: E0310 08:19:48.235517 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 08:19:48 crc kubenswrapper[4825]: E0310 08:19:48.236469 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 08:19:48 crc kubenswrapper[4825]: E0310 08:19:48.236543 4825 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60" containerName="nova-scheduler-scheduler" Mar 10 08:19:49 crc kubenswrapper[4825]: I0310 08:19:49.005427 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="125572ab-4336-41da-900d-a0649d7d378d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.130:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 08:19:49 crc kubenswrapper[4825]: I0310 08:19:49.019366 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="125572ab-4336-41da-900d-a0649d7d378d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.130:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 08:19:49 crc kubenswrapper[4825]: I0310 08:19:49.019377 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ac546983-f13f-4257-ad1d-f0ff7398a28b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.131:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 08:19:49 crc kubenswrapper[4825]: I0310 08:19:49.019539 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ac546983-f13f-4257-ad1d-f0ff7398a28b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.131:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 08:19:53 crc kubenswrapper[4825]: E0310 08:19:53.231978 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94 is running failed: container process not found" containerID="a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 08:19:53 crc kubenswrapper[4825]: E0310 08:19:53.232353 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94 is running failed: container process not found" containerID="a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 08:19:53 crc kubenswrapper[4825]: E0310 08:19:53.232661 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94 is running failed: container process not found" containerID="a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 08:19:53 crc kubenswrapper[4825]: E0310 08:19:53.232693 4825 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60" containerName="nova-scheduler-scheduler" Mar 10 08:19:53 crc kubenswrapper[4825]: I0310 08:19:53.370810 4825 generic.go:334] "Generic (PLEG): container finished" podID="360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60" containerID="a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94" exitCode=137 Mar 10 08:19:53 crc kubenswrapper[4825]: I0310 08:19:53.372343 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60","Type":"ContainerDied","Data":"a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94"} Mar 10 08:19:53 crc kubenswrapper[4825]: I0310 08:19:53.593570 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 08:19:53 crc kubenswrapper[4825]: I0310 08:19:53.708431 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60-combined-ca-bundle\") pod \"360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60\" (UID: \"360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60\") " Mar 10 08:19:53 crc kubenswrapper[4825]: I0310 08:19:53.708563 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60-config-data\") pod \"360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60\" (UID: \"360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60\") " Mar 10 08:19:53 crc kubenswrapper[4825]: I0310 08:19:53.708772 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px8v8\" (UniqueName: \"kubernetes.io/projected/360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60-kube-api-access-px8v8\") pod \"360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60\" (UID: \"360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60\") " Mar 10 08:19:53 crc kubenswrapper[4825]: I0310 08:19:53.713057 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60-kube-api-access-px8v8" (OuterVolumeSpecName: "kube-api-access-px8v8") pod "360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60" (UID: "360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60"). InnerVolumeSpecName "kube-api-access-px8v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:19:53 crc kubenswrapper[4825]: I0310 08:19:53.738498 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60" (UID: "360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:19:53 crc kubenswrapper[4825]: I0310 08:19:53.741450 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60-config-data" (OuterVolumeSpecName: "config-data") pod "360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60" (UID: "360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:19:53 crc kubenswrapper[4825]: I0310 08:19:53.811226 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px8v8\" (UniqueName: \"kubernetes.io/projected/360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60-kube-api-access-px8v8\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:53 crc kubenswrapper[4825]: I0310 08:19:53.811257 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:53 crc kubenswrapper[4825]: I0310 08:19:53.811269 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:19:54 crc kubenswrapper[4825]: I0310 08:19:54.381091 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60","Type":"ContainerDied","Data":"5a74c611f00214c9d1f633747bc7850fb58339d013708a0fdae8bfec0180558f"} Mar 10 08:19:54 crc kubenswrapper[4825]: I0310 08:19:54.381161 4825 scope.go:117] "RemoveContainer" containerID="a64b689235496f66774f1a8cb304a843e375f0afa782f0fbe9dad0ffef637a94" Mar 10 08:19:54 crc kubenswrapper[4825]: I0310 08:19:54.381226 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 08:19:54 crc kubenswrapper[4825]: I0310 08:19:54.412443 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 08:19:54 crc kubenswrapper[4825]: I0310 08:19:54.424651 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 08:19:54 crc kubenswrapper[4825]: I0310 08:19:54.436840 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 08:19:54 crc kubenswrapper[4825]: E0310 08:19:54.437379 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60" containerName="nova-scheduler-scheduler" Mar 10 08:19:54 crc kubenswrapper[4825]: I0310 08:19:54.437402 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60" containerName="nova-scheduler-scheduler" Mar 10 08:19:54 crc kubenswrapper[4825]: I0310 08:19:54.437634 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60" containerName="nova-scheduler-scheduler" Mar 10 08:19:54 crc kubenswrapper[4825]: I0310 08:19:54.439673 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 08:19:54 crc kubenswrapper[4825]: I0310 08:19:54.441411 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 08:19:54 crc kubenswrapper[4825]: I0310 08:19:54.462546 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 08:19:54 crc kubenswrapper[4825]: I0310 08:19:54.523066 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5189c199-0d2b-40c2-8fe8-04bdea46a84c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5189c199-0d2b-40c2-8fe8-04bdea46a84c\") " pod="openstack/nova-scheduler-0" Mar 10 08:19:54 crc kubenswrapper[4825]: I0310 08:19:54.523311 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sfdc\" (UniqueName: \"kubernetes.io/projected/5189c199-0d2b-40c2-8fe8-04bdea46a84c-kube-api-access-2sfdc\") pod \"nova-scheduler-0\" (UID: \"5189c199-0d2b-40c2-8fe8-04bdea46a84c\") " pod="openstack/nova-scheduler-0" Mar 10 08:19:54 crc kubenswrapper[4825]: I0310 08:19:54.523357 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5189c199-0d2b-40c2-8fe8-04bdea46a84c-config-data\") pod \"nova-scheduler-0\" (UID: \"5189c199-0d2b-40c2-8fe8-04bdea46a84c\") " pod="openstack/nova-scheduler-0" Mar 10 08:19:54 crc kubenswrapper[4825]: I0310 08:19:54.625361 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sfdc\" (UniqueName: \"kubernetes.io/projected/5189c199-0d2b-40c2-8fe8-04bdea46a84c-kube-api-access-2sfdc\") pod \"nova-scheduler-0\" (UID: \"5189c199-0d2b-40c2-8fe8-04bdea46a84c\") " pod="openstack/nova-scheduler-0" Mar 10 08:19:54 crc kubenswrapper[4825]: I0310 08:19:54.625414 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5189c199-0d2b-40c2-8fe8-04bdea46a84c-config-data\") pod \"nova-scheduler-0\" (UID: \"5189c199-0d2b-40c2-8fe8-04bdea46a84c\") " pod="openstack/nova-scheduler-0" Mar 10 08:19:54 crc kubenswrapper[4825]: I0310 08:19:54.625446 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5189c199-0d2b-40c2-8fe8-04bdea46a84c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5189c199-0d2b-40c2-8fe8-04bdea46a84c\") " pod="openstack/nova-scheduler-0" Mar 10 08:19:54 crc kubenswrapper[4825]: I0310 08:19:54.629903 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5189c199-0d2b-40c2-8fe8-04bdea46a84c-config-data\") pod \"nova-scheduler-0\" (UID: \"5189c199-0d2b-40c2-8fe8-04bdea46a84c\") " pod="openstack/nova-scheduler-0" Mar 10 08:19:54 crc kubenswrapper[4825]: I0310 08:19:54.630658 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5189c199-0d2b-40c2-8fe8-04bdea46a84c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5189c199-0d2b-40c2-8fe8-04bdea46a84c\") " pod="openstack/nova-scheduler-0" Mar 10 08:19:54 crc kubenswrapper[4825]: I0310 08:19:54.640957 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sfdc\" (UniqueName: \"kubernetes.io/projected/5189c199-0d2b-40c2-8fe8-04bdea46a84c-kube-api-access-2sfdc\") pod \"nova-scheduler-0\" (UID: \"5189c199-0d2b-40c2-8fe8-04bdea46a84c\") " pod="openstack/nova-scheduler-0" Mar 10 08:19:54 crc kubenswrapper[4825]: I0310 08:19:54.758392 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 08:19:55 crc kubenswrapper[4825]: I0310 08:19:55.028616 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-ksp8z"] Mar 10 08:19:55 crc kubenswrapper[4825]: I0310 08:19:55.038648 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-ksp8z"] Mar 10 08:19:55 crc kubenswrapper[4825]: I0310 08:19:55.187942 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 08:19:55 crc kubenswrapper[4825]: I0310 08:19:55.248180 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60" path="/var/lib/kubelet/pods/360a0a5a-bfd5-481d-8f0a-bd6a24ec4a60/volumes" Mar 10 08:19:55 crc kubenswrapper[4825]: I0310 08:19:55.248885 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8565e58-689e-417e-a072-cc3a26fed9a3" path="/var/lib/kubelet/pods/d8565e58-689e-417e-a072-cc3a26fed9a3/volumes" Mar 10 08:19:55 crc kubenswrapper[4825]: I0310 08:19:55.391934 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5189c199-0d2b-40c2-8fe8-04bdea46a84c","Type":"ContainerStarted","Data":"2b8ba41409430104450ede717f08bf58654090af49362e9caf0f65c4d13c1060"} Mar 10 08:19:55 crc kubenswrapper[4825]: I0310 08:19:55.392245 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5189c199-0d2b-40c2-8fe8-04bdea46a84c","Type":"ContainerStarted","Data":"2d942202374c08919f0649968ad5599ab4bd43c83ca5e92e24d199bfe87fdbc6"} Mar 10 08:19:55 crc kubenswrapper[4825]: I0310 08:19:55.415098 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.415076557 podStartE2EDuration="1.415076557s" podCreationTimestamp="2026-03-10 08:19:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:19:55.407469347 +0000 UTC m=+5748.437249962" watchObservedRunningTime="2026-03-10 08:19:55.415076557 +0000 UTC m=+5748.444857172" Mar 10 08:19:57 crc kubenswrapper[4825]: I0310 08:19:57.927925 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 08:19:57 crc kubenswrapper[4825]: I0310 08:19:57.928959 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 08:19:57 crc kubenswrapper[4825]: I0310 08:19:57.929355 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 08:19:57 crc kubenswrapper[4825]: I0310 08:19:57.929591 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 08:19:57 crc kubenswrapper[4825]: I0310 08:19:57.932455 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 08:19:57 crc kubenswrapper[4825]: I0310 08:19:57.932624 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 08:19:57 crc kubenswrapper[4825]: I0310 08:19:57.941660 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 08:19:57 crc kubenswrapper[4825]: I0310 08:19:57.944064 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 08:19:57 crc kubenswrapper[4825]: I0310 08:19:57.948558 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 08:19:58 crc kubenswrapper[4825]: I0310 08:19:58.150236 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66cbbcd49c-svhss"] Mar 10 08:19:58 crc kubenswrapper[4825]: I0310 08:19:58.157098 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" Mar 10 08:19:58 crc kubenswrapper[4825]: I0310 08:19:58.172325 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66cbbcd49c-svhss"] Mar 10 08:19:58 crc kubenswrapper[4825]: I0310 08:19:58.298207 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6985674-aa5a-40c5-a384-04bac7fd0a1b-dns-svc\") pod \"dnsmasq-dns-66cbbcd49c-svhss\" (UID: \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\") " pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" Mar 10 08:19:58 crc kubenswrapper[4825]: I0310 08:19:58.298298 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vczwv\" (UniqueName: \"kubernetes.io/projected/a6985674-aa5a-40c5-a384-04bac7fd0a1b-kube-api-access-vczwv\") pod \"dnsmasq-dns-66cbbcd49c-svhss\" (UID: \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\") " pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" Mar 10 08:19:58 crc kubenswrapper[4825]: I0310 08:19:58.298364 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6985674-aa5a-40c5-a384-04bac7fd0a1b-ovsdbserver-nb\") pod \"dnsmasq-dns-66cbbcd49c-svhss\" (UID: \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\") " pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" Mar 10 08:19:58 crc kubenswrapper[4825]: I0310 08:19:58.298406 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6985674-aa5a-40c5-a384-04bac7fd0a1b-ovsdbserver-sb\") pod \"dnsmasq-dns-66cbbcd49c-svhss\" (UID: \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\") " pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" Mar 10 08:19:58 crc kubenswrapper[4825]: I0310 08:19:58.298486 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6985674-aa5a-40c5-a384-04bac7fd0a1b-config\") pod \"dnsmasq-dns-66cbbcd49c-svhss\" (UID: \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\") " pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" Mar 10 08:19:58 crc kubenswrapper[4825]: I0310 08:19:58.400043 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6985674-aa5a-40c5-a384-04bac7fd0a1b-ovsdbserver-nb\") pod \"dnsmasq-dns-66cbbcd49c-svhss\" (UID: \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\") " pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" Mar 10 08:19:58 crc kubenswrapper[4825]: I0310 08:19:58.400535 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6985674-aa5a-40c5-a384-04bac7fd0a1b-ovsdbserver-sb\") pod \"dnsmasq-dns-66cbbcd49c-svhss\" (UID: \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\") " pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" Mar 10 08:19:58 crc kubenswrapper[4825]: I0310 08:19:58.400845 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6985674-aa5a-40c5-a384-04bac7fd0a1b-config\") pod \"dnsmasq-dns-66cbbcd49c-svhss\" (UID: \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\") " pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" Mar 10 08:19:58 crc kubenswrapper[4825]: I0310 08:19:58.400892 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6985674-aa5a-40c5-a384-04bac7fd0a1b-ovsdbserver-nb\") pod \"dnsmasq-dns-66cbbcd49c-svhss\" (UID: \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\") " pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" Mar 10 08:19:58 crc kubenswrapper[4825]: I0310 08:19:58.401192 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6985674-aa5a-40c5-a384-04bac7fd0a1b-dns-svc\") pod \"dnsmasq-dns-66cbbcd49c-svhss\" (UID: \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\") " pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" Mar 10 08:19:58 crc kubenswrapper[4825]: I0310 08:19:58.401685 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6985674-aa5a-40c5-a384-04bac7fd0a1b-config\") pod \"dnsmasq-dns-66cbbcd49c-svhss\" (UID: \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\") " pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" Mar 10 08:19:58 crc kubenswrapper[4825]: I0310 08:19:58.401727 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6985674-aa5a-40c5-a384-04bac7fd0a1b-dns-svc\") pod \"dnsmasq-dns-66cbbcd49c-svhss\" (UID: \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\") " pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" Mar 10 08:19:58 crc kubenswrapper[4825]: I0310 08:19:58.401747 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6985674-aa5a-40c5-a384-04bac7fd0a1b-ovsdbserver-sb\") pod \"dnsmasq-dns-66cbbcd49c-svhss\" (UID: \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\") " pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" Mar 10 08:19:58 crc kubenswrapper[4825]: I0310 08:19:58.401854 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vczwv\" (UniqueName: \"kubernetes.io/projected/a6985674-aa5a-40c5-a384-04bac7fd0a1b-kube-api-access-vczwv\") pod \"dnsmasq-dns-66cbbcd49c-svhss\" (UID: \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\") " pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" Mar 10 08:19:58 crc kubenswrapper[4825]: I0310 08:19:58.430120 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vczwv\" (UniqueName: \"kubernetes.io/projected/a6985674-aa5a-40c5-a384-04bac7fd0a1b-kube-api-access-vczwv\") pod \"dnsmasq-dns-66cbbcd49c-svhss\" (UID: \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\") " pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" Mar 10 08:19:58 crc kubenswrapper[4825]: I0310 08:19:58.431428 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 08:19:58 crc kubenswrapper[4825]: I0310 08:19:58.487275 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" Mar 10 08:19:58 crc kubenswrapper[4825]: W0310 08:19:58.979700 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6985674_aa5a_40c5_a384_04bac7fd0a1b.slice/crio-9ab226bffea0e69b9ed41e55cc33656c9a911b8d37b0e82b97a235a3f448ba20 WatchSource:0}: Error finding container 9ab226bffea0e69b9ed41e55cc33656c9a911b8d37b0e82b97a235a3f448ba20: Status 404 returned error can't find the container with id 9ab226bffea0e69b9ed41e55cc33656c9a911b8d37b0e82b97a235a3f448ba20 Mar 10 08:19:58 crc kubenswrapper[4825]: I0310 08:19:58.987990 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66cbbcd49c-svhss"] Mar 10 08:19:59 crc kubenswrapper[4825]: I0310 08:19:59.432893 4825 generic.go:334] "Generic (PLEG): container finished" podID="a6985674-aa5a-40c5-a384-04bac7fd0a1b" containerID="d1ac0f7ac6e21940d961dad9396df6d9371d828f3f536516a27f2f707bb954de" exitCode=0 Mar 10 08:19:59 crc kubenswrapper[4825]: I0310 08:19:59.432950 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" event={"ID":"a6985674-aa5a-40c5-a384-04bac7fd0a1b","Type":"ContainerDied","Data":"d1ac0f7ac6e21940d961dad9396df6d9371d828f3f536516a27f2f707bb954de"} Mar 10 08:19:59 crc kubenswrapper[4825]: I0310 08:19:59.433306 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" event={"ID":"a6985674-aa5a-40c5-a384-04bac7fd0a1b","Type":"ContainerStarted","Data":"9ab226bffea0e69b9ed41e55cc33656c9a911b8d37b0e82b97a235a3f448ba20"} Mar 10 08:19:59 crc kubenswrapper[4825]: I0310 08:19:59.758719 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 08:20:00 crc kubenswrapper[4825]: I0310 08:20:00.133699 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552180-xsj9w"] Mar 10 08:20:00 crc kubenswrapper[4825]: I0310 08:20:00.135492 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552180-xsj9w" Mar 10 08:20:00 crc kubenswrapper[4825]: I0310 08:20:00.141789 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:20:00 crc kubenswrapper[4825]: I0310 08:20:00.142041 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:20:00 crc kubenswrapper[4825]: I0310 08:20:00.142536 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:20:00 crc kubenswrapper[4825]: I0310 08:20:00.147009 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552180-xsj9w"] Mar 10 08:20:00 crc kubenswrapper[4825]: I0310 08:20:00.150855 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nnw7\" (UniqueName: \"kubernetes.io/projected/d7bac17e-a1d4-4938-b9cc-7e9590710245-kube-api-access-9nnw7\") pod \"auto-csr-approver-29552180-xsj9w\" (UID: \"d7bac17e-a1d4-4938-b9cc-7e9590710245\") " pod="openshift-infra/auto-csr-approver-29552180-xsj9w" Mar 10 08:20:00 crc kubenswrapper[4825]: I0310 08:20:00.253255 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nnw7\" (UniqueName: \"kubernetes.io/projected/d7bac17e-a1d4-4938-b9cc-7e9590710245-kube-api-access-9nnw7\") pod \"auto-csr-approver-29552180-xsj9w\" (UID: \"d7bac17e-a1d4-4938-b9cc-7e9590710245\") " pod="openshift-infra/auto-csr-approver-29552180-xsj9w" Mar 10 08:20:00 crc kubenswrapper[4825]: I0310 08:20:00.273803 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nnw7\" (UniqueName: \"kubernetes.io/projected/d7bac17e-a1d4-4938-b9cc-7e9590710245-kube-api-access-9nnw7\") pod \"auto-csr-approver-29552180-xsj9w\" (UID: \"d7bac17e-a1d4-4938-b9cc-7e9590710245\") " pod="openshift-infra/auto-csr-approver-29552180-xsj9w" Mar 10 08:20:00 crc kubenswrapper[4825]: I0310 08:20:00.442316 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" event={"ID":"a6985674-aa5a-40c5-a384-04bac7fd0a1b","Type":"ContainerStarted","Data":"6862e0585f5a70635aa6d6682db784c76f978351ae935abc4d0460550c6fd034"} Mar 10 08:20:00 crc kubenswrapper[4825]: I0310 08:20:00.469608 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" podStartSLOduration=2.469587857 podStartE2EDuration="2.469587857s" podCreationTimestamp="2026-03-10 08:19:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:20:00.46740936 +0000 UTC m=+5753.497189975" watchObservedRunningTime="2026-03-10 08:20:00.469587857 +0000 UTC m=+5753.499368472" Mar 10 08:20:00 crc kubenswrapper[4825]: I0310 08:20:00.470918 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552180-xsj9w" Mar 10 08:20:00 crc kubenswrapper[4825]: I0310 08:20:00.933058 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552180-xsj9w"] Mar 10 08:20:00 crc kubenswrapper[4825]: W0310 08:20:00.943108 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7bac17e_a1d4_4938_b9cc_7e9590710245.slice/crio-e639e5519e740dc3f474aad7d259493b8cba98e1eec40f2f87fd1863dd792bf7 WatchSource:0}: Error finding container e639e5519e740dc3f474aad7d259493b8cba98e1eec40f2f87fd1863dd792bf7: Status 404 returned error can't find the container with id e639e5519e740dc3f474aad7d259493b8cba98e1eec40f2f87fd1863dd792bf7 Mar 10 08:20:00 crc kubenswrapper[4825]: I0310 08:20:00.945776 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 08:20:01 crc kubenswrapper[4825]: I0310 08:20:01.451766 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552180-xsj9w" event={"ID":"d7bac17e-a1d4-4938-b9cc-7e9590710245","Type":"ContainerStarted","Data":"e639e5519e740dc3f474aad7d259493b8cba98e1eec40f2f87fd1863dd792bf7"} Mar 10 08:20:01 crc kubenswrapper[4825]: I0310 08:20:01.452074 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" Mar 10 08:20:01 crc kubenswrapper[4825]: I0310 08:20:01.685596 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 08:20:01 crc kubenswrapper[4825]: I0310 08:20:01.685803 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="125572ab-4336-41da-900d-a0649d7d378d" containerName="nova-api-log" containerID="cri-o://5ecde6232d4d92fb21f8fce82c558c0bf2c8a5c77f788596d29a02bfd6d5b55e" gracePeriod=30 Mar 10 08:20:01 crc kubenswrapper[4825]: I0310 08:20:01.686012 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="125572ab-4336-41da-900d-a0649d7d378d" containerName="nova-api-api" containerID="cri-o://c136487bcbacd9b215a10bacabe846100c2b90633f10d9513f3a49349032a85b" gracePeriod=30 Mar 10 08:20:02 crc kubenswrapper[4825]: I0310 08:20:02.462579 4825 generic.go:334] "Generic (PLEG): container finished" podID="125572ab-4336-41da-900d-a0649d7d378d" containerID="5ecde6232d4d92fb21f8fce82c558c0bf2c8a5c77f788596d29a02bfd6d5b55e" exitCode=143 Mar 10 08:20:02 crc kubenswrapper[4825]: I0310 08:20:02.462761 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"125572ab-4336-41da-900d-a0649d7d378d","Type":"ContainerDied","Data":"5ecde6232d4d92fb21f8fce82c558c0bf2c8a5c77f788596d29a02bfd6d5b55e"} Mar 10 08:20:02 crc kubenswrapper[4825]: I0310 08:20:02.464641 4825 generic.go:334] "Generic (PLEG): container finished" podID="d7bac17e-a1d4-4938-b9cc-7e9590710245" containerID="3db06b76d50c0663942827bc9ceda05627ed36845b18c3c0d089df96a737b90d" exitCode=0 Mar 10 08:20:02 crc kubenswrapper[4825]: I0310 08:20:02.464774 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552180-xsj9w" event={"ID":"d7bac17e-a1d4-4938-b9cc-7e9590710245","Type":"ContainerDied","Data":"3db06b76d50c0663942827bc9ceda05627ed36845b18c3c0d089df96a737b90d"} Mar 10 08:20:03 crc kubenswrapper[4825]: I0310 08:20:03.795928 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552180-xsj9w" Mar 10 08:20:03 crc kubenswrapper[4825]: I0310 08:20:03.817708 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nnw7\" (UniqueName: \"kubernetes.io/projected/d7bac17e-a1d4-4938-b9cc-7e9590710245-kube-api-access-9nnw7\") pod \"d7bac17e-a1d4-4938-b9cc-7e9590710245\" (UID: \"d7bac17e-a1d4-4938-b9cc-7e9590710245\") " Mar 10 08:20:03 crc kubenswrapper[4825]: I0310 08:20:03.823911 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7bac17e-a1d4-4938-b9cc-7e9590710245-kube-api-access-9nnw7" (OuterVolumeSpecName: "kube-api-access-9nnw7") pod "d7bac17e-a1d4-4938-b9cc-7e9590710245" (UID: "d7bac17e-a1d4-4938-b9cc-7e9590710245"). InnerVolumeSpecName "kube-api-access-9nnw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:20:03 crc kubenswrapper[4825]: I0310 08:20:03.920394 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nnw7\" (UniqueName: \"kubernetes.io/projected/d7bac17e-a1d4-4938-b9cc-7e9590710245-kube-api-access-9nnw7\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:04 crc kubenswrapper[4825]: I0310 08:20:04.481513 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552180-xsj9w" event={"ID":"d7bac17e-a1d4-4938-b9cc-7e9590710245","Type":"ContainerDied","Data":"e639e5519e740dc3f474aad7d259493b8cba98e1eec40f2f87fd1863dd792bf7"} Mar 10 08:20:04 crc kubenswrapper[4825]: I0310 08:20:04.481746 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e639e5519e740dc3f474aad7d259493b8cba98e1eec40f2f87fd1863dd792bf7" Mar 10 08:20:04 crc kubenswrapper[4825]: I0310 08:20:04.481588 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552180-xsj9w" Mar 10 08:20:04 crc kubenswrapper[4825]: I0310 08:20:04.758827 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 08:20:04 crc kubenswrapper[4825]: I0310 08:20:04.785704 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 08:20:04 crc kubenswrapper[4825]: I0310 08:20:04.870190 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552174-xcsb5"] Mar 10 08:20:04 crc kubenswrapper[4825]: I0310 08:20:04.881338 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552174-xcsb5"] Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.246970 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa9e162c-76d6-4e53-9cf3-72544f8a1399" path="/var/lib/kubelet/pods/fa9e162c-76d6-4e53-9cf3-72544f8a1399/volumes" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.312564 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.348887 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/125572ab-4336-41da-900d-a0649d7d378d-logs\") pod \"125572ab-4336-41da-900d-a0649d7d378d\" (UID: \"125572ab-4336-41da-900d-a0649d7d378d\") " Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.349205 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xff5l\" (UniqueName: \"kubernetes.io/projected/125572ab-4336-41da-900d-a0649d7d378d-kube-api-access-xff5l\") pod \"125572ab-4336-41da-900d-a0649d7d378d\" (UID: \"125572ab-4336-41da-900d-a0649d7d378d\") " Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.349310 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125572ab-4336-41da-900d-a0649d7d378d-combined-ca-bundle\") pod \"125572ab-4336-41da-900d-a0649d7d378d\" (UID: \"125572ab-4336-41da-900d-a0649d7d378d\") " Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.349421 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/125572ab-4336-41da-900d-a0649d7d378d-config-data\") pod \"125572ab-4336-41da-900d-a0649d7d378d\" (UID: \"125572ab-4336-41da-900d-a0649d7d378d\") " Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.350001 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/125572ab-4336-41da-900d-a0649d7d378d-logs" (OuterVolumeSpecName: "logs") pod "125572ab-4336-41da-900d-a0649d7d378d" (UID: "125572ab-4336-41da-900d-a0649d7d378d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.350249 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/125572ab-4336-41da-900d-a0649d7d378d-logs\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.355612 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/125572ab-4336-41da-900d-a0649d7d378d-kube-api-access-xff5l" (OuterVolumeSpecName: "kube-api-access-xff5l") pod "125572ab-4336-41da-900d-a0649d7d378d" (UID: "125572ab-4336-41da-900d-a0649d7d378d"). InnerVolumeSpecName "kube-api-access-xff5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.377070 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125572ab-4336-41da-900d-a0649d7d378d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "125572ab-4336-41da-900d-a0649d7d378d" (UID: "125572ab-4336-41da-900d-a0649d7d378d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.386318 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125572ab-4336-41da-900d-a0649d7d378d-config-data" (OuterVolumeSpecName: "config-data") pod "125572ab-4336-41da-900d-a0649d7d378d" (UID: "125572ab-4336-41da-900d-a0649d7d378d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.451581 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xff5l\" (UniqueName: \"kubernetes.io/projected/125572ab-4336-41da-900d-a0649d7d378d-kube-api-access-xff5l\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.451622 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125572ab-4336-41da-900d-a0649d7d378d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.451634 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/125572ab-4336-41da-900d-a0649d7d378d-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.491204 4825 generic.go:334] "Generic (PLEG): container finished" podID="125572ab-4336-41da-900d-a0649d7d378d" containerID="c136487bcbacd9b215a10bacabe846100c2b90633f10d9513f3a49349032a85b" exitCode=0 Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.492220 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.496339 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"125572ab-4336-41da-900d-a0649d7d378d","Type":"ContainerDied","Data":"c136487bcbacd9b215a10bacabe846100c2b90633f10d9513f3a49349032a85b"} Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.496397 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"125572ab-4336-41da-900d-a0649d7d378d","Type":"ContainerDied","Data":"06d8b1b3c0e75bdec21fb01016a619499ac425777e97fd23221dc640f0f0440b"} Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.496421 4825 scope.go:117] "RemoveContainer" containerID="c136487bcbacd9b215a10bacabe846100c2b90633f10d9513f3a49349032a85b" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.529543 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.529932 4825 scope.go:117] "RemoveContainer" containerID="5ecde6232d4d92fb21f8fce82c558c0bf2c8a5c77f788596d29a02bfd6d5b55e" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.543466 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.556641 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.568536 4825 scope.go:117] "RemoveContainer" containerID="c136487bcbacd9b215a10bacabe846100c2b90633f10d9513f3a49349032a85b" Mar 10 08:20:05 crc kubenswrapper[4825]: E0310 08:20:05.569624 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c136487bcbacd9b215a10bacabe846100c2b90633f10d9513f3a49349032a85b\": container with ID starting with c136487bcbacd9b215a10bacabe846100c2b90633f10d9513f3a49349032a85b not found: ID does not exist" containerID="c136487bcbacd9b215a10bacabe846100c2b90633f10d9513f3a49349032a85b" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.569661 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c136487bcbacd9b215a10bacabe846100c2b90633f10d9513f3a49349032a85b"} err="failed to get container status \"c136487bcbacd9b215a10bacabe846100c2b90633f10d9513f3a49349032a85b\": rpc error: code = NotFound desc = could not find container \"c136487bcbacd9b215a10bacabe846100c2b90633f10d9513f3a49349032a85b\": container with ID starting with c136487bcbacd9b215a10bacabe846100c2b90633f10d9513f3a49349032a85b not found: ID does not exist" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.569687 4825 scope.go:117] "RemoveContainer" containerID="5ecde6232d4d92fb21f8fce82c558c0bf2c8a5c77f788596d29a02bfd6d5b55e" Mar 10 08:20:05 crc kubenswrapper[4825]: E0310 08:20:05.578037 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ecde6232d4d92fb21f8fce82c558c0bf2c8a5c77f788596d29a02bfd6d5b55e\": container with ID starting with 5ecde6232d4d92fb21f8fce82c558c0bf2c8a5c77f788596d29a02bfd6d5b55e not found: ID does not exist" containerID="5ecde6232d4d92fb21f8fce82c558c0bf2c8a5c77f788596d29a02bfd6d5b55e" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.578098 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ecde6232d4d92fb21f8fce82c558c0bf2c8a5c77f788596d29a02bfd6d5b55e"} err="failed to get container status \"5ecde6232d4d92fb21f8fce82c558c0bf2c8a5c77f788596d29a02bfd6d5b55e\": rpc error: code = NotFound desc = could not find container \"5ecde6232d4d92fb21f8fce82c558c0bf2c8a5c77f788596d29a02bfd6d5b55e\": container with ID starting with 5ecde6232d4d92fb21f8fce82c558c0bf2c8a5c77f788596d29a02bfd6d5b55e not found: ID does not exist" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.589931 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 08:20:05 crc kubenswrapper[4825]: E0310 08:20:05.590518 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bac17e-a1d4-4938-b9cc-7e9590710245" containerName="oc" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.590537 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bac17e-a1d4-4938-b9cc-7e9590710245" containerName="oc" Mar 10 08:20:05 crc kubenswrapper[4825]: E0310 08:20:05.590551 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="125572ab-4336-41da-900d-a0649d7d378d" containerName="nova-api-api" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.590558 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="125572ab-4336-41da-900d-a0649d7d378d" containerName="nova-api-api" Mar 10 08:20:05 crc kubenswrapper[4825]: E0310 08:20:05.590569 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="125572ab-4336-41da-900d-a0649d7d378d" containerName="nova-api-log" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.590575 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="125572ab-4336-41da-900d-a0649d7d378d" containerName="nova-api-log" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.590831 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="125572ab-4336-41da-900d-a0649d7d378d" containerName="nova-api-api" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.590849 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="125572ab-4336-41da-900d-a0649d7d378d" containerName="nova-api-log" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.590870 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7bac17e-a1d4-4938-b9cc-7e9590710245" containerName="oc" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.591902 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.593907 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.596293 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.596960 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.619841 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.658225 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k8jz\" (UniqueName: \"kubernetes.io/projected/62dad1be-8413-4a7e-b097-b6d2e1ec3246-kube-api-access-7k8jz\") pod \"nova-api-0\" (UID: \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\") " pod="openstack/nova-api-0" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.658322 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62dad1be-8413-4a7e-b097-b6d2e1ec3246-internal-tls-certs\") pod \"nova-api-0\" (UID: \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\") " pod="openstack/nova-api-0" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.658395 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62dad1be-8413-4a7e-b097-b6d2e1ec3246-config-data\") pod \"nova-api-0\" (UID: \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\") " pod="openstack/nova-api-0" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.658418 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62dad1be-8413-4a7e-b097-b6d2e1ec3246-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\") " pod="openstack/nova-api-0" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.658448 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62dad1be-8413-4a7e-b097-b6d2e1ec3246-public-tls-certs\") pod \"nova-api-0\" (UID: \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\") " pod="openstack/nova-api-0" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.658480 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62dad1be-8413-4a7e-b097-b6d2e1ec3246-logs\") pod \"nova-api-0\" (UID: \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\") " pod="openstack/nova-api-0" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.760907 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k8jz\" (UniqueName: \"kubernetes.io/projected/62dad1be-8413-4a7e-b097-b6d2e1ec3246-kube-api-access-7k8jz\") pod \"nova-api-0\" (UID: \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\") " pod="openstack/nova-api-0" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.761324 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62dad1be-8413-4a7e-b097-b6d2e1ec3246-internal-tls-certs\") pod \"nova-api-0\" (UID: \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\") " pod="openstack/nova-api-0" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.761397 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62dad1be-8413-4a7e-b097-b6d2e1ec3246-config-data\") pod \"nova-api-0\" (UID: \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\") " pod="openstack/nova-api-0" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.761422 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62dad1be-8413-4a7e-b097-b6d2e1ec3246-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\") " pod="openstack/nova-api-0" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.761444 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62dad1be-8413-4a7e-b097-b6d2e1ec3246-public-tls-certs\") pod \"nova-api-0\" (UID: \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\") " pod="openstack/nova-api-0" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.761469 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62dad1be-8413-4a7e-b097-b6d2e1ec3246-logs\") pod \"nova-api-0\" (UID: \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\") " pod="openstack/nova-api-0" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.761941 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62dad1be-8413-4a7e-b097-b6d2e1ec3246-logs\") pod \"nova-api-0\" (UID: \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\") " pod="openstack/nova-api-0" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.765717 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62dad1be-8413-4a7e-b097-b6d2e1ec3246-internal-tls-certs\") pod \"nova-api-0\" (UID: \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\") " pod="openstack/nova-api-0" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.766340 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62dad1be-8413-4a7e-b097-b6d2e1ec3246-public-tls-certs\") pod \"nova-api-0\" (UID: \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\") " pod="openstack/nova-api-0" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.766800 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62dad1be-8413-4a7e-b097-b6d2e1ec3246-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\") " pod="openstack/nova-api-0" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.768842 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62dad1be-8413-4a7e-b097-b6d2e1ec3246-config-data\") pod \"nova-api-0\" (UID: \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\") " pod="openstack/nova-api-0" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.778941 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k8jz\" (UniqueName: \"kubernetes.io/projected/62dad1be-8413-4a7e-b097-b6d2e1ec3246-kube-api-access-7k8jz\") pod \"nova-api-0\" (UID: \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\") " pod="openstack/nova-api-0" Mar 10 08:20:05 crc kubenswrapper[4825]: I0310 08:20:05.911582 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 08:20:06 crc kubenswrapper[4825]: I0310 08:20:06.372929 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 08:20:06 crc kubenswrapper[4825]: I0310 08:20:06.501814 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"62dad1be-8413-4a7e-b097-b6d2e1ec3246","Type":"ContainerStarted","Data":"d61052bd14c88a6e5a4b74270347a1d3f4fc6df0f055339d88a561100f399a65"} Mar 10 08:20:07 crc kubenswrapper[4825]: I0310 08:20:07.249830 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="125572ab-4336-41da-900d-a0649d7d378d" path="/var/lib/kubelet/pods/125572ab-4336-41da-900d-a0649d7d378d/volumes" Mar 10 08:20:07 crc kubenswrapper[4825]: I0310 08:20:07.513397 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"62dad1be-8413-4a7e-b097-b6d2e1ec3246","Type":"ContainerStarted","Data":"53501f735e5e94a19464b8ed3190c4c9cc118c8695bb55f561845902951cfdbc"} Mar 10 08:20:07 crc kubenswrapper[4825]: I0310 08:20:07.513433 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"62dad1be-8413-4a7e-b097-b6d2e1ec3246","Type":"ContainerStarted","Data":"e58f953a04ebd72b37b1f49cf6ed5a60c52afd420da2fa4ea715f458184f4621"} Mar 10 08:20:07 crc kubenswrapper[4825]: I0310 08:20:07.530930 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.5309016140000002 podStartE2EDuration="2.530901614s" podCreationTimestamp="2026-03-10 08:20:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:20:07.529174638 +0000 UTC m=+5760.558955253" watchObservedRunningTime="2026-03-10 08:20:07.530901614 +0000 UTC m=+5760.560682229" Mar 10 08:20:08 crc kubenswrapper[4825]: I0310 08:20:08.031175 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-d4mm8"] Mar 10 08:20:08 crc kubenswrapper[4825]: I0310 08:20:08.042619 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-d4mm8"] Mar 10 08:20:08 crc kubenswrapper[4825]: I0310 08:20:08.489881 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" Mar 10 08:20:08 crc kubenswrapper[4825]: I0310 08:20:08.550667 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b65f5864c-2nkbg"] Mar 10 08:20:08 crc kubenswrapper[4825]: I0310 08:20:08.550940 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" podUID="4e09d2ea-af50-4fff-ba76-270737793800" containerName="dnsmasq-dns" containerID="cri-o://493ca5094ecb817da7adae4f0b754addd1f62b6347c54a114fccb0f800dae09a" gracePeriod=10 Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.052713 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.126776 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcqbl\" (UniqueName: \"kubernetes.io/projected/4e09d2ea-af50-4fff-ba76-270737793800-kube-api-access-zcqbl\") pod \"4e09d2ea-af50-4fff-ba76-270737793800\" (UID: \"4e09d2ea-af50-4fff-ba76-270737793800\") " Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.126811 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e09d2ea-af50-4fff-ba76-270737793800-config\") pod \"4e09d2ea-af50-4fff-ba76-270737793800\" (UID: \"4e09d2ea-af50-4fff-ba76-270737793800\") " Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.126940 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e09d2ea-af50-4fff-ba76-270737793800-ovsdbserver-nb\") pod \"4e09d2ea-af50-4fff-ba76-270737793800\" (UID: \"4e09d2ea-af50-4fff-ba76-270737793800\") " Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.126983 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e09d2ea-af50-4fff-ba76-270737793800-ovsdbserver-sb\") pod \"4e09d2ea-af50-4fff-ba76-270737793800\" (UID: \"4e09d2ea-af50-4fff-ba76-270737793800\") " Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.127051 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e09d2ea-af50-4fff-ba76-270737793800-dns-svc\") pod \"4e09d2ea-af50-4fff-ba76-270737793800\" (UID: \"4e09d2ea-af50-4fff-ba76-270737793800\") " Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.133523 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e09d2ea-af50-4fff-ba76-270737793800-kube-api-access-zcqbl" (OuterVolumeSpecName: "kube-api-access-zcqbl") pod "4e09d2ea-af50-4fff-ba76-270737793800" (UID: "4e09d2ea-af50-4fff-ba76-270737793800"). InnerVolumeSpecName "kube-api-access-zcqbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.176663 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e09d2ea-af50-4fff-ba76-270737793800-config" (OuterVolumeSpecName: "config") pod "4e09d2ea-af50-4fff-ba76-270737793800" (UID: "4e09d2ea-af50-4fff-ba76-270737793800"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.188853 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e09d2ea-af50-4fff-ba76-270737793800-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4e09d2ea-af50-4fff-ba76-270737793800" (UID: "4e09d2ea-af50-4fff-ba76-270737793800"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.195221 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e09d2ea-af50-4fff-ba76-270737793800-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4e09d2ea-af50-4fff-ba76-270737793800" (UID: "4e09d2ea-af50-4fff-ba76-270737793800"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.211352 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e09d2ea-af50-4fff-ba76-270737793800-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4e09d2ea-af50-4fff-ba76-270737793800" (UID: "4e09d2ea-af50-4fff-ba76-270737793800"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.229187 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e09d2ea-af50-4fff-ba76-270737793800-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.229369 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcqbl\" (UniqueName: \"kubernetes.io/projected/4e09d2ea-af50-4fff-ba76-270737793800-kube-api-access-zcqbl\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.229409 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e09d2ea-af50-4fff-ba76-270737793800-config\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.229423 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e09d2ea-af50-4fff-ba76-270737793800-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.229436 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e09d2ea-af50-4fff-ba76-270737793800-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.246608 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c94d5e99-c6af-43b2-9000-bc78fb053d79" path="/var/lib/kubelet/pods/c94d5e99-c6af-43b2-9000-bc78fb053d79/volumes" Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.551523 4825 generic.go:334] "Generic (PLEG): container finished" podID="4e09d2ea-af50-4fff-ba76-270737793800" containerID="493ca5094ecb817da7adae4f0b754addd1f62b6347c54a114fccb0f800dae09a" exitCode=0 Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.551587 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" event={"ID":"4e09d2ea-af50-4fff-ba76-270737793800","Type":"ContainerDied","Data":"493ca5094ecb817da7adae4f0b754addd1f62b6347c54a114fccb0f800dae09a"} Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.551615 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" event={"ID":"4e09d2ea-af50-4fff-ba76-270737793800","Type":"ContainerDied","Data":"6fa927544056305e29afa011575339da3d3364c56e9bc879549ebd8e277b5cbc"} Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.551667 4825 scope.go:117] "RemoveContainer" containerID="493ca5094ecb817da7adae4f0b754addd1f62b6347c54a114fccb0f800dae09a" Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.552478 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b65f5864c-2nkbg" Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.575015 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b65f5864c-2nkbg"] Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.584400 4825 scope.go:117] "RemoveContainer" containerID="1aa04356f697b3fcdd3359ee9e10b7ab3948b624433309f62f5ac8f761d1e1a7" Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.585556 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b65f5864c-2nkbg"] Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.601899 4825 scope.go:117] "RemoveContainer" containerID="493ca5094ecb817da7adae4f0b754addd1f62b6347c54a114fccb0f800dae09a" Mar 10 08:20:09 crc kubenswrapper[4825]: E0310 08:20:09.602429 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"493ca5094ecb817da7adae4f0b754addd1f62b6347c54a114fccb0f800dae09a\": container with ID starting with 493ca5094ecb817da7adae4f0b754addd1f62b6347c54a114fccb0f800dae09a not found: ID does not exist" containerID="493ca5094ecb817da7adae4f0b754addd1f62b6347c54a114fccb0f800dae09a" Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.602473 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"493ca5094ecb817da7adae4f0b754addd1f62b6347c54a114fccb0f800dae09a"} err="failed to get container status \"493ca5094ecb817da7adae4f0b754addd1f62b6347c54a114fccb0f800dae09a\": rpc error: code = NotFound desc = could not find container \"493ca5094ecb817da7adae4f0b754addd1f62b6347c54a114fccb0f800dae09a\": container with ID starting with 493ca5094ecb817da7adae4f0b754addd1f62b6347c54a114fccb0f800dae09a not found: ID does not exist" Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.602498 4825 scope.go:117] "RemoveContainer" containerID="1aa04356f697b3fcdd3359ee9e10b7ab3948b624433309f62f5ac8f761d1e1a7" Mar 10 08:20:09 crc kubenswrapper[4825]: E0310 08:20:09.602856 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aa04356f697b3fcdd3359ee9e10b7ab3948b624433309f62f5ac8f761d1e1a7\": container with ID starting with 1aa04356f697b3fcdd3359ee9e10b7ab3948b624433309f62f5ac8f761d1e1a7 not found: ID does not exist" containerID="1aa04356f697b3fcdd3359ee9e10b7ab3948b624433309f62f5ac8f761d1e1a7" Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.602916 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aa04356f697b3fcdd3359ee9e10b7ab3948b624433309f62f5ac8f761d1e1a7"} err="failed to get container status \"1aa04356f697b3fcdd3359ee9e10b7ab3948b624433309f62f5ac8f761d1e1a7\": rpc error: code = NotFound desc = could not find container \"1aa04356f697b3fcdd3359ee9e10b7ab3948b624433309f62f5ac8f761d1e1a7\": container with ID starting with 1aa04356f697b3fcdd3359ee9e10b7ab3948b624433309f62f5ac8f761d1e1a7 not found: ID does not exist" Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.970123 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-llnvt"] Mar 10 08:20:09 crc kubenswrapper[4825]: E0310 08:20:09.970784 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e09d2ea-af50-4fff-ba76-270737793800" containerName="init" Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.970800 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e09d2ea-af50-4fff-ba76-270737793800" containerName="init" Mar 10 08:20:09 crc kubenswrapper[4825]: E0310 08:20:09.970826 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e09d2ea-af50-4fff-ba76-270737793800" containerName="dnsmasq-dns" Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.970832 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e09d2ea-af50-4fff-ba76-270737793800" containerName="dnsmasq-dns" Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.970998 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e09d2ea-af50-4fff-ba76-270737793800" containerName="dnsmasq-dns" Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.972258 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-llnvt" Mar 10 08:20:09 crc kubenswrapper[4825]: I0310 08:20:09.991342 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-llnvt"] Mar 10 08:20:10 crc kubenswrapper[4825]: I0310 08:20:10.049564 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31dbfbab-84ac-4f05-bfbf-b658aca1b300-catalog-content\") pod \"redhat-marketplace-llnvt\" (UID: \"31dbfbab-84ac-4f05-bfbf-b658aca1b300\") " pod="openshift-marketplace/redhat-marketplace-llnvt" Mar 10 08:20:10 crc kubenswrapper[4825]: I0310 08:20:10.049786 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57c2d\" (UniqueName: \"kubernetes.io/projected/31dbfbab-84ac-4f05-bfbf-b658aca1b300-kube-api-access-57c2d\") pod \"redhat-marketplace-llnvt\" (UID: \"31dbfbab-84ac-4f05-bfbf-b658aca1b300\") " pod="openshift-marketplace/redhat-marketplace-llnvt" Mar 10 08:20:10 crc kubenswrapper[4825]: I0310 08:20:10.049875 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31dbfbab-84ac-4f05-bfbf-b658aca1b300-utilities\") pod \"redhat-marketplace-llnvt\" (UID: \"31dbfbab-84ac-4f05-bfbf-b658aca1b300\") " pod="openshift-marketplace/redhat-marketplace-llnvt" Mar 10 08:20:10 crc kubenswrapper[4825]: I0310 08:20:10.151414 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31dbfbab-84ac-4f05-bfbf-b658aca1b300-catalog-content\") pod \"redhat-marketplace-llnvt\" (UID: \"31dbfbab-84ac-4f05-bfbf-b658aca1b300\") " pod="openshift-marketplace/redhat-marketplace-llnvt" Mar 10 08:20:10 crc kubenswrapper[4825]: I0310 08:20:10.151522 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57c2d\" (UniqueName: \"kubernetes.io/projected/31dbfbab-84ac-4f05-bfbf-b658aca1b300-kube-api-access-57c2d\") pod \"redhat-marketplace-llnvt\" (UID: \"31dbfbab-84ac-4f05-bfbf-b658aca1b300\") " pod="openshift-marketplace/redhat-marketplace-llnvt" Mar 10 08:20:10 crc kubenswrapper[4825]: I0310 08:20:10.151559 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31dbfbab-84ac-4f05-bfbf-b658aca1b300-utilities\") pod \"redhat-marketplace-llnvt\" (UID: \"31dbfbab-84ac-4f05-bfbf-b658aca1b300\") " pod="openshift-marketplace/redhat-marketplace-llnvt" Mar 10 08:20:10 crc kubenswrapper[4825]: I0310 08:20:10.152520 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31dbfbab-84ac-4f05-bfbf-b658aca1b300-catalog-content\") pod \"redhat-marketplace-llnvt\" (UID: \"31dbfbab-84ac-4f05-bfbf-b658aca1b300\") " pod="openshift-marketplace/redhat-marketplace-llnvt" Mar 10 08:20:10 crc kubenswrapper[4825]: I0310 08:20:10.152622 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31dbfbab-84ac-4f05-bfbf-b658aca1b300-utilities\") pod \"redhat-marketplace-llnvt\" (UID: \"31dbfbab-84ac-4f05-bfbf-b658aca1b300\") " pod="openshift-marketplace/redhat-marketplace-llnvt" Mar 10 08:20:10 crc kubenswrapper[4825]: I0310 08:20:10.205939 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57c2d\" (UniqueName: \"kubernetes.io/projected/31dbfbab-84ac-4f05-bfbf-b658aca1b300-kube-api-access-57c2d\") pod \"redhat-marketplace-llnvt\" (UID: \"31dbfbab-84ac-4f05-bfbf-b658aca1b300\") " pod="openshift-marketplace/redhat-marketplace-llnvt" Mar 10 08:20:10 crc kubenswrapper[4825]: I0310 08:20:10.288712 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-llnvt" Mar 10 08:20:10 crc kubenswrapper[4825]: I0310 08:20:10.724636 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-llnvt"] Mar 10 08:20:11 crc kubenswrapper[4825]: I0310 08:20:11.246515 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e09d2ea-af50-4fff-ba76-270737793800" path="/var/lib/kubelet/pods/4e09d2ea-af50-4fff-ba76-270737793800/volumes" Mar 10 08:20:11 crc kubenswrapper[4825]: I0310 08:20:11.573460 4825 generic.go:334] "Generic (PLEG): container finished" podID="31dbfbab-84ac-4f05-bfbf-b658aca1b300" containerID="719346f0d07d4e2ec21c464caaf1c5aa8778b5b142b202e2baeff590a4d03609" exitCode=0 Mar 10 08:20:11 crc kubenswrapper[4825]: I0310 08:20:11.573541 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-llnvt" event={"ID":"31dbfbab-84ac-4f05-bfbf-b658aca1b300","Type":"ContainerDied","Data":"719346f0d07d4e2ec21c464caaf1c5aa8778b5b142b202e2baeff590a4d03609"} Mar 10 08:20:11 crc kubenswrapper[4825]: I0310 08:20:11.573619 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-llnvt" event={"ID":"31dbfbab-84ac-4f05-bfbf-b658aca1b300","Type":"ContainerStarted","Data":"a0b97fb698dc01fddae29aa4e4e7aefd4b9efbb7b5f386e49ef7756d76de0514"} Mar 10 08:20:12 crc kubenswrapper[4825]: I0310 08:20:12.604148 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-llnvt" event={"ID":"31dbfbab-84ac-4f05-bfbf-b658aca1b300","Type":"ContainerStarted","Data":"fdd3aadf9e0388431925d595e1d184ebf46fd4250cd563fbcb532ee7e9d98ba7"} Mar 10 08:20:13 crc kubenswrapper[4825]: I0310 08:20:13.612803 4825 generic.go:334] "Generic (PLEG): container finished" podID="31dbfbab-84ac-4f05-bfbf-b658aca1b300" containerID="fdd3aadf9e0388431925d595e1d184ebf46fd4250cd563fbcb532ee7e9d98ba7" exitCode=0 Mar 10 08:20:13 crc kubenswrapper[4825]: I0310 08:20:13.612838 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-llnvt" event={"ID":"31dbfbab-84ac-4f05-bfbf-b658aca1b300","Type":"ContainerDied","Data":"fdd3aadf9e0388431925d595e1d184ebf46fd4250cd563fbcb532ee7e9d98ba7"} Mar 10 08:20:14 crc kubenswrapper[4825]: I0310 08:20:14.622268 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-llnvt" event={"ID":"31dbfbab-84ac-4f05-bfbf-b658aca1b300","Type":"ContainerStarted","Data":"6147fc6f54537e63fd4df90a965f051176a8fe3f0190bb2ac25746ad5ae87071"} Mar 10 08:20:14 crc kubenswrapper[4825]: I0310 08:20:14.654212 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-llnvt" podStartSLOduration=3.174009752 podStartE2EDuration="5.654197187s" podCreationTimestamp="2026-03-10 08:20:09 +0000 UTC" firstStartedPulling="2026-03-10 08:20:11.575612513 +0000 UTC m=+5764.605393128" lastFinishedPulling="2026-03-10 08:20:14.055799928 +0000 UTC m=+5767.085580563" observedRunningTime="2026-03-10 08:20:14.649186536 +0000 UTC m=+5767.678967161" watchObservedRunningTime="2026-03-10 08:20:14.654197187 +0000 UTC m=+5767.683977792" Mar 10 08:20:15 crc kubenswrapper[4825]: I0310 08:20:15.912608 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 08:20:15 crc kubenswrapper[4825]: I0310 08:20:15.912889 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 08:20:16 crc kubenswrapper[4825]: I0310 08:20:16.927287 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="62dad1be-8413-4a7e-b097-b6d2e1ec3246" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.135:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 08:20:16 crc kubenswrapper[4825]: I0310 08:20:16.927328 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="62dad1be-8413-4a7e-b097-b6d2e1ec3246" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.135:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 08:20:20 crc kubenswrapper[4825]: I0310 08:20:20.289274 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-llnvt" Mar 10 08:20:20 crc kubenswrapper[4825]: I0310 08:20:20.289695 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-llnvt" Mar 10 08:20:20 crc kubenswrapper[4825]: I0310 08:20:20.340442 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-llnvt" Mar 10 08:20:20 crc kubenswrapper[4825]: I0310 08:20:20.741444 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-llnvt" Mar 10 08:20:20 crc kubenswrapper[4825]: I0310 08:20:20.805478 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-llnvt"] Mar 10 08:20:22 crc kubenswrapper[4825]: I0310 08:20:22.698570 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-llnvt" podUID="31dbfbab-84ac-4f05-bfbf-b658aca1b300" containerName="registry-server" containerID="cri-o://6147fc6f54537e63fd4df90a965f051176a8fe3f0190bb2ac25746ad5ae87071" gracePeriod=2 Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.197061 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-llnvt" Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.220347 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31dbfbab-84ac-4f05-bfbf-b658aca1b300-catalog-content\") pod \"31dbfbab-84ac-4f05-bfbf-b658aca1b300\" (UID: \"31dbfbab-84ac-4f05-bfbf-b658aca1b300\") " Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.220542 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31dbfbab-84ac-4f05-bfbf-b658aca1b300-utilities\") pod \"31dbfbab-84ac-4f05-bfbf-b658aca1b300\" (UID: \"31dbfbab-84ac-4f05-bfbf-b658aca1b300\") " Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.220614 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57c2d\" (UniqueName: \"kubernetes.io/projected/31dbfbab-84ac-4f05-bfbf-b658aca1b300-kube-api-access-57c2d\") pod \"31dbfbab-84ac-4f05-bfbf-b658aca1b300\" (UID: \"31dbfbab-84ac-4f05-bfbf-b658aca1b300\") " Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.221434 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31dbfbab-84ac-4f05-bfbf-b658aca1b300-utilities" (OuterVolumeSpecName: "utilities") pod "31dbfbab-84ac-4f05-bfbf-b658aca1b300" (UID: "31dbfbab-84ac-4f05-bfbf-b658aca1b300"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.227445 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31dbfbab-84ac-4f05-bfbf-b658aca1b300-kube-api-access-57c2d" (OuterVolumeSpecName: "kube-api-access-57c2d") pod "31dbfbab-84ac-4f05-bfbf-b658aca1b300" (UID: "31dbfbab-84ac-4f05-bfbf-b658aca1b300"). InnerVolumeSpecName "kube-api-access-57c2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.255417 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31dbfbab-84ac-4f05-bfbf-b658aca1b300-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31dbfbab-84ac-4f05-bfbf-b658aca1b300" (UID: "31dbfbab-84ac-4f05-bfbf-b658aca1b300"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.327031 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31dbfbab-84ac-4f05-bfbf-b658aca1b300-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.327068 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31dbfbab-84ac-4f05-bfbf-b658aca1b300-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.327084 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57c2d\" (UniqueName: \"kubernetes.io/projected/31dbfbab-84ac-4f05-bfbf-b658aca1b300-kube-api-access-57c2d\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.709495 4825 generic.go:334] "Generic (PLEG): container finished" podID="31dbfbab-84ac-4f05-bfbf-b658aca1b300" containerID="6147fc6f54537e63fd4df90a965f051176a8fe3f0190bb2ac25746ad5ae87071" exitCode=0 Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.709578 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-llnvt" event={"ID":"31dbfbab-84ac-4f05-bfbf-b658aca1b300","Type":"ContainerDied","Data":"6147fc6f54537e63fd4df90a965f051176a8fe3f0190bb2ac25746ad5ae87071"} Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.709625 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-llnvt" event={"ID":"31dbfbab-84ac-4f05-bfbf-b658aca1b300","Type":"ContainerDied","Data":"a0b97fb698dc01fddae29aa4e4e7aefd4b9efbb7b5f386e49ef7756d76de0514"} Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.709645 4825 scope.go:117] "RemoveContainer" containerID="6147fc6f54537e63fd4df90a965f051176a8fe3f0190bb2ac25746ad5ae87071" Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.709642 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-llnvt" Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.742951 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-llnvt"] Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.745755 4825 scope.go:117] "RemoveContainer" containerID="fdd3aadf9e0388431925d595e1d184ebf46fd4250cd563fbcb532ee7e9d98ba7" Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.753457 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-llnvt"] Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.764709 4825 scope.go:117] "RemoveContainer" containerID="719346f0d07d4e2ec21c464caaf1c5aa8778b5b142b202e2baeff590a4d03609" Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.816764 4825 scope.go:117] "RemoveContainer" containerID="6147fc6f54537e63fd4df90a965f051176a8fe3f0190bb2ac25746ad5ae87071" Mar 10 08:20:23 crc kubenswrapper[4825]: E0310 08:20:23.817658 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6147fc6f54537e63fd4df90a965f051176a8fe3f0190bb2ac25746ad5ae87071\": container with ID starting with 6147fc6f54537e63fd4df90a965f051176a8fe3f0190bb2ac25746ad5ae87071 not found: ID does not exist" containerID="6147fc6f54537e63fd4df90a965f051176a8fe3f0190bb2ac25746ad5ae87071" Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.817707 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6147fc6f54537e63fd4df90a965f051176a8fe3f0190bb2ac25746ad5ae87071"} err="failed to get container status \"6147fc6f54537e63fd4df90a965f051176a8fe3f0190bb2ac25746ad5ae87071\": rpc error: code = NotFound desc = could not find container \"6147fc6f54537e63fd4df90a965f051176a8fe3f0190bb2ac25746ad5ae87071\": container with ID starting with 6147fc6f54537e63fd4df90a965f051176a8fe3f0190bb2ac25746ad5ae87071 not found: ID does not exist" Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.817761 4825 scope.go:117] "RemoveContainer" containerID="fdd3aadf9e0388431925d595e1d184ebf46fd4250cd563fbcb532ee7e9d98ba7" Mar 10 08:20:23 crc kubenswrapper[4825]: E0310 08:20:23.818217 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdd3aadf9e0388431925d595e1d184ebf46fd4250cd563fbcb532ee7e9d98ba7\": container with ID starting with fdd3aadf9e0388431925d595e1d184ebf46fd4250cd563fbcb532ee7e9d98ba7 not found: ID does not exist" containerID="fdd3aadf9e0388431925d595e1d184ebf46fd4250cd563fbcb532ee7e9d98ba7" Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.818262 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdd3aadf9e0388431925d595e1d184ebf46fd4250cd563fbcb532ee7e9d98ba7"} err="failed to get container status \"fdd3aadf9e0388431925d595e1d184ebf46fd4250cd563fbcb532ee7e9d98ba7\": rpc error: code = NotFound desc = could not find container \"fdd3aadf9e0388431925d595e1d184ebf46fd4250cd563fbcb532ee7e9d98ba7\": container with ID starting with fdd3aadf9e0388431925d595e1d184ebf46fd4250cd563fbcb532ee7e9d98ba7 not found: ID does not exist" Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.818302 4825 scope.go:117] "RemoveContainer" containerID="719346f0d07d4e2ec21c464caaf1c5aa8778b5b142b202e2baeff590a4d03609" Mar 10 08:20:23 crc kubenswrapper[4825]: E0310 08:20:23.818696 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"719346f0d07d4e2ec21c464caaf1c5aa8778b5b142b202e2baeff590a4d03609\": container with ID starting with 719346f0d07d4e2ec21c464caaf1c5aa8778b5b142b202e2baeff590a4d03609 not found: ID does not exist" containerID="719346f0d07d4e2ec21c464caaf1c5aa8778b5b142b202e2baeff590a4d03609" Mar 10 08:20:23 crc kubenswrapper[4825]: I0310 08:20:23.818726 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"719346f0d07d4e2ec21c464caaf1c5aa8778b5b142b202e2baeff590a4d03609"} err="failed to get container status \"719346f0d07d4e2ec21c464caaf1c5aa8778b5b142b202e2baeff590a4d03609\": rpc error: code = NotFound desc = could not find container \"719346f0d07d4e2ec21c464caaf1c5aa8778b5b142b202e2baeff590a4d03609\": container with ID starting with 719346f0d07d4e2ec21c464caaf1c5aa8778b5b142b202e2baeff590a4d03609 not found: ID does not exist" Mar 10 08:20:25 crc kubenswrapper[4825]: I0310 08:20:25.252483 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31dbfbab-84ac-4f05-bfbf-b658aca1b300" path="/var/lib/kubelet/pods/31dbfbab-84ac-4f05-bfbf-b658aca1b300/volumes" Mar 10 08:20:25 crc kubenswrapper[4825]: I0310 08:20:25.919694 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 08:20:25 crc kubenswrapper[4825]: I0310 08:20:25.921388 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 08:20:25 crc kubenswrapper[4825]: I0310 08:20:25.924597 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 08:20:25 crc kubenswrapper[4825]: I0310 08:20:25.930782 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 08:20:26 crc kubenswrapper[4825]: I0310 08:20:26.739934 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 08:20:26 crc kubenswrapper[4825]: I0310 08:20:26.747975 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 08:20:37 crc kubenswrapper[4825]: I0310 08:20:37.853736 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-58dcdb7cc-ccx9h"] Mar 10 08:20:37 crc kubenswrapper[4825]: E0310 08:20:37.854817 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31dbfbab-84ac-4f05-bfbf-b658aca1b300" containerName="extract-content" Mar 10 08:20:37 crc kubenswrapper[4825]: I0310 08:20:37.854839 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="31dbfbab-84ac-4f05-bfbf-b658aca1b300" containerName="extract-content" Mar 10 08:20:37 crc kubenswrapper[4825]: E0310 08:20:37.854876 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31dbfbab-84ac-4f05-bfbf-b658aca1b300" containerName="extract-utilities" Mar 10 08:20:37 crc kubenswrapper[4825]: I0310 08:20:37.854888 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="31dbfbab-84ac-4f05-bfbf-b658aca1b300" containerName="extract-utilities" Mar 10 08:20:37 crc kubenswrapper[4825]: E0310 08:20:37.854906 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31dbfbab-84ac-4f05-bfbf-b658aca1b300" containerName="registry-server" Mar 10 08:20:37 crc kubenswrapper[4825]: I0310 08:20:37.854914 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="31dbfbab-84ac-4f05-bfbf-b658aca1b300" containerName="registry-server" Mar 10 08:20:37 crc kubenswrapper[4825]: I0310 08:20:37.855122 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="31dbfbab-84ac-4f05-bfbf-b658aca1b300" containerName="registry-server" Mar 10 08:20:37 crc kubenswrapper[4825]: I0310 08:20:37.856082 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58dcdb7cc-ccx9h" Mar 10 08:20:37 crc kubenswrapper[4825]: I0310 08:20:37.859494 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 10 08:20:37 crc kubenswrapper[4825]: I0310 08:20:37.859730 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 10 08:20:37 crc kubenswrapper[4825]: I0310 08:20:37.859886 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-cqncb" Mar 10 08:20:37 crc kubenswrapper[4825]: I0310 08:20:37.860056 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 10 08:20:37 crc kubenswrapper[4825]: I0310 08:20:37.895498 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58dcdb7cc-ccx9h"] Mar 10 08:20:37 crc kubenswrapper[4825]: I0310 08:20:37.908388 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 08:20:37 crc kubenswrapper[4825]: I0310 08:20:37.908642 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ec1c3fd1-11dc-49a3-9869-95438b93ab08" containerName="glance-log" containerID="cri-o://1d3e4e3e439cd0f6b04d665a1435005ce50c78d9a937614f5ef8a5ade7a959a5" gracePeriod=30 Mar 10 08:20:37 crc kubenswrapper[4825]: I0310 08:20:37.908776 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ec1c3fd1-11dc-49a3-9869-95438b93ab08" containerName="glance-httpd" containerID="cri-o://e8b549f3aca00d92c110d0fabf012350fc907bc04b117fdc2dc18e6faf605abe" gracePeriod=30 Mar 10 08:20:37 crc kubenswrapper[4825]: I0310 08:20:37.952469 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-77b77cfcdf-n62md"] Mar 10 08:20:37 crc kubenswrapper[4825]: I0310 08:20:37.956285 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77b77cfcdf-n62md" Mar 10 08:20:37 crc kubenswrapper[4825]: I0310 08:20:37.981307 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77b77cfcdf-n62md"] Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.002439 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.002709 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ee73a326-c85e-4756-b836-3fc323ef11ac" containerName="glance-log" containerID="cri-o://2f3556bb6af608a5cfdaa3a62737229b2129fac11fe4cdf0a59d6331965d4f05" gracePeriod=30 Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.002921 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ee73a326-c85e-4756-b836-3fc323ef11ac" containerName="glance-httpd" containerID="cri-o://6cf54bd96acb81570c21132944a0bdf95aaa4372936cf091dfc290a59d872175" gracePeriod=30 Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.028028 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-config-data\") pod \"horizon-58dcdb7cc-ccx9h\" (UID: \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\") " pod="openstack/horizon-58dcdb7cc-ccx9h" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.028089 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-scripts\") pod \"horizon-58dcdb7cc-ccx9h\" (UID: \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\") " pod="openstack/horizon-58dcdb7cc-ccx9h" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.028157 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-logs\") pod \"horizon-58dcdb7cc-ccx9h\" (UID: \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\") " pod="openstack/horizon-58dcdb7cc-ccx9h" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.028174 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvs2z\" (UniqueName: \"kubernetes.io/projected/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-kube-api-access-jvs2z\") pod \"horizon-58dcdb7cc-ccx9h\" (UID: \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\") " pod="openstack/horizon-58dcdb7cc-ccx9h" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.028312 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-horizon-secret-key\") pod \"horizon-58dcdb7cc-ccx9h\" (UID: \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\") " pod="openstack/horizon-58dcdb7cc-ccx9h" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.129577 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/385dc976-f03b-4c63-8cdc-73d5e8481597-logs\") pod \"horizon-77b77cfcdf-n62md\" (UID: \"385dc976-f03b-4c63-8cdc-73d5e8481597\") " pod="openstack/horizon-77b77cfcdf-n62md" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.129615 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/385dc976-f03b-4c63-8cdc-73d5e8481597-horizon-secret-key\") pod \"horizon-77b77cfcdf-n62md\" (UID: \"385dc976-f03b-4c63-8cdc-73d5e8481597\") " pod="openstack/horizon-77b77cfcdf-n62md" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.129646 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-horizon-secret-key\") pod \"horizon-58dcdb7cc-ccx9h\" (UID: \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\") " pod="openstack/horizon-58dcdb7cc-ccx9h" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.129717 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-625pk\" (UniqueName: \"kubernetes.io/projected/385dc976-f03b-4c63-8cdc-73d5e8481597-kube-api-access-625pk\") pod \"horizon-77b77cfcdf-n62md\" (UID: \"385dc976-f03b-4c63-8cdc-73d5e8481597\") " pod="openstack/horizon-77b77cfcdf-n62md" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.129737 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-config-data\") pod \"horizon-58dcdb7cc-ccx9h\" (UID: \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\") " pod="openstack/horizon-58dcdb7cc-ccx9h" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.129763 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-scripts\") pod \"horizon-58dcdb7cc-ccx9h\" (UID: \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\") " pod="openstack/horizon-58dcdb7cc-ccx9h" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.129789 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/385dc976-f03b-4c63-8cdc-73d5e8481597-config-data\") pod \"horizon-77b77cfcdf-n62md\" (UID: \"385dc976-f03b-4c63-8cdc-73d5e8481597\") " pod="openstack/horizon-77b77cfcdf-n62md" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.129813 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/385dc976-f03b-4c63-8cdc-73d5e8481597-scripts\") pod \"horizon-77b77cfcdf-n62md\" (UID: \"385dc976-f03b-4c63-8cdc-73d5e8481597\") " pod="openstack/horizon-77b77cfcdf-n62md" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.129828 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-logs\") pod \"horizon-58dcdb7cc-ccx9h\" (UID: \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\") " pod="openstack/horizon-58dcdb7cc-ccx9h" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.129845 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvs2z\" (UniqueName: \"kubernetes.io/projected/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-kube-api-access-jvs2z\") pod \"horizon-58dcdb7cc-ccx9h\" (UID: \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\") " pod="openstack/horizon-58dcdb7cc-ccx9h" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.130860 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-scripts\") pod \"horizon-58dcdb7cc-ccx9h\" (UID: \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\") " pod="openstack/horizon-58dcdb7cc-ccx9h" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.131097 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-logs\") pod \"horizon-58dcdb7cc-ccx9h\" (UID: \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\") " pod="openstack/horizon-58dcdb7cc-ccx9h" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.131445 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-config-data\") pod \"horizon-58dcdb7cc-ccx9h\" (UID: \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\") " pod="openstack/horizon-58dcdb7cc-ccx9h" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.142266 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-horizon-secret-key\") pod \"horizon-58dcdb7cc-ccx9h\" (UID: \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\") " pod="openstack/horizon-58dcdb7cc-ccx9h" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.147867 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvs2z\" (UniqueName: \"kubernetes.io/projected/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-kube-api-access-jvs2z\") pod \"horizon-58dcdb7cc-ccx9h\" (UID: \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\") " pod="openstack/horizon-58dcdb7cc-ccx9h" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.202839 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58dcdb7cc-ccx9h" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.232077 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/385dc976-f03b-4c63-8cdc-73d5e8481597-logs\") pod \"horizon-77b77cfcdf-n62md\" (UID: \"385dc976-f03b-4c63-8cdc-73d5e8481597\") " pod="openstack/horizon-77b77cfcdf-n62md" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.232156 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/385dc976-f03b-4c63-8cdc-73d5e8481597-horizon-secret-key\") pod \"horizon-77b77cfcdf-n62md\" (UID: \"385dc976-f03b-4c63-8cdc-73d5e8481597\") " pod="openstack/horizon-77b77cfcdf-n62md" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.232284 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-625pk\" (UniqueName: \"kubernetes.io/projected/385dc976-f03b-4c63-8cdc-73d5e8481597-kube-api-access-625pk\") pod \"horizon-77b77cfcdf-n62md\" (UID: \"385dc976-f03b-4c63-8cdc-73d5e8481597\") " pod="openstack/horizon-77b77cfcdf-n62md" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.232356 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/385dc976-f03b-4c63-8cdc-73d5e8481597-config-data\") pod \"horizon-77b77cfcdf-n62md\" (UID: \"385dc976-f03b-4c63-8cdc-73d5e8481597\") " pod="openstack/horizon-77b77cfcdf-n62md" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.232404 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/385dc976-f03b-4c63-8cdc-73d5e8481597-scripts\") pod \"horizon-77b77cfcdf-n62md\" (UID: \"385dc976-f03b-4c63-8cdc-73d5e8481597\") " pod="openstack/horizon-77b77cfcdf-n62md" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.232794 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/385dc976-f03b-4c63-8cdc-73d5e8481597-logs\") pod \"horizon-77b77cfcdf-n62md\" (UID: \"385dc976-f03b-4c63-8cdc-73d5e8481597\") " pod="openstack/horizon-77b77cfcdf-n62md" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.233684 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/385dc976-f03b-4c63-8cdc-73d5e8481597-config-data\") pod \"horizon-77b77cfcdf-n62md\" (UID: \"385dc976-f03b-4c63-8cdc-73d5e8481597\") " pod="openstack/horizon-77b77cfcdf-n62md" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.234284 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/385dc976-f03b-4c63-8cdc-73d5e8481597-scripts\") pod \"horizon-77b77cfcdf-n62md\" (UID: \"385dc976-f03b-4c63-8cdc-73d5e8481597\") " pod="openstack/horizon-77b77cfcdf-n62md" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.235693 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/385dc976-f03b-4c63-8cdc-73d5e8481597-horizon-secret-key\") pod \"horizon-77b77cfcdf-n62md\" (UID: \"385dc976-f03b-4c63-8cdc-73d5e8481597\") " pod="openstack/horizon-77b77cfcdf-n62md" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.249051 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-625pk\" (UniqueName: \"kubernetes.io/projected/385dc976-f03b-4c63-8cdc-73d5e8481597-kube-api-access-625pk\") pod \"horizon-77b77cfcdf-n62md\" (UID: \"385dc976-f03b-4c63-8cdc-73d5e8481597\") " pod="openstack/horizon-77b77cfcdf-n62md" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.297053 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77b77cfcdf-n62md" Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.654748 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58dcdb7cc-ccx9h"] Mar 10 08:20:38 crc kubenswrapper[4825]: W0310 08:20:38.661419 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2ee8d7c_1256_4a21_8f30_2a3f9d3f1421.slice/crio-ba4a2da7d97965aa9be94de1d01b90744244674a63fed17dd94acda6ff94d3af WatchSource:0}: Error finding container ba4a2da7d97965aa9be94de1d01b90744244674a63fed17dd94acda6ff94d3af: Status 404 returned error can't find the container with id ba4a2da7d97965aa9be94de1d01b90744244674a63fed17dd94acda6ff94d3af Mar 10 08:20:38 crc kubenswrapper[4825]: W0310 08:20:38.775151 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod385dc976_f03b_4c63_8cdc_73d5e8481597.slice/crio-7bcbc4fdd5fec9c3ff27451dfa08602e7574850e8a3fbc3bbe148a32a079dcc7 WatchSource:0}: Error finding container 7bcbc4fdd5fec9c3ff27451dfa08602e7574850e8a3fbc3bbe148a32a079dcc7: Status 404 returned error can't find the container with id 7bcbc4fdd5fec9c3ff27451dfa08602e7574850e8a3fbc3bbe148a32a079dcc7 Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.775683 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77b77cfcdf-n62md"] Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.865342 4825 generic.go:334] "Generic (PLEG): container finished" podID="ee73a326-c85e-4756-b836-3fc323ef11ac" containerID="2f3556bb6af608a5cfdaa3a62737229b2129fac11fe4cdf0a59d6331965d4f05" exitCode=143 Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.865415 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ee73a326-c85e-4756-b836-3fc323ef11ac","Type":"ContainerDied","Data":"2f3556bb6af608a5cfdaa3a62737229b2129fac11fe4cdf0a59d6331965d4f05"} Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.867767 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58dcdb7cc-ccx9h" event={"ID":"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421","Type":"ContainerStarted","Data":"ba4a2da7d97965aa9be94de1d01b90744244674a63fed17dd94acda6ff94d3af"} Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.879264 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77b77cfcdf-n62md" event={"ID":"385dc976-f03b-4c63-8cdc-73d5e8481597","Type":"ContainerStarted","Data":"7bcbc4fdd5fec9c3ff27451dfa08602e7574850e8a3fbc3bbe148a32a079dcc7"} Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.882292 4825 generic.go:334] "Generic (PLEG): container finished" podID="ec1c3fd1-11dc-49a3-9869-95438b93ab08" containerID="1d3e4e3e439cd0f6b04d665a1435005ce50c78d9a937614f5ef8a5ade7a959a5" exitCode=143 Mar 10 08:20:38 crc kubenswrapper[4825]: I0310 08:20:38.882381 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ec1c3fd1-11dc-49a3-9869-95438b93ab08","Type":"ContainerDied","Data":"1d3e4e3e439cd0f6b04d665a1435005ce50c78d9a937614f5ef8a5ade7a959a5"} Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.587511 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-58dcdb7cc-ccx9h"] Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.636952 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8677c986cb-mlrhz"] Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.640250 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.645638 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.651977 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8677c986cb-mlrhz"] Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.687658 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-77b77cfcdf-n62md"] Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.718952 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-67b47658d-rd7tt"] Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.720553 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.750342 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67b47658d-rd7tt"] Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.779161 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-scripts\") pod \"horizon-8677c986cb-mlrhz\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.779253 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-horizon-secret-key\") pod \"horizon-8677c986cb-mlrhz\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.779313 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtjvh\" (UniqueName: \"kubernetes.io/projected/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-kube-api-access-mtjvh\") pod \"horizon-8677c986cb-mlrhz\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.779415 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-combined-ca-bundle\") pod \"horizon-8677c986cb-mlrhz\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.779523 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-config-data\") pod \"horizon-8677c986cb-mlrhz\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.779609 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-horizon-tls-certs\") pod \"horizon-8677c986cb-mlrhz\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.779653 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-logs\") pod \"horizon-8677c986cb-mlrhz\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.881651 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/402e8d63-8847-44af-9a62-9536d0f513f8-horizon-secret-key\") pod \"horizon-67b47658d-rd7tt\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.881712 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-combined-ca-bundle\") pod \"horizon-8677c986cb-mlrhz\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.881736 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/402e8d63-8847-44af-9a62-9536d0f513f8-horizon-tls-certs\") pod \"horizon-67b47658d-rd7tt\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.881995 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/402e8d63-8847-44af-9a62-9536d0f513f8-config-data\") pod \"horizon-67b47658d-rd7tt\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.882100 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-config-data\") pod \"horizon-8677c986cb-mlrhz\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.882243 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-horizon-tls-certs\") pod \"horizon-8677c986cb-mlrhz\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.882308 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-logs\") pod \"horizon-8677c986cb-mlrhz\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.882404 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402e8d63-8847-44af-9a62-9536d0f513f8-combined-ca-bundle\") pod \"horizon-67b47658d-rd7tt\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.882457 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/402e8d63-8847-44af-9a62-9536d0f513f8-logs\") pod \"horizon-67b47658d-rd7tt\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.882515 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/402e8d63-8847-44af-9a62-9536d0f513f8-scripts\") pod \"horizon-67b47658d-rd7tt\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.882559 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txh6n\" (UniqueName: \"kubernetes.io/projected/402e8d63-8847-44af-9a62-9536d0f513f8-kube-api-access-txh6n\") pod \"horizon-67b47658d-rd7tt\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.882659 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-scripts\") pod \"horizon-8677c986cb-mlrhz\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.882721 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-horizon-secret-key\") pod \"horizon-8677c986cb-mlrhz\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.882784 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtjvh\" (UniqueName: \"kubernetes.io/projected/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-kube-api-access-mtjvh\") pod \"horizon-8677c986cb-mlrhz\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.882806 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-logs\") pod \"horizon-8677c986cb-mlrhz\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.883424 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-scripts\") pod \"horizon-8677c986cb-mlrhz\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.883618 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-config-data\") pod \"horizon-8677c986cb-mlrhz\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.889916 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-horizon-secret-key\") pod \"horizon-8677c986cb-mlrhz\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.889950 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-horizon-tls-certs\") pod \"horizon-8677c986cb-mlrhz\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.890106 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-combined-ca-bundle\") pod \"horizon-8677c986cb-mlrhz\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.897721 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtjvh\" (UniqueName: \"kubernetes.io/projected/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-kube-api-access-mtjvh\") pod \"horizon-8677c986cb-mlrhz\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.985016 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402e8d63-8847-44af-9a62-9536d0f513f8-combined-ca-bundle\") pod \"horizon-67b47658d-rd7tt\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.985064 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/402e8d63-8847-44af-9a62-9536d0f513f8-logs\") pod \"horizon-67b47658d-rd7tt\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.985088 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/402e8d63-8847-44af-9a62-9536d0f513f8-scripts\") pod \"horizon-67b47658d-rd7tt\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.985119 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txh6n\" (UniqueName: \"kubernetes.io/projected/402e8d63-8847-44af-9a62-9536d0f513f8-kube-api-access-txh6n\") pod \"horizon-67b47658d-rd7tt\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.985234 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/402e8d63-8847-44af-9a62-9536d0f513f8-horizon-secret-key\") pod \"horizon-67b47658d-rd7tt\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.985272 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/402e8d63-8847-44af-9a62-9536d0f513f8-horizon-tls-certs\") pod \"horizon-67b47658d-rd7tt\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.985310 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/402e8d63-8847-44af-9a62-9536d0f513f8-config-data\") pod \"horizon-67b47658d-rd7tt\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.986668 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/402e8d63-8847-44af-9a62-9536d0f513f8-config-data\") pod \"horizon-67b47658d-rd7tt\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.987596 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.987605 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/402e8d63-8847-44af-9a62-9536d0f513f8-logs\") pod \"horizon-67b47658d-rd7tt\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.987951 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/402e8d63-8847-44af-9a62-9536d0f513f8-scripts\") pod \"horizon-67b47658d-rd7tt\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.991161 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/402e8d63-8847-44af-9a62-9536d0f513f8-horizon-tls-certs\") pod \"horizon-67b47658d-rd7tt\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.993761 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402e8d63-8847-44af-9a62-9536d0f513f8-combined-ca-bundle\") pod \"horizon-67b47658d-rd7tt\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:39 crc kubenswrapper[4825]: I0310 08:20:39.994051 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/402e8d63-8847-44af-9a62-9536d0f513f8-horizon-secret-key\") pod \"horizon-67b47658d-rd7tt\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:40 crc kubenswrapper[4825]: I0310 08:20:40.001690 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txh6n\" (UniqueName: \"kubernetes.io/projected/402e8d63-8847-44af-9a62-9536d0f513f8-kube-api-access-txh6n\") pod \"horizon-67b47658d-rd7tt\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:40 crc kubenswrapper[4825]: I0310 08:20:40.039797 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:40 crc kubenswrapper[4825]: I0310 08:20:40.470932 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8677c986cb-mlrhz"] Mar 10 08:20:40 crc kubenswrapper[4825]: W0310 08:20:40.470972 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53591cd3_2c72_4a0d_ac6b_a904476b2dc4.slice/crio-e93fe6bafe88a6a808073f78a47244777e53ccd22807bdbdc771c17a4d8be925 WatchSource:0}: Error finding container e93fe6bafe88a6a808073f78a47244777e53ccd22807bdbdc771c17a4d8be925: Status 404 returned error can't find the container with id e93fe6bafe88a6a808073f78a47244777e53ccd22807bdbdc771c17a4d8be925 Mar 10 08:20:40 crc kubenswrapper[4825]: W0310 08:20:40.587447 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod402e8d63_8847_44af_9a62_9536d0f513f8.slice/crio-7e36d87b4e68d8d7adc8fb6d4ef04caffc17639225cb0019e06415c10c2ecd8b WatchSource:0}: Error finding container 7e36d87b4e68d8d7adc8fb6d4ef04caffc17639225cb0019e06415c10c2ecd8b: Status 404 returned error can't find the container with id 7e36d87b4e68d8d7adc8fb6d4ef04caffc17639225cb0019e06415c10c2ecd8b Mar 10 08:20:40 crc kubenswrapper[4825]: I0310 08:20:40.588668 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67b47658d-rd7tt"] Mar 10 08:20:40 crc kubenswrapper[4825]: I0310 08:20:40.916609 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67b47658d-rd7tt" event={"ID":"402e8d63-8847-44af-9a62-9536d0f513f8","Type":"ContainerStarted","Data":"7e36d87b4e68d8d7adc8fb6d4ef04caffc17639225cb0019e06415c10c2ecd8b"} Mar 10 08:20:40 crc kubenswrapper[4825]: I0310 08:20:40.919378 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8677c986cb-mlrhz" event={"ID":"53591cd3-2c72-4a0d-ac6b-a904476b2dc4","Type":"ContainerStarted","Data":"e93fe6bafe88a6a808073f78a47244777e53ccd22807bdbdc771c17a4d8be925"} Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.652371 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.721500 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.729308 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1c3fd1-11dc-49a3-9869-95438b93ab08-config-data\") pod \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.729372 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ec1c3fd1-11dc-49a3-9869-95438b93ab08-httpd-run\") pod \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.729400 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kqgb\" (UniqueName: \"kubernetes.io/projected/ec1c3fd1-11dc-49a3-9869-95438b93ab08-kube-api-access-8kqgb\") pod \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.729425 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec1c3fd1-11dc-49a3-9869-95438b93ab08-logs\") pod \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.729466 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1c3fd1-11dc-49a3-9869-95438b93ab08-scripts\") pod \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.729591 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec1c3fd1-11dc-49a3-9869-95438b93ab08-public-tls-certs\") pod \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.729622 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1c3fd1-11dc-49a3-9869-95438b93ab08-combined-ca-bundle\") pod \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\" (UID: \"ec1c3fd1-11dc-49a3-9869-95438b93ab08\") " Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.732319 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec1c3fd1-11dc-49a3-9869-95438b93ab08-logs" (OuterVolumeSpecName: "logs") pod "ec1c3fd1-11dc-49a3-9869-95438b93ab08" (UID: "ec1c3fd1-11dc-49a3-9869-95438b93ab08"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.733529 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec1c3fd1-11dc-49a3-9869-95438b93ab08-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ec1c3fd1-11dc-49a3-9869-95438b93ab08" (UID: "ec1c3fd1-11dc-49a3-9869-95438b93ab08"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.736824 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec1c3fd1-11dc-49a3-9869-95438b93ab08-kube-api-access-8kqgb" (OuterVolumeSpecName: "kube-api-access-8kqgb") pod "ec1c3fd1-11dc-49a3-9869-95438b93ab08" (UID: "ec1c3fd1-11dc-49a3-9869-95438b93ab08"). InnerVolumeSpecName "kube-api-access-8kqgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.740215 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec1c3fd1-11dc-49a3-9869-95438b93ab08-scripts" (OuterVolumeSpecName: "scripts") pod "ec1c3fd1-11dc-49a3-9869-95438b93ab08" (UID: "ec1c3fd1-11dc-49a3-9869-95438b93ab08"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.795154 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec1c3fd1-11dc-49a3-9869-95438b93ab08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec1c3fd1-11dc-49a3-9869-95438b93ab08" (UID: "ec1c3fd1-11dc-49a3-9869-95438b93ab08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.812684 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec1c3fd1-11dc-49a3-9869-95438b93ab08-config-data" (OuterVolumeSpecName: "config-data") pod "ec1c3fd1-11dc-49a3-9869-95438b93ab08" (UID: "ec1c3fd1-11dc-49a3-9869-95438b93ab08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.843820 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee73a326-c85e-4756-b836-3fc323ef11ac-logs\") pod \"ee73a326-c85e-4756-b836-3fc323ef11ac\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.843879 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7qb5\" (UniqueName: \"kubernetes.io/projected/ee73a326-c85e-4756-b836-3fc323ef11ac-kube-api-access-t7qb5\") pod \"ee73a326-c85e-4756-b836-3fc323ef11ac\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.844037 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee73a326-c85e-4756-b836-3fc323ef11ac-combined-ca-bundle\") pod \"ee73a326-c85e-4756-b836-3fc323ef11ac\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.844110 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee73a326-c85e-4756-b836-3fc323ef11ac-internal-tls-certs\") pod \"ee73a326-c85e-4756-b836-3fc323ef11ac\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.844168 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ee73a326-c85e-4756-b836-3fc323ef11ac-httpd-run\") pod \"ee73a326-c85e-4756-b836-3fc323ef11ac\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.844207 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee73a326-c85e-4756-b836-3fc323ef11ac-scripts\") pod \"ee73a326-c85e-4756-b836-3fc323ef11ac\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.844244 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee73a326-c85e-4756-b836-3fc323ef11ac-config-data\") pod \"ee73a326-c85e-4756-b836-3fc323ef11ac\" (UID: \"ee73a326-c85e-4756-b836-3fc323ef11ac\") " Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.844645 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1c3fd1-11dc-49a3-9869-95438b93ab08-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.844662 4825 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ec1c3fd1-11dc-49a3-9869-95438b93ab08-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.844671 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kqgb\" (UniqueName: \"kubernetes.io/projected/ec1c3fd1-11dc-49a3-9869-95438b93ab08-kube-api-access-8kqgb\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.844681 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec1c3fd1-11dc-49a3-9869-95438b93ab08-logs\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.844690 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1c3fd1-11dc-49a3-9869-95438b93ab08-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.844677 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee73a326-c85e-4756-b836-3fc323ef11ac-logs" (OuterVolumeSpecName: "logs") pod "ee73a326-c85e-4756-b836-3fc323ef11ac" (UID: "ee73a326-c85e-4756-b836-3fc323ef11ac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.844698 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1c3fd1-11dc-49a3-9869-95438b93ab08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.845052 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee73a326-c85e-4756-b836-3fc323ef11ac-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ee73a326-c85e-4756-b836-3fc323ef11ac" (UID: "ee73a326-c85e-4756-b836-3fc323ef11ac"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.856110 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee73a326-c85e-4756-b836-3fc323ef11ac-scripts" (OuterVolumeSpecName: "scripts") pod "ee73a326-c85e-4756-b836-3fc323ef11ac" (UID: "ee73a326-c85e-4756-b836-3fc323ef11ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.857010 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee73a326-c85e-4756-b836-3fc323ef11ac-kube-api-access-t7qb5" (OuterVolumeSpecName: "kube-api-access-t7qb5") pod "ee73a326-c85e-4756-b836-3fc323ef11ac" (UID: "ee73a326-c85e-4756-b836-3fc323ef11ac"). InnerVolumeSpecName "kube-api-access-t7qb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.865853 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec1c3fd1-11dc-49a3-9869-95438b93ab08-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ec1c3fd1-11dc-49a3-9869-95438b93ab08" (UID: "ec1c3fd1-11dc-49a3-9869-95438b93ab08"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.889534 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee73a326-c85e-4756-b836-3fc323ef11ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee73a326-c85e-4756-b836-3fc323ef11ac" (UID: "ee73a326-c85e-4756-b836-3fc323ef11ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.913530 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee73a326-c85e-4756-b836-3fc323ef11ac-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ee73a326-c85e-4756-b836-3fc323ef11ac" (UID: "ee73a326-c85e-4756-b836-3fc323ef11ac"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.919962 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee73a326-c85e-4756-b836-3fc323ef11ac-config-data" (OuterVolumeSpecName: "config-data") pod "ee73a326-c85e-4756-b836-3fc323ef11ac" (UID: "ee73a326-c85e-4756-b836-3fc323ef11ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.936431 4825 generic.go:334] "Generic (PLEG): container finished" podID="ee73a326-c85e-4756-b836-3fc323ef11ac" containerID="6cf54bd96acb81570c21132944a0bdf95aaa4372936cf091dfc290a59d872175" exitCode=0 Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.936511 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ee73a326-c85e-4756-b836-3fc323ef11ac","Type":"ContainerDied","Data":"6cf54bd96acb81570c21132944a0bdf95aaa4372936cf091dfc290a59d872175"} Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.936559 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ee73a326-c85e-4756-b836-3fc323ef11ac","Type":"ContainerDied","Data":"960d8532115364cc8bf7f43ff151e2645c7b102a81e55beb189bcac54787383d"} Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.936580 4825 scope.go:117] "RemoveContainer" containerID="6cf54bd96acb81570c21132944a0bdf95aaa4372936cf091dfc290a59d872175" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.936752 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.941290 4825 generic.go:334] "Generic (PLEG): container finished" podID="ec1c3fd1-11dc-49a3-9869-95438b93ab08" containerID="e8b549f3aca00d92c110d0fabf012350fc907bc04b117fdc2dc18e6faf605abe" exitCode=0 Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.941324 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ec1c3fd1-11dc-49a3-9869-95438b93ab08","Type":"ContainerDied","Data":"e8b549f3aca00d92c110d0fabf012350fc907bc04b117fdc2dc18e6faf605abe"} Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.941345 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ec1c3fd1-11dc-49a3-9869-95438b93ab08","Type":"ContainerDied","Data":"24de60a2db0176c4af9a50dfd92a80a4cd3dca3a99c757d808115cef634ca6e9"} Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.941377 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.946016 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee73a326-c85e-4756-b836-3fc323ef11ac-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.946043 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee73a326-c85e-4756-b836-3fc323ef11ac-logs\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.946052 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7qb5\" (UniqueName: \"kubernetes.io/projected/ee73a326-c85e-4756-b836-3fc323ef11ac-kube-api-access-t7qb5\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.946067 4825 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec1c3fd1-11dc-49a3-9869-95438b93ab08-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.946079 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee73a326-c85e-4756-b836-3fc323ef11ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.946090 4825 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee73a326-c85e-4756-b836-3fc323ef11ac-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.946114 4825 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ee73a326-c85e-4756-b836-3fc323ef11ac-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.946121 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee73a326-c85e-4756-b836-3fc323ef11ac-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.973059 4825 scope.go:117] "RemoveContainer" containerID="2f3556bb6af608a5cfdaa3a62737229b2129fac11fe4cdf0a59d6331965d4f05" Mar 10 08:20:41 crc kubenswrapper[4825]: I0310 08:20:41.996532 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.012798 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.038400 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.065835 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.073338 4825 scope.go:117] "RemoveContainer" containerID="6cf54bd96acb81570c21132944a0bdf95aaa4372936cf091dfc290a59d872175" Mar 10 08:20:42 crc kubenswrapper[4825]: E0310 08:20:42.073688 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cf54bd96acb81570c21132944a0bdf95aaa4372936cf091dfc290a59d872175\": container with ID starting with 6cf54bd96acb81570c21132944a0bdf95aaa4372936cf091dfc290a59d872175 not found: ID does not exist" containerID="6cf54bd96acb81570c21132944a0bdf95aaa4372936cf091dfc290a59d872175" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.073731 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cf54bd96acb81570c21132944a0bdf95aaa4372936cf091dfc290a59d872175"} err="failed to get container status \"6cf54bd96acb81570c21132944a0bdf95aaa4372936cf091dfc290a59d872175\": rpc error: code = NotFound desc = could not find container \"6cf54bd96acb81570c21132944a0bdf95aaa4372936cf091dfc290a59d872175\": container with ID starting with 6cf54bd96acb81570c21132944a0bdf95aaa4372936cf091dfc290a59d872175 not found: ID does not exist" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.073765 4825 scope.go:117] "RemoveContainer" containerID="2f3556bb6af608a5cfdaa3a62737229b2129fac11fe4cdf0a59d6331965d4f05" Mar 10 08:20:42 crc kubenswrapper[4825]: E0310 08:20:42.074106 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f3556bb6af608a5cfdaa3a62737229b2129fac11fe4cdf0a59d6331965d4f05\": container with ID starting with 2f3556bb6af608a5cfdaa3a62737229b2129fac11fe4cdf0a59d6331965d4f05 not found: ID does not exist" containerID="2f3556bb6af608a5cfdaa3a62737229b2129fac11fe4cdf0a59d6331965d4f05" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.074195 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f3556bb6af608a5cfdaa3a62737229b2129fac11fe4cdf0a59d6331965d4f05"} err="failed to get container status \"2f3556bb6af608a5cfdaa3a62737229b2129fac11fe4cdf0a59d6331965d4f05\": rpc error: code = NotFound desc = could not find container \"2f3556bb6af608a5cfdaa3a62737229b2129fac11fe4cdf0a59d6331965d4f05\": container with ID starting with 2f3556bb6af608a5cfdaa3a62737229b2129fac11fe4cdf0a59d6331965d4f05 not found: ID does not exist" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.074211 4825 scope.go:117] "RemoveContainer" containerID="e8b549f3aca00d92c110d0fabf012350fc907bc04b117fdc2dc18e6faf605abe" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.077220 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 08:20:42 crc kubenswrapper[4825]: E0310 08:20:42.077665 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1c3fd1-11dc-49a3-9869-95438b93ab08" containerName="glance-httpd" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.077682 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1c3fd1-11dc-49a3-9869-95438b93ab08" containerName="glance-httpd" Mar 10 08:20:42 crc kubenswrapper[4825]: E0310 08:20:42.077700 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1c3fd1-11dc-49a3-9869-95438b93ab08" containerName="glance-log" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.077708 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1c3fd1-11dc-49a3-9869-95438b93ab08" containerName="glance-log" Mar 10 08:20:42 crc kubenswrapper[4825]: E0310 08:20:42.077730 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee73a326-c85e-4756-b836-3fc323ef11ac" containerName="glance-httpd" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.077735 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee73a326-c85e-4756-b836-3fc323ef11ac" containerName="glance-httpd" Mar 10 08:20:42 crc kubenswrapper[4825]: E0310 08:20:42.077751 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee73a326-c85e-4756-b836-3fc323ef11ac" containerName="glance-log" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.077758 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee73a326-c85e-4756-b836-3fc323ef11ac" containerName="glance-log" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.077920 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec1c3fd1-11dc-49a3-9869-95438b93ab08" containerName="glance-httpd" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.077935 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee73a326-c85e-4756-b836-3fc323ef11ac" containerName="glance-log" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.077958 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee73a326-c85e-4756-b836-3fc323ef11ac" containerName="glance-httpd" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.077973 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec1c3fd1-11dc-49a3-9869-95438b93ab08" containerName="glance-log" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.079789 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.081739 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.082096 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.091765 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vs878" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.091959 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.101260 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.106578 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.108702 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.109426 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.110433 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.124198 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.131273 4825 scope.go:117] "RemoveContainer" containerID="1d3e4e3e439cd0f6b04d665a1435005ce50c78d9a937614f5ef8a5ade7a959a5" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.171832 4825 scope.go:117] "RemoveContainer" containerID="e8b549f3aca00d92c110d0fabf012350fc907bc04b117fdc2dc18e6faf605abe" Mar 10 08:20:42 crc kubenswrapper[4825]: E0310 08:20:42.172210 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8b549f3aca00d92c110d0fabf012350fc907bc04b117fdc2dc18e6faf605abe\": container with ID starting with e8b549f3aca00d92c110d0fabf012350fc907bc04b117fdc2dc18e6faf605abe not found: ID does not exist" containerID="e8b549f3aca00d92c110d0fabf012350fc907bc04b117fdc2dc18e6faf605abe" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.172244 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8b549f3aca00d92c110d0fabf012350fc907bc04b117fdc2dc18e6faf605abe"} err="failed to get container status \"e8b549f3aca00d92c110d0fabf012350fc907bc04b117fdc2dc18e6faf605abe\": rpc error: code = NotFound desc = could not find container \"e8b549f3aca00d92c110d0fabf012350fc907bc04b117fdc2dc18e6faf605abe\": container with ID starting with e8b549f3aca00d92c110d0fabf012350fc907bc04b117fdc2dc18e6faf605abe not found: ID does not exist" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.172269 4825 scope.go:117] "RemoveContainer" containerID="1d3e4e3e439cd0f6b04d665a1435005ce50c78d9a937614f5ef8a5ade7a959a5" Mar 10 08:20:42 crc kubenswrapper[4825]: E0310 08:20:42.172531 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d3e4e3e439cd0f6b04d665a1435005ce50c78d9a937614f5ef8a5ade7a959a5\": container with ID starting with 1d3e4e3e439cd0f6b04d665a1435005ce50c78d9a937614f5ef8a5ade7a959a5 not found: ID does not exist" containerID="1d3e4e3e439cd0f6b04d665a1435005ce50c78d9a937614f5ef8a5ade7a959a5" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.172550 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d3e4e3e439cd0f6b04d665a1435005ce50c78d9a937614f5ef8a5ade7a959a5"} err="failed to get container status \"1d3e4e3e439cd0f6b04d665a1435005ce50c78d9a937614f5ef8a5ade7a959a5\": rpc error: code = NotFound desc = could not find container \"1d3e4e3e439cd0f6b04d665a1435005ce50c78d9a937614f5ef8a5ade7a959a5\": container with ID starting with 1d3e4e3e439cd0f6b04d665a1435005ce50c78d9a937614f5ef8a5ade7a959a5 not found: ID does not exist" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.263238 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/961053b2-4438-497f-8056-0f9d1b6f058b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"961053b2-4438-497f-8056-0f9d1b6f058b\") " pod="openstack/glance-default-external-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.263291 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/160cbdfa-ab8d-448a-85ae-38fcf6315509-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"160cbdfa-ab8d-448a-85ae-38fcf6315509\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.263379 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/961053b2-4438-497f-8056-0f9d1b6f058b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"961053b2-4438-497f-8056-0f9d1b6f058b\") " pod="openstack/glance-default-external-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.263433 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/160cbdfa-ab8d-448a-85ae-38fcf6315509-config-data\") pod \"glance-default-internal-api-0\" (UID: \"160cbdfa-ab8d-448a-85ae-38fcf6315509\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.263458 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/961053b2-4438-497f-8056-0f9d1b6f058b-scripts\") pod \"glance-default-external-api-0\" (UID: \"961053b2-4438-497f-8056-0f9d1b6f058b\") " pod="openstack/glance-default-external-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.263522 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/961053b2-4438-497f-8056-0f9d1b6f058b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"961053b2-4438-497f-8056-0f9d1b6f058b\") " pod="openstack/glance-default-external-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.263561 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/961053b2-4438-497f-8056-0f9d1b6f058b-config-data\") pod \"glance-default-external-api-0\" (UID: \"961053b2-4438-497f-8056-0f9d1b6f058b\") " pod="openstack/glance-default-external-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.263588 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/160cbdfa-ab8d-448a-85ae-38fcf6315509-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"160cbdfa-ab8d-448a-85ae-38fcf6315509\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.263648 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/160cbdfa-ab8d-448a-85ae-38fcf6315509-scripts\") pod \"glance-default-internal-api-0\" (UID: \"160cbdfa-ab8d-448a-85ae-38fcf6315509\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.263672 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shtk8\" (UniqueName: \"kubernetes.io/projected/961053b2-4438-497f-8056-0f9d1b6f058b-kube-api-access-shtk8\") pod \"glance-default-external-api-0\" (UID: \"961053b2-4438-497f-8056-0f9d1b6f058b\") " pod="openstack/glance-default-external-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.263692 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdmjw\" (UniqueName: \"kubernetes.io/projected/160cbdfa-ab8d-448a-85ae-38fcf6315509-kube-api-access-sdmjw\") pod \"glance-default-internal-api-0\" (UID: \"160cbdfa-ab8d-448a-85ae-38fcf6315509\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.263762 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/160cbdfa-ab8d-448a-85ae-38fcf6315509-logs\") pod \"glance-default-internal-api-0\" (UID: \"160cbdfa-ab8d-448a-85ae-38fcf6315509\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.264237 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/160cbdfa-ab8d-448a-85ae-38fcf6315509-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"160cbdfa-ab8d-448a-85ae-38fcf6315509\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.264330 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/961053b2-4438-497f-8056-0f9d1b6f058b-logs\") pod \"glance-default-external-api-0\" (UID: \"961053b2-4438-497f-8056-0f9d1b6f058b\") " pod="openstack/glance-default-external-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.372148 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/160cbdfa-ab8d-448a-85ae-38fcf6315509-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"160cbdfa-ab8d-448a-85ae-38fcf6315509\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.372241 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/961053b2-4438-497f-8056-0f9d1b6f058b-logs\") pod \"glance-default-external-api-0\" (UID: \"961053b2-4438-497f-8056-0f9d1b6f058b\") " pod="openstack/glance-default-external-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.372386 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/961053b2-4438-497f-8056-0f9d1b6f058b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"961053b2-4438-497f-8056-0f9d1b6f058b\") " pod="openstack/glance-default-external-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.372417 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/160cbdfa-ab8d-448a-85ae-38fcf6315509-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"160cbdfa-ab8d-448a-85ae-38fcf6315509\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.372455 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/961053b2-4438-497f-8056-0f9d1b6f058b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"961053b2-4438-497f-8056-0f9d1b6f058b\") " pod="openstack/glance-default-external-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.372483 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/160cbdfa-ab8d-448a-85ae-38fcf6315509-config-data\") pod \"glance-default-internal-api-0\" (UID: \"160cbdfa-ab8d-448a-85ae-38fcf6315509\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.372511 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/961053b2-4438-497f-8056-0f9d1b6f058b-scripts\") pod \"glance-default-external-api-0\" (UID: \"961053b2-4438-497f-8056-0f9d1b6f058b\") " pod="openstack/glance-default-external-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.372575 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/961053b2-4438-497f-8056-0f9d1b6f058b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"961053b2-4438-497f-8056-0f9d1b6f058b\") " pod="openstack/glance-default-external-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.372602 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/961053b2-4438-497f-8056-0f9d1b6f058b-config-data\") pod \"glance-default-external-api-0\" (UID: \"961053b2-4438-497f-8056-0f9d1b6f058b\") " pod="openstack/glance-default-external-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.372634 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/160cbdfa-ab8d-448a-85ae-38fcf6315509-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"160cbdfa-ab8d-448a-85ae-38fcf6315509\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.372709 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/160cbdfa-ab8d-448a-85ae-38fcf6315509-scripts\") pod \"glance-default-internal-api-0\" (UID: \"160cbdfa-ab8d-448a-85ae-38fcf6315509\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.372740 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shtk8\" (UniqueName: \"kubernetes.io/projected/961053b2-4438-497f-8056-0f9d1b6f058b-kube-api-access-shtk8\") pod \"glance-default-external-api-0\" (UID: \"961053b2-4438-497f-8056-0f9d1b6f058b\") " pod="openstack/glance-default-external-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.372743 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/961053b2-4438-497f-8056-0f9d1b6f058b-logs\") pod \"glance-default-external-api-0\" (UID: \"961053b2-4438-497f-8056-0f9d1b6f058b\") " pod="openstack/glance-default-external-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.372765 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdmjw\" (UniqueName: \"kubernetes.io/projected/160cbdfa-ab8d-448a-85ae-38fcf6315509-kube-api-access-sdmjw\") pod \"glance-default-internal-api-0\" (UID: \"160cbdfa-ab8d-448a-85ae-38fcf6315509\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.372809 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/160cbdfa-ab8d-448a-85ae-38fcf6315509-logs\") pod \"glance-default-internal-api-0\" (UID: \"160cbdfa-ab8d-448a-85ae-38fcf6315509\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.373261 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/160cbdfa-ab8d-448a-85ae-38fcf6315509-logs\") pod \"glance-default-internal-api-0\" (UID: \"160cbdfa-ab8d-448a-85ae-38fcf6315509\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.373586 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/160cbdfa-ab8d-448a-85ae-38fcf6315509-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"160cbdfa-ab8d-448a-85ae-38fcf6315509\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.375345 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/961053b2-4438-497f-8056-0f9d1b6f058b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"961053b2-4438-497f-8056-0f9d1b6f058b\") " pod="openstack/glance-default-external-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.379585 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/160cbdfa-ab8d-448a-85ae-38fcf6315509-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"160cbdfa-ab8d-448a-85ae-38fcf6315509\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.380649 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/160cbdfa-ab8d-448a-85ae-38fcf6315509-scripts\") pod \"glance-default-internal-api-0\" (UID: \"160cbdfa-ab8d-448a-85ae-38fcf6315509\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.382518 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/961053b2-4438-497f-8056-0f9d1b6f058b-scripts\") pod \"glance-default-external-api-0\" (UID: \"961053b2-4438-497f-8056-0f9d1b6f058b\") " pod="openstack/glance-default-external-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.383491 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/961053b2-4438-497f-8056-0f9d1b6f058b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"961053b2-4438-497f-8056-0f9d1b6f058b\") " pod="openstack/glance-default-external-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.384226 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/160cbdfa-ab8d-448a-85ae-38fcf6315509-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"160cbdfa-ab8d-448a-85ae-38fcf6315509\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.385006 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/961053b2-4438-497f-8056-0f9d1b6f058b-config-data\") pod \"glance-default-external-api-0\" (UID: \"961053b2-4438-497f-8056-0f9d1b6f058b\") " pod="openstack/glance-default-external-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.389058 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/961053b2-4438-497f-8056-0f9d1b6f058b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"961053b2-4438-497f-8056-0f9d1b6f058b\") " pod="openstack/glance-default-external-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.392470 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/160cbdfa-ab8d-448a-85ae-38fcf6315509-config-data\") pod \"glance-default-internal-api-0\" (UID: \"160cbdfa-ab8d-448a-85ae-38fcf6315509\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.392758 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdmjw\" (UniqueName: \"kubernetes.io/projected/160cbdfa-ab8d-448a-85ae-38fcf6315509-kube-api-access-sdmjw\") pod \"glance-default-internal-api-0\" (UID: \"160cbdfa-ab8d-448a-85ae-38fcf6315509\") " pod="openstack/glance-default-internal-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.394216 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shtk8\" (UniqueName: \"kubernetes.io/projected/961053b2-4438-497f-8056-0f9d1b6f058b-kube-api-access-shtk8\") pod \"glance-default-external-api-0\" (UID: \"961053b2-4438-497f-8056-0f9d1b6f058b\") " pod="openstack/glance-default-external-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.409751 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 08:20:42 crc kubenswrapper[4825]: I0310 08:20:42.425871 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 08:20:43 crc kubenswrapper[4825]: I0310 08:20:43.251548 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec1c3fd1-11dc-49a3-9869-95438b93ab08" path="/var/lib/kubelet/pods/ec1c3fd1-11dc-49a3-9869-95438b93ab08/volumes" Mar 10 08:20:43 crc kubenswrapper[4825]: I0310 08:20:43.252714 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee73a326-c85e-4756-b836-3fc323ef11ac" path="/var/lib/kubelet/pods/ee73a326-c85e-4756-b836-3fc323ef11ac/volumes" Mar 10 08:20:43 crc kubenswrapper[4825]: I0310 08:20:43.601368 4825 scope.go:117] "RemoveContainer" containerID="8ff48b0e845c119b201f42252ec7936b2bc34895932f700db12f398de5e90390" Mar 10 08:20:46 crc kubenswrapper[4825]: I0310 08:20:46.836489 4825 scope.go:117] "RemoveContainer" containerID="9bc38ee5df0de7e7cb535b07fa19f7882db88fdfd7fd3c142ef84476287cf80d" Mar 10 08:20:46 crc kubenswrapper[4825]: I0310 08:20:46.961988 4825 scope.go:117] "RemoveContainer" containerID="63fa391deabef5224165acf41e5f511070e87347af64375ff1f1bdb5a91b2773" Mar 10 08:20:47 crc kubenswrapper[4825]: I0310 08:20:47.080853 4825 scope.go:117] "RemoveContainer" containerID="7e47c76ed5bdceee305106f6805f553116458fd19154c222ac8de20aaadc5130" Mar 10 08:20:47 crc kubenswrapper[4825]: I0310 08:20:47.122272 4825 scope.go:117] "RemoveContainer" containerID="0445a81d70caf56e23376649a443336447a0a30059364373d8bd1773e80c6108" Mar 10 08:20:47 crc kubenswrapper[4825]: I0310 08:20:47.174521 4825 scope.go:117] "RemoveContainer" containerID="2eb2b8f487a346d97915c6289d53b6f34478daa1b1a82707defd22b2970a4eab" Mar 10 08:20:47 crc kubenswrapper[4825]: I0310 08:20:47.253047 4825 scope.go:117] "RemoveContainer" containerID="a0ef714ed80b9877b5e762ee73cb01ae9e769071f74dc2a2689e7de7628babf6" Mar 10 08:20:47 crc kubenswrapper[4825]: I0310 08:20:47.482063 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 08:20:47 crc kubenswrapper[4825]: I0310 08:20:47.700481 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 08:20:48 crc kubenswrapper[4825]: I0310 08:20:48.055018 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67b47658d-rd7tt" event={"ID":"402e8d63-8847-44af-9a62-9536d0f513f8","Type":"ContainerStarted","Data":"4734b56c5e54339cc3180b88d104ea545423352854d359b2160cb88f7ae1ab80"} Mar 10 08:20:48 crc kubenswrapper[4825]: I0310 08:20:48.055332 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67b47658d-rd7tt" event={"ID":"402e8d63-8847-44af-9a62-9536d0f513f8","Type":"ContainerStarted","Data":"39d78a2a9b3841e3d9ee90de8594f18c0fb81678d34f6df1cd41a80bb869fbf6"} Mar 10 08:20:48 crc kubenswrapper[4825]: I0310 08:20:48.060028 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77b77cfcdf-n62md" event={"ID":"385dc976-f03b-4c63-8cdc-73d5e8481597","Type":"ContainerStarted","Data":"d3210a09d486c69125e220164c363046db732c68b7ca447aa3d1f76428f6c7fa"} Mar 10 08:20:48 crc kubenswrapper[4825]: I0310 08:20:48.060070 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77b77cfcdf-n62md" event={"ID":"385dc976-f03b-4c63-8cdc-73d5e8481597","Type":"ContainerStarted","Data":"f1f8f63b853b5acfdbc6b971206d55184e2529eee741b097ed23ff2fb7591917"} Mar 10 08:20:48 crc kubenswrapper[4825]: I0310 08:20:48.060189 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-77b77cfcdf-n62md" podUID="385dc976-f03b-4c63-8cdc-73d5e8481597" containerName="horizon-log" containerID="cri-o://f1f8f63b853b5acfdbc6b971206d55184e2529eee741b097ed23ff2fb7591917" gracePeriod=30 Mar 10 08:20:48 crc kubenswrapper[4825]: I0310 08:20:48.060249 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-77b77cfcdf-n62md" podUID="385dc976-f03b-4c63-8cdc-73d5e8481597" containerName="horizon" containerID="cri-o://d3210a09d486c69125e220164c363046db732c68b7ca447aa3d1f76428f6c7fa" gracePeriod=30 Mar 10 08:20:48 crc kubenswrapper[4825]: I0310 08:20:48.064505 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8677c986cb-mlrhz" event={"ID":"53591cd3-2c72-4a0d-ac6b-a904476b2dc4","Type":"ContainerStarted","Data":"85df29cb665ff0979047be5868bad23878444fa169690ff6e59018560293c3c2"} Mar 10 08:20:48 crc kubenswrapper[4825]: I0310 08:20:48.064532 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8677c986cb-mlrhz" event={"ID":"53591cd3-2c72-4a0d-ac6b-a904476b2dc4","Type":"ContainerStarted","Data":"7ca55dea021557f84473516973ad447b63260d9147ae4bea8645ea07f90ad565"} Mar 10 08:20:48 crc kubenswrapper[4825]: I0310 08:20:48.066421 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"160cbdfa-ab8d-448a-85ae-38fcf6315509","Type":"ContainerStarted","Data":"c7dc61442a176d0d7c3a890059a8c94b5dd34d0e030f84658f4452548b79da84"} Mar 10 08:20:48 crc kubenswrapper[4825]: I0310 08:20:48.068405 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"961053b2-4438-497f-8056-0f9d1b6f058b","Type":"ContainerStarted","Data":"de4b8cb8a59f9abe18bdb8df5a263e81a365c6a8720f5b163c18c6d92cade1f9"} Mar 10 08:20:48 crc kubenswrapper[4825]: I0310 08:20:48.071530 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58dcdb7cc-ccx9h" event={"ID":"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421","Type":"ContainerStarted","Data":"9a7c68948eed8e14f0534f2eb32f7295a8b53e6ad7635be3e35eedc08c44b496"} Mar 10 08:20:48 crc kubenswrapper[4825]: I0310 08:20:48.071554 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58dcdb7cc-ccx9h" event={"ID":"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421","Type":"ContainerStarted","Data":"1d436d89e872fde65218671e25e0d1c7020e43085900f4b67ba050ae015e678e"} Mar 10 08:20:48 crc kubenswrapper[4825]: I0310 08:20:48.071651 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-58dcdb7cc-ccx9h" podUID="a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421" containerName="horizon-log" containerID="cri-o://1d436d89e872fde65218671e25e0d1c7020e43085900f4b67ba050ae015e678e" gracePeriod=30 Mar 10 08:20:48 crc kubenswrapper[4825]: I0310 08:20:48.071697 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-58dcdb7cc-ccx9h" podUID="a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421" containerName="horizon" containerID="cri-o://9a7c68948eed8e14f0534f2eb32f7295a8b53e6ad7635be3e35eedc08c44b496" gracePeriod=30 Mar 10 08:20:48 crc kubenswrapper[4825]: I0310 08:20:48.081900 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-67b47658d-rd7tt" podStartSLOduration=2.647775582 podStartE2EDuration="9.081879321s" podCreationTimestamp="2026-03-10 08:20:39 +0000 UTC" firstStartedPulling="2026-03-10 08:20:40.589882458 +0000 UTC m=+5793.619663073" lastFinishedPulling="2026-03-10 08:20:47.023986197 +0000 UTC m=+5800.053766812" observedRunningTime="2026-03-10 08:20:48.073504481 +0000 UTC m=+5801.103285096" watchObservedRunningTime="2026-03-10 08:20:48.081879321 +0000 UTC m=+5801.111659936" Mar 10 08:20:48 crc kubenswrapper[4825]: I0310 08:20:48.110035 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-77b77cfcdf-n62md" podStartSLOduration=2.820171781 podStartE2EDuration="11.11001219s" podCreationTimestamp="2026-03-10 08:20:37 +0000 UTC" firstStartedPulling="2026-03-10 08:20:38.777118236 +0000 UTC m=+5791.806898851" lastFinishedPulling="2026-03-10 08:20:47.066958645 +0000 UTC m=+5800.096739260" observedRunningTime="2026-03-10 08:20:48.102348289 +0000 UTC m=+5801.132128904" watchObservedRunningTime="2026-03-10 08:20:48.11001219 +0000 UTC m=+5801.139792815" Mar 10 08:20:48 crc kubenswrapper[4825]: I0310 08:20:48.133458 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-8677c986cb-mlrhz" podStartSLOduration=2.5965840079999998 podStartE2EDuration="9.133435345s" podCreationTimestamp="2026-03-10 08:20:39 +0000 UTC" firstStartedPulling="2026-03-10 08:20:40.474376065 +0000 UTC m=+5793.504156680" lastFinishedPulling="2026-03-10 08:20:47.011227402 +0000 UTC m=+5800.041008017" observedRunningTime="2026-03-10 08:20:48.121507812 +0000 UTC m=+5801.151288437" watchObservedRunningTime="2026-03-10 08:20:48.133435345 +0000 UTC m=+5801.163215970" Mar 10 08:20:48 crc kubenswrapper[4825]: I0310 08:20:48.143334 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-58dcdb7cc-ccx9h" podStartSLOduration=2.797843705 podStartE2EDuration="11.143317654s" podCreationTimestamp="2026-03-10 08:20:37 +0000 UTC" firstStartedPulling="2026-03-10 08:20:38.66335467 +0000 UTC m=+5791.693135285" lastFinishedPulling="2026-03-10 08:20:47.008828599 +0000 UTC m=+5800.038609234" observedRunningTime="2026-03-10 08:20:48.142353369 +0000 UTC m=+5801.172133974" watchObservedRunningTime="2026-03-10 08:20:48.143317654 +0000 UTC m=+5801.173098269" Mar 10 08:20:48 crc kubenswrapper[4825]: I0310 08:20:48.203352 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-58dcdb7cc-ccx9h" Mar 10 08:20:48 crc kubenswrapper[4825]: I0310 08:20:48.298978 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-77b77cfcdf-n62md" Mar 10 08:20:49 crc kubenswrapper[4825]: I0310 08:20:49.084956 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"160cbdfa-ab8d-448a-85ae-38fcf6315509","Type":"ContainerStarted","Data":"d325399696ede672748d64a225cbda9e31a4a5c42501db2699470250d4fa654e"} Mar 10 08:20:49 crc kubenswrapper[4825]: I0310 08:20:49.085007 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"160cbdfa-ab8d-448a-85ae-38fcf6315509","Type":"ContainerStarted","Data":"6bbc92af924e3d9e0f8dcdc523135954370c0902253aa7dd2e967223ca4c9584"} Mar 10 08:20:49 crc kubenswrapper[4825]: I0310 08:20:49.087889 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"961053b2-4438-497f-8056-0f9d1b6f058b","Type":"ContainerStarted","Data":"a59f0d349c63cd089f2e04b08618ef68a752840e68e74ac2b61796755527bc9c"} Mar 10 08:20:49 crc kubenswrapper[4825]: I0310 08:20:49.087941 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"961053b2-4438-497f-8056-0f9d1b6f058b","Type":"ContainerStarted","Data":"f66986083fbbbd56d86251943b6d6c2908ab7bfc5805c5245850afb0e4f78366"} Mar 10 08:20:49 crc kubenswrapper[4825]: I0310 08:20:49.122402 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.122380609 podStartE2EDuration="8.122380609s" podCreationTimestamp="2026-03-10 08:20:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:20:49.106512922 +0000 UTC m=+5802.136293537" watchObservedRunningTime="2026-03-10 08:20:49.122380609 +0000 UTC m=+5802.152161224" Mar 10 08:20:49 crc kubenswrapper[4825]: I0310 08:20:49.133965 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.133944852 podStartE2EDuration="8.133944852s" podCreationTimestamp="2026-03-10 08:20:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:20:49.132470253 +0000 UTC m=+5802.162250878" watchObservedRunningTime="2026-03-10 08:20:49.133944852 +0000 UTC m=+5802.163725467" Mar 10 08:20:49 crc kubenswrapper[4825]: I0310 08:20:49.988217 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:49 crc kubenswrapper[4825]: I0310 08:20:49.988267 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:20:50 crc kubenswrapper[4825]: I0310 08:20:50.040100 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:50 crc kubenswrapper[4825]: I0310 08:20:50.040165 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:20:52 crc kubenswrapper[4825]: I0310 08:20:52.410194 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 08:20:52 crc kubenswrapper[4825]: I0310 08:20:52.410481 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 08:20:52 crc kubenswrapper[4825]: I0310 08:20:52.427004 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 08:20:52 crc kubenswrapper[4825]: I0310 08:20:52.427048 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 08:20:52 crc kubenswrapper[4825]: I0310 08:20:52.450478 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 08:20:52 crc kubenswrapper[4825]: I0310 08:20:52.462522 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 08:20:52 crc kubenswrapper[4825]: I0310 08:20:52.469292 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 08:20:52 crc kubenswrapper[4825]: I0310 08:20:52.478052 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 08:20:53 crc kubenswrapper[4825]: I0310 08:20:53.181334 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 08:20:53 crc kubenswrapper[4825]: I0310 08:20:53.181420 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 08:20:53 crc kubenswrapper[4825]: I0310 08:20:53.181437 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 08:20:53 crc kubenswrapper[4825]: I0310 08:20:53.181448 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 08:20:55 crc kubenswrapper[4825]: I0310 08:20:55.198116 4825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 08:20:55 crc kubenswrapper[4825]: I0310 08:20:55.198420 4825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 08:20:55 crc kubenswrapper[4825]: I0310 08:20:55.513249 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 08:20:55 crc kubenswrapper[4825]: I0310 08:20:55.513402 4825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 08:20:55 crc kubenswrapper[4825]: I0310 08:20:55.521160 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 08:20:55 crc kubenswrapper[4825]: I0310 08:20:55.524533 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 08:20:55 crc kubenswrapper[4825]: I0310 08:20:55.666970 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 08:20:59 crc kubenswrapper[4825]: I0310 08:20:59.990978 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-8677c986cb-mlrhz" podUID="53591cd3-2c72-4a0d-ac6b-a904476b2dc4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.139:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.139:8443: connect: connection refused" Mar 10 08:21:00 crc kubenswrapper[4825]: I0310 08:21:00.044317 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-67b47658d-rd7tt" podUID="402e8d63-8847-44af-9a62-9536d0f513f8" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.140:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.140:8443: connect: connection refused" Mar 10 08:21:11 crc kubenswrapper[4825]: I0310 08:21:11.848354 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:21:11 crc kubenswrapper[4825]: I0310 08:21:11.875579 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:21:13 crc kubenswrapper[4825]: I0310 08:21:13.584047 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:21:13 crc kubenswrapper[4825]: I0310 08:21:13.709723 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:21:13 crc kubenswrapper[4825]: I0310 08:21:13.773911 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8677c986cb-mlrhz"] Mar 10 08:21:14 crc kubenswrapper[4825]: I0310 08:21:14.361178 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8677c986cb-mlrhz" podUID="53591cd3-2c72-4a0d-ac6b-a904476b2dc4" containerName="horizon-log" containerID="cri-o://7ca55dea021557f84473516973ad447b63260d9147ae4bea8645ea07f90ad565" gracePeriod=30 Mar 10 08:21:14 crc kubenswrapper[4825]: I0310 08:21:14.361218 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8677c986cb-mlrhz" podUID="53591cd3-2c72-4a0d-ac6b-a904476b2dc4" containerName="horizon" containerID="cri-o://85df29cb665ff0979047be5868bad23878444fa169690ff6e59018560293c3c2" gracePeriod=30 Mar 10 08:21:16 crc kubenswrapper[4825]: I0310 08:21:16.888105 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:21:16 crc kubenswrapper[4825]: I0310 08:21:16.888401 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:21:18 crc kubenswrapper[4825]: E0310 08:21:18.370943 4825 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2ee8d7c_1256_4a21_8f30_2a3f9d3f1421.slice/crio-conmon-9a7c68948eed8e14f0534f2eb32f7295a8b53e6ad7635be3e35eedc08c44b496.scope\": RecentStats: unable to find data in memory cache]" Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.416294 4825 generic.go:334] "Generic (PLEG): container finished" podID="a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421" containerID="9a7c68948eed8e14f0534f2eb32f7295a8b53e6ad7635be3e35eedc08c44b496" exitCode=137 Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.416338 4825 generic.go:334] "Generic (PLEG): container finished" podID="a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421" containerID="1d436d89e872fde65218671e25e0d1c7020e43085900f4b67ba050ae015e678e" exitCode=137 Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.416414 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58dcdb7cc-ccx9h" event={"ID":"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421","Type":"ContainerDied","Data":"9a7c68948eed8e14f0534f2eb32f7295a8b53e6ad7635be3e35eedc08c44b496"} Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.416459 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58dcdb7cc-ccx9h" event={"ID":"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421","Type":"ContainerDied","Data":"1d436d89e872fde65218671e25e0d1c7020e43085900f4b67ba050ae015e678e"} Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.419371 4825 generic.go:334] "Generic (PLEG): container finished" podID="385dc976-f03b-4c63-8cdc-73d5e8481597" containerID="d3210a09d486c69125e220164c363046db732c68b7ca447aa3d1f76428f6c7fa" exitCode=137 Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.419419 4825 generic.go:334] "Generic (PLEG): container finished" podID="385dc976-f03b-4c63-8cdc-73d5e8481597" containerID="f1f8f63b853b5acfdbc6b971206d55184e2529eee741b097ed23ff2fb7591917" exitCode=137 Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.419440 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77b77cfcdf-n62md" event={"ID":"385dc976-f03b-4c63-8cdc-73d5e8481597","Type":"ContainerDied","Data":"d3210a09d486c69125e220164c363046db732c68b7ca447aa3d1f76428f6c7fa"} Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.419486 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77b77cfcdf-n62md" event={"ID":"385dc976-f03b-4c63-8cdc-73d5e8481597","Type":"ContainerDied","Data":"f1f8f63b853b5acfdbc6b971206d55184e2529eee741b097ed23ff2fb7591917"} Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.421952 4825 generic.go:334] "Generic (PLEG): container finished" podID="53591cd3-2c72-4a0d-ac6b-a904476b2dc4" containerID="85df29cb665ff0979047be5868bad23878444fa169690ff6e59018560293c3c2" exitCode=0 Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.421978 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8677c986cb-mlrhz" event={"ID":"53591cd3-2c72-4a0d-ac6b-a904476b2dc4","Type":"ContainerDied","Data":"85df29cb665ff0979047be5868bad23878444fa169690ff6e59018560293c3c2"} Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.585516 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77b77cfcdf-n62md" Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.591494 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58dcdb7cc-ccx9h" Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.749094 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-horizon-secret-key\") pod \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\" (UID: \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\") " Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.749196 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/385dc976-f03b-4c63-8cdc-73d5e8481597-config-data\") pod \"385dc976-f03b-4c63-8cdc-73d5e8481597\" (UID: \"385dc976-f03b-4c63-8cdc-73d5e8481597\") " Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.749235 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-625pk\" (UniqueName: \"kubernetes.io/projected/385dc976-f03b-4c63-8cdc-73d5e8481597-kube-api-access-625pk\") pod \"385dc976-f03b-4c63-8cdc-73d5e8481597\" (UID: \"385dc976-f03b-4c63-8cdc-73d5e8481597\") " Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.749356 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-logs\") pod \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\" (UID: \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\") " Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.749382 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-config-data\") pod \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\" (UID: \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\") " Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.749407 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/385dc976-f03b-4c63-8cdc-73d5e8481597-logs\") pod \"385dc976-f03b-4c63-8cdc-73d5e8481597\" (UID: \"385dc976-f03b-4c63-8cdc-73d5e8481597\") " Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.749460 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-scripts\") pod \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\" (UID: \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\") " Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.749549 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvs2z\" (UniqueName: \"kubernetes.io/projected/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-kube-api-access-jvs2z\") pod \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\" (UID: \"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421\") " Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.749589 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/385dc976-f03b-4c63-8cdc-73d5e8481597-horizon-secret-key\") pod \"385dc976-f03b-4c63-8cdc-73d5e8481597\" (UID: \"385dc976-f03b-4c63-8cdc-73d5e8481597\") " Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.749659 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/385dc976-f03b-4c63-8cdc-73d5e8481597-scripts\") pod \"385dc976-f03b-4c63-8cdc-73d5e8481597\" (UID: \"385dc976-f03b-4c63-8cdc-73d5e8481597\") " Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.750438 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/385dc976-f03b-4c63-8cdc-73d5e8481597-logs" (OuterVolumeSpecName: "logs") pod "385dc976-f03b-4c63-8cdc-73d5e8481597" (UID: "385dc976-f03b-4c63-8cdc-73d5e8481597"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.750802 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-logs" (OuterVolumeSpecName: "logs") pod "a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421" (UID: "a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.755538 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421" (UID: "a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.763288 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-kube-api-access-jvs2z" (OuterVolumeSpecName: "kube-api-access-jvs2z") pod "a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421" (UID: "a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421"). InnerVolumeSpecName "kube-api-access-jvs2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.763372 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/385dc976-f03b-4c63-8cdc-73d5e8481597-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "385dc976-f03b-4c63-8cdc-73d5e8481597" (UID: "385dc976-f03b-4c63-8cdc-73d5e8481597"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.763440 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/385dc976-f03b-4c63-8cdc-73d5e8481597-kube-api-access-625pk" (OuterVolumeSpecName: "kube-api-access-625pk") pod "385dc976-f03b-4c63-8cdc-73d5e8481597" (UID: "385dc976-f03b-4c63-8cdc-73d5e8481597"). InnerVolumeSpecName "kube-api-access-625pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.773305 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-scripts" (OuterVolumeSpecName: "scripts") pod "a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421" (UID: "a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.775725 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/385dc976-f03b-4c63-8cdc-73d5e8481597-config-data" (OuterVolumeSpecName: "config-data") pod "385dc976-f03b-4c63-8cdc-73d5e8481597" (UID: "385dc976-f03b-4c63-8cdc-73d5e8481597"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.778633 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/385dc976-f03b-4c63-8cdc-73d5e8481597-scripts" (OuterVolumeSpecName: "scripts") pod "385dc976-f03b-4c63-8cdc-73d5e8481597" (UID: "385dc976-f03b-4c63-8cdc-73d5e8481597"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.787501 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-config-data" (OuterVolumeSpecName: "config-data") pod "a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421" (UID: "a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.851253 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/385dc976-f03b-4c63-8cdc-73d5e8481597-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.851292 4825 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.851304 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/385dc976-f03b-4c63-8cdc-73d5e8481597-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.851314 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-625pk\" (UniqueName: \"kubernetes.io/projected/385dc976-f03b-4c63-8cdc-73d5e8481597-kube-api-access-625pk\") on node \"crc\" DevicePath \"\"" Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.851325 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-logs\") on node \"crc\" DevicePath \"\"" Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.851335 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.851345 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/385dc976-f03b-4c63-8cdc-73d5e8481597-logs\") on node \"crc\" DevicePath \"\"" Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.851354 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.851365 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvs2z\" (UniqueName: \"kubernetes.io/projected/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421-kube-api-access-jvs2z\") on node \"crc\" DevicePath \"\"" Mar 10 08:21:18 crc kubenswrapper[4825]: I0310 08:21:18.851373 4825 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/385dc976-f03b-4c63-8cdc-73d5e8481597-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 08:21:19 crc kubenswrapper[4825]: I0310 08:21:19.433092 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77b77cfcdf-n62md" Mar 10 08:21:19 crc kubenswrapper[4825]: I0310 08:21:19.433088 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77b77cfcdf-n62md" event={"ID":"385dc976-f03b-4c63-8cdc-73d5e8481597","Type":"ContainerDied","Data":"7bcbc4fdd5fec9c3ff27451dfa08602e7574850e8a3fbc3bbe148a32a079dcc7"} Mar 10 08:21:19 crc kubenswrapper[4825]: I0310 08:21:19.433576 4825 scope.go:117] "RemoveContainer" containerID="d3210a09d486c69125e220164c363046db732c68b7ca447aa3d1f76428f6c7fa" Mar 10 08:21:19 crc kubenswrapper[4825]: I0310 08:21:19.435057 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58dcdb7cc-ccx9h" event={"ID":"a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421","Type":"ContainerDied","Data":"ba4a2da7d97965aa9be94de1d01b90744244674a63fed17dd94acda6ff94d3af"} Mar 10 08:21:19 crc kubenswrapper[4825]: I0310 08:21:19.435107 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58dcdb7cc-ccx9h" Mar 10 08:21:19 crc kubenswrapper[4825]: I0310 08:21:19.468727 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-58dcdb7cc-ccx9h"] Mar 10 08:21:19 crc kubenswrapper[4825]: I0310 08:21:19.479141 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-58dcdb7cc-ccx9h"] Mar 10 08:21:19 crc kubenswrapper[4825]: I0310 08:21:19.493233 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-77b77cfcdf-n62md"] Mar 10 08:21:19 crc kubenswrapper[4825]: I0310 08:21:19.501545 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-77b77cfcdf-n62md"] Mar 10 08:21:19 crc kubenswrapper[4825]: I0310 08:21:19.628312 4825 scope.go:117] "RemoveContainer" containerID="f1f8f63b853b5acfdbc6b971206d55184e2529eee741b097ed23ff2fb7591917" Mar 10 08:21:19 crc kubenswrapper[4825]: I0310 08:21:19.648411 4825 scope.go:117] "RemoveContainer" containerID="9a7c68948eed8e14f0534f2eb32f7295a8b53e6ad7635be3e35eedc08c44b496" Mar 10 08:21:19 crc kubenswrapper[4825]: I0310 08:21:19.832745 4825 scope.go:117] "RemoveContainer" containerID="1d436d89e872fde65218671e25e0d1c7020e43085900f4b67ba050ae015e678e" Mar 10 08:21:19 crc kubenswrapper[4825]: I0310 08:21:19.988803 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8677c986cb-mlrhz" podUID="53591cd3-2c72-4a0d-ac6b-a904476b2dc4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.139:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.139:8443: connect: connection refused" Mar 10 08:21:21 crc kubenswrapper[4825]: I0310 08:21:21.251472 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="385dc976-f03b-4c63-8cdc-73d5e8481597" path="/var/lib/kubelet/pods/385dc976-f03b-4c63-8cdc-73d5e8481597/volumes" Mar 10 08:21:21 crc kubenswrapper[4825]: I0310 08:21:21.253341 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421" path="/var/lib/kubelet/pods/a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421/volumes" Mar 10 08:21:29 crc kubenswrapper[4825]: I0310 08:21:29.988981 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8677c986cb-mlrhz" podUID="53591cd3-2c72-4a0d-ac6b-a904476b2dc4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.139:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.139:8443: connect: connection refused" Mar 10 08:21:39 crc kubenswrapper[4825]: I0310 08:21:39.988244 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8677c986cb-mlrhz" podUID="53591cd3-2c72-4a0d-ac6b-a904476b2dc4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.139:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.139:8443: connect: connection refused" Mar 10 08:21:39 crc kubenswrapper[4825]: I0310 08:21:39.988941 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.702385 4825 generic.go:334] "Generic (PLEG): container finished" podID="53591cd3-2c72-4a0d-ac6b-a904476b2dc4" containerID="7ca55dea021557f84473516973ad447b63260d9147ae4bea8645ea07f90ad565" exitCode=137 Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.702412 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8677c986cb-mlrhz" event={"ID":"53591cd3-2c72-4a0d-ac6b-a904476b2dc4","Type":"ContainerDied","Data":"7ca55dea021557f84473516973ad447b63260d9147ae4bea8645ea07f90ad565"} Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.703028 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8677c986cb-mlrhz" event={"ID":"53591cd3-2c72-4a0d-ac6b-a904476b2dc4","Type":"ContainerDied","Data":"e93fe6bafe88a6a808073f78a47244777e53ccd22807bdbdc771c17a4d8be925"} Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.703050 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e93fe6bafe88a6a808073f78a47244777e53ccd22807bdbdc771c17a4d8be925" Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.729115 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.794962 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtjvh\" (UniqueName: \"kubernetes.io/projected/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-kube-api-access-mtjvh\") pod \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.795080 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-config-data\") pod \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.795230 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-logs\") pod \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.795280 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-scripts\") pod \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.795406 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-horizon-secret-key\") pod \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.795443 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-horizon-tls-certs\") pod \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.795467 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-combined-ca-bundle\") pod \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\" (UID: \"53591cd3-2c72-4a0d-ac6b-a904476b2dc4\") " Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.796475 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-logs" (OuterVolumeSpecName: "logs") pod "53591cd3-2c72-4a0d-ac6b-a904476b2dc4" (UID: "53591cd3-2c72-4a0d-ac6b-a904476b2dc4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.806059 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-kube-api-access-mtjvh" (OuterVolumeSpecName: "kube-api-access-mtjvh") pod "53591cd3-2c72-4a0d-ac6b-a904476b2dc4" (UID: "53591cd3-2c72-4a0d-ac6b-a904476b2dc4"). InnerVolumeSpecName "kube-api-access-mtjvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.806048 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "53591cd3-2c72-4a0d-ac6b-a904476b2dc4" (UID: "53591cd3-2c72-4a0d-ac6b-a904476b2dc4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.823055 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53591cd3-2c72-4a0d-ac6b-a904476b2dc4" (UID: "53591cd3-2c72-4a0d-ac6b-a904476b2dc4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.826043 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-scripts" (OuterVolumeSpecName: "scripts") pod "53591cd3-2c72-4a0d-ac6b-a904476b2dc4" (UID: "53591cd3-2c72-4a0d-ac6b-a904476b2dc4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.832787 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-config-data" (OuterVolumeSpecName: "config-data") pod "53591cd3-2c72-4a0d-ac6b-a904476b2dc4" (UID: "53591cd3-2c72-4a0d-ac6b-a904476b2dc4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.854072 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "53591cd3-2c72-4a0d-ac6b-a904476b2dc4" (UID: "53591cd3-2c72-4a0d-ac6b-a904476b2dc4"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.898312 4825 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.898363 4825 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.898377 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.898389 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtjvh\" (UniqueName: \"kubernetes.io/projected/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-kube-api-access-mtjvh\") on node \"crc\" DevicePath \"\"" Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.898401 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.898412 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-logs\") on node \"crc\" DevicePath \"\"" Mar 10 08:21:44 crc kubenswrapper[4825]: I0310 08:21:44.898424 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53591cd3-2c72-4a0d-ac6b-a904476b2dc4-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:21:45 crc kubenswrapper[4825]: I0310 08:21:45.711448 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8677c986cb-mlrhz" Mar 10 08:21:45 crc kubenswrapper[4825]: I0310 08:21:45.742802 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8677c986cb-mlrhz"] Mar 10 08:21:45 crc kubenswrapper[4825]: I0310 08:21:45.751531 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8677c986cb-mlrhz"] Mar 10 08:21:46 crc kubenswrapper[4825]: I0310 08:21:46.889034 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:21:46 crc kubenswrapper[4825]: I0310 08:21:46.889100 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:21:47 crc kubenswrapper[4825]: I0310 08:21:47.248372 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53591cd3-2c72-4a0d-ac6b-a904476b2dc4" path="/var/lib/kubelet/pods/53591cd3-2c72-4a0d-ac6b-a904476b2dc4/volumes" Mar 10 08:21:47 crc kubenswrapper[4825]: I0310 08:21:47.719837 4825 scope.go:117] "RemoveContainer" containerID="31afbfa85d1b8c8b7ab676c6ab3b0efb6487c66e1418dd92d1207a9c8c547043" Mar 10 08:21:47 crc kubenswrapper[4825]: I0310 08:21:47.749786 4825 scope.go:117] "RemoveContainer" containerID="93e50380a27201dc777ba257a73ff60ccbd7ddb727bf3ff5f02acec9cac182e3" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.305382 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-fdb9f75f4-fr95n"] Mar 10 08:21:55 crc kubenswrapper[4825]: E0310 08:21:55.306323 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421" containerName="horizon-log" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.306340 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421" containerName="horizon-log" Mar 10 08:21:55 crc kubenswrapper[4825]: E0310 08:21:55.306360 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53591cd3-2c72-4a0d-ac6b-a904476b2dc4" containerName="horizon-log" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.306368 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="53591cd3-2c72-4a0d-ac6b-a904476b2dc4" containerName="horizon-log" Mar 10 08:21:55 crc kubenswrapper[4825]: E0310 08:21:55.306378 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="385dc976-f03b-4c63-8cdc-73d5e8481597" containerName="horizon" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.306386 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="385dc976-f03b-4c63-8cdc-73d5e8481597" containerName="horizon" Mar 10 08:21:55 crc kubenswrapper[4825]: E0310 08:21:55.306401 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="385dc976-f03b-4c63-8cdc-73d5e8481597" containerName="horizon-log" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.306408 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="385dc976-f03b-4c63-8cdc-73d5e8481597" containerName="horizon-log" Mar 10 08:21:55 crc kubenswrapper[4825]: E0310 08:21:55.306436 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421" containerName="horizon" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.306444 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421" containerName="horizon" Mar 10 08:21:55 crc kubenswrapper[4825]: E0310 08:21:55.306464 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53591cd3-2c72-4a0d-ac6b-a904476b2dc4" containerName="horizon" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.306473 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="53591cd3-2c72-4a0d-ac6b-a904476b2dc4" containerName="horizon" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.306672 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="385dc976-f03b-4c63-8cdc-73d5e8481597" containerName="horizon-log" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.306690 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="53591cd3-2c72-4a0d-ac6b-a904476b2dc4" containerName="horizon-log" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.306702 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421" containerName="horizon" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.306715 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="53591cd3-2c72-4a0d-ac6b-a904476b2dc4" containerName="horizon" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.306724 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2ee8d7c-1256-4a21-8f30-2a3f9d3f1421" containerName="horizon-log" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.306748 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="385dc976-f03b-4c63-8cdc-73d5e8481597" containerName="horizon" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.307963 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.329307 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-fdb9f75f4-fr95n"] Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.355165 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/75ad0a08-756c-40ea-85ec-3cd497c96680-scripts\") pod \"horizon-fdb9f75f4-fr95n\" (UID: \"75ad0a08-756c-40ea-85ec-3cd497c96680\") " pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.355233 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75ad0a08-756c-40ea-85ec-3cd497c96680-combined-ca-bundle\") pod \"horizon-fdb9f75f4-fr95n\" (UID: \"75ad0a08-756c-40ea-85ec-3cd497c96680\") " pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.355270 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zk2k\" (UniqueName: \"kubernetes.io/projected/75ad0a08-756c-40ea-85ec-3cd497c96680-kube-api-access-4zk2k\") pod \"horizon-fdb9f75f4-fr95n\" (UID: \"75ad0a08-756c-40ea-85ec-3cd497c96680\") " pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.355326 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75ad0a08-756c-40ea-85ec-3cd497c96680-logs\") pod \"horizon-fdb9f75f4-fr95n\" (UID: \"75ad0a08-756c-40ea-85ec-3cd497c96680\") " pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.355363 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/75ad0a08-756c-40ea-85ec-3cd497c96680-horizon-secret-key\") pod \"horizon-fdb9f75f4-fr95n\" (UID: \"75ad0a08-756c-40ea-85ec-3cd497c96680\") " pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.355385 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/75ad0a08-756c-40ea-85ec-3cd497c96680-horizon-tls-certs\") pod \"horizon-fdb9f75f4-fr95n\" (UID: \"75ad0a08-756c-40ea-85ec-3cd497c96680\") " pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.355404 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75ad0a08-756c-40ea-85ec-3cd497c96680-config-data\") pod \"horizon-fdb9f75f4-fr95n\" (UID: \"75ad0a08-756c-40ea-85ec-3cd497c96680\") " pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.457525 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75ad0a08-756c-40ea-85ec-3cd497c96680-combined-ca-bundle\") pod \"horizon-fdb9f75f4-fr95n\" (UID: \"75ad0a08-756c-40ea-85ec-3cd497c96680\") " pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.457566 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zk2k\" (UniqueName: \"kubernetes.io/projected/75ad0a08-756c-40ea-85ec-3cd497c96680-kube-api-access-4zk2k\") pod \"horizon-fdb9f75f4-fr95n\" (UID: \"75ad0a08-756c-40ea-85ec-3cd497c96680\") " pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.457617 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75ad0a08-756c-40ea-85ec-3cd497c96680-logs\") pod \"horizon-fdb9f75f4-fr95n\" (UID: \"75ad0a08-756c-40ea-85ec-3cd497c96680\") " pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.457660 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/75ad0a08-756c-40ea-85ec-3cd497c96680-horizon-secret-key\") pod \"horizon-fdb9f75f4-fr95n\" (UID: \"75ad0a08-756c-40ea-85ec-3cd497c96680\") " pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.457688 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/75ad0a08-756c-40ea-85ec-3cd497c96680-horizon-tls-certs\") pod \"horizon-fdb9f75f4-fr95n\" (UID: \"75ad0a08-756c-40ea-85ec-3cd497c96680\") " pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.457710 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75ad0a08-756c-40ea-85ec-3cd497c96680-config-data\") pod \"horizon-fdb9f75f4-fr95n\" (UID: \"75ad0a08-756c-40ea-85ec-3cd497c96680\") " pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.457816 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/75ad0a08-756c-40ea-85ec-3cd497c96680-scripts\") pod \"horizon-fdb9f75f4-fr95n\" (UID: \"75ad0a08-756c-40ea-85ec-3cd497c96680\") " pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.458283 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75ad0a08-756c-40ea-85ec-3cd497c96680-logs\") pod \"horizon-fdb9f75f4-fr95n\" (UID: \"75ad0a08-756c-40ea-85ec-3cd497c96680\") " pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.458595 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/75ad0a08-756c-40ea-85ec-3cd497c96680-scripts\") pod \"horizon-fdb9f75f4-fr95n\" (UID: \"75ad0a08-756c-40ea-85ec-3cd497c96680\") " pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.459214 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75ad0a08-756c-40ea-85ec-3cd497c96680-config-data\") pod \"horizon-fdb9f75f4-fr95n\" (UID: \"75ad0a08-756c-40ea-85ec-3cd497c96680\") " pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.463227 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/75ad0a08-756c-40ea-85ec-3cd497c96680-horizon-secret-key\") pod \"horizon-fdb9f75f4-fr95n\" (UID: \"75ad0a08-756c-40ea-85ec-3cd497c96680\") " pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.463464 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/75ad0a08-756c-40ea-85ec-3cd497c96680-horizon-tls-certs\") pod \"horizon-fdb9f75f4-fr95n\" (UID: \"75ad0a08-756c-40ea-85ec-3cd497c96680\") " pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.471447 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75ad0a08-756c-40ea-85ec-3cd497c96680-combined-ca-bundle\") pod \"horizon-fdb9f75f4-fr95n\" (UID: \"75ad0a08-756c-40ea-85ec-3cd497c96680\") " pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.476546 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zk2k\" (UniqueName: \"kubernetes.io/projected/75ad0a08-756c-40ea-85ec-3cd497c96680-kube-api-access-4zk2k\") pod \"horizon-fdb9f75f4-fr95n\" (UID: \"75ad0a08-756c-40ea-85ec-3cd497c96680\") " pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:21:55 crc kubenswrapper[4825]: I0310 08:21:55.629395 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.127187 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-fdb9f75f4-fr95n"] Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.530041 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-pv5lc"] Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.531830 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-pv5lc" Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.543289 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-pv5lc"] Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.637315 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-2379-account-create-update-bq46z"] Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.638892 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-2379-account-create-update-bq46z" Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.646567 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.650247 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-2379-account-create-update-bq46z"] Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.679561 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjq7j\" (UniqueName: \"kubernetes.io/projected/320db231-7a1a-4a9c-aa42-dbc5593be19b-kube-api-access-xjq7j\") pod \"heat-db-create-pv5lc\" (UID: \"320db231-7a1a-4a9c-aa42-dbc5593be19b\") " pod="openstack/heat-db-create-pv5lc" Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.679870 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/320db231-7a1a-4a9c-aa42-dbc5593be19b-operator-scripts\") pod \"heat-db-create-pv5lc\" (UID: \"320db231-7a1a-4a9c-aa42-dbc5593be19b\") " pod="openstack/heat-db-create-pv5lc" Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.781332 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbwsx\" (UniqueName: \"kubernetes.io/projected/7e1976b7-f398-46fb-b376-64939c737883-kube-api-access-nbwsx\") pod \"heat-2379-account-create-update-bq46z\" (UID: \"7e1976b7-f398-46fb-b376-64939c737883\") " pod="openstack/heat-2379-account-create-update-bq46z" Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.781420 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/320db231-7a1a-4a9c-aa42-dbc5593be19b-operator-scripts\") pod \"heat-db-create-pv5lc\" (UID: \"320db231-7a1a-4a9c-aa42-dbc5593be19b\") " pod="openstack/heat-db-create-pv5lc" Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.781595 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjq7j\" (UniqueName: \"kubernetes.io/projected/320db231-7a1a-4a9c-aa42-dbc5593be19b-kube-api-access-xjq7j\") pod \"heat-db-create-pv5lc\" (UID: \"320db231-7a1a-4a9c-aa42-dbc5593be19b\") " pod="openstack/heat-db-create-pv5lc" Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.781904 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e1976b7-f398-46fb-b376-64939c737883-operator-scripts\") pod \"heat-2379-account-create-update-bq46z\" (UID: \"7e1976b7-f398-46fb-b376-64939c737883\") " pod="openstack/heat-2379-account-create-update-bq46z" Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.782256 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/320db231-7a1a-4a9c-aa42-dbc5593be19b-operator-scripts\") pod \"heat-db-create-pv5lc\" (UID: \"320db231-7a1a-4a9c-aa42-dbc5593be19b\") " pod="openstack/heat-db-create-pv5lc" Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.797066 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjq7j\" (UniqueName: \"kubernetes.io/projected/320db231-7a1a-4a9c-aa42-dbc5593be19b-kube-api-access-xjq7j\") pod \"heat-db-create-pv5lc\" (UID: \"320db231-7a1a-4a9c-aa42-dbc5593be19b\") " pod="openstack/heat-db-create-pv5lc" Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.815607 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fdb9f75f4-fr95n" event={"ID":"75ad0a08-756c-40ea-85ec-3cd497c96680","Type":"ContainerStarted","Data":"98d21c61535d8f06f0d5e1f25bf1797e4da30481932c3b7e4f6be14da85f9bc1"} Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.815655 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fdb9f75f4-fr95n" event={"ID":"75ad0a08-756c-40ea-85ec-3cd497c96680","Type":"ContainerStarted","Data":"e91661b6526a47b4f4a51ce33bc34d47fa0202e1851a145521dcd13c03f42b62"} Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.815670 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fdb9f75f4-fr95n" event={"ID":"75ad0a08-756c-40ea-85ec-3cd497c96680","Type":"ContainerStarted","Data":"16dfdd9b12f16a98d40c921996d63f690f122f9a4248cddaa09548a5576f468a"} Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.839929 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-fdb9f75f4-fr95n" podStartSLOduration=1.839910875 podStartE2EDuration="1.839910875s" podCreationTimestamp="2026-03-10 08:21:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:21:56.836261139 +0000 UTC m=+5869.866041774" watchObservedRunningTime="2026-03-10 08:21:56.839910875 +0000 UTC m=+5869.869691490" Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.872438 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-pv5lc" Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.884039 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e1976b7-f398-46fb-b376-64939c737883-operator-scripts\") pod \"heat-2379-account-create-update-bq46z\" (UID: \"7e1976b7-f398-46fb-b376-64939c737883\") " pod="openstack/heat-2379-account-create-update-bq46z" Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.884200 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbwsx\" (UniqueName: \"kubernetes.io/projected/7e1976b7-f398-46fb-b376-64939c737883-kube-api-access-nbwsx\") pod \"heat-2379-account-create-update-bq46z\" (UID: \"7e1976b7-f398-46fb-b376-64939c737883\") " pod="openstack/heat-2379-account-create-update-bq46z" Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.885058 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e1976b7-f398-46fb-b376-64939c737883-operator-scripts\") pod \"heat-2379-account-create-update-bq46z\" (UID: \"7e1976b7-f398-46fb-b376-64939c737883\") " pod="openstack/heat-2379-account-create-update-bq46z" Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.901600 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbwsx\" (UniqueName: \"kubernetes.io/projected/7e1976b7-f398-46fb-b376-64939c737883-kube-api-access-nbwsx\") pod \"heat-2379-account-create-update-bq46z\" (UID: \"7e1976b7-f398-46fb-b376-64939c737883\") " pod="openstack/heat-2379-account-create-update-bq46z" Mar 10 08:21:56 crc kubenswrapper[4825]: I0310 08:21:56.974962 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-2379-account-create-update-bq46z" Mar 10 08:21:57 crc kubenswrapper[4825]: I0310 08:21:57.307326 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-pv5lc"] Mar 10 08:21:57 crc kubenswrapper[4825]: W0310 08:21:57.311059 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod320db231_7a1a_4a9c_aa42_dbc5593be19b.slice/crio-026f983a7b1383241a9809caac6189515c4c9ce9783bb99149a21d28adb4af9f WatchSource:0}: Error finding container 026f983a7b1383241a9809caac6189515c4c9ce9783bb99149a21d28adb4af9f: Status 404 returned error can't find the container with id 026f983a7b1383241a9809caac6189515c4c9ce9783bb99149a21d28adb4af9f Mar 10 08:21:57 crc kubenswrapper[4825]: I0310 08:21:57.465698 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-2379-account-create-update-bq46z"] Mar 10 08:21:57 crc kubenswrapper[4825]: W0310 08:21:57.472432 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e1976b7_f398_46fb_b376_64939c737883.slice/crio-a3a07e23b0805b03bc4dd38994af55b356cdbe9d58f96b95e681e09f638bc13f WatchSource:0}: Error finding container a3a07e23b0805b03bc4dd38994af55b356cdbe9d58f96b95e681e09f638bc13f: Status 404 returned error can't find the container with id a3a07e23b0805b03bc4dd38994af55b356cdbe9d58f96b95e681e09f638bc13f Mar 10 08:21:57 crc kubenswrapper[4825]: I0310 08:21:57.827311 4825 generic.go:334] "Generic (PLEG): container finished" podID="320db231-7a1a-4a9c-aa42-dbc5593be19b" containerID="c7a2cc348757ad3a8facf25c670353306edff3a7e1faa407a581d0eb6786ec65" exitCode=0 Mar 10 08:21:57 crc kubenswrapper[4825]: I0310 08:21:57.828782 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-pv5lc" event={"ID":"320db231-7a1a-4a9c-aa42-dbc5593be19b","Type":"ContainerDied","Data":"c7a2cc348757ad3a8facf25c670353306edff3a7e1faa407a581d0eb6786ec65"} Mar 10 08:21:57 crc kubenswrapper[4825]: I0310 08:21:57.828898 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-pv5lc" event={"ID":"320db231-7a1a-4a9c-aa42-dbc5593be19b","Type":"ContainerStarted","Data":"026f983a7b1383241a9809caac6189515c4c9ce9783bb99149a21d28adb4af9f"} Mar 10 08:21:57 crc kubenswrapper[4825]: I0310 08:21:57.832663 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-2379-account-create-update-bq46z" event={"ID":"7e1976b7-f398-46fb-b376-64939c737883","Type":"ContainerStarted","Data":"7c3510e3af6f6a5910d61e21a734847a5d4ade589767e8f52829f5b891c450b7"} Mar 10 08:21:57 crc kubenswrapper[4825]: I0310 08:21:57.832849 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-2379-account-create-update-bq46z" event={"ID":"7e1976b7-f398-46fb-b376-64939c737883","Type":"ContainerStarted","Data":"a3a07e23b0805b03bc4dd38994af55b356cdbe9d58f96b95e681e09f638bc13f"} Mar 10 08:21:57 crc kubenswrapper[4825]: I0310 08:21:57.871505 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-2379-account-create-update-bq46z" podStartSLOduration=1.871475968 podStartE2EDuration="1.871475968s" podCreationTimestamp="2026-03-10 08:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:21:57.869353122 +0000 UTC m=+5870.899133757" watchObservedRunningTime="2026-03-10 08:21:57.871475968 +0000 UTC m=+5870.901256583" Mar 10 08:21:58 crc kubenswrapper[4825]: I0310 08:21:58.847206 4825 generic.go:334] "Generic (PLEG): container finished" podID="7e1976b7-f398-46fb-b376-64939c737883" containerID="7c3510e3af6f6a5910d61e21a734847a5d4ade589767e8f52829f5b891c450b7" exitCode=0 Mar 10 08:21:58 crc kubenswrapper[4825]: I0310 08:21:58.847278 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-2379-account-create-update-bq46z" event={"ID":"7e1976b7-f398-46fb-b376-64939c737883","Type":"ContainerDied","Data":"7c3510e3af6f6a5910d61e21a734847a5d4ade589767e8f52829f5b891c450b7"} Mar 10 08:21:59 crc kubenswrapper[4825]: I0310 08:21:59.185800 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-pv5lc" Mar 10 08:21:59 crc kubenswrapper[4825]: I0310 08:21:59.230810 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/320db231-7a1a-4a9c-aa42-dbc5593be19b-operator-scripts\") pod \"320db231-7a1a-4a9c-aa42-dbc5593be19b\" (UID: \"320db231-7a1a-4a9c-aa42-dbc5593be19b\") " Mar 10 08:21:59 crc kubenswrapper[4825]: I0310 08:21:59.230885 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjq7j\" (UniqueName: \"kubernetes.io/projected/320db231-7a1a-4a9c-aa42-dbc5593be19b-kube-api-access-xjq7j\") pod \"320db231-7a1a-4a9c-aa42-dbc5593be19b\" (UID: \"320db231-7a1a-4a9c-aa42-dbc5593be19b\") " Mar 10 08:21:59 crc kubenswrapper[4825]: I0310 08:21:59.233021 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/320db231-7a1a-4a9c-aa42-dbc5593be19b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "320db231-7a1a-4a9c-aa42-dbc5593be19b" (UID: "320db231-7a1a-4a9c-aa42-dbc5593be19b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:21:59 crc kubenswrapper[4825]: I0310 08:21:59.254087 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/320db231-7a1a-4a9c-aa42-dbc5593be19b-kube-api-access-xjq7j" (OuterVolumeSpecName: "kube-api-access-xjq7j") pod "320db231-7a1a-4a9c-aa42-dbc5593be19b" (UID: "320db231-7a1a-4a9c-aa42-dbc5593be19b"). InnerVolumeSpecName "kube-api-access-xjq7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:21:59 crc kubenswrapper[4825]: I0310 08:21:59.332556 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/320db231-7a1a-4a9c-aa42-dbc5593be19b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:21:59 crc kubenswrapper[4825]: I0310 08:21:59.332588 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjq7j\" (UniqueName: \"kubernetes.io/projected/320db231-7a1a-4a9c-aa42-dbc5593be19b-kube-api-access-xjq7j\") on node \"crc\" DevicePath \"\"" Mar 10 08:21:59 crc kubenswrapper[4825]: I0310 08:21:59.858761 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-pv5lc" Mar 10 08:21:59 crc kubenswrapper[4825]: I0310 08:21:59.859851 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-pv5lc" event={"ID":"320db231-7a1a-4a9c-aa42-dbc5593be19b","Type":"ContainerDied","Data":"026f983a7b1383241a9809caac6189515c4c9ce9783bb99149a21d28adb4af9f"} Mar 10 08:21:59 crc kubenswrapper[4825]: I0310 08:21:59.859882 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="026f983a7b1383241a9809caac6189515c4c9ce9783bb99149a21d28adb4af9f" Mar 10 08:22:00 crc kubenswrapper[4825]: I0310 08:22:00.145718 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552182-jvs86"] Mar 10 08:22:00 crc kubenswrapper[4825]: E0310 08:22:00.146497 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320db231-7a1a-4a9c-aa42-dbc5593be19b" containerName="mariadb-database-create" Mar 10 08:22:00 crc kubenswrapper[4825]: I0310 08:22:00.146517 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="320db231-7a1a-4a9c-aa42-dbc5593be19b" containerName="mariadb-database-create" Mar 10 08:22:00 crc kubenswrapper[4825]: I0310 08:22:00.146935 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="320db231-7a1a-4a9c-aa42-dbc5593be19b" containerName="mariadb-database-create" Mar 10 08:22:00 crc kubenswrapper[4825]: I0310 08:22:00.147918 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552182-jvs86" Mar 10 08:22:00 crc kubenswrapper[4825]: I0310 08:22:00.150428 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:22:00 crc kubenswrapper[4825]: I0310 08:22:00.150595 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:22:00 crc kubenswrapper[4825]: I0310 08:22:00.150705 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:22:00 crc kubenswrapper[4825]: I0310 08:22:00.156681 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552182-jvs86"] Mar 10 08:22:00 crc kubenswrapper[4825]: I0310 08:22:00.247468 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-2379-account-create-update-bq46z" Mar 10 08:22:00 crc kubenswrapper[4825]: I0310 08:22:00.248346 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk5fq\" (UniqueName: \"kubernetes.io/projected/fa2c9226-c27e-44da-881d-2495b13dbffe-kube-api-access-hk5fq\") pod \"auto-csr-approver-29552182-jvs86\" (UID: \"fa2c9226-c27e-44da-881d-2495b13dbffe\") " pod="openshift-infra/auto-csr-approver-29552182-jvs86" Mar 10 08:22:00 crc kubenswrapper[4825]: I0310 08:22:00.349584 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e1976b7-f398-46fb-b376-64939c737883-operator-scripts\") pod \"7e1976b7-f398-46fb-b376-64939c737883\" (UID: \"7e1976b7-f398-46fb-b376-64939c737883\") " Mar 10 08:22:00 crc kubenswrapper[4825]: I0310 08:22:00.349701 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbwsx\" (UniqueName: \"kubernetes.io/projected/7e1976b7-f398-46fb-b376-64939c737883-kube-api-access-nbwsx\") pod \"7e1976b7-f398-46fb-b376-64939c737883\" (UID: \"7e1976b7-f398-46fb-b376-64939c737883\") " Mar 10 08:22:00 crc kubenswrapper[4825]: I0310 08:22:00.349946 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk5fq\" (UniqueName: \"kubernetes.io/projected/fa2c9226-c27e-44da-881d-2495b13dbffe-kube-api-access-hk5fq\") pod \"auto-csr-approver-29552182-jvs86\" (UID: \"fa2c9226-c27e-44da-881d-2495b13dbffe\") " pod="openshift-infra/auto-csr-approver-29552182-jvs86" Mar 10 08:22:00 crc kubenswrapper[4825]: I0310 08:22:00.350386 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e1976b7-f398-46fb-b376-64939c737883-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e1976b7-f398-46fb-b376-64939c737883" (UID: "7e1976b7-f398-46fb-b376-64939c737883"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:22:00 crc kubenswrapper[4825]: I0310 08:22:00.355320 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e1976b7-f398-46fb-b376-64939c737883-kube-api-access-nbwsx" (OuterVolumeSpecName: "kube-api-access-nbwsx") pod "7e1976b7-f398-46fb-b376-64939c737883" (UID: "7e1976b7-f398-46fb-b376-64939c737883"). InnerVolumeSpecName "kube-api-access-nbwsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:22:00 crc kubenswrapper[4825]: I0310 08:22:00.365385 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk5fq\" (UniqueName: \"kubernetes.io/projected/fa2c9226-c27e-44da-881d-2495b13dbffe-kube-api-access-hk5fq\") pod \"auto-csr-approver-29552182-jvs86\" (UID: \"fa2c9226-c27e-44da-881d-2495b13dbffe\") " pod="openshift-infra/auto-csr-approver-29552182-jvs86" Mar 10 08:22:00 crc kubenswrapper[4825]: I0310 08:22:00.452449 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e1976b7-f398-46fb-b376-64939c737883-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:00 crc kubenswrapper[4825]: I0310 08:22:00.452486 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbwsx\" (UniqueName: \"kubernetes.io/projected/7e1976b7-f398-46fb-b376-64939c737883-kube-api-access-nbwsx\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:00 crc kubenswrapper[4825]: I0310 08:22:00.562783 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552182-jvs86" Mar 10 08:22:00 crc kubenswrapper[4825]: I0310 08:22:00.868680 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-2379-account-create-update-bq46z" event={"ID":"7e1976b7-f398-46fb-b376-64939c737883","Type":"ContainerDied","Data":"a3a07e23b0805b03bc4dd38994af55b356cdbe9d58f96b95e681e09f638bc13f"} Mar 10 08:22:00 crc kubenswrapper[4825]: I0310 08:22:00.869005 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3a07e23b0805b03bc4dd38994af55b356cdbe9d58f96b95e681e09f638bc13f" Mar 10 08:22:00 crc kubenswrapper[4825]: I0310 08:22:00.868730 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-2379-account-create-update-bq46z" Mar 10 08:22:01 crc kubenswrapper[4825]: W0310 08:22:00.999112 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa2c9226_c27e_44da_881d_2495b13dbffe.slice/crio-b836800781e27ceacbac915c71e1a162b75affc4005902bf2b6af5911a259076 WatchSource:0}: Error finding container b836800781e27ceacbac915c71e1a162b75affc4005902bf2b6af5911a259076: Status 404 returned error can't find the container with id b836800781e27ceacbac915c71e1a162b75affc4005902bf2b6af5911a259076 Mar 10 08:22:01 crc kubenswrapper[4825]: I0310 08:22:01.003237 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552182-jvs86"] Mar 10 08:22:01 crc kubenswrapper[4825]: I0310 08:22:01.887054 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552182-jvs86" event={"ID":"fa2c9226-c27e-44da-881d-2495b13dbffe","Type":"ContainerStarted","Data":"b836800781e27ceacbac915c71e1a162b75affc4005902bf2b6af5911a259076"} Mar 10 08:22:01 crc kubenswrapper[4825]: I0310 08:22:01.912927 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-fv2c7"] Mar 10 08:22:01 crc kubenswrapper[4825]: E0310 08:22:01.914095 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e1976b7-f398-46fb-b376-64939c737883" containerName="mariadb-account-create-update" Mar 10 08:22:01 crc kubenswrapper[4825]: I0310 08:22:01.914120 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e1976b7-f398-46fb-b376-64939c737883" containerName="mariadb-account-create-update" Mar 10 08:22:01 crc kubenswrapper[4825]: I0310 08:22:01.914346 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e1976b7-f398-46fb-b376-64939c737883" containerName="mariadb-account-create-update" Mar 10 08:22:01 crc kubenswrapper[4825]: I0310 08:22:01.918162 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fv2c7" Mar 10 08:22:01 crc kubenswrapper[4825]: I0310 08:22:01.920985 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 10 08:22:01 crc kubenswrapper[4825]: I0310 08:22:01.921291 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-2shjq" Mar 10 08:22:01 crc kubenswrapper[4825]: I0310 08:22:01.925414 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-fv2c7"] Mar 10 08:22:01 crc kubenswrapper[4825]: I0310 08:22:01.982938 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca-config-data\") pod \"heat-db-sync-fv2c7\" (UID: \"d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca\") " pod="openstack/heat-db-sync-fv2c7" Mar 10 08:22:01 crc kubenswrapper[4825]: I0310 08:22:01.983161 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-586jk\" (UniqueName: \"kubernetes.io/projected/d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca-kube-api-access-586jk\") pod \"heat-db-sync-fv2c7\" (UID: \"d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca\") " pod="openstack/heat-db-sync-fv2c7" Mar 10 08:22:01 crc kubenswrapper[4825]: I0310 08:22:01.983258 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca-combined-ca-bundle\") pod \"heat-db-sync-fv2c7\" (UID: \"d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca\") " pod="openstack/heat-db-sync-fv2c7" Mar 10 08:22:02 crc kubenswrapper[4825]: I0310 08:22:02.084165 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca-config-data\") pod \"heat-db-sync-fv2c7\" (UID: \"d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca\") " pod="openstack/heat-db-sync-fv2c7" Mar 10 08:22:02 crc kubenswrapper[4825]: I0310 08:22:02.084223 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-586jk\" (UniqueName: \"kubernetes.io/projected/d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca-kube-api-access-586jk\") pod \"heat-db-sync-fv2c7\" (UID: \"d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca\") " pod="openstack/heat-db-sync-fv2c7" Mar 10 08:22:02 crc kubenswrapper[4825]: I0310 08:22:02.084257 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca-combined-ca-bundle\") pod \"heat-db-sync-fv2c7\" (UID: \"d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca\") " pod="openstack/heat-db-sync-fv2c7" Mar 10 08:22:02 crc kubenswrapper[4825]: I0310 08:22:02.090520 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca-config-data\") pod \"heat-db-sync-fv2c7\" (UID: \"d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca\") " pod="openstack/heat-db-sync-fv2c7" Mar 10 08:22:02 crc kubenswrapper[4825]: I0310 08:22:02.090791 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca-combined-ca-bundle\") pod \"heat-db-sync-fv2c7\" (UID: \"d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca\") " pod="openstack/heat-db-sync-fv2c7" Mar 10 08:22:02 crc kubenswrapper[4825]: I0310 08:22:02.122655 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-586jk\" (UniqueName: \"kubernetes.io/projected/d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca-kube-api-access-586jk\") pod \"heat-db-sync-fv2c7\" (UID: \"d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca\") " pod="openstack/heat-db-sync-fv2c7" Mar 10 08:22:02 crc kubenswrapper[4825]: I0310 08:22:02.245655 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fv2c7" Mar 10 08:22:02 crc kubenswrapper[4825]: I0310 08:22:02.771908 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-fv2c7"] Mar 10 08:22:02 crc kubenswrapper[4825]: I0310 08:22:02.897064 4825 generic.go:334] "Generic (PLEG): container finished" podID="fa2c9226-c27e-44da-881d-2495b13dbffe" containerID="4ffa109e6f375d70cdfc3ad26b17ca5dcedd80479761ada3645788f7f2246e93" exitCode=0 Mar 10 08:22:02 crc kubenswrapper[4825]: I0310 08:22:02.897150 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552182-jvs86" event={"ID":"fa2c9226-c27e-44da-881d-2495b13dbffe","Type":"ContainerDied","Data":"4ffa109e6f375d70cdfc3ad26b17ca5dcedd80479761ada3645788f7f2246e93"} Mar 10 08:22:02 crc kubenswrapper[4825]: I0310 08:22:02.898775 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fv2c7" event={"ID":"d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca","Type":"ContainerStarted","Data":"fbe025e3678c6e6df1f0f6c173df1d5e0c195636c4b6828cd57633597f21567b"} Mar 10 08:22:04 crc kubenswrapper[4825]: I0310 08:22:04.260115 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552182-jvs86" Mar 10 08:22:04 crc kubenswrapper[4825]: I0310 08:22:04.427271 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk5fq\" (UniqueName: \"kubernetes.io/projected/fa2c9226-c27e-44da-881d-2495b13dbffe-kube-api-access-hk5fq\") pod \"fa2c9226-c27e-44da-881d-2495b13dbffe\" (UID: \"fa2c9226-c27e-44da-881d-2495b13dbffe\") " Mar 10 08:22:04 crc kubenswrapper[4825]: I0310 08:22:04.432927 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa2c9226-c27e-44da-881d-2495b13dbffe-kube-api-access-hk5fq" (OuterVolumeSpecName: "kube-api-access-hk5fq") pod "fa2c9226-c27e-44da-881d-2495b13dbffe" (UID: "fa2c9226-c27e-44da-881d-2495b13dbffe"). InnerVolumeSpecName "kube-api-access-hk5fq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:22:04 crc kubenswrapper[4825]: I0310 08:22:04.530643 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk5fq\" (UniqueName: \"kubernetes.io/projected/fa2c9226-c27e-44da-881d-2495b13dbffe-kube-api-access-hk5fq\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:04 crc kubenswrapper[4825]: I0310 08:22:04.923240 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552182-jvs86" event={"ID":"fa2c9226-c27e-44da-881d-2495b13dbffe","Type":"ContainerDied","Data":"b836800781e27ceacbac915c71e1a162b75affc4005902bf2b6af5911a259076"} Mar 10 08:22:04 crc kubenswrapper[4825]: I0310 08:22:04.923570 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b836800781e27ceacbac915c71e1a162b75affc4005902bf2b6af5911a259076" Mar 10 08:22:04 crc kubenswrapper[4825]: I0310 08:22:04.923289 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552182-jvs86" Mar 10 08:22:05 crc kubenswrapper[4825]: I0310 08:22:05.332803 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552176-gkhkl"] Mar 10 08:22:05 crc kubenswrapper[4825]: I0310 08:22:05.342096 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552176-gkhkl"] Mar 10 08:22:05 crc kubenswrapper[4825]: I0310 08:22:05.629506 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:22:05 crc kubenswrapper[4825]: I0310 08:22:05.629611 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:22:07 crc kubenswrapper[4825]: I0310 08:22:07.255693 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b6e77c2-7c47-47cb-b6ab-d44cbba3ade6" path="/var/lib/kubelet/pods/2b6e77c2-7c47-47cb-b6ab-d44cbba3ade6/volumes" Mar 10 08:22:12 crc kubenswrapper[4825]: I0310 08:22:12.000912 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fv2c7" event={"ID":"d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca","Type":"ContainerStarted","Data":"b1ef5efcc7904aa7c5ade0c1281756ff3bbc47d6ded7eb42771dd36a36df9448"} Mar 10 08:22:12 crc kubenswrapper[4825]: I0310 08:22:12.021861 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-fv2c7" podStartSLOduration=2.054736198 podStartE2EDuration="11.021845579s" podCreationTimestamp="2026-03-10 08:22:01 +0000 UTC" firstStartedPulling="2026-03-10 08:22:02.775409384 +0000 UTC m=+5875.805189999" lastFinishedPulling="2026-03-10 08:22:11.742518765 +0000 UTC m=+5884.772299380" observedRunningTime="2026-03-10 08:22:12.019785694 +0000 UTC m=+5885.049566309" watchObservedRunningTime="2026-03-10 08:22:12.021845579 +0000 UTC m=+5885.051626184" Mar 10 08:22:15 crc kubenswrapper[4825]: I0310 08:22:15.042440 4825 generic.go:334] "Generic (PLEG): container finished" podID="d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca" containerID="b1ef5efcc7904aa7c5ade0c1281756ff3bbc47d6ded7eb42771dd36a36df9448" exitCode=0 Mar 10 08:22:15 crc kubenswrapper[4825]: I0310 08:22:15.042930 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fv2c7" event={"ID":"d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca","Type":"ContainerDied","Data":"b1ef5efcc7904aa7c5ade0c1281756ff3bbc47d6ded7eb42771dd36a36df9448"} Mar 10 08:22:15 crc kubenswrapper[4825]: I0310 08:22:15.631526 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-fdb9f75f4-fr95n" podUID="75ad0a08-756c-40ea-85ec-3cd497c96680" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.143:8443: connect: connection refused" Mar 10 08:22:16 crc kubenswrapper[4825]: I0310 08:22:16.415385 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fv2c7" Mar 10 08:22:16 crc kubenswrapper[4825]: I0310 08:22:16.584648 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca-combined-ca-bundle\") pod \"d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca\" (UID: \"d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca\") " Mar 10 08:22:16 crc kubenswrapper[4825]: I0310 08:22:16.584779 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-586jk\" (UniqueName: \"kubernetes.io/projected/d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca-kube-api-access-586jk\") pod \"d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca\" (UID: \"d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca\") " Mar 10 08:22:16 crc kubenswrapper[4825]: I0310 08:22:16.584822 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca-config-data\") pod \"d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca\" (UID: \"d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca\") " Mar 10 08:22:16 crc kubenswrapper[4825]: I0310 08:22:16.608252 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca-kube-api-access-586jk" (OuterVolumeSpecName: "kube-api-access-586jk") pod "d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca" (UID: "d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca"). InnerVolumeSpecName "kube-api-access-586jk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:22:16 crc kubenswrapper[4825]: I0310 08:22:16.622959 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca" (UID: "d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:22:16 crc kubenswrapper[4825]: I0310 08:22:16.688459 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca-config-data" (OuterVolumeSpecName: "config-data") pod "d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca" (UID: "d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:22:16 crc kubenswrapper[4825]: I0310 08:22:16.689286 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca-config-data\") pod \"d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca\" (UID: \"d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca\") " Mar 10 08:22:16 crc kubenswrapper[4825]: W0310 08:22:16.689425 4825 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca/volumes/kubernetes.io~secret/config-data Mar 10 08:22:16 crc kubenswrapper[4825]: I0310 08:22:16.689452 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca-config-data" (OuterVolumeSpecName: "config-data") pod "d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca" (UID: "d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:22:16 crc kubenswrapper[4825]: I0310 08:22:16.689837 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:16 crc kubenswrapper[4825]: I0310 08:22:16.689863 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-586jk\" (UniqueName: \"kubernetes.io/projected/d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca-kube-api-access-586jk\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:16 crc kubenswrapper[4825]: I0310 08:22:16.689876 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:16 crc kubenswrapper[4825]: I0310 08:22:16.888877 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:22:16 crc kubenswrapper[4825]: I0310 08:22:16.889434 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:22:16 crc kubenswrapper[4825]: I0310 08:22:16.889619 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 08:22:16 crc kubenswrapper[4825]: I0310 08:22:16.891632 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"527769f1bd8bc09217383522eb96db2b542ebee6be919f1442be652e8e45cf86"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 08:22:16 crc kubenswrapper[4825]: I0310 08:22:16.891748 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://527769f1bd8bc09217383522eb96db2b542ebee6be919f1442be652e8e45cf86" gracePeriod=600 Mar 10 08:22:17 crc kubenswrapper[4825]: I0310 08:22:17.082185 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fv2c7" event={"ID":"d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca","Type":"ContainerDied","Data":"fbe025e3678c6e6df1f0f6c173df1d5e0c195636c4b6828cd57633597f21567b"} Mar 10 08:22:17 crc kubenswrapper[4825]: I0310 08:22:17.082224 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbe025e3678c6e6df1f0f6c173df1d5e0c195636c4b6828cd57633597f21567b" Mar 10 08:22:17 crc kubenswrapper[4825]: I0310 08:22:17.082281 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fv2c7" Mar 10 08:22:17 crc kubenswrapper[4825]: I0310 08:22:17.088183 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="527769f1bd8bc09217383522eb96db2b542ebee6be919f1442be652e8e45cf86" exitCode=0 Mar 10 08:22:17 crc kubenswrapper[4825]: I0310 08:22:17.088517 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"527769f1bd8bc09217383522eb96db2b542ebee6be919f1442be652e8e45cf86"} Mar 10 08:22:17 crc kubenswrapper[4825]: I0310 08:22:17.088568 4825 scope.go:117] "RemoveContainer" containerID="cc40bbe7e5127cd8ca03c26dd788c9bfc628fd248484a1f5f73946709b5b4414" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.108573 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc"} Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.139996 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-775b55954d-5d5kw"] Mar 10 08:22:18 crc kubenswrapper[4825]: E0310 08:22:18.140477 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca" containerName="heat-db-sync" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.140494 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca" containerName="heat-db-sync" Mar 10 08:22:18 crc kubenswrapper[4825]: E0310 08:22:18.140506 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa2c9226-c27e-44da-881d-2495b13dbffe" containerName="oc" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.140513 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa2c9226-c27e-44da-881d-2495b13dbffe" containerName="oc" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.140717 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca" containerName="heat-db-sync" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.140754 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa2c9226-c27e-44da-881d-2495b13dbffe" containerName="oc" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.141624 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-775b55954d-5d5kw" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.149654 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-2shjq" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.149780 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.149685 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.178998 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-775b55954d-5d5kw"] Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.320319 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d4b56d5-815b-43ef-851c-78be8efed9d8-combined-ca-bundle\") pod \"heat-engine-775b55954d-5d5kw\" (UID: \"6d4b56d5-815b-43ef-851c-78be8efed9d8\") " pod="openstack/heat-engine-775b55954d-5d5kw" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.320498 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d4b56d5-815b-43ef-851c-78be8efed9d8-config-data-custom\") pod \"heat-engine-775b55954d-5d5kw\" (UID: \"6d4b56d5-815b-43ef-851c-78be8efed9d8\") " pod="openstack/heat-engine-775b55954d-5d5kw" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.320552 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d4b56d5-815b-43ef-851c-78be8efed9d8-config-data\") pod \"heat-engine-775b55954d-5d5kw\" (UID: \"6d4b56d5-815b-43ef-851c-78be8efed9d8\") " pod="openstack/heat-engine-775b55954d-5d5kw" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.320593 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2lz4\" (UniqueName: \"kubernetes.io/projected/6d4b56d5-815b-43ef-851c-78be8efed9d8-kube-api-access-p2lz4\") pod \"heat-engine-775b55954d-5d5kw\" (UID: \"6d4b56d5-815b-43ef-851c-78be8efed9d8\") " pod="openstack/heat-engine-775b55954d-5d5kw" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.334471 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6947b5fb54-lvrgg"] Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.335648 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6947b5fb54-lvrgg" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.341884 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.347063 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6947b5fb54-lvrgg"] Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.356699 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-9c4964b68-jlxvc"] Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.358444 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-9c4964b68-jlxvc" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.362436 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.369184 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-9c4964b68-jlxvc"] Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.422952 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2424da63-7441-4f0f-bbd3-532e759a5b36-config-data\") pod \"heat-api-6947b5fb54-lvrgg\" (UID: \"2424da63-7441-4f0f-bbd3-532e759a5b36\") " pod="openstack/heat-api-6947b5fb54-lvrgg" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.423025 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2424da63-7441-4f0f-bbd3-532e759a5b36-config-data-custom\") pod \"heat-api-6947b5fb54-lvrgg\" (UID: \"2424da63-7441-4f0f-bbd3-532e759a5b36\") " pod="openstack/heat-api-6947b5fb54-lvrgg" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.423068 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d4b56d5-815b-43ef-851c-78be8efed9d8-config-data\") pod \"heat-engine-775b55954d-5d5kw\" (UID: \"6d4b56d5-815b-43ef-851c-78be8efed9d8\") " pod="openstack/heat-engine-775b55954d-5d5kw" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.423120 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2lz4\" (UniqueName: \"kubernetes.io/projected/6d4b56d5-815b-43ef-851c-78be8efed9d8-kube-api-access-p2lz4\") pod \"heat-engine-775b55954d-5d5kw\" (UID: \"6d4b56d5-815b-43ef-851c-78be8efed9d8\") " pod="openstack/heat-engine-775b55954d-5d5kw" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.423187 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2424da63-7441-4f0f-bbd3-532e759a5b36-combined-ca-bundle\") pod \"heat-api-6947b5fb54-lvrgg\" (UID: \"2424da63-7441-4f0f-bbd3-532e759a5b36\") " pod="openstack/heat-api-6947b5fb54-lvrgg" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.423264 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk4g2\" (UniqueName: \"kubernetes.io/projected/2424da63-7441-4f0f-bbd3-532e759a5b36-kube-api-access-sk4g2\") pod \"heat-api-6947b5fb54-lvrgg\" (UID: \"2424da63-7441-4f0f-bbd3-532e759a5b36\") " pod="openstack/heat-api-6947b5fb54-lvrgg" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.423307 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d4b56d5-815b-43ef-851c-78be8efed9d8-combined-ca-bundle\") pod \"heat-engine-775b55954d-5d5kw\" (UID: \"6d4b56d5-815b-43ef-851c-78be8efed9d8\") " pod="openstack/heat-engine-775b55954d-5d5kw" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.423426 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d4b56d5-815b-43ef-851c-78be8efed9d8-config-data-custom\") pod \"heat-engine-775b55954d-5d5kw\" (UID: \"6d4b56d5-815b-43ef-851c-78be8efed9d8\") " pod="openstack/heat-engine-775b55954d-5d5kw" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.431081 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d4b56d5-815b-43ef-851c-78be8efed9d8-config-data\") pod \"heat-engine-775b55954d-5d5kw\" (UID: \"6d4b56d5-815b-43ef-851c-78be8efed9d8\") " pod="openstack/heat-engine-775b55954d-5d5kw" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.441408 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d4b56d5-815b-43ef-851c-78be8efed9d8-config-data-custom\") pod \"heat-engine-775b55954d-5d5kw\" (UID: \"6d4b56d5-815b-43ef-851c-78be8efed9d8\") " pod="openstack/heat-engine-775b55954d-5d5kw" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.441804 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2lz4\" (UniqueName: \"kubernetes.io/projected/6d4b56d5-815b-43ef-851c-78be8efed9d8-kube-api-access-p2lz4\") pod \"heat-engine-775b55954d-5d5kw\" (UID: \"6d4b56d5-815b-43ef-851c-78be8efed9d8\") " pod="openstack/heat-engine-775b55954d-5d5kw" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.449077 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d4b56d5-815b-43ef-851c-78be8efed9d8-combined-ca-bundle\") pod \"heat-engine-775b55954d-5d5kw\" (UID: \"6d4b56d5-815b-43ef-851c-78be8efed9d8\") " pod="openstack/heat-engine-775b55954d-5d5kw" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.473918 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-775b55954d-5d5kw" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.525496 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2424da63-7441-4f0f-bbd3-532e759a5b36-config-data\") pod \"heat-api-6947b5fb54-lvrgg\" (UID: \"2424da63-7441-4f0f-bbd3-532e759a5b36\") " pod="openstack/heat-api-6947b5fb54-lvrgg" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.525546 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2424da63-7441-4f0f-bbd3-532e759a5b36-config-data-custom\") pod \"heat-api-6947b5fb54-lvrgg\" (UID: \"2424da63-7441-4f0f-bbd3-532e759a5b36\") " pod="openstack/heat-api-6947b5fb54-lvrgg" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.525568 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2-config-data\") pod \"heat-cfnapi-9c4964b68-jlxvc\" (UID: \"58f58fb9-cd6f-49e4-945f-a3fd33bf45f2\") " pod="openstack/heat-cfnapi-9c4964b68-jlxvc" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.525603 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz97s\" (UniqueName: \"kubernetes.io/projected/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2-kube-api-access-bz97s\") pod \"heat-cfnapi-9c4964b68-jlxvc\" (UID: \"58f58fb9-cd6f-49e4-945f-a3fd33bf45f2\") " pod="openstack/heat-cfnapi-9c4964b68-jlxvc" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.525623 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2424da63-7441-4f0f-bbd3-532e759a5b36-combined-ca-bundle\") pod \"heat-api-6947b5fb54-lvrgg\" (UID: \"2424da63-7441-4f0f-bbd3-532e759a5b36\") " pod="openstack/heat-api-6947b5fb54-lvrgg" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.525661 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2-config-data-custom\") pod \"heat-cfnapi-9c4964b68-jlxvc\" (UID: \"58f58fb9-cd6f-49e4-945f-a3fd33bf45f2\") " pod="openstack/heat-cfnapi-9c4964b68-jlxvc" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.525678 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2-combined-ca-bundle\") pod \"heat-cfnapi-9c4964b68-jlxvc\" (UID: \"58f58fb9-cd6f-49e4-945f-a3fd33bf45f2\") " pod="openstack/heat-cfnapi-9c4964b68-jlxvc" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.525703 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk4g2\" (UniqueName: \"kubernetes.io/projected/2424da63-7441-4f0f-bbd3-532e759a5b36-kube-api-access-sk4g2\") pod \"heat-api-6947b5fb54-lvrgg\" (UID: \"2424da63-7441-4f0f-bbd3-532e759a5b36\") " pod="openstack/heat-api-6947b5fb54-lvrgg" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.530475 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2424da63-7441-4f0f-bbd3-532e759a5b36-config-data\") pod \"heat-api-6947b5fb54-lvrgg\" (UID: \"2424da63-7441-4f0f-bbd3-532e759a5b36\") " pod="openstack/heat-api-6947b5fb54-lvrgg" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.532040 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2424da63-7441-4f0f-bbd3-532e759a5b36-config-data-custom\") pod \"heat-api-6947b5fb54-lvrgg\" (UID: \"2424da63-7441-4f0f-bbd3-532e759a5b36\") " pod="openstack/heat-api-6947b5fb54-lvrgg" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.537155 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2424da63-7441-4f0f-bbd3-532e759a5b36-combined-ca-bundle\") pod \"heat-api-6947b5fb54-lvrgg\" (UID: \"2424da63-7441-4f0f-bbd3-532e759a5b36\") " pod="openstack/heat-api-6947b5fb54-lvrgg" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.567422 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk4g2\" (UniqueName: \"kubernetes.io/projected/2424da63-7441-4f0f-bbd3-532e759a5b36-kube-api-access-sk4g2\") pod \"heat-api-6947b5fb54-lvrgg\" (UID: \"2424da63-7441-4f0f-bbd3-532e759a5b36\") " pod="openstack/heat-api-6947b5fb54-lvrgg" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.627755 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2-config-data\") pod \"heat-cfnapi-9c4964b68-jlxvc\" (UID: \"58f58fb9-cd6f-49e4-945f-a3fd33bf45f2\") " pod="openstack/heat-cfnapi-9c4964b68-jlxvc" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.627811 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz97s\" (UniqueName: \"kubernetes.io/projected/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2-kube-api-access-bz97s\") pod \"heat-cfnapi-9c4964b68-jlxvc\" (UID: \"58f58fb9-cd6f-49e4-945f-a3fd33bf45f2\") " pod="openstack/heat-cfnapi-9c4964b68-jlxvc" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.627859 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2-config-data-custom\") pod \"heat-cfnapi-9c4964b68-jlxvc\" (UID: \"58f58fb9-cd6f-49e4-945f-a3fd33bf45f2\") " pod="openstack/heat-cfnapi-9c4964b68-jlxvc" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.627877 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2-combined-ca-bundle\") pod \"heat-cfnapi-9c4964b68-jlxvc\" (UID: \"58f58fb9-cd6f-49e4-945f-a3fd33bf45f2\") " pod="openstack/heat-cfnapi-9c4964b68-jlxvc" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.633018 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2-config-data\") pod \"heat-cfnapi-9c4964b68-jlxvc\" (UID: \"58f58fb9-cd6f-49e4-945f-a3fd33bf45f2\") " pod="openstack/heat-cfnapi-9c4964b68-jlxvc" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.636044 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2-combined-ca-bundle\") pod \"heat-cfnapi-9c4964b68-jlxvc\" (UID: \"58f58fb9-cd6f-49e4-945f-a3fd33bf45f2\") " pod="openstack/heat-cfnapi-9c4964b68-jlxvc" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.649689 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2-config-data-custom\") pod \"heat-cfnapi-9c4964b68-jlxvc\" (UID: \"58f58fb9-cd6f-49e4-945f-a3fd33bf45f2\") " pod="openstack/heat-cfnapi-9c4964b68-jlxvc" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.652300 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6947b5fb54-lvrgg" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.652332 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz97s\" (UniqueName: \"kubernetes.io/projected/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2-kube-api-access-bz97s\") pod \"heat-cfnapi-9c4964b68-jlxvc\" (UID: \"58f58fb9-cd6f-49e4-945f-a3fd33bf45f2\") " pod="openstack/heat-cfnapi-9c4964b68-jlxvc" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.684601 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-9c4964b68-jlxvc" Mar 10 08:22:18 crc kubenswrapper[4825]: I0310 08:22:18.973230 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-775b55954d-5d5kw"] Mar 10 08:22:19 crc kubenswrapper[4825]: I0310 08:22:19.127643 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-775b55954d-5d5kw" event={"ID":"6d4b56d5-815b-43ef-851c-78be8efed9d8","Type":"ContainerStarted","Data":"524797310f525b71621f2312224546c45dd783789efb288458aec0626edfa4a4"} Mar 10 08:22:19 crc kubenswrapper[4825]: I0310 08:22:19.163152 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6947b5fb54-lvrgg"] Mar 10 08:22:19 crc kubenswrapper[4825]: W0310 08:22:19.165232 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2424da63_7441_4f0f_bbd3_532e759a5b36.slice/crio-2494aaee240fd33d943d0ed60739e800a66dae8af5f8182cf4ed4a8e13c0b7ca WatchSource:0}: Error finding container 2494aaee240fd33d943d0ed60739e800a66dae8af5f8182cf4ed4a8e13c0b7ca: Status 404 returned error can't find the container with id 2494aaee240fd33d943d0ed60739e800a66dae8af5f8182cf4ed4a8e13c0b7ca Mar 10 08:22:19 crc kubenswrapper[4825]: I0310 08:22:19.300483 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-9c4964b68-jlxvc"] Mar 10 08:22:20 crc kubenswrapper[4825]: I0310 08:22:20.135744 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6947b5fb54-lvrgg" event={"ID":"2424da63-7441-4f0f-bbd3-532e759a5b36","Type":"ContainerStarted","Data":"2494aaee240fd33d943d0ed60739e800a66dae8af5f8182cf4ed4a8e13c0b7ca"} Mar 10 08:22:20 crc kubenswrapper[4825]: I0310 08:22:20.138106 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-9c4964b68-jlxvc" event={"ID":"58f58fb9-cd6f-49e4-945f-a3fd33bf45f2","Type":"ContainerStarted","Data":"ed669b30fd049193bd293f7717ba2434f0f7cb1b9260ebc62c76086d6c9a2cff"} Mar 10 08:22:20 crc kubenswrapper[4825]: I0310 08:22:20.139524 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-775b55954d-5d5kw" event={"ID":"6d4b56d5-815b-43ef-851c-78be8efed9d8","Type":"ContainerStarted","Data":"28e40db98698092639f721ad1ded3003afb324fea9d67293d24ba0080e2411ac"} Mar 10 08:22:20 crc kubenswrapper[4825]: I0310 08:22:20.139722 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-775b55954d-5d5kw" Mar 10 08:22:20 crc kubenswrapper[4825]: I0310 08:22:20.165576 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-775b55954d-5d5kw" podStartSLOduration=2.165553081 podStartE2EDuration="2.165553081s" podCreationTimestamp="2026-03-10 08:22:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:22:20.161193276 +0000 UTC m=+5893.190973911" watchObservedRunningTime="2026-03-10 08:22:20.165553081 +0000 UTC m=+5893.195333706" Mar 10 08:22:22 crc kubenswrapper[4825]: I0310 08:22:22.158699 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-9c4964b68-jlxvc" event={"ID":"58f58fb9-cd6f-49e4-945f-a3fd33bf45f2","Type":"ContainerStarted","Data":"dae7e40147c2bcd681126ae2735ab3f28238ffad3d65193b495530daca49ef30"} Mar 10 08:22:22 crc kubenswrapper[4825]: I0310 08:22:22.159213 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-9c4964b68-jlxvc" Mar 10 08:22:22 crc kubenswrapper[4825]: I0310 08:22:22.161587 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6947b5fb54-lvrgg" event={"ID":"2424da63-7441-4f0f-bbd3-532e759a5b36","Type":"ContainerStarted","Data":"a520736295f231bcb9741cf3ff1ac386c24ebda890527271778549fb8552554f"} Mar 10 08:22:22 crc kubenswrapper[4825]: I0310 08:22:22.161782 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6947b5fb54-lvrgg" Mar 10 08:22:22 crc kubenswrapper[4825]: I0310 08:22:22.183123 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-9c4964b68-jlxvc" podStartSLOduration=2.320182522 podStartE2EDuration="4.18310566s" podCreationTimestamp="2026-03-10 08:22:18 +0000 UTC" firstStartedPulling="2026-03-10 08:22:19.292347627 +0000 UTC m=+5892.322128242" lastFinishedPulling="2026-03-10 08:22:21.155270775 +0000 UTC m=+5894.185051380" observedRunningTime="2026-03-10 08:22:22.17548755 +0000 UTC m=+5895.205268165" watchObservedRunningTime="2026-03-10 08:22:22.18310566 +0000 UTC m=+5895.212886265" Mar 10 08:22:22 crc kubenswrapper[4825]: I0310 08:22:22.204490 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6947b5fb54-lvrgg" podStartSLOduration=2.223892064 podStartE2EDuration="4.204472191s" podCreationTimestamp="2026-03-10 08:22:18 +0000 UTC" firstStartedPulling="2026-03-10 08:22:19.167938741 +0000 UTC m=+5892.197719356" lastFinishedPulling="2026-03-10 08:22:21.148518868 +0000 UTC m=+5894.178299483" observedRunningTime="2026-03-10 08:22:22.198010251 +0000 UTC m=+5895.227790876" watchObservedRunningTime="2026-03-10 08:22:22.204472191 +0000 UTC m=+5895.234252806" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.216024 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-55d8bdb9c6-bvvnc"] Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.218233 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-55d8bdb9c6-bvvnc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.253574 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6b9bd96c95-wvxct"] Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.255110 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.273194 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-55d8bdb9c6-bvvnc"] Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.294815 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-56569cbcb6-tnnsc"] Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.296461 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-56569cbcb6-tnnsc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.308936 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72-combined-ca-bundle\") pod \"heat-api-56569cbcb6-tnnsc\" (UID: \"7c39bb33-53f2-4a5d-b37e-eb2e5feeda72\") " pod="openstack/heat-api-56569cbcb6-tnnsc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.309099 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72-config-data-custom\") pod \"heat-api-56569cbcb6-tnnsc\" (UID: \"7c39bb33-53f2-4a5d-b37e-eb2e5feeda72\") " pod="openstack/heat-api-56569cbcb6-tnnsc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.309158 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/575df1da-f60a-4891-b163-459c3af78f95-config-data-custom\") pod \"heat-cfnapi-6b9bd96c95-wvxct\" (UID: \"575df1da-f60a-4891-b163-459c3af78f95\") " pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.309182 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98ks5\" (UniqueName: \"kubernetes.io/projected/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72-kube-api-access-98ks5\") pod \"heat-api-56569cbcb6-tnnsc\" (UID: \"7c39bb33-53f2-4a5d-b37e-eb2e5feeda72\") " pod="openstack/heat-api-56569cbcb6-tnnsc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.309248 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f395cfa-be06-4d10-b6ca-2584f3588882-combined-ca-bundle\") pod \"heat-engine-55d8bdb9c6-bvvnc\" (UID: \"0f395cfa-be06-4d10-b6ca-2584f3588882\") " pod="openstack/heat-engine-55d8bdb9c6-bvvnc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.309337 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4qp4\" (UniqueName: \"kubernetes.io/projected/0f395cfa-be06-4d10-b6ca-2584f3588882-kube-api-access-d4qp4\") pod \"heat-engine-55d8bdb9c6-bvvnc\" (UID: \"0f395cfa-be06-4d10-b6ca-2584f3588882\") " pod="openstack/heat-engine-55d8bdb9c6-bvvnc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.309361 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/575df1da-f60a-4891-b163-459c3af78f95-config-data\") pod \"heat-cfnapi-6b9bd96c95-wvxct\" (UID: \"575df1da-f60a-4891-b163-459c3af78f95\") " pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.309403 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/575df1da-f60a-4891-b163-459c3af78f95-combined-ca-bundle\") pod \"heat-cfnapi-6b9bd96c95-wvxct\" (UID: \"575df1da-f60a-4891-b163-459c3af78f95\") " pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.309428 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f395cfa-be06-4d10-b6ca-2584f3588882-config-data-custom\") pod \"heat-engine-55d8bdb9c6-bvvnc\" (UID: \"0f395cfa-be06-4d10-b6ca-2584f3588882\") " pod="openstack/heat-engine-55d8bdb9c6-bvvnc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.309458 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f395cfa-be06-4d10-b6ca-2584f3588882-config-data\") pod \"heat-engine-55d8bdb9c6-bvvnc\" (UID: \"0f395cfa-be06-4d10-b6ca-2584f3588882\") " pod="openstack/heat-engine-55d8bdb9c6-bvvnc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.309497 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8m57\" (UniqueName: \"kubernetes.io/projected/575df1da-f60a-4891-b163-459c3af78f95-kube-api-access-z8m57\") pod \"heat-cfnapi-6b9bd96c95-wvxct\" (UID: \"575df1da-f60a-4891-b163-459c3af78f95\") " pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.309518 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72-config-data\") pod \"heat-api-56569cbcb6-tnnsc\" (UID: \"7c39bb33-53f2-4a5d-b37e-eb2e5feeda72\") " pod="openstack/heat-api-56569cbcb6-tnnsc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.331211 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6b9bd96c95-wvxct"] Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.442988 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f395cfa-be06-4d10-b6ca-2584f3588882-config-data\") pod \"heat-engine-55d8bdb9c6-bvvnc\" (UID: \"0f395cfa-be06-4d10-b6ca-2584f3588882\") " pod="openstack/heat-engine-55d8bdb9c6-bvvnc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.443037 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8m57\" (UniqueName: \"kubernetes.io/projected/575df1da-f60a-4891-b163-459c3af78f95-kube-api-access-z8m57\") pod \"heat-cfnapi-6b9bd96c95-wvxct\" (UID: \"575df1da-f60a-4891-b163-459c3af78f95\") " pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.443062 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72-config-data\") pod \"heat-api-56569cbcb6-tnnsc\" (UID: \"7c39bb33-53f2-4a5d-b37e-eb2e5feeda72\") " pod="openstack/heat-api-56569cbcb6-tnnsc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.443102 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72-combined-ca-bundle\") pod \"heat-api-56569cbcb6-tnnsc\" (UID: \"7c39bb33-53f2-4a5d-b37e-eb2e5feeda72\") " pod="openstack/heat-api-56569cbcb6-tnnsc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.443204 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72-config-data-custom\") pod \"heat-api-56569cbcb6-tnnsc\" (UID: \"7c39bb33-53f2-4a5d-b37e-eb2e5feeda72\") " pod="openstack/heat-api-56569cbcb6-tnnsc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.443238 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/575df1da-f60a-4891-b163-459c3af78f95-config-data-custom\") pod \"heat-cfnapi-6b9bd96c95-wvxct\" (UID: \"575df1da-f60a-4891-b163-459c3af78f95\") " pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.443268 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98ks5\" (UniqueName: \"kubernetes.io/projected/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72-kube-api-access-98ks5\") pod \"heat-api-56569cbcb6-tnnsc\" (UID: \"7c39bb33-53f2-4a5d-b37e-eb2e5feeda72\") " pod="openstack/heat-api-56569cbcb6-tnnsc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.443300 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f395cfa-be06-4d10-b6ca-2584f3588882-combined-ca-bundle\") pod \"heat-engine-55d8bdb9c6-bvvnc\" (UID: \"0f395cfa-be06-4d10-b6ca-2584f3588882\") " pod="openstack/heat-engine-55d8bdb9c6-bvvnc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.443337 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4qp4\" (UniqueName: \"kubernetes.io/projected/0f395cfa-be06-4d10-b6ca-2584f3588882-kube-api-access-d4qp4\") pod \"heat-engine-55d8bdb9c6-bvvnc\" (UID: \"0f395cfa-be06-4d10-b6ca-2584f3588882\") " pod="openstack/heat-engine-55d8bdb9c6-bvvnc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.443354 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/575df1da-f60a-4891-b163-459c3af78f95-config-data\") pod \"heat-cfnapi-6b9bd96c95-wvxct\" (UID: \"575df1da-f60a-4891-b163-459c3af78f95\") " pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.443387 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/575df1da-f60a-4891-b163-459c3af78f95-combined-ca-bundle\") pod \"heat-cfnapi-6b9bd96c95-wvxct\" (UID: \"575df1da-f60a-4891-b163-459c3af78f95\") " pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.443407 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f395cfa-be06-4d10-b6ca-2584f3588882-config-data-custom\") pod \"heat-engine-55d8bdb9c6-bvvnc\" (UID: \"0f395cfa-be06-4d10-b6ca-2584f3588882\") " pod="openstack/heat-engine-55d8bdb9c6-bvvnc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.444282 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-56569cbcb6-tnnsc"] Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.489001 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/575df1da-f60a-4891-b163-459c3af78f95-combined-ca-bundle\") pod \"heat-cfnapi-6b9bd96c95-wvxct\" (UID: \"575df1da-f60a-4891-b163-459c3af78f95\") " pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.527196 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f395cfa-be06-4d10-b6ca-2584f3588882-config-data\") pod \"heat-engine-55d8bdb9c6-bvvnc\" (UID: \"0f395cfa-be06-4d10-b6ca-2584f3588882\") " pod="openstack/heat-engine-55d8bdb9c6-bvvnc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.531670 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4qp4\" (UniqueName: \"kubernetes.io/projected/0f395cfa-be06-4d10-b6ca-2584f3588882-kube-api-access-d4qp4\") pod \"heat-engine-55d8bdb9c6-bvvnc\" (UID: \"0f395cfa-be06-4d10-b6ca-2584f3588882\") " pod="openstack/heat-engine-55d8bdb9c6-bvvnc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.531699 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f395cfa-be06-4d10-b6ca-2584f3588882-config-data-custom\") pod \"heat-engine-55d8bdb9c6-bvvnc\" (UID: \"0f395cfa-be06-4d10-b6ca-2584f3588882\") " pod="openstack/heat-engine-55d8bdb9c6-bvvnc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.532088 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72-combined-ca-bundle\") pod \"heat-api-56569cbcb6-tnnsc\" (UID: \"7c39bb33-53f2-4a5d-b37e-eb2e5feeda72\") " pod="openstack/heat-api-56569cbcb6-tnnsc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.532272 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f395cfa-be06-4d10-b6ca-2584f3588882-combined-ca-bundle\") pod \"heat-engine-55d8bdb9c6-bvvnc\" (UID: \"0f395cfa-be06-4d10-b6ca-2584f3588882\") " pod="openstack/heat-engine-55d8bdb9c6-bvvnc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.532316 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/575df1da-f60a-4891-b163-459c3af78f95-config-data\") pod \"heat-cfnapi-6b9bd96c95-wvxct\" (UID: \"575df1da-f60a-4891-b163-459c3af78f95\") " pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.532901 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98ks5\" (UniqueName: \"kubernetes.io/projected/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72-kube-api-access-98ks5\") pod \"heat-api-56569cbcb6-tnnsc\" (UID: \"7c39bb33-53f2-4a5d-b37e-eb2e5feeda72\") " pod="openstack/heat-api-56569cbcb6-tnnsc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.532953 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72-config-data-custom\") pod \"heat-api-56569cbcb6-tnnsc\" (UID: \"7c39bb33-53f2-4a5d-b37e-eb2e5feeda72\") " pod="openstack/heat-api-56569cbcb6-tnnsc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.533722 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72-config-data\") pod \"heat-api-56569cbcb6-tnnsc\" (UID: \"7c39bb33-53f2-4a5d-b37e-eb2e5feeda72\") " pod="openstack/heat-api-56569cbcb6-tnnsc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.535789 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-55d8bdb9c6-bvvnc" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.536813 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8m57\" (UniqueName: \"kubernetes.io/projected/575df1da-f60a-4891-b163-459c3af78f95-kube-api-access-z8m57\") pod \"heat-cfnapi-6b9bd96c95-wvxct\" (UID: \"575df1da-f60a-4891-b163-459c3af78f95\") " pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.539645 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/575df1da-f60a-4891-b163-459c3af78f95-config-data-custom\") pod \"heat-cfnapi-6b9bd96c95-wvxct\" (UID: \"575df1da-f60a-4891-b163-459c3af78f95\") " pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.569764 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" Mar 10 08:22:25 crc kubenswrapper[4825]: I0310 08:22:25.652521 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-56569cbcb6-tnnsc" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.040699 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-25ac-account-create-update-hn6w9"] Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.053274 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-krslz"] Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.072601 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-25ac-account-create-update-hn6w9"] Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.091056 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-krslz"] Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.103445 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-55d8bdb9c6-bvvnc"] Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.201368 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6b9bd96c95-wvxct"] Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.203505 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-55d8bdb9c6-bvvnc" event={"ID":"0f395cfa-be06-4d10-b6ca-2584f3588882","Type":"ContainerStarted","Data":"8fe4bfeb0794d4892d35426692eb77d825fdf3d9d30079d0f73d305028e084eb"} Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.219046 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-56569cbcb6-tnnsc"] Mar 10 08:22:26 crc kubenswrapper[4825]: W0310 08:22:26.219411 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod575df1da_f60a_4891_b163_459c3af78f95.slice/crio-7ef6fb6e7dcbbc015dea6b205776cb97eab5313491f763ebade9ce5c94b9eb72 WatchSource:0}: Error finding container 7ef6fb6e7dcbbc015dea6b205776cb97eab5313491f763ebade9ce5c94b9eb72: Status 404 returned error can't find the container with id 7ef6fb6e7dcbbc015dea6b205776cb97eab5313491f763ebade9ce5c94b9eb72 Mar 10 08:22:26 crc kubenswrapper[4825]: W0310 08:22:26.240361 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c39bb33_53f2_4a5d_b37e_eb2e5feeda72.slice/crio-484eb1e5347cc11caa798a22b9cf33e26e95bbcb971e23d615e8b2e992ec7ac5 WatchSource:0}: Error finding container 484eb1e5347cc11caa798a22b9cf33e26e95bbcb971e23d615e8b2e992ec7ac5: Status 404 returned error can't find the container with id 484eb1e5347cc11caa798a22b9cf33e26e95bbcb971e23d615e8b2e992ec7ac5 Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.615348 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6947b5fb54-lvrgg"] Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.615757 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-6947b5fb54-lvrgg" podUID="2424da63-7441-4f0f-bbd3-532e759a5b36" containerName="heat-api" containerID="cri-o://a520736295f231bcb9741cf3ff1ac386c24ebda890527271778549fb8552554f" gracePeriod=60 Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.632975 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-9c4964b68-jlxvc"] Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.633244 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-9c4964b68-jlxvc" podUID="58f58fb9-cd6f-49e4-945f-a3fd33bf45f2" containerName="heat-cfnapi" containerID="cri-o://dae7e40147c2bcd681126ae2735ab3f28238ffad3d65193b495530daca49ef30" gracePeriod=60 Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.646008 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-6947b5fb54-lvrgg" podUID="2424da63-7441-4f0f-bbd3-532e759a5b36" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.1.149:8004/healthcheck\": EOF" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.670185 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-65988cc597-hxfff"] Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.671942 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-65988cc597-hxfff" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.678512 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.678783 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.679435 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a394b38-8efc-4b86-a0a9-5a748e4d56e3-combined-ca-bundle\") pod \"heat-cfnapi-65988cc597-hxfff\" (UID: \"3a394b38-8efc-4b86-a0a9-5a748e4d56e3\") " pod="openstack/heat-cfnapi-65988cc597-hxfff" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.679516 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a394b38-8efc-4b86-a0a9-5a748e4d56e3-config-data\") pod \"heat-cfnapi-65988cc597-hxfff\" (UID: \"3a394b38-8efc-4b86-a0a9-5a748e4d56e3\") " pod="openstack/heat-cfnapi-65988cc597-hxfff" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.679582 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knmn9\" (UniqueName: \"kubernetes.io/projected/3a394b38-8efc-4b86-a0a9-5a748e4d56e3-kube-api-access-knmn9\") pod \"heat-cfnapi-65988cc597-hxfff\" (UID: \"3a394b38-8efc-4b86-a0a9-5a748e4d56e3\") " pod="openstack/heat-cfnapi-65988cc597-hxfff" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.679618 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a394b38-8efc-4b86-a0a9-5a748e4d56e3-config-data-custom\") pod \"heat-cfnapi-65988cc597-hxfff\" (UID: \"3a394b38-8efc-4b86-a0a9-5a748e4d56e3\") " pod="openstack/heat-cfnapi-65988cc597-hxfff" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.679681 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a394b38-8efc-4b86-a0a9-5a748e4d56e3-public-tls-certs\") pod \"heat-cfnapi-65988cc597-hxfff\" (UID: \"3a394b38-8efc-4b86-a0a9-5a748e4d56e3\") " pod="openstack/heat-cfnapi-65988cc597-hxfff" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.679731 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a394b38-8efc-4b86-a0a9-5a748e4d56e3-internal-tls-certs\") pod \"heat-cfnapi-65988cc597-hxfff\" (UID: \"3a394b38-8efc-4b86-a0a9-5a748e4d56e3\") " pod="openstack/heat-cfnapi-65988cc597-hxfff" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.690216 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7fcb64c775-kvwk8"] Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.691539 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7fcb64c775-kvwk8" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.695108 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.695357 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.739646 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-65988cc597-hxfff"] Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.757832 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7fcb64c775-kvwk8"] Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.781003 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a394b38-8efc-4b86-a0a9-5a748e4d56e3-internal-tls-certs\") pod \"heat-cfnapi-65988cc597-hxfff\" (UID: \"3a394b38-8efc-4b86-a0a9-5a748e4d56e3\") " pod="openstack/heat-cfnapi-65988cc597-hxfff" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.781171 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a394b38-8efc-4b86-a0a9-5a748e4d56e3-combined-ca-bundle\") pod \"heat-cfnapi-65988cc597-hxfff\" (UID: \"3a394b38-8efc-4b86-a0a9-5a748e4d56e3\") " pod="openstack/heat-cfnapi-65988cc597-hxfff" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.781229 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a394b38-8efc-4b86-a0a9-5a748e4d56e3-config-data\") pod \"heat-cfnapi-65988cc597-hxfff\" (UID: \"3a394b38-8efc-4b86-a0a9-5a748e4d56e3\") " pod="openstack/heat-cfnapi-65988cc597-hxfff" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.781290 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knmn9\" (UniqueName: \"kubernetes.io/projected/3a394b38-8efc-4b86-a0a9-5a748e4d56e3-kube-api-access-knmn9\") pod \"heat-cfnapi-65988cc597-hxfff\" (UID: \"3a394b38-8efc-4b86-a0a9-5a748e4d56e3\") " pod="openstack/heat-cfnapi-65988cc597-hxfff" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.781328 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a394b38-8efc-4b86-a0a9-5a748e4d56e3-config-data-custom\") pod \"heat-cfnapi-65988cc597-hxfff\" (UID: \"3a394b38-8efc-4b86-a0a9-5a748e4d56e3\") " pod="openstack/heat-cfnapi-65988cc597-hxfff" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.781388 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a394b38-8efc-4b86-a0a9-5a748e4d56e3-public-tls-certs\") pod \"heat-cfnapi-65988cc597-hxfff\" (UID: \"3a394b38-8efc-4b86-a0a9-5a748e4d56e3\") " pod="openstack/heat-cfnapi-65988cc597-hxfff" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.786566 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a394b38-8efc-4b86-a0a9-5a748e4d56e3-internal-tls-certs\") pod \"heat-cfnapi-65988cc597-hxfff\" (UID: \"3a394b38-8efc-4b86-a0a9-5a748e4d56e3\") " pod="openstack/heat-cfnapi-65988cc597-hxfff" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.786672 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a394b38-8efc-4b86-a0a9-5a748e4d56e3-public-tls-certs\") pod \"heat-cfnapi-65988cc597-hxfff\" (UID: \"3a394b38-8efc-4b86-a0a9-5a748e4d56e3\") " pod="openstack/heat-cfnapi-65988cc597-hxfff" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.787973 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a394b38-8efc-4b86-a0a9-5a748e4d56e3-combined-ca-bundle\") pod \"heat-cfnapi-65988cc597-hxfff\" (UID: \"3a394b38-8efc-4b86-a0a9-5a748e4d56e3\") " pod="openstack/heat-cfnapi-65988cc597-hxfff" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.796033 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a394b38-8efc-4b86-a0a9-5a748e4d56e3-config-data-custom\") pod \"heat-cfnapi-65988cc597-hxfff\" (UID: \"3a394b38-8efc-4b86-a0a9-5a748e4d56e3\") " pod="openstack/heat-cfnapi-65988cc597-hxfff" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.796473 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a394b38-8efc-4b86-a0a9-5a748e4d56e3-config-data\") pod \"heat-cfnapi-65988cc597-hxfff\" (UID: \"3a394b38-8efc-4b86-a0a9-5a748e4d56e3\") " pod="openstack/heat-cfnapi-65988cc597-hxfff" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.804854 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knmn9\" (UniqueName: \"kubernetes.io/projected/3a394b38-8efc-4b86-a0a9-5a748e4d56e3-kube-api-access-knmn9\") pod \"heat-cfnapi-65988cc597-hxfff\" (UID: \"3a394b38-8efc-4b86-a0a9-5a748e4d56e3\") " pod="openstack/heat-cfnapi-65988cc597-hxfff" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.883867 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8350adde-7bc4-4b8e-8e61-da40714ad2c2-public-tls-certs\") pod \"heat-api-7fcb64c775-kvwk8\" (UID: \"8350adde-7bc4-4b8e-8e61-da40714ad2c2\") " pod="openstack/heat-api-7fcb64c775-kvwk8" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.883979 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8350adde-7bc4-4b8e-8e61-da40714ad2c2-config-data-custom\") pod \"heat-api-7fcb64c775-kvwk8\" (UID: \"8350adde-7bc4-4b8e-8e61-da40714ad2c2\") " pod="openstack/heat-api-7fcb64c775-kvwk8" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.884795 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8350adde-7bc4-4b8e-8e61-da40714ad2c2-internal-tls-certs\") pod \"heat-api-7fcb64c775-kvwk8\" (UID: \"8350adde-7bc4-4b8e-8e61-da40714ad2c2\") " pod="openstack/heat-api-7fcb64c775-kvwk8" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.884827 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7cb6\" (UniqueName: \"kubernetes.io/projected/8350adde-7bc4-4b8e-8e61-da40714ad2c2-kube-api-access-m7cb6\") pod \"heat-api-7fcb64c775-kvwk8\" (UID: \"8350adde-7bc4-4b8e-8e61-da40714ad2c2\") " pod="openstack/heat-api-7fcb64c775-kvwk8" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.884972 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8350adde-7bc4-4b8e-8e61-da40714ad2c2-combined-ca-bundle\") pod \"heat-api-7fcb64c775-kvwk8\" (UID: \"8350adde-7bc4-4b8e-8e61-da40714ad2c2\") " pod="openstack/heat-api-7fcb64c775-kvwk8" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.885025 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8350adde-7bc4-4b8e-8e61-da40714ad2c2-config-data\") pod \"heat-api-7fcb64c775-kvwk8\" (UID: \"8350adde-7bc4-4b8e-8e61-da40714ad2c2\") " pod="openstack/heat-api-7fcb64c775-kvwk8" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.986574 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8350adde-7bc4-4b8e-8e61-da40714ad2c2-public-tls-certs\") pod \"heat-api-7fcb64c775-kvwk8\" (UID: \"8350adde-7bc4-4b8e-8e61-da40714ad2c2\") " pod="openstack/heat-api-7fcb64c775-kvwk8" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.986631 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8350adde-7bc4-4b8e-8e61-da40714ad2c2-config-data-custom\") pod \"heat-api-7fcb64c775-kvwk8\" (UID: \"8350adde-7bc4-4b8e-8e61-da40714ad2c2\") " pod="openstack/heat-api-7fcb64c775-kvwk8" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.986683 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8350adde-7bc4-4b8e-8e61-da40714ad2c2-internal-tls-certs\") pod \"heat-api-7fcb64c775-kvwk8\" (UID: \"8350adde-7bc4-4b8e-8e61-da40714ad2c2\") " pod="openstack/heat-api-7fcb64c775-kvwk8" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.986706 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7cb6\" (UniqueName: \"kubernetes.io/projected/8350adde-7bc4-4b8e-8e61-da40714ad2c2-kube-api-access-m7cb6\") pod \"heat-api-7fcb64c775-kvwk8\" (UID: \"8350adde-7bc4-4b8e-8e61-da40714ad2c2\") " pod="openstack/heat-api-7fcb64c775-kvwk8" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.986767 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8350adde-7bc4-4b8e-8e61-da40714ad2c2-combined-ca-bundle\") pod \"heat-api-7fcb64c775-kvwk8\" (UID: \"8350adde-7bc4-4b8e-8e61-da40714ad2c2\") " pod="openstack/heat-api-7fcb64c775-kvwk8" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.986798 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8350adde-7bc4-4b8e-8e61-da40714ad2c2-config-data\") pod \"heat-api-7fcb64c775-kvwk8\" (UID: \"8350adde-7bc4-4b8e-8e61-da40714ad2c2\") " pod="openstack/heat-api-7fcb64c775-kvwk8" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.992904 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8350adde-7bc4-4b8e-8e61-da40714ad2c2-public-tls-certs\") pod \"heat-api-7fcb64c775-kvwk8\" (UID: \"8350adde-7bc4-4b8e-8e61-da40714ad2c2\") " pod="openstack/heat-api-7fcb64c775-kvwk8" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.993460 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8350adde-7bc4-4b8e-8e61-da40714ad2c2-combined-ca-bundle\") pod \"heat-api-7fcb64c775-kvwk8\" (UID: \"8350adde-7bc4-4b8e-8e61-da40714ad2c2\") " pod="openstack/heat-api-7fcb64c775-kvwk8" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.994204 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8350adde-7bc4-4b8e-8e61-da40714ad2c2-config-data-custom\") pod \"heat-api-7fcb64c775-kvwk8\" (UID: \"8350adde-7bc4-4b8e-8e61-da40714ad2c2\") " pod="openstack/heat-api-7fcb64c775-kvwk8" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.994904 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8350adde-7bc4-4b8e-8e61-da40714ad2c2-config-data\") pod \"heat-api-7fcb64c775-kvwk8\" (UID: \"8350adde-7bc4-4b8e-8e61-da40714ad2c2\") " pod="openstack/heat-api-7fcb64c775-kvwk8" Mar 10 08:22:26 crc kubenswrapper[4825]: I0310 08:22:26.995630 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8350adde-7bc4-4b8e-8e61-da40714ad2c2-internal-tls-certs\") pod \"heat-api-7fcb64c775-kvwk8\" (UID: \"8350adde-7bc4-4b8e-8e61-da40714ad2c2\") " pod="openstack/heat-api-7fcb64c775-kvwk8" Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.004450 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7cb6\" (UniqueName: \"kubernetes.io/projected/8350adde-7bc4-4b8e-8e61-da40714ad2c2-kube-api-access-m7cb6\") pod \"heat-api-7fcb64c775-kvwk8\" (UID: \"8350adde-7bc4-4b8e-8e61-da40714ad2c2\") " pod="openstack/heat-api-7fcb64c775-kvwk8" Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.011473 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-65988cc597-hxfff" Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.031345 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7fcb64c775-kvwk8" Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.195389 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-9c4964b68-jlxvc" podUID="58f58fb9-cd6f-49e4-945f-a3fd33bf45f2" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.1.150:8000/healthcheck\": read tcp 10.217.0.2:57524->10.217.1.150:8000: read: connection reset by peer" Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.224180 4825 generic.go:334] "Generic (PLEG): container finished" podID="7c39bb33-53f2-4a5d-b37e-eb2e5feeda72" containerID="3b16d36c696114a0e43cd02a301685f04fc968b5264ccc9ad9f9c7969a70a39d" exitCode=1 Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.224293 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-56569cbcb6-tnnsc" event={"ID":"7c39bb33-53f2-4a5d-b37e-eb2e5feeda72","Type":"ContainerDied","Data":"3b16d36c696114a0e43cd02a301685f04fc968b5264ccc9ad9f9c7969a70a39d"} Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.224330 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-56569cbcb6-tnnsc" event={"ID":"7c39bb33-53f2-4a5d-b37e-eb2e5feeda72","Type":"ContainerStarted","Data":"484eb1e5347cc11caa798a22b9cf33e26e95bbcb971e23d615e8b2e992ec7ac5"} Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.224961 4825 scope.go:117] "RemoveContainer" containerID="3b16d36c696114a0e43cd02a301685f04fc968b5264ccc9ad9f9c7969a70a39d" Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.266491 4825 generic.go:334] "Generic (PLEG): container finished" podID="575df1da-f60a-4891-b163-459c3af78f95" containerID="bebbfd8ef866e8017390d7e3d92fc2367ff25f3a387ebe08d632993bd42cca5e" exitCode=1 Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.267255 4825 scope.go:117] "RemoveContainer" containerID="bebbfd8ef866e8017390d7e3d92fc2367ff25f3a387ebe08d632993bd42cca5e" Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.272030 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7e5180d-ef78-40ef-9813-da06bc220bc9" path="/var/lib/kubelet/pods/a7e5180d-ef78-40ef-9813-da06bc220bc9/volumes" Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.272878 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8e45726-486b-47db-8c50-1f7bf24c83dc" path="/var/lib/kubelet/pods/d8e45726-486b-47db-8c50-1f7bf24c83dc/volumes" Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.274574 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" event={"ID":"575df1da-f60a-4891-b163-459c3af78f95","Type":"ContainerDied","Data":"bebbfd8ef866e8017390d7e3d92fc2367ff25f3a387ebe08d632993bd42cca5e"} Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.274614 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" event={"ID":"575df1da-f60a-4891-b163-459c3af78f95","Type":"ContainerStarted","Data":"7ef6fb6e7dcbbc015dea6b205776cb97eab5313491f763ebade9ce5c94b9eb72"} Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.275788 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-55d8bdb9c6-bvvnc" event={"ID":"0f395cfa-be06-4d10-b6ca-2584f3588882","Type":"ContainerStarted","Data":"d78b3c914f035f316c634e93d9ee499579cfdd69af06231196b405deb9cb6bd4"} Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.277203 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-55d8bdb9c6-bvvnc" Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.366892 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-55d8bdb9c6-bvvnc" podStartSLOduration=2.366868104 podStartE2EDuration="2.366868104s" podCreationTimestamp="2026-03-10 08:22:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:22:27.314615882 +0000 UTC m=+5900.344396507" watchObservedRunningTime="2026-03-10 08:22:27.366868104 +0000 UTC m=+5900.396648719" Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.389552 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-65988cc597-hxfff"] Mar 10 08:22:27 crc kubenswrapper[4825]: W0310 08:22:27.438299 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a394b38_8efc_4b86_a0a9_5a748e4d56e3.slice/crio-06822d7f0acdb60785c5bb1ce85eb62e0bb7a065345cc952c7a59e6f9fba1acb WatchSource:0}: Error finding container 06822d7f0acdb60785c5bb1ce85eb62e0bb7a065345cc952c7a59e6f9fba1acb: Status 404 returned error can't find the container with id 06822d7f0acdb60785c5bb1ce85eb62e0bb7a065345cc952c7a59e6f9fba1acb Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.659045 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7fcb64c775-kvwk8"] Mar 10 08:22:27 crc kubenswrapper[4825]: W0310 08:22:27.660926 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8350adde_7bc4_4b8e_8e61_da40714ad2c2.slice/crio-30ea630fe4d2a689ed1f3ab7bde8f1ac9cd77e7d2304b3b54bfe16210d69a2c7 WatchSource:0}: Error finding container 30ea630fe4d2a689ed1f3ab7bde8f1ac9cd77e7d2304b3b54bfe16210d69a2c7: Status 404 returned error can't find the container with id 30ea630fe4d2a689ed1f3ab7bde8f1ac9cd77e7d2304b3b54bfe16210d69a2c7 Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.895071 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-9c4964b68-jlxvc" Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.924476 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2-config-data\") pod \"58f58fb9-cd6f-49e4-945f-a3fd33bf45f2\" (UID: \"58f58fb9-cd6f-49e4-945f-a3fd33bf45f2\") " Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.924581 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz97s\" (UniqueName: \"kubernetes.io/projected/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2-kube-api-access-bz97s\") pod \"58f58fb9-cd6f-49e4-945f-a3fd33bf45f2\" (UID: \"58f58fb9-cd6f-49e4-945f-a3fd33bf45f2\") " Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.924660 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2-config-data-custom\") pod \"58f58fb9-cd6f-49e4-945f-a3fd33bf45f2\" (UID: \"58f58fb9-cd6f-49e4-945f-a3fd33bf45f2\") " Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.924832 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2-combined-ca-bundle\") pod \"58f58fb9-cd6f-49e4-945f-a3fd33bf45f2\" (UID: \"58f58fb9-cd6f-49e4-945f-a3fd33bf45f2\") " Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.949329 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "58f58fb9-cd6f-49e4-945f-a3fd33bf45f2" (UID: "58f58fb9-cd6f-49e4-945f-a3fd33bf45f2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:22:27 crc kubenswrapper[4825]: I0310 08:22:27.958747 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2-kube-api-access-bz97s" (OuterVolumeSpecName: "kube-api-access-bz97s") pod "58f58fb9-cd6f-49e4-945f-a3fd33bf45f2" (UID: "58f58fb9-cd6f-49e4-945f-a3fd33bf45f2"). InnerVolumeSpecName "kube-api-access-bz97s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.026934 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz97s\" (UniqueName: \"kubernetes.io/projected/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2-kube-api-access-bz97s\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.026958 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.068511 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58f58fb9-cd6f-49e4-945f-a3fd33bf45f2" (UID: "58f58fb9-cd6f-49e4-945f-a3fd33bf45f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.124528 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2-config-data" (OuterVolumeSpecName: "config-data") pod "58f58fb9-cd6f-49e4-945f-a3fd33bf45f2" (UID: "58f58fb9-cd6f-49e4-945f-a3fd33bf45f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.129210 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.129245 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.240507 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.289176 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7fcb64c775-kvwk8" event={"ID":"8350adde-7bc4-4b8e-8e61-da40714ad2c2","Type":"ContainerStarted","Data":"bbf6ba6aa0b67a75353350ffebb788875ebbab8109f38e1597df3448010f1bec"} Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.289222 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7fcb64c775-kvwk8" event={"ID":"8350adde-7bc4-4b8e-8e61-da40714ad2c2","Type":"ContainerStarted","Data":"30ea630fe4d2a689ed1f3ab7bde8f1ac9cd77e7d2304b3b54bfe16210d69a2c7"} Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.289335 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7fcb64c775-kvwk8" Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.292471 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" event={"ID":"575df1da-f60a-4891-b163-459c3af78f95","Type":"ContainerStarted","Data":"196cc46eba22ec39b0ad1c24de8a56f85a59e98235ca3e10b53d94586f3cd214"} Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.292612 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.297145 4825 generic.go:334] "Generic (PLEG): container finished" podID="58f58fb9-cd6f-49e4-945f-a3fd33bf45f2" containerID="dae7e40147c2bcd681126ae2735ab3f28238ffad3d65193b495530daca49ef30" exitCode=0 Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.297183 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-9c4964b68-jlxvc" event={"ID":"58f58fb9-cd6f-49e4-945f-a3fd33bf45f2","Type":"ContainerDied","Data":"dae7e40147c2bcd681126ae2735ab3f28238ffad3d65193b495530daca49ef30"} Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.297223 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-9c4964b68-jlxvc" event={"ID":"58f58fb9-cd6f-49e4-945f-a3fd33bf45f2","Type":"ContainerDied","Data":"ed669b30fd049193bd293f7717ba2434f0f7cb1b9260ebc62c76086d6c9a2cff"} Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.297228 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-9c4964b68-jlxvc" Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.297245 4825 scope.go:117] "RemoveContainer" containerID="dae7e40147c2bcd681126ae2735ab3f28238ffad3d65193b495530daca49ef30" Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.302672 4825 generic.go:334] "Generic (PLEG): container finished" podID="7c39bb33-53f2-4a5d-b37e-eb2e5feeda72" containerID="9c797d4fc902e575dcd01d129baf8bd762f5c56479f0d5c5fc6855dfa2533def" exitCode=1 Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.302750 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-56569cbcb6-tnnsc" event={"ID":"7c39bb33-53f2-4a5d-b37e-eb2e5feeda72","Type":"ContainerDied","Data":"9c797d4fc902e575dcd01d129baf8bd762f5c56479f0d5c5fc6855dfa2533def"} Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.303524 4825 scope.go:117] "RemoveContainer" containerID="9c797d4fc902e575dcd01d129baf8bd762f5c56479f0d5c5fc6855dfa2533def" Mar 10 08:22:28 crc kubenswrapper[4825]: E0310 08:22:28.303867 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-56569cbcb6-tnnsc_openstack(7c39bb33-53f2-4a5d-b37e-eb2e5feeda72)\"" pod="openstack/heat-api-56569cbcb6-tnnsc" podUID="7c39bb33-53f2-4a5d-b37e-eb2e5feeda72" Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.306974 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-65988cc597-hxfff" event={"ID":"3a394b38-8efc-4b86-a0a9-5a748e4d56e3","Type":"ContainerStarted","Data":"aeed500819d452ea110038f2ae2c6f5ae78d0ba291426371195f5c5eaee9f76b"} Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.307122 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-65988cc597-hxfff" Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.307219 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-65988cc597-hxfff" event={"ID":"3a394b38-8efc-4b86-a0a9-5a748e4d56e3","Type":"ContainerStarted","Data":"06822d7f0acdb60785c5bb1ce85eb62e0bb7a065345cc952c7a59e6f9fba1acb"} Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.333164 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7fcb64c775-kvwk8" podStartSLOduration=2.328714487 podStartE2EDuration="2.328714487s" podCreationTimestamp="2026-03-10 08:22:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:22:28.320707146 +0000 UTC m=+5901.350487761" watchObservedRunningTime="2026-03-10 08:22:28.328714487 +0000 UTC m=+5901.358495102" Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.358025 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-65988cc597-hxfff" podStartSLOduration=2.357997555 podStartE2EDuration="2.357997555s" podCreationTimestamp="2026-03-10 08:22:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:22:28.353652371 +0000 UTC m=+5901.383432986" watchObservedRunningTime="2026-03-10 08:22:28.357997555 +0000 UTC m=+5901.387778170" Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.370995 4825 scope.go:117] "RemoveContainer" containerID="dae7e40147c2bcd681126ae2735ab3f28238ffad3d65193b495530daca49ef30" Mar 10 08:22:28 crc kubenswrapper[4825]: E0310 08:22:28.372281 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dae7e40147c2bcd681126ae2735ab3f28238ffad3d65193b495530daca49ef30\": container with ID starting with dae7e40147c2bcd681126ae2735ab3f28238ffad3d65193b495530daca49ef30 not found: ID does not exist" containerID="dae7e40147c2bcd681126ae2735ab3f28238ffad3d65193b495530daca49ef30" Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.372318 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dae7e40147c2bcd681126ae2735ab3f28238ffad3d65193b495530daca49ef30"} err="failed to get container status \"dae7e40147c2bcd681126ae2735ab3f28238ffad3d65193b495530daca49ef30\": rpc error: code = NotFound desc = could not find container \"dae7e40147c2bcd681126ae2735ab3f28238ffad3d65193b495530daca49ef30\": container with ID starting with dae7e40147c2bcd681126ae2735ab3f28238ffad3d65193b495530daca49ef30 not found: ID does not exist" Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.372342 4825 scope.go:117] "RemoveContainer" containerID="3b16d36c696114a0e43cd02a301685f04fc968b5264ccc9ad9f9c7969a70a39d" Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.455209 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-9c4964b68-jlxvc"] Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.466932 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-9c4964b68-jlxvc"] Mar 10 08:22:28 crc kubenswrapper[4825]: I0310 08:22:28.476466 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" podStartSLOduration=3.476442915 podStartE2EDuration="3.476442915s" podCreationTimestamp="2026-03-10 08:22:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:22:28.42862489 +0000 UTC m=+5901.458405505" watchObservedRunningTime="2026-03-10 08:22:28.476442915 +0000 UTC m=+5901.506223530" Mar 10 08:22:29 crc kubenswrapper[4825]: I0310 08:22:29.249179 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58f58fb9-cd6f-49e4-945f-a3fd33bf45f2" path="/var/lib/kubelet/pods/58f58fb9-cd6f-49e4-945f-a3fd33bf45f2/volumes" Mar 10 08:22:29 crc kubenswrapper[4825]: I0310 08:22:29.319997 4825 generic.go:334] "Generic (PLEG): container finished" podID="575df1da-f60a-4891-b163-459c3af78f95" containerID="196cc46eba22ec39b0ad1c24de8a56f85a59e98235ca3e10b53d94586f3cd214" exitCode=1 Mar 10 08:22:29 crc kubenswrapper[4825]: I0310 08:22:29.320234 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" event={"ID":"575df1da-f60a-4891-b163-459c3af78f95","Type":"ContainerDied","Data":"196cc46eba22ec39b0ad1c24de8a56f85a59e98235ca3e10b53d94586f3cd214"} Mar 10 08:22:29 crc kubenswrapper[4825]: I0310 08:22:29.320312 4825 scope.go:117] "RemoveContainer" containerID="bebbfd8ef866e8017390d7e3d92fc2367ff25f3a387ebe08d632993bd42cca5e" Mar 10 08:22:29 crc kubenswrapper[4825]: I0310 08:22:29.320972 4825 scope.go:117] "RemoveContainer" containerID="196cc46eba22ec39b0ad1c24de8a56f85a59e98235ca3e10b53d94586f3cd214" Mar 10 08:22:29 crc kubenswrapper[4825]: E0310 08:22:29.321266 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6b9bd96c95-wvxct_openstack(575df1da-f60a-4891-b163-459c3af78f95)\"" pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" podUID="575df1da-f60a-4891-b163-459c3af78f95" Mar 10 08:22:29 crc kubenswrapper[4825]: I0310 08:22:29.336954 4825 scope.go:117] "RemoveContainer" containerID="9c797d4fc902e575dcd01d129baf8bd762f5c56479f0d5c5fc6855dfa2533def" Mar 10 08:22:29 crc kubenswrapper[4825]: E0310 08:22:29.337690 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-56569cbcb6-tnnsc_openstack(7c39bb33-53f2-4a5d-b37e-eb2e5feeda72)\"" pod="openstack/heat-api-56569cbcb6-tnnsc" podUID="7c39bb33-53f2-4a5d-b37e-eb2e5feeda72" Mar 10 08:22:30 crc kubenswrapper[4825]: I0310 08:22:30.346079 4825 scope.go:117] "RemoveContainer" containerID="196cc46eba22ec39b0ad1c24de8a56f85a59e98235ca3e10b53d94586f3cd214" Mar 10 08:22:30 crc kubenswrapper[4825]: E0310 08:22:30.346484 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6b9bd96c95-wvxct_openstack(575df1da-f60a-4891-b163-459c3af78f95)\"" pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" podUID="575df1da-f60a-4891-b163-459c3af78f95" Mar 10 08:22:30 crc kubenswrapper[4825]: I0310 08:22:30.424329 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-fdb9f75f4-fr95n" Mar 10 08:22:30 crc kubenswrapper[4825]: I0310 08:22:30.495515 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67b47658d-rd7tt"] Mar 10 08:22:30 crc kubenswrapper[4825]: I0310 08:22:30.495791 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67b47658d-rd7tt" podUID="402e8d63-8847-44af-9a62-9536d0f513f8" containerName="horizon-log" containerID="cri-o://39d78a2a9b3841e3d9ee90de8594f18c0fb81678d34f6df1cd41a80bb869fbf6" gracePeriod=30 Mar 10 08:22:30 crc kubenswrapper[4825]: I0310 08:22:30.495937 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67b47658d-rd7tt" podUID="402e8d63-8847-44af-9a62-9536d0f513f8" containerName="horizon" containerID="cri-o://4734b56c5e54339cc3180b88d104ea545423352854d359b2160cb88f7ae1ab80" gracePeriod=30 Mar 10 08:22:30 crc kubenswrapper[4825]: I0310 08:22:30.570694 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" Mar 10 08:22:30 crc kubenswrapper[4825]: I0310 08:22:30.652904 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-56569cbcb6-tnnsc" Mar 10 08:22:30 crc kubenswrapper[4825]: I0310 08:22:30.653659 4825 scope.go:117] "RemoveContainer" containerID="9c797d4fc902e575dcd01d129baf8bd762f5c56479f0d5c5fc6855dfa2533def" Mar 10 08:22:30 crc kubenswrapper[4825]: E0310 08:22:30.653877 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-56569cbcb6-tnnsc_openstack(7c39bb33-53f2-4a5d-b37e-eb2e5feeda72)\"" pod="openstack/heat-api-56569cbcb6-tnnsc" podUID="7c39bb33-53f2-4a5d-b37e-eb2e5feeda72" Mar 10 08:22:30 crc kubenswrapper[4825]: I0310 08:22:30.653897 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-56569cbcb6-tnnsc" Mar 10 08:22:31 crc kubenswrapper[4825]: I0310 08:22:31.354447 4825 scope.go:117] "RemoveContainer" containerID="9c797d4fc902e575dcd01d129baf8bd762f5c56479f0d5c5fc6855dfa2533def" Mar 10 08:22:31 crc kubenswrapper[4825]: I0310 08:22:31.355045 4825 scope.go:117] "RemoveContainer" containerID="196cc46eba22ec39b0ad1c24de8a56f85a59e98235ca3e10b53d94586f3cd214" Mar 10 08:22:31 crc kubenswrapper[4825]: E0310 08:22:31.355295 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6b9bd96c95-wvxct_openstack(575df1da-f60a-4891-b163-459c3af78f95)\"" pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" podUID="575df1da-f60a-4891-b163-459c3af78f95" Mar 10 08:22:31 crc kubenswrapper[4825]: E0310 08:22:31.355298 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-56569cbcb6-tnnsc_openstack(7c39bb33-53f2-4a5d-b37e-eb2e5feeda72)\"" pod="openstack/heat-api-56569cbcb6-tnnsc" podUID="7c39bb33-53f2-4a5d-b37e-eb2e5feeda72" Mar 10 08:22:32 crc kubenswrapper[4825]: I0310 08:22:32.035221 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-6947b5fb54-lvrgg" podUID="2424da63-7441-4f0f-bbd3-532e759a5b36" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.1.149:8004/healthcheck\": read tcp 10.217.0.2:58858->10.217.1.149:8004: read: connection reset by peer" Mar 10 08:22:32 crc kubenswrapper[4825]: I0310 08:22:32.368344 4825 generic.go:334] "Generic (PLEG): container finished" podID="2424da63-7441-4f0f-bbd3-532e759a5b36" containerID="a520736295f231bcb9741cf3ff1ac386c24ebda890527271778549fb8552554f" exitCode=0 Mar 10 08:22:32 crc kubenswrapper[4825]: I0310 08:22:32.368386 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6947b5fb54-lvrgg" event={"ID":"2424da63-7441-4f0f-bbd3-532e759a5b36","Type":"ContainerDied","Data":"a520736295f231bcb9741cf3ff1ac386c24ebda890527271778549fb8552554f"} Mar 10 08:22:32 crc kubenswrapper[4825]: I0310 08:22:32.504245 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6947b5fb54-lvrgg" Mar 10 08:22:32 crc kubenswrapper[4825]: I0310 08:22:32.628995 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2424da63-7441-4f0f-bbd3-532e759a5b36-config-data-custom\") pod \"2424da63-7441-4f0f-bbd3-532e759a5b36\" (UID: \"2424da63-7441-4f0f-bbd3-532e759a5b36\") " Mar 10 08:22:32 crc kubenswrapper[4825]: I0310 08:22:32.629336 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2424da63-7441-4f0f-bbd3-532e759a5b36-combined-ca-bundle\") pod \"2424da63-7441-4f0f-bbd3-532e759a5b36\" (UID: \"2424da63-7441-4f0f-bbd3-532e759a5b36\") " Mar 10 08:22:32 crc kubenswrapper[4825]: I0310 08:22:32.629493 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk4g2\" (UniqueName: \"kubernetes.io/projected/2424da63-7441-4f0f-bbd3-532e759a5b36-kube-api-access-sk4g2\") pod \"2424da63-7441-4f0f-bbd3-532e759a5b36\" (UID: \"2424da63-7441-4f0f-bbd3-532e759a5b36\") " Mar 10 08:22:32 crc kubenswrapper[4825]: I0310 08:22:32.629610 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2424da63-7441-4f0f-bbd3-532e759a5b36-config-data\") pod \"2424da63-7441-4f0f-bbd3-532e759a5b36\" (UID: \"2424da63-7441-4f0f-bbd3-532e759a5b36\") " Mar 10 08:22:32 crc kubenswrapper[4825]: I0310 08:22:32.637058 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2424da63-7441-4f0f-bbd3-532e759a5b36-kube-api-access-sk4g2" (OuterVolumeSpecName: "kube-api-access-sk4g2") pod "2424da63-7441-4f0f-bbd3-532e759a5b36" (UID: "2424da63-7441-4f0f-bbd3-532e759a5b36"). InnerVolumeSpecName "kube-api-access-sk4g2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:22:32 crc kubenswrapper[4825]: I0310 08:22:32.639493 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2424da63-7441-4f0f-bbd3-532e759a5b36-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2424da63-7441-4f0f-bbd3-532e759a5b36" (UID: "2424da63-7441-4f0f-bbd3-532e759a5b36"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:22:32 crc kubenswrapper[4825]: I0310 08:22:32.670760 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2424da63-7441-4f0f-bbd3-532e759a5b36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2424da63-7441-4f0f-bbd3-532e759a5b36" (UID: "2424da63-7441-4f0f-bbd3-532e759a5b36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:22:32 crc kubenswrapper[4825]: I0310 08:22:32.678391 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2424da63-7441-4f0f-bbd3-532e759a5b36-config-data" (OuterVolumeSpecName: "config-data") pod "2424da63-7441-4f0f-bbd3-532e759a5b36" (UID: "2424da63-7441-4f0f-bbd3-532e759a5b36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:22:32 crc kubenswrapper[4825]: I0310 08:22:32.734535 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2424da63-7441-4f0f-bbd3-532e759a5b36-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:32 crc kubenswrapper[4825]: I0310 08:22:32.734592 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2424da63-7441-4f0f-bbd3-532e759a5b36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:32 crc kubenswrapper[4825]: I0310 08:22:32.734604 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk4g2\" (UniqueName: \"kubernetes.io/projected/2424da63-7441-4f0f-bbd3-532e759a5b36-kube-api-access-sk4g2\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:32 crc kubenswrapper[4825]: I0310 08:22:32.734622 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2424da63-7441-4f0f-bbd3-532e759a5b36-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:33 crc kubenswrapper[4825]: I0310 08:22:33.382296 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6947b5fb54-lvrgg" event={"ID":"2424da63-7441-4f0f-bbd3-532e759a5b36","Type":"ContainerDied","Data":"2494aaee240fd33d943d0ed60739e800a66dae8af5f8182cf4ed4a8e13c0b7ca"} Mar 10 08:22:33 crc kubenswrapper[4825]: I0310 08:22:33.382675 4825 scope.go:117] "RemoveContainer" containerID="a520736295f231bcb9741cf3ff1ac386c24ebda890527271778549fb8552554f" Mar 10 08:22:33 crc kubenswrapper[4825]: I0310 08:22:33.382376 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6947b5fb54-lvrgg" Mar 10 08:22:33 crc kubenswrapper[4825]: I0310 08:22:33.415564 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6947b5fb54-lvrgg"] Mar 10 08:22:33 crc kubenswrapper[4825]: I0310 08:22:33.423651 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-6947b5fb54-lvrgg"] Mar 10 08:22:34 crc kubenswrapper[4825]: I0310 08:22:34.401449 4825 generic.go:334] "Generic (PLEG): container finished" podID="402e8d63-8847-44af-9a62-9536d0f513f8" containerID="4734b56c5e54339cc3180b88d104ea545423352854d359b2160cb88f7ae1ab80" exitCode=0 Mar 10 08:22:34 crc kubenswrapper[4825]: I0310 08:22:34.401548 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67b47658d-rd7tt" event={"ID":"402e8d63-8847-44af-9a62-9536d0f513f8","Type":"ContainerDied","Data":"4734b56c5e54339cc3180b88d104ea545423352854d359b2160cb88f7ae1ab80"} Mar 10 08:22:35 crc kubenswrapper[4825]: I0310 08:22:35.247286 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2424da63-7441-4f0f-bbd3-532e759a5b36" path="/var/lib/kubelet/pods/2424da63-7441-4f0f-bbd3-532e759a5b36/volumes" Mar 10 08:22:37 crc kubenswrapper[4825]: I0310 08:22:37.046021 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-m6vdt"] Mar 10 08:22:37 crc kubenswrapper[4825]: I0310 08:22:37.056582 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-m6vdt"] Mar 10 08:22:37 crc kubenswrapper[4825]: I0310 08:22:37.246940 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf14f62c-18f3-4696-abd2-ea72a47e6773" path="/var/lib/kubelet/pods/bf14f62c-18f3-4696-abd2-ea72a47e6773/volumes" Mar 10 08:22:38 crc kubenswrapper[4825]: I0310 08:22:38.329242 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-65988cc597-hxfff" Mar 10 08:22:38 crc kubenswrapper[4825]: I0310 08:22:38.342770 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-7fcb64c775-kvwk8" Mar 10 08:22:38 crc kubenswrapper[4825]: I0310 08:22:38.399496 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6b9bd96c95-wvxct"] Mar 10 08:22:38 crc kubenswrapper[4825]: I0310 08:22:38.430425 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-56569cbcb6-tnnsc"] Mar 10 08:22:38 crc kubenswrapper[4825]: I0310 08:22:38.538667 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-775b55954d-5d5kw" Mar 10 08:22:38 crc kubenswrapper[4825]: I0310 08:22:38.908412 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" Mar 10 08:22:38 crc kubenswrapper[4825]: I0310 08:22:38.917287 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-56569cbcb6-tnnsc" Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.057681 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72-config-data\") pod \"7c39bb33-53f2-4a5d-b37e-eb2e5feeda72\" (UID: \"7c39bb33-53f2-4a5d-b37e-eb2e5feeda72\") " Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.057804 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8m57\" (UniqueName: \"kubernetes.io/projected/575df1da-f60a-4891-b163-459c3af78f95-kube-api-access-z8m57\") pod \"575df1da-f60a-4891-b163-459c3af78f95\" (UID: \"575df1da-f60a-4891-b163-459c3af78f95\") " Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.057858 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/575df1da-f60a-4891-b163-459c3af78f95-config-data-custom\") pod \"575df1da-f60a-4891-b163-459c3af78f95\" (UID: \"575df1da-f60a-4891-b163-459c3af78f95\") " Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.057882 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72-combined-ca-bundle\") pod \"7c39bb33-53f2-4a5d-b37e-eb2e5feeda72\" (UID: \"7c39bb33-53f2-4a5d-b37e-eb2e5feeda72\") " Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.058719 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98ks5\" (UniqueName: \"kubernetes.io/projected/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72-kube-api-access-98ks5\") pod \"7c39bb33-53f2-4a5d-b37e-eb2e5feeda72\" (UID: \"7c39bb33-53f2-4a5d-b37e-eb2e5feeda72\") " Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.058840 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/575df1da-f60a-4891-b163-459c3af78f95-config-data\") pod \"575df1da-f60a-4891-b163-459c3af78f95\" (UID: \"575df1da-f60a-4891-b163-459c3af78f95\") " Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.058920 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/575df1da-f60a-4891-b163-459c3af78f95-combined-ca-bundle\") pod \"575df1da-f60a-4891-b163-459c3af78f95\" (UID: \"575df1da-f60a-4891-b163-459c3af78f95\") " Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.058983 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72-config-data-custom\") pod \"7c39bb33-53f2-4a5d-b37e-eb2e5feeda72\" (UID: \"7c39bb33-53f2-4a5d-b37e-eb2e5feeda72\") " Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.063935 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72-kube-api-access-98ks5" (OuterVolumeSpecName: "kube-api-access-98ks5") pod "7c39bb33-53f2-4a5d-b37e-eb2e5feeda72" (UID: "7c39bb33-53f2-4a5d-b37e-eb2e5feeda72"). InnerVolumeSpecName "kube-api-access-98ks5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.064149 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/575df1da-f60a-4891-b163-459c3af78f95-kube-api-access-z8m57" (OuterVolumeSpecName: "kube-api-access-z8m57") pod "575df1da-f60a-4891-b163-459c3af78f95" (UID: "575df1da-f60a-4891-b163-459c3af78f95"). InnerVolumeSpecName "kube-api-access-z8m57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.064543 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/575df1da-f60a-4891-b163-459c3af78f95-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "575df1da-f60a-4891-b163-459c3af78f95" (UID: "575df1da-f60a-4891-b163-459c3af78f95"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.065383 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7c39bb33-53f2-4a5d-b37e-eb2e5feeda72" (UID: "7c39bb33-53f2-4a5d-b37e-eb2e5feeda72"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.090654 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c39bb33-53f2-4a5d-b37e-eb2e5feeda72" (UID: "7c39bb33-53f2-4a5d-b37e-eb2e5feeda72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.093446 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/575df1da-f60a-4891-b163-459c3af78f95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "575df1da-f60a-4891-b163-459c3af78f95" (UID: "575df1da-f60a-4891-b163-459c3af78f95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.129354 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72-config-data" (OuterVolumeSpecName: "config-data") pod "7c39bb33-53f2-4a5d-b37e-eb2e5feeda72" (UID: "7c39bb33-53f2-4a5d-b37e-eb2e5feeda72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.135733 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/575df1da-f60a-4891-b163-459c3af78f95-config-data" (OuterVolumeSpecName: "config-data") pod "575df1da-f60a-4891-b163-459c3af78f95" (UID: "575df1da-f60a-4891-b163-459c3af78f95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.161259 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/575df1da-f60a-4891-b163-459c3af78f95-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.161312 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/575df1da-f60a-4891-b163-459c3af78f95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.161326 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.161335 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.161344 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8m57\" (UniqueName: \"kubernetes.io/projected/575df1da-f60a-4891-b163-459c3af78f95-kube-api-access-z8m57\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.161353 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/575df1da-f60a-4891-b163-459c3af78f95-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.161361 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.161369 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98ks5\" (UniqueName: \"kubernetes.io/projected/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72-kube-api-access-98ks5\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.491487 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-56569cbcb6-tnnsc" event={"ID":"7c39bb33-53f2-4a5d-b37e-eb2e5feeda72","Type":"ContainerDied","Data":"484eb1e5347cc11caa798a22b9cf33e26e95bbcb971e23d615e8b2e992ec7ac5"} Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.491783 4825 scope.go:117] "RemoveContainer" containerID="9c797d4fc902e575dcd01d129baf8bd762f5c56479f0d5c5fc6855dfa2533def" Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.491500 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-56569cbcb6-tnnsc" Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.492640 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" event={"ID":"575df1da-f60a-4891-b163-459c3af78f95","Type":"ContainerDied","Data":"7ef6fb6e7dcbbc015dea6b205776cb97eab5313491f763ebade9ce5c94b9eb72"} Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.492698 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6b9bd96c95-wvxct" Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.521143 4825 scope.go:117] "RemoveContainer" containerID="196cc46eba22ec39b0ad1c24de8a56f85a59e98235ca3e10b53d94586f3cd214" Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.523959 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6b9bd96c95-wvxct"] Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.535878 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6b9bd96c95-wvxct"] Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.544917 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-56569cbcb6-tnnsc"] Mar 10 08:22:39 crc kubenswrapper[4825]: I0310 08:22:39.552400 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-56569cbcb6-tnnsc"] Mar 10 08:22:40 crc kubenswrapper[4825]: I0310 08:22:40.040851 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67b47658d-rd7tt" podUID="402e8d63-8847-44af-9a62-9536d0f513f8" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.140:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.140:8443: connect: connection refused" Mar 10 08:22:41 crc kubenswrapper[4825]: I0310 08:22:41.246219 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="575df1da-f60a-4891-b163-459c3af78f95" path="/var/lib/kubelet/pods/575df1da-f60a-4891-b163-459c3af78f95/volumes" Mar 10 08:22:41 crc kubenswrapper[4825]: I0310 08:22:41.247005 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c39bb33-53f2-4a5d-b37e-eb2e5feeda72" path="/var/lib/kubelet/pods/7c39bb33-53f2-4a5d-b37e-eb2e5feeda72/volumes" Mar 10 08:22:45 crc kubenswrapper[4825]: I0310 08:22:45.565919 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-55d8bdb9c6-bvvnc" Mar 10 08:22:45 crc kubenswrapper[4825]: I0310 08:22:45.619930 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-775b55954d-5d5kw"] Mar 10 08:22:45 crc kubenswrapper[4825]: I0310 08:22:45.620555 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-775b55954d-5d5kw" podUID="6d4b56d5-815b-43ef-851c-78be8efed9d8" containerName="heat-engine" containerID="cri-o://28e40db98698092639f721ad1ded3003afb324fea9d67293d24ba0080e2411ac" gracePeriod=60 Mar 10 08:22:47 crc kubenswrapper[4825]: I0310 08:22:47.924851 4825 scope.go:117] "RemoveContainer" containerID="458b51ca9dd944447c170a4ec933cff17927330de8315141fc288440a8bda21c" Mar 10 08:22:47 crc kubenswrapper[4825]: I0310 08:22:47.964756 4825 scope.go:117] "RemoveContainer" containerID="d1333c08c6dfe951c98f1b355b7accf634ac2610ca31ec9e74c16efe2999b96b" Mar 10 08:22:48 crc kubenswrapper[4825]: I0310 08:22:48.001704 4825 scope.go:117] "RemoveContainer" containerID="57c2b272221ddc23f212408754974013bc3a6e357f2917374ae738af261e9f88" Mar 10 08:22:48 crc kubenswrapper[4825]: I0310 08:22:48.073830 4825 scope.go:117] "RemoveContainer" containerID="036c536c50bcd54816ef85eb5048b562a0550d4afc523cd440acef00376c965c" Mar 10 08:22:48 crc kubenswrapper[4825]: I0310 08:22:48.099528 4825 scope.go:117] "RemoveContainer" containerID="750740d74dc7ce86ef078cec1e5b2c8959a59bb22ae559a45faba7e7d535b367" Mar 10 08:22:48 crc kubenswrapper[4825]: I0310 08:22:48.123000 4825 scope.go:117] "RemoveContainer" containerID="0aac67a9c34adc135dd6601faadaecb8b5564bc2c31f1a368b7f6aa72a870b3a" Mar 10 08:22:48 crc kubenswrapper[4825]: E0310 08:22:48.477194 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="28e40db98698092639f721ad1ded3003afb324fea9d67293d24ba0080e2411ac" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 10 08:22:48 crc kubenswrapper[4825]: E0310 08:22:48.478473 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="28e40db98698092639f721ad1ded3003afb324fea9d67293d24ba0080e2411ac" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 10 08:22:48 crc kubenswrapper[4825]: E0310 08:22:48.480106 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="28e40db98698092639f721ad1ded3003afb324fea9d67293d24ba0080e2411ac" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 10 08:22:48 crc kubenswrapper[4825]: E0310 08:22:48.480171 4825 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-775b55954d-5d5kw" podUID="6d4b56d5-815b-43ef-851c-78be8efed9d8" containerName="heat-engine" Mar 10 08:22:50 crc kubenswrapper[4825]: I0310 08:22:50.041577 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67b47658d-rd7tt" podUID="402e8d63-8847-44af-9a62-9536d0f513f8" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.140:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.140:8443: connect: connection refused" Mar 10 08:22:51 crc kubenswrapper[4825]: I0310 08:22:51.400581 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-775b55954d-5d5kw" Mar 10 08:22:51 crc kubenswrapper[4825]: I0310 08:22:51.523949 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d4b56d5-815b-43ef-851c-78be8efed9d8-config-data\") pod \"6d4b56d5-815b-43ef-851c-78be8efed9d8\" (UID: \"6d4b56d5-815b-43ef-851c-78be8efed9d8\") " Mar 10 08:22:51 crc kubenswrapper[4825]: I0310 08:22:51.524023 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2lz4\" (UniqueName: \"kubernetes.io/projected/6d4b56d5-815b-43ef-851c-78be8efed9d8-kube-api-access-p2lz4\") pod \"6d4b56d5-815b-43ef-851c-78be8efed9d8\" (UID: \"6d4b56d5-815b-43ef-851c-78be8efed9d8\") " Mar 10 08:22:51 crc kubenswrapper[4825]: I0310 08:22:51.524188 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d4b56d5-815b-43ef-851c-78be8efed9d8-combined-ca-bundle\") pod \"6d4b56d5-815b-43ef-851c-78be8efed9d8\" (UID: \"6d4b56d5-815b-43ef-851c-78be8efed9d8\") " Mar 10 08:22:51 crc kubenswrapper[4825]: I0310 08:22:51.524259 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d4b56d5-815b-43ef-851c-78be8efed9d8-config-data-custom\") pod \"6d4b56d5-815b-43ef-851c-78be8efed9d8\" (UID: \"6d4b56d5-815b-43ef-851c-78be8efed9d8\") " Mar 10 08:22:51 crc kubenswrapper[4825]: I0310 08:22:51.531985 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d4b56d5-815b-43ef-851c-78be8efed9d8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6d4b56d5-815b-43ef-851c-78be8efed9d8" (UID: "6d4b56d5-815b-43ef-851c-78be8efed9d8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:22:51 crc kubenswrapper[4825]: I0310 08:22:51.532508 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d4b56d5-815b-43ef-851c-78be8efed9d8-kube-api-access-p2lz4" (OuterVolumeSpecName: "kube-api-access-p2lz4") pod "6d4b56d5-815b-43ef-851c-78be8efed9d8" (UID: "6d4b56d5-815b-43ef-851c-78be8efed9d8"). InnerVolumeSpecName "kube-api-access-p2lz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:22:51 crc kubenswrapper[4825]: I0310 08:22:51.574359 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d4b56d5-815b-43ef-851c-78be8efed9d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d4b56d5-815b-43ef-851c-78be8efed9d8" (UID: "6d4b56d5-815b-43ef-851c-78be8efed9d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:22:51 crc kubenswrapper[4825]: I0310 08:22:51.599983 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d4b56d5-815b-43ef-851c-78be8efed9d8-config-data" (OuterVolumeSpecName: "config-data") pod "6d4b56d5-815b-43ef-851c-78be8efed9d8" (UID: "6d4b56d5-815b-43ef-851c-78be8efed9d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:22:51 crc kubenswrapper[4825]: I0310 08:22:51.610867 4825 generic.go:334] "Generic (PLEG): container finished" podID="6d4b56d5-815b-43ef-851c-78be8efed9d8" containerID="28e40db98698092639f721ad1ded3003afb324fea9d67293d24ba0080e2411ac" exitCode=0 Mar 10 08:22:51 crc kubenswrapper[4825]: I0310 08:22:51.610932 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-775b55954d-5d5kw" event={"ID":"6d4b56d5-815b-43ef-851c-78be8efed9d8","Type":"ContainerDied","Data":"28e40db98698092639f721ad1ded3003afb324fea9d67293d24ba0080e2411ac"} Mar 10 08:22:51 crc kubenswrapper[4825]: I0310 08:22:51.610983 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-775b55954d-5d5kw" event={"ID":"6d4b56d5-815b-43ef-851c-78be8efed9d8","Type":"ContainerDied","Data":"524797310f525b71621f2312224546c45dd783789efb288458aec0626edfa4a4"} Mar 10 08:22:51 crc kubenswrapper[4825]: I0310 08:22:51.611014 4825 scope.go:117] "RemoveContainer" containerID="28e40db98698092639f721ad1ded3003afb324fea9d67293d24ba0080e2411ac" Mar 10 08:22:51 crc kubenswrapper[4825]: I0310 08:22:51.611223 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-775b55954d-5d5kw" Mar 10 08:22:51 crc kubenswrapper[4825]: I0310 08:22:51.626814 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d4b56d5-815b-43ef-851c-78be8efed9d8-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:51 crc kubenswrapper[4825]: I0310 08:22:51.626850 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2lz4\" (UniqueName: \"kubernetes.io/projected/6d4b56d5-815b-43ef-851c-78be8efed9d8-kube-api-access-p2lz4\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:51 crc kubenswrapper[4825]: I0310 08:22:51.626863 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d4b56d5-815b-43ef-851c-78be8efed9d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:51 crc kubenswrapper[4825]: I0310 08:22:51.626874 4825 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d4b56d5-815b-43ef-851c-78be8efed9d8-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 08:22:51 crc kubenswrapper[4825]: I0310 08:22:51.685964 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-775b55954d-5d5kw"] Mar 10 08:22:51 crc kubenswrapper[4825]: I0310 08:22:51.691998 4825 scope.go:117] "RemoveContainer" containerID="28e40db98698092639f721ad1ded3003afb324fea9d67293d24ba0080e2411ac" Mar 10 08:22:51 crc kubenswrapper[4825]: I0310 08:22:51.694444 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-775b55954d-5d5kw"] Mar 10 08:22:51 crc kubenswrapper[4825]: E0310 08:22:51.695055 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28e40db98698092639f721ad1ded3003afb324fea9d67293d24ba0080e2411ac\": container with ID starting with 28e40db98698092639f721ad1ded3003afb324fea9d67293d24ba0080e2411ac not found: ID does not exist" containerID="28e40db98698092639f721ad1ded3003afb324fea9d67293d24ba0080e2411ac" Mar 10 08:22:51 crc kubenswrapper[4825]: I0310 08:22:51.695103 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28e40db98698092639f721ad1ded3003afb324fea9d67293d24ba0080e2411ac"} err="failed to get container status \"28e40db98698092639f721ad1ded3003afb324fea9d67293d24ba0080e2411ac\": rpc error: code = NotFound desc = could not find container \"28e40db98698092639f721ad1ded3003afb324fea9d67293d24ba0080e2411ac\": container with ID starting with 28e40db98698092639f721ad1ded3003afb324fea9d67293d24ba0080e2411ac not found: ID does not exist" Mar 10 08:22:53 crc kubenswrapper[4825]: I0310 08:22:53.249691 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d4b56d5-815b-43ef-851c-78be8efed9d8" path="/var/lib/kubelet/pods/6d4b56d5-815b-43ef-851c-78be8efed9d8/volumes" Mar 10 08:23:00 crc kubenswrapper[4825]: I0310 08:23:00.041366 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67b47658d-rd7tt" podUID="402e8d63-8847-44af-9a62-9536d0f513f8" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.140:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.140:8443: connect: connection refused" Mar 10 08:23:00 crc kubenswrapper[4825]: I0310 08:23:00.041949 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:23:00 crc kubenswrapper[4825]: I0310 08:23:00.699102 4825 generic.go:334] "Generic (PLEG): container finished" podID="402e8d63-8847-44af-9a62-9536d0f513f8" containerID="39d78a2a9b3841e3d9ee90de8594f18c0fb81678d34f6df1cd41a80bb869fbf6" exitCode=137 Mar 10 08:23:00 crc kubenswrapper[4825]: I0310 08:23:00.699158 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67b47658d-rd7tt" event={"ID":"402e8d63-8847-44af-9a62-9536d0f513f8","Type":"ContainerDied","Data":"39d78a2a9b3841e3d9ee90de8594f18c0fb81678d34f6df1cd41a80bb869fbf6"} Mar 10 08:23:00 crc kubenswrapper[4825]: I0310 08:23:00.924955 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.012429 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/402e8d63-8847-44af-9a62-9536d0f513f8-horizon-tls-certs\") pod \"402e8d63-8847-44af-9a62-9536d0f513f8\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.012485 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/402e8d63-8847-44af-9a62-9536d0f513f8-horizon-secret-key\") pod \"402e8d63-8847-44af-9a62-9536d0f513f8\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.012531 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/402e8d63-8847-44af-9a62-9536d0f513f8-scripts\") pod \"402e8d63-8847-44af-9a62-9536d0f513f8\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.012594 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402e8d63-8847-44af-9a62-9536d0f513f8-combined-ca-bundle\") pod \"402e8d63-8847-44af-9a62-9536d0f513f8\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.012658 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/402e8d63-8847-44af-9a62-9536d0f513f8-logs\") pod \"402e8d63-8847-44af-9a62-9536d0f513f8\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.012686 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/402e8d63-8847-44af-9a62-9536d0f513f8-config-data\") pod \"402e8d63-8847-44af-9a62-9536d0f513f8\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.013008 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txh6n\" (UniqueName: \"kubernetes.io/projected/402e8d63-8847-44af-9a62-9536d0f513f8-kube-api-access-txh6n\") pod \"402e8d63-8847-44af-9a62-9536d0f513f8\" (UID: \"402e8d63-8847-44af-9a62-9536d0f513f8\") " Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.013807 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/402e8d63-8847-44af-9a62-9536d0f513f8-logs" (OuterVolumeSpecName: "logs") pod "402e8d63-8847-44af-9a62-9536d0f513f8" (UID: "402e8d63-8847-44af-9a62-9536d0f513f8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.018170 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/402e8d63-8847-44af-9a62-9536d0f513f8-kube-api-access-txh6n" (OuterVolumeSpecName: "kube-api-access-txh6n") pod "402e8d63-8847-44af-9a62-9536d0f513f8" (UID: "402e8d63-8847-44af-9a62-9536d0f513f8"). InnerVolumeSpecName "kube-api-access-txh6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.024463 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402e8d63-8847-44af-9a62-9536d0f513f8-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "402e8d63-8847-44af-9a62-9536d0f513f8" (UID: "402e8d63-8847-44af-9a62-9536d0f513f8"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.040595 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/402e8d63-8847-44af-9a62-9536d0f513f8-config-data" (OuterVolumeSpecName: "config-data") pod "402e8d63-8847-44af-9a62-9536d0f513f8" (UID: "402e8d63-8847-44af-9a62-9536d0f513f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.041580 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402e8d63-8847-44af-9a62-9536d0f513f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "402e8d63-8847-44af-9a62-9536d0f513f8" (UID: "402e8d63-8847-44af-9a62-9536d0f513f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.054957 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/402e8d63-8847-44af-9a62-9536d0f513f8-scripts" (OuterVolumeSpecName: "scripts") pod "402e8d63-8847-44af-9a62-9536d0f513f8" (UID: "402e8d63-8847-44af-9a62-9536d0f513f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.065684 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402e8d63-8847-44af-9a62-9536d0f513f8-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "402e8d63-8847-44af-9a62-9536d0f513f8" (UID: "402e8d63-8847-44af-9a62-9536d0f513f8"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.115506 4825 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/402e8d63-8847-44af-9a62-9536d0f513f8-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.115592 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/402e8d63-8847-44af-9a62-9536d0f513f8-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.115604 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402e8d63-8847-44af-9a62-9536d0f513f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.115616 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/402e8d63-8847-44af-9a62-9536d0f513f8-logs\") on node \"crc\" DevicePath \"\"" Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.115625 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/402e8d63-8847-44af-9a62-9536d0f513f8-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.115636 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txh6n\" (UniqueName: \"kubernetes.io/projected/402e8d63-8847-44af-9a62-9536d0f513f8-kube-api-access-txh6n\") on node \"crc\" DevicePath \"\"" Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.115650 4825 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/402e8d63-8847-44af-9a62-9536d0f513f8-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.710460 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67b47658d-rd7tt" event={"ID":"402e8d63-8847-44af-9a62-9536d0f513f8","Type":"ContainerDied","Data":"7e36d87b4e68d8d7adc8fb6d4ef04caffc17639225cb0019e06415c10c2ecd8b"} Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.710522 4825 scope.go:117] "RemoveContainer" containerID="4734b56c5e54339cc3180b88d104ea545423352854d359b2160cb88f7ae1ab80" Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.710625 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67b47658d-rd7tt" Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.736886 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67b47658d-rd7tt"] Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.745856 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-67b47658d-rd7tt"] Mar 10 08:23:01 crc kubenswrapper[4825]: I0310 08:23:01.900814 4825 scope.go:117] "RemoveContainer" containerID="39d78a2a9b3841e3d9ee90de8594f18c0fb81678d34f6df1cd41a80bb869fbf6" Mar 10 08:23:03 crc kubenswrapper[4825]: I0310 08:23:03.251422 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="402e8d63-8847-44af-9a62-9536d0f513f8" path="/var/lib/kubelet/pods/402e8d63-8847-44af-9a62-9536d0f513f8/volumes" Mar 10 08:23:04 crc kubenswrapper[4825]: I0310 08:23:04.082278 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-acb9-account-create-update-s4m8d"] Mar 10 08:23:04 crc kubenswrapper[4825]: I0310 08:23:04.090996 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-g6wcl"] Mar 10 08:23:04 crc kubenswrapper[4825]: I0310 08:23:04.104536 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-acb9-account-create-update-s4m8d"] Mar 10 08:23:04 crc kubenswrapper[4825]: I0310 08:23:04.115183 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-g6wcl"] Mar 10 08:23:05 crc kubenswrapper[4825]: I0310 08:23:05.246488 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29d7bb79-c212-470e-927d-9415a2e9f206" path="/var/lib/kubelet/pods/29d7bb79-c212-470e-927d-9415a2e9f206/volumes" Mar 10 08:23:05 crc kubenswrapper[4825]: I0310 08:23:05.247436 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d38f516f-6216-4ed2-9efe-95492dfc61e8" path="/var/lib/kubelet/pods/d38f516f-6216-4ed2-9efe-95492dfc61e8/volumes" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.189021 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-69dtf"] Mar 10 08:23:11 crc kubenswrapper[4825]: E0310 08:23:11.189976 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d4b56d5-815b-43ef-851c-78be8efed9d8" containerName="heat-engine" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.189991 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d4b56d5-815b-43ef-851c-78be8efed9d8" containerName="heat-engine" Mar 10 08:23:11 crc kubenswrapper[4825]: E0310 08:23:11.190014 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58f58fb9-cd6f-49e4-945f-a3fd33bf45f2" containerName="heat-cfnapi" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.190022 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="58f58fb9-cd6f-49e4-945f-a3fd33bf45f2" containerName="heat-cfnapi" Mar 10 08:23:11 crc kubenswrapper[4825]: E0310 08:23:11.190042 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c39bb33-53f2-4a5d-b37e-eb2e5feeda72" containerName="heat-api" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.190052 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c39bb33-53f2-4a5d-b37e-eb2e5feeda72" containerName="heat-api" Mar 10 08:23:11 crc kubenswrapper[4825]: E0310 08:23:11.190063 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="575df1da-f60a-4891-b163-459c3af78f95" containerName="heat-cfnapi" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.190071 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="575df1da-f60a-4891-b163-459c3af78f95" containerName="heat-cfnapi" Mar 10 08:23:11 crc kubenswrapper[4825]: E0310 08:23:11.190082 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c39bb33-53f2-4a5d-b37e-eb2e5feeda72" containerName="heat-api" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.190090 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c39bb33-53f2-4a5d-b37e-eb2e5feeda72" containerName="heat-api" Mar 10 08:23:11 crc kubenswrapper[4825]: E0310 08:23:11.190109 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402e8d63-8847-44af-9a62-9536d0f513f8" containerName="horizon" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.190117 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="402e8d63-8847-44af-9a62-9536d0f513f8" containerName="horizon" Mar 10 08:23:11 crc kubenswrapper[4825]: E0310 08:23:11.190140 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2424da63-7441-4f0f-bbd3-532e759a5b36" containerName="heat-api" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.190166 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="2424da63-7441-4f0f-bbd3-532e759a5b36" containerName="heat-api" Mar 10 08:23:11 crc kubenswrapper[4825]: E0310 08:23:11.190181 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402e8d63-8847-44af-9a62-9536d0f513f8" containerName="horizon-log" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.190192 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="402e8d63-8847-44af-9a62-9536d0f513f8" containerName="horizon-log" Mar 10 08:23:11 crc kubenswrapper[4825]: E0310 08:23:11.190214 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="575df1da-f60a-4891-b163-459c3af78f95" containerName="heat-cfnapi" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.190221 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="575df1da-f60a-4891-b163-459c3af78f95" containerName="heat-cfnapi" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.190429 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c39bb33-53f2-4a5d-b37e-eb2e5feeda72" containerName="heat-api" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.190453 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="575df1da-f60a-4891-b163-459c3af78f95" containerName="heat-cfnapi" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.190467 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="2424da63-7441-4f0f-bbd3-532e759a5b36" containerName="heat-api" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.190481 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="58f58fb9-cd6f-49e4-945f-a3fd33bf45f2" containerName="heat-cfnapi" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.190496 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="575df1da-f60a-4891-b163-459c3af78f95" containerName="heat-cfnapi" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.190504 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d4b56d5-815b-43ef-851c-78be8efed9d8" containerName="heat-engine" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.190521 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="402e8d63-8847-44af-9a62-9536d0f513f8" containerName="horizon" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.190536 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="402e8d63-8847-44af-9a62-9536d0f513f8" containerName="horizon-log" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.191022 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c39bb33-53f2-4a5d-b37e-eb2e5feeda72" containerName="heat-api" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.192388 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-69dtf" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.217295 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-69dtf"] Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.325836 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecfeb4dd-9935-411d-890c-57fede23310e-utilities\") pod \"certified-operators-69dtf\" (UID: \"ecfeb4dd-9935-411d-890c-57fede23310e\") " pod="openshift-marketplace/certified-operators-69dtf" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.326243 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecfeb4dd-9935-411d-890c-57fede23310e-catalog-content\") pod \"certified-operators-69dtf\" (UID: \"ecfeb4dd-9935-411d-890c-57fede23310e\") " pod="openshift-marketplace/certified-operators-69dtf" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.326455 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4v8m\" (UniqueName: \"kubernetes.io/projected/ecfeb4dd-9935-411d-890c-57fede23310e-kube-api-access-k4v8m\") pod \"certified-operators-69dtf\" (UID: \"ecfeb4dd-9935-411d-890c-57fede23310e\") " pod="openshift-marketplace/certified-operators-69dtf" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.428095 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4v8m\" (UniqueName: \"kubernetes.io/projected/ecfeb4dd-9935-411d-890c-57fede23310e-kube-api-access-k4v8m\") pod \"certified-operators-69dtf\" (UID: \"ecfeb4dd-9935-411d-890c-57fede23310e\") " pod="openshift-marketplace/certified-operators-69dtf" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.428268 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecfeb4dd-9935-411d-890c-57fede23310e-utilities\") pod \"certified-operators-69dtf\" (UID: \"ecfeb4dd-9935-411d-890c-57fede23310e\") " pod="openshift-marketplace/certified-operators-69dtf" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.428323 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecfeb4dd-9935-411d-890c-57fede23310e-catalog-content\") pod \"certified-operators-69dtf\" (UID: \"ecfeb4dd-9935-411d-890c-57fede23310e\") " pod="openshift-marketplace/certified-operators-69dtf" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.428839 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecfeb4dd-9935-411d-890c-57fede23310e-utilities\") pod \"certified-operators-69dtf\" (UID: \"ecfeb4dd-9935-411d-890c-57fede23310e\") " pod="openshift-marketplace/certified-operators-69dtf" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.428865 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecfeb4dd-9935-411d-890c-57fede23310e-catalog-content\") pod \"certified-operators-69dtf\" (UID: \"ecfeb4dd-9935-411d-890c-57fede23310e\") " pod="openshift-marketplace/certified-operators-69dtf" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.462094 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4v8m\" (UniqueName: \"kubernetes.io/projected/ecfeb4dd-9935-411d-890c-57fede23310e-kube-api-access-k4v8m\") pod \"certified-operators-69dtf\" (UID: \"ecfeb4dd-9935-411d-890c-57fede23310e\") " pod="openshift-marketplace/certified-operators-69dtf" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.513331 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-69dtf" Mar 10 08:23:11 crc kubenswrapper[4825]: I0310 08:23:11.966220 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-69dtf"] Mar 10 08:23:11 crc kubenswrapper[4825]: W0310 08:23:11.968645 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecfeb4dd_9935_411d_890c_57fede23310e.slice/crio-275d933f979bd4c08b06ff24ac1ea7435c39bf800ccfa8de85f9bf56aebc2f76 WatchSource:0}: Error finding container 275d933f979bd4c08b06ff24ac1ea7435c39bf800ccfa8de85f9bf56aebc2f76: Status 404 returned error can't find the container with id 275d933f979bd4c08b06ff24ac1ea7435c39bf800ccfa8de85f9bf56aebc2f76 Mar 10 08:23:12 crc kubenswrapper[4825]: I0310 08:23:12.040179 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-s9c9n"] Mar 10 08:23:12 crc kubenswrapper[4825]: I0310 08:23:12.057147 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-s9c9n"] Mar 10 08:23:12 crc kubenswrapper[4825]: I0310 08:23:12.805680 4825 generic.go:334] "Generic (PLEG): container finished" podID="ecfeb4dd-9935-411d-890c-57fede23310e" containerID="80beef6fa43fc467850eca7180aa43af989c42886ed3ebb3904c8709d4a7d624" exitCode=0 Mar 10 08:23:12 crc kubenswrapper[4825]: I0310 08:23:12.805747 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69dtf" event={"ID":"ecfeb4dd-9935-411d-890c-57fede23310e","Type":"ContainerDied","Data":"80beef6fa43fc467850eca7180aa43af989c42886ed3ebb3904c8709d4a7d624"} Mar 10 08:23:12 crc kubenswrapper[4825]: I0310 08:23:12.805975 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69dtf" event={"ID":"ecfeb4dd-9935-411d-890c-57fede23310e","Type":"ContainerStarted","Data":"275d933f979bd4c08b06ff24ac1ea7435c39bf800ccfa8de85f9bf56aebc2f76"} Mar 10 08:23:13 crc kubenswrapper[4825]: I0310 08:23:13.251850 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="311dddb8-bb79-4100-8497-ee38e452b266" path="/var/lib/kubelet/pods/311dddb8-bb79-4100-8497-ee38e452b266/volumes" Mar 10 08:23:13 crc kubenswrapper[4825]: I0310 08:23:13.816101 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69dtf" event={"ID":"ecfeb4dd-9935-411d-890c-57fede23310e","Type":"ContainerStarted","Data":"f687b767bc18bf72fe88ba7475b43133d090ff4d30833617a752856651ba3c29"} Mar 10 08:23:15 crc kubenswrapper[4825]: I0310 08:23:15.835624 4825 generic.go:334] "Generic (PLEG): container finished" podID="ecfeb4dd-9935-411d-890c-57fede23310e" containerID="f687b767bc18bf72fe88ba7475b43133d090ff4d30833617a752856651ba3c29" exitCode=0 Mar 10 08:23:15 crc kubenswrapper[4825]: I0310 08:23:15.835775 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69dtf" event={"ID":"ecfeb4dd-9935-411d-890c-57fede23310e","Type":"ContainerDied","Data":"f687b767bc18bf72fe88ba7475b43133d090ff4d30833617a752856651ba3c29"} Mar 10 08:23:16 crc kubenswrapper[4825]: I0310 08:23:16.861942 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69dtf" event={"ID":"ecfeb4dd-9935-411d-890c-57fede23310e","Type":"ContainerStarted","Data":"3dd157622019997ba2b1be899c4f3f6bb8aca2c173c548d961796eacd86782b5"} Mar 10 08:23:16 crc kubenswrapper[4825]: I0310 08:23:16.903506 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-69dtf" podStartSLOduration=2.459025699 podStartE2EDuration="5.903481559s" podCreationTimestamp="2026-03-10 08:23:11 +0000 UTC" firstStartedPulling="2026-03-10 08:23:12.808554242 +0000 UTC m=+5945.838334857" lastFinishedPulling="2026-03-10 08:23:16.253010102 +0000 UTC m=+5949.282790717" observedRunningTime="2026-03-10 08:23:16.879644464 +0000 UTC m=+5949.909425109" watchObservedRunningTime="2026-03-10 08:23:16.903481559 +0000 UTC m=+5949.933262174" Mar 10 08:23:17 crc kubenswrapper[4825]: I0310 08:23:17.797613 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b"] Mar 10 08:23:17 crc kubenswrapper[4825]: I0310 08:23:17.800201 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b" Mar 10 08:23:17 crc kubenswrapper[4825]: I0310 08:23:17.802577 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 08:23:17 crc kubenswrapper[4825]: I0310 08:23:17.809518 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b"] Mar 10 08:23:17 crc kubenswrapper[4825]: I0310 08:23:17.868100 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb28j\" (UniqueName: \"kubernetes.io/projected/84d59e89-a282-426f-9173-22b57c51522a-kube-api-access-pb28j\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b\" (UID: \"84d59e89-a282-426f-9173-22b57c51522a\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b" Mar 10 08:23:17 crc kubenswrapper[4825]: I0310 08:23:17.868810 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84d59e89-a282-426f-9173-22b57c51522a-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b\" (UID: \"84d59e89-a282-426f-9173-22b57c51522a\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b" Mar 10 08:23:17 crc kubenswrapper[4825]: I0310 08:23:17.868961 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84d59e89-a282-426f-9173-22b57c51522a-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b\" (UID: \"84d59e89-a282-426f-9173-22b57c51522a\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b" Mar 10 08:23:17 crc kubenswrapper[4825]: I0310 08:23:17.970513 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb28j\" (UniqueName: \"kubernetes.io/projected/84d59e89-a282-426f-9173-22b57c51522a-kube-api-access-pb28j\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b\" (UID: \"84d59e89-a282-426f-9173-22b57c51522a\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b" Mar 10 08:23:17 crc kubenswrapper[4825]: I0310 08:23:17.970615 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84d59e89-a282-426f-9173-22b57c51522a-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b\" (UID: \"84d59e89-a282-426f-9173-22b57c51522a\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b" Mar 10 08:23:17 crc kubenswrapper[4825]: I0310 08:23:17.970714 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84d59e89-a282-426f-9173-22b57c51522a-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b\" (UID: \"84d59e89-a282-426f-9173-22b57c51522a\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b" Mar 10 08:23:17 crc kubenswrapper[4825]: I0310 08:23:17.971213 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84d59e89-a282-426f-9173-22b57c51522a-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b\" (UID: \"84d59e89-a282-426f-9173-22b57c51522a\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b" Mar 10 08:23:17 crc kubenswrapper[4825]: I0310 08:23:17.971270 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84d59e89-a282-426f-9173-22b57c51522a-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b\" (UID: \"84d59e89-a282-426f-9173-22b57c51522a\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b" Mar 10 08:23:17 crc kubenswrapper[4825]: I0310 08:23:17.989866 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb28j\" (UniqueName: \"kubernetes.io/projected/84d59e89-a282-426f-9173-22b57c51522a-kube-api-access-pb28j\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b\" (UID: \"84d59e89-a282-426f-9173-22b57c51522a\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b" Mar 10 08:23:18 crc kubenswrapper[4825]: I0310 08:23:18.123660 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b" Mar 10 08:23:18 crc kubenswrapper[4825]: I0310 08:23:18.584344 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b"] Mar 10 08:23:18 crc kubenswrapper[4825]: W0310 08:23:18.593393 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84d59e89_a282_426f_9173_22b57c51522a.slice/crio-274e7e27c899273ced199bfdff3d6045e2493c911f768378b9f60f94e113b989 WatchSource:0}: Error finding container 274e7e27c899273ced199bfdff3d6045e2493c911f768378b9f60f94e113b989: Status 404 returned error can't find the container with id 274e7e27c899273ced199bfdff3d6045e2493c911f768378b9f60f94e113b989 Mar 10 08:23:18 crc kubenswrapper[4825]: I0310 08:23:18.883562 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b" event={"ID":"84d59e89-a282-426f-9173-22b57c51522a","Type":"ContainerStarted","Data":"0880bcd369921d1c1029e68757c17ecfb845eb938c94d5edc729ac96b22268aa"} Mar 10 08:23:18 crc kubenswrapper[4825]: I0310 08:23:18.883631 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b" event={"ID":"84d59e89-a282-426f-9173-22b57c51522a","Type":"ContainerStarted","Data":"274e7e27c899273ced199bfdff3d6045e2493c911f768378b9f60f94e113b989"} Mar 10 08:23:19 crc kubenswrapper[4825]: I0310 08:23:19.899842 4825 generic.go:334] "Generic (PLEG): container finished" podID="84d59e89-a282-426f-9173-22b57c51522a" containerID="0880bcd369921d1c1029e68757c17ecfb845eb938c94d5edc729ac96b22268aa" exitCode=0 Mar 10 08:23:19 crc kubenswrapper[4825]: I0310 08:23:19.900070 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b" event={"ID":"84d59e89-a282-426f-9173-22b57c51522a","Type":"ContainerDied","Data":"0880bcd369921d1c1029e68757c17ecfb845eb938c94d5edc729ac96b22268aa"} Mar 10 08:23:21 crc kubenswrapper[4825]: I0310 08:23:21.513447 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-69dtf" Mar 10 08:23:21 crc kubenswrapper[4825]: I0310 08:23:21.513924 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-69dtf" Mar 10 08:23:21 crc kubenswrapper[4825]: I0310 08:23:21.921883 4825 generic.go:334] "Generic (PLEG): container finished" podID="84d59e89-a282-426f-9173-22b57c51522a" containerID="180f5943588d03da587191c2588bfef63a98f971c8fd51d33f8334d14385aca3" exitCode=0 Mar 10 08:23:21 crc kubenswrapper[4825]: I0310 08:23:21.921957 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b" event={"ID":"84d59e89-a282-426f-9173-22b57c51522a","Type":"ContainerDied","Data":"180f5943588d03da587191c2588bfef63a98f971c8fd51d33f8334d14385aca3"} Mar 10 08:23:22 crc kubenswrapper[4825]: I0310 08:23:22.558565 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-69dtf" podUID="ecfeb4dd-9935-411d-890c-57fede23310e" containerName="registry-server" probeResult="failure" output=< Mar 10 08:23:22 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 08:23:22 crc kubenswrapper[4825]: > Mar 10 08:23:22 crc kubenswrapper[4825]: I0310 08:23:22.935854 4825 generic.go:334] "Generic (PLEG): container finished" podID="84d59e89-a282-426f-9173-22b57c51522a" containerID="502d610c1e11a65f6177b28d889e16ed541d1b553c1668668744ef5ecea0a5c4" exitCode=0 Mar 10 08:23:22 crc kubenswrapper[4825]: I0310 08:23:22.935894 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b" event={"ID":"84d59e89-a282-426f-9173-22b57c51522a","Type":"ContainerDied","Data":"502d610c1e11a65f6177b28d889e16ed541d1b553c1668668744ef5ecea0a5c4"} Mar 10 08:23:24 crc kubenswrapper[4825]: I0310 08:23:24.254947 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b" Mar 10 08:23:24 crc kubenswrapper[4825]: I0310 08:23:24.335504 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84d59e89-a282-426f-9173-22b57c51522a-bundle\") pod \"84d59e89-a282-426f-9173-22b57c51522a\" (UID: \"84d59e89-a282-426f-9173-22b57c51522a\") " Mar 10 08:23:24 crc kubenswrapper[4825]: I0310 08:23:24.335661 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb28j\" (UniqueName: \"kubernetes.io/projected/84d59e89-a282-426f-9173-22b57c51522a-kube-api-access-pb28j\") pod \"84d59e89-a282-426f-9173-22b57c51522a\" (UID: \"84d59e89-a282-426f-9173-22b57c51522a\") " Mar 10 08:23:24 crc kubenswrapper[4825]: I0310 08:23:24.335727 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84d59e89-a282-426f-9173-22b57c51522a-util\") pod \"84d59e89-a282-426f-9173-22b57c51522a\" (UID: \"84d59e89-a282-426f-9173-22b57c51522a\") " Mar 10 08:23:24 crc kubenswrapper[4825]: I0310 08:23:24.338479 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84d59e89-a282-426f-9173-22b57c51522a-bundle" (OuterVolumeSpecName: "bundle") pod "84d59e89-a282-426f-9173-22b57c51522a" (UID: "84d59e89-a282-426f-9173-22b57c51522a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:23:24 crc kubenswrapper[4825]: I0310 08:23:24.342154 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d59e89-a282-426f-9173-22b57c51522a-kube-api-access-pb28j" (OuterVolumeSpecName: "kube-api-access-pb28j") pod "84d59e89-a282-426f-9173-22b57c51522a" (UID: "84d59e89-a282-426f-9173-22b57c51522a"). InnerVolumeSpecName "kube-api-access-pb28j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:23:24 crc kubenswrapper[4825]: I0310 08:23:24.347875 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84d59e89-a282-426f-9173-22b57c51522a-util" (OuterVolumeSpecName: "util") pod "84d59e89-a282-426f-9173-22b57c51522a" (UID: "84d59e89-a282-426f-9173-22b57c51522a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:23:24 crc kubenswrapper[4825]: I0310 08:23:24.440739 4825 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84d59e89-a282-426f-9173-22b57c51522a-util\") on node \"crc\" DevicePath \"\"" Mar 10 08:23:24 crc kubenswrapper[4825]: I0310 08:23:24.440850 4825 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84d59e89-a282-426f-9173-22b57c51522a-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:23:24 crc kubenswrapper[4825]: I0310 08:23:24.440873 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb28j\" (UniqueName: \"kubernetes.io/projected/84d59e89-a282-426f-9173-22b57c51522a-kube-api-access-pb28j\") on node \"crc\" DevicePath \"\"" Mar 10 08:23:24 crc kubenswrapper[4825]: I0310 08:23:24.955111 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b" event={"ID":"84d59e89-a282-426f-9173-22b57c51522a","Type":"ContainerDied","Data":"274e7e27c899273ced199bfdff3d6045e2493c911f768378b9f60f94e113b989"} Mar 10 08:23:24 crc kubenswrapper[4825]: I0310 08:23:24.955433 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="274e7e27c899273ced199bfdff3d6045e2493c911f768378b9f60f94e113b989" Mar 10 08:23:24 crc kubenswrapper[4825]: I0310 08:23:24.955252 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b" Mar 10 08:23:31 crc kubenswrapper[4825]: I0310 08:23:31.603435 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-69dtf" Mar 10 08:23:31 crc kubenswrapper[4825]: I0310 08:23:31.700725 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-69dtf" Mar 10 08:23:33 crc kubenswrapper[4825]: I0310 08:23:33.750905 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-69dtf"] Mar 10 08:23:33 crc kubenswrapper[4825]: I0310 08:23:33.751339 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-69dtf" podUID="ecfeb4dd-9935-411d-890c-57fede23310e" containerName="registry-server" containerID="cri-o://3dd157622019997ba2b1be899c4f3f6bb8aca2c173c548d961796eacd86782b5" gracePeriod=2 Mar 10 08:23:34 crc kubenswrapper[4825]: I0310 08:23:34.040668 4825 generic.go:334] "Generic (PLEG): container finished" podID="ecfeb4dd-9935-411d-890c-57fede23310e" containerID="3dd157622019997ba2b1be899c4f3f6bb8aca2c173c548d961796eacd86782b5" exitCode=0 Mar 10 08:23:34 crc kubenswrapper[4825]: I0310 08:23:34.040744 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69dtf" event={"ID":"ecfeb4dd-9935-411d-890c-57fede23310e","Type":"ContainerDied","Data":"3dd157622019997ba2b1be899c4f3f6bb8aca2c173c548d961796eacd86782b5"} Mar 10 08:23:34 crc kubenswrapper[4825]: I0310 08:23:34.578934 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-69dtf" Mar 10 08:23:34 crc kubenswrapper[4825]: I0310 08:23:34.655738 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4v8m\" (UniqueName: \"kubernetes.io/projected/ecfeb4dd-9935-411d-890c-57fede23310e-kube-api-access-k4v8m\") pod \"ecfeb4dd-9935-411d-890c-57fede23310e\" (UID: \"ecfeb4dd-9935-411d-890c-57fede23310e\") " Mar 10 08:23:34 crc kubenswrapper[4825]: I0310 08:23:34.655817 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecfeb4dd-9935-411d-890c-57fede23310e-catalog-content\") pod \"ecfeb4dd-9935-411d-890c-57fede23310e\" (UID: \"ecfeb4dd-9935-411d-890c-57fede23310e\") " Mar 10 08:23:34 crc kubenswrapper[4825]: I0310 08:23:34.655872 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecfeb4dd-9935-411d-890c-57fede23310e-utilities\") pod \"ecfeb4dd-9935-411d-890c-57fede23310e\" (UID: \"ecfeb4dd-9935-411d-890c-57fede23310e\") " Mar 10 08:23:34 crc kubenswrapper[4825]: I0310 08:23:34.657097 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecfeb4dd-9935-411d-890c-57fede23310e-utilities" (OuterVolumeSpecName: "utilities") pod "ecfeb4dd-9935-411d-890c-57fede23310e" (UID: "ecfeb4dd-9935-411d-890c-57fede23310e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:23:34 crc kubenswrapper[4825]: I0310 08:23:34.683475 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecfeb4dd-9935-411d-890c-57fede23310e-kube-api-access-k4v8m" (OuterVolumeSpecName: "kube-api-access-k4v8m") pod "ecfeb4dd-9935-411d-890c-57fede23310e" (UID: "ecfeb4dd-9935-411d-890c-57fede23310e"). InnerVolumeSpecName "kube-api-access-k4v8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:23:34 crc kubenswrapper[4825]: I0310 08:23:34.756377 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecfeb4dd-9935-411d-890c-57fede23310e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ecfeb4dd-9935-411d-890c-57fede23310e" (UID: "ecfeb4dd-9935-411d-890c-57fede23310e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:23:34 crc kubenswrapper[4825]: I0310 08:23:34.758092 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4v8m\" (UniqueName: \"kubernetes.io/projected/ecfeb4dd-9935-411d-890c-57fede23310e-kube-api-access-k4v8m\") on node \"crc\" DevicePath \"\"" Mar 10 08:23:34 crc kubenswrapper[4825]: I0310 08:23:34.758149 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecfeb4dd-9935-411d-890c-57fede23310e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 08:23:34 crc kubenswrapper[4825]: I0310 08:23:34.758162 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecfeb4dd-9935-411d-890c-57fede23310e-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.056770 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69dtf" event={"ID":"ecfeb4dd-9935-411d-890c-57fede23310e","Type":"ContainerDied","Data":"275d933f979bd4c08b06ff24ac1ea7435c39bf800ccfa8de85f9bf56aebc2f76"} Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.056823 4825 scope.go:117] "RemoveContainer" containerID="3dd157622019997ba2b1be899c4f3f6bb8aca2c173c548d961796eacd86782b5" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.056843 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-69dtf" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.088774 4825 scope.go:117] "RemoveContainer" containerID="f687b767bc18bf72fe88ba7475b43133d090ff4d30833617a752856651ba3c29" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.119855 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-69dtf"] Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.157412 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-69dtf"] Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.166372 4825 scope.go:117] "RemoveContainer" containerID="80beef6fa43fc467850eca7180aa43af989c42886ed3ebb3904c8709d4a7d624" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.178331 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-xhhcv"] Mar 10 08:23:35 crc kubenswrapper[4825]: E0310 08:23:35.178866 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecfeb4dd-9935-411d-890c-57fede23310e" containerName="extract-content" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.178890 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecfeb4dd-9935-411d-890c-57fede23310e" containerName="extract-content" Mar 10 08:23:35 crc kubenswrapper[4825]: E0310 08:23:35.178927 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecfeb4dd-9935-411d-890c-57fede23310e" containerName="extract-utilities" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.178936 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecfeb4dd-9935-411d-890c-57fede23310e" containerName="extract-utilities" Mar 10 08:23:35 crc kubenswrapper[4825]: E0310 08:23:35.178954 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecfeb4dd-9935-411d-890c-57fede23310e" containerName="registry-server" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.178961 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecfeb4dd-9935-411d-890c-57fede23310e" containerName="registry-server" Mar 10 08:23:35 crc kubenswrapper[4825]: E0310 08:23:35.178978 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d59e89-a282-426f-9173-22b57c51522a" containerName="util" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.178986 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d59e89-a282-426f-9173-22b57c51522a" containerName="util" Mar 10 08:23:35 crc kubenswrapper[4825]: E0310 08:23:35.179004 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d59e89-a282-426f-9173-22b57c51522a" containerName="pull" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.179012 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d59e89-a282-426f-9173-22b57c51522a" containerName="pull" Mar 10 08:23:35 crc kubenswrapper[4825]: E0310 08:23:35.179028 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d59e89-a282-426f-9173-22b57c51522a" containerName="extract" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.179036 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d59e89-a282-426f-9173-22b57c51522a" containerName="extract" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.179275 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecfeb4dd-9935-411d-890c-57fede23310e" containerName="registry-server" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.179304 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d59e89-a282-426f-9173-22b57c51522a" containerName="extract" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.180201 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xhhcv" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.185361 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-98s7l" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.185556 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.197571 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.226951 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-xhhcv"] Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.253560 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecfeb4dd-9935-411d-890c-57fede23310e" path="/var/lib/kubelet/pods/ecfeb4dd-9935-411d-890c-57fede23310e/volumes" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.254293 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-rcqmb"] Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.255634 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-rcqmb" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.259248 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-md4ll"] Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.262570 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-nb6nm" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.262795 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.265319 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-md4ll" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.272300 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knwgv\" (UniqueName: \"kubernetes.io/projected/66cf4082-239b-4875-b3cc-4f83e75f3c41-kube-api-access-knwgv\") pod \"obo-prometheus-operator-68bc856cb9-xhhcv\" (UID: \"66cf4082-239b-4875-b3cc-4f83e75f3c41\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xhhcv" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.287285 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-md4ll"] Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.303710 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-rcqmb"] Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.376919 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knwgv\" (UniqueName: \"kubernetes.io/projected/66cf4082-239b-4875-b3cc-4f83e75f3c41-kube-api-access-knwgv\") pod \"obo-prometheus-operator-68bc856cb9-xhhcv\" (UID: \"66cf4082-239b-4875-b3cc-4f83e75f3c41\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xhhcv" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.377337 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3d3e4992-fd3b-42ba-af1c-65278a4b277e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bbfbfd454-md4ll\" (UID: \"3d3e4992-fd3b-42ba-af1c-65278a4b277e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-md4ll" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.377497 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a291a88f-48bf-45a9-80f1-e558286ab74a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bbfbfd454-rcqmb\" (UID: \"a291a88f-48bf-45a9-80f1-e558286ab74a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-rcqmb" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.377589 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3d3e4992-fd3b-42ba-af1c-65278a4b277e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bbfbfd454-md4ll\" (UID: \"3d3e4992-fd3b-42ba-af1c-65278a4b277e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-md4ll" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.377664 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a291a88f-48bf-45a9-80f1-e558286ab74a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bbfbfd454-rcqmb\" (UID: \"a291a88f-48bf-45a9-80f1-e558286ab74a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-rcqmb" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.377841 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-rtvkd"] Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.379181 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-rtvkd" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.386197 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-rtvkd"] Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.386670 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.386854 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-hd27x" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.398026 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knwgv\" (UniqueName: \"kubernetes.io/projected/66cf4082-239b-4875-b3cc-4f83e75f3c41-kube-api-access-knwgv\") pod \"obo-prometheus-operator-68bc856cb9-xhhcv\" (UID: \"66cf4082-239b-4875-b3cc-4f83e75f3c41\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xhhcv" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.479632 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zzds\" (UniqueName: \"kubernetes.io/projected/0b126417-0421-4901-bbf0-e8c75dffa4d5-kube-api-access-5zzds\") pod \"observability-operator-59bdc8b94-rtvkd\" (UID: \"0b126417-0421-4901-bbf0-e8c75dffa4d5\") " pod="openshift-operators/observability-operator-59bdc8b94-rtvkd" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.479721 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3d3e4992-fd3b-42ba-af1c-65278a4b277e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bbfbfd454-md4ll\" (UID: \"3d3e4992-fd3b-42ba-af1c-65278a4b277e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-md4ll" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.479794 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a291a88f-48bf-45a9-80f1-e558286ab74a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bbfbfd454-rcqmb\" (UID: \"a291a88f-48bf-45a9-80f1-e558286ab74a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-rcqmb" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.479837 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3d3e4992-fd3b-42ba-af1c-65278a4b277e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bbfbfd454-md4ll\" (UID: \"3d3e4992-fd3b-42ba-af1c-65278a4b277e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-md4ll" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.479856 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a291a88f-48bf-45a9-80f1-e558286ab74a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bbfbfd454-rcqmb\" (UID: \"a291a88f-48bf-45a9-80f1-e558286ab74a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-rcqmb" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.479875 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b126417-0421-4901-bbf0-e8c75dffa4d5-observability-operator-tls\") pod \"observability-operator-59bdc8b94-rtvkd\" (UID: \"0b126417-0421-4901-bbf0-e8c75dffa4d5\") " pod="openshift-operators/observability-operator-59bdc8b94-rtvkd" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.482760 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a291a88f-48bf-45a9-80f1-e558286ab74a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bbfbfd454-rcqmb\" (UID: \"a291a88f-48bf-45a9-80f1-e558286ab74a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-rcqmb" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.485842 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3d3e4992-fd3b-42ba-af1c-65278a4b277e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bbfbfd454-md4ll\" (UID: \"3d3e4992-fd3b-42ba-af1c-65278a4b277e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-md4ll" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.485947 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3d3e4992-fd3b-42ba-af1c-65278a4b277e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bbfbfd454-md4ll\" (UID: \"3d3e4992-fd3b-42ba-af1c-65278a4b277e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-md4ll" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.492669 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a291a88f-48bf-45a9-80f1-e558286ab74a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bbfbfd454-rcqmb\" (UID: \"a291a88f-48bf-45a9-80f1-e558286ab74a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-rcqmb" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.537914 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xhhcv" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.543510 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-jxdsc"] Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.552578 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-jxdsc" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.559955 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-zp5mk" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.575204 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-jxdsc"] Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.577232 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-rcqmb" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.583726 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqkc8\" (UniqueName: \"kubernetes.io/projected/e9441b13-c0dc-478c-90c1-43abb52482af-kube-api-access-kqkc8\") pod \"perses-operator-5bf474d74f-jxdsc\" (UID: \"e9441b13-c0dc-478c-90c1-43abb52482af\") " pod="openshift-operators/perses-operator-5bf474d74f-jxdsc" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.583783 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zzds\" (UniqueName: \"kubernetes.io/projected/0b126417-0421-4901-bbf0-e8c75dffa4d5-kube-api-access-5zzds\") pod \"observability-operator-59bdc8b94-rtvkd\" (UID: \"0b126417-0421-4901-bbf0-e8c75dffa4d5\") " pod="openshift-operators/observability-operator-59bdc8b94-rtvkd" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.583879 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b126417-0421-4901-bbf0-e8c75dffa4d5-observability-operator-tls\") pod \"observability-operator-59bdc8b94-rtvkd\" (UID: \"0b126417-0421-4901-bbf0-e8c75dffa4d5\") " pod="openshift-operators/observability-operator-59bdc8b94-rtvkd" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.584024 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/e9441b13-c0dc-478c-90c1-43abb52482af-openshift-service-ca\") pod \"perses-operator-5bf474d74f-jxdsc\" (UID: \"e9441b13-c0dc-478c-90c1-43abb52482af\") " pod="openshift-operators/perses-operator-5bf474d74f-jxdsc" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.588906 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b126417-0421-4901-bbf0-e8c75dffa4d5-observability-operator-tls\") pod \"observability-operator-59bdc8b94-rtvkd\" (UID: \"0b126417-0421-4901-bbf0-e8c75dffa4d5\") " pod="openshift-operators/observability-operator-59bdc8b94-rtvkd" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.591659 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-md4ll" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.609604 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zzds\" (UniqueName: \"kubernetes.io/projected/0b126417-0421-4901-bbf0-e8c75dffa4d5-kube-api-access-5zzds\") pod \"observability-operator-59bdc8b94-rtvkd\" (UID: \"0b126417-0421-4901-bbf0-e8c75dffa4d5\") " pod="openshift-operators/observability-operator-59bdc8b94-rtvkd" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.688470 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/e9441b13-c0dc-478c-90c1-43abb52482af-openshift-service-ca\") pod \"perses-operator-5bf474d74f-jxdsc\" (UID: \"e9441b13-c0dc-478c-90c1-43abb52482af\") " pod="openshift-operators/perses-operator-5bf474d74f-jxdsc" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.688603 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqkc8\" (UniqueName: \"kubernetes.io/projected/e9441b13-c0dc-478c-90c1-43abb52482af-kube-api-access-kqkc8\") pod \"perses-operator-5bf474d74f-jxdsc\" (UID: \"e9441b13-c0dc-478c-90c1-43abb52482af\") " pod="openshift-operators/perses-operator-5bf474d74f-jxdsc" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.689697 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/e9441b13-c0dc-478c-90c1-43abb52482af-openshift-service-ca\") pod \"perses-operator-5bf474d74f-jxdsc\" (UID: \"e9441b13-c0dc-478c-90c1-43abb52482af\") " pod="openshift-operators/perses-operator-5bf474d74f-jxdsc" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.707270 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqkc8\" (UniqueName: \"kubernetes.io/projected/e9441b13-c0dc-478c-90c1-43abb52482af-kube-api-access-kqkc8\") pod \"perses-operator-5bf474d74f-jxdsc\" (UID: \"e9441b13-c0dc-478c-90c1-43abb52482af\") " pod="openshift-operators/perses-operator-5bf474d74f-jxdsc" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.748126 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-rtvkd" Mar 10 08:23:35 crc kubenswrapper[4825]: I0310 08:23:35.906438 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-jxdsc" Mar 10 08:23:36 crc kubenswrapper[4825]: I0310 08:23:36.126337 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-xhhcv"] Mar 10 08:23:36 crc kubenswrapper[4825]: W0310 08:23:36.135535 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66cf4082_239b_4875_b3cc_4f83e75f3c41.slice/crio-4dfab80b49aa27e223b4a62cd3fb5654adbd00a186bb4aa93053457990145da7 WatchSource:0}: Error finding container 4dfab80b49aa27e223b4a62cd3fb5654adbd00a186bb4aa93053457990145da7: Status 404 returned error can't find the container with id 4dfab80b49aa27e223b4a62cd3fb5654adbd00a186bb4aa93053457990145da7 Mar 10 08:23:36 crc kubenswrapper[4825]: W0310 08:23:36.343784 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda291a88f_48bf_45a9_80f1_e558286ab74a.slice/crio-54c6bdf1eddfb76bb357737c79bab60a062bb751984ea5ff501533e30b77309c WatchSource:0}: Error finding container 54c6bdf1eddfb76bb357737c79bab60a062bb751984ea5ff501533e30b77309c: Status 404 returned error can't find the container with id 54c6bdf1eddfb76bb357737c79bab60a062bb751984ea5ff501533e30b77309c Mar 10 08:23:36 crc kubenswrapper[4825]: I0310 08:23:36.345525 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-rcqmb"] Mar 10 08:23:36 crc kubenswrapper[4825]: I0310 08:23:36.393255 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-md4ll"] Mar 10 08:23:36 crc kubenswrapper[4825]: W0310 08:23:36.400681 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d3e4992_fd3b_42ba_af1c_65278a4b277e.slice/crio-f5670e63a6585eee110bc1e611905637a442bc90fca2009ae6d7c9b661da5b12 WatchSource:0}: Error finding container f5670e63a6585eee110bc1e611905637a442bc90fca2009ae6d7c9b661da5b12: Status 404 returned error can't find the container with id f5670e63a6585eee110bc1e611905637a442bc90fca2009ae6d7c9b661da5b12 Mar 10 08:23:36 crc kubenswrapper[4825]: I0310 08:23:36.467161 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-rtvkd"] Mar 10 08:23:36 crc kubenswrapper[4825]: I0310 08:23:36.645636 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-jxdsc"] Mar 10 08:23:37 crc kubenswrapper[4825]: I0310 08:23:37.116443 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xhhcv" event={"ID":"66cf4082-239b-4875-b3cc-4f83e75f3c41","Type":"ContainerStarted","Data":"4dfab80b49aa27e223b4a62cd3fb5654adbd00a186bb4aa93053457990145da7"} Mar 10 08:23:37 crc kubenswrapper[4825]: I0310 08:23:37.121751 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-md4ll" event={"ID":"3d3e4992-fd3b-42ba-af1c-65278a4b277e","Type":"ContainerStarted","Data":"f5670e63a6585eee110bc1e611905637a442bc90fca2009ae6d7c9b661da5b12"} Mar 10 08:23:37 crc kubenswrapper[4825]: I0310 08:23:37.127213 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-rtvkd" event={"ID":"0b126417-0421-4901-bbf0-e8c75dffa4d5","Type":"ContainerStarted","Data":"82565836c98b9b8593eb617721041c372204f50d6aa733c9dacff393a9bf9270"} Mar 10 08:23:37 crc kubenswrapper[4825]: I0310 08:23:37.129490 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-rcqmb" event={"ID":"a291a88f-48bf-45a9-80f1-e558286ab74a","Type":"ContainerStarted","Data":"54c6bdf1eddfb76bb357737c79bab60a062bb751984ea5ff501533e30b77309c"} Mar 10 08:23:37 crc kubenswrapper[4825]: I0310 08:23:37.131399 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-jxdsc" event={"ID":"e9441b13-c0dc-478c-90c1-43abb52482af","Type":"ContainerStarted","Data":"121ba134f702d4c71068c50309146d3f2d8f43f68a4137e03d53dc7a16006fbc"} Mar 10 08:23:48 crc kubenswrapper[4825]: I0310 08:23:48.377347 4825 scope.go:117] "RemoveContainer" containerID="c57527c0af5115273139dc4418c1d2b2e55ae48862c11d14b4fe0cce50780c5a" Mar 10 08:23:49 crc kubenswrapper[4825]: I0310 08:23:49.777040 4825 scope.go:117] "RemoveContainer" containerID="2e9fa9c64694d31a7607ba8d638cb7990e44d0ec8780f6c393152feaa46ef579" Mar 10 08:23:49 crc kubenswrapper[4825]: I0310 08:23:49.831808 4825 scope.go:117] "RemoveContainer" containerID="40edc66acaea18a4331bbb40750d16617c40c50f75e69141af2f7dbbdc2b1274" Mar 10 08:23:50 crc kubenswrapper[4825]: I0310 08:23:50.302708 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-md4ll" event={"ID":"3d3e4992-fd3b-42ba-af1c-65278a4b277e","Type":"ContainerStarted","Data":"9c629a716bd41ec6e2c40a862c9e43984bef8032b58525a37b2f4a8d86382119"} Mar 10 08:23:50 crc kubenswrapper[4825]: I0310 08:23:50.335655 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-jxdsc" Mar 10 08:23:50 crc kubenswrapper[4825]: I0310 08:23:50.341253 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-md4ll" podStartSLOduration=1.968741098 podStartE2EDuration="15.341234678s" podCreationTimestamp="2026-03-10 08:23:35 +0000 UTC" firstStartedPulling="2026-03-10 08:23:36.404923266 +0000 UTC m=+5969.434703881" lastFinishedPulling="2026-03-10 08:23:49.777416846 +0000 UTC m=+5982.807197461" observedRunningTime="2026-03-10 08:23:50.335930119 +0000 UTC m=+5983.365710754" watchObservedRunningTime="2026-03-10 08:23:50.341234678 +0000 UTC m=+5983.371015293" Mar 10 08:23:50 crc kubenswrapper[4825]: I0310 08:23:50.378952 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-jxdsc" podStartSLOduration=2.242821854 podStartE2EDuration="15.378930678s" podCreationTimestamp="2026-03-10 08:23:35 +0000 UTC" firstStartedPulling="2026-03-10 08:23:36.650947915 +0000 UTC m=+5969.680728530" lastFinishedPulling="2026-03-10 08:23:49.787056739 +0000 UTC m=+5982.816837354" observedRunningTime="2026-03-10 08:23:50.364285103 +0000 UTC m=+5983.394065738" watchObservedRunningTime="2026-03-10 08:23:50.378930678 +0000 UTC m=+5983.408711293" Mar 10 08:23:51 crc kubenswrapper[4825]: I0310 08:23:51.344766 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-rtvkd" event={"ID":"0b126417-0421-4901-bbf0-e8c75dffa4d5","Type":"ContainerStarted","Data":"9df716d57d2130706f49291ecc30f0961368852c403a56487f5b7b0e19dd0779"} Mar 10 08:23:51 crc kubenswrapper[4825]: I0310 08:23:51.345171 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-rtvkd" Mar 10 08:23:51 crc kubenswrapper[4825]: I0310 08:23:51.346796 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-jxdsc" event={"ID":"e9441b13-c0dc-478c-90c1-43abb52482af","Type":"ContainerStarted","Data":"037ef432e30aa21dc1a3b0396c14f575470fffc7e0f01779857064da495c0918"} Mar 10 08:23:51 crc kubenswrapper[4825]: I0310 08:23:51.348495 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-rcqmb" event={"ID":"a291a88f-48bf-45a9-80f1-e558286ab74a","Type":"ContainerStarted","Data":"6e7cd2cca3527aa32b1621ce7f0815926439ebeda0710a6a7440924db5d11c57"} Mar 10 08:23:51 crc kubenswrapper[4825]: I0310 08:23:51.350438 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xhhcv" event={"ID":"66cf4082-239b-4875-b3cc-4f83e75f3c41","Type":"ContainerStarted","Data":"55813c3ef1ada890c00fe9690990b1ff5afac642a27ae9bd4f1fabf0486bc64f"} Mar 10 08:23:51 crc kubenswrapper[4825]: I0310 08:23:51.356434 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-rtvkd" Mar 10 08:23:51 crc kubenswrapper[4825]: I0310 08:23:51.374805 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-rtvkd" podStartSLOduration=3.080637901 podStartE2EDuration="16.374784393s" podCreationTimestamp="2026-03-10 08:23:35 +0000 UTC" firstStartedPulling="2026-03-10 08:23:36.492054344 +0000 UTC m=+5969.521834949" lastFinishedPulling="2026-03-10 08:23:49.786200826 +0000 UTC m=+5982.815981441" observedRunningTime="2026-03-10 08:23:51.369016182 +0000 UTC m=+5984.398796797" watchObservedRunningTime="2026-03-10 08:23:51.374784393 +0000 UTC m=+5984.404565008" Mar 10 08:23:51 crc kubenswrapper[4825]: I0310 08:23:51.398624 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xhhcv" podStartSLOduration=2.742559644 podStartE2EDuration="16.398607389s" podCreationTimestamp="2026-03-10 08:23:35 +0000 UTC" firstStartedPulling="2026-03-10 08:23:36.151109412 +0000 UTC m=+5969.180890027" lastFinishedPulling="2026-03-10 08:23:49.807157147 +0000 UTC m=+5982.836937772" observedRunningTime="2026-03-10 08:23:51.39218467 +0000 UTC m=+5984.421965275" watchObservedRunningTime="2026-03-10 08:23:51.398607389 +0000 UTC m=+5984.428388004" Mar 10 08:23:51 crc kubenswrapper[4825]: I0310 08:23:51.424912 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbfbfd454-rcqmb" podStartSLOduration=2.999067399 podStartE2EDuration="16.424891289s" podCreationTimestamp="2026-03-10 08:23:35 +0000 UTC" firstStartedPulling="2026-03-10 08:23:36.351410311 +0000 UTC m=+5969.381190926" lastFinishedPulling="2026-03-10 08:23:49.777234201 +0000 UTC m=+5982.807014816" observedRunningTime="2026-03-10 08:23:51.4150208 +0000 UTC m=+5984.444801445" watchObservedRunningTime="2026-03-10 08:23:51.424891289 +0000 UTC m=+5984.454671904" Mar 10 08:23:55 crc kubenswrapper[4825]: I0310 08:23:55.910701 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-jxdsc" Mar 10 08:23:58 crc kubenswrapper[4825]: I0310 08:23:58.694474 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 10 08:23:58 crc kubenswrapper[4825]: I0310 08:23:58.694943 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de" containerName="openstackclient" containerID="cri-o://e7ff1f930e87f5d9e3cabeec960ef7b6904a7905154d6c779781b767eab69638" gracePeriod=2 Mar 10 08:23:58 crc kubenswrapper[4825]: I0310 08:23:58.772010 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 10 08:23:58 crc kubenswrapper[4825]: I0310 08:23:58.792057 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 10 08:23:58 crc kubenswrapper[4825]: E0310 08:23:58.792558 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de" containerName="openstackclient" Mar 10 08:23:58 crc kubenswrapper[4825]: I0310 08:23:58.792587 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de" containerName="openstackclient" Mar 10 08:23:58 crc kubenswrapper[4825]: I0310 08:23:58.792810 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de" containerName="openstackclient" Mar 10 08:23:58 crc kubenswrapper[4825]: I0310 08:23:58.793775 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 08:23:58 crc kubenswrapper[4825]: I0310 08:23:58.812687 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 08:23:58 crc kubenswrapper[4825]: I0310 08:23:58.836053 4825 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de" podUID="6942d3f9-7a95-4c4f-9c04-49b338e4f82f" Mar 10 08:23:58 crc kubenswrapper[4825]: I0310 08:23:58.938793 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6942d3f9-7a95-4c4f-9c04-49b338e4f82f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6942d3f9-7a95-4c4f-9c04-49b338e4f82f\") " pod="openstack/openstackclient" Mar 10 08:23:58 crc kubenswrapper[4825]: I0310 08:23:58.938852 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6942d3f9-7a95-4c4f-9c04-49b338e4f82f-openstack-config\") pod \"openstackclient\" (UID: \"6942d3f9-7a95-4c4f-9c04-49b338e4f82f\") " pod="openstack/openstackclient" Mar 10 08:23:58 crc kubenswrapper[4825]: I0310 08:23:58.938892 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h5nr\" (UniqueName: \"kubernetes.io/projected/6942d3f9-7a95-4c4f-9c04-49b338e4f82f-kube-api-access-2h5nr\") pod \"openstackclient\" (UID: \"6942d3f9-7a95-4c4f-9c04-49b338e4f82f\") " pod="openstack/openstackclient" Mar 10 08:23:58 crc kubenswrapper[4825]: I0310 08:23:58.938953 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6942d3f9-7a95-4c4f-9c04-49b338e4f82f-openstack-config-secret\") pod \"openstackclient\" (UID: \"6942d3f9-7a95-4c4f-9c04-49b338e4f82f\") " pod="openstack/openstackclient" Mar 10 08:23:58 crc kubenswrapper[4825]: I0310 08:23:58.948398 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 08:23:58 crc kubenswrapper[4825]: I0310 08:23:58.950842 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 08:23:58 crc kubenswrapper[4825]: I0310 08:23:58.958757 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-rdldg" Mar 10 08:23:58 crc kubenswrapper[4825]: I0310 08:23:58.978603 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 08:23:59 crc kubenswrapper[4825]: I0310 08:23:59.045244 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6942d3f9-7a95-4c4f-9c04-49b338e4f82f-openstack-config\") pod \"openstackclient\" (UID: \"6942d3f9-7a95-4c4f-9c04-49b338e4f82f\") " pod="openstack/openstackclient" Mar 10 08:23:59 crc kubenswrapper[4825]: I0310 08:23:59.045315 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h5nr\" (UniqueName: \"kubernetes.io/projected/6942d3f9-7a95-4c4f-9c04-49b338e4f82f-kube-api-access-2h5nr\") pod \"openstackclient\" (UID: \"6942d3f9-7a95-4c4f-9c04-49b338e4f82f\") " pod="openstack/openstackclient" Mar 10 08:23:59 crc kubenswrapper[4825]: I0310 08:23:59.045384 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lfq6\" (UniqueName: \"kubernetes.io/projected/350fa04a-653d-4ab6-95bb-4315e8de019c-kube-api-access-8lfq6\") pod \"kube-state-metrics-0\" (UID: \"350fa04a-653d-4ab6-95bb-4315e8de019c\") " pod="openstack/kube-state-metrics-0" Mar 10 08:23:59 crc kubenswrapper[4825]: I0310 08:23:59.045409 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6942d3f9-7a95-4c4f-9c04-49b338e4f82f-openstack-config-secret\") pod \"openstackclient\" (UID: \"6942d3f9-7a95-4c4f-9c04-49b338e4f82f\") " pod="openstack/openstackclient" Mar 10 08:23:59 crc kubenswrapper[4825]: I0310 08:23:59.045500 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6942d3f9-7a95-4c4f-9c04-49b338e4f82f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6942d3f9-7a95-4c4f-9c04-49b338e4f82f\") " pod="openstack/openstackclient" Mar 10 08:23:59 crc kubenswrapper[4825]: I0310 08:23:59.057847 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6942d3f9-7a95-4c4f-9c04-49b338e4f82f-openstack-config\") pod \"openstackclient\" (UID: \"6942d3f9-7a95-4c4f-9c04-49b338e4f82f\") " pod="openstack/openstackclient" Mar 10 08:23:59 crc kubenswrapper[4825]: I0310 08:23:59.061893 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6942d3f9-7a95-4c4f-9c04-49b338e4f82f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6942d3f9-7a95-4c4f-9c04-49b338e4f82f\") " pod="openstack/openstackclient" Mar 10 08:23:59 crc kubenswrapper[4825]: I0310 08:23:59.073170 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6942d3f9-7a95-4c4f-9c04-49b338e4f82f-openstack-config-secret\") pod \"openstackclient\" (UID: \"6942d3f9-7a95-4c4f-9c04-49b338e4f82f\") " pod="openstack/openstackclient" Mar 10 08:23:59 crc kubenswrapper[4825]: I0310 08:23:59.137041 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h5nr\" (UniqueName: \"kubernetes.io/projected/6942d3f9-7a95-4c4f-9c04-49b338e4f82f-kube-api-access-2h5nr\") pod \"openstackclient\" (UID: \"6942d3f9-7a95-4c4f-9c04-49b338e4f82f\") " pod="openstack/openstackclient" Mar 10 08:23:59 crc kubenswrapper[4825]: I0310 08:23:59.156475 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lfq6\" (UniqueName: \"kubernetes.io/projected/350fa04a-653d-4ab6-95bb-4315e8de019c-kube-api-access-8lfq6\") pod \"kube-state-metrics-0\" (UID: \"350fa04a-653d-4ab6-95bb-4315e8de019c\") " pod="openstack/kube-state-metrics-0" Mar 10 08:23:59 crc kubenswrapper[4825]: I0310 08:23:59.240943 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lfq6\" (UniqueName: \"kubernetes.io/projected/350fa04a-653d-4ab6-95bb-4315e8de019c-kube-api-access-8lfq6\") pod \"kube-state-metrics-0\" (UID: \"350fa04a-653d-4ab6-95bb-4315e8de019c\") " pod="openstack/kube-state-metrics-0" Mar 10 08:23:59 crc kubenswrapper[4825]: I0310 08:23:59.276421 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 08:23:59 crc kubenswrapper[4825]: I0310 08:23:59.429661 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.166091 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552184-rl66d"] Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.202467 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552184-rl66d"] Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.202559 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552184-rl66d" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.208015 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.208426 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.208666 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.281059 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 08:24:00 crc kubenswrapper[4825]: W0310 08:24:00.294205 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod350fa04a_653d_4ab6_95bb_4315e8de019c.slice/crio-3f09959c253ec7490a48d69fe16399449612aacdb47aaa4bbceb273568e2bf9b WatchSource:0}: Error finding container 3f09959c253ec7490a48d69fe16399449612aacdb47aaa4bbceb273568e2bf9b: Status 404 returned error can't find the container with id 3f09959c253ec7490a48d69fe16399449612aacdb47aaa4bbceb273568e2bf9b Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.351113 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.372750 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.382145 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.382356 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.382454 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.382575 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.383273 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-4h77c" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.390257 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.404656 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8gjf\" (UniqueName: \"kubernetes.io/projected/455f8f89-31ed-457d-ad40-5719de71514a-kube-api-access-l8gjf\") pod \"auto-csr-approver-29552184-rl66d\" (UID: \"455f8f89-31ed-457d-ad40-5719de71514a\") " pod="openshift-infra/auto-csr-approver-29552184-rl66d" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.502459 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"350fa04a-653d-4ab6-95bb-4315e8de019c","Type":"ContainerStarted","Data":"3f09959c253ec7490a48d69fe16399449612aacdb47aaa4bbceb273568e2bf9b"} Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.506337 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/9ea0c17e-7d29-45c4-85f4-66836e8860fd-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"9ea0c17e-7d29-45c4-85f4-66836e8860fd\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.506396 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9ea0c17e-7d29-45c4-85f4-66836e8860fd-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"9ea0c17e-7d29-45c4-85f4-66836e8860fd\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.506418 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9ea0c17e-7d29-45c4-85f4-66836e8860fd-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"9ea0c17e-7d29-45c4-85f4-66836e8860fd\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.506449 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9ea0c17e-7d29-45c4-85f4-66836e8860fd-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"9ea0c17e-7d29-45c4-85f4-66836e8860fd\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.506482 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9ea0c17e-7d29-45c4-85f4-66836e8860fd-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"9ea0c17e-7d29-45c4-85f4-66836e8860fd\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.506525 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8gjf\" (UniqueName: \"kubernetes.io/projected/455f8f89-31ed-457d-ad40-5719de71514a-kube-api-access-l8gjf\") pod \"auto-csr-approver-29552184-rl66d\" (UID: \"455f8f89-31ed-457d-ad40-5719de71514a\") " pod="openshift-infra/auto-csr-approver-29552184-rl66d" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.506542 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9ea0c17e-7d29-45c4-85f4-66836e8860fd-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"9ea0c17e-7d29-45c4-85f4-66836e8860fd\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.506606 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn4z2\" (UniqueName: \"kubernetes.io/projected/9ea0c17e-7d29-45c4-85f4-66836e8860fd-kube-api-access-cn4z2\") pod \"alertmanager-metric-storage-0\" (UID: \"9ea0c17e-7d29-45c4-85f4-66836e8860fd\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.541738 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8gjf\" (UniqueName: \"kubernetes.io/projected/455f8f89-31ed-457d-ad40-5719de71514a-kube-api-access-l8gjf\") pod \"auto-csr-approver-29552184-rl66d\" (UID: \"455f8f89-31ed-457d-ad40-5719de71514a\") " pod="openshift-infra/auto-csr-approver-29552184-rl66d" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.550874 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552184-rl66d" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.603922 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.608918 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9ea0c17e-7d29-45c4-85f4-66836e8860fd-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"9ea0c17e-7d29-45c4-85f4-66836e8860fd\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.608986 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9ea0c17e-7d29-45c4-85f4-66836e8860fd-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"9ea0c17e-7d29-45c4-85f4-66836e8860fd\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.609025 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9ea0c17e-7d29-45c4-85f4-66836e8860fd-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"9ea0c17e-7d29-45c4-85f4-66836e8860fd\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.609074 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9ea0c17e-7d29-45c4-85f4-66836e8860fd-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"9ea0c17e-7d29-45c4-85f4-66836e8860fd\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.609147 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9ea0c17e-7d29-45c4-85f4-66836e8860fd-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"9ea0c17e-7d29-45c4-85f4-66836e8860fd\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.609243 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn4z2\" (UniqueName: \"kubernetes.io/projected/9ea0c17e-7d29-45c4-85f4-66836e8860fd-kube-api-access-cn4z2\") pod \"alertmanager-metric-storage-0\" (UID: \"9ea0c17e-7d29-45c4-85f4-66836e8860fd\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.609354 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/9ea0c17e-7d29-45c4-85f4-66836e8860fd-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"9ea0c17e-7d29-45c4-85f4-66836e8860fd\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.610172 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/9ea0c17e-7d29-45c4-85f4-66836e8860fd-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"9ea0c17e-7d29-45c4-85f4-66836e8860fd\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.621796 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9ea0c17e-7d29-45c4-85f4-66836e8860fd-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"9ea0c17e-7d29-45c4-85f4-66836e8860fd\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.623107 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9ea0c17e-7d29-45c4-85f4-66836e8860fd-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"9ea0c17e-7d29-45c4-85f4-66836e8860fd\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.628686 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9ea0c17e-7d29-45c4-85f4-66836e8860fd-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"9ea0c17e-7d29-45c4-85f4-66836e8860fd\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.636049 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9ea0c17e-7d29-45c4-85f4-66836e8860fd-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"9ea0c17e-7d29-45c4-85f4-66836e8860fd\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.647673 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9ea0c17e-7d29-45c4-85f4-66836e8860fd-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"9ea0c17e-7d29-45c4-85f4-66836e8860fd\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.652485 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn4z2\" (UniqueName: \"kubernetes.io/projected/9ea0c17e-7d29-45c4-85f4-66836e8860fd-kube-api-access-cn4z2\") pod \"alertmanager-metric-storage-0\" (UID: \"9ea0c17e-7d29-45c4-85f4-66836e8860fd\") " pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.669282 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.672270 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.676217 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.676551 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.676780 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.676951 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.677109 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.677305 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-fsmn2" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.677471 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.677585 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.704390 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.745670 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.813942 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/504bd427-27b5-4b10-99cb-285d970bf02e-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.814241 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/504bd427-27b5-4b10-99cb-285d970bf02e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.814267 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd9vr\" (UniqueName: \"kubernetes.io/projected/504bd427-27b5-4b10-99cb-285d970bf02e-kube-api-access-sd9vr\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.814312 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/504bd427-27b5-4b10-99cb-285d970bf02e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.814349 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4277677a-0949-4a04-b057-6e93d3b54e65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4277677a-0949-4a04-b057-6e93d3b54e65\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.814366 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/504bd427-27b5-4b10-99cb-285d970bf02e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.814406 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/504bd427-27b5-4b10-99cb-285d970bf02e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.814459 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/504bd427-27b5-4b10-99cb-285d970bf02e-config\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.814485 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/504bd427-27b5-4b10-99cb-285d970bf02e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.814505 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/504bd427-27b5-4b10-99cb-285d970bf02e-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.916159 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4277677a-0949-4a04-b057-6e93d3b54e65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4277677a-0949-4a04-b057-6e93d3b54e65\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.916201 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/504bd427-27b5-4b10-99cb-285d970bf02e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.916249 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/504bd427-27b5-4b10-99cb-285d970bf02e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.916305 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/504bd427-27b5-4b10-99cb-285d970bf02e-config\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.916336 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/504bd427-27b5-4b10-99cb-285d970bf02e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.916372 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/504bd427-27b5-4b10-99cb-285d970bf02e-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.916401 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/504bd427-27b5-4b10-99cb-285d970bf02e-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.916434 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/504bd427-27b5-4b10-99cb-285d970bf02e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.916458 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd9vr\" (UniqueName: \"kubernetes.io/projected/504bd427-27b5-4b10-99cb-285d970bf02e-kube-api-access-sd9vr\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.916491 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/504bd427-27b5-4b10-99cb-285d970bf02e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.917178 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/504bd427-27b5-4b10-99cb-285d970bf02e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.917790 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/504bd427-27b5-4b10-99cb-285d970bf02e-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.919584 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/504bd427-27b5-4b10-99cb-285d970bf02e-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.922533 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/504bd427-27b5-4b10-99cb-285d970bf02e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.934147 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/504bd427-27b5-4b10-99cb-285d970bf02e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.934129 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/504bd427-27b5-4b10-99cb-285d970bf02e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.934473 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/504bd427-27b5-4b10-99cb-285d970bf02e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.934568 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/504bd427-27b5-4b10-99cb-285d970bf02e-config\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.944230 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd9vr\" (UniqueName: \"kubernetes.io/projected/504bd427-27b5-4b10-99cb-285d970bf02e-kube-api-access-sd9vr\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.946581 4825 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 08:24:00 crc kubenswrapper[4825]: I0310 08:24:00.946631 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4277677a-0949-4a04-b057-6e93d3b54e65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4277677a-0949-4a04-b057-6e93d3b54e65\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/837f8e712531f2ad959ef263db73d503df17159a57bcfd27bb8598f273a4236d/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.054593 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4277677a-0949-4a04-b057-6e93d3b54e65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4277677a-0949-4a04-b057-6e93d3b54e65\") pod \"prometheus-metric-storage-0\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.189487 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.191877 4825 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de" podUID="6942d3f9-7a95-4c4f-9c04-49b338e4f82f" Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.331638 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de-combined-ca-bundle\") pod \"a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de\" (UID: \"a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de\") " Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.332707 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de-openstack-config-secret\") pod \"a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de\" (UID: \"a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de\") " Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.333046 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s8q2\" (UniqueName: \"kubernetes.io/projected/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de-kube-api-access-8s8q2\") pod \"a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de\" (UID: \"a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de\") " Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.333258 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de-openstack-config\") pod \"a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de\" (UID: \"a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de\") " Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.332988 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.371396 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de-kube-api-access-8s8q2" (OuterVolumeSpecName: "kube-api-access-8s8q2") pod "a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de" (UID: "a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de"). InnerVolumeSpecName "kube-api-access-8s8q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.374899 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552184-rl66d"] Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.386701 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de" (UID: "a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.406457 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de" (UID: "a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.435370 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.435705 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s8q2\" (UniqueName: \"kubernetes.io/projected/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de-kube-api-access-8s8q2\") on node \"crc\" DevicePath \"\"" Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.435795 4825 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.455347 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de" (UID: "a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.518404 4825 generic.go:334] "Generic (PLEG): container finished" podID="a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de" containerID="e7ff1f930e87f5d9e3cabeec960ef7b6904a7905154d6c779781b767eab69638" exitCode=137 Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.518505 4825 scope.go:117] "RemoveContainer" containerID="e7ff1f930e87f5d9e3cabeec960ef7b6904a7905154d6c779781b767eab69638" Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.518658 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.522889 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"350fa04a-653d-4ab6-95bb-4315e8de019c","Type":"ContainerStarted","Data":"122e829c5b1b27499da2b33d659bd3bb16caace45fff2c715ff82c2c7b4d980d"} Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.523032 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.523741 4825 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de" podUID="6942d3f9-7a95-4c4f-9c04-49b338e4f82f" Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.525755 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552184-rl66d" event={"ID":"455f8f89-31ed-457d-ad40-5719de71514a","Type":"ContainerStarted","Data":"599ef9ed2fdcb156081aa6c8695582d324b2b675a1a298b3b164c5b4fbde7214"} Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.527468 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6942d3f9-7a95-4c4f-9c04-49b338e4f82f","Type":"ContainerStarted","Data":"30a71937c7e64786c07e2d0ce2025a14d75797e4a94e7ca8d63ac313cb19ff57"} Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.527513 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6942d3f9-7a95-4c4f-9c04-49b338e4f82f","Type":"ContainerStarted","Data":"7aad89d60511ccb7cdfd2c17719a950e1bacd77c5d2dad1e2b1d1f447f1471be"} Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.541771 4825 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.552042 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.975685393 podStartE2EDuration="3.552021554s" podCreationTimestamp="2026-03-10 08:23:58 +0000 UTC" firstStartedPulling="2026-03-10 08:24:00.32500861 +0000 UTC m=+5993.354789235" lastFinishedPulling="2026-03-10 08:24:00.901344791 +0000 UTC m=+5993.931125396" observedRunningTime="2026-03-10 08:24:01.539896736 +0000 UTC m=+5994.569677361" watchObservedRunningTime="2026-03-10 08:24:01.552021554 +0000 UTC m=+5994.581802169" Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.563244 4825 scope.go:117] "RemoveContainer" containerID="e7ff1f930e87f5d9e3cabeec960ef7b6904a7905154d6c779781b767eab69638" Mar 10 08:24:01 crc kubenswrapper[4825]: E0310 08:24:01.564299 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7ff1f930e87f5d9e3cabeec960ef7b6904a7905154d6c779781b767eab69638\": container with ID starting with e7ff1f930e87f5d9e3cabeec960ef7b6904a7905154d6c779781b767eab69638 not found: ID does not exist" containerID="e7ff1f930e87f5d9e3cabeec960ef7b6904a7905154d6c779781b767eab69638" Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.564333 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7ff1f930e87f5d9e3cabeec960ef7b6904a7905154d6c779781b767eab69638"} err="failed to get container status \"e7ff1f930e87f5d9e3cabeec960ef7b6904a7905154d6c779781b767eab69638\": rpc error: code = NotFound desc = could not find container \"e7ff1f930e87f5d9e3cabeec960ef7b6904a7905154d6c779781b767eab69638\": container with ID starting with e7ff1f930e87f5d9e3cabeec960ef7b6904a7905154d6c779781b767eab69638 not found: ID does not exist" Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.577540 4825 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de" podUID="6942d3f9-7a95-4c4f-9c04-49b338e4f82f" Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.591294 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 10 08:24:01 crc kubenswrapper[4825]: W0310 08:24:01.591837 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ea0c17e_7d29_45c4_85f4_66836e8860fd.slice/crio-97d5df12f4314140304344de921ac3fb95091d6b44fbce66999ba43dba3eaf85 WatchSource:0}: Error finding container 97d5df12f4314140304344de921ac3fb95091d6b44fbce66999ba43dba3eaf85: Status 404 returned error can't find the container with id 97d5df12f4314140304344de921ac3fb95091d6b44fbce66999ba43dba3eaf85 Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.600376 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.600356133 podStartE2EDuration="3.600356133s" podCreationTimestamp="2026-03-10 08:23:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:24:01.574995587 +0000 UTC m=+5994.604776202" watchObservedRunningTime="2026-03-10 08:24:01.600356133 +0000 UTC m=+5994.630136748" Mar 10 08:24:01 crc kubenswrapper[4825]: I0310 08:24:01.908919 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 10 08:24:02 crc kubenswrapper[4825]: I0310 08:24:02.550085 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"504bd427-27b5-4b10-99cb-285d970bf02e","Type":"ContainerStarted","Data":"a9917bb6daa9b6905186750ccc4e9ec5cdfe017bb2d4db9e288d8d0ffcc22efb"} Mar 10 08:24:02 crc kubenswrapper[4825]: I0310 08:24:02.552602 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"9ea0c17e-7d29-45c4-85f4-66836e8860fd","Type":"ContainerStarted","Data":"97d5df12f4314140304344de921ac3fb95091d6b44fbce66999ba43dba3eaf85"} Mar 10 08:24:03 crc kubenswrapper[4825]: I0310 08:24:03.249359 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de" path="/var/lib/kubelet/pods/a242ba70-a3b1-4f9d-8bf3-0673e8a7c2de/volumes" Mar 10 08:24:03 crc kubenswrapper[4825]: I0310 08:24:03.565855 4825 generic.go:334] "Generic (PLEG): container finished" podID="455f8f89-31ed-457d-ad40-5719de71514a" containerID="9c66832fb69ed2b1d1cd3a56d2e3225e31beb613c4c6545f10fcc37a148ede04" exitCode=0 Mar 10 08:24:03 crc kubenswrapper[4825]: I0310 08:24:03.567043 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552184-rl66d" event={"ID":"455f8f89-31ed-457d-ad40-5719de71514a","Type":"ContainerDied","Data":"9c66832fb69ed2b1d1cd3a56d2e3225e31beb613c4c6545f10fcc37a148ede04"} Mar 10 08:24:04 crc kubenswrapper[4825]: I0310 08:24:04.922357 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552184-rl66d" Mar 10 08:24:05 crc kubenswrapper[4825]: I0310 08:24:05.021549 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8gjf\" (UniqueName: \"kubernetes.io/projected/455f8f89-31ed-457d-ad40-5719de71514a-kube-api-access-l8gjf\") pod \"455f8f89-31ed-457d-ad40-5719de71514a\" (UID: \"455f8f89-31ed-457d-ad40-5719de71514a\") " Mar 10 08:24:05 crc kubenswrapper[4825]: I0310 08:24:05.025644 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/455f8f89-31ed-457d-ad40-5719de71514a-kube-api-access-l8gjf" (OuterVolumeSpecName: "kube-api-access-l8gjf") pod "455f8f89-31ed-457d-ad40-5719de71514a" (UID: "455f8f89-31ed-457d-ad40-5719de71514a"). InnerVolumeSpecName "kube-api-access-l8gjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:24:05 crc kubenswrapper[4825]: I0310 08:24:05.125558 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8gjf\" (UniqueName: \"kubernetes.io/projected/455f8f89-31ed-457d-ad40-5719de71514a-kube-api-access-l8gjf\") on node \"crc\" DevicePath \"\"" Mar 10 08:24:05 crc kubenswrapper[4825]: I0310 08:24:05.595643 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552184-rl66d" event={"ID":"455f8f89-31ed-457d-ad40-5719de71514a","Type":"ContainerDied","Data":"599ef9ed2fdcb156081aa6c8695582d324b2b675a1a298b3b164c5b4fbde7214"} Mar 10 08:24:05 crc kubenswrapper[4825]: I0310 08:24:05.595688 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="599ef9ed2fdcb156081aa6c8695582d324b2b675a1a298b3b164c5b4fbde7214" Mar 10 08:24:05 crc kubenswrapper[4825]: I0310 08:24:05.595686 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552184-rl66d" Mar 10 08:24:05 crc kubenswrapper[4825]: I0310 08:24:05.980698 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552178-zgtl6"] Mar 10 08:24:05 crc kubenswrapper[4825]: I0310 08:24:05.991476 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552178-zgtl6"] Mar 10 08:24:07 crc kubenswrapper[4825]: I0310 08:24:07.247490 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b94e20e-d08c-4e38-989b-715a5b1d4365" path="/var/lib/kubelet/pods/5b94e20e-d08c-4e38-989b-715a5b1d4365/volumes" Mar 10 08:24:07 crc kubenswrapper[4825]: I0310 08:24:07.616444 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"504bd427-27b5-4b10-99cb-285d970bf02e","Type":"ContainerStarted","Data":"ac5da213dba0b9785a5ead9c0f3ea7084f81a54f664eb9952011ef5f7024b42d"} Mar 10 08:24:07 crc kubenswrapper[4825]: I0310 08:24:07.619675 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"9ea0c17e-7d29-45c4-85f4-66836e8860fd","Type":"ContainerStarted","Data":"6f989b46d30a4aaa6e4505209a018ffc803db03b9db51d6defa90c0c9748ac2b"} Mar 10 08:24:09 crc kubenswrapper[4825]: I0310 08:24:09.281670 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 10 08:24:14 crc kubenswrapper[4825]: I0310 08:24:14.691119 4825 generic.go:334] "Generic (PLEG): container finished" podID="504bd427-27b5-4b10-99cb-285d970bf02e" containerID="ac5da213dba0b9785a5ead9c0f3ea7084f81a54f664eb9952011ef5f7024b42d" exitCode=0 Mar 10 08:24:14 crc kubenswrapper[4825]: I0310 08:24:14.691211 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"504bd427-27b5-4b10-99cb-285d970bf02e","Type":"ContainerDied","Data":"ac5da213dba0b9785a5ead9c0f3ea7084f81a54f664eb9952011ef5f7024b42d"} Mar 10 08:24:14 crc kubenswrapper[4825]: I0310 08:24:14.693568 4825 generic.go:334] "Generic (PLEG): container finished" podID="9ea0c17e-7d29-45c4-85f4-66836e8860fd" containerID="6f989b46d30a4aaa6e4505209a018ffc803db03b9db51d6defa90c0c9748ac2b" exitCode=0 Mar 10 08:24:14 crc kubenswrapper[4825]: I0310 08:24:14.693612 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"9ea0c17e-7d29-45c4-85f4-66836e8860fd","Type":"ContainerDied","Data":"6f989b46d30a4aaa6e4505209a018ffc803db03b9db51d6defa90c0c9748ac2b"} Mar 10 08:24:17 crc kubenswrapper[4825]: I0310 08:24:17.727747 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"9ea0c17e-7d29-45c4-85f4-66836e8860fd","Type":"ContainerStarted","Data":"0061482f64b4640b77c7a0b2f71a546cbd18901a7755f479d1b4489a2134670e"} Mar 10 08:24:20 crc kubenswrapper[4825]: I0310 08:24:20.034649 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-14b7-account-create-update-lmmrz"] Mar 10 08:24:20 crc kubenswrapper[4825]: I0310 08:24:20.054504 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-dw5mh"] Mar 10 08:24:20 crc kubenswrapper[4825]: I0310 08:24:20.074109 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-dw5mh"] Mar 10 08:24:20 crc kubenswrapper[4825]: I0310 08:24:20.086025 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-14b7-account-create-update-lmmrz"] Mar 10 08:24:21 crc kubenswrapper[4825]: I0310 08:24:21.247777 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d16dea2-c672-4fc5-932e-734973e299dd" path="/var/lib/kubelet/pods/0d16dea2-c672-4fc5-932e-734973e299dd/volumes" Mar 10 08:24:21 crc kubenswrapper[4825]: I0310 08:24:21.248650 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="986cc5c1-8756-4ea7-ae05-97ce9a4182b1" path="/var/lib/kubelet/pods/986cc5c1-8756-4ea7-ae05-97ce9a4182b1/volumes" Mar 10 08:24:21 crc kubenswrapper[4825]: I0310 08:24:21.766375 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"9ea0c17e-7d29-45c4-85f4-66836e8860fd","Type":"ContainerStarted","Data":"207d504ac238a29131ed01e10ee410b9380fae3331616cb0d0f4b98938e81a5a"} Mar 10 08:24:22 crc kubenswrapper[4825]: I0310 08:24:22.776218 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"504bd427-27b5-4b10-99cb-285d970bf02e","Type":"ContainerStarted","Data":"51adc52051557da29ea0059305b1fabbdf2dd4b898fb143e341eb9eeb701d7bf"} Mar 10 08:24:22 crc kubenswrapper[4825]: I0310 08:24:22.777090 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:22 crc kubenswrapper[4825]: I0310 08:24:22.779173 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Mar 10 08:24:22 crc kubenswrapper[4825]: I0310 08:24:22.814383 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=7.6076771690000005 podStartE2EDuration="22.814364782s" podCreationTimestamp="2026-03-10 08:24:00 +0000 UTC" firstStartedPulling="2026-03-10 08:24:01.594435118 +0000 UTC m=+5994.624215723" lastFinishedPulling="2026-03-10 08:24:16.801122721 +0000 UTC m=+6009.830903336" observedRunningTime="2026-03-10 08:24:22.799083961 +0000 UTC m=+6015.828864586" watchObservedRunningTime="2026-03-10 08:24:22.814364782 +0000 UTC m=+6015.844145397" Mar 10 08:24:25 crc kubenswrapper[4825]: I0310 08:24:25.805538 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"504bd427-27b5-4b10-99cb-285d970bf02e","Type":"ContainerStarted","Data":"30645e8fd4a42db208e6a279c8f8487dd402eb2a9f5bc17c540a72805ff74ce9"} Mar 10 08:24:29 crc kubenswrapper[4825]: I0310 08:24:29.842638 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"504bd427-27b5-4b10-99cb-285d970bf02e","Type":"ContainerStarted","Data":"6a5b561f9fd0cbac987fb383d3048754b2b5c5ee65fa72dc6dbda5c54688ad36"} Mar 10 08:24:31 crc kubenswrapper[4825]: I0310 08:24:31.333933 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:31 crc kubenswrapper[4825]: I0310 08:24:31.333997 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:31 crc kubenswrapper[4825]: I0310 08:24:31.336963 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:31 crc kubenswrapper[4825]: I0310 08:24:31.364286 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=4.8299753 podStartE2EDuration="32.36426788s" podCreationTimestamp="2026-03-10 08:23:59 +0000 UTC" firstStartedPulling="2026-03-10 08:24:01.926428194 +0000 UTC m=+5994.956208809" lastFinishedPulling="2026-03-10 08:24:29.460720774 +0000 UTC m=+6022.490501389" observedRunningTime="2026-03-10 08:24:29.86927993 +0000 UTC m=+6022.899060545" watchObservedRunningTime="2026-03-10 08:24:31.36426788 +0000 UTC m=+6024.394048495" Mar 10 08:24:31 crc kubenswrapper[4825]: I0310 08:24:31.863689 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:32 crc kubenswrapper[4825]: I0310 08:24:32.821041 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 08:24:32 crc kubenswrapper[4825]: E0310 08:24:32.821550 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="455f8f89-31ed-457d-ad40-5719de71514a" containerName="oc" Mar 10 08:24:32 crc kubenswrapper[4825]: I0310 08:24:32.821566 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="455f8f89-31ed-457d-ad40-5719de71514a" containerName="oc" Mar 10 08:24:32 crc kubenswrapper[4825]: I0310 08:24:32.821800 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="455f8f89-31ed-457d-ad40-5719de71514a" containerName="oc" Mar 10 08:24:32 crc kubenswrapper[4825]: I0310 08:24:32.824029 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 08:24:32 crc kubenswrapper[4825]: I0310 08:24:32.832180 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 08:24:32 crc kubenswrapper[4825]: I0310 08:24:32.837701 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 08:24:32 crc kubenswrapper[4825]: I0310 08:24:32.843572 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 08:24:32 crc kubenswrapper[4825]: I0310 08:24:32.972616 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3fb738d-ab2f-4e60-9089-0e00c094843f-log-httpd\") pod \"ceilometer-0\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " pod="openstack/ceilometer-0" Mar 10 08:24:32 crc kubenswrapper[4825]: I0310 08:24:32.972696 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3fb738d-ab2f-4e60-9089-0e00c094843f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " pod="openstack/ceilometer-0" Mar 10 08:24:32 crc kubenswrapper[4825]: I0310 08:24:32.972732 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3fb738d-ab2f-4e60-9089-0e00c094843f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " pod="openstack/ceilometer-0" Mar 10 08:24:32 crc kubenswrapper[4825]: I0310 08:24:32.972778 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3fb738d-ab2f-4e60-9089-0e00c094843f-run-httpd\") pod \"ceilometer-0\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " pod="openstack/ceilometer-0" Mar 10 08:24:32 crc kubenswrapper[4825]: I0310 08:24:32.972836 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3fb738d-ab2f-4e60-9089-0e00c094843f-scripts\") pod \"ceilometer-0\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " pod="openstack/ceilometer-0" Mar 10 08:24:32 crc kubenswrapper[4825]: I0310 08:24:32.972872 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8qz6\" (UniqueName: \"kubernetes.io/projected/e3fb738d-ab2f-4e60-9089-0e00c094843f-kube-api-access-p8qz6\") pod \"ceilometer-0\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " pod="openstack/ceilometer-0" Mar 10 08:24:32 crc kubenswrapper[4825]: I0310 08:24:32.972909 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3fb738d-ab2f-4e60-9089-0e00c094843f-config-data\") pod \"ceilometer-0\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " pod="openstack/ceilometer-0" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.075040 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3fb738d-ab2f-4e60-9089-0e00c094843f-log-httpd\") pod \"ceilometer-0\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " pod="openstack/ceilometer-0" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.075177 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3fb738d-ab2f-4e60-9089-0e00c094843f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " pod="openstack/ceilometer-0" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.075334 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3fb738d-ab2f-4e60-9089-0e00c094843f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " pod="openstack/ceilometer-0" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.076021 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3fb738d-ab2f-4e60-9089-0e00c094843f-log-httpd\") pod \"ceilometer-0\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " pod="openstack/ceilometer-0" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.076087 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3fb738d-ab2f-4e60-9089-0e00c094843f-run-httpd\") pod \"ceilometer-0\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " pod="openstack/ceilometer-0" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.076247 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3fb738d-ab2f-4e60-9089-0e00c094843f-scripts\") pod \"ceilometer-0\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " pod="openstack/ceilometer-0" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.076364 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8qz6\" (UniqueName: \"kubernetes.io/projected/e3fb738d-ab2f-4e60-9089-0e00c094843f-kube-api-access-p8qz6\") pod \"ceilometer-0\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " pod="openstack/ceilometer-0" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.076430 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3fb738d-ab2f-4e60-9089-0e00c094843f-config-data\") pod \"ceilometer-0\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " pod="openstack/ceilometer-0" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.076489 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3fb738d-ab2f-4e60-9089-0e00c094843f-run-httpd\") pod \"ceilometer-0\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " pod="openstack/ceilometer-0" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.082802 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3fb738d-ab2f-4e60-9089-0e00c094843f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " pod="openstack/ceilometer-0" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.082940 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3fb738d-ab2f-4e60-9089-0e00c094843f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " pod="openstack/ceilometer-0" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.083727 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3fb738d-ab2f-4e60-9089-0e00c094843f-scripts\") pod \"ceilometer-0\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " pod="openstack/ceilometer-0" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.085288 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3fb738d-ab2f-4e60-9089-0e00c094843f-config-data\") pod \"ceilometer-0\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " pod="openstack/ceilometer-0" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.102394 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8qz6\" (UniqueName: \"kubernetes.io/projected/e3fb738d-ab2f-4e60-9089-0e00c094843f-kube-api-access-p8qz6\") pod \"ceilometer-0\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " pod="openstack/ceilometer-0" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.156427 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.320340 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.320983 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="6942d3f9-7a95-4c4f-9c04-49b338e4f82f" containerName="openstackclient" containerID="cri-o://30a71937c7e64786c07e2d0ce2025a14d75797e4a94e7ca8d63ac313cb19ff57" gracePeriod=2 Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.335328 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.352728 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 10 08:24:33 crc kubenswrapper[4825]: E0310 08:24:33.353219 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6942d3f9-7a95-4c4f-9c04-49b338e4f82f" containerName="openstackclient" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.353234 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6942d3f9-7a95-4c4f-9c04-49b338e4f82f" containerName="openstackclient" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.353438 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="6942d3f9-7a95-4c4f-9c04-49b338e4f82f" containerName="openstackclient" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.354198 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.358705 4825 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="6942d3f9-7a95-4c4f-9c04-49b338e4f82f" podUID="4a7d5fd4-1799-4d5d-9016-4ee3de9ed205" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.364447 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.486219 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4a7d5fd4-1799-4d5d-9016-4ee3de9ed205-openstack-config-secret\") pod \"openstackclient\" (UID: \"4a7d5fd4-1799-4d5d-9016-4ee3de9ed205\") " pod="openstack/openstackclient" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.486290 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7d5fd4-1799-4d5d-9016-4ee3de9ed205-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4a7d5fd4-1799-4d5d-9016-4ee3de9ed205\") " pod="openstack/openstackclient" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.486340 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4a7d5fd4-1799-4d5d-9016-4ee3de9ed205-openstack-config\") pod \"openstackclient\" (UID: \"4a7d5fd4-1799-4d5d-9016-4ee3de9ed205\") " pod="openstack/openstackclient" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.486471 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfwcd\" (UniqueName: \"kubernetes.io/projected/4a7d5fd4-1799-4d5d-9016-4ee3de9ed205-kube-api-access-lfwcd\") pod \"openstackclient\" (UID: \"4a7d5fd4-1799-4d5d-9016-4ee3de9ed205\") " pod="openstack/openstackclient" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.595035 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7d5fd4-1799-4d5d-9016-4ee3de9ed205-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4a7d5fd4-1799-4d5d-9016-4ee3de9ed205\") " pod="openstack/openstackclient" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.595307 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4a7d5fd4-1799-4d5d-9016-4ee3de9ed205-openstack-config\") pod \"openstackclient\" (UID: \"4a7d5fd4-1799-4d5d-9016-4ee3de9ed205\") " pod="openstack/openstackclient" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.595413 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfwcd\" (UniqueName: \"kubernetes.io/projected/4a7d5fd4-1799-4d5d-9016-4ee3de9ed205-kube-api-access-lfwcd\") pod \"openstackclient\" (UID: \"4a7d5fd4-1799-4d5d-9016-4ee3de9ed205\") " pod="openstack/openstackclient" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.595494 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4a7d5fd4-1799-4d5d-9016-4ee3de9ed205-openstack-config-secret\") pod \"openstackclient\" (UID: \"4a7d5fd4-1799-4d5d-9016-4ee3de9ed205\") " pod="openstack/openstackclient" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.597066 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4a7d5fd4-1799-4d5d-9016-4ee3de9ed205-openstack-config\") pod \"openstackclient\" (UID: \"4a7d5fd4-1799-4d5d-9016-4ee3de9ed205\") " pod="openstack/openstackclient" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.606897 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4a7d5fd4-1799-4d5d-9016-4ee3de9ed205-openstack-config-secret\") pod \"openstackclient\" (UID: \"4a7d5fd4-1799-4d5d-9016-4ee3de9ed205\") " pod="openstack/openstackclient" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.615506 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7d5fd4-1799-4d5d-9016-4ee3de9ed205-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4a7d5fd4-1799-4d5d-9016-4ee3de9ed205\") " pod="openstack/openstackclient" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.630632 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfwcd\" (UniqueName: \"kubernetes.io/projected/4a7d5fd4-1799-4d5d-9016-4ee3de9ed205-kube-api-access-lfwcd\") pod \"openstackclient\" (UID: \"4a7d5fd4-1799-4d5d-9016-4ee3de9ed205\") " pod="openstack/openstackclient" Mar 10 08:24:33 crc kubenswrapper[4825]: I0310 08:24:33.690431 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 08:24:34 crc kubenswrapper[4825]: I0310 08:24:34.087527 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 08:24:34 crc kubenswrapper[4825]: W0310 08:24:34.092306 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3fb738d_ab2f_4e60_9089_0e00c094843f.slice/crio-9d81c16222768031c63d5f215aa3fb863b0e97dded7ba57497d3942ae7ce5da6 WatchSource:0}: Error finding container 9d81c16222768031c63d5f215aa3fb863b0e97dded7ba57497d3942ae7ce5da6: Status 404 returned error can't find the container with id 9d81c16222768031c63d5f215aa3fb863b0e97dded7ba57497d3942ae7ce5da6 Mar 10 08:24:34 crc kubenswrapper[4825]: I0310 08:24:34.316927 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 08:24:34 crc kubenswrapper[4825]: I0310 08:24:34.912347 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3fb738d-ab2f-4e60-9089-0e00c094843f","Type":"ContainerStarted","Data":"9d81c16222768031c63d5f215aa3fb863b0e97dded7ba57497d3942ae7ce5da6"} Mar 10 08:24:34 crc kubenswrapper[4825]: I0310 08:24:34.915278 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4a7d5fd4-1799-4d5d-9016-4ee3de9ed205","Type":"ContainerStarted","Data":"6793a1b760c99d382843a93fb66c47c450b4ed1e903d11cf1b98411f26e7a259"} Mar 10 08:24:34 crc kubenswrapper[4825]: I0310 08:24:34.915415 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4a7d5fd4-1799-4d5d-9016-4ee3de9ed205","Type":"ContainerStarted","Data":"0a2a290265beb7c82f7089dae65f7bff16d7ba2cac6345651e5b7d2769fbacab"} Mar 10 08:24:34 crc kubenswrapper[4825]: I0310 08:24:34.932601 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.932583972 podStartE2EDuration="1.932583972s" podCreationTimestamp="2026-03-10 08:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:24:34.930958699 +0000 UTC m=+6027.960739304" watchObservedRunningTime="2026-03-10 08:24:34.932583972 +0000 UTC m=+6027.962364577" Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.011263 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.011571 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="504bd427-27b5-4b10-99cb-285d970bf02e" containerName="prometheus" containerID="cri-o://51adc52051557da29ea0059305b1fabbdf2dd4b898fb143e341eb9eeb701d7bf" gracePeriod=600 Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.011619 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="504bd427-27b5-4b10-99cb-285d970bf02e" containerName="thanos-sidecar" containerID="cri-o://6a5b561f9fd0cbac987fb383d3048754b2b5c5ee65fa72dc6dbda5c54688ad36" gracePeriod=600 Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.011674 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="504bd427-27b5-4b10-99cb-285d970bf02e" containerName="config-reloader" containerID="cri-o://30645e8fd4a42db208e6a279c8f8487dd402eb2a9f5bc17c540a72805ff74ce9" gracePeriod=600 Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.793169 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.895791 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6942d3f9-7a95-4c4f-9c04-49b338e4f82f-openstack-config\") pod \"6942d3f9-7a95-4c4f-9c04-49b338e4f82f\" (UID: \"6942d3f9-7a95-4c4f-9c04-49b338e4f82f\") " Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.896119 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h5nr\" (UniqueName: \"kubernetes.io/projected/6942d3f9-7a95-4c4f-9c04-49b338e4f82f-kube-api-access-2h5nr\") pod \"6942d3f9-7a95-4c4f-9c04-49b338e4f82f\" (UID: \"6942d3f9-7a95-4c4f-9c04-49b338e4f82f\") " Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.896637 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6942d3f9-7a95-4c4f-9c04-49b338e4f82f-openstack-config-secret\") pod \"6942d3f9-7a95-4c4f-9c04-49b338e4f82f\" (UID: \"6942d3f9-7a95-4c4f-9c04-49b338e4f82f\") " Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.896675 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6942d3f9-7a95-4c4f-9c04-49b338e4f82f-combined-ca-bundle\") pod \"6942d3f9-7a95-4c4f-9c04-49b338e4f82f\" (UID: \"6942d3f9-7a95-4c4f-9c04-49b338e4f82f\") " Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.916467 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6942d3f9-7a95-4c4f-9c04-49b338e4f82f-kube-api-access-2h5nr" (OuterVolumeSpecName: "kube-api-access-2h5nr") pod "6942d3f9-7a95-4c4f-9c04-49b338e4f82f" (UID: "6942d3f9-7a95-4c4f-9c04-49b338e4f82f"). InnerVolumeSpecName "kube-api-access-2h5nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.934733 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6942d3f9-7a95-4c4f-9c04-49b338e4f82f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "6942d3f9-7a95-4c4f-9c04-49b338e4f82f" (UID: "6942d3f9-7a95-4c4f-9c04-49b338e4f82f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.954462 4825 generic.go:334] "Generic (PLEG): container finished" podID="6942d3f9-7a95-4c4f-9c04-49b338e4f82f" containerID="30a71937c7e64786c07e2d0ce2025a14d75797e4a94e7ca8d63ac313cb19ff57" exitCode=137 Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.954616 4825 scope.go:117] "RemoveContainer" containerID="30a71937c7e64786c07e2d0ce2025a14d75797e4a94e7ca8d63ac313cb19ff57" Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.954854 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.964931 4825 generic.go:334] "Generic (PLEG): container finished" podID="504bd427-27b5-4b10-99cb-285d970bf02e" containerID="6a5b561f9fd0cbac987fb383d3048754b2b5c5ee65fa72dc6dbda5c54688ad36" exitCode=0 Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.964958 4825 generic.go:334] "Generic (PLEG): container finished" podID="504bd427-27b5-4b10-99cb-285d970bf02e" containerID="30645e8fd4a42db208e6a279c8f8487dd402eb2a9f5bc17c540a72805ff74ce9" exitCode=0 Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.964966 4825 generic.go:334] "Generic (PLEG): container finished" podID="504bd427-27b5-4b10-99cb-285d970bf02e" containerID="51adc52051557da29ea0059305b1fabbdf2dd4b898fb143e341eb9eeb701d7bf" exitCode=0 Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.965973 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"504bd427-27b5-4b10-99cb-285d970bf02e","Type":"ContainerDied","Data":"6a5b561f9fd0cbac987fb383d3048754b2b5c5ee65fa72dc6dbda5c54688ad36"} Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.966008 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"504bd427-27b5-4b10-99cb-285d970bf02e","Type":"ContainerDied","Data":"30645e8fd4a42db208e6a279c8f8487dd402eb2a9f5bc17c540a72805ff74ce9"} Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.966018 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"504bd427-27b5-4b10-99cb-285d970bf02e","Type":"ContainerDied","Data":"51adc52051557da29ea0059305b1fabbdf2dd4b898fb143e341eb9eeb701d7bf"} Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.978356 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6942d3f9-7a95-4c4f-9c04-49b338e4f82f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6942d3f9-7a95-4c4f-9c04-49b338e4f82f" (UID: "6942d3f9-7a95-4c4f-9c04-49b338e4f82f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.998539 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6942d3f9-7a95-4c4f-9c04-49b338e4f82f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "6942d3f9-7a95-4c4f-9c04-49b338e4f82f" (UID: "6942d3f9-7a95-4c4f-9c04-49b338e4f82f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.999859 4825 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6942d3f9-7a95-4c4f-9c04-49b338e4f82f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.999896 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6942d3f9-7a95-4c4f-9c04-49b338e4f82f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:24:35 crc kubenswrapper[4825]: I0310 08:24:35.999909 4825 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6942d3f9-7a95-4c4f-9c04-49b338e4f82f-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:35.999920 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h5nr\" (UniqueName: \"kubernetes.io/projected/6942d3f9-7a95-4c4f-9c04-49b338e4f82f-kube-api-access-2h5nr\") on node \"crc\" DevicePath \"\"" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.021562 4825 scope.go:117] "RemoveContainer" containerID="30a71937c7e64786c07e2d0ce2025a14d75797e4a94e7ca8d63ac313cb19ff57" Mar 10 08:24:36 crc kubenswrapper[4825]: E0310 08:24:36.022160 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30a71937c7e64786c07e2d0ce2025a14d75797e4a94e7ca8d63ac313cb19ff57\": container with ID starting with 30a71937c7e64786c07e2d0ce2025a14d75797e4a94e7ca8d63ac313cb19ff57 not found: ID does not exist" containerID="30a71937c7e64786c07e2d0ce2025a14d75797e4a94e7ca8d63ac313cb19ff57" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.022193 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30a71937c7e64786c07e2d0ce2025a14d75797e4a94e7ca8d63ac313cb19ff57"} err="failed to get container status \"30a71937c7e64786c07e2d0ce2025a14d75797e4a94e7ca8d63ac313cb19ff57\": rpc error: code = NotFound desc = could not find container \"30a71937c7e64786c07e2d0ce2025a14d75797e4a94e7ca8d63ac313cb19ff57\": container with ID starting with 30a71937c7e64786c07e2d0ce2025a14d75797e4a94e7ca8d63ac313cb19ff57 not found: ID does not exist" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.348840 4825 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="6942d3f9-7a95-4c4f-9c04-49b338e4f82f" podUID="4a7d5fd4-1799-4d5d-9016-4ee3de9ed205" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.350254 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.410152 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/504bd427-27b5-4b10-99cb-285d970bf02e-tls-assets\") pod \"504bd427-27b5-4b10-99cb-285d970bf02e\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.410266 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4277677a-0949-4a04-b057-6e93d3b54e65\") pod \"504bd427-27b5-4b10-99cb-285d970bf02e\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.410302 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd9vr\" (UniqueName: \"kubernetes.io/projected/504bd427-27b5-4b10-99cb-285d970bf02e-kube-api-access-sd9vr\") pod \"504bd427-27b5-4b10-99cb-285d970bf02e\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.410338 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/504bd427-27b5-4b10-99cb-285d970bf02e-prometheus-metric-storage-rulefiles-2\") pod \"504bd427-27b5-4b10-99cb-285d970bf02e\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.410385 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/504bd427-27b5-4b10-99cb-285d970bf02e-thanos-prometheus-http-client-file\") pod \"504bd427-27b5-4b10-99cb-285d970bf02e\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.410464 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/504bd427-27b5-4b10-99cb-285d970bf02e-prometheus-metric-storage-rulefiles-1\") pod \"504bd427-27b5-4b10-99cb-285d970bf02e\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.410493 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/504bd427-27b5-4b10-99cb-285d970bf02e-prometheus-metric-storage-rulefiles-0\") pod \"504bd427-27b5-4b10-99cb-285d970bf02e\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.410571 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/504bd427-27b5-4b10-99cb-285d970bf02e-config-out\") pod \"504bd427-27b5-4b10-99cb-285d970bf02e\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.410606 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/504bd427-27b5-4b10-99cb-285d970bf02e-web-config\") pod \"504bd427-27b5-4b10-99cb-285d970bf02e\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.410705 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/504bd427-27b5-4b10-99cb-285d970bf02e-config\") pod \"504bd427-27b5-4b10-99cb-285d970bf02e\" (UID: \"504bd427-27b5-4b10-99cb-285d970bf02e\") " Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.413740 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/504bd427-27b5-4b10-99cb-285d970bf02e-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "504bd427-27b5-4b10-99cb-285d970bf02e" (UID: "504bd427-27b5-4b10-99cb-285d970bf02e"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.413832 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/504bd427-27b5-4b10-99cb-285d970bf02e-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "504bd427-27b5-4b10-99cb-285d970bf02e" (UID: "504bd427-27b5-4b10-99cb-285d970bf02e"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.414113 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/504bd427-27b5-4b10-99cb-285d970bf02e-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "504bd427-27b5-4b10-99cb-285d970bf02e" (UID: "504bd427-27b5-4b10-99cb-285d970bf02e"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.419398 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/504bd427-27b5-4b10-99cb-285d970bf02e-config-out" (OuterVolumeSpecName: "config-out") pod "504bd427-27b5-4b10-99cb-285d970bf02e" (UID: "504bd427-27b5-4b10-99cb-285d970bf02e"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.419625 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/504bd427-27b5-4b10-99cb-285d970bf02e-config" (OuterVolumeSpecName: "config") pod "504bd427-27b5-4b10-99cb-285d970bf02e" (UID: "504bd427-27b5-4b10-99cb-285d970bf02e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.421618 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/504bd427-27b5-4b10-99cb-285d970bf02e-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "504bd427-27b5-4b10-99cb-285d970bf02e" (UID: "504bd427-27b5-4b10-99cb-285d970bf02e"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.422332 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/504bd427-27b5-4b10-99cb-285d970bf02e-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "504bd427-27b5-4b10-99cb-285d970bf02e" (UID: "504bd427-27b5-4b10-99cb-285d970bf02e"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.438554 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/504bd427-27b5-4b10-99cb-285d970bf02e-kube-api-access-sd9vr" (OuterVolumeSpecName: "kube-api-access-sd9vr") pod "504bd427-27b5-4b10-99cb-285d970bf02e" (UID: "504bd427-27b5-4b10-99cb-285d970bf02e"). InnerVolumeSpecName "kube-api-access-sd9vr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.442881 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4277677a-0949-4a04-b057-6e93d3b54e65" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "504bd427-27b5-4b10-99cb-285d970bf02e" (UID: "504bd427-27b5-4b10-99cb-285d970bf02e"). InnerVolumeSpecName "pvc-4277677a-0949-4a04-b057-6e93d3b54e65". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.467043 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/504bd427-27b5-4b10-99cb-285d970bf02e-web-config" (OuterVolumeSpecName: "web-config") pod "504bd427-27b5-4b10-99cb-285d970bf02e" (UID: "504bd427-27b5-4b10-99cb-285d970bf02e"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.512951 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/504bd427-27b5-4b10-99cb-285d970bf02e-config\") on node \"crc\" DevicePath \"\"" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.512988 4825 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/504bd427-27b5-4b10-99cb-285d970bf02e-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.513023 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-4277677a-0949-4a04-b057-6e93d3b54e65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4277677a-0949-4a04-b057-6e93d3b54e65\") on node \"crc\" " Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.513035 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd9vr\" (UniqueName: \"kubernetes.io/projected/504bd427-27b5-4b10-99cb-285d970bf02e-kube-api-access-sd9vr\") on node \"crc\" DevicePath \"\"" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.513046 4825 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/504bd427-27b5-4b10-99cb-285d970bf02e-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.513056 4825 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/504bd427-27b5-4b10-99cb-285d970bf02e-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.513065 4825 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/504bd427-27b5-4b10-99cb-285d970bf02e-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.513073 4825 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/504bd427-27b5-4b10-99cb-285d970bf02e-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.513082 4825 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/504bd427-27b5-4b10-99cb-285d970bf02e-config-out\") on node \"crc\" DevicePath \"\"" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.513093 4825 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/504bd427-27b5-4b10-99cb-285d970bf02e-web-config\") on node \"crc\" DevicePath \"\"" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.550967 4825 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.551101 4825 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-4277677a-0949-4a04-b057-6e93d3b54e65" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4277677a-0949-4a04-b057-6e93d3b54e65") on node "crc" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.614952 4825 reconciler_common.go:293] "Volume detached for volume \"pvc-4277677a-0949-4a04-b057-6e93d3b54e65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4277677a-0949-4a04-b057-6e93d3b54e65\") on node \"crc\" DevicePath \"\"" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.992478 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"504bd427-27b5-4b10-99cb-285d970bf02e","Type":"ContainerDied","Data":"a9917bb6daa9b6905186750ccc4e9ec5cdfe017bb2d4db9e288d8d0ffcc22efb"} Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.992557 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:36 crc kubenswrapper[4825]: I0310 08:24:36.992748 4825 scope.go:117] "RemoveContainer" containerID="6a5b561f9fd0cbac987fb383d3048754b2b5c5ee65fa72dc6dbda5c54688ad36" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.033568 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.044821 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.085214 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 10 08:24:37 crc kubenswrapper[4825]: E0310 08:24:37.085862 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="504bd427-27b5-4b10-99cb-285d970bf02e" containerName="thanos-sidecar" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.085883 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="504bd427-27b5-4b10-99cb-285d970bf02e" containerName="thanos-sidecar" Mar 10 08:24:37 crc kubenswrapper[4825]: E0310 08:24:37.085893 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="504bd427-27b5-4b10-99cb-285d970bf02e" containerName="prometheus" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.085899 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="504bd427-27b5-4b10-99cb-285d970bf02e" containerName="prometheus" Mar 10 08:24:37 crc kubenswrapper[4825]: E0310 08:24:37.085914 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="504bd427-27b5-4b10-99cb-285d970bf02e" containerName="init-config-reloader" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.085922 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="504bd427-27b5-4b10-99cb-285d970bf02e" containerName="init-config-reloader" Mar 10 08:24:37 crc kubenswrapper[4825]: E0310 08:24:37.085950 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="504bd427-27b5-4b10-99cb-285d970bf02e" containerName="config-reloader" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.085955 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="504bd427-27b5-4b10-99cb-285d970bf02e" containerName="config-reloader" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.086376 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="504bd427-27b5-4b10-99cb-285d970bf02e" containerName="prometheus" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.086404 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="504bd427-27b5-4b10-99cb-285d970bf02e" containerName="config-reloader" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.086412 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="504bd427-27b5-4b10-99cb-285d970bf02e" containerName="thanos-sidecar" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.088361 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.097867 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.098212 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.098328 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.098448 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-fsmn2" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.098587 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.098696 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.098781 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.102037 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.102203 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.114113 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.122742 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f4caed09-3fd0-43fc-8e28-776e103343bd-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.122825 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f4caed09-3fd0-43fc-8e28-776e103343bd-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.122848 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f4caed09-3fd0-43fc-8e28-776e103343bd-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.122888 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f4caed09-3fd0-43fc-8e28-776e103343bd-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.122908 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4caed09-3fd0-43fc-8e28-776e103343bd-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.122931 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f4caed09-3fd0-43fc-8e28-776e103343bd-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.122954 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f4caed09-3fd0-43fc-8e28-776e103343bd-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.122973 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f4caed09-3fd0-43fc-8e28-776e103343bd-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.123005 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f4caed09-3fd0-43fc-8e28-776e103343bd-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.123028 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f4caed09-3fd0-43fc-8e28-776e103343bd-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.123045 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6tkp\" (UniqueName: \"kubernetes.io/projected/f4caed09-3fd0-43fc-8e28-776e103343bd-kube-api-access-w6tkp\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.123094 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4caed09-3fd0-43fc-8e28-776e103343bd-config\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.123118 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4277677a-0949-4a04-b057-6e93d3b54e65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4277677a-0949-4a04-b057-6e93d3b54e65\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.225240 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f4caed09-3fd0-43fc-8e28-776e103343bd-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.225353 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f4caed09-3fd0-43fc-8e28-776e103343bd-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.225374 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f4caed09-3fd0-43fc-8e28-776e103343bd-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.225428 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f4caed09-3fd0-43fc-8e28-776e103343bd-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.225456 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4caed09-3fd0-43fc-8e28-776e103343bd-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.225480 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f4caed09-3fd0-43fc-8e28-776e103343bd-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.225512 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f4caed09-3fd0-43fc-8e28-776e103343bd-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.225534 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f4caed09-3fd0-43fc-8e28-776e103343bd-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.225566 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f4caed09-3fd0-43fc-8e28-776e103343bd-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.225596 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f4caed09-3fd0-43fc-8e28-776e103343bd-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.225618 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6tkp\" (UniqueName: \"kubernetes.io/projected/f4caed09-3fd0-43fc-8e28-776e103343bd-kube-api-access-w6tkp\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.225690 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4caed09-3fd0-43fc-8e28-776e103343bd-config\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.225727 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4277677a-0949-4a04-b057-6e93d3b54e65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4277677a-0949-4a04-b057-6e93d3b54e65\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.227474 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f4caed09-3fd0-43fc-8e28-776e103343bd-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.227903 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f4caed09-3fd0-43fc-8e28-776e103343bd-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.228159 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f4caed09-3fd0-43fc-8e28-776e103343bd-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.231039 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f4caed09-3fd0-43fc-8e28-776e103343bd-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.231557 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4caed09-3fd0-43fc-8e28-776e103343bd-config\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.232702 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f4caed09-3fd0-43fc-8e28-776e103343bd-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.233006 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4caed09-3fd0-43fc-8e28-776e103343bd-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.234508 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f4caed09-3fd0-43fc-8e28-776e103343bd-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.237837 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f4caed09-3fd0-43fc-8e28-776e103343bd-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.238003 4825 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.238039 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4277677a-0949-4a04-b057-6e93d3b54e65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4277677a-0949-4a04-b057-6e93d3b54e65\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/837f8e712531f2ad959ef263db73d503df17159a57bcfd27bb8598f273a4236d/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.250950 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f4caed09-3fd0-43fc-8e28-776e103343bd-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.251709 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6tkp\" (UniqueName: \"kubernetes.io/projected/f4caed09-3fd0-43fc-8e28-776e103343bd-kube-api-access-w6tkp\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.255943 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f4caed09-3fd0-43fc-8e28-776e103343bd-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.258970 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="504bd427-27b5-4b10-99cb-285d970bf02e" path="/var/lib/kubelet/pods/504bd427-27b5-4b10-99cb-285d970bf02e/volumes" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.261708 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6942d3f9-7a95-4c4f-9c04-49b338e4f82f" path="/var/lib/kubelet/pods/6942d3f9-7a95-4c4f-9c04-49b338e4f82f/volumes" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.310678 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4277677a-0949-4a04-b057-6e93d3b54e65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4277677a-0949-4a04-b057-6e93d3b54e65\") pod \"prometheus-metric-storage-0\" (UID: \"f4caed09-3fd0-43fc-8e28-776e103343bd\") " pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:37 crc kubenswrapper[4825]: I0310 08:24:37.420466 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 10 08:24:38 crc kubenswrapper[4825]: I0310 08:24:38.561420 4825 scope.go:117] "RemoveContainer" containerID="30645e8fd4a42db208e6a279c8f8487dd402eb2a9f5bc17c540a72805ff74ce9" Mar 10 08:24:38 crc kubenswrapper[4825]: I0310 08:24:38.946910 4825 scope.go:117] "RemoveContainer" containerID="51adc52051557da29ea0059305b1fabbdf2dd4b898fb143e341eb9eeb701d7bf" Mar 10 08:24:39 crc kubenswrapper[4825]: I0310 08:24:39.022477 4825 scope.go:117] "RemoveContainer" containerID="ac5da213dba0b9785a5ead9c0f3ea7084f81a54f664eb9952011ef5f7024b42d" Mar 10 08:24:39 crc kubenswrapper[4825]: I0310 08:24:39.334249 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="504bd427-27b5-4b10-99cb-285d970bf02e" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.1.167:9090/-/ready\": dial tcp 10.217.1.167:9090: i/o timeout (Client.Timeout exceeded while awaiting headers)" Mar 10 08:24:40 crc kubenswrapper[4825]: I0310 08:24:39.503037 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 10 08:24:40 crc kubenswrapper[4825]: W0310 08:24:39.509366 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4caed09_3fd0_43fc_8e28_776e103343bd.slice/crio-2c76b8f8848e69e45efacb530bf5b8be5e30de09cfbea7f536cac97f43344e0b WatchSource:0}: Error finding container 2c76b8f8848e69e45efacb530bf5b8be5e30de09cfbea7f536cac97f43344e0b: Status 404 returned error can't find the container with id 2c76b8f8848e69e45efacb530bf5b8be5e30de09cfbea7f536cac97f43344e0b Mar 10 08:24:40 crc kubenswrapper[4825]: I0310 08:24:40.089652 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3fb738d-ab2f-4e60-9089-0e00c094843f","Type":"ContainerStarted","Data":"05957ec238ef9b0191d882b6eb08a62cbc40efce2a9684bb8638d063980f0d78"} Mar 10 08:24:40 crc kubenswrapper[4825]: I0310 08:24:40.091182 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f4caed09-3fd0-43fc-8e28-776e103343bd","Type":"ContainerStarted","Data":"2c76b8f8848e69e45efacb530bf5b8be5e30de09cfbea7f536cac97f43344e0b"} Mar 10 08:24:41 crc kubenswrapper[4825]: I0310 08:24:41.134996 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3fb738d-ab2f-4e60-9089-0e00c094843f","Type":"ContainerStarted","Data":"e22e2c68bfc406e6120b38bbe5a17a5b2d9e1e26cc7f412c181ed18ccdb48aba"} Mar 10 08:24:42 crc kubenswrapper[4825]: I0310 08:24:42.145845 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3fb738d-ab2f-4e60-9089-0e00c094843f","Type":"ContainerStarted","Data":"4fdde4c509cacc7ae2888137ff45ed17da2d69b5d12202ed5fbbff3478ba82e2"} Mar 10 08:24:43 crc kubenswrapper[4825]: I0310 08:24:43.156699 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3fb738d-ab2f-4e60-9089-0e00c094843f","Type":"ContainerStarted","Data":"33b23ac26b28c5034ced51b07431debf29256982eae426d3799a7f0ce000023a"} Mar 10 08:24:43 crc kubenswrapper[4825]: I0310 08:24:43.156857 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 08:24:43 crc kubenswrapper[4825]: I0310 08:24:43.159086 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f4caed09-3fd0-43fc-8e28-776e103343bd","Type":"ContainerStarted","Data":"942054483f21e9b23cfe2b15eda27bf35df25c0d9f3008011f4a49c92444fefa"} Mar 10 08:24:43 crc kubenswrapper[4825]: I0310 08:24:43.185577 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.4733856149999998 podStartE2EDuration="11.185546462s" podCreationTimestamp="2026-03-10 08:24:32 +0000 UTC" firstStartedPulling="2026-03-10 08:24:34.095770782 +0000 UTC m=+6027.125551397" lastFinishedPulling="2026-03-10 08:24:42.807931619 +0000 UTC m=+6035.837712244" observedRunningTime="2026-03-10 08:24:43.180374237 +0000 UTC m=+6036.210154862" watchObservedRunningTime="2026-03-10 08:24:43.185546462 +0000 UTC m=+6036.215327087" Mar 10 08:24:46 crc kubenswrapper[4825]: I0310 08:24:46.888038 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:24:46 crc kubenswrapper[4825]: I0310 08:24:46.888425 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:24:48 crc kubenswrapper[4825]: I0310 08:24:48.051075 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-xf7vk"] Mar 10 08:24:48 crc kubenswrapper[4825]: I0310 08:24:48.060083 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-xf7vk"] Mar 10 08:24:49 crc kubenswrapper[4825]: I0310 08:24:49.247403 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50272a2f-4ff6-4c66-b7de-80bf6ce236c3" path="/var/lib/kubelet/pods/50272a2f-4ff6-4c66-b7de-80bf6ce236c3/volumes" Mar 10 08:24:49 crc kubenswrapper[4825]: I0310 08:24:49.810484 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-hfkl9"] Mar 10 08:24:49 crc kubenswrapper[4825]: I0310 08:24:49.812436 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-hfkl9" Mar 10 08:24:49 crc kubenswrapper[4825]: I0310 08:24:49.830173 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-hfkl9"] Mar 10 08:24:49 crc kubenswrapper[4825]: I0310 08:24:49.911042 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a04468f-d0fe-40da-81ab-85ff0e683d0a-operator-scripts\") pod \"aodh-db-create-hfkl9\" (UID: \"5a04468f-d0fe-40da-81ab-85ff0e683d0a\") " pod="openstack/aodh-db-create-hfkl9" Mar 10 08:24:49 crc kubenswrapper[4825]: I0310 08:24:49.911121 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwh22\" (UniqueName: \"kubernetes.io/projected/5a04468f-d0fe-40da-81ab-85ff0e683d0a-kube-api-access-zwh22\") pod \"aodh-db-create-hfkl9\" (UID: \"5a04468f-d0fe-40da-81ab-85ff0e683d0a\") " pod="openstack/aodh-db-create-hfkl9" Mar 10 08:24:49 crc kubenswrapper[4825]: I0310 08:24:49.919892 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-170a-account-create-update-dm2z4"] Mar 10 08:24:49 crc kubenswrapper[4825]: I0310 08:24:49.921138 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-170a-account-create-update-dm2z4" Mar 10 08:24:49 crc kubenswrapper[4825]: I0310 08:24:49.923281 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Mar 10 08:24:49 crc kubenswrapper[4825]: I0310 08:24:49.939228 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-170a-account-create-update-dm2z4"] Mar 10 08:24:50 crc kubenswrapper[4825]: I0310 08:24:50.013538 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a04468f-d0fe-40da-81ab-85ff0e683d0a-operator-scripts\") pod \"aodh-db-create-hfkl9\" (UID: \"5a04468f-d0fe-40da-81ab-85ff0e683d0a\") " pod="openstack/aodh-db-create-hfkl9" Mar 10 08:24:50 crc kubenswrapper[4825]: I0310 08:24:50.013651 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwh22\" (UniqueName: \"kubernetes.io/projected/5a04468f-d0fe-40da-81ab-85ff0e683d0a-kube-api-access-zwh22\") pod \"aodh-db-create-hfkl9\" (UID: \"5a04468f-d0fe-40da-81ab-85ff0e683d0a\") " pod="openstack/aodh-db-create-hfkl9" Mar 10 08:24:50 crc kubenswrapper[4825]: I0310 08:24:50.013744 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb9c8dbc-8d08-41bc-92b9-807b45181044-operator-scripts\") pod \"aodh-170a-account-create-update-dm2z4\" (UID: \"fb9c8dbc-8d08-41bc-92b9-807b45181044\") " pod="openstack/aodh-170a-account-create-update-dm2z4" Mar 10 08:24:50 crc kubenswrapper[4825]: I0310 08:24:50.013787 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t6dn\" (UniqueName: \"kubernetes.io/projected/fb9c8dbc-8d08-41bc-92b9-807b45181044-kube-api-access-4t6dn\") pod \"aodh-170a-account-create-update-dm2z4\" (UID: \"fb9c8dbc-8d08-41bc-92b9-807b45181044\") " pod="openstack/aodh-170a-account-create-update-dm2z4" Mar 10 08:24:50 crc kubenswrapper[4825]: I0310 08:24:50.014382 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a04468f-d0fe-40da-81ab-85ff0e683d0a-operator-scripts\") pod \"aodh-db-create-hfkl9\" (UID: \"5a04468f-d0fe-40da-81ab-85ff0e683d0a\") " pod="openstack/aodh-db-create-hfkl9" Mar 10 08:24:50 crc kubenswrapper[4825]: I0310 08:24:50.030772 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwh22\" (UniqueName: \"kubernetes.io/projected/5a04468f-d0fe-40da-81ab-85ff0e683d0a-kube-api-access-zwh22\") pod \"aodh-db-create-hfkl9\" (UID: \"5a04468f-d0fe-40da-81ab-85ff0e683d0a\") " pod="openstack/aodh-db-create-hfkl9" Mar 10 08:24:50 crc kubenswrapper[4825]: I0310 08:24:50.115598 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb9c8dbc-8d08-41bc-92b9-807b45181044-operator-scripts\") pod \"aodh-170a-account-create-update-dm2z4\" (UID: \"fb9c8dbc-8d08-41bc-92b9-807b45181044\") " pod="openstack/aodh-170a-account-create-update-dm2z4" Mar 10 08:24:50 crc kubenswrapper[4825]: I0310 08:24:50.115957 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t6dn\" (UniqueName: \"kubernetes.io/projected/fb9c8dbc-8d08-41bc-92b9-807b45181044-kube-api-access-4t6dn\") pod \"aodh-170a-account-create-update-dm2z4\" (UID: \"fb9c8dbc-8d08-41bc-92b9-807b45181044\") " pod="openstack/aodh-170a-account-create-update-dm2z4" Mar 10 08:24:50 crc kubenswrapper[4825]: I0310 08:24:50.117905 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb9c8dbc-8d08-41bc-92b9-807b45181044-operator-scripts\") pod \"aodh-170a-account-create-update-dm2z4\" (UID: \"fb9c8dbc-8d08-41bc-92b9-807b45181044\") " pod="openstack/aodh-170a-account-create-update-dm2z4" Mar 10 08:24:50 crc kubenswrapper[4825]: I0310 08:24:50.135291 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-hfkl9" Mar 10 08:24:50 crc kubenswrapper[4825]: I0310 08:24:50.138154 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t6dn\" (UniqueName: \"kubernetes.io/projected/fb9c8dbc-8d08-41bc-92b9-807b45181044-kube-api-access-4t6dn\") pod \"aodh-170a-account-create-update-dm2z4\" (UID: \"fb9c8dbc-8d08-41bc-92b9-807b45181044\") " pod="openstack/aodh-170a-account-create-update-dm2z4" Mar 10 08:24:50 crc kubenswrapper[4825]: I0310 08:24:50.216936 4825 scope.go:117] "RemoveContainer" containerID="0f554e63e5c766240e80d6b35cb69ff0af9a63cce450ff6a93312478f01708a4" Mar 10 08:24:50 crc kubenswrapper[4825]: I0310 08:24:50.242184 4825 generic.go:334] "Generic (PLEG): container finished" podID="f4caed09-3fd0-43fc-8e28-776e103343bd" containerID="942054483f21e9b23cfe2b15eda27bf35df25c0d9f3008011f4a49c92444fefa" exitCode=0 Mar 10 08:24:50 crc kubenswrapper[4825]: I0310 08:24:50.242224 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f4caed09-3fd0-43fc-8e28-776e103343bd","Type":"ContainerDied","Data":"942054483f21e9b23cfe2b15eda27bf35df25c0d9f3008011f4a49c92444fefa"} Mar 10 08:24:50 crc kubenswrapper[4825]: I0310 08:24:50.287665 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-170a-account-create-update-dm2z4" Mar 10 08:24:50 crc kubenswrapper[4825]: I0310 08:24:50.302418 4825 scope.go:117] "RemoveContainer" containerID="7de10519b9e1b6f98c1a1256dab9a133f0e65a322025cc77b71087bb11943577" Mar 10 08:24:50 crc kubenswrapper[4825]: I0310 08:24:50.441603 4825 scope.go:117] "RemoveContainer" containerID="38bd83551c0dac70d6198bf0e2a000e2d03b88269c897a0e0b841de5a64a9f02" Mar 10 08:24:50 crc kubenswrapper[4825]: I0310 08:24:50.526780 4825 scope.go:117] "RemoveContainer" containerID="010a911e5e9ec3ed61ba006b2584f13f6e8b789e247d40f22a396878a027d480" Mar 10 08:24:50 crc kubenswrapper[4825]: I0310 08:24:50.696778 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-hfkl9"] Mar 10 08:24:50 crc kubenswrapper[4825]: I0310 08:24:50.844578 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-170a-account-create-update-dm2z4"] Mar 10 08:24:50 crc kubenswrapper[4825]: W0310 08:24:50.872216 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb9c8dbc_8d08_41bc_92b9_807b45181044.slice/crio-5f87b0e0547e75b91b71ac06359cb7912edc5df567333b6e7ee68fb60e331f20 WatchSource:0}: Error finding container 5f87b0e0547e75b91b71ac06359cb7912edc5df567333b6e7ee68fb60e331f20: Status 404 returned error can't find the container with id 5f87b0e0547e75b91b71ac06359cb7912edc5df567333b6e7ee68fb60e331f20 Mar 10 08:24:51 crc kubenswrapper[4825]: I0310 08:24:51.251829 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f4caed09-3fd0-43fc-8e28-776e103343bd","Type":"ContainerStarted","Data":"07b10c0a5ef9df7836b57d881141168f542a190545beda86d290e9e018946e64"} Mar 10 08:24:51 crc kubenswrapper[4825]: I0310 08:24:51.253641 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-170a-account-create-update-dm2z4" event={"ID":"fb9c8dbc-8d08-41bc-92b9-807b45181044","Type":"ContainerStarted","Data":"f5d7ae71e5203c3d98973148eb54aec5743a00074e6c8fb3f5df6ec4097c8f6b"} Mar 10 08:24:51 crc kubenswrapper[4825]: I0310 08:24:51.253765 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-170a-account-create-update-dm2z4" event={"ID":"fb9c8dbc-8d08-41bc-92b9-807b45181044","Type":"ContainerStarted","Data":"5f87b0e0547e75b91b71ac06359cb7912edc5df567333b6e7ee68fb60e331f20"} Mar 10 08:24:51 crc kubenswrapper[4825]: I0310 08:24:51.255796 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-hfkl9" event={"ID":"5a04468f-d0fe-40da-81ab-85ff0e683d0a","Type":"ContainerStarted","Data":"b77c91bb00dc24963b87c5e53479c9a9343f56474be1f973da5436cbf8377f2a"} Mar 10 08:24:51 crc kubenswrapper[4825]: I0310 08:24:51.255843 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-hfkl9" event={"ID":"5a04468f-d0fe-40da-81ab-85ff0e683d0a","Type":"ContainerStarted","Data":"1c776f52b2593ffc2cde7fa3ba2e66e8da1cfed04ce14d4853b1682046338325"} Mar 10 08:24:51 crc kubenswrapper[4825]: I0310 08:24:51.278287 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-170a-account-create-update-dm2z4" podStartSLOduration=2.278267267 podStartE2EDuration="2.278267267s" podCreationTimestamp="2026-03-10 08:24:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:24:51.266525779 +0000 UTC m=+6044.296306424" watchObservedRunningTime="2026-03-10 08:24:51.278267267 +0000 UTC m=+6044.308047882" Mar 10 08:24:51 crc kubenswrapper[4825]: I0310 08:24:51.297838 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-hfkl9" podStartSLOduration=2.2978219810000002 podStartE2EDuration="2.297821981s" podCreationTimestamp="2026-03-10 08:24:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:24:51.288996479 +0000 UTC m=+6044.318777094" watchObservedRunningTime="2026-03-10 08:24:51.297821981 +0000 UTC m=+6044.327602586" Mar 10 08:24:52 crc kubenswrapper[4825]: I0310 08:24:52.277002 4825 generic.go:334] "Generic (PLEG): container finished" podID="fb9c8dbc-8d08-41bc-92b9-807b45181044" containerID="f5d7ae71e5203c3d98973148eb54aec5743a00074e6c8fb3f5df6ec4097c8f6b" exitCode=0 Mar 10 08:24:52 crc kubenswrapper[4825]: I0310 08:24:52.278044 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-170a-account-create-update-dm2z4" event={"ID":"fb9c8dbc-8d08-41bc-92b9-807b45181044","Type":"ContainerDied","Data":"f5d7ae71e5203c3d98973148eb54aec5743a00074e6c8fb3f5df6ec4097c8f6b"} Mar 10 08:24:52 crc kubenswrapper[4825]: I0310 08:24:52.282309 4825 generic.go:334] "Generic (PLEG): container finished" podID="5a04468f-d0fe-40da-81ab-85ff0e683d0a" containerID="b77c91bb00dc24963b87c5e53479c9a9343f56474be1f973da5436cbf8377f2a" exitCode=0 Mar 10 08:24:52 crc kubenswrapper[4825]: I0310 08:24:52.282356 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-hfkl9" event={"ID":"5a04468f-d0fe-40da-81ab-85ff0e683d0a","Type":"ContainerDied","Data":"b77c91bb00dc24963b87c5e53479c9a9343f56474be1f973da5436cbf8377f2a"} Mar 10 08:24:54 crc kubenswrapper[4825]: I0310 08:24:54.022419 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-hfkl9" Mar 10 08:24:54 crc kubenswrapper[4825]: I0310 08:24:54.030630 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-170a-account-create-update-dm2z4" Mar 10 08:24:54 crc kubenswrapper[4825]: I0310 08:24:54.117990 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwh22\" (UniqueName: \"kubernetes.io/projected/5a04468f-d0fe-40da-81ab-85ff0e683d0a-kube-api-access-zwh22\") pod \"5a04468f-d0fe-40da-81ab-85ff0e683d0a\" (UID: \"5a04468f-d0fe-40da-81ab-85ff0e683d0a\") " Mar 10 08:24:54 crc kubenswrapper[4825]: I0310 08:24:54.118061 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t6dn\" (UniqueName: \"kubernetes.io/projected/fb9c8dbc-8d08-41bc-92b9-807b45181044-kube-api-access-4t6dn\") pod \"fb9c8dbc-8d08-41bc-92b9-807b45181044\" (UID: \"fb9c8dbc-8d08-41bc-92b9-807b45181044\") " Mar 10 08:24:54 crc kubenswrapper[4825]: I0310 08:24:54.118096 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a04468f-d0fe-40da-81ab-85ff0e683d0a-operator-scripts\") pod \"5a04468f-d0fe-40da-81ab-85ff0e683d0a\" (UID: \"5a04468f-d0fe-40da-81ab-85ff0e683d0a\") " Mar 10 08:24:54 crc kubenswrapper[4825]: I0310 08:24:54.118262 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb9c8dbc-8d08-41bc-92b9-807b45181044-operator-scripts\") pod \"fb9c8dbc-8d08-41bc-92b9-807b45181044\" (UID: \"fb9c8dbc-8d08-41bc-92b9-807b45181044\") " Mar 10 08:24:54 crc kubenswrapper[4825]: I0310 08:24:54.118709 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb9c8dbc-8d08-41bc-92b9-807b45181044-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fb9c8dbc-8d08-41bc-92b9-807b45181044" (UID: "fb9c8dbc-8d08-41bc-92b9-807b45181044"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:24:54 crc kubenswrapper[4825]: I0310 08:24:54.119209 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a04468f-d0fe-40da-81ab-85ff0e683d0a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5a04468f-d0fe-40da-81ab-85ff0e683d0a" (UID: "5a04468f-d0fe-40da-81ab-85ff0e683d0a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:24:54 crc kubenswrapper[4825]: I0310 08:24:54.124334 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb9c8dbc-8d08-41bc-92b9-807b45181044-kube-api-access-4t6dn" (OuterVolumeSpecName: "kube-api-access-4t6dn") pod "fb9c8dbc-8d08-41bc-92b9-807b45181044" (UID: "fb9c8dbc-8d08-41bc-92b9-807b45181044"). InnerVolumeSpecName "kube-api-access-4t6dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:24:54 crc kubenswrapper[4825]: I0310 08:24:54.124391 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a04468f-d0fe-40da-81ab-85ff0e683d0a-kube-api-access-zwh22" (OuterVolumeSpecName: "kube-api-access-zwh22") pod "5a04468f-d0fe-40da-81ab-85ff0e683d0a" (UID: "5a04468f-d0fe-40da-81ab-85ff0e683d0a"). InnerVolumeSpecName "kube-api-access-zwh22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:24:54 crc kubenswrapper[4825]: I0310 08:24:54.220820 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwh22\" (UniqueName: \"kubernetes.io/projected/5a04468f-d0fe-40da-81ab-85ff0e683d0a-kube-api-access-zwh22\") on node \"crc\" DevicePath \"\"" Mar 10 08:24:54 crc kubenswrapper[4825]: I0310 08:24:54.220866 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t6dn\" (UniqueName: \"kubernetes.io/projected/fb9c8dbc-8d08-41bc-92b9-807b45181044-kube-api-access-4t6dn\") on node \"crc\" DevicePath \"\"" Mar 10 08:24:54 crc kubenswrapper[4825]: I0310 08:24:54.220878 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a04468f-d0fe-40da-81ab-85ff0e683d0a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:24:54 crc kubenswrapper[4825]: I0310 08:24:54.220890 4825 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb9c8dbc-8d08-41bc-92b9-807b45181044-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:24:54 crc kubenswrapper[4825]: I0310 08:24:54.304926 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-hfkl9" event={"ID":"5a04468f-d0fe-40da-81ab-85ff0e683d0a","Type":"ContainerDied","Data":"1c776f52b2593ffc2cde7fa3ba2e66e8da1cfed04ce14d4853b1682046338325"} Mar 10 08:24:54 crc kubenswrapper[4825]: I0310 08:24:54.304966 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c776f52b2593ffc2cde7fa3ba2e66e8da1cfed04ce14d4853b1682046338325" Mar 10 08:24:54 crc kubenswrapper[4825]: I0310 08:24:54.304968 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-hfkl9" Mar 10 08:24:54 crc kubenswrapper[4825]: I0310 08:24:54.308611 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f4caed09-3fd0-43fc-8e28-776e103343bd","Type":"ContainerStarted","Data":"7d984af948f6a4e75ab2391eb684087520d2eb27c11372ec4ad04652ece60c4b"} Mar 10 08:24:54 crc kubenswrapper[4825]: I0310 08:24:54.308659 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f4caed09-3fd0-43fc-8e28-776e103343bd","Type":"ContainerStarted","Data":"00b6d835cd490eb95a731201e6f666e30ca274980122ef556029fb020890f04c"} Mar 10 08:24:54 crc kubenswrapper[4825]: I0310 08:24:54.311487 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-170a-account-create-update-dm2z4" event={"ID":"fb9c8dbc-8d08-41bc-92b9-807b45181044","Type":"ContainerDied","Data":"5f87b0e0547e75b91b71ac06359cb7912edc5df567333b6e7ee68fb60e331f20"} Mar 10 08:24:54 crc kubenswrapper[4825]: I0310 08:24:54.311515 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-170a-account-create-update-dm2z4" Mar 10 08:24:54 crc kubenswrapper[4825]: I0310 08:24:54.311534 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f87b0e0547e75b91b71ac06359cb7912edc5df567333b6e7ee68fb60e331f20" Mar 10 08:24:54 crc kubenswrapper[4825]: I0310 08:24:54.350434 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=17.350413973 podStartE2EDuration="17.350413973s" podCreationTimestamp="2026-03-10 08:24:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:24:54.337624617 +0000 UTC m=+6047.367405232" watchObservedRunningTime="2026-03-10 08:24:54.350413973 +0000 UTC m=+6047.380194588" Mar 10 08:24:55 crc kubenswrapper[4825]: I0310 08:24:55.206759 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-whppm"] Mar 10 08:24:55 crc kubenswrapper[4825]: E0310 08:24:55.207624 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a04468f-d0fe-40da-81ab-85ff0e683d0a" containerName="mariadb-database-create" Mar 10 08:24:55 crc kubenswrapper[4825]: I0310 08:24:55.207645 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a04468f-d0fe-40da-81ab-85ff0e683d0a" containerName="mariadb-database-create" Mar 10 08:24:55 crc kubenswrapper[4825]: E0310 08:24:55.207673 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb9c8dbc-8d08-41bc-92b9-807b45181044" containerName="mariadb-account-create-update" Mar 10 08:24:55 crc kubenswrapper[4825]: I0310 08:24:55.207681 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb9c8dbc-8d08-41bc-92b9-807b45181044" containerName="mariadb-account-create-update" Mar 10 08:24:55 crc kubenswrapper[4825]: I0310 08:24:55.207959 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a04468f-d0fe-40da-81ab-85ff0e683d0a" containerName="mariadb-database-create" Mar 10 08:24:55 crc kubenswrapper[4825]: I0310 08:24:55.207973 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb9c8dbc-8d08-41bc-92b9-807b45181044" containerName="mariadb-account-create-update" Mar 10 08:24:55 crc kubenswrapper[4825]: I0310 08:24:55.208879 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-whppm" Mar 10 08:24:55 crc kubenswrapper[4825]: I0310 08:24:55.211775 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 10 08:24:55 crc kubenswrapper[4825]: I0310 08:24:55.211983 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 08:24:55 crc kubenswrapper[4825]: I0310 08:24:55.212725 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 10 08:24:55 crc kubenswrapper[4825]: I0310 08:24:55.213024 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-pkvxq" Mar 10 08:24:55 crc kubenswrapper[4825]: I0310 08:24:55.219169 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-whppm"] Mar 10 08:24:55 crc kubenswrapper[4825]: I0310 08:24:55.239447 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5737928d-0f6e-441a-a129-f40bb9a984f6-combined-ca-bundle\") pod \"aodh-db-sync-whppm\" (UID: \"5737928d-0f6e-441a-a129-f40bb9a984f6\") " pod="openstack/aodh-db-sync-whppm" Mar 10 08:24:55 crc kubenswrapper[4825]: I0310 08:24:55.239640 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5737928d-0f6e-441a-a129-f40bb9a984f6-scripts\") pod \"aodh-db-sync-whppm\" (UID: \"5737928d-0f6e-441a-a129-f40bb9a984f6\") " pod="openstack/aodh-db-sync-whppm" Mar 10 08:24:55 crc kubenswrapper[4825]: I0310 08:24:55.239813 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf4lg\" (UniqueName: \"kubernetes.io/projected/5737928d-0f6e-441a-a129-f40bb9a984f6-kube-api-access-wf4lg\") pod \"aodh-db-sync-whppm\" (UID: \"5737928d-0f6e-441a-a129-f40bb9a984f6\") " pod="openstack/aodh-db-sync-whppm" Mar 10 08:24:55 crc kubenswrapper[4825]: I0310 08:24:55.239914 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5737928d-0f6e-441a-a129-f40bb9a984f6-config-data\") pod \"aodh-db-sync-whppm\" (UID: \"5737928d-0f6e-441a-a129-f40bb9a984f6\") " pod="openstack/aodh-db-sync-whppm" Mar 10 08:24:55 crc kubenswrapper[4825]: I0310 08:24:55.341730 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf4lg\" (UniqueName: \"kubernetes.io/projected/5737928d-0f6e-441a-a129-f40bb9a984f6-kube-api-access-wf4lg\") pod \"aodh-db-sync-whppm\" (UID: \"5737928d-0f6e-441a-a129-f40bb9a984f6\") " pod="openstack/aodh-db-sync-whppm" Mar 10 08:24:55 crc kubenswrapper[4825]: I0310 08:24:55.341852 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5737928d-0f6e-441a-a129-f40bb9a984f6-config-data\") pod \"aodh-db-sync-whppm\" (UID: \"5737928d-0f6e-441a-a129-f40bb9a984f6\") " pod="openstack/aodh-db-sync-whppm" Mar 10 08:24:55 crc kubenswrapper[4825]: I0310 08:24:55.341898 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5737928d-0f6e-441a-a129-f40bb9a984f6-combined-ca-bundle\") pod \"aodh-db-sync-whppm\" (UID: \"5737928d-0f6e-441a-a129-f40bb9a984f6\") " pod="openstack/aodh-db-sync-whppm" Mar 10 08:24:55 crc kubenswrapper[4825]: I0310 08:24:55.341926 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5737928d-0f6e-441a-a129-f40bb9a984f6-scripts\") pod \"aodh-db-sync-whppm\" (UID: \"5737928d-0f6e-441a-a129-f40bb9a984f6\") " pod="openstack/aodh-db-sync-whppm" Mar 10 08:24:55 crc kubenswrapper[4825]: I0310 08:24:55.347839 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5737928d-0f6e-441a-a129-f40bb9a984f6-scripts\") pod \"aodh-db-sync-whppm\" (UID: \"5737928d-0f6e-441a-a129-f40bb9a984f6\") " pod="openstack/aodh-db-sync-whppm" Mar 10 08:24:55 crc kubenswrapper[4825]: I0310 08:24:55.348420 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5737928d-0f6e-441a-a129-f40bb9a984f6-config-data\") pod \"aodh-db-sync-whppm\" (UID: \"5737928d-0f6e-441a-a129-f40bb9a984f6\") " pod="openstack/aodh-db-sync-whppm" Mar 10 08:24:55 crc kubenswrapper[4825]: I0310 08:24:55.356780 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5737928d-0f6e-441a-a129-f40bb9a984f6-combined-ca-bundle\") pod \"aodh-db-sync-whppm\" (UID: \"5737928d-0f6e-441a-a129-f40bb9a984f6\") " pod="openstack/aodh-db-sync-whppm" Mar 10 08:24:55 crc kubenswrapper[4825]: I0310 08:24:55.363548 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf4lg\" (UniqueName: \"kubernetes.io/projected/5737928d-0f6e-441a-a129-f40bb9a984f6-kube-api-access-wf4lg\") pod \"aodh-db-sync-whppm\" (UID: \"5737928d-0f6e-441a-a129-f40bb9a984f6\") " pod="openstack/aodh-db-sync-whppm" Mar 10 08:24:55 crc kubenswrapper[4825]: I0310 08:24:55.532303 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-whppm" Mar 10 08:24:56 crc kubenswrapper[4825]: I0310 08:24:56.050648 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-whppm"] Mar 10 08:24:56 crc kubenswrapper[4825]: I0310 08:24:56.330926 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-whppm" event={"ID":"5737928d-0f6e-441a-a129-f40bb9a984f6","Type":"ContainerStarted","Data":"d7928cab4bf4960292cd9bfe2152f8e2826dc6cd364293dc2ff1bed7a402584e"} Mar 10 08:24:57 crc kubenswrapper[4825]: I0310 08:24:57.420723 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 10 08:25:03 crc kubenswrapper[4825]: I0310 08:25:03.163518 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 08:25:03 crc kubenswrapper[4825]: I0310 08:25:03.401381 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-whppm" event={"ID":"5737928d-0f6e-441a-a129-f40bb9a984f6","Type":"ContainerStarted","Data":"3591ea9da8f8ff9ac66ee64d0cdc12fcbef2e5cdb74c1d2df5e52e9ccbeb7187"} Mar 10 08:25:03 crc kubenswrapper[4825]: I0310 08:25:03.428697 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-whppm" podStartSLOduration=1.936265922 podStartE2EDuration="8.428677203s" podCreationTimestamp="2026-03-10 08:24:55 +0000 UTC" firstStartedPulling="2026-03-10 08:24:56.055521889 +0000 UTC m=+6049.085302504" lastFinishedPulling="2026-03-10 08:25:02.54793317 +0000 UTC m=+6055.577713785" observedRunningTime="2026-03-10 08:25:03.414781918 +0000 UTC m=+6056.444562563" watchObservedRunningTime="2026-03-10 08:25:03.428677203 +0000 UTC m=+6056.458457828" Mar 10 08:25:05 crc kubenswrapper[4825]: I0310 08:25:05.423302 4825 generic.go:334] "Generic (PLEG): container finished" podID="5737928d-0f6e-441a-a129-f40bb9a984f6" containerID="3591ea9da8f8ff9ac66ee64d0cdc12fcbef2e5cdb74c1d2df5e52e9ccbeb7187" exitCode=0 Mar 10 08:25:05 crc kubenswrapper[4825]: I0310 08:25:05.423381 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-whppm" event={"ID":"5737928d-0f6e-441a-a129-f40bb9a984f6","Type":"ContainerDied","Data":"3591ea9da8f8ff9ac66ee64d0cdc12fcbef2e5cdb74c1d2df5e52e9ccbeb7187"} Mar 10 08:25:06 crc kubenswrapper[4825]: I0310 08:25:06.871065 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-whppm" Mar 10 08:25:07 crc kubenswrapper[4825]: I0310 08:25:07.003266 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5737928d-0f6e-441a-a129-f40bb9a984f6-combined-ca-bundle\") pod \"5737928d-0f6e-441a-a129-f40bb9a984f6\" (UID: \"5737928d-0f6e-441a-a129-f40bb9a984f6\") " Mar 10 08:25:07 crc kubenswrapper[4825]: I0310 08:25:07.003326 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5737928d-0f6e-441a-a129-f40bb9a984f6-scripts\") pod \"5737928d-0f6e-441a-a129-f40bb9a984f6\" (UID: \"5737928d-0f6e-441a-a129-f40bb9a984f6\") " Mar 10 08:25:07 crc kubenswrapper[4825]: I0310 08:25:07.003363 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf4lg\" (UniqueName: \"kubernetes.io/projected/5737928d-0f6e-441a-a129-f40bb9a984f6-kube-api-access-wf4lg\") pod \"5737928d-0f6e-441a-a129-f40bb9a984f6\" (UID: \"5737928d-0f6e-441a-a129-f40bb9a984f6\") " Mar 10 08:25:07 crc kubenswrapper[4825]: I0310 08:25:07.003400 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5737928d-0f6e-441a-a129-f40bb9a984f6-config-data\") pod \"5737928d-0f6e-441a-a129-f40bb9a984f6\" (UID: \"5737928d-0f6e-441a-a129-f40bb9a984f6\") " Mar 10 08:25:07 crc kubenswrapper[4825]: I0310 08:25:07.017386 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5737928d-0f6e-441a-a129-f40bb9a984f6-scripts" (OuterVolumeSpecName: "scripts") pod "5737928d-0f6e-441a-a129-f40bb9a984f6" (UID: "5737928d-0f6e-441a-a129-f40bb9a984f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:25:07 crc kubenswrapper[4825]: I0310 08:25:07.017395 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5737928d-0f6e-441a-a129-f40bb9a984f6-kube-api-access-wf4lg" (OuterVolumeSpecName: "kube-api-access-wf4lg") pod "5737928d-0f6e-441a-a129-f40bb9a984f6" (UID: "5737928d-0f6e-441a-a129-f40bb9a984f6"). InnerVolumeSpecName "kube-api-access-wf4lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:25:07 crc kubenswrapper[4825]: I0310 08:25:07.033834 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5737928d-0f6e-441a-a129-f40bb9a984f6-config-data" (OuterVolumeSpecName: "config-data") pod "5737928d-0f6e-441a-a129-f40bb9a984f6" (UID: "5737928d-0f6e-441a-a129-f40bb9a984f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:25:07 crc kubenswrapper[4825]: I0310 08:25:07.033887 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5737928d-0f6e-441a-a129-f40bb9a984f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5737928d-0f6e-441a-a129-f40bb9a984f6" (UID: "5737928d-0f6e-441a-a129-f40bb9a984f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:25:07 crc kubenswrapper[4825]: I0310 08:25:07.106249 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wf4lg\" (UniqueName: \"kubernetes.io/projected/5737928d-0f6e-441a-a129-f40bb9a984f6-kube-api-access-wf4lg\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:07 crc kubenswrapper[4825]: I0310 08:25:07.106292 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5737928d-0f6e-441a-a129-f40bb9a984f6-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:07 crc kubenswrapper[4825]: I0310 08:25:07.106305 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5737928d-0f6e-441a-a129-f40bb9a984f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:07 crc kubenswrapper[4825]: I0310 08:25:07.106315 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5737928d-0f6e-441a-a129-f40bb9a984f6-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:07 crc kubenswrapper[4825]: I0310 08:25:07.395533 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 08:25:07 crc kubenswrapper[4825]: I0310 08:25:07.395789 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="350fa04a-653d-4ab6-95bb-4315e8de019c" containerName="kube-state-metrics" containerID="cri-o://122e829c5b1b27499da2b33d659bd3bb16caace45fff2c715ff82c2c7b4d980d" gracePeriod=30 Mar 10 08:25:07 crc kubenswrapper[4825]: I0310 08:25:07.421251 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 10 08:25:07 crc kubenswrapper[4825]: I0310 08:25:07.432464 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 10 08:25:07 crc kubenswrapper[4825]: I0310 08:25:07.442758 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-whppm" event={"ID":"5737928d-0f6e-441a-a129-f40bb9a984f6","Type":"ContainerDied","Data":"d7928cab4bf4960292cd9bfe2152f8e2826dc6cd364293dc2ff1bed7a402584e"} Mar 10 08:25:07 crc kubenswrapper[4825]: I0310 08:25:07.442934 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7928cab4bf4960292cd9bfe2152f8e2826dc6cd364293dc2ff1bed7a402584e" Mar 10 08:25:07 crc kubenswrapper[4825]: I0310 08:25:07.442936 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-whppm" Mar 10 08:25:07 crc kubenswrapper[4825]: I0310 08:25:07.446549 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.030479 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.145968 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lfq6\" (UniqueName: \"kubernetes.io/projected/350fa04a-653d-4ab6-95bb-4315e8de019c-kube-api-access-8lfq6\") pod \"350fa04a-653d-4ab6-95bb-4315e8de019c\" (UID: \"350fa04a-653d-4ab6-95bb-4315e8de019c\") " Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.157873 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/350fa04a-653d-4ab6-95bb-4315e8de019c-kube-api-access-8lfq6" (OuterVolumeSpecName: "kube-api-access-8lfq6") pod "350fa04a-653d-4ab6-95bb-4315e8de019c" (UID: "350fa04a-653d-4ab6-95bb-4315e8de019c"). InnerVolumeSpecName "kube-api-access-8lfq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.248512 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lfq6\" (UniqueName: \"kubernetes.io/projected/350fa04a-653d-4ab6-95bb-4315e8de019c-kube-api-access-8lfq6\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.453503 4825 generic.go:334] "Generic (PLEG): container finished" podID="350fa04a-653d-4ab6-95bb-4315e8de019c" containerID="122e829c5b1b27499da2b33d659bd3bb16caace45fff2c715ff82c2c7b4d980d" exitCode=2 Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.453553 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"350fa04a-653d-4ab6-95bb-4315e8de019c","Type":"ContainerDied","Data":"122e829c5b1b27499da2b33d659bd3bb16caace45fff2c715ff82c2c7b4d980d"} Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.453598 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"350fa04a-653d-4ab6-95bb-4315e8de019c","Type":"ContainerDied","Data":"3f09959c253ec7490a48d69fe16399449612aacdb47aaa4bbceb273568e2bf9b"} Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.453609 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.453622 4825 scope.go:117] "RemoveContainer" containerID="122e829c5b1b27499da2b33d659bd3bb16caace45fff2c715ff82c2c7b4d980d" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.482518 4825 scope.go:117] "RemoveContainer" containerID="122e829c5b1b27499da2b33d659bd3bb16caace45fff2c715ff82c2c7b4d980d" Mar 10 08:25:08 crc kubenswrapper[4825]: E0310 08:25:08.483051 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"122e829c5b1b27499da2b33d659bd3bb16caace45fff2c715ff82c2c7b4d980d\": container with ID starting with 122e829c5b1b27499da2b33d659bd3bb16caace45fff2c715ff82c2c7b4d980d not found: ID does not exist" containerID="122e829c5b1b27499da2b33d659bd3bb16caace45fff2c715ff82c2c7b4d980d" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.483104 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"122e829c5b1b27499da2b33d659bd3bb16caace45fff2c715ff82c2c7b4d980d"} err="failed to get container status \"122e829c5b1b27499da2b33d659bd3bb16caace45fff2c715ff82c2c7b4d980d\": rpc error: code = NotFound desc = could not find container \"122e829c5b1b27499da2b33d659bd3bb16caace45fff2c715ff82c2c7b4d980d\": container with ID starting with 122e829c5b1b27499da2b33d659bd3bb16caace45fff2c715ff82c2c7b4d980d not found: ID does not exist" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.491765 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.502001 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.518303 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 08:25:08 crc kubenswrapper[4825]: E0310 08:25:08.518780 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="350fa04a-653d-4ab6-95bb-4315e8de019c" containerName="kube-state-metrics" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.518795 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="350fa04a-653d-4ab6-95bb-4315e8de019c" containerName="kube-state-metrics" Mar 10 08:25:08 crc kubenswrapper[4825]: E0310 08:25:08.518819 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5737928d-0f6e-441a-a129-f40bb9a984f6" containerName="aodh-db-sync" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.518843 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5737928d-0f6e-441a-a129-f40bb9a984f6" containerName="aodh-db-sync" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.519077 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5737928d-0f6e-441a-a129-f40bb9a984f6" containerName="aodh-db-sync" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.519110 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="350fa04a-653d-4ab6-95bb-4315e8de019c" containerName="kube-state-metrics" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.519825 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.522126 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.522702 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.530553 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.658304 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb5h4\" (UniqueName: \"kubernetes.io/projected/9c2d694b-0214-4e6e-b719-f002b7d58c2d-kube-api-access-nb5h4\") pod \"kube-state-metrics-0\" (UID: \"9c2d694b-0214-4e6e-b719-f002b7d58c2d\") " pod="openstack/kube-state-metrics-0" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.658411 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9c2d694b-0214-4e6e-b719-f002b7d58c2d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"9c2d694b-0214-4e6e-b719-f002b7d58c2d\") " pod="openstack/kube-state-metrics-0" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.658451 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c2d694b-0214-4e6e-b719-f002b7d58c2d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"9c2d694b-0214-4e6e-b719-f002b7d58c2d\") " pod="openstack/kube-state-metrics-0" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.658512 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c2d694b-0214-4e6e-b719-f002b7d58c2d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"9c2d694b-0214-4e6e-b719-f002b7d58c2d\") " pod="openstack/kube-state-metrics-0" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.759946 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb5h4\" (UniqueName: \"kubernetes.io/projected/9c2d694b-0214-4e6e-b719-f002b7d58c2d-kube-api-access-nb5h4\") pod \"kube-state-metrics-0\" (UID: \"9c2d694b-0214-4e6e-b719-f002b7d58c2d\") " pod="openstack/kube-state-metrics-0" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.760046 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9c2d694b-0214-4e6e-b719-f002b7d58c2d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"9c2d694b-0214-4e6e-b719-f002b7d58c2d\") " pod="openstack/kube-state-metrics-0" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.760078 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c2d694b-0214-4e6e-b719-f002b7d58c2d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"9c2d694b-0214-4e6e-b719-f002b7d58c2d\") " pod="openstack/kube-state-metrics-0" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.760123 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c2d694b-0214-4e6e-b719-f002b7d58c2d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"9c2d694b-0214-4e6e-b719-f002b7d58c2d\") " pod="openstack/kube-state-metrics-0" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.763741 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9c2d694b-0214-4e6e-b719-f002b7d58c2d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"9c2d694b-0214-4e6e-b719-f002b7d58c2d\") " pod="openstack/kube-state-metrics-0" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.763771 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c2d694b-0214-4e6e-b719-f002b7d58c2d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"9c2d694b-0214-4e6e-b719-f002b7d58c2d\") " pod="openstack/kube-state-metrics-0" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.766163 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c2d694b-0214-4e6e-b719-f002b7d58c2d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"9c2d694b-0214-4e6e-b719-f002b7d58c2d\") " pod="openstack/kube-state-metrics-0" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.781778 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb5h4\" (UniqueName: \"kubernetes.io/projected/9c2d694b-0214-4e6e-b719-f002b7d58c2d-kube-api-access-nb5h4\") pod \"kube-state-metrics-0\" (UID: \"9c2d694b-0214-4e6e-b719-f002b7d58c2d\") " pod="openstack/kube-state-metrics-0" Mar 10 08:25:08 crc kubenswrapper[4825]: I0310 08:25:08.840342 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 08:25:09 crc kubenswrapper[4825]: I0310 08:25:09.249203 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="350fa04a-653d-4ab6-95bb-4315e8de019c" path="/var/lib/kubelet/pods/350fa04a-653d-4ab6-95bb-4315e8de019c/volumes" Mar 10 08:25:09 crc kubenswrapper[4825]: I0310 08:25:09.317057 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 08:25:09 crc kubenswrapper[4825]: I0310 08:25:09.317431 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e3fb738d-ab2f-4e60-9089-0e00c094843f" containerName="ceilometer-central-agent" containerID="cri-o://05957ec238ef9b0191d882b6eb08a62cbc40efce2a9684bb8638d063980f0d78" gracePeriod=30 Mar 10 08:25:09 crc kubenswrapper[4825]: I0310 08:25:09.317943 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e3fb738d-ab2f-4e60-9089-0e00c094843f" containerName="proxy-httpd" containerID="cri-o://33b23ac26b28c5034ced51b07431debf29256982eae426d3799a7f0ce000023a" gracePeriod=30 Mar 10 08:25:09 crc kubenswrapper[4825]: I0310 08:25:09.318316 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e3fb738d-ab2f-4e60-9089-0e00c094843f" containerName="ceilometer-notification-agent" containerID="cri-o://e22e2c68bfc406e6120b38bbe5a17a5b2d9e1e26cc7f412c181ed18ccdb48aba" gracePeriod=30 Mar 10 08:25:09 crc kubenswrapper[4825]: I0310 08:25:09.318342 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e3fb738d-ab2f-4e60-9089-0e00c094843f" containerName="sg-core" containerID="cri-o://4fdde4c509cacc7ae2888137ff45ed17da2d69b5d12202ed5fbbff3478ba82e2" gracePeriod=30 Mar 10 08:25:09 crc kubenswrapper[4825]: I0310 08:25:09.334396 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 08:25:09 crc kubenswrapper[4825]: I0310 08:25:09.349612 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 08:25:09 crc kubenswrapper[4825]: I0310 08:25:09.463594 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9c2d694b-0214-4e6e-b719-f002b7d58c2d","Type":"ContainerStarted","Data":"7cf17859085ac07b391492f5e63770c8187f77e57787032f12a24b2f64266bca"} Mar 10 08:25:09 crc kubenswrapper[4825]: I0310 08:25:09.876410 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 10 08:25:09 crc kubenswrapper[4825]: I0310 08:25:09.879655 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 10 08:25:09 crc kubenswrapper[4825]: I0310 08:25:09.881815 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 10 08:25:09 crc kubenswrapper[4825]: I0310 08:25:09.881815 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 10 08:25:09 crc kubenswrapper[4825]: I0310 08:25:09.882012 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-pkvxq" Mar 10 08:25:09 crc kubenswrapper[4825]: I0310 08:25:09.895768 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 10 08:25:09 crc kubenswrapper[4825]: I0310 08:25:09.993889 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/258f3503-574c-4911-a806-832d9ee659d5-scripts\") pod \"aodh-0\" (UID: \"258f3503-574c-4911-a806-832d9ee659d5\") " pod="openstack/aodh-0" Mar 10 08:25:09 crc kubenswrapper[4825]: I0310 08:25:09.994073 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/258f3503-574c-4911-a806-832d9ee659d5-combined-ca-bundle\") pod \"aodh-0\" (UID: \"258f3503-574c-4911-a806-832d9ee659d5\") " pod="openstack/aodh-0" Mar 10 08:25:09 crc kubenswrapper[4825]: I0310 08:25:09.994177 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/258f3503-574c-4911-a806-832d9ee659d5-config-data\") pod \"aodh-0\" (UID: \"258f3503-574c-4911-a806-832d9ee659d5\") " pod="openstack/aodh-0" Mar 10 08:25:09 crc kubenswrapper[4825]: I0310 08:25:09.994254 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gck5t\" (UniqueName: \"kubernetes.io/projected/258f3503-574c-4911-a806-832d9ee659d5-kube-api-access-gck5t\") pod \"aodh-0\" (UID: \"258f3503-574c-4911-a806-832d9ee659d5\") " pod="openstack/aodh-0" Mar 10 08:25:10 crc kubenswrapper[4825]: I0310 08:25:10.096071 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/258f3503-574c-4911-a806-832d9ee659d5-combined-ca-bundle\") pod \"aodh-0\" (UID: \"258f3503-574c-4911-a806-832d9ee659d5\") " pod="openstack/aodh-0" Mar 10 08:25:10 crc kubenswrapper[4825]: I0310 08:25:10.096534 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/258f3503-574c-4911-a806-832d9ee659d5-config-data\") pod \"aodh-0\" (UID: \"258f3503-574c-4911-a806-832d9ee659d5\") " pod="openstack/aodh-0" Mar 10 08:25:10 crc kubenswrapper[4825]: I0310 08:25:10.096689 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gck5t\" (UniqueName: \"kubernetes.io/projected/258f3503-574c-4911-a806-832d9ee659d5-kube-api-access-gck5t\") pod \"aodh-0\" (UID: \"258f3503-574c-4911-a806-832d9ee659d5\") " pod="openstack/aodh-0" Mar 10 08:25:10 crc kubenswrapper[4825]: I0310 08:25:10.097083 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/258f3503-574c-4911-a806-832d9ee659d5-scripts\") pod \"aodh-0\" (UID: \"258f3503-574c-4911-a806-832d9ee659d5\") " pod="openstack/aodh-0" Mar 10 08:25:10 crc kubenswrapper[4825]: I0310 08:25:10.102434 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/258f3503-574c-4911-a806-832d9ee659d5-scripts\") pod \"aodh-0\" (UID: \"258f3503-574c-4911-a806-832d9ee659d5\") " pod="openstack/aodh-0" Mar 10 08:25:10 crc kubenswrapper[4825]: I0310 08:25:10.102494 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/258f3503-574c-4911-a806-832d9ee659d5-config-data\") pod \"aodh-0\" (UID: \"258f3503-574c-4911-a806-832d9ee659d5\") " pod="openstack/aodh-0" Mar 10 08:25:10 crc kubenswrapper[4825]: I0310 08:25:10.112460 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/258f3503-574c-4911-a806-832d9ee659d5-combined-ca-bundle\") pod \"aodh-0\" (UID: \"258f3503-574c-4911-a806-832d9ee659d5\") " pod="openstack/aodh-0" Mar 10 08:25:10 crc kubenswrapper[4825]: I0310 08:25:10.114038 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gck5t\" (UniqueName: \"kubernetes.io/projected/258f3503-574c-4911-a806-832d9ee659d5-kube-api-access-gck5t\") pod \"aodh-0\" (UID: \"258f3503-574c-4911-a806-832d9ee659d5\") " pod="openstack/aodh-0" Mar 10 08:25:10 crc kubenswrapper[4825]: I0310 08:25:10.198750 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 10 08:25:10 crc kubenswrapper[4825]: I0310 08:25:10.475849 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9c2d694b-0214-4e6e-b719-f002b7d58c2d","Type":"ContainerStarted","Data":"c7749e50f76a3ffc6486ab5c7551136a74f202aceaba094d573c28c55b06d48b"} Mar 10 08:25:10 crc kubenswrapper[4825]: I0310 08:25:10.477099 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 10 08:25:10 crc kubenswrapper[4825]: I0310 08:25:10.484221 4825 generic.go:334] "Generic (PLEG): container finished" podID="e3fb738d-ab2f-4e60-9089-0e00c094843f" containerID="33b23ac26b28c5034ced51b07431debf29256982eae426d3799a7f0ce000023a" exitCode=0 Mar 10 08:25:10 crc kubenswrapper[4825]: I0310 08:25:10.484252 4825 generic.go:334] "Generic (PLEG): container finished" podID="e3fb738d-ab2f-4e60-9089-0e00c094843f" containerID="4fdde4c509cacc7ae2888137ff45ed17da2d69b5d12202ed5fbbff3478ba82e2" exitCode=2 Mar 10 08:25:10 crc kubenswrapper[4825]: I0310 08:25:10.484261 4825 generic.go:334] "Generic (PLEG): container finished" podID="e3fb738d-ab2f-4e60-9089-0e00c094843f" containerID="05957ec238ef9b0191d882b6eb08a62cbc40efce2a9684bb8638d063980f0d78" exitCode=0 Mar 10 08:25:10 crc kubenswrapper[4825]: I0310 08:25:10.484280 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3fb738d-ab2f-4e60-9089-0e00c094843f","Type":"ContainerDied","Data":"33b23ac26b28c5034ced51b07431debf29256982eae426d3799a7f0ce000023a"} Mar 10 08:25:10 crc kubenswrapper[4825]: I0310 08:25:10.484304 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3fb738d-ab2f-4e60-9089-0e00c094843f","Type":"ContainerDied","Data":"4fdde4c509cacc7ae2888137ff45ed17da2d69b5d12202ed5fbbff3478ba82e2"} Mar 10 08:25:10 crc kubenswrapper[4825]: I0310 08:25:10.484314 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3fb738d-ab2f-4e60-9089-0e00c094843f","Type":"ContainerDied","Data":"05957ec238ef9b0191d882b6eb08a62cbc40efce2a9684bb8638d063980f0d78"} Mar 10 08:25:10 crc kubenswrapper[4825]: I0310 08:25:10.504750 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.153764022 podStartE2EDuration="2.504727306s" podCreationTimestamp="2026-03-10 08:25:08 +0000 UTC" firstStartedPulling="2026-03-10 08:25:09.349315762 +0000 UTC m=+6062.379096377" lastFinishedPulling="2026-03-10 08:25:09.700279046 +0000 UTC m=+6062.730059661" observedRunningTime="2026-03-10 08:25:10.492375972 +0000 UTC m=+6063.522156587" watchObservedRunningTime="2026-03-10 08:25:10.504727306 +0000 UTC m=+6063.534507921" Mar 10 08:25:10 crc kubenswrapper[4825]: I0310 08:25:10.679096 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 10 08:25:10 crc kubenswrapper[4825]: W0310 08:25:10.684352 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod258f3503_574c_4911_a806_832d9ee659d5.slice/crio-2f2144752805fb52dab369fbe0caca512e134821383dca1eccbf6406b2c6b652 WatchSource:0}: Error finding container 2f2144752805fb52dab369fbe0caca512e134821383dca1eccbf6406b2c6b652: Status 404 returned error can't find the container with id 2f2144752805fb52dab369fbe0caca512e134821383dca1eccbf6406b2c6b652 Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.476084 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.496518 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"258f3503-574c-4911-a806-832d9ee659d5","Type":"ContainerStarted","Data":"183d219b90afdf6d8d19644ab103a96a3d59fbb32ee78b40f888b4857427cbfa"} Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.496561 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"258f3503-574c-4911-a806-832d9ee659d5","Type":"ContainerStarted","Data":"2f2144752805fb52dab369fbe0caca512e134821383dca1eccbf6406b2c6b652"} Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.499698 4825 generic.go:334] "Generic (PLEG): container finished" podID="e3fb738d-ab2f-4e60-9089-0e00c094843f" containerID="e22e2c68bfc406e6120b38bbe5a17a5b2d9e1e26cc7f412c181ed18ccdb48aba" exitCode=0 Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.500679 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.500868 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3fb738d-ab2f-4e60-9089-0e00c094843f","Type":"ContainerDied","Data":"e22e2c68bfc406e6120b38bbe5a17a5b2d9e1e26cc7f412c181ed18ccdb48aba"} Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.500896 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3fb738d-ab2f-4e60-9089-0e00c094843f","Type":"ContainerDied","Data":"9d81c16222768031c63d5f215aa3fb863b0e97dded7ba57497d3942ae7ce5da6"} Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.500914 4825 scope.go:117] "RemoveContainer" containerID="33b23ac26b28c5034ced51b07431debf29256982eae426d3799a7f0ce000023a" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.536557 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3fb738d-ab2f-4e60-9089-0e00c094843f-combined-ca-bundle\") pod \"e3fb738d-ab2f-4e60-9089-0e00c094843f\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.536658 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3fb738d-ab2f-4e60-9089-0e00c094843f-run-httpd\") pod \"e3fb738d-ab2f-4e60-9089-0e00c094843f\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.536714 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3fb738d-ab2f-4e60-9089-0e00c094843f-sg-core-conf-yaml\") pod \"e3fb738d-ab2f-4e60-9089-0e00c094843f\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.536743 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8qz6\" (UniqueName: \"kubernetes.io/projected/e3fb738d-ab2f-4e60-9089-0e00c094843f-kube-api-access-p8qz6\") pod \"e3fb738d-ab2f-4e60-9089-0e00c094843f\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.536758 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3fb738d-ab2f-4e60-9089-0e00c094843f-config-data\") pod \"e3fb738d-ab2f-4e60-9089-0e00c094843f\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.537975 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3fb738d-ab2f-4e60-9089-0e00c094843f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e3fb738d-ab2f-4e60-9089-0e00c094843f" (UID: "e3fb738d-ab2f-4e60-9089-0e00c094843f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.540334 4825 scope.go:117] "RemoveContainer" containerID="4fdde4c509cacc7ae2888137ff45ed17da2d69b5d12202ed5fbbff3478ba82e2" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.553253 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3fb738d-ab2f-4e60-9089-0e00c094843f-kube-api-access-p8qz6" (OuterVolumeSpecName: "kube-api-access-p8qz6") pod "e3fb738d-ab2f-4e60-9089-0e00c094843f" (UID: "e3fb738d-ab2f-4e60-9089-0e00c094843f"). InnerVolumeSpecName "kube-api-access-p8qz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.584260 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3fb738d-ab2f-4e60-9089-0e00c094843f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e3fb738d-ab2f-4e60-9089-0e00c094843f" (UID: "e3fb738d-ab2f-4e60-9089-0e00c094843f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.626368 4825 scope.go:117] "RemoveContainer" containerID="e22e2c68bfc406e6120b38bbe5a17a5b2d9e1e26cc7f412c181ed18ccdb48aba" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.640781 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3fb738d-ab2f-4e60-9089-0e00c094843f-scripts\") pod \"e3fb738d-ab2f-4e60-9089-0e00c094843f\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.641350 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3fb738d-ab2f-4e60-9089-0e00c094843f-log-httpd\") pod \"e3fb738d-ab2f-4e60-9089-0e00c094843f\" (UID: \"e3fb738d-ab2f-4e60-9089-0e00c094843f\") " Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.641871 4825 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3fb738d-ab2f-4e60-9089-0e00c094843f-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.641888 4825 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3fb738d-ab2f-4e60-9089-0e00c094843f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.641899 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8qz6\" (UniqueName: \"kubernetes.io/projected/e3fb738d-ab2f-4e60-9089-0e00c094843f-kube-api-access-p8qz6\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.642462 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3fb738d-ab2f-4e60-9089-0e00c094843f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e3fb738d-ab2f-4e60-9089-0e00c094843f" (UID: "e3fb738d-ab2f-4e60-9089-0e00c094843f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.648051 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3fb738d-ab2f-4e60-9089-0e00c094843f-scripts" (OuterVolumeSpecName: "scripts") pod "e3fb738d-ab2f-4e60-9089-0e00c094843f" (UID: "e3fb738d-ab2f-4e60-9089-0e00c094843f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.657652 4825 scope.go:117] "RemoveContainer" containerID="05957ec238ef9b0191d882b6eb08a62cbc40efce2a9684bb8638d063980f0d78" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.666945 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3fb738d-ab2f-4e60-9089-0e00c094843f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3fb738d-ab2f-4e60-9089-0e00c094843f" (UID: "e3fb738d-ab2f-4e60-9089-0e00c094843f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.721219 4825 scope.go:117] "RemoveContainer" containerID="33b23ac26b28c5034ced51b07431debf29256982eae426d3799a7f0ce000023a" Mar 10 08:25:11 crc kubenswrapper[4825]: E0310 08:25:11.724362 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33b23ac26b28c5034ced51b07431debf29256982eae426d3799a7f0ce000023a\": container with ID starting with 33b23ac26b28c5034ced51b07431debf29256982eae426d3799a7f0ce000023a not found: ID does not exist" containerID="33b23ac26b28c5034ced51b07431debf29256982eae426d3799a7f0ce000023a" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.724404 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b23ac26b28c5034ced51b07431debf29256982eae426d3799a7f0ce000023a"} err="failed to get container status \"33b23ac26b28c5034ced51b07431debf29256982eae426d3799a7f0ce000023a\": rpc error: code = NotFound desc = could not find container \"33b23ac26b28c5034ced51b07431debf29256982eae426d3799a7f0ce000023a\": container with ID starting with 33b23ac26b28c5034ced51b07431debf29256982eae426d3799a7f0ce000023a not found: ID does not exist" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.724425 4825 scope.go:117] "RemoveContainer" containerID="4fdde4c509cacc7ae2888137ff45ed17da2d69b5d12202ed5fbbff3478ba82e2" Mar 10 08:25:11 crc kubenswrapper[4825]: E0310 08:25:11.727512 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fdde4c509cacc7ae2888137ff45ed17da2d69b5d12202ed5fbbff3478ba82e2\": container with ID starting with 4fdde4c509cacc7ae2888137ff45ed17da2d69b5d12202ed5fbbff3478ba82e2 not found: ID does not exist" containerID="4fdde4c509cacc7ae2888137ff45ed17da2d69b5d12202ed5fbbff3478ba82e2" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.727545 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fdde4c509cacc7ae2888137ff45ed17da2d69b5d12202ed5fbbff3478ba82e2"} err="failed to get container status \"4fdde4c509cacc7ae2888137ff45ed17da2d69b5d12202ed5fbbff3478ba82e2\": rpc error: code = NotFound desc = could not find container \"4fdde4c509cacc7ae2888137ff45ed17da2d69b5d12202ed5fbbff3478ba82e2\": container with ID starting with 4fdde4c509cacc7ae2888137ff45ed17da2d69b5d12202ed5fbbff3478ba82e2 not found: ID does not exist" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.727562 4825 scope.go:117] "RemoveContainer" containerID="e22e2c68bfc406e6120b38bbe5a17a5b2d9e1e26cc7f412c181ed18ccdb48aba" Mar 10 08:25:11 crc kubenswrapper[4825]: E0310 08:25:11.729398 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e22e2c68bfc406e6120b38bbe5a17a5b2d9e1e26cc7f412c181ed18ccdb48aba\": container with ID starting with e22e2c68bfc406e6120b38bbe5a17a5b2d9e1e26cc7f412c181ed18ccdb48aba not found: ID does not exist" containerID="e22e2c68bfc406e6120b38bbe5a17a5b2d9e1e26cc7f412c181ed18ccdb48aba" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.729508 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e22e2c68bfc406e6120b38bbe5a17a5b2d9e1e26cc7f412c181ed18ccdb48aba"} err="failed to get container status \"e22e2c68bfc406e6120b38bbe5a17a5b2d9e1e26cc7f412c181ed18ccdb48aba\": rpc error: code = NotFound desc = could not find container \"e22e2c68bfc406e6120b38bbe5a17a5b2d9e1e26cc7f412c181ed18ccdb48aba\": container with ID starting with e22e2c68bfc406e6120b38bbe5a17a5b2d9e1e26cc7f412c181ed18ccdb48aba not found: ID does not exist" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.729597 4825 scope.go:117] "RemoveContainer" containerID="05957ec238ef9b0191d882b6eb08a62cbc40efce2a9684bb8638d063980f0d78" Mar 10 08:25:11 crc kubenswrapper[4825]: E0310 08:25:11.730816 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05957ec238ef9b0191d882b6eb08a62cbc40efce2a9684bb8638d063980f0d78\": container with ID starting with 05957ec238ef9b0191d882b6eb08a62cbc40efce2a9684bb8638d063980f0d78 not found: ID does not exist" containerID="05957ec238ef9b0191d882b6eb08a62cbc40efce2a9684bb8638d063980f0d78" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.730869 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05957ec238ef9b0191d882b6eb08a62cbc40efce2a9684bb8638d063980f0d78"} err="failed to get container status \"05957ec238ef9b0191d882b6eb08a62cbc40efce2a9684bb8638d063980f0d78\": rpc error: code = NotFound desc = could not find container \"05957ec238ef9b0191d882b6eb08a62cbc40efce2a9684bb8638d063980f0d78\": container with ID starting with 05957ec238ef9b0191d882b6eb08a62cbc40efce2a9684bb8638d063980f0d78 not found: ID does not exist" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.743842 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3fb738d-ab2f-4e60-9089-0e00c094843f-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.744387 4825 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3fb738d-ab2f-4e60-9089-0e00c094843f-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.744469 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3fb738d-ab2f-4e60-9089-0e00c094843f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.744593 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3fb738d-ab2f-4e60-9089-0e00c094843f-config-data" (OuterVolumeSpecName: "config-data") pod "e3fb738d-ab2f-4e60-9089-0e00c094843f" (UID: "e3fb738d-ab2f-4e60-9089-0e00c094843f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.837625 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.846370 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.846759 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3fb738d-ab2f-4e60-9089-0e00c094843f-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.857311 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 08:25:11 crc kubenswrapper[4825]: E0310 08:25:11.857683 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3fb738d-ab2f-4e60-9089-0e00c094843f" containerName="sg-core" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.857700 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3fb738d-ab2f-4e60-9089-0e00c094843f" containerName="sg-core" Mar 10 08:25:11 crc kubenswrapper[4825]: E0310 08:25:11.857827 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3fb738d-ab2f-4e60-9089-0e00c094843f" containerName="proxy-httpd" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.857839 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3fb738d-ab2f-4e60-9089-0e00c094843f" containerName="proxy-httpd" Mar 10 08:25:11 crc kubenswrapper[4825]: E0310 08:25:11.857869 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3fb738d-ab2f-4e60-9089-0e00c094843f" containerName="ceilometer-notification-agent" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.857963 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3fb738d-ab2f-4e60-9089-0e00c094843f" containerName="ceilometer-notification-agent" Mar 10 08:25:11 crc kubenswrapper[4825]: E0310 08:25:11.858000 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3fb738d-ab2f-4e60-9089-0e00c094843f" containerName="ceilometer-central-agent" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.858588 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3fb738d-ab2f-4e60-9089-0e00c094843f" containerName="ceilometer-central-agent" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.858772 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3fb738d-ab2f-4e60-9089-0e00c094843f" containerName="proxy-httpd" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.858785 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3fb738d-ab2f-4e60-9089-0e00c094843f" containerName="ceilometer-central-agent" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.858802 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3fb738d-ab2f-4e60-9089-0e00c094843f" containerName="ceilometer-notification-agent" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.858810 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3fb738d-ab2f-4e60-9089-0e00c094843f" containerName="sg-core" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.862029 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.863982 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.864227 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.864539 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 08:25:11 crc kubenswrapper[4825]: I0310 08:25:11.886076 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.054223 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.054277 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.054317 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55816f33-bc34-4d82-b162-e5503b542d33-run-httpd\") pod \"ceilometer-0\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.054578 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-scripts\") pod \"ceilometer-0\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.054770 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-config-data\") pod \"ceilometer-0\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.054891 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55816f33-bc34-4d82-b162-e5503b542d33-log-httpd\") pod \"ceilometer-0\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.055063 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.055244 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5v7k\" (UniqueName: \"kubernetes.io/projected/55816f33-bc34-4d82-b162-e5503b542d33-kube-api-access-q5v7k\") pod \"ceilometer-0\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.127358 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 08:25:12 crc kubenswrapper[4825]: E0310 08:25:12.128273 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-q5v7k log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="55816f33-bc34-4d82-b162-e5503b542d33" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.157479 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.157558 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55816f33-bc34-4d82-b162-e5503b542d33-run-httpd\") pod \"ceilometer-0\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.157619 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-scripts\") pod \"ceilometer-0\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.157669 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-config-data\") pod \"ceilometer-0\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.157704 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55816f33-bc34-4d82-b162-e5503b542d33-log-httpd\") pod \"ceilometer-0\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.157764 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.157822 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5v7k\" (UniqueName: \"kubernetes.io/projected/55816f33-bc34-4d82-b162-e5503b542d33-kube-api-access-q5v7k\") pod \"ceilometer-0\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.157890 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.158290 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55816f33-bc34-4d82-b162-e5503b542d33-run-httpd\") pod \"ceilometer-0\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.158349 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55816f33-bc34-4d82-b162-e5503b542d33-log-httpd\") pod \"ceilometer-0\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.164814 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-scripts\") pod \"ceilometer-0\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.167726 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-config-data\") pod \"ceilometer-0\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.168516 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.169851 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.170255 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.176047 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5v7k\" (UniqueName: \"kubernetes.io/projected/55816f33-bc34-4d82-b162-e5503b542d33-kube-api-access-q5v7k\") pod \"ceilometer-0\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.519946 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.540207 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.668489 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55816f33-bc34-4d82-b162-e5503b542d33-log-httpd\") pod \"55816f33-bc34-4d82-b162-e5503b542d33\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.668572 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-scripts\") pod \"55816f33-bc34-4d82-b162-e5503b542d33\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.668693 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-sg-core-conf-yaml\") pod \"55816f33-bc34-4d82-b162-e5503b542d33\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.668740 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-combined-ca-bundle\") pod \"55816f33-bc34-4d82-b162-e5503b542d33\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.668873 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55816f33-bc34-4d82-b162-e5503b542d33-run-httpd\") pod \"55816f33-bc34-4d82-b162-e5503b542d33\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.668905 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5v7k\" (UniqueName: \"kubernetes.io/projected/55816f33-bc34-4d82-b162-e5503b542d33-kube-api-access-q5v7k\") pod \"55816f33-bc34-4d82-b162-e5503b542d33\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.668952 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-config-data\") pod \"55816f33-bc34-4d82-b162-e5503b542d33\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.668984 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-ceilometer-tls-certs\") pod \"55816f33-bc34-4d82-b162-e5503b542d33\" (UID: \"55816f33-bc34-4d82-b162-e5503b542d33\") " Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.671517 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55816f33-bc34-4d82-b162-e5503b542d33-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "55816f33-bc34-4d82-b162-e5503b542d33" (UID: "55816f33-bc34-4d82-b162-e5503b542d33"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.677385 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55816f33-bc34-4d82-b162-e5503b542d33-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "55816f33-bc34-4d82-b162-e5503b542d33" (UID: "55816f33-bc34-4d82-b162-e5503b542d33"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.679339 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-config-data" (OuterVolumeSpecName: "config-data") pod "55816f33-bc34-4d82-b162-e5503b542d33" (UID: "55816f33-bc34-4d82-b162-e5503b542d33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.679890 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55816f33-bc34-4d82-b162-e5503b542d33" (UID: "55816f33-bc34-4d82-b162-e5503b542d33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.681419 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "55816f33-bc34-4d82-b162-e5503b542d33" (UID: "55816f33-bc34-4d82-b162-e5503b542d33"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.682868 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-scripts" (OuterVolumeSpecName: "scripts") pod "55816f33-bc34-4d82-b162-e5503b542d33" (UID: "55816f33-bc34-4d82-b162-e5503b542d33"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.683012 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55816f33-bc34-4d82-b162-e5503b542d33-kube-api-access-q5v7k" (OuterVolumeSpecName: "kube-api-access-q5v7k") pod "55816f33-bc34-4d82-b162-e5503b542d33" (UID: "55816f33-bc34-4d82-b162-e5503b542d33"). InnerVolumeSpecName "kube-api-access-q5v7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.683999 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "55816f33-bc34-4d82-b162-e5503b542d33" (UID: "55816f33-bc34-4d82-b162-e5503b542d33"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.772649 4825 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.772687 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.772701 4825 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55816f33-bc34-4d82-b162-e5503b542d33-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.772716 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5v7k\" (UniqueName: \"kubernetes.io/projected/55816f33-bc34-4d82-b162-e5503b542d33-kube-api-access-q5v7k\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.772731 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.772742 4825 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.772753 4825 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55816f33-bc34-4d82-b162-e5503b542d33-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:12 crc kubenswrapper[4825]: I0310 08:25:12.772763 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55816f33-bc34-4d82-b162-e5503b542d33-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.250771 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3fb738d-ab2f-4e60-9089-0e00c094843f" path="/var/lib/kubelet/pods/e3fb738d-ab2f-4e60-9089-0e00c094843f/volumes" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.491525 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.545558 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.546546 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"258f3503-574c-4911-a806-832d9ee659d5","Type":"ContainerStarted","Data":"e5cb52ad5df695461d8439c7be4a75a9cf18a77259d8f465603682554441252e"} Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.609248 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.628170 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.663706 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.666295 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.674810 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.675070 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.675243 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.680154 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.797175 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slfgd\" (UniqueName: \"kubernetes.io/projected/1ca43042-5830-4c74-9412-9964fdc9ea93-kube-api-access-slfgd\") pod \"ceilometer-0\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.797238 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-scripts\") pod \"ceilometer-0\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.797298 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ca43042-5830-4c74-9412-9964fdc9ea93-log-httpd\") pod \"ceilometer-0\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.797348 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.797370 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.797401 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.797428 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ca43042-5830-4c74-9412-9964fdc9ea93-run-httpd\") pod \"ceilometer-0\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.797452 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-config-data\") pod \"ceilometer-0\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.903641 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.903681 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.903717 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.903746 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ca43042-5830-4c74-9412-9964fdc9ea93-run-httpd\") pod \"ceilometer-0\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.903770 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-config-data\") pod \"ceilometer-0\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.903845 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slfgd\" (UniqueName: \"kubernetes.io/projected/1ca43042-5830-4c74-9412-9964fdc9ea93-kube-api-access-slfgd\") pod \"ceilometer-0\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.903886 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-scripts\") pod \"ceilometer-0\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.903913 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ca43042-5830-4c74-9412-9964fdc9ea93-log-httpd\") pod \"ceilometer-0\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.909719 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ca43042-5830-4c74-9412-9964fdc9ea93-log-httpd\") pod \"ceilometer-0\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.912783 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ca43042-5830-4c74-9412-9964fdc9ea93-run-httpd\") pod \"ceilometer-0\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.918858 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.921482 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-scripts\") pod \"ceilometer-0\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.921536 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-config-data\") pod \"ceilometer-0\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.931905 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.932331 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " pod="openstack/ceilometer-0" Mar 10 08:25:13 crc kubenswrapper[4825]: I0310 08:25:13.937853 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slfgd\" (UniqueName: \"kubernetes.io/projected/1ca43042-5830-4c74-9412-9964fdc9ea93-kube-api-access-slfgd\") pod \"ceilometer-0\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " pod="openstack/ceilometer-0" Mar 10 08:25:14 crc kubenswrapper[4825]: I0310 08:25:14.002670 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 08:25:14 crc kubenswrapper[4825]: I0310 08:25:14.974662 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 08:25:15 crc kubenswrapper[4825]: I0310 08:25:15.253715 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55816f33-bc34-4d82-b162-e5503b542d33" path="/var/lib/kubelet/pods/55816f33-bc34-4d82-b162-e5503b542d33/volumes" Mar 10 08:25:15 crc kubenswrapper[4825]: I0310 08:25:15.605978 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ca43042-5830-4c74-9412-9964fdc9ea93","Type":"ContainerStarted","Data":"a97a34dae3765d608598733050a6d835e62fac6f99be1d7d21cb10a603a718b4"} Mar 10 08:25:15 crc kubenswrapper[4825]: I0310 08:25:15.610153 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"258f3503-574c-4911-a806-832d9ee659d5","Type":"ContainerStarted","Data":"600a775a7de0ec710aa71c98d27e8e893d63c0000768c7b317271150fabdcbc8"} Mar 10 08:25:16 crc kubenswrapper[4825]: I0310 08:25:16.626098 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ca43042-5830-4c74-9412-9964fdc9ea93","Type":"ContainerStarted","Data":"cf2ad0dbd8c5374800f1c0819666b1102fb3ab48a216ff03f1d348f6feb9fe23"} Mar 10 08:25:16 crc kubenswrapper[4825]: I0310 08:25:16.887861 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:25:16 crc kubenswrapper[4825]: I0310 08:25:16.887918 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:25:17 crc kubenswrapper[4825]: I0310 08:25:17.113198 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 08:25:17 crc kubenswrapper[4825]: I0310 08:25:17.636085 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ca43042-5830-4c74-9412-9964fdc9ea93","Type":"ContainerStarted","Data":"1804e585a210046980d35169283e8a2ff36832180dbfd287936c490ce70891ba"} Mar 10 08:25:17 crc kubenswrapper[4825]: I0310 08:25:17.639380 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"258f3503-574c-4911-a806-832d9ee659d5","Type":"ContainerStarted","Data":"a8e474bd7cc82c3d7c90eacbfdf8f2e82018cf2e44d98298ee08cd745db5abc9"} Mar 10 08:25:17 crc kubenswrapper[4825]: I0310 08:25:17.639563 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="258f3503-574c-4911-a806-832d9ee659d5" containerName="aodh-api" containerID="cri-o://183d219b90afdf6d8d19644ab103a96a3d59fbb32ee78b40f888b4857427cbfa" gracePeriod=30 Mar 10 08:25:17 crc kubenswrapper[4825]: I0310 08:25:17.639644 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="258f3503-574c-4911-a806-832d9ee659d5" containerName="aodh-notifier" containerID="cri-o://600a775a7de0ec710aa71c98d27e8e893d63c0000768c7b317271150fabdcbc8" gracePeriod=30 Mar 10 08:25:17 crc kubenswrapper[4825]: I0310 08:25:17.639684 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="258f3503-574c-4911-a806-832d9ee659d5" containerName="aodh-evaluator" containerID="cri-o://e5cb52ad5df695461d8439c7be4a75a9cf18a77259d8f465603682554441252e" gracePeriod=30 Mar 10 08:25:17 crc kubenswrapper[4825]: I0310 08:25:17.639654 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="258f3503-574c-4911-a806-832d9ee659d5" containerName="aodh-listener" containerID="cri-o://a8e474bd7cc82c3d7c90eacbfdf8f2e82018cf2e44d98298ee08cd745db5abc9" gracePeriod=30 Mar 10 08:25:17 crc kubenswrapper[4825]: I0310 08:25:17.668618 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.838910564 podStartE2EDuration="8.668597975s" podCreationTimestamp="2026-03-10 08:25:09 +0000 UTC" firstStartedPulling="2026-03-10 08:25:10.688328996 +0000 UTC m=+6063.718109621" lastFinishedPulling="2026-03-10 08:25:16.518016417 +0000 UTC m=+6069.547797032" observedRunningTime="2026-03-10 08:25:17.665719059 +0000 UTC m=+6070.695499674" watchObservedRunningTime="2026-03-10 08:25:17.668597975 +0000 UTC m=+6070.698378590" Mar 10 08:25:18 crc kubenswrapper[4825]: I0310 08:25:18.652471 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ca43042-5830-4c74-9412-9964fdc9ea93","Type":"ContainerStarted","Data":"2a1122cf36c8c1d6ce93a3dedbe1cdcf2376aba292a99bb71479120dd6087db1"} Mar 10 08:25:18 crc kubenswrapper[4825]: I0310 08:25:18.655872 4825 generic.go:334] "Generic (PLEG): container finished" podID="258f3503-574c-4911-a806-832d9ee659d5" containerID="600a775a7de0ec710aa71c98d27e8e893d63c0000768c7b317271150fabdcbc8" exitCode=0 Mar 10 08:25:18 crc kubenswrapper[4825]: I0310 08:25:18.655917 4825 generic.go:334] "Generic (PLEG): container finished" podID="258f3503-574c-4911-a806-832d9ee659d5" containerID="e5cb52ad5df695461d8439c7be4a75a9cf18a77259d8f465603682554441252e" exitCode=0 Mar 10 08:25:18 crc kubenswrapper[4825]: I0310 08:25:18.655967 4825 generic.go:334] "Generic (PLEG): container finished" podID="258f3503-574c-4911-a806-832d9ee659d5" containerID="183d219b90afdf6d8d19644ab103a96a3d59fbb32ee78b40f888b4857427cbfa" exitCode=0 Mar 10 08:25:18 crc kubenswrapper[4825]: I0310 08:25:18.655931 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"258f3503-574c-4911-a806-832d9ee659d5","Type":"ContainerDied","Data":"600a775a7de0ec710aa71c98d27e8e893d63c0000768c7b317271150fabdcbc8"} Mar 10 08:25:18 crc kubenswrapper[4825]: I0310 08:25:18.656050 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"258f3503-574c-4911-a806-832d9ee659d5","Type":"ContainerDied","Data":"e5cb52ad5df695461d8439c7be4a75a9cf18a77259d8f465603682554441252e"} Mar 10 08:25:18 crc kubenswrapper[4825]: I0310 08:25:18.656072 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"258f3503-574c-4911-a806-832d9ee659d5","Type":"ContainerDied","Data":"183d219b90afdf6d8d19644ab103a96a3d59fbb32ee78b40f888b4857427cbfa"} Mar 10 08:25:18 crc kubenswrapper[4825]: I0310 08:25:18.873816 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 10 08:25:20 crc kubenswrapper[4825]: I0310 08:25:20.674730 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ca43042-5830-4c74-9412-9964fdc9ea93","Type":"ContainerStarted","Data":"3a41812a6d5d301948770c309281e294664a6b62b20bab18441060fbb67c1787"} Mar 10 08:25:20 crc kubenswrapper[4825]: I0310 08:25:20.675123 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ca43042-5830-4c74-9412-9964fdc9ea93" containerName="ceilometer-central-agent" containerID="cri-o://cf2ad0dbd8c5374800f1c0819666b1102fb3ab48a216ff03f1d348f6feb9fe23" gracePeriod=30 Mar 10 08:25:20 crc kubenswrapper[4825]: I0310 08:25:20.675321 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 08:25:20 crc kubenswrapper[4825]: I0310 08:25:20.675426 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ca43042-5830-4c74-9412-9964fdc9ea93" containerName="proxy-httpd" containerID="cri-o://3a41812a6d5d301948770c309281e294664a6b62b20bab18441060fbb67c1787" gracePeriod=30 Mar 10 08:25:20 crc kubenswrapper[4825]: I0310 08:25:20.675579 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ca43042-5830-4c74-9412-9964fdc9ea93" containerName="sg-core" containerID="cri-o://2a1122cf36c8c1d6ce93a3dedbe1cdcf2376aba292a99bb71479120dd6087db1" gracePeriod=30 Mar 10 08:25:20 crc kubenswrapper[4825]: I0310 08:25:20.675667 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ca43042-5830-4c74-9412-9964fdc9ea93" containerName="ceilometer-notification-agent" containerID="cri-o://1804e585a210046980d35169283e8a2ff36832180dbfd287936c490ce70891ba" gracePeriod=30 Mar 10 08:25:20 crc kubenswrapper[4825]: I0310 08:25:20.714996 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.376936084 podStartE2EDuration="7.714973253s" podCreationTimestamp="2026-03-10 08:25:13 +0000 UTC" firstStartedPulling="2026-03-10 08:25:15.003017173 +0000 UTC m=+6068.032797788" lastFinishedPulling="2026-03-10 08:25:19.341054342 +0000 UTC m=+6072.370834957" observedRunningTime="2026-03-10 08:25:20.706715377 +0000 UTC m=+6073.736496022" watchObservedRunningTime="2026-03-10 08:25:20.714973253 +0000 UTC m=+6073.744753868" Mar 10 08:25:21 crc kubenswrapper[4825]: I0310 08:25:21.689442 4825 generic.go:334] "Generic (PLEG): container finished" podID="1ca43042-5830-4c74-9412-9964fdc9ea93" containerID="3a41812a6d5d301948770c309281e294664a6b62b20bab18441060fbb67c1787" exitCode=0 Mar 10 08:25:21 crc kubenswrapper[4825]: I0310 08:25:21.689885 4825 generic.go:334] "Generic (PLEG): container finished" podID="1ca43042-5830-4c74-9412-9964fdc9ea93" containerID="2a1122cf36c8c1d6ce93a3dedbe1cdcf2376aba292a99bb71479120dd6087db1" exitCode=2 Mar 10 08:25:21 crc kubenswrapper[4825]: I0310 08:25:21.689918 4825 generic.go:334] "Generic (PLEG): container finished" podID="1ca43042-5830-4c74-9412-9964fdc9ea93" containerID="1804e585a210046980d35169283e8a2ff36832180dbfd287936c490ce70891ba" exitCode=0 Mar 10 08:25:21 crc kubenswrapper[4825]: I0310 08:25:21.689517 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ca43042-5830-4c74-9412-9964fdc9ea93","Type":"ContainerDied","Data":"3a41812a6d5d301948770c309281e294664a6b62b20bab18441060fbb67c1787"} Mar 10 08:25:21 crc kubenswrapper[4825]: I0310 08:25:21.689982 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ca43042-5830-4c74-9412-9964fdc9ea93","Type":"ContainerDied","Data":"2a1122cf36c8c1d6ce93a3dedbe1cdcf2376aba292a99bb71479120dd6087db1"} Mar 10 08:25:21 crc kubenswrapper[4825]: I0310 08:25:21.689995 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ca43042-5830-4c74-9412-9964fdc9ea93","Type":"ContainerDied","Data":"1804e585a210046980d35169283e8a2ff36832180dbfd287936c490ce70891ba"} Mar 10 08:25:22 crc kubenswrapper[4825]: I0310 08:25:22.705992 4825 generic.go:334] "Generic (PLEG): container finished" podID="1ca43042-5830-4c74-9412-9964fdc9ea93" containerID="cf2ad0dbd8c5374800f1c0819666b1102fb3ab48a216ff03f1d348f6feb9fe23" exitCode=0 Mar 10 08:25:22 crc kubenswrapper[4825]: I0310 08:25:22.706185 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ca43042-5830-4c74-9412-9964fdc9ea93","Type":"ContainerDied","Data":"cf2ad0dbd8c5374800f1c0819666b1102fb3ab48a216ff03f1d348f6feb9fe23"} Mar 10 08:25:22 crc kubenswrapper[4825]: I0310 08:25:22.975186 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.127507 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-combined-ca-bundle\") pod \"1ca43042-5830-4c74-9412-9964fdc9ea93\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.127586 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ca43042-5830-4c74-9412-9964fdc9ea93-run-httpd\") pod \"1ca43042-5830-4c74-9412-9964fdc9ea93\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.127652 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ca43042-5830-4c74-9412-9964fdc9ea93-log-httpd\") pod \"1ca43042-5830-4c74-9412-9964fdc9ea93\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.127670 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slfgd\" (UniqueName: \"kubernetes.io/projected/1ca43042-5830-4c74-9412-9964fdc9ea93-kube-api-access-slfgd\") pod \"1ca43042-5830-4c74-9412-9964fdc9ea93\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.127751 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-ceilometer-tls-certs\") pod \"1ca43042-5830-4c74-9412-9964fdc9ea93\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.127784 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-scripts\") pod \"1ca43042-5830-4c74-9412-9964fdc9ea93\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.127849 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-config-data\") pod \"1ca43042-5830-4c74-9412-9964fdc9ea93\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.127864 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-sg-core-conf-yaml\") pod \"1ca43042-5830-4c74-9412-9964fdc9ea93\" (UID: \"1ca43042-5830-4c74-9412-9964fdc9ea93\") " Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.129204 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ca43042-5830-4c74-9412-9964fdc9ea93-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1ca43042-5830-4c74-9412-9964fdc9ea93" (UID: "1ca43042-5830-4c74-9412-9964fdc9ea93"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.129503 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ca43042-5830-4c74-9412-9964fdc9ea93-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1ca43042-5830-4c74-9412-9964fdc9ea93" (UID: "1ca43042-5830-4c74-9412-9964fdc9ea93"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.133445 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-scripts" (OuterVolumeSpecName: "scripts") pod "1ca43042-5830-4c74-9412-9964fdc9ea93" (UID: "1ca43042-5830-4c74-9412-9964fdc9ea93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.134612 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ca43042-5830-4c74-9412-9964fdc9ea93-kube-api-access-slfgd" (OuterVolumeSpecName: "kube-api-access-slfgd") pod "1ca43042-5830-4c74-9412-9964fdc9ea93" (UID: "1ca43042-5830-4c74-9412-9964fdc9ea93"). InnerVolumeSpecName "kube-api-access-slfgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.168420 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1ca43042-5830-4c74-9412-9964fdc9ea93" (UID: "1ca43042-5830-4c74-9412-9964fdc9ea93"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.203895 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1ca43042-5830-4c74-9412-9964fdc9ea93" (UID: "1ca43042-5830-4c74-9412-9964fdc9ea93"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.217516 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ca43042-5830-4c74-9412-9964fdc9ea93" (UID: "1ca43042-5830-4c74-9412-9964fdc9ea93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.230232 4825 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ca43042-5830-4c74-9412-9964fdc9ea93-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.230269 4825 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ca43042-5830-4c74-9412-9964fdc9ea93-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.230281 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slfgd\" (UniqueName: \"kubernetes.io/projected/1ca43042-5830-4c74-9412-9964fdc9ea93-kube-api-access-slfgd\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.230291 4825 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.230299 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.230307 4825 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.230342 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.230506 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-config-data" (OuterVolumeSpecName: "config-data") pod "1ca43042-5830-4c74-9412-9964fdc9ea93" (UID: "1ca43042-5830-4c74-9412-9964fdc9ea93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.334333 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ca43042-5830-4c74-9412-9964fdc9ea93-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.721422 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ca43042-5830-4c74-9412-9964fdc9ea93","Type":"ContainerDied","Data":"a97a34dae3765d608598733050a6d835e62fac6f99be1d7d21cb10a603a718b4"} Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.721493 4825 scope.go:117] "RemoveContainer" containerID="3a41812a6d5d301948770c309281e294664a6b62b20bab18441060fbb67c1787" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.721530 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.753393 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.755185 4825 scope.go:117] "RemoveContainer" containerID="2a1122cf36c8c1d6ce93a3dedbe1cdcf2376aba292a99bb71479120dd6087db1" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.778250 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.782301 4825 scope.go:117] "RemoveContainer" containerID="1804e585a210046980d35169283e8a2ff36832180dbfd287936c490ce70891ba" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.797148 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 08:25:23 crc kubenswrapper[4825]: E0310 08:25:23.797672 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca43042-5830-4c74-9412-9964fdc9ea93" containerName="sg-core" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.797690 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca43042-5830-4c74-9412-9964fdc9ea93" containerName="sg-core" Mar 10 08:25:23 crc kubenswrapper[4825]: E0310 08:25:23.797732 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca43042-5830-4c74-9412-9964fdc9ea93" containerName="proxy-httpd" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.797740 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca43042-5830-4c74-9412-9964fdc9ea93" containerName="proxy-httpd" Mar 10 08:25:23 crc kubenswrapper[4825]: E0310 08:25:23.797749 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca43042-5830-4c74-9412-9964fdc9ea93" containerName="ceilometer-central-agent" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.797755 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca43042-5830-4c74-9412-9964fdc9ea93" containerName="ceilometer-central-agent" Mar 10 08:25:23 crc kubenswrapper[4825]: E0310 08:25:23.797769 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca43042-5830-4c74-9412-9964fdc9ea93" containerName="ceilometer-notification-agent" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.797775 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca43042-5830-4c74-9412-9964fdc9ea93" containerName="ceilometer-notification-agent" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.797936 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ca43042-5830-4c74-9412-9964fdc9ea93" containerName="ceilometer-notification-agent" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.797948 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ca43042-5830-4c74-9412-9964fdc9ea93" containerName="proxy-httpd" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.797969 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ca43042-5830-4c74-9412-9964fdc9ea93" containerName="sg-core" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.797982 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ca43042-5830-4c74-9412-9964fdc9ea93" containerName="ceilometer-central-agent" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.806238 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.811742 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.811869 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.812284 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.816358 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.818502 4825 scope.go:117] "RemoveContainer" containerID="cf2ad0dbd8c5374800f1c0819666b1102fb3ab48a216ff03f1d348f6feb9fe23" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.948900 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f712d86-b1c5-44e8-8096-2a9d37b7a792-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f712d86-b1c5-44e8-8096-2a9d37b7a792\") " pod="openstack/ceilometer-0" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.948955 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f712d86-b1c5-44e8-8096-2a9d37b7a792-log-httpd\") pod \"ceilometer-0\" (UID: \"0f712d86-b1c5-44e8-8096-2a9d37b7a792\") " pod="openstack/ceilometer-0" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.948997 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f712d86-b1c5-44e8-8096-2a9d37b7a792-scripts\") pod \"ceilometer-0\" (UID: \"0f712d86-b1c5-44e8-8096-2a9d37b7a792\") " pod="openstack/ceilometer-0" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.949200 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f712d86-b1c5-44e8-8096-2a9d37b7a792-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f712d86-b1c5-44e8-8096-2a9d37b7a792\") " pod="openstack/ceilometer-0" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.949260 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f712d86-b1c5-44e8-8096-2a9d37b7a792-config-data\") pod \"ceilometer-0\" (UID: \"0f712d86-b1c5-44e8-8096-2a9d37b7a792\") " pod="openstack/ceilometer-0" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.949366 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f712d86-b1c5-44e8-8096-2a9d37b7a792-run-httpd\") pod \"ceilometer-0\" (UID: \"0f712d86-b1c5-44e8-8096-2a9d37b7a792\") " pod="openstack/ceilometer-0" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.949428 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf7tr\" (UniqueName: \"kubernetes.io/projected/0f712d86-b1c5-44e8-8096-2a9d37b7a792-kube-api-access-wf7tr\") pod \"ceilometer-0\" (UID: \"0f712d86-b1c5-44e8-8096-2a9d37b7a792\") " pod="openstack/ceilometer-0" Mar 10 08:25:23 crc kubenswrapper[4825]: I0310 08:25:23.949525 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f712d86-b1c5-44e8-8096-2a9d37b7a792-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0f712d86-b1c5-44e8-8096-2a9d37b7a792\") " pod="openstack/ceilometer-0" Mar 10 08:25:24 crc kubenswrapper[4825]: I0310 08:25:24.052242 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f712d86-b1c5-44e8-8096-2a9d37b7a792-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f712d86-b1c5-44e8-8096-2a9d37b7a792\") " pod="openstack/ceilometer-0" Mar 10 08:25:24 crc kubenswrapper[4825]: I0310 08:25:24.052309 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f712d86-b1c5-44e8-8096-2a9d37b7a792-log-httpd\") pod \"ceilometer-0\" (UID: \"0f712d86-b1c5-44e8-8096-2a9d37b7a792\") " pod="openstack/ceilometer-0" Mar 10 08:25:24 crc kubenswrapper[4825]: I0310 08:25:24.052351 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f712d86-b1c5-44e8-8096-2a9d37b7a792-scripts\") pod \"ceilometer-0\" (UID: \"0f712d86-b1c5-44e8-8096-2a9d37b7a792\") " pod="openstack/ceilometer-0" Mar 10 08:25:24 crc kubenswrapper[4825]: I0310 08:25:24.052394 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f712d86-b1c5-44e8-8096-2a9d37b7a792-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f712d86-b1c5-44e8-8096-2a9d37b7a792\") " pod="openstack/ceilometer-0" Mar 10 08:25:24 crc kubenswrapper[4825]: I0310 08:25:24.052413 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f712d86-b1c5-44e8-8096-2a9d37b7a792-config-data\") pod \"ceilometer-0\" (UID: \"0f712d86-b1c5-44e8-8096-2a9d37b7a792\") " pod="openstack/ceilometer-0" Mar 10 08:25:24 crc kubenswrapper[4825]: I0310 08:25:24.052455 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f712d86-b1c5-44e8-8096-2a9d37b7a792-run-httpd\") pod \"ceilometer-0\" (UID: \"0f712d86-b1c5-44e8-8096-2a9d37b7a792\") " pod="openstack/ceilometer-0" Mar 10 08:25:24 crc kubenswrapper[4825]: I0310 08:25:24.052482 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf7tr\" (UniqueName: \"kubernetes.io/projected/0f712d86-b1c5-44e8-8096-2a9d37b7a792-kube-api-access-wf7tr\") pod \"ceilometer-0\" (UID: \"0f712d86-b1c5-44e8-8096-2a9d37b7a792\") " pod="openstack/ceilometer-0" Mar 10 08:25:24 crc kubenswrapper[4825]: I0310 08:25:24.052534 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f712d86-b1c5-44e8-8096-2a9d37b7a792-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0f712d86-b1c5-44e8-8096-2a9d37b7a792\") " pod="openstack/ceilometer-0" Mar 10 08:25:24 crc kubenswrapper[4825]: I0310 08:25:24.053455 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f712d86-b1c5-44e8-8096-2a9d37b7a792-run-httpd\") pod \"ceilometer-0\" (UID: \"0f712d86-b1c5-44e8-8096-2a9d37b7a792\") " pod="openstack/ceilometer-0" Mar 10 08:25:24 crc kubenswrapper[4825]: I0310 08:25:24.053475 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f712d86-b1c5-44e8-8096-2a9d37b7a792-log-httpd\") pod \"ceilometer-0\" (UID: \"0f712d86-b1c5-44e8-8096-2a9d37b7a792\") " pod="openstack/ceilometer-0" Mar 10 08:25:24 crc kubenswrapper[4825]: I0310 08:25:24.057395 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f712d86-b1c5-44e8-8096-2a9d37b7a792-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f712d86-b1c5-44e8-8096-2a9d37b7a792\") " pod="openstack/ceilometer-0" Mar 10 08:25:24 crc kubenswrapper[4825]: I0310 08:25:24.058119 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f712d86-b1c5-44e8-8096-2a9d37b7a792-scripts\") pod \"ceilometer-0\" (UID: \"0f712d86-b1c5-44e8-8096-2a9d37b7a792\") " pod="openstack/ceilometer-0" Mar 10 08:25:24 crc kubenswrapper[4825]: I0310 08:25:24.058997 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f712d86-b1c5-44e8-8096-2a9d37b7a792-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f712d86-b1c5-44e8-8096-2a9d37b7a792\") " pod="openstack/ceilometer-0" Mar 10 08:25:24 crc kubenswrapper[4825]: I0310 08:25:24.059515 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f712d86-b1c5-44e8-8096-2a9d37b7a792-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0f712d86-b1c5-44e8-8096-2a9d37b7a792\") " pod="openstack/ceilometer-0" Mar 10 08:25:24 crc kubenswrapper[4825]: I0310 08:25:24.072084 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f712d86-b1c5-44e8-8096-2a9d37b7a792-config-data\") pod \"ceilometer-0\" (UID: \"0f712d86-b1c5-44e8-8096-2a9d37b7a792\") " pod="openstack/ceilometer-0" Mar 10 08:25:24 crc kubenswrapper[4825]: I0310 08:25:24.073269 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf7tr\" (UniqueName: \"kubernetes.io/projected/0f712d86-b1c5-44e8-8096-2a9d37b7a792-kube-api-access-wf7tr\") pod \"ceilometer-0\" (UID: \"0f712d86-b1c5-44e8-8096-2a9d37b7a792\") " pod="openstack/ceilometer-0" Mar 10 08:25:24 crc kubenswrapper[4825]: I0310 08:25:24.129606 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 08:25:24 crc kubenswrapper[4825]: I0310 08:25:24.619479 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 08:25:24 crc kubenswrapper[4825]: W0310 08:25:24.623934 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f712d86_b1c5_44e8_8096_2a9d37b7a792.slice/crio-a10e4c29a6fd358b40a2b5cee2a08894994ec240d0ee450703ddd0daaa259e3d WatchSource:0}: Error finding container a10e4c29a6fd358b40a2b5cee2a08894994ec240d0ee450703ddd0daaa259e3d: Status 404 returned error can't find the container with id a10e4c29a6fd358b40a2b5cee2a08894994ec240d0ee450703ddd0daaa259e3d Mar 10 08:25:24 crc kubenswrapper[4825]: I0310 08:25:24.731750 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f712d86-b1c5-44e8-8096-2a9d37b7a792","Type":"ContainerStarted","Data":"a10e4c29a6fd358b40a2b5cee2a08894994ec240d0ee450703ddd0daaa259e3d"} Mar 10 08:25:25 crc kubenswrapper[4825]: I0310 08:25:25.251979 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ca43042-5830-4c74-9412-9964fdc9ea93" path="/var/lib/kubelet/pods/1ca43042-5830-4c74-9412-9964fdc9ea93/volumes" Mar 10 08:25:25 crc kubenswrapper[4825]: I0310 08:25:25.744264 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f712d86-b1c5-44e8-8096-2a9d37b7a792","Type":"ContainerStarted","Data":"54057ac22ed07b87d5df27d9290624f04177942ec555b654cce5342645d01e74"} Mar 10 08:25:25 crc kubenswrapper[4825]: I0310 08:25:25.744327 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f712d86-b1c5-44e8-8096-2a9d37b7a792","Type":"ContainerStarted","Data":"0fc2f525553ed3a823423151957d55cb82de512e9b8b3a0f3280408c190b728b"} Mar 10 08:25:26 crc kubenswrapper[4825]: I0310 08:25:26.754106 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f712d86-b1c5-44e8-8096-2a9d37b7a792","Type":"ContainerStarted","Data":"48a70710a63cd26b5b7b526de5d1f87327d0547cf40b95bfec9e8d3d104dbad5"} Mar 10 08:25:28 crc kubenswrapper[4825]: I0310 08:25:28.780822 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f712d86-b1c5-44e8-8096-2a9d37b7a792","Type":"ContainerStarted","Data":"2245f76d0509ec131b5ec825b1009bfc267ccb9154c83b5141869431dcc6a3e2"} Mar 10 08:25:28 crc kubenswrapper[4825]: I0310 08:25:28.781578 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 08:25:28 crc kubenswrapper[4825]: I0310 08:25:28.816443 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.64895195 podStartE2EDuration="5.816415467s" podCreationTimestamp="2026-03-10 08:25:23 +0000 UTC" firstStartedPulling="2026-03-10 08:25:24.626725192 +0000 UTC m=+6077.656505807" lastFinishedPulling="2026-03-10 08:25:27.794188679 +0000 UTC m=+6080.823969324" observedRunningTime="2026-03-10 08:25:28.801869885 +0000 UTC m=+6081.831650500" watchObservedRunningTime="2026-03-10 08:25:28.816415467 +0000 UTC m=+6081.846196082" Mar 10 08:25:46 crc kubenswrapper[4825]: I0310 08:25:46.889202 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:25:46 crc kubenswrapper[4825]: I0310 08:25:46.889742 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:25:46 crc kubenswrapper[4825]: I0310 08:25:46.889791 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 08:25:46 crc kubenswrapper[4825]: I0310 08:25:46.890500 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 08:25:46 crc kubenswrapper[4825]: I0310 08:25:46.890563 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" gracePeriod=600 Mar 10 08:25:47 crc kubenswrapper[4825]: E0310 08:25:47.019616 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:25:47 crc kubenswrapper[4825]: I0310 08:25:47.993642 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" exitCode=0 Mar 10 08:25:47 crc kubenswrapper[4825]: I0310 08:25:47.993736 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc"} Mar 10 08:25:47 crc kubenswrapper[4825]: I0310 08:25:47.994031 4825 scope.go:117] "RemoveContainer" containerID="527769f1bd8bc09217383522eb96db2b542ebee6be919f1442be652e8e45cf86" Mar 10 08:25:47 crc kubenswrapper[4825]: I0310 08:25:47.994876 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:25:47 crc kubenswrapper[4825]: E0310 08:25:47.995213 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:25:48 crc kubenswrapper[4825]: I0310 08:25:48.000628 4825 generic.go:334] "Generic (PLEG): container finished" podID="258f3503-574c-4911-a806-832d9ee659d5" containerID="a8e474bd7cc82c3d7c90eacbfdf8f2e82018cf2e44d98298ee08cd745db5abc9" exitCode=137 Mar 10 08:25:48 crc kubenswrapper[4825]: I0310 08:25:48.000676 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"258f3503-574c-4911-a806-832d9ee659d5","Type":"ContainerDied","Data":"a8e474bd7cc82c3d7c90eacbfdf8f2e82018cf2e44d98298ee08cd745db5abc9"} Mar 10 08:25:48 crc kubenswrapper[4825]: I0310 08:25:48.107175 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 10 08:25:48 crc kubenswrapper[4825]: I0310 08:25:48.176868 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gck5t\" (UniqueName: \"kubernetes.io/projected/258f3503-574c-4911-a806-832d9ee659d5-kube-api-access-gck5t\") pod \"258f3503-574c-4911-a806-832d9ee659d5\" (UID: \"258f3503-574c-4911-a806-832d9ee659d5\") " Mar 10 08:25:48 crc kubenswrapper[4825]: I0310 08:25:48.177570 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/258f3503-574c-4911-a806-832d9ee659d5-combined-ca-bundle\") pod \"258f3503-574c-4911-a806-832d9ee659d5\" (UID: \"258f3503-574c-4911-a806-832d9ee659d5\") " Mar 10 08:25:48 crc kubenswrapper[4825]: I0310 08:25:48.177650 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/258f3503-574c-4911-a806-832d9ee659d5-scripts\") pod \"258f3503-574c-4911-a806-832d9ee659d5\" (UID: \"258f3503-574c-4911-a806-832d9ee659d5\") " Mar 10 08:25:48 crc kubenswrapper[4825]: I0310 08:25:48.178119 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/258f3503-574c-4911-a806-832d9ee659d5-config-data\") pod \"258f3503-574c-4911-a806-832d9ee659d5\" (UID: \"258f3503-574c-4911-a806-832d9ee659d5\") " Mar 10 08:25:48 crc kubenswrapper[4825]: I0310 08:25:48.187256 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/258f3503-574c-4911-a806-832d9ee659d5-scripts" (OuterVolumeSpecName: "scripts") pod "258f3503-574c-4911-a806-832d9ee659d5" (UID: "258f3503-574c-4911-a806-832d9ee659d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:25:48 crc kubenswrapper[4825]: I0310 08:25:48.198652 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/258f3503-574c-4911-a806-832d9ee659d5-kube-api-access-gck5t" (OuterVolumeSpecName: "kube-api-access-gck5t") pod "258f3503-574c-4911-a806-832d9ee659d5" (UID: "258f3503-574c-4911-a806-832d9ee659d5"). InnerVolumeSpecName "kube-api-access-gck5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:25:48 crc kubenswrapper[4825]: I0310 08:25:48.281015 4825 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/258f3503-574c-4911-a806-832d9ee659d5-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:48 crc kubenswrapper[4825]: I0310 08:25:48.281231 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gck5t\" (UniqueName: \"kubernetes.io/projected/258f3503-574c-4911-a806-832d9ee659d5-kube-api-access-gck5t\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:48 crc kubenswrapper[4825]: I0310 08:25:48.303075 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/258f3503-574c-4911-a806-832d9ee659d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "258f3503-574c-4911-a806-832d9ee659d5" (UID: "258f3503-574c-4911-a806-832d9ee659d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:25:48 crc kubenswrapper[4825]: I0310 08:25:48.322342 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/258f3503-574c-4911-a806-832d9ee659d5-config-data" (OuterVolumeSpecName: "config-data") pod "258f3503-574c-4911-a806-832d9ee659d5" (UID: "258f3503-574c-4911-a806-832d9ee659d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:25:48 crc kubenswrapper[4825]: I0310 08:25:48.383521 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/258f3503-574c-4911-a806-832d9ee659d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:48 crc kubenswrapper[4825]: I0310 08:25:48.383559 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/258f3503-574c-4911-a806-832d9ee659d5-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.016056 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"258f3503-574c-4911-a806-832d9ee659d5","Type":"ContainerDied","Data":"2f2144752805fb52dab369fbe0caca512e134821383dca1eccbf6406b2c6b652"} Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.016113 4825 scope.go:117] "RemoveContainer" containerID="a8e474bd7cc82c3d7c90eacbfdf8f2e82018cf2e44d98298ee08cd745db5abc9" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.016119 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.072552 4825 scope.go:117] "RemoveContainer" containerID="600a775a7de0ec710aa71c98d27e8e893d63c0000768c7b317271150fabdcbc8" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.079679 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.093027 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.105796 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 10 08:25:49 crc kubenswrapper[4825]: E0310 08:25:49.106373 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="258f3503-574c-4911-a806-832d9ee659d5" containerName="aodh-notifier" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.106397 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="258f3503-574c-4911-a806-832d9ee659d5" containerName="aodh-notifier" Mar 10 08:25:49 crc kubenswrapper[4825]: E0310 08:25:49.106420 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="258f3503-574c-4911-a806-832d9ee659d5" containerName="aodh-api" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.106432 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="258f3503-574c-4911-a806-832d9ee659d5" containerName="aodh-api" Mar 10 08:25:49 crc kubenswrapper[4825]: E0310 08:25:49.106464 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="258f3503-574c-4911-a806-832d9ee659d5" containerName="aodh-evaluator" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.106472 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="258f3503-574c-4911-a806-832d9ee659d5" containerName="aodh-evaluator" Mar 10 08:25:49 crc kubenswrapper[4825]: E0310 08:25:49.106497 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="258f3503-574c-4911-a806-832d9ee659d5" containerName="aodh-listener" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.106505 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="258f3503-574c-4911-a806-832d9ee659d5" containerName="aodh-listener" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.106724 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="258f3503-574c-4911-a806-832d9ee659d5" containerName="aodh-notifier" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.106758 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="258f3503-574c-4911-a806-832d9ee659d5" containerName="aodh-listener" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.106777 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="258f3503-574c-4911-a806-832d9ee659d5" containerName="aodh-evaluator" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.106796 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="258f3503-574c-4911-a806-832d9ee659d5" containerName="aodh-api" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.108693 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.110440 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-pkvxq" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.110864 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.111100 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.111241 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.123834 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.124024 4825 scope.go:117] "RemoveContainer" containerID="e5cb52ad5df695461d8439c7be4a75a9cf18a77259d8f465603682554441252e" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.145084 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.168474 4825 scope.go:117] "RemoveContainer" containerID="183d219b90afdf6d8d19644ab103a96a3d59fbb32ee78b40f888b4857427cbfa" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.199334 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72b82233-02bf-4511-9c8b-6b9c744a552a-internal-tls-certs\") pod \"aodh-0\" (UID: \"72b82233-02bf-4511-9c8b-6b9c744a552a\") " pod="openstack/aodh-0" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.199427 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwtnc\" (UniqueName: \"kubernetes.io/projected/72b82233-02bf-4511-9c8b-6b9c744a552a-kube-api-access-fwtnc\") pod \"aodh-0\" (UID: \"72b82233-02bf-4511-9c8b-6b9c744a552a\") " pod="openstack/aodh-0" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.199457 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b82233-02bf-4511-9c8b-6b9c744a552a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"72b82233-02bf-4511-9c8b-6b9c744a552a\") " pod="openstack/aodh-0" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.199564 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72b82233-02bf-4511-9c8b-6b9c744a552a-scripts\") pod \"aodh-0\" (UID: \"72b82233-02bf-4511-9c8b-6b9c744a552a\") " pod="openstack/aodh-0" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.199667 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b82233-02bf-4511-9c8b-6b9c744a552a-config-data\") pod \"aodh-0\" (UID: \"72b82233-02bf-4511-9c8b-6b9c744a552a\") " pod="openstack/aodh-0" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.199793 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72b82233-02bf-4511-9c8b-6b9c744a552a-public-tls-certs\") pod \"aodh-0\" (UID: \"72b82233-02bf-4511-9c8b-6b9c744a552a\") " pod="openstack/aodh-0" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.247831 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="258f3503-574c-4911-a806-832d9ee659d5" path="/var/lib/kubelet/pods/258f3503-574c-4911-a806-832d9ee659d5/volumes" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.301338 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b82233-02bf-4511-9c8b-6b9c744a552a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"72b82233-02bf-4511-9c8b-6b9c744a552a\") " pod="openstack/aodh-0" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.301391 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72b82233-02bf-4511-9c8b-6b9c744a552a-scripts\") pod \"aodh-0\" (UID: \"72b82233-02bf-4511-9c8b-6b9c744a552a\") " pod="openstack/aodh-0" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.301422 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b82233-02bf-4511-9c8b-6b9c744a552a-config-data\") pod \"aodh-0\" (UID: \"72b82233-02bf-4511-9c8b-6b9c744a552a\") " pod="openstack/aodh-0" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.301463 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72b82233-02bf-4511-9c8b-6b9c744a552a-public-tls-certs\") pod \"aodh-0\" (UID: \"72b82233-02bf-4511-9c8b-6b9c744a552a\") " pod="openstack/aodh-0" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.301569 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72b82233-02bf-4511-9c8b-6b9c744a552a-internal-tls-certs\") pod \"aodh-0\" (UID: \"72b82233-02bf-4511-9c8b-6b9c744a552a\") " pod="openstack/aodh-0" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.301658 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwtnc\" (UniqueName: \"kubernetes.io/projected/72b82233-02bf-4511-9c8b-6b9c744a552a-kube-api-access-fwtnc\") pod \"aodh-0\" (UID: \"72b82233-02bf-4511-9c8b-6b9c744a552a\") " pod="openstack/aodh-0" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.308738 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72b82233-02bf-4511-9c8b-6b9c744a552a-public-tls-certs\") pod \"aodh-0\" (UID: \"72b82233-02bf-4511-9c8b-6b9c744a552a\") " pod="openstack/aodh-0" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.308866 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b82233-02bf-4511-9c8b-6b9c744a552a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"72b82233-02bf-4511-9c8b-6b9c744a552a\") " pod="openstack/aodh-0" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.309929 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b82233-02bf-4511-9c8b-6b9c744a552a-config-data\") pod \"aodh-0\" (UID: \"72b82233-02bf-4511-9c8b-6b9c744a552a\") " pod="openstack/aodh-0" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.311123 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72b82233-02bf-4511-9c8b-6b9c744a552a-scripts\") pod \"aodh-0\" (UID: \"72b82233-02bf-4511-9c8b-6b9c744a552a\") " pod="openstack/aodh-0" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.311993 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72b82233-02bf-4511-9c8b-6b9c744a552a-internal-tls-certs\") pod \"aodh-0\" (UID: \"72b82233-02bf-4511-9c8b-6b9c744a552a\") " pod="openstack/aodh-0" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.319662 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwtnc\" (UniqueName: \"kubernetes.io/projected/72b82233-02bf-4511-9c8b-6b9c744a552a-kube-api-access-fwtnc\") pod \"aodh-0\" (UID: \"72b82233-02bf-4511-9c8b-6b9c744a552a\") " pod="openstack/aodh-0" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.441873 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 10 08:25:49 crc kubenswrapper[4825]: I0310 08:25:49.901757 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 10 08:25:49 crc kubenswrapper[4825]: W0310 08:25:49.909702 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72b82233_02bf_4511_9c8b_6b9c744a552a.slice/crio-45c9841b14ac6550c5a6db643dab26e34a9a2f159ae3acdd5f872fdf8e5abb48 WatchSource:0}: Error finding container 45c9841b14ac6550c5a6db643dab26e34a9a2f159ae3acdd5f872fdf8e5abb48: Status 404 returned error can't find the container with id 45c9841b14ac6550c5a6db643dab26e34a9a2f159ae3acdd5f872fdf8e5abb48 Mar 10 08:25:50 crc kubenswrapper[4825]: I0310 08:25:50.028894 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"72b82233-02bf-4511-9c8b-6b9c744a552a","Type":"ContainerStarted","Data":"45c9841b14ac6550c5a6db643dab26e34a9a2f159ae3acdd5f872fdf8e5abb48"} Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.042570 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"72b82233-02bf-4511-9c8b-6b9c744a552a","Type":"ContainerStarted","Data":"851de1c3f5b50551d6e487aecf0fd511b8127fac67a7b7adef840bf455a25c20"} Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.043297 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"72b82233-02bf-4511-9c8b-6b9c744a552a","Type":"ContainerStarted","Data":"5fde08cff722c65c5d6388dac0c623b22a0eda395e6d395c09b209e3479be34f"} Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.613245 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76cb956b75-tw8ht"] Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.614945 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.617293 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.627297 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76cb956b75-tw8ht"] Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.644092 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-dns-svc\") pod \"dnsmasq-dns-76cb956b75-tw8ht\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.644383 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-openstack-cell1\") pod \"dnsmasq-dns-76cb956b75-tw8ht\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.644539 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-config\") pod \"dnsmasq-dns-76cb956b75-tw8ht\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.644733 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-ovsdbserver-sb\") pod \"dnsmasq-dns-76cb956b75-tw8ht\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.644795 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htfdf\" (UniqueName: \"kubernetes.io/projected/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-kube-api-access-htfdf\") pod \"dnsmasq-dns-76cb956b75-tw8ht\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.644814 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-ovsdbserver-nb\") pod \"dnsmasq-dns-76cb956b75-tw8ht\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.746577 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-config\") pod \"dnsmasq-dns-76cb956b75-tw8ht\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.746930 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-ovsdbserver-sb\") pod \"dnsmasq-dns-76cb956b75-tw8ht\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.747028 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-ovsdbserver-nb\") pod \"dnsmasq-dns-76cb956b75-tw8ht\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.747061 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htfdf\" (UniqueName: \"kubernetes.io/projected/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-kube-api-access-htfdf\") pod \"dnsmasq-dns-76cb956b75-tw8ht\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.747606 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-config\") pod \"dnsmasq-dns-76cb956b75-tw8ht\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.747806 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-ovsdbserver-sb\") pod \"dnsmasq-dns-76cb956b75-tw8ht\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.747818 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-ovsdbserver-nb\") pod \"dnsmasq-dns-76cb956b75-tw8ht\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.747962 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-dns-svc\") pod \"dnsmasq-dns-76cb956b75-tw8ht\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.747998 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-openstack-cell1\") pod \"dnsmasq-dns-76cb956b75-tw8ht\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.748740 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-dns-svc\") pod \"dnsmasq-dns-76cb956b75-tw8ht\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.748767 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-openstack-cell1\") pod \"dnsmasq-dns-76cb956b75-tw8ht\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.765804 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htfdf\" (UniqueName: \"kubernetes.io/projected/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-kube-api-access-htfdf\") pod \"dnsmasq-dns-76cb956b75-tw8ht\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" Mar 10 08:25:51 crc kubenswrapper[4825]: I0310 08:25:51.934781 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" Mar 10 08:25:52 crc kubenswrapper[4825]: I0310 08:25:52.078720 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"72b82233-02bf-4511-9c8b-6b9c744a552a","Type":"ContainerStarted","Data":"b0ff2cb3defa219f13fcb771ffb0794561836e9097461f3deb7201003b79e974"} Mar 10 08:25:52 crc kubenswrapper[4825]: I0310 08:25:52.079039 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"72b82233-02bf-4511-9c8b-6b9c744a552a","Type":"ContainerStarted","Data":"c7cdd56e657f1da01b0931fce8a5ac188a76ce987545c2583118e2ed3c0da04a"} Mar 10 08:25:52 crc kubenswrapper[4825]: I0310 08:25:52.118661 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.592124644 podStartE2EDuration="3.11864185s" podCreationTimestamp="2026-03-10 08:25:49 +0000 UTC" firstStartedPulling="2026-03-10 08:25:49.914398711 +0000 UTC m=+6102.944179346" lastFinishedPulling="2026-03-10 08:25:51.440915937 +0000 UTC m=+6104.470696552" observedRunningTime="2026-03-10 08:25:52.102376393 +0000 UTC m=+6105.132157028" watchObservedRunningTime="2026-03-10 08:25:52.11864185 +0000 UTC m=+6105.148422465" Mar 10 08:25:52 crc kubenswrapper[4825]: W0310 08:25:52.488503 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ac4bc4c_ab7c_45e0_b862_4c33bb46ae35.slice/crio-47a3e9ef233ee652d78c97e9b1bb36316e4d25fb8097775ea6d919a17330e4b0 WatchSource:0}: Error finding container 47a3e9ef233ee652d78c97e9b1bb36316e4d25fb8097775ea6d919a17330e4b0: Status 404 returned error can't find the container with id 47a3e9ef233ee652d78c97e9b1bb36316e4d25fb8097775ea6d919a17330e4b0 Mar 10 08:25:52 crc kubenswrapper[4825]: I0310 08:25:52.505979 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76cb956b75-tw8ht"] Mar 10 08:25:53 crc kubenswrapper[4825]: I0310 08:25:53.091450 4825 generic.go:334] "Generic (PLEG): container finished" podID="9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35" containerID="993c49019b0e0ca2b1132e5af69a90c51894e140101cb45bfd177ce3edf919d8" exitCode=0 Mar 10 08:25:53 crc kubenswrapper[4825]: I0310 08:25:53.091508 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" event={"ID":"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35","Type":"ContainerDied","Data":"993c49019b0e0ca2b1132e5af69a90c51894e140101cb45bfd177ce3edf919d8"} Mar 10 08:25:53 crc kubenswrapper[4825]: I0310 08:25:53.091833 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" event={"ID":"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35","Type":"ContainerStarted","Data":"47a3e9ef233ee652d78c97e9b1bb36316e4d25fb8097775ea6d919a17330e4b0"} Mar 10 08:25:54 crc kubenswrapper[4825]: I0310 08:25:54.050192 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-4078-account-create-update-tbhx6"] Mar 10 08:25:54 crc kubenswrapper[4825]: I0310 08:25:54.062651 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-nlxld"] Mar 10 08:25:54 crc kubenswrapper[4825]: I0310 08:25:54.072681 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-4078-account-create-update-tbhx6"] Mar 10 08:25:54 crc kubenswrapper[4825]: I0310 08:25:54.082396 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-nlxld"] Mar 10 08:25:54 crc kubenswrapper[4825]: I0310 08:25:54.101786 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" event={"ID":"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35","Type":"ContainerStarted","Data":"9e15f6b822363005b80d867b0eb4d4d79f35358c003ef050180166e4bbe5edf3"} Mar 10 08:25:54 crc kubenswrapper[4825]: I0310 08:25:54.101988 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" Mar 10 08:25:54 crc kubenswrapper[4825]: I0310 08:25:54.122285 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" podStartSLOduration=3.122265023 podStartE2EDuration="3.122265023s" podCreationTimestamp="2026-03-10 08:25:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:25:54.120256671 +0000 UTC m=+6107.150037326" watchObservedRunningTime="2026-03-10 08:25:54.122265023 +0000 UTC m=+6107.152045658" Mar 10 08:25:54 crc kubenswrapper[4825]: I0310 08:25:54.141765 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 08:25:55 crc kubenswrapper[4825]: I0310 08:25:55.247563 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16d6fbed-6e1d-42c0-b023-2b2f0bef5889" path="/var/lib/kubelet/pods/16d6fbed-6e1d-42c0-b023-2b2f0bef5889/volumes" Mar 10 08:25:55 crc kubenswrapper[4825]: I0310 08:25:55.248683 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9667b429-21a2-47d3-8874-5cc660a9e1d6" path="/var/lib/kubelet/pods/9667b429-21a2-47d3-8874-5cc660a9e1d6/volumes" Mar 10 08:26:00 crc kubenswrapper[4825]: I0310 08:26:00.196627 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552186-z88l7"] Mar 10 08:26:00 crc kubenswrapper[4825]: I0310 08:26:00.199510 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552186-z88l7" Mar 10 08:26:00 crc kubenswrapper[4825]: I0310 08:26:00.202529 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:26:00 crc kubenswrapper[4825]: I0310 08:26:00.202793 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:26:00 crc kubenswrapper[4825]: I0310 08:26:00.204614 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:26:00 crc kubenswrapper[4825]: I0310 08:26:00.207562 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552186-z88l7"] Mar 10 08:26:00 crc kubenswrapper[4825]: I0310 08:26:00.245561 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4xdm\" (UniqueName: \"kubernetes.io/projected/8fc5e5dd-4c5e-4729-8750-b65c6e70e83d-kube-api-access-d4xdm\") pod \"auto-csr-approver-29552186-z88l7\" (UID: \"8fc5e5dd-4c5e-4729-8750-b65c6e70e83d\") " pod="openshift-infra/auto-csr-approver-29552186-z88l7" Mar 10 08:26:00 crc kubenswrapper[4825]: I0310 08:26:00.348863 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4xdm\" (UniqueName: \"kubernetes.io/projected/8fc5e5dd-4c5e-4729-8750-b65c6e70e83d-kube-api-access-d4xdm\") pod \"auto-csr-approver-29552186-z88l7\" (UID: \"8fc5e5dd-4c5e-4729-8750-b65c6e70e83d\") " pod="openshift-infra/auto-csr-approver-29552186-z88l7" Mar 10 08:26:00 crc kubenswrapper[4825]: I0310 08:26:00.381514 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4xdm\" (UniqueName: \"kubernetes.io/projected/8fc5e5dd-4c5e-4729-8750-b65c6e70e83d-kube-api-access-d4xdm\") pod \"auto-csr-approver-29552186-z88l7\" (UID: \"8fc5e5dd-4c5e-4729-8750-b65c6e70e83d\") " pod="openshift-infra/auto-csr-approver-29552186-z88l7" Mar 10 08:26:00 crc kubenswrapper[4825]: I0310 08:26:00.536248 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552186-z88l7" Mar 10 08:26:01 crc kubenswrapper[4825]: I0310 08:26:01.170070 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552186-z88l7"] Mar 10 08:26:01 crc kubenswrapper[4825]: I0310 08:26:01.219539 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552186-z88l7" event={"ID":"8fc5e5dd-4c5e-4729-8750-b65c6e70e83d","Type":"ContainerStarted","Data":"b16c5e8977c54d5f7b4775761af5d2d373740417c68133b0f3f56e49561cf9f1"} Mar 10 08:26:01 crc kubenswrapper[4825]: I0310 08:26:01.936303 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.015344 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66cbbcd49c-svhss"] Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.015594 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" podUID="a6985674-aa5a-40c5-a384-04bac7fd0a1b" containerName="dnsmasq-dns" containerID="cri-o://6862e0585f5a70635aa6d6682db784c76f978351ae935abc4d0460550c6fd034" gracePeriod=10 Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.237945 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:26:02 crc kubenswrapper[4825]: E0310 08:26:02.238273 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.259296 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5958856c47-pd5nn"] Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.261515 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5958856c47-pd5nn" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.283006 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5958856c47-pd5nn"] Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.292460 4825 generic.go:334] "Generic (PLEG): container finished" podID="a6985674-aa5a-40c5-a384-04bac7fd0a1b" containerID="6862e0585f5a70635aa6d6682db784c76f978351ae935abc4d0460550c6fd034" exitCode=0 Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.292507 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" event={"ID":"a6985674-aa5a-40c5-a384-04bac7fd0a1b","Type":"ContainerDied","Data":"6862e0585f5a70635aa6d6682db784c76f978351ae935abc4d0460550c6fd034"} Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.302405 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e59d4d2c-39bd-4900-8777-be4beac0f1c5-ovsdbserver-sb\") pod \"dnsmasq-dns-5958856c47-pd5nn\" (UID: \"e59d4d2c-39bd-4900-8777-be4beac0f1c5\") " pod="openstack/dnsmasq-dns-5958856c47-pd5nn" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.302499 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw24m\" (UniqueName: \"kubernetes.io/projected/e59d4d2c-39bd-4900-8777-be4beac0f1c5-kube-api-access-vw24m\") pod \"dnsmasq-dns-5958856c47-pd5nn\" (UID: \"e59d4d2c-39bd-4900-8777-be4beac0f1c5\") " pod="openstack/dnsmasq-dns-5958856c47-pd5nn" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.302556 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e59d4d2c-39bd-4900-8777-be4beac0f1c5-ovsdbserver-nb\") pod \"dnsmasq-dns-5958856c47-pd5nn\" (UID: \"e59d4d2c-39bd-4900-8777-be4beac0f1c5\") " pod="openstack/dnsmasq-dns-5958856c47-pd5nn" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.302584 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e59d4d2c-39bd-4900-8777-be4beac0f1c5-dns-svc\") pod \"dnsmasq-dns-5958856c47-pd5nn\" (UID: \"e59d4d2c-39bd-4900-8777-be4beac0f1c5\") " pod="openstack/dnsmasq-dns-5958856c47-pd5nn" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.302689 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/e59d4d2c-39bd-4900-8777-be4beac0f1c5-openstack-cell1\") pod \"dnsmasq-dns-5958856c47-pd5nn\" (UID: \"e59d4d2c-39bd-4900-8777-be4beac0f1c5\") " pod="openstack/dnsmasq-dns-5958856c47-pd5nn" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.302731 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e59d4d2c-39bd-4900-8777-be4beac0f1c5-config\") pod \"dnsmasq-dns-5958856c47-pd5nn\" (UID: \"e59d4d2c-39bd-4900-8777-be4beac0f1c5\") " pod="openstack/dnsmasq-dns-5958856c47-pd5nn" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.404959 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/e59d4d2c-39bd-4900-8777-be4beac0f1c5-openstack-cell1\") pod \"dnsmasq-dns-5958856c47-pd5nn\" (UID: \"e59d4d2c-39bd-4900-8777-be4beac0f1c5\") " pod="openstack/dnsmasq-dns-5958856c47-pd5nn" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.405025 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e59d4d2c-39bd-4900-8777-be4beac0f1c5-config\") pod \"dnsmasq-dns-5958856c47-pd5nn\" (UID: \"e59d4d2c-39bd-4900-8777-be4beac0f1c5\") " pod="openstack/dnsmasq-dns-5958856c47-pd5nn" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.405077 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e59d4d2c-39bd-4900-8777-be4beac0f1c5-ovsdbserver-sb\") pod \"dnsmasq-dns-5958856c47-pd5nn\" (UID: \"e59d4d2c-39bd-4900-8777-be4beac0f1c5\") " pod="openstack/dnsmasq-dns-5958856c47-pd5nn" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.405149 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw24m\" (UniqueName: \"kubernetes.io/projected/e59d4d2c-39bd-4900-8777-be4beac0f1c5-kube-api-access-vw24m\") pod \"dnsmasq-dns-5958856c47-pd5nn\" (UID: \"e59d4d2c-39bd-4900-8777-be4beac0f1c5\") " pod="openstack/dnsmasq-dns-5958856c47-pd5nn" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.405187 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e59d4d2c-39bd-4900-8777-be4beac0f1c5-ovsdbserver-nb\") pod \"dnsmasq-dns-5958856c47-pd5nn\" (UID: \"e59d4d2c-39bd-4900-8777-be4beac0f1c5\") " pod="openstack/dnsmasq-dns-5958856c47-pd5nn" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.405204 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e59d4d2c-39bd-4900-8777-be4beac0f1c5-dns-svc\") pod \"dnsmasq-dns-5958856c47-pd5nn\" (UID: \"e59d4d2c-39bd-4900-8777-be4beac0f1c5\") " pod="openstack/dnsmasq-dns-5958856c47-pd5nn" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.405872 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/e59d4d2c-39bd-4900-8777-be4beac0f1c5-openstack-cell1\") pod \"dnsmasq-dns-5958856c47-pd5nn\" (UID: \"e59d4d2c-39bd-4900-8777-be4beac0f1c5\") " pod="openstack/dnsmasq-dns-5958856c47-pd5nn" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.406112 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e59d4d2c-39bd-4900-8777-be4beac0f1c5-config\") pod \"dnsmasq-dns-5958856c47-pd5nn\" (UID: \"e59d4d2c-39bd-4900-8777-be4beac0f1c5\") " pod="openstack/dnsmasq-dns-5958856c47-pd5nn" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.406838 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e59d4d2c-39bd-4900-8777-be4beac0f1c5-ovsdbserver-sb\") pod \"dnsmasq-dns-5958856c47-pd5nn\" (UID: \"e59d4d2c-39bd-4900-8777-be4beac0f1c5\") " pod="openstack/dnsmasq-dns-5958856c47-pd5nn" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.406892 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e59d4d2c-39bd-4900-8777-be4beac0f1c5-ovsdbserver-nb\") pod \"dnsmasq-dns-5958856c47-pd5nn\" (UID: \"e59d4d2c-39bd-4900-8777-be4beac0f1c5\") " pod="openstack/dnsmasq-dns-5958856c47-pd5nn" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.407355 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e59d4d2c-39bd-4900-8777-be4beac0f1c5-dns-svc\") pod \"dnsmasq-dns-5958856c47-pd5nn\" (UID: \"e59d4d2c-39bd-4900-8777-be4beac0f1c5\") " pod="openstack/dnsmasq-dns-5958856c47-pd5nn" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.425525 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw24m\" (UniqueName: \"kubernetes.io/projected/e59d4d2c-39bd-4900-8777-be4beac0f1c5-kube-api-access-vw24m\") pod \"dnsmasq-dns-5958856c47-pd5nn\" (UID: \"e59d4d2c-39bd-4900-8777-be4beac0f1c5\") " pod="openstack/dnsmasq-dns-5958856c47-pd5nn" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.572953 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.604869 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5958856c47-pd5nn" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.608893 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6985674-aa5a-40c5-a384-04bac7fd0a1b-config\") pod \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\" (UID: \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\") " Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.608979 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6985674-aa5a-40c5-a384-04bac7fd0a1b-ovsdbserver-sb\") pod \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\" (UID: \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\") " Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.609280 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vczwv\" (UniqueName: \"kubernetes.io/projected/a6985674-aa5a-40c5-a384-04bac7fd0a1b-kube-api-access-vczwv\") pod \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\" (UID: \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\") " Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.609317 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6985674-aa5a-40c5-a384-04bac7fd0a1b-ovsdbserver-nb\") pod \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\" (UID: \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\") " Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.609348 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6985674-aa5a-40c5-a384-04bac7fd0a1b-dns-svc\") pod \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\" (UID: \"a6985674-aa5a-40c5-a384-04bac7fd0a1b\") " Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.631257 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6985674-aa5a-40c5-a384-04bac7fd0a1b-kube-api-access-vczwv" (OuterVolumeSpecName: "kube-api-access-vczwv") pod "a6985674-aa5a-40c5-a384-04bac7fd0a1b" (UID: "a6985674-aa5a-40c5-a384-04bac7fd0a1b"). InnerVolumeSpecName "kube-api-access-vczwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.674575 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6985674-aa5a-40c5-a384-04bac7fd0a1b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a6985674-aa5a-40c5-a384-04bac7fd0a1b" (UID: "a6985674-aa5a-40c5-a384-04bac7fd0a1b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.685077 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6985674-aa5a-40c5-a384-04bac7fd0a1b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a6985674-aa5a-40c5-a384-04bac7fd0a1b" (UID: "a6985674-aa5a-40c5-a384-04bac7fd0a1b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.696593 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6985674-aa5a-40c5-a384-04bac7fd0a1b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a6985674-aa5a-40c5-a384-04bac7fd0a1b" (UID: "a6985674-aa5a-40c5-a384-04bac7fd0a1b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.698874 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6985674-aa5a-40c5-a384-04bac7fd0a1b-config" (OuterVolumeSpecName: "config") pod "a6985674-aa5a-40c5-a384-04bac7fd0a1b" (UID: "a6985674-aa5a-40c5-a384-04bac7fd0a1b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.711939 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vczwv\" (UniqueName: \"kubernetes.io/projected/a6985674-aa5a-40c5-a384-04bac7fd0a1b-kube-api-access-vczwv\") on node \"crc\" DevicePath \"\"" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.712224 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6985674-aa5a-40c5-a384-04bac7fd0a1b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.712283 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6985674-aa5a-40c5-a384-04bac7fd0a1b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.712334 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6985674-aa5a-40c5-a384-04bac7fd0a1b-config\") on node \"crc\" DevicePath \"\"" Mar 10 08:26:02 crc kubenswrapper[4825]: I0310 08:26:02.712406 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6985674-aa5a-40c5-a384-04bac7fd0a1b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 08:26:03 crc kubenswrapper[4825]: I0310 08:26:03.100117 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5958856c47-pd5nn"] Mar 10 08:26:03 crc kubenswrapper[4825]: I0310 08:26:03.306581 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552186-z88l7" event={"ID":"8fc5e5dd-4c5e-4729-8750-b65c6e70e83d","Type":"ContainerStarted","Data":"10d28f408366d3a98aa2961943ac9fe1c11d4640f9c9e39ef75970fa87943e77"} Mar 10 08:26:03 crc kubenswrapper[4825]: I0310 08:26:03.309591 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5958856c47-pd5nn" event={"ID":"e59d4d2c-39bd-4900-8777-be4beac0f1c5","Type":"ContainerStarted","Data":"53946abfb9bc5cf260f743f5918533d79647f69250cc0ac1d6ff04293d023c3d"} Mar 10 08:26:03 crc kubenswrapper[4825]: I0310 08:26:03.312149 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" event={"ID":"a6985674-aa5a-40c5-a384-04bac7fd0a1b","Type":"ContainerDied","Data":"9ab226bffea0e69b9ed41e55cc33656c9a911b8d37b0e82b97a235a3f448ba20"} Mar 10 08:26:03 crc kubenswrapper[4825]: I0310 08:26:03.312211 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66cbbcd49c-svhss" Mar 10 08:26:03 crc kubenswrapper[4825]: I0310 08:26:03.312272 4825 scope.go:117] "RemoveContainer" containerID="6862e0585f5a70635aa6d6682db784c76f978351ae935abc4d0460550c6fd034" Mar 10 08:26:03 crc kubenswrapper[4825]: I0310 08:26:03.326239 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552186-z88l7" podStartSLOduration=2.4336082279999998 podStartE2EDuration="3.326221902s" podCreationTimestamp="2026-03-10 08:26:00 +0000 UTC" firstStartedPulling="2026-03-10 08:26:01.171849341 +0000 UTC m=+6114.201629956" lastFinishedPulling="2026-03-10 08:26:02.064463015 +0000 UTC m=+6115.094243630" observedRunningTime="2026-03-10 08:26:03.319634739 +0000 UTC m=+6116.349415354" watchObservedRunningTime="2026-03-10 08:26:03.326221902 +0000 UTC m=+6116.356002517" Mar 10 08:26:03 crc kubenswrapper[4825]: I0310 08:26:03.345496 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66cbbcd49c-svhss"] Mar 10 08:26:03 crc kubenswrapper[4825]: I0310 08:26:03.348556 4825 scope.go:117] "RemoveContainer" containerID="d1ac0f7ac6e21940d961dad9396df6d9371d828f3f536516a27f2f707bb954de" Mar 10 08:26:03 crc kubenswrapper[4825]: I0310 08:26:03.354940 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66cbbcd49c-svhss"] Mar 10 08:26:04 crc kubenswrapper[4825]: I0310 08:26:04.329898 4825 generic.go:334] "Generic (PLEG): container finished" podID="e59d4d2c-39bd-4900-8777-be4beac0f1c5" containerID="e524cf3a65249e910dd3993552844c83160b48ba3994859b492fc69d62743679" exitCode=0 Mar 10 08:26:04 crc kubenswrapper[4825]: I0310 08:26:04.329966 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5958856c47-pd5nn" event={"ID":"e59d4d2c-39bd-4900-8777-be4beac0f1c5","Type":"ContainerDied","Data":"e524cf3a65249e910dd3993552844c83160b48ba3994859b492fc69d62743679"} Mar 10 08:26:04 crc kubenswrapper[4825]: I0310 08:26:04.335445 4825 generic.go:334] "Generic (PLEG): container finished" podID="8fc5e5dd-4c5e-4729-8750-b65c6e70e83d" containerID="10d28f408366d3a98aa2961943ac9fe1c11d4640f9c9e39ef75970fa87943e77" exitCode=0 Mar 10 08:26:04 crc kubenswrapper[4825]: I0310 08:26:04.335528 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552186-z88l7" event={"ID":"8fc5e5dd-4c5e-4729-8750-b65c6e70e83d","Type":"ContainerDied","Data":"10d28f408366d3a98aa2961943ac9fe1c11d4640f9c9e39ef75970fa87943e77"} Mar 10 08:26:05 crc kubenswrapper[4825]: I0310 08:26:05.249791 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6985674-aa5a-40c5-a384-04bac7fd0a1b" path="/var/lib/kubelet/pods/a6985674-aa5a-40c5-a384-04bac7fd0a1b/volumes" Mar 10 08:26:05 crc kubenswrapper[4825]: I0310 08:26:05.355340 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5958856c47-pd5nn" event={"ID":"e59d4d2c-39bd-4900-8777-be4beac0f1c5","Type":"ContainerStarted","Data":"bb4c6756e674f83794d5facc32e0f427c9031ffca78d6f57267c49fcefc61a8b"} Mar 10 08:26:05 crc kubenswrapper[4825]: I0310 08:26:05.355438 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5958856c47-pd5nn" Mar 10 08:26:05 crc kubenswrapper[4825]: I0310 08:26:05.381849 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5958856c47-pd5nn" podStartSLOduration=3.381832349 podStartE2EDuration="3.381832349s" podCreationTimestamp="2026-03-10 08:26:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 08:26:05.379061486 +0000 UTC m=+6118.408842121" watchObservedRunningTime="2026-03-10 08:26:05.381832349 +0000 UTC m=+6118.411612964" Mar 10 08:26:05 crc kubenswrapper[4825]: I0310 08:26:05.735099 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552186-z88l7" Mar 10 08:26:05 crc kubenswrapper[4825]: I0310 08:26:05.774956 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4xdm\" (UniqueName: \"kubernetes.io/projected/8fc5e5dd-4c5e-4729-8750-b65c6e70e83d-kube-api-access-d4xdm\") pod \"8fc5e5dd-4c5e-4729-8750-b65c6e70e83d\" (UID: \"8fc5e5dd-4c5e-4729-8750-b65c6e70e83d\") " Mar 10 08:26:05 crc kubenswrapper[4825]: I0310 08:26:05.810328 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fc5e5dd-4c5e-4729-8750-b65c6e70e83d-kube-api-access-d4xdm" (OuterVolumeSpecName: "kube-api-access-d4xdm") pod "8fc5e5dd-4c5e-4729-8750-b65c6e70e83d" (UID: "8fc5e5dd-4c5e-4729-8750-b65c6e70e83d"). InnerVolumeSpecName "kube-api-access-d4xdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:26:05 crc kubenswrapper[4825]: I0310 08:26:05.877870 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4xdm\" (UniqueName: \"kubernetes.io/projected/8fc5e5dd-4c5e-4729-8750-b65c6e70e83d-kube-api-access-d4xdm\") on node \"crc\" DevicePath \"\"" Mar 10 08:26:06 crc kubenswrapper[4825]: I0310 08:26:06.368267 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552186-z88l7" event={"ID":"8fc5e5dd-4c5e-4729-8750-b65c6e70e83d","Type":"ContainerDied","Data":"b16c5e8977c54d5f7b4775761af5d2d373740417c68133b0f3f56e49561cf9f1"} Mar 10 08:26:06 crc kubenswrapper[4825]: I0310 08:26:06.368336 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b16c5e8977c54d5f7b4775761af5d2d373740417c68133b0f3f56e49561cf9f1" Mar 10 08:26:06 crc kubenswrapper[4825]: I0310 08:26:06.369758 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552186-z88l7" Mar 10 08:26:06 crc kubenswrapper[4825]: I0310 08:26:06.415260 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552180-xsj9w"] Mar 10 08:26:06 crc kubenswrapper[4825]: I0310 08:26:06.423650 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552180-xsj9w"] Mar 10 08:26:07 crc kubenswrapper[4825]: I0310 08:26:07.259574 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7bac17e-a1d4-4938-b9cc-7e9590710245" path="/var/lib/kubelet/pods/d7bac17e-a1d4-4938-b9cc-7e9590710245/volumes" Mar 10 08:26:12 crc kubenswrapper[4825]: I0310 08:26:12.606297 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5958856c47-pd5nn" Mar 10 08:26:12 crc kubenswrapper[4825]: I0310 08:26:12.711122 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76cb956b75-tw8ht"] Mar 10 08:26:12 crc kubenswrapper[4825]: I0310 08:26:12.711420 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" podUID="9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35" containerName="dnsmasq-dns" containerID="cri-o://9e15f6b822363005b80d867b0eb4d4d79f35358c003ef050180166e4bbe5edf3" gracePeriod=10 Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.174793 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.246720 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-config\") pod \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.248644 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-ovsdbserver-sb\") pod \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.248745 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfdf\" (UniqueName: \"kubernetes.io/projected/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-kube-api-access-htfdf\") pod \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.248833 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-ovsdbserver-nb\") pod \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.248856 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-dns-svc\") pod \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.249002 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-openstack-cell1\") pod \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.258488 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-kube-api-access-htfdf" (OuterVolumeSpecName: "kube-api-access-htfdf") pod "9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35" (UID: "9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35"). InnerVolumeSpecName "kube-api-access-htfdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.317494 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35" (UID: "9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.324287 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35" (UID: "9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.334226 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35" (UID: "9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.341614 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35" (UID: "9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.350788 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-config" (OuterVolumeSpecName: "config") pod "9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35" (UID: "9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.351212 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-config\") pod \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\" (UID: \"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35\") " Mar 10 08:26:13 crc kubenswrapper[4825]: W0310 08:26:13.351534 4825 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35/volumes/kubernetes.io~configmap/config Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.351570 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-config" (OuterVolumeSpecName: "config") pod "9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35" (UID: "9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.353103 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfdf\" (UniqueName: \"kubernetes.io/projected/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-kube-api-access-htfdf\") on node \"crc\" DevicePath \"\"" Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.353122 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.353160 4825 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.353170 4825 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.353178 4825 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-config\") on node \"crc\" DevicePath \"\"" Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.353185 4825 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.444843 4825 generic.go:334] "Generic (PLEG): container finished" podID="9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35" containerID="9e15f6b822363005b80d867b0eb4d4d79f35358c003ef050180166e4bbe5edf3" exitCode=0 Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.444884 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" event={"ID":"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35","Type":"ContainerDied","Data":"9e15f6b822363005b80d867b0eb4d4d79f35358c003ef050180166e4bbe5edf3"} Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.444912 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" event={"ID":"9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35","Type":"ContainerDied","Data":"47a3e9ef233ee652d78c97e9b1bb36316e4d25fb8097775ea6d919a17330e4b0"} Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.444928 4825 scope.go:117] "RemoveContainer" containerID="9e15f6b822363005b80d867b0eb4d4d79f35358c003ef050180166e4bbe5edf3" Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.444947 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76cb956b75-tw8ht" Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.476067 4825 scope.go:117] "RemoveContainer" containerID="993c49019b0e0ca2b1132e5af69a90c51894e140101cb45bfd177ce3edf919d8" Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.480801 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76cb956b75-tw8ht"] Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.490088 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76cb956b75-tw8ht"] Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.498567 4825 scope.go:117] "RemoveContainer" containerID="9e15f6b822363005b80d867b0eb4d4d79f35358c003ef050180166e4bbe5edf3" Mar 10 08:26:13 crc kubenswrapper[4825]: E0310 08:26:13.499120 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e15f6b822363005b80d867b0eb4d4d79f35358c003ef050180166e4bbe5edf3\": container with ID starting with 9e15f6b822363005b80d867b0eb4d4d79f35358c003ef050180166e4bbe5edf3 not found: ID does not exist" containerID="9e15f6b822363005b80d867b0eb4d4d79f35358c003ef050180166e4bbe5edf3" Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.499190 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e15f6b822363005b80d867b0eb4d4d79f35358c003ef050180166e4bbe5edf3"} err="failed to get container status \"9e15f6b822363005b80d867b0eb4d4d79f35358c003ef050180166e4bbe5edf3\": rpc error: code = NotFound desc = could not find container \"9e15f6b822363005b80d867b0eb4d4d79f35358c003ef050180166e4bbe5edf3\": container with ID starting with 9e15f6b822363005b80d867b0eb4d4d79f35358c003ef050180166e4bbe5edf3 not found: ID does not exist" Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.499217 4825 scope.go:117] "RemoveContainer" containerID="993c49019b0e0ca2b1132e5af69a90c51894e140101cb45bfd177ce3edf919d8" Mar 10 08:26:13 crc kubenswrapper[4825]: E0310 08:26:13.499701 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"993c49019b0e0ca2b1132e5af69a90c51894e140101cb45bfd177ce3edf919d8\": container with ID starting with 993c49019b0e0ca2b1132e5af69a90c51894e140101cb45bfd177ce3edf919d8 not found: ID does not exist" containerID="993c49019b0e0ca2b1132e5af69a90c51894e140101cb45bfd177ce3edf919d8" Mar 10 08:26:13 crc kubenswrapper[4825]: I0310 08:26:13.499730 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"993c49019b0e0ca2b1132e5af69a90c51894e140101cb45bfd177ce3edf919d8"} err="failed to get container status \"993c49019b0e0ca2b1132e5af69a90c51894e140101cb45bfd177ce3edf919d8\": rpc error: code = NotFound desc = could not find container \"993c49019b0e0ca2b1132e5af69a90c51894e140101cb45bfd177ce3edf919d8\": container with ID starting with 993c49019b0e0ca2b1132e5af69a90c51894e140101cb45bfd177ce3edf919d8 not found: ID does not exist" Mar 10 08:26:15 crc kubenswrapper[4825]: I0310 08:26:15.236974 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:26:15 crc kubenswrapper[4825]: E0310 08:26:15.238072 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:26:15 crc kubenswrapper[4825]: I0310 08:26:15.253647 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35" path="/var/lib/kubelet/pods/9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35/volumes" Mar 10 08:26:19 crc kubenswrapper[4825]: I0310 08:26:19.034325 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-zhwf4"] Mar 10 08:26:19 crc kubenswrapper[4825]: I0310 08:26:19.048165 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-zhwf4"] Mar 10 08:26:19 crc kubenswrapper[4825]: I0310 08:26:19.256988 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ed01571-e0e3-438d-ab0e-3e0a85675f29" path="/var/lib/kubelet/pods/7ed01571-e0e3-438d-ab0e-3e0a85675f29/volumes" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.105614 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x"] Mar 10 08:26:23 crc kubenswrapper[4825]: E0310 08:26:23.116651 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6985674-aa5a-40c5-a384-04bac7fd0a1b" containerName="dnsmasq-dns" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.116690 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6985674-aa5a-40c5-a384-04bac7fd0a1b" containerName="dnsmasq-dns" Mar 10 08:26:23 crc kubenswrapper[4825]: E0310 08:26:23.116698 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc5e5dd-4c5e-4729-8750-b65c6e70e83d" containerName="oc" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.116704 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc5e5dd-4c5e-4729-8750-b65c6e70e83d" containerName="oc" Mar 10 08:26:23 crc kubenswrapper[4825]: E0310 08:26:23.116717 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6985674-aa5a-40c5-a384-04bac7fd0a1b" containerName="init" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.116722 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6985674-aa5a-40c5-a384-04bac7fd0a1b" containerName="init" Mar 10 08:26:23 crc kubenswrapper[4825]: E0310 08:26:23.116730 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35" containerName="dnsmasq-dns" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.116737 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35" containerName="dnsmasq-dns" Mar 10 08:26:23 crc kubenswrapper[4825]: E0310 08:26:23.116749 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35" containerName="init" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.116756 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35" containerName="init" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.117022 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6985674-aa5a-40c5-a384-04bac7fd0a1b" containerName="dnsmasq-dns" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.117034 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fc5e5dd-4c5e-4729-8750-b65c6e70e83d" containerName="oc" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.117055 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ac4bc4c-ab7c-45e0-b862-4c33bb46ae35" containerName="dnsmasq-dns" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.117712 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x"] Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.117792 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.121201 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.121641 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.121872 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.122610 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-5twvv" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.186688 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04196610-0f20-46dc-b6e9-2e0d1b62342d-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x\" (UID: \"04196610-0f20-46dc-b6e9-2e0d1b62342d\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.186800 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpxt5\" (UniqueName: \"kubernetes.io/projected/04196610-0f20-46dc-b6e9-2e0d1b62342d-kube-api-access-kpxt5\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x\" (UID: \"04196610-0f20-46dc-b6e9-2e0d1b62342d\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.186833 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04196610-0f20-46dc-b6e9-2e0d1b62342d-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x\" (UID: \"04196610-0f20-46dc-b6e9-2e0d1b62342d\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.187010 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/04196610-0f20-46dc-b6e9-2e0d1b62342d-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x\" (UID: \"04196610-0f20-46dc-b6e9-2e0d1b62342d\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.289208 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04196610-0f20-46dc-b6e9-2e0d1b62342d-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x\" (UID: \"04196610-0f20-46dc-b6e9-2e0d1b62342d\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.289391 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpxt5\" (UniqueName: \"kubernetes.io/projected/04196610-0f20-46dc-b6e9-2e0d1b62342d-kube-api-access-kpxt5\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x\" (UID: \"04196610-0f20-46dc-b6e9-2e0d1b62342d\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.289440 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04196610-0f20-46dc-b6e9-2e0d1b62342d-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x\" (UID: \"04196610-0f20-46dc-b6e9-2e0d1b62342d\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.289554 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/04196610-0f20-46dc-b6e9-2e0d1b62342d-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x\" (UID: \"04196610-0f20-46dc-b6e9-2e0d1b62342d\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.295999 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/04196610-0f20-46dc-b6e9-2e0d1b62342d-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x\" (UID: \"04196610-0f20-46dc-b6e9-2e0d1b62342d\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.296221 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04196610-0f20-46dc-b6e9-2e0d1b62342d-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x\" (UID: \"04196610-0f20-46dc-b6e9-2e0d1b62342d\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.299265 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04196610-0f20-46dc-b6e9-2e0d1b62342d-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x\" (UID: \"04196610-0f20-46dc-b6e9-2e0d1b62342d\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.308536 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpxt5\" (UniqueName: \"kubernetes.io/projected/04196610-0f20-46dc-b6e9-2e0d1b62342d-kube-api-access-kpxt5\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x\" (UID: \"04196610-0f20-46dc-b6e9-2e0d1b62342d\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x" Mar 10 08:26:23 crc kubenswrapper[4825]: I0310 08:26:23.472786 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x" Mar 10 08:26:24 crc kubenswrapper[4825]: I0310 08:26:24.534865 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x"] Mar 10 08:26:24 crc kubenswrapper[4825]: W0310 08:26:24.538526 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04196610_0f20_46dc_b6e9_2e0d1b62342d.slice/crio-e1ebf909729d445d9b170430dce36047dfd5d5295e1b8b421dc8aaf51b09b3f9 WatchSource:0}: Error finding container e1ebf909729d445d9b170430dce36047dfd5d5295e1b8b421dc8aaf51b09b3f9: Status 404 returned error can't find the container with id e1ebf909729d445d9b170430dce36047dfd5d5295e1b8b421dc8aaf51b09b3f9 Mar 10 08:26:24 crc kubenswrapper[4825]: I0310 08:26:24.572042 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x" event={"ID":"04196610-0f20-46dc-b6e9-2e0d1b62342d","Type":"ContainerStarted","Data":"e1ebf909729d445d9b170430dce36047dfd5d5295e1b8b421dc8aaf51b09b3f9"} Mar 10 08:26:28 crc kubenswrapper[4825]: I0310 08:26:28.236749 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:26:28 crc kubenswrapper[4825]: E0310 08:26:28.237365 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:26:33 crc kubenswrapper[4825]: I0310 08:26:33.656196 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x" event={"ID":"04196610-0f20-46dc-b6e9-2e0d1b62342d","Type":"ContainerStarted","Data":"803d7dc4424ffd38b4938c3ec806c833c4350d7175b619e440f55d9b49e209e3"} Mar 10 08:26:33 crc kubenswrapper[4825]: I0310 08:26:33.678491 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x" podStartSLOduration=2.51980819 podStartE2EDuration="10.678441345s" podCreationTimestamp="2026-03-10 08:26:23 +0000 UTC" firstStartedPulling="2026-03-10 08:26:24.541564917 +0000 UTC m=+6137.571345562" lastFinishedPulling="2026-03-10 08:26:32.700198102 +0000 UTC m=+6145.729978717" observedRunningTime="2026-03-10 08:26:33.674005609 +0000 UTC m=+6146.703786224" watchObservedRunningTime="2026-03-10 08:26:33.678441345 +0000 UTC m=+6146.708221960" Mar 10 08:26:40 crc kubenswrapper[4825]: I0310 08:26:40.237087 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:26:40 crc kubenswrapper[4825]: E0310 08:26:40.238068 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:26:40 crc kubenswrapper[4825]: I0310 08:26:40.249495 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wv7xp"] Mar 10 08:26:40 crc kubenswrapper[4825]: I0310 08:26:40.251831 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wv7xp" Mar 10 08:26:40 crc kubenswrapper[4825]: I0310 08:26:40.270287 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wv7xp"] Mar 10 08:26:40 crc kubenswrapper[4825]: I0310 08:26:40.345832 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abefed0f-b590-471d-97d3-e6ee3ad9e148-catalog-content\") pod \"redhat-operators-wv7xp\" (UID: \"abefed0f-b590-471d-97d3-e6ee3ad9e148\") " pod="openshift-marketplace/redhat-operators-wv7xp" Mar 10 08:26:40 crc kubenswrapper[4825]: I0310 08:26:40.345876 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkb9q\" (UniqueName: \"kubernetes.io/projected/abefed0f-b590-471d-97d3-e6ee3ad9e148-kube-api-access-bkb9q\") pod \"redhat-operators-wv7xp\" (UID: \"abefed0f-b590-471d-97d3-e6ee3ad9e148\") " pod="openshift-marketplace/redhat-operators-wv7xp" Mar 10 08:26:40 crc kubenswrapper[4825]: I0310 08:26:40.345930 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abefed0f-b590-471d-97d3-e6ee3ad9e148-utilities\") pod \"redhat-operators-wv7xp\" (UID: \"abefed0f-b590-471d-97d3-e6ee3ad9e148\") " pod="openshift-marketplace/redhat-operators-wv7xp" Mar 10 08:26:40 crc kubenswrapper[4825]: I0310 08:26:40.448482 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abefed0f-b590-471d-97d3-e6ee3ad9e148-catalog-content\") pod \"redhat-operators-wv7xp\" (UID: \"abefed0f-b590-471d-97d3-e6ee3ad9e148\") " pod="openshift-marketplace/redhat-operators-wv7xp" Mar 10 08:26:40 crc kubenswrapper[4825]: I0310 08:26:40.449085 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkb9q\" (UniqueName: \"kubernetes.io/projected/abefed0f-b590-471d-97d3-e6ee3ad9e148-kube-api-access-bkb9q\") pod \"redhat-operators-wv7xp\" (UID: \"abefed0f-b590-471d-97d3-e6ee3ad9e148\") " pod="openshift-marketplace/redhat-operators-wv7xp" Mar 10 08:26:40 crc kubenswrapper[4825]: I0310 08:26:40.449205 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abefed0f-b590-471d-97d3-e6ee3ad9e148-utilities\") pod \"redhat-operators-wv7xp\" (UID: \"abefed0f-b590-471d-97d3-e6ee3ad9e148\") " pod="openshift-marketplace/redhat-operators-wv7xp" Mar 10 08:26:40 crc kubenswrapper[4825]: I0310 08:26:40.449355 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abefed0f-b590-471d-97d3-e6ee3ad9e148-catalog-content\") pod \"redhat-operators-wv7xp\" (UID: \"abefed0f-b590-471d-97d3-e6ee3ad9e148\") " pod="openshift-marketplace/redhat-operators-wv7xp" Mar 10 08:26:40 crc kubenswrapper[4825]: I0310 08:26:40.449870 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abefed0f-b590-471d-97d3-e6ee3ad9e148-utilities\") pod \"redhat-operators-wv7xp\" (UID: \"abefed0f-b590-471d-97d3-e6ee3ad9e148\") " pod="openshift-marketplace/redhat-operators-wv7xp" Mar 10 08:26:40 crc kubenswrapper[4825]: I0310 08:26:40.471498 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkb9q\" (UniqueName: \"kubernetes.io/projected/abefed0f-b590-471d-97d3-e6ee3ad9e148-kube-api-access-bkb9q\") pod \"redhat-operators-wv7xp\" (UID: \"abefed0f-b590-471d-97d3-e6ee3ad9e148\") " pod="openshift-marketplace/redhat-operators-wv7xp" Mar 10 08:26:40 crc kubenswrapper[4825]: I0310 08:26:40.579157 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wv7xp" Mar 10 08:26:41 crc kubenswrapper[4825]: I0310 08:26:41.037170 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wv7xp"] Mar 10 08:26:41 crc kubenswrapper[4825]: W0310 08:26:41.043648 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabefed0f_b590_471d_97d3_e6ee3ad9e148.slice/crio-1c42ca67d6c222a138a626496cbebbb6c4434053f208b71403576b04ce53fff5 WatchSource:0}: Error finding container 1c42ca67d6c222a138a626496cbebbb6c4434053f208b71403576b04ce53fff5: Status 404 returned error can't find the container with id 1c42ca67d6c222a138a626496cbebbb6c4434053f208b71403576b04ce53fff5 Mar 10 08:26:41 crc kubenswrapper[4825]: I0310 08:26:41.736528 4825 generic.go:334] "Generic (PLEG): container finished" podID="abefed0f-b590-471d-97d3-e6ee3ad9e148" containerID="a45447c1843cefb0e47ff3d5d67ede0930cfc0b3a282d4594d7c2c4dda6f4b45" exitCode=0 Mar 10 08:26:41 crc kubenswrapper[4825]: I0310 08:26:41.736567 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wv7xp" event={"ID":"abefed0f-b590-471d-97d3-e6ee3ad9e148","Type":"ContainerDied","Data":"a45447c1843cefb0e47ff3d5d67ede0930cfc0b3a282d4594d7c2c4dda6f4b45"} Mar 10 08:26:41 crc kubenswrapper[4825]: I0310 08:26:41.736830 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wv7xp" event={"ID":"abefed0f-b590-471d-97d3-e6ee3ad9e148","Type":"ContainerStarted","Data":"1c42ca67d6c222a138a626496cbebbb6c4434053f208b71403576b04ce53fff5"} Mar 10 08:26:42 crc kubenswrapper[4825]: I0310 08:26:42.747237 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wv7xp" event={"ID":"abefed0f-b590-471d-97d3-e6ee3ad9e148","Type":"ContainerStarted","Data":"24650c0f5099e08ef50bc1c32eb1d791dd1b15c54c0c1ff4da0650a3c64da8cf"} Mar 10 08:26:46 crc kubenswrapper[4825]: I0310 08:26:46.785575 4825 generic.go:334] "Generic (PLEG): container finished" podID="04196610-0f20-46dc-b6e9-2e0d1b62342d" containerID="803d7dc4424ffd38b4938c3ec806c833c4350d7175b619e440f55d9b49e209e3" exitCode=0 Mar 10 08:26:46 crc kubenswrapper[4825]: I0310 08:26:46.786161 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x" event={"ID":"04196610-0f20-46dc-b6e9-2e0d1b62342d","Type":"ContainerDied","Data":"803d7dc4424ffd38b4938c3ec806c833c4350d7175b619e440f55d9b49e209e3"} Mar 10 08:26:48 crc kubenswrapper[4825]: I0310 08:26:48.041858 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-hbbvh"] Mar 10 08:26:48 crc kubenswrapper[4825]: I0310 08:26:48.051651 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-71b1-account-create-update-w4phr"] Mar 10 08:26:48 crc kubenswrapper[4825]: I0310 08:26:48.062530 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-hbbvh"] Mar 10 08:26:48 crc kubenswrapper[4825]: I0310 08:26:48.071301 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-71b1-account-create-update-w4phr"] Mar 10 08:26:48 crc kubenswrapper[4825]: I0310 08:26:48.382934 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x" Mar 10 08:26:48 crc kubenswrapper[4825]: I0310 08:26:48.488524 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04196610-0f20-46dc-b6e9-2e0d1b62342d-pre-adoption-validation-combined-ca-bundle\") pod \"04196610-0f20-46dc-b6e9-2e0d1b62342d\" (UID: \"04196610-0f20-46dc-b6e9-2e0d1b62342d\") " Mar 10 08:26:48 crc kubenswrapper[4825]: I0310 08:26:48.488650 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04196610-0f20-46dc-b6e9-2e0d1b62342d-inventory\") pod \"04196610-0f20-46dc-b6e9-2e0d1b62342d\" (UID: \"04196610-0f20-46dc-b6e9-2e0d1b62342d\") " Mar 10 08:26:48 crc kubenswrapper[4825]: I0310 08:26:48.488725 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/04196610-0f20-46dc-b6e9-2e0d1b62342d-ssh-key-openstack-cell1\") pod \"04196610-0f20-46dc-b6e9-2e0d1b62342d\" (UID: \"04196610-0f20-46dc-b6e9-2e0d1b62342d\") " Mar 10 08:26:48 crc kubenswrapper[4825]: I0310 08:26:48.488871 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpxt5\" (UniqueName: \"kubernetes.io/projected/04196610-0f20-46dc-b6e9-2e0d1b62342d-kube-api-access-kpxt5\") pod \"04196610-0f20-46dc-b6e9-2e0d1b62342d\" (UID: \"04196610-0f20-46dc-b6e9-2e0d1b62342d\") " Mar 10 08:26:48 crc kubenswrapper[4825]: I0310 08:26:48.494485 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04196610-0f20-46dc-b6e9-2e0d1b62342d-kube-api-access-kpxt5" (OuterVolumeSpecName: "kube-api-access-kpxt5") pod "04196610-0f20-46dc-b6e9-2e0d1b62342d" (UID: "04196610-0f20-46dc-b6e9-2e0d1b62342d"). InnerVolumeSpecName "kube-api-access-kpxt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:26:48 crc kubenswrapper[4825]: I0310 08:26:48.496248 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04196610-0f20-46dc-b6e9-2e0d1b62342d-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "04196610-0f20-46dc-b6e9-2e0d1b62342d" (UID: "04196610-0f20-46dc-b6e9-2e0d1b62342d"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:26:48 crc kubenswrapper[4825]: I0310 08:26:48.530340 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04196610-0f20-46dc-b6e9-2e0d1b62342d-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "04196610-0f20-46dc-b6e9-2e0d1b62342d" (UID: "04196610-0f20-46dc-b6e9-2e0d1b62342d"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:26:48 crc kubenswrapper[4825]: I0310 08:26:48.537427 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04196610-0f20-46dc-b6e9-2e0d1b62342d-inventory" (OuterVolumeSpecName: "inventory") pod "04196610-0f20-46dc-b6e9-2e0d1b62342d" (UID: "04196610-0f20-46dc-b6e9-2e0d1b62342d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:26:48 crc kubenswrapper[4825]: I0310 08:26:48.591549 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpxt5\" (UniqueName: \"kubernetes.io/projected/04196610-0f20-46dc-b6e9-2e0d1b62342d-kube-api-access-kpxt5\") on node \"crc\" DevicePath \"\"" Mar 10 08:26:48 crc kubenswrapper[4825]: I0310 08:26:48.591591 4825 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04196610-0f20-46dc-b6e9-2e0d1b62342d-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:26:48 crc kubenswrapper[4825]: I0310 08:26:48.591604 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04196610-0f20-46dc-b6e9-2e0d1b62342d-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 08:26:48 crc kubenswrapper[4825]: I0310 08:26:48.591613 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/04196610-0f20-46dc-b6e9-2e0d1b62342d-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 08:26:48 crc kubenswrapper[4825]: I0310 08:26:48.805887 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x" Mar 10 08:26:48 crc kubenswrapper[4825]: I0310 08:26:48.805903 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x" event={"ID":"04196610-0f20-46dc-b6e9-2e0d1b62342d","Type":"ContainerDied","Data":"e1ebf909729d445d9b170430dce36047dfd5d5295e1b8b421dc8aaf51b09b3f9"} Mar 10 08:26:48 crc kubenswrapper[4825]: I0310 08:26:48.805940 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1ebf909729d445d9b170430dce36047dfd5d5295e1b8b421dc8aaf51b09b3f9" Mar 10 08:26:48 crc kubenswrapper[4825]: I0310 08:26:48.809501 4825 generic.go:334] "Generic (PLEG): container finished" podID="abefed0f-b590-471d-97d3-e6ee3ad9e148" containerID="24650c0f5099e08ef50bc1c32eb1d791dd1b15c54c0c1ff4da0650a3c64da8cf" exitCode=0 Mar 10 08:26:48 crc kubenswrapper[4825]: I0310 08:26:48.809534 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wv7xp" event={"ID":"abefed0f-b590-471d-97d3-e6ee3ad9e148","Type":"ContainerDied","Data":"24650c0f5099e08ef50bc1c32eb1d791dd1b15c54c0c1ff4da0650a3c64da8cf"} Mar 10 08:26:49 crc kubenswrapper[4825]: I0310 08:26:49.247824 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f9c69e7-8f45-4b9e-abe4-54f811f4227d" path="/var/lib/kubelet/pods/6f9c69e7-8f45-4b9e-abe4-54f811f4227d/volumes" Mar 10 08:26:49 crc kubenswrapper[4825]: I0310 08:26:49.248922 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7c1a03f-33d6-4b82-92c2-05a42b793e86" path="/var/lib/kubelet/pods/c7c1a03f-33d6-4b82-92c2-05a42b793e86/volumes" Mar 10 08:26:50 crc kubenswrapper[4825]: I0310 08:26:50.840667 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wv7xp" event={"ID":"abefed0f-b590-471d-97d3-e6ee3ad9e148","Type":"ContainerStarted","Data":"e1b9cd4f84eeb5e090a05ecb2d7f9131591ae2fd8e55043e0e8ae67c2c626c14"} Mar 10 08:26:50 crc kubenswrapper[4825]: I0310 08:26:50.869410 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wv7xp" podStartSLOduration=2.372096906 podStartE2EDuration="10.869385029s" podCreationTimestamp="2026-03-10 08:26:40 +0000 UTC" firstStartedPulling="2026-03-10 08:26:41.738994481 +0000 UTC m=+6154.768775096" lastFinishedPulling="2026-03-10 08:26:50.236282604 +0000 UTC m=+6163.266063219" observedRunningTime="2026-03-10 08:26:50.855796245 +0000 UTC m=+6163.885576890" watchObservedRunningTime="2026-03-10 08:26:50.869385029 +0000 UTC m=+6163.899165684" Mar 10 08:26:51 crc kubenswrapper[4825]: I0310 08:26:51.018432 4825 scope.go:117] "RemoveContainer" containerID="8b6d793b02a03d2aa4759b0c90664b47e6964de5a55678386bfc749d3b063e35" Mar 10 08:26:51 crc kubenswrapper[4825]: I0310 08:26:51.078954 4825 scope.go:117] "RemoveContainer" containerID="b10ac4feb1521b1355959eafff1beaf457f3ea9f88c00c84f17ec1a7addda1d9" Mar 10 08:26:51 crc kubenswrapper[4825]: I0310 08:26:51.113455 4825 scope.go:117] "RemoveContainer" containerID="89a0d60e3882cad80ee3fc44a51053fcea4ea3f6161e8c265fd82229de0d004c" Mar 10 08:26:51 crc kubenswrapper[4825]: I0310 08:26:51.183945 4825 scope.go:117] "RemoveContainer" containerID="86985ca1303771554b9fa682859ad2aff5f3f8c84e7991b815c56606593d3700" Mar 10 08:26:51 crc kubenswrapper[4825]: I0310 08:26:51.222818 4825 scope.go:117] "RemoveContainer" containerID="5b9928f32ac768c9046299d5bb4a58704c39fc658abb72c248435f228cbe8e56" Mar 10 08:26:51 crc kubenswrapper[4825]: I0310 08:26:51.303823 4825 scope.go:117] "RemoveContainer" containerID="85df29cb665ff0979047be5868bad23878444fa169690ff6e59018560293c3c2" Mar 10 08:26:51 crc kubenswrapper[4825]: I0310 08:26:51.512780 4825 scope.go:117] "RemoveContainer" containerID="3db06b76d50c0663942827bc9ceda05627ed36845b18c3c0d089df96a737b90d" Mar 10 08:26:51 crc kubenswrapper[4825]: I0310 08:26:51.576108 4825 scope.go:117] "RemoveContainer" containerID="7ca55dea021557f84473516973ad447b63260d9147ae4bea8645ea07f90ad565" Mar 10 08:26:55 crc kubenswrapper[4825]: I0310 08:26:55.236879 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:26:55 crc kubenswrapper[4825]: E0310 08:26:55.237667 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:26:56 crc kubenswrapper[4825]: I0310 08:26:56.255431 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk"] Mar 10 08:26:56 crc kubenswrapper[4825]: E0310 08:26:56.256284 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04196610-0f20-46dc-b6e9-2e0d1b62342d" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 10 08:26:56 crc kubenswrapper[4825]: I0310 08:26:56.256309 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="04196610-0f20-46dc-b6e9-2e0d1b62342d" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 10 08:26:56 crc kubenswrapper[4825]: I0310 08:26:56.256604 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="04196610-0f20-46dc-b6e9-2e0d1b62342d" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 10 08:26:56 crc kubenswrapper[4825]: I0310 08:26:56.257572 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk" Mar 10 08:26:56 crc kubenswrapper[4825]: I0310 08:26:56.266080 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk"] Mar 10 08:26:56 crc kubenswrapper[4825]: I0310 08:26:56.292960 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 08:26:56 crc kubenswrapper[4825]: I0310 08:26:56.293032 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 08:26:56 crc kubenswrapper[4825]: I0310 08:26:56.293077 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 08:26:56 crc kubenswrapper[4825]: I0310 08:26:56.293249 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-5twvv" Mar 10 08:26:56 crc kubenswrapper[4825]: I0310 08:26:56.348467 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98107d82-46d1-4863-8c79-a10abec1737f-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk\" (UID: \"98107d82-46d1-4863-8c79-a10abec1737f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk" Mar 10 08:26:56 crc kubenswrapper[4825]: I0310 08:26:56.348563 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpk85\" (UniqueName: \"kubernetes.io/projected/98107d82-46d1-4863-8c79-a10abec1737f-kube-api-access-tpk85\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk\" (UID: \"98107d82-46d1-4863-8c79-a10abec1737f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk" Mar 10 08:26:56 crc kubenswrapper[4825]: I0310 08:26:56.348673 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98107d82-46d1-4863-8c79-a10abec1737f-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk\" (UID: \"98107d82-46d1-4863-8c79-a10abec1737f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk" Mar 10 08:26:56 crc kubenswrapper[4825]: I0310 08:26:56.348734 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/98107d82-46d1-4863-8c79-a10abec1737f-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk\" (UID: \"98107d82-46d1-4863-8c79-a10abec1737f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk" Mar 10 08:26:56 crc kubenswrapper[4825]: I0310 08:26:56.451619 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98107d82-46d1-4863-8c79-a10abec1737f-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk\" (UID: \"98107d82-46d1-4863-8c79-a10abec1737f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk" Mar 10 08:26:56 crc kubenswrapper[4825]: I0310 08:26:56.451722 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/98107d82-46d1-4863-8c79-a10abec1737f-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk\" (UID: \"98107d82-46d1-4863-8c79-a10abec1737f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk" Mar 10 08:26:56 crc kubenswrapper[4825]: I0310 08:26:56.451779 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98107d82-46d1-4863-8c79-a10abec1737f-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk\" (UID: \"98107d82-46d1-4863-8c79-a10abec1737f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk" Mar 10 08:26:56 crc kubenswrapper[4825]: I0310 08:26:56.451846 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpk85\" (UniqueName: \"kubernetes.io/projected/98107d82-46d1-4863-8c79-a10abec1737f-kube-api-access-tpk85\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk\" (UID: \"98107d82-46d1-4863-8c79-a10abec1737f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk" Mar 10 08:26:56 crc kubenswrapper[4825]: I0310 08:26:56.458036 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98107d82-46d1-4863-8c79-a10abec1737f-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk\" (UID: \"98107d82-46d1-4863-8c79-a10abec1737f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk" Mar 10 08:26:56 crc kubenswrapper[4825]: I0310 08:26:56.461252 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98107d82-46d1-4863-8c79-a10abec1737f-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk\" (UID: \"98107d82-46d1-4863-8c79-a10abec1737f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk" Mar 10 08:26:56 crc kubenswrapper[4825]: I0310 08:26:56.462046 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/98107d82-46d1-4863-8c79-a10abec1737f-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk\" (UID: \"98107d82-46d1-4863-8c79-a10abec1737f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk" Mar 10 08:26:56 crc kubenswrapper[4825]: I0310 08:26:56.468984 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpk85\" (UniqueName: \"kubernetes.io/projected/98107d82-46d1-4863-8c79-a10abec1737f-kube-api-access-tpk85\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk\" (UID: \"98107d82-46d1-4863-8c79-a10abec1737f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk" Mar 10 08:26:56 crc kubenswrapper[4825]: I0310 08:26:56.616582 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk" Mar 10 08:26:57 crc kubenswrapper[4825]: I0310 08:26:57.035759 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-2sjk7"] Mar 10 08:26:57 crc kubenswrapper[4825]: I0310 08:26:57.047327 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-2sjk7"] Mar 10 08:26:57 crc kubenswrapper[4825]: I0310 08:26:57.152515 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk"] Mar 10 08:26:57 crc kubenswrapper[4825]: I0310 08:26:57.248959 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bd32bd9-5073-4d95-994e-a51836a51f67" path="/var/lib/kubelet/pods/3bd32bd9-5073-4d95-994e-a51836a51f67/volumes" Mar 10 08:26:57 crc kubenswrapper[4825]: I0310 08:26:57.915537 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk" event={"ID":"98107d82-46d1-4863-8c79-a10abec1737f","Type":"ContainerStarted","Data":"453bac1b39ec329db42b048fca923fd22c77bac1b508358901caa62b14cd9720"} Mar 10 08:26:58 crc kubenswrapper[4825]: I0310 08:26:58.927043 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk" event={"ID":"98107d82-46d1-4863-8c79-a10abec1737f","Type":"ContainerStarted","Data":"763e694dbcd38f37b0c74ddda85320c761e232c0be5811a4f7d05b3d8748ddfb"} Mar 10 08:26:58 crc kubenswrapper[4825]: I0310 08:26:58.961040 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk" podStartSLOduration=2.362020744 podStartE2EDuration="2.961018485s" podCreationTimestamp="2026-03-10 08:26:56 +0000 UTC" firstStartedPulling="2026-03-10 08:26:57.158112353 +0000 UTC m=+6170.187892968" lastFinishedPulling="2026-03-10 08:26:57.757110094 +0000 UTC m=+6170.786890709" observedRunningTime="2026-03-10 08:26:58.947524164 +0000 UTC m=+6171.977304779" watchObservedRunningTime="2026-03-10 08:26:58.961018485 +0000 UTC m=+6171.990799100" Mar 10 08:27:00 crc kubenswrapper[4825]: I0310 08:27:00.580528 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wv7xp" Mar 10 08:27:00 crc kubenswrapper[4825]: I0310 08:27:00.580598 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wv7xp" Mar 10 08:27:01 crc kubenswrapper[4825]: I0310 08:27:01.626720 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wv7xp" podUID="abefed0f-b590-471d-97d3-e6ee3ad9e148" containerName="registry-server" probeResult="failure" output=< Mar 10 08:27:01 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 08:27:01 crc kubenswrapper[4825]: > Mar 10 08:27:06 crc kubenswrapper[4825]: I0310 08:27:06.236835 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:27:06 crc kubenswrapper[4825]: E0310 08:27:06.237536 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:27:11 crc kubenswrapper[4825]: I0310 08:27:11.630498 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wv7xp" podUID="abefed0f-b590-471d-97d3-e6ee3ad9e148" containerName="registry-server" probeResult="failure" output=< Mar 10 08:27:11 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 08:27:11 crc kubenswrapper[4825]: > Mar 10 08:27:17 crc kubenswrapper[4825]: I0310 08:27:17.236922 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:27:17 crc kubenswrapper[4825]: E0310 08:27:17.237624 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:27:20 crc kubenswrapper[4825]: I0310 08:27:20.627979 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wv7xp" Mar 10 08:27:20 crc kubenswrapper[4825]: I0310 08:27:20.689438 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wv7xp" Mar 10 08:27:20 crc kubenswrapper[4825]: I0310 08:27:20.867384 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wv7xp"] Mar 10 08:27:22 crc kubenswrapper[4825]: I0310 08:27:22.140181 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wv7xp" podUID="abefed0f-b590-471d-97d3-e6ee3ad9e148" containerName="registry-server" containerID="cri-o://e1b9cd4f84eeb5e090a05ecb2d7f9131591ae2fd8e55043e0e8ae67c2c626c14" gracePeriod=2 Mar 10 08:27:22 crc kubenswrapper[4825]: I0310 08:27:22.602063 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wv7xp" Mar 10 08:27:22 crc kubenswrapper[4825]: I0310 08:27:22.713843 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkb9q\" (UniqueName: \"kubernetes.io/projected/abefed0f-b590-471d-97d3-e6ee3ad9e148-kube-api-access-bkb9q\") pod \"abefed0f-b590-471d-97d3-e6ee3ad9e148\" (UID: \"abefed0f-b590-471d-97d3-e6ee3ad9e148\") " Mar 10 08:27:22 crc kubenswrapper[4825]: I0310 08:27:22.714069 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abefed0f-b590-471d-97d3-e6ee3ad9e148-utilities\") pod \"abefed0f-b590-471d-97d3-e6ee3ad9e148\" (UID: \"abefed0f-b590-471d-97d3-e6ee3ad9e148\") " Mar 10 08:27:22 crc kubenswrapper[4825]: I0310 08:27:22.714127 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abefed0f-b590-471d-97d3-e6ee3ad9e148-catalog-content\") pod \"abefed0f-b590-471d-97d3-e6ee3ad9e148\" (UID: \"abefed0f-b590-471d-97d3-e6ee3ad9e148\") " Mar 10 08:27:22 crc kubenswrapper[4825]: I0310 08:27:22.714753 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abefed0f-b590-471d-97d3-e6ee3ad9e148-utilities" (OuterVolumeSpecName: "utilities") pod "abefed0f-b590-471d-97d3-e6ee3ad9e148" (UID: "abefed0f-b590-471d-97d3-e6ee3ad9e148"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:27:22 crc kubenswrapper[4825]: I0310 08:27:22.719789 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abefed0f-b590-471d-97d3-e6ee3ad9e148-kube-api-access-bkb9q" (OuterVolumeSpecName: "kube-api-access-bkb9q") pod "abefed0f-b590-471d-97d3-e6ee3ad9e148" (UID: "abefed0f-b590-471d-97d3-e6ee3ad9e148"). InnerVolumeSpecName "kube-api-access-bkb9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:27:22 crc kubenswrapper[4825]: I0310 08:27:22.817211 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkb9q\" (UniqueName: \"kubernetes.io/projected/abefed0f-b590-471d-97d3-e6ee3ad9e148-kube-api-access-bkb9q\") on node \"crc\" DevicePath \"\"" Mar 10 08:27:22 crc kubenswrapper[4825]: I0310 08:27:22.817250 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abefed0f-b590-471d-97d3-e6ee3ad9e148-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 08:27:22 crc kubenswrapper[4825]: I0310 08:27:22.856250 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abefed0f-b590-471d-97d3-e6ee3ad9e148-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "abefed0f-b590-471d-97d3-e6ee3ad9e148" (UID: "abefed0f-b590-471d-97d3-e6ee3ad9e148"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:27:22 crc kubenswrapper[4825]: I0310 08:27:22.919358 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abefed0f-b590-471d-97d3-e6ee3ad9e148-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 08:27:23 crc kubenswrapper[4825]: I0310 08:27:23.151667 4825 generic.go:334] "Generic (PLEG): container finished" podID="abefed0f-b590-471d-97d3-e6ee3ad9e148" containerID="e1b9cd4f84eeb5e090a05ecb2d7f9131591ae2fd8e55043e0e8ae67c2c626c14" exitCode=0 Mar 10 08:27:23 crc kubenswrapper[4825]: I0310 08:27:23.151717 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wv7xp" event={"ID":"abefed0f-b590-471d-97d3-e6ee3ad9e148","Type":"ContainerDied","Data":"e1b9cd4f84eeb5e090a05ecb2d7f9131591ae2fd8e55043e0e8ae67c2c626c14"} Mar 10 08:27:23 crc kubenswrapper[4825]: I0310 08:27:23.151756 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wv7xp" event={"ID":"abefed0f-b590-471d-97d3-e6ee3ad9e148","Type":"ContainerDied","Data":"1c42ca67d6c222a138a626496cbebbb6c4434053f208b71403576b04ce53fff5"} Mar 10 08:27:23 crc kubenswrapper[4825]: I0310 08:27:23.151775 4825 scope.go:117] "RemoveContainer" containerID="e1b9cd4f84eeb5e090a05ecb2d7f9131591ae2fd8e55043e0e8ae67c2c626c14" Mar 10 08:27:23 crc kubenswrapper[4825]: I0310 08:27:23.151826 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wv7xp" Mar 10 08:27:23 crc kubenswrapper[4825]: I0310 08:27:23.195270 4825 scope.go:117] "RemoveContainer" containerID="24650c0f5099e08ef50bc1c32eb1d791dd1b15c54c0c1ff4da0650a3c64da8cf" Mar 10 08:27:23 crc kubenswrapper[4825]: I0310 08:27:23.204791 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wv7xp"] Mar 10 08:27:23 crc kubenswrapper[4825]: I0310 08:27:23.214068 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wv7xp"] Mar 10 08:27:23 crc kubenswrapper[4825]: I0310 08:27:23.241017 4825 scope.go:117] "RemoveContainer" containerID="a45447c1843cefb0e47ff3d5d67ede0930cfc0b3a282d4594d7c2c4dda6f4b45" Mar 10 08:27:23 crc kubenswrapper[4825]: I0310 08:27:23.257037 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abefed0f-b590-471d-97d3-e6ee3ad9e148" path="/var/lib/kubelet/pods/abefed0f-b590-471d-97d3-e6ee3ad9e148/volumes" Mar 10 08:27:23 crc kubenswrapper[4825]: I0310 08:27:23.297723 4825 scope.go:117] "RemoveContainer" containerID="e1b9cd4f84eeb5e090a05ecb2d7f9131591ae2fd8e55043e0e8ae67c2c626c14" Mar 10 08:27:23 crc kubenswrapper[4825]: E0310 08:27:23.298457 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1b9cd4f84eeb5e090a05ecb2d7f9131591ae2fd8e55043e0e8ae67c2c626c14\": container with ID starting with e1b9cd4f84eeb5e090a05ecb2d7f9131591ae2fd8e55043e0e8ae67c2c626c14 not found: ID does not exist" containerID="e1b9cd4f84eeb5e090a05ecb2d7f9131591ae2fd8e55043e0e8ae67c2c626c14" Mar 10 08:27:23 crc kubenswrapper[4825]: I0310 08:27:23.298498 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1b9cd4f84eeb5e090a05ecb2d7f9131591ae2fd8e55043e0e8ae67c2c626c14"} err="failed to get container status \"e1b9cd4f84eeb5e090a05ecb2d7f9131591ae2fd8e55043e0e8ae67c2c626c14\": rpc error: code = NotFound desc = could not find container \"e1b9cd4f84eeb5e090a05ecb2d7f9131591ae2fd8e55043e0e8ae67c2c626c14\": container with ID starting with e1b9cd4f84eeb5e090a05ecb2d7f9131591ae2fd8e55043e0e8ae67c2c626c14 not found: ID does not exist" Mar 10 08:27:23 crc kubenswrapper[4825]: I0310 08:27:23.298523 4825 scope.go:117] "RemoveContainer" containerID="24650c0f5099e08ef50bc1c32eb1d791dd1b15c54c0c1ff4da0650a3c64da8cf" Mar 10 08:27:23 crc kubenswrapper[4825]: E0310 08:27:23.299019 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24650c0f5099e08ef50bc1c32eb1d791dd1b15c54c0c1ff4da0650a3c64da8cf\": container with ID starting with 24650c0f5099e08ef50bc1c32eb1d791dd1b15c54c0c1ff4da0650a3c64da8cf not found: ID does not exist" containerID="24650c0f5099e08ef50bc1c32eb1d791dd1b15c54c0c1ff4da0650a3c64da8cf" Mar 10 08:27:23 crc kubenswrapper[4825]: I0310 08:27:23.299053 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24650c0f5099e08ef50bc1c32eb1d791dd1b15c54c0c1ff4da0650a3c64da8cf"} err="failed to get container status \"24650c0f5099e08ef50bc1c32eb1d791dd1b15c54c0c1ff4da0650a3c64da8cf\": rpc error: code = NotFound desc = could not find container \"24650c0f5099e08ef50bc1c32eb1d791dd1b15c54c0c1ff4da0650a3c64da8cf\": container with ID starting with 24650c0f5099e08ef50bc1c32eb1d791dd1b15c54c0c1ff4da0650a3c64da8cf not found: ID does not exist" Mar 10 08:27:23 crc kubenswrapper[4825]: I0310 08:27:23.299074 4825 scope.go:117] "RemoveContainer" containerID="a45447c1843cefb0e47ff3d5d67ede0930cfc0b3a282d4594d7c2c4dda6f4b45" Mar 10 08:27:23 crc kubenswrapper[4825]: E0310 08:27:23.299514 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a45447c1843cefb0e47ff3d5d67ede0930cfc0b3a282d4594d7c2c4dda6f4b45\": container with ID starting with a45447c1843cefb0e47ff3d5d67ede0930cfc0b3a282d4594d7c2c4dda6f4b45 not found: ID does not exist" containerID="a45447c1843cefb0e47ff3d5d67ede0930cfc0b3a282d4594d7c2c4dda6f4b45" Mar 10 08:27:23 crc kubenswrapper[4825]: I0310 08:27:23.299567 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a45447c1843cefb0e47ff3d5d67ede0930cfc0b3a282d4594d7c2c4dda6f4b45"} err="failed to get container status \"a45447c1843cefb0e47ff3d5d67ede0930cfc0b3a282d4594d7c2c4dda6f4b45\": rpc error: code = NotFound desc = could not find container \"a45447c1843cefb0e47ff3d5d67ede0930cfc0b3a282d4594d7c2c4dda6f4b45\": container with ID starting with a45447c1843cefb0e47ff3d5d67ede0930cfc0b3a282d4594d7c2c4dda6f4b45 not found: ID does not exist" Mar 10 08:27:31 crc kubenswrapper[4825]: I0310 08:27:31.237623 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:27:31 crc kubenswrapper[4825]: E0310 08:27:31.238950 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:27:42 crc kubenswrapper[4825]: I0310 08:27:42.236920 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:27:42 crc kubenswrapper[4825]: E0310 08:27:42.237634 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:27:51 crc kubenswrapper[4825]: I0310 08:27:51.768415 4825 scope.go:117] "RemoveContainer" containerID="398bf2634549284d1287df5167b546153037e8d96bd52c1f6cbf3a716089f156" Mar 10 08:27:57 crc kubenswrapper[4825]: I0310 08:27:57.049836 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-dtd22"] Mar 10 08:27:57 crc kubenswrapper[4825]: I0310 08:27:57.069788 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-mng4f"] Mar 10 08:27:57 crc kubenswrapper[4825]: I0310 08:27:57.080763 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-mng4f"] Mar 10 08:27:57 crc kubenswrapper[4825]: I0310 08:27:57.091106 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-dtd22"] Mar 10 08:27:57 crc kubenswrapper[4825]: I0310 08:27:57.100315 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-9ac4-account-create-update-fms75"] Mar 10 08:27:57 crc kubenswrapper[4825]: I0310 08:27:57.108390 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-9ac4-account-create-update-fms75"] Mar 10 08:27:57 crc kubenswrapper[4825]: I0310 08:27:57.237774 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:27:57 crc kubenswrapper[4825]: E0310 08:27:57.238110 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:27:57 crc kubenswrapper[4825]: I0310 08:27:57.250564 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bba32ca3-b10c-41ec-9154-65c3dfd60c48" path="/var/lib/kubelet/pods/bba32ca3-b10c-41ec-9154-65c3dfd60c48/volumes" Mar 10 08:27:57 crc kubenswrapper[4825]: I0310 08:27:57.251175 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eab59002-2587-4875-acb8-52330470fc45" path="/var/lib/kubelet/pods/eab59002-2587-4875-acb8-52330470fc45/volumes" Mar 10 08:27:57 crc kubenswrapper[4825]: I0310 08:27:57.251749 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1172cce-7e11-4cd0-8c82-f4baa5a15bcc" path="/var/lib/kubelet/pods/f1172cce-7e11-4cd0-8c82-f4baa5a15bcc/volumes" Mar 10 08:27:58 crc kubenswrapper[4825]: I0310 08:27:58.039015 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-k7f4d"] Mar 10 08:27:58 crc kubenswrapper[4825]: I0310 08:27:58.054098 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-30de-account-create-update-wgghb"] Mar 10 08:27:58 crc kubenswrapper[4825]: I0310 08:27:58.067932 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-655f-account-create-update-wpvkq"] Mar 10 08:27:58 crc kubenswrapper[4825]: I0310 08:27:58.080720 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-k7f4d"] Mar 10 08:27:58 crc kubenswrapper[4825]: I0310 08:27:58.088469 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-30de-account-create-update-wgghb"] Mar 10 08:27:58 crc kubenswrapper[4825]: I0310 08:27:58.096596 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-655f-account-create-update-wpvkq"] Mar 10 08:27:59 crc kubenswrapper[4825]: I0310 08:27:59.265468 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29f630f8-cffd-4cd9-8ad6-42b5647e57a6" path="/var/lib/kubelet/pods/29f630f8-cffd-4cd9-8ad6-42b5647e57a6/volumes" Mar 10 08:27:59 crc kubenswrapper[4825]: I0310 08:27:59.266906 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aed51f5-8b74-479f-a86c-1dd1e51d7bb5" path="/var/lib/kubelet/pods/5aed51f5-8b74-479f-a86c-1dd1e51d7bb5/volumes" Mar 10 08:27:59 crc kubenswrapper[4825]: I0310 08:27:59.267736 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f00f896-c7ba-4380-9b97-da67e92c1741" path="/var/lib/kubelet/pods/5f00f896-c7ba-4380-9b97-da67e92c1741/volumes" Mar 10 08:28:00 crc kubenswrapper[4825]: I0310 08:28:00.148417 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552188-hq4rj"] Mar 10 08:28:00 crc kubenswrapper[4825]: E0310 08:28:00.148926 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abefed0f-b590-471d-97d3-e6ee3ad9e148" containerName="extract-utilities" Mar 10 08:28:00 crc kubenswrapper[4825]: I0310 08:28:00.148944 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="abefed0f-b590-471d-97d3-e6ee3ad9e148" containerName="extract-utilities" Mar 10 08:28:00 crc kubenswrapper[4825]: E0310 08:28:00.148955 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abefed0f-b590-471d-97d3-e6ee3ad9e148" containerName="registry-server" Mar 10 08:28:00 crc kubenswrapper[4825]: I0310 08:28:00.148962 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="abefed0f-b590-471d-97d3-e6ee3ad9e148" containerName="registry-server" Mar 10 08:28:00 crc kubenswrapper[4825]: E0310 08:28:00.148973 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abefed0f-b590-471d-97d3-e6ee3ad9e148" containerName="extract-content" Mar 10 08:28:00 crc kubenswrapper[4825]: I0310 08:28:00.148980 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="abefed0f-b590-471d-97d3-e6ee3ad9e148" containerName="extract-content" Mar 10 08:28:00 crc kubenswrapper[4825]: I0310 08:28:00.149241 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="abefed0f-b590-471d-97d3-e6ee3ad9e148" containerName="registry-server" Mar 10 08:28:00 crc kubenswrapper[4825]: I0310 08:28:00.150005 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552188-hq4rj" Mar 10 08:28:00 crc kubenswrapper[4825]: I0310 08:28:00.153374 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:28:00 crc kubenswrapper[4825]: I0310 08:28:00.153816 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:28:00 crc kubenswrapper[4825]: I0310 08:28:00.154277 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:28:00 crc kubenswrapper[4825]: I0310 08:28:00.164268 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552188-hq4rj"] Mar 10 08:28:00 crc kubenswrapper[4825]: I0310 08:28:00.279981 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5q2h\" (UniqueName: \"kubernetes.io/projected/c9d54311-e96f-4179-a4c7-061199831901-kube-api-access-n5q2h\") pod \"auto-csr-approver-29552188-hq4rj\" (UID: \"c9d54311-e96f-4179-a4c7-061199831901\") " pod="openshift-infra/auto-csr-approver-29552188-hq4rj" Mar 10 08:28:00 crc kubenswrapper[4825]: I0310 08:28:00.383668 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5q2h\" (UniqueName: \"kubernetes.io/projected/c9d54311-e96f-4179-a4c7-061199831901-kube-api-access-n5q2h\") pod \"auto-csr-approver-29552188-hq4rj\" (UID: \"c9d54311-e96f-4179-a4c7-061199831901\") " pod="openshift-infra/auto-csr-approver-29552188-hq4rj" Mar 10 08:28:00 crc kubenswrapper[4825]: I0310 08:28:00.403164 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5q2h\" (UniqueName: \"kubernetes.io/projected/c9d54311-e96f-4179-a4c7-061199831901-kube-api-access-n5q2h\") pod \"auto-csr-approver-29552188-hq4rj\" (UID: \"c9d54311-e96f-4179-a4c7-061199831901\") " pod="openshift-infra/auto-csr-approver-29552188-hq4rj" Mar 10 08:28:00 crc kubenswrapper[4825]: I0310 08:28:00.491233 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552188-hq4rj" Mar 10 08:28:00 crc kubenswrapper[4825]: I0310 08:28:00.968264 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552188-hq4rj"] Mar 10 08:28:01 crc kubenswrapper[4825]: I0310 08:28:01.521324 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552188-hq4rj" event={"ID":"c9d54311-e96f-4179-a4c7-061199831901","Type":"ContainerStarted","Data":"4042e41da60636cdfc9f97ce222b9dbe8ba81fc0dbf4c64afa7c1c87c26b5214"} Mar 10 08:28:02 crc kubenswrapper[4825]: I0310 08:28:02.540494 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552188-hq4rj" event={"ID":"c9d54311-e96f-4179-a4c7-061199831901","Type":"ContainerStarted","Data":"a83c12e06053a922044a9d7a9029364af4ef7ca9dd542c82769661855d1c8dc8"} Mar 10 08:28:02 crc kubenswrapper[4825]: I0310 08:28:02.560964 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552188-hq4rj" podStartSLOduration=1.379769741 podStartE2EDuration="2.560942774s" podCreationTimestamp="2026-03-10 08:28:00 +0000 UTC" firstStartedPulling="2026-03-10 08:28:00.970178932 +0000 UTC m=+6233.999959547" lastFinishedPulling="2026-03-10 08:28:02.151351965 +0000 UTC m=+6235.181132580" observedRunningTime="2026-03-10 08:28:02.55445264 +0000 UTC m=+6235.584233265" watchObservedRunningTime="2026-03-10 08:28:02.560942774 +0000 UTC m=+6235.590723389" Mar 10 08:28:03 crc kubenswrapper[4825]: I0310 08:28:03.565695 4825 generic.go:334] "Generic (PLEG): container finished" podID="c9d54311-e96f-4179-a4c7-061199831901" containerID="a83c12e06053a922044a9d7a9029364af4ef7ca9dd542c82769661855d1c8dc8" exitCode=0 Mar 10 08:28:03 crc kubenswrapper[4825]: I0310 08:28:03.565837 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552188-hq4rj" event={"ID":"c9d54311-e96f-4179-a4c7-061199831901","Type":"ContainerDied","Data":"a83c12e06053a922044a9d7a9029364af4ef7ca9dd542c82769661855d1c8dc8"} Mar 10 08:28:04 crc kubenswrapper[4825]: I0310 08:28:04.953340 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552188-hq4rj" Mar 10 08:28:04 crc kubenswrapper[4825]: I0310 08:28:04.983750 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5q2h\" (UniqueName: \"kubernetes.io/projected/c9d54311-e96f-4179-a4c7-061199831901-kube-api-access-n5q2h\") pod \"c9d54311-e96f-4179-a4c7-061199831901\" (UID: \"c9d54311-e96f-4179-a4c7-061199831901\") " Mar 10 08:28:04 crc kubenswrapper[4825]: I0310 08:28:04.992309 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9d54311-e96f-4179-a4c7-061199831901-kube-api-access-n5q2h" (OuterVolumeSpecName: "kube-api-access-n5q2h") pod "c9d54311-e96f-4179-a4c7-061199831901" (UID: "c9d54311-e96f-4179-a4c7-061199831901"). InnerVolumeSpecName "kube-api-access-n5q2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:28:05 crc kubenswrapper[4825]: I0310 08:28:05.085853 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5q2h\" (UniqueName: \"kubernetes.io/projected/c9d54311-e96f-4179-a4c7-061199831901-kube-api-access-n5q2h\") on node \"crc\" DevicePath \"\"" Mar 10 08:28:05 crc kubenswrapper[4825]: I0310 08:28:05.595823 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552188-hq4rj" event={"ID":"c9d54311-e96f-4179-a4c7-061199831901","Type":"ContainerDied","Data":"4042e41da60636cdfc9f97ce222b9dbe8ba81fc0dbf4c64afa7c1c87c26b5214"} Mar 10 08:28:05 crc kubenswrapper[4825]: I0310 08:28:05.596181 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4042e41da60636cdfc9f97ce222b9dbe8ba81fc0dbf4c64afa7c1c87c26b5214" Mar 10 08:28:05 crc kubenswrapper[4825]: I0310 08:28:05.595954 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552188-hq4rj" Mar 10 08:28:05 crc kubenswrapper[4825]: I0310 08:28:05.646073 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552182-jvs86"] Mar 10 08:28:05 crc kubenswrapper[4825]: I0310 08:28:05.655006 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552182-jvs86"] Mar 10 08:28:07 crc kubenswrapper[4825]: I0310 08:28:07.251323 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa2c9226-c27e-44da-881d-2495b13dbffe" path="/var/lib/kubelet/pods/fa2c9226-c27e-44da-881d-2495b13dbffe/volumes" Mar 10 08:28:09 crc kubenswrapper[4825]: I0310 08:28:09.242797 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:28:09 crc kubenswrapper[4825]: E0310 08:28:09.243430 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:28:16 crc kubenswrapper[4825]: I0310 08:28:16.041617 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-66fg7"] Mar 10 08:28:16 crc kubenswrapper[4825]: I0310 08:28:16.056401 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-66fg7"] Mar 10 08:28:17 crc kubenswrapper[4825]: I0310 08:28:17.253461 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77ec0a62-9440-49c2-87c1-2d9d9d30b7ff" path="/var/lib/kubelet/pods/77ec0a62-9440-49c2-87c1-2d9d9d30b7ff/volumes" Mar 10 08:28:23 crc kubenswrapper[4825]: I0310 08:28:23.237408 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:28:23 crc kubenswrapper[4825]: E0310 08:28:23.238289 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:28:27 crc kubenswrapper[4825]: I0310 08:28:27.763651 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p8fxj"] Mar 10 08:28:27 crc kubenswrapper[4825]: E0310 08:28:27.764434 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9d54311-e96f-4179-a4c7-061199831901" containerName="oc" Mar 10 08:28:27 crc kubenswrapper[4825]: I0310 08:28:27.764452 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d54311-e96f-4179-a4c7-061199831901" containerName="oc" Mar 10 08:28:27 crc kubenswrapper[4825]: I0310 08:28:27.764743 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9d54311-e96f-4179-a4c7-061199831901" containerName="oc" Mar 10 08:28:27 crc kubenswrapper[4825]: I0310 08:28:27.767151 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p8fxj" Mar 10 08:28:27 crc kubenswrapper[4825]: I0310 08:28:27.776754 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p8fxj"] Mar 10 08:28:27 crc kubenswrapper[4825]: I0310 08:28:27.829076 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7-catalog-content\") pod \"community-operators-p8fxj\" (UID: \"b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7\") " pod="openshift-marketplace/community-operators-p8fxj" Mar 10 08:28:27 crc kubenswrapper[4825]: I0310 08:28:27.829530 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqnhw\" (UniqueName: \"kubernetes.io/projected/b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7-kube-api-access-dqnhw\") pod \"community-operators-p8fxj\" (UID: \"b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7\") " pod="openshift-marketplace/community-operators-p8fxj" Mar 10 08:28:27 crc kubenswrapper[4825]: I0310 08:28:27.829815 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7-utilities\") pod \"community-operators-p8fxj\" (UID: \"b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7\") " pod="openshift-marketplace/community-operators-p8fxj" Mar 10 08:28:27 crc kubenswrapper[4825]: I0310 08:28:27.931551 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqnhw\" (UniqueName: \"kubernetes.io/projected/b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7-kube-api-access-dqnhw\") pod \"community-operators-p8fxj\" (UID: \"b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7\") " pod="openshift-marketplace/community-operators-p8fxj" Mar 10 08:28:27 crc kubenswrapper[4825]: I0310 08:28:27.931773 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7-utilities\") pod \"community-operators-p8fxj\" (UID: \"b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7\") " pod="openshift-marketplace/community-operators-p8fxj" Mar 10 08:28:27 crc kubenswrapper[4825]: I0310 08:28:27.931938 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7-catalog-content\") pod \"community-operators-p8fxj\" (UID: \"b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7\") " pod="openshift-marketplace/community-operators-p8fxj" Mar 10 08:28:27 crc kubenswrapper[4825]: I0310 08:28:27.932595 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7-utilities\") pod \"community-operators-p8fxj\" (UID: \"b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7\") " pod="openshift-marketplace/community-operators-p8fxj" Mar 10 08:28:27 crc kubenswrapper[4825]: I0310 08:28:27.932704 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7-catalog-content\") pod \"community-operators-p8fxj\" (UID: \"b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7\") " pod="openshift-marketplace/community-operators-p8fxj" Mar 10 08:28:27 crc kubenswrapper[4825]: I0310 08:28:27.957595 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqnhw\" (UniqueName: \"kubernetes.io/projected/b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7-kube-api-access-dqnhw\") pod \"community-operators-p8fxj\" (UID: \"b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7\") " pod="openshift-marketplace/community-operators-p8fxj" Mar 10 08:28:28 crc kubenswrapper[4825]: I0310 08:28:28.101695 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p8fxj" Mar 10 08:28:28 crc kubenswrapper[4825]: I0310 08:28:28.602010 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p8fxj"] Mar 10 08:28:28 crc kubenswrapper[4825]: I0310 08:28:28.832536 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8fxj" event={"ID":"b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7","Type":"ContainerStarted","Data":"8938760c5da3d9f675d539ab1b32f2985d0cfe91735356bf36ea87b88da59eac"} Mar 10 08:28:28 crc kubenswrapper[4825]: I0310 08:28:28.832595 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8fxj" event={"ID":"b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7","Type":"ContainerStarted","Data":"7281b5f1cf97c7419c04e2dab9ffffff52b055afe7ca5fff4db0c109db5dbcb9"} Mar 10 08:28:29 crc kubenswrapper[4825]: I0310 08:28:29.842379 4825 generic.go:334] "Generic (PLEG): container finished" podID="b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7" containerID="8938760c5da3d9f675d539ab1b32f2985d0cfe91735356bf36ea87b88da59eac" exitCode=0 Mar 10 08:28:29 crc kubenswrapper[4825]: I0310 08:28:29.842467 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8fxj" event={"ID":"b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7","Type":"ContainerDied","Data":"8938760c5da3d9f675d539ab1b32f2985d0cfe91735356bf36ea87b88da59eac"} Mar 10 08:28:30 crc kubenswrapper[4825]: I0310 08:28:30.863837 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8fxj" event={"ID":"b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7","Type":"ContainerStarted","Data":"371685bb7ededbf92a50d0b621b6ffc17a48bdd1122adc2036507b7675afb411"} Mar 10 08:28:32 crc kubenswrapper[4825]: I0310 08:28:32.882289 4825 generic.go:334] "Generic (PLEG): container finished" podID="b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7" containerID="371685bb7ededbf92a50d0b621b6ffc17a48bdd1122adc2036507b7675afb411" exitCode=0 Mar 10 08:28:32 crc kubenswrapper[4825]: I0310 08:28:32.882518 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8fxj" event={"ID":"b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7","Type":"ContainerDied","Data":"371685bb7ededbf92a50d0b621b6ffc17a48bdd1122adc2036507b7675afb411"} Mar 10 08:28:33 crc kubenswrapper[4825]: I0310 08:28:33.892232 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8fxj" event={"ID":"b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7","Type":"ContainerStarted","Data":"b9bee371a1f0d3d600f7de6179a64814230ef88a56ab05f25df3531b621a4242"} Mar 10 08:28:33 crc kubenswrapper[4825]: I0310 08:28:33.924667 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p8fxj" podStartSLOduration=3.406740106 podStartE2EDuration="6.924649528s" podCreationTimestamp="2026-03-10 08:28:27 +0000 UTC" firstStartedPulling="2026-03-10 08:28:29.846310837 +0000 UTC m=+6262.876091452" lastFinishedPulling="2026-03-10 08:28:33.364220259 +0000 UTC m=+6266.394000874" observedRunningTime="2026-03-10 08:28:33.913050417 +0000 UTC m=+6266.942831032" watchObservedRunningTime="2026-03-10 08:28:33.924649528 +0000 UTC m=+6266.954430143" Mar 10 08:28:34 crc kubenswrapper[4825]: I0310 08:28:34.236299 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:28:34 crc kubenswrapper[4825]: E0310 08:28:34.236610 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:28:35 crc kubenswrapper[4825]: I0310 08:28:35.056413 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lvqlw"] Mar 10 08:28:35 crc kubenswrapper[4825]: I0310 08:28:35.066741 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lvqlw"] Mar 10 08:28:35 crc kubenswrapper[4825]: I0310 08:28:35.249429 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="429f49f0-f549-4a10-a6d4-9dff065e5258" path="/var/lib/kubelet/pods/429f49f0-f549-4a10-a6d4-9dff065e5258/volumes" Mar 10 08:28:36 crc kubenswrapper[4825]: I0310 08:28:36.032618 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-twtvd"] Mar 10 08:28:36 crc kubenswrapper[4825]: I0310 08:28:36.042250 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-twtvd"] Mar 10 08:28:37 crc kubenswrapper[4825]: I0310 08:28:37.249290 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="564e7588-ac0e-48e5-80c5-3e4592e06601" path="/var/lib/kubelet/pods/564e7588-ac0e-48e5-80c5-3e4592e06601/volumes" Mar 10 08:28:38 crc kubenswrapper[4825]: I0310 08:28:38.101816 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p8fxj" Mar 10 08:28:38 crc kubenswrapper[4825]: I0310 08:28:38.101857 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p8fxj" Mar 10 08:28:38 crc kubenswrapper[4825]: I0310 08:28:38.149250 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p8fxj" Mar 10 08:28:38 crc kubenswrapper[4825]: I0310 08:28:38.999495 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p8fxj" Mar 10 08:28:39 crc kubenswrapper[4825]: I0310 08:28:39.051747 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p8fxj"] Mar 10 08:28:40 crc kubenswrapper[4825]: I0310 08:28:40.971070 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p8fxj" podUID="b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7" containerName="registry-server" containerID="cri-o://b9bee371a1f0d3d600f7de6179a64814230ef88a56ab05f25df3531b621a4242" gracePeriod=2 Mar 10 08:28:41 crc kubenswrapper[4825]: I0310 08:28:41.547200 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p8fxj" Mar 10 08:28:41 crc kubenswrapper[4825]: I0310 08:28:41.628783 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7-catalog-content\") pod \"b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7\" (UID: \"b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7\") " Mar 10 08:28:41 crc kubenswrapper[4825]: I0310 08:28:41.629081 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqnhw\" (UniqueName: \"kubernetes.io/projected/b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7-kube-api-access-dqnhw\") pod \"b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7\" (UID: \"b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7\") " Mar 10 08:28:41 crc kubenswrapper[4825]: I0310 08:28:41.629247 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7-utilities\") pod \"b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7\" (UID: \"b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7\") " Mar 10 08:28:41 crc kubenswrapper[4825]: I0310 08:28:41.629994 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7-utilities" (OuterVolumeSpecName: "utilities") pod "b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7" (UID: "b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:28:41 crc kubenswrapper[4825]: I0310 08:28:41.641034 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7-kube-api-access-dqnhw" (OuterVolumeSpecName: "kube-api-access-dqnhw") pod "b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7" (UID: "b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7"). InnerVolumeSpecName "kube-api-access-dqnhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:28:41 crc kubenswrapper[4825]: I0310 08:28:41.732276 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqnhw\" (UniqueName: \"kubernetes.io/projected/b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7-kube-api-access-dqnhw\") on node \"crc\" DevicePath \"\"" Mar 10 08:28:41 crc kubenswrapper[4825]: I0310 08:28:41.732307 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 08:28:41 crc kubenswrapper[4825]: I0310 08:28:41.768586 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7" (UID: "b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:28:41 crc kubenswrapper[4825]: I0310 08:28:41.834095 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 08:28:41 crc kubenswrapper[4825]: I0310 08:28:41.983393 4825 generic.go:334] "Generic (PLEG): container finished" podID="b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7" containerID="b9bee371a1f0d3d600f7de6179a64814230ef88a56ab05f25df3531b621a4242" exitCode=0 Mar 10 08:28:41 crc kubenswrapper[4825]: I0310 08:28:41.983434 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p8fxj" Mar 10 08:28:41 crc kubenswrapper[4825]: I0310 08:28:41.983430 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8fxj" event={"ID":"b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7","Type":"ContainerDied","Data":"b9bee371a1f0d3d600f7de6179a64814230ef88a56ab05f25df3531b621a4242"} Mar 10 08:28:41 crc kubenswrapper[4825]: I0310 08:28:41.985239 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p8fxj" event={"ID":"b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7","Type":"ContainerDied","Data":"7281b5f1cf97c7419c04e2dab9ffffff52b055afe7ca5fff4db0c109db5dbcb9"} Mar 10 08:28:41 crc kubenswrapper[4825]: I0310 08:28:41.985318 4825 scope.go:117] "RemoveContainer" containerID="b9bee371a1f0d3d600f7de6179a64814230ef88a56ab05f25df3531b621a4242" Mar 10 08:28:42 crc kubenswrapper[4825]: I0310 08:28:42.019308 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p8fxj"] Mar 10 08:28:42 crc kubenswrapper[4825]: I0310 08:28:42.027243 4825 scope.go:117] "RemoveContainer" containerID="371685bb7ededbf92a50d0b621b6ffc17a48bdd1122adc2036507b7675afb411" Mar 10 08:28:42 crc kubenswrapper[4825]: I0310 08:28:42.029401 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p8fxj"] Mar 10 08:28:42 crc kubenswrapper[4825]: I0310 08:28:42.049194 4825 scope.go:117] "RemoveContainer" containerID="8938760c5da3d9f675d539ab1b32f2985d0cfe91735356bf36ea87b88da59eac" Mar 10 08:28:42 crc kubenswrapper[4825]: I0310 08:28:42.093806 4825 scope.go:117] "RemoveContainer" containerID="b9bee371a1f0d3d600f7de6179a64814230ef88a56ab05f25df3531b621a4242" Mar 10 08:28:42 crc kubenswrapper[4825]: E0310 08:28:42.094393 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9bee371a1f0d3d600f7de6179a64814230ef88a56ab05f25df3531b621a4242\": container with ID starting with b9bee371a1f0d3d600f7de6179a64814230ef88a56ab05f25df3531b621a4242 not found: ID does not exist" containerID="b9bee371a1f0d3d600f7de6179a64814230ef88a56ab05f25df3531b621a4242" Mar 10 08:28:42 crc kubenswrapper[4825]: I0310 08:28:42.094438 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9bee371a1f0d3d600f7de6179a64814230ef88a56ab05f25df3531b621a4242"} err="failed to get container status \"b9bee371a1f0d3d600f7de6179a64814230ef88a56ab05f25df3531b621a4242\": rpc error: code = NotFound desc = could not find container \"b9bee371a1f0d3d600f7de6179a64814230ef88a56ab05f25df3531b621a4242\": container with ID starting with b9bee371a1f0d3d600f7de6179a64814230ef88a56ab05f25df3531b621a4242 not found: ID does not exist" Mar 10 08:28:42 crc kubenswrapper[4825]: I0310 08:28:42.094468 4825 scope.go:117] "RemoveContainer" containerID="371685bb7ededbf92a50d0b621b6ffc17a48bdd1122adc2036507b7675afb411" Mar 10 08:28:42 crc kubenswrapper[4825]: E0310 08:28:42.094757 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"371685bb7ededbf92a50d0b621b6ffc17a48bdd1122adc2036507b7675afb411\": container with ID starting with 371685bb7ededbf92a50d0b621b6ffc17a48bdd1122adc2036507b7675afb411 not found: ID does not exist" containerID="371685bb7ededbf92a50d0b621b6ffc17a48bdd1122adc2036507b7675afb411" Mar 10 08:28:42 crc kubenswrapper[4825]: I0310 08:28:42.094794 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"371685bb7ededbf92a50d0b621b6ffc17a48bdd1122adc2036507b7675afb411"} err="failed to get container status \"371685bb7ededbf92a50d0b621b6ffc17a48bdd1122adc2036507b7675afb411\": rpc error: code = NotFound desc = could not find container \"371685bb7ededbf92a50d0b621b6ffc17a48bdd1122adc2036507b7675afb411\": container with ID starting with 371685bb7ededbf92a50d0b621b6ffc17a48bdd1122adc2036507b7675afb411 not found: ID does not exist" Mar 10 08:28:42 crc kubenswrapper[4825]: I0310 08:28:42.094817 4825 scope.go:117] "RemoveContainer" containerID="8938760c5da3d9f675d539ab1b32f2985d0cfe91735356bf36ea87b88da59eac" Mar 10 08:28:42 crc kubenswrapper[4825]: E0310 08:28:42.095363 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8938760c5da3d9f675d539ab1b32f2985d0cfe91735356bf36ea87b88da59eac\": container with ID starting with 8938760c5da3d9f675d539ab1b32f2985d0cfe91735356bf36ea87b88da59eac not found: ID does not exist" containerID="8938760c5da3d9f675d539ab1b32f2985d0cfe91735356bf36ea87b88da59eac" Mar 10 08:28:42 crc kubenswrapper[4825]: I0310 08:28:42.095394 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8938760c5da3d9f675d539ab1b32f2985d0cfe91735356bf36ea87b88da59eac"} err="failed to get container status \"8938760c5da3d9f675d539ab1b32f2985d0cfe91735356bf36ea87b88da59eac\": rpc error: code = NotFound desc = could not find container \"8938760c5da3d9f675d539ab1b32f2985d0cfe91735356bf36ea87b88da59eac\": container with ID starting with 8938760c5da3d9f675d539ab1b32f2985d0cfe91735356bf36ea87b88da59eac not found: ID does not exist" Mar 10 08:28:43 crc kubenswrapper[4825]: I0310 08:28:43.261295 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7" path="/var/lib/kubelet/pods/b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7/volumes" Mar 10 08:28:48 crc kubenswrapper[4825]: I0310 08:28:48.236951 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:28:48 crc kubenswrapper[4825]: E0310 08:28:48.237713 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:28:51 crc kubenswrapper[4825]: I0310 08:28:51.880712 4825 scope.go:117] "RemoveContainer" containerID="eb723b605616917b814449b23fa7a3c157a29e1774bf0a241b81b8cc3d0e5019" Mar 10 08:28:51 crc kubenswrapper[4825]: I0310 08:28:51.907487 4825 scope.go:117] "RemoveContainer" containerID="1a24f86eb00e34063c60a38ed8ee69df6405c36aec606e1ca5aa9ff3f787d50e" Mar 10 08:28:51 crc kubenswrapper[4825]: I0310 08:28:51.977597 4825 scope.go:117] "RemoveContainer" containerID="4ffa109e6f375d70cdfc3ad26b17ca5dcedd80479761ada3645788f7f2246e93" Mar 10 08:28:52 crc kubenswrapper[4825]: I0310 08:28:52.011861 4825 scope.go:117] "RemoveContainer" containerID="8986ab6e16d2366472f7d644734591d7e9ee495d766effd1ccdc55397060c039" Mar 10 08:28:52 crc kubenswrapper[4825]: I0310 08:28:52.086007 4825 scope.go:117] "RemoveContainer" containerID="6e150d3c7d3d12a275b1e406a13d7afbda25a1272de07c8ffcce956747b50b02" Mar 10 08:28:52 crc kubenswrapper[4825]: I0310 08:28:52.152515 4825 scope.go:117] "RemoveContainer" containerID="482a3fc01815cfcbe356cbc95fff4e50562a60743144e06ac6aafcad018c33c7" Mar 10 08:28:52 crc kubenswrapper[4825]: I0310 08:28:52.181949 4825 scope.go:117] "RemoveContainer" containerID="7cc2331bec61d3810da58bcf1342b6a173d0fe899374b5aa53cff13a063ead98" Mar 10 08:28:52 crc kubenswrapper[4825]: I0310 08:28:52.203017 4825 scope.go:117] "RemoveContainer" containerID="19d325794cf2ba66bd034f1af37d6240febd0a7a28b7089c1d5d329c476cfae6" Mar 10 08:28:52 crc kubenswrapper[4825]: I0310 08:28:52.230085 4825 scope.go:117] "RemoveContainer" containerID="664e6a99164eb7653d0d2fed78814ac77a88f3ae4a846b851f9cd0c3392231b3" Mar 10 08:28:52 crc kubenswrapper[4825]: I0310 08:28:52.252972 4825 scope.go:117] "RemoveContainer" containerID="668ea68a80ec250c4f30de40c68994b40e473c435dde0111f63ac540cea48e9d" Mar 10 08:29:03 crc kubenswrapper[4825]: I0310 08:29:03.236731 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:29:03 crc kubenswrapper[4825]: E0310 08:29:03.237522 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:29:18 crc kubenswrapper[4825]: I0310 08:29:18.236706 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:29:18 crc kubenswrapper[4825]: E0310 08:29:18.237576 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:29:23 crc kubenswrapper[4825]: I0310 08:29:23.054167 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-mwgq4"] Mar 10 08:29:23 crc kubenswrapper[4825]: I0310 08:29:23.068563 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-mwgq4"] Mar 10 08:29:23 crc kubenswrapper[4825]: I0310 08:29:23.248201 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc3963c8-93c2-4495-96f5-24227a0db90d" path="/var/lib/kubelet/pods/dc3963c8-93c2-4495-96f5-24227a0db90d/volumes" Mar 10 08:29:33 crc kubenswrapper[4825]: I0310 08:29:33.236508 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:29:33 crc kubenswrapper[4825]: E0310 08:29:33.238218 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:29:46 crc kubenswrapper[4825]: I0310 08:29:46.236497 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:29:46 crc kubenswrapper[4825]: E0310 08:29:46.237421 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:29:52 crc kubenswrapper[4825]: I0310 08:29:52.444160 4825 scope.go:117] "RemoveContainer" containerID="60937d0ffd4b47821e2bfc207c85886292910aaf52e0dbd87838e3cb21ba27f7" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.162390 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552190-tdnwv"] Mar 10 08:30:00 crc kubenswrapper[4825]: E0310 08:30:00.163225 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7" containerName="registry-server" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.163242 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7" containerName="registry-server" Mar 10 08:30:00 crc kubenswrapper[4825]: E0310 08:30:00.163266 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7" containerName="extract-utilities" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.163274 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7" containerName="extract-utilities" Mar 10 08:30:00 crc kubenswrapper[4825]: E0310 08:30:00.163292 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7" containerName="extract-content" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.163299 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7" containerName="extract-content" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.163530 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1f8dc8a-46b8-4f11-9f81-40dbaef4f8a7" containerName="registry-server" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.164407 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552190-tdnwv" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.171044 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.171393 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.171498 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.176009 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552190-lbl4m"] Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.178112 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552190-lbl4m" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.182916 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.183254 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.186755 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552190-tdnwv"] Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.203719 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552190-lbl4m"] Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.238675 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:30:00 crc kubenswrapper[4825]: E0310 08:30:00.239016 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.245408 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba915940-2e22-4fcb-92f5-b99de841c175-secret-volume\") pod \"collect-profiles-29552190-lbl4m\" (UID: \"ba915940-2e22-4fcb-92f5-b99de841c175\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552190-lbl4m" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.245489 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba915940-2e22-4fcb-92f5-b99de841c175-config-volume\") pod \"collect-profiles-29552190-lbl4m\" (UID: \"ba915940-2e22-4fcb-92f5-b99de841c175\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552190-lbl4m" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.245566 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzztg\" (UniqueName: \"kubernetes.io/projected/ba915940-2e22-4fcb-92f5-b99de841c175-kube-api-access-gzztg\") pod \"collect-profiles-29552190-lbl4m\" (UID: \"ba915940-2e22-4fcb-92f5-b99de841c175\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552190-lbl4m" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.245590 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h4vn\" (UniqueName: \"kubernetes.io/projected/b6bf8e71-0bdf-4597-8b70-f15499e6d4f8-kube-api-access-4h4vn\") pod \"auto-csr-approver-29552190-tdnwv\" (UID: \"b6bf8e71-0bdf-4597-8b70-f15499e6d4f8\") " pod="openshift-infra/auto-csr-approver-29552190-tdnwv" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.348116 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba915940-2e22-4fcb-92f5-b99de841c175-secret-volume\") pod \"collect-profiles-29552190-lbl4m\" (UID: \"ba915940-2e22-4fcb-92f5-b99de841c175\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552190-lbl4m" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.348291 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba915940-2e22-4fcb-92f5-b99de841c175-config-volume\") pod \"collect-profiles-29552190-lbl4m\" (UID: \"ba915940-2e22-4fcb-92f5-b99de841c175\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552190-lbl4m" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.348544 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzztg\" (UniqueName: \"kubernetes.io/projected/ba915940-2e22-4fcb-92f5-b99de841c175-kube-api-access-gzztg\") pod \"collect-profiles-29552190-lbl4m\" (UID: \"ba915940-2e22-4fcb-92f5-b99de841c175\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552190-lbl4m" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.348589 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h4vn\" (UniqueName: \"kubernetes.io/projected/b6bf8e71-0bdf-4597-8b70-f15499e6d4f8-kube-api-access-4h4vn\") pod \"auto-csr-approver-29552190-tdnwv\" (UID: \"b6bf8e71-0bdf-4597-8b70-f15499e6d4f8\") " pod="openshift-infra/auto-csr-approver-29552190-tdnwv" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.350289 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba915940-2e22-4fcb-92f5-b99de841c175-config-volume\") pod \"collect-profiles-29552190-lbl4m\" (UID: \"ba915940-2e22-4fcb-92f5-b99de841c175\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552190-lbl4m" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.355868 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba915940-2e22-4fcb-92f5-b99de841c175-secret-volume\") pod \"collect-profiles-29552190-lbl4m\" (UID: \"ba915940-2e22-4fcb-92f5-b99de841c175\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552190-lbl4m" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.367204 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h4vn\" (UniqueName: \"kubernetes.io/projected/b6bf8e71-0bdf-4597-8b70-f15499e6d4f8-kube-api-access-4h4vn\") pod \"auto-csr-approver-29552190-tdnwv\" (UID: \"b6bf8e71-0bdf-4597-8b70-f15499e6d4f8\") " pod="openshift-infra/auto-csr-approver-29552190-tdnwv" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.367552 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzztg\" (UniqueName: \"kubernetes.io/projected/ba915940-2e22-4fcb-92f5-b99de841c175-kube-api-access-gzztg\") pod \"collect-profiles-29552190-lbl4m\" (UID: \"ba915940-2e22-4fcb-92f5-b99de841c175\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552190-lbl4m" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.500633 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552190-tdnwv" Mar 10 08:30:00 crc kubenswrapper[4825]: I0310 08:30:00.514023 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552190-lbl4m" Mar 10 08:30:01 crc kubenswrapper[4825]: I0310 08:30:01.047797 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552190-tdnwv"] Mar 10 08:30:01 crc kubenswrapper[4825]: I0310 08:30:01.119676 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552190-lbl4m"] Mar 10 08:30:01 crc kubenswrapper[4825]: W0310 08:30:01.121027 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba915940_2e22_4fcb_92f5_b99de841c175.slice/crio-ee6fc287e79be9f1f3cc7dd364363aa8475df752fd9e46f21015d385a9897d1e WatchSource:0}: Error finding container ee6fc287e79be9f1f3cc7dd364363aa8475df752fd9e46f21015d385a9897d1e: Status 404 returned error can't find the container with id ee6fc287e79be9f1f3cc7dd364363aa8475df752fd9e46f21015d385a9897d1e Mar 10 08:30:01 crc kubenswrapper[4825]: I0310 08:30:01.757994 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552190-tdnwv" event={"ID":"b6bf8e71-0bdf-4597-8b70-f15499e6d4f8","Type":"ContainerStarted","Data":"80a32fe7ff69b9769ff442c5d6247824d95779495b0598511fe184ebfa01e6c3"} Mar 10 08:30:01 crc kubenswrapper[4825]: I0310 08:30:01.759466 4825 generic.go:334] "Generic (PLEG): container finished" podID="ba915940-2e22-4fcb-92f5-b99de841c175" containerID="b9a400ab978e78670446ab782d68132aa6af9bb7536e009a894c3e487358d0b7" exitCode=0 Mar 10 08:30:01 crc kubenswrapper[4825]: I0310 08:30:01.759495 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552190-lbl4m" event={"ID":"ba915940-2e22-4fcb-92f5-b99de841c175","Type":"ContainerDied","Data":"b9a400ab978e78670446ab782d68132aa6af9bb7536e009a894c3e487358d0b7"} Mar 10 08:30:01 crc kubenswrapper[4825]: I0310 08:30:01.759525 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552190-lbl4m" event={"ID":"ba915940-2e22-4fcb-92f5-b99de841c175","Type":"ContainerStarted","Data":"ee6fc287e79be9f1f3cc7dd364363aa8475df752fd9e46f21015d385a9897d1e"} Mar 10 08:30:02 crc kubenswrapper[4825]: I0310 08:30:02.771979 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552190-tdnwv" event={"ID":"b6bf8e71-0bdf-4597-8b70-f15499e6d4f8","Type":"ContainerStarted","Data":"70bd5cad43c296e34cea39f55c9b8fbaad4124b8648275278dff6be36f9b0955"} Mar 10 08:30:02 crc kubenswrapper[4825]: I0310 08:30:02.790639 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552190-tdnwv" podStartSLOduration=1.489508603 podStartE2EDuration="2.790619497s" podCreationTimestamp="2026-03-10 08:30:00 +0000 UTC" firstStartedPulling="2026-03-10 08:30:01.048828321 +0000 UTC m=+6354.078608936" lastFinishedPulling="2026-03-10 08:30:02.349939215 +0000 UTC m=+6355.379719830" observedRunningTime="2026-03-10 08:30:02.784327248 +0000 UTC m=+6355.814107863" watchObservedRunningTime="2026-03-10 08:30:02.790619497 +0000 UTC m=+6355.820400122" Mar 10 08:30:03 crc kubenswrapper[4825]: I0310 08:30:03.113967 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552190-lbl4m" Mar 10 08:30:03 crc kubenswrapper[4825]: I0310 08:30:03.219115 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzztg\" (UniqueName: \"kubernetes.io/projected/ba915940-2e22-4fcb-92f5-b99de841c175-kube-api-access-gzztg\") pod \"ba915940-2e22-4fcb-92f5-b99de841c175\" (UID: \"ba915940-2e22-4fcb-92f5-b99de841c175\") " Mar 10 08:30:03 crc kubenswrapper[4825]: I0310 08:30:03.219298 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba915940-2e22-4fcb-92f5-b99de841c175-config-volume\") pod \"ba915940-2e22-4fcb-92f5-b99de841c175\" (UID: \"ba915940-2e22-4fcb-92f5-b99de841c175\") " Mar 10 08:30:03 crc kubenswrapper[4825]: I0310 08:30:03.219374 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba915940-2e22-4fcb-92f5-b99de841c175-secret-volume\") pod \"ba915940-2e22-4fcb-92f5-b99de841c175\" (UID: \"ba915940-2e22-4fcb-92f5-b99de841c175\") " Mar 10 08:30:03 crc kubenswrapper[4825]: I0310 08:30:03.219971 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba915940-2e22-4fcb-92f5-b99de841c175-config-volume" (OuterVolumeSpecName: "config-volume") pod "ba915940-2e22-4fcb-92f5-b99de841c175" (UID: "ba915940-2e22-4fcb-92f5-b99de841c175"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:30:03 crc kubenswrapper[4825]: I0310 08:30:03.224675 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba915940-2e22-4fcb-92f5-b99de841c175-kube-api-access-gzztg" (OuterVolumeSpecName: "kube-api-access-gzztg") pod "ba915940-2e22-4fcb-92f5-b99de841c175" (UID: "ba915940-2e22-4fcb-92f5-b99de841c175"). InnerVolumeSpecName "kube-api-access-gzztg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:30:03 crc kubenswrapper[4825]: I0310 08:30:03.226283 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba915940-2e22-4fcb-92f5-b99de841c175-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ba915940-2e22-4fcb-92f5-b99de841c175" (UID: "ba915940-2e22-4fcb-92f5-b99de841c175"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:30:03 crc kubenswrapper[4825]: I0310 08:30:03.321261 4825 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba915940-2e22-4fcb-92f5-b99de841c175-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 08:30:03 crc kubenswrapper[4825]: I0310 08:30:03.321296 4825 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba915940-2e22-4fcb-92f5-b99de841c175-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 08:30:03 crc kubenswrapper[4825]: I0310 08:30:03.321306 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzztg\" (UniqueName: \"kubernetes.io/projected/ba915940-2e22-4fcb-92f5-b99de841c175-kube-api-access-gzztg\") on node \"crc\" DevicePath \"\"" Mar 10 08:30:03 crc kubenswrapper[4825]: I0310 08:30:03.778156 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552190-lbl4m" event={"ID":"ba915940-2e22-4fcb-92f5-b99de841c175","Type":"ContainerDied","Data":"ee6fc287e79be9f1f3cc7dd364363aa8475df752fd9e46f21015d385a9897d1e"} Mar 10 08:30:03 crc kubenswrapper[4825]: I0310 08:30:03.778472 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee6fc287e79be9f1f3cc7dd364363aa8475df752fd9e46f21015d385a9897d1e" Mar 10 08:30:03 crc kubenswrapper[4825]: I0310 08:30:03.778540 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552190-lbl4m" Mar 10 08:30:03 crc kubenswrapper[4825]: I0310 08:30:03.779901 4825 generic.go:334] "Generic (PLEG): container finished" podID="b6bf8e71-0bdf-4597-8b70-f15499e6d4f8" containerID="70bd5cad43c296e34cea39f55c9b8fbaad4124b8648275278dff6be36f9b0955" exitCode=0 Mar 10 08:30:03 crc kubenswrapper[4825]: I0310 08:30:03.779966 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552190-tdnwv" event={"ID":"b6bf8e71-0bdf-4597-8b70-f15499e6d4f8","Type":"ContainerDied","Data":"70bd5cad43c296e34cea39f55c9b8fbaad4124b8648275278dff6be36f9b0955"} Mar 10 08:30:04 crc kubenswrapper[4825]: I0310 08:30:04.204772 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552145-pm7sj"] Mar 10 08:30:04 crc kubenswrapper[4825]: I0310 08:30:04.213581 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552145-pm7sj"] Mar 10 08:30:05 crc kubenswrapper[4825]: I0310 08:30:05.167866 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552190-tdnwv" Mar 10 08:30:05 crc kubenswrapper[4825]: I0310 08:30:05.258039 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h4vn\" (UniqueName: \"kubernetes.io/projected/b6bf8e71-0bdf-4597-8b70-f15499e6d4f8-kube-api-access-4h4vn\") pod \"b6bf8e71-0bdf-4597-8b70-f15499e6d4f8\" (UID: \"b6bf8e71-0bdf-4597-8b70-f15499e6d4f8\") " Mar 10 08:30:05 crc kubenswrapper[4825]: I0310 08:30:05.259204 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e10f0b6f-4e14-4510-aac9-936b916abda5" path="/var/lib/kubelet/pods/e10f0b6f-4e14-4510-aac9-936b916abda5/volumes" Mar 10 08:30:05 crc kubenswrapper[4825]: I0310 08:30:05.263718 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6bf8e71-0bdf-4597-8b70-f15499e6d4f8-kube-api-access-4h4vn" (OuterVolumeSpecName: "kube-api-access-4h4vn") pod "b6bf8e71-0bdf-4597-8b70-f15499e6d4f8" (UID: "b6bf8e71-0bdf-4597-8b70-f15499e6d4f8"). InnerVolumeSpecName "kube-api-access-4h4vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:30:05 crc kubenswrapper[4825]: I0310 08:30:05.360713 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h4vn\" (UniqueName: \"kubernetes.io/projected/b6bf8e71-0bdf-4597-8b70-f15499e6d4f8-kube-api-access-4h4vn\") on node \"crc\" DevicePath \"\"" Mar 10 08:30:05 crc kubenswrapper[4825]: I0310 08:30:05.800304 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552190-tdnwv" event={"ID":"b6bf8e71-0bdf-4597-8b70-f15499e6d4f8","Type":"ContainerDied","Data":"80a32fe7ff69b9769ff442c5d6247824d95779495b0598511fe184ebfa01e6c3"} Mar 10 08:30:05 crc kubenswrapper[4825]: I0310 08:30:05.800343 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80a32fe7ff69b9769ff442c5d6247824d95779495b0598511fe184ebfa01e6c3" Mar 10 08:30:05 crc kubenswrapper[4825]: I0310 08:30:05.800747 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552190-tdnwv" Mar 10 08:30:05 crc kubenswrapper[4825]: I0310 08:30:05.846292 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552184-rl66d"] Mar 10 08:30:05 crc kubenswrapper[4825]: I0310 08:30:05.853922 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552184-rl66d"] Mar 10 08:30:07 crc kubenswrapper[4825]: I0310 08:30:07.247608 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="455f8f89-31ed-457d-ad40-5719de71514a" path="/var/lib/kubelet/pods/455f8f89-31ed-457d-ad40-5719de71514a/volumes" Mar 10 08:30:09 crc kubenswrapper[4825]: I0310 08:30:09.521270 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-plzcw"] Mar 10 08:30:09 crc kubenswrapper[4825]: E0310 08:30:09.521917 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba915940-2e22-4fcb-92f5-b99de841c175" containerName="collect-profiles" Mar 10 08:30:09 crc kubenswrapper[4825]: I0310 08:30:09.521928 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba915940-2e22-4fcb-92f5-b99de841c175" containerName="collect-profiles" Mar 10 08:30:09 crc kubenswrapper[4825]: E0310 08:30:09.521940 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6bf8e71-0bdf-4597-8b70-f15499e6d4f8" containerName="oc" Mar 10 08:30:09 crc kubenswrapper[4825]: I0310 08:30:09.521946 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6bf8e71-0bdf-4597-8b70-f15499e6d4f8" containerName="oc" Mar 10 08:30:09 crc kubenswrapper[4825]: I0310 08:30:09.524827 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba915940-2e22-4fcb-92f5-b99de841c175" containerName="collect-profiles" Mar 10 08:30:09 crc kubenswrapper[4825]: I0310 08:30:09.524894 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6bf8e71-0bdf-4597-8b70-f15499e6d4f8" containerName="oc" Mar 10 08:30:09 crc kubenswrapper[4825]: I0310 08:30:09.527028 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-plzcw" Mar 10 08:30:09 crc kubenswrapper[4825]: I0310 08:30:09.544527 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-plzcw"] Mar 10 08:30:09 crc kubenswrapper[4825]: I0310 08:30:09.684945 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sq7d\" (UniqueName: \"kubernetes.io/projected/69c8a03d-771a-4b7b-a3c5-14cfe89337fc-kube-api-access-5sq7d\") pod \"redhat-marketplace-plzcw\" (UID: \"69c8a03d-771a-4b7b-a3c5-14cfe89337fc\") " pod="openshift-marketplace/redhat-marketplace-plzcw" Mar 10 08:30:09 crc kubenswrapper[4825]: I0310 08:30:09.685195 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c8a03d-771a-4b7b-a3c5-14cfe89337fc-utilities\") pod \"redhat-marketplace-plzcw\" (UID: \"69c8a03d-771a-4b7b-a3c5-14cfe89337fc\") " pod="openshift-marketplace/redhat-marketplace-plzcw" Mar 10 08:30:09 crc kubenswrapper[4825]: I0310 08:30:09.685557 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c8a03d-771a-4b7b-a3c5-14cfe89337fc-catalog-content\") pod \"redhat-marketplace-plzcw\" (UID: \"69c8a03d-771a-4b7b-a3c5-14cfe89337fc\") " pod="openshift-marketplace/redhat-marketplace-plzcw" Mar 10 08:30:09 crc kubenswrapper[4825]: I0310 08:30:09.787073 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c8a03d-771a-4b7b-a3c5-14cfe89337fc-utilities\") pod \"redhat-marketplace-plzcw\" (UID: \"69c8a03d-771a-4b7b-a3c5-14cfe89337fc\") " pod="openshift-marketplace/redhat-marketplace-plzcw" Mar 10 08:30:09 crc kubenswrapper[4825]: I0310 08:30:09.787228 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c8a03d-771a-4b7b-a3c5-14cfe89337fc-catalog-content\") pod \"redhat-marketplace-plzcw\" (UID: \"69c8a03d-771a-4b7b-a3c5-14cfe89337fc\") " pod="openshift-marketplace/redhat-marketplace-plzcw" Mar 10 08:30:09 crc kubenswrapper[4825]: I0310 08:30:09.787295 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sq7d\" (UniqueName: \"kubernetes.io/projected/69c8a03d-771a-4b7b-a3c5-14cfe89337fc-kube-api-access-5sq7d\") pod \"redhat-marketplace-plzcw\" (UID: \"69c8a03d-771a-4b7b-a3c5-14cfe89337fc\") " pod="openshift-marketplace/redhat-marketplace-plzcw" Mar 10 08:30:09 crc kubenswrapper[4825]: I0310 08:30:09.787599 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c8a03d-771a-4b7b-a3c5-14cfe89337fc-utilities\") pod \"redhat-marketplace-plzcw\" (UID: \"69c8a03d-771a-4b7b-a3c5-14cfe89337fc\") " pod="openshift-marketplace/redhat-marketplace-plzcw" Mar 10 08:30:09 crc kubenswrapper[4825]: I0310 08:30:09.787801 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c8a03d-771a-4b7b-a3c5-14cfe89337fc-catalog-content\") pod \"redhat-marketplace-plzcw\" (UID: \"69c8a03d-771a-4b7b-a3c5-14cfe89337fc\") " pod="openshift-marketplace/redhat-marketplace-plzcw" Mar 10 08:30:09 crc kubenswrapper[4825]: I0310 08:30:09.805802 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sq7d\" (UniqueName: \"kubernetes.io/projected/69c8a03d-771a-4b7b-a3c5-14cfe89337fc-kube-api-access-5sq7d\") pod \"redhat-marketplace-plzcw\" (UID: \"69c8a03d-771a-4b7b-a3c5-14cfe89337fc\") " pod="openshift-marketplace/redhat-marketplace-plzcw" Mar 10 08:30:09 crc kubenswrapper[4825]: I0310 08:30:09.849733 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-plzcw" Mar 10 08:30:10 crc kubenswrapper[4825]: I0310 08:30:10.341152 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-plzcw"] Mar 10 08:30:10 crc kubenswrapper[4825]: I0310 08:30:10.841222 4825 generic.go:334] "Generic (PLEG): container finished" podID="69c8a03d-771a-4b7b-a3c5-14cfe89337fc" containerID="fea154872d18245251c5ec97c6189a83abbcf5ff495ad22dc8d36b4c663a86e1" exitCode=0 Mar 10 08:30:10 crc kubenswrapper[4825]: I0310 08:30:10.841308 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plzcw" event={"ID":"69c8a03d-771a-4b7b-a3c5-14cfe89337fc","Type":"ContainerDied","Data":"fea154872d18245251c5ec97c6189a83abbcf5ff495ad22dc8d36b4c663a86e1"} Mar 10 08:30:10 crc kubenswrapper[4825]: I0310 08:30:10.842481 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plzcw" event={"ID":"69c8a03d-771a-4b7b-a3c5-14cfe89337fc","Type":"ContainerStarted","Data":"289f2e31cab609b69aa6dce3bed3115bd5fbe5e60eb761c91bd45a0d6c180dfe"} Mar 10 08:30:10 crc kubenswrapper[4825]: I0310 08:30:10.843845 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 08:30:11 crc kubenswrapper[4825]: I0310 08:30:11.236601 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:30:11 crc kubenswrapper[4825]: E0310 08:30:11.236837 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:30:12 crc kubenswrapper[4825]: I0310 08:30:12.861685 4825 generic.go:334] "Generic (PLEG): container finished" podID="69c8a03d-771a-4b7b-a3c5-14cfe89337fc" containerID="ebbed588bcf55313c83f0f9c6063394ce912a2cb4af9d57f9f2a629bde197844" exitCode=0 Mar 10 08:30:12 crc kubenswrapper[4825]: I0310 08:30:12.861761 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plzcw" event={"ID":"69c8a03d-771a-4b7b-a3c5-14cfe89337fc","Type":"ContainerDied","Data":"ebbed588bcf55313c83f0f9c6063394ce912a2cb4af9d57f9f2a629bde197844"} Mar 10 08:30:13 crc kubenswrapper[4825]: I0310 08:30:13.883714 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plzcw" event={"ID":"69c8a03d-771a-4b7b-a3c5-14cfe89337fc","Type":"ContainerStarted","Data":"0b72b90a1b4dea17637c00990b4e0058e2b7f8880d6a6e5650e1ee65965d4d11"} Mar 10 08:30:13 crc kubenswrapper[4825]: I0310 08:30:13.906309 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-plzcw" podStartSLOduration=2.476195361 podStartE2EDuration="4.906292451s" podCreationTimestamp="2026-03-10 08:30:09 +0000 UTC" firstStartedPulling="2026-03-10 08:30:10.843584929 +0000 UTC m=+6363.873365544" lastFinishedPulling="2026-03-10 08:30:13.273682029 +0000 UTC m=+6366.303462634" observedRunningTime="2026-03-10 08:30:13.902297234 +0000 UTC m=+6366.932077869" watchObservedRunningTime="2026-03-10 08:30:13.906292451 +0000 UTC m=+6366.936073066" Mar 10 08:30:19 crc kubenswrapper[4825]: I0310 08:30:19.850190 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-plzcw" Mar 10 08:30:19 crc kubenswrapper[4825]: I0310 08:30:19.850679 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-plzcw" Mar 10 08:30:19 crc kubenswrapper[4825]: I0310 08:30:19.902682 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-plzcw" Mar 10 08:30:19 crc kubenswrapper[4825]: I0310 08:30:19.984158 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-plzcw" Mar 10 08:30:20 crc kubenswrapper[4825]: I0310 08:30:20.138421 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-plzcw"] Mar 10 08:30:21 crc kubenswrapper[4825]: I0310 08:30:21.955955 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-plzcw" podUID="69c8a03d-771a-4b7b-a3c5-14cfe89337fc" containerName="registry-server" containerID="cri-o://0b72b90a1b4dea17637c00990b4e0058e2b7f8880d6a6e5650e1ee65965d4d11" gracePeriod=2 Mar 10 08:30:22 crc kubenswrapper[4825]: E0310 08:30:22.164557 4825 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69c8a03d_771a_4b7b_a3c5_14cfe89337fc.slice/crio-0b72b90a1b4dea17637c00990b4e0058e2b7f8880d6a6e5650e1ee65965d4d11.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69c8a03d_771a_4b7b_a3c5_14cfe89337fc.slice/crio-conmon-0b72b90a1b4dea17637c00990b4e0058e2b7f8880d6a6e5650e1ee65965d4d11.scope\": RecentStats: unable to find data in memory cache]" Mar 10 08:30:22 crc kubenswrapper[4825]: I0310 08:30:22.395328 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-plzcw" Mar 10 08:30:22 crc kubenswrapper[4825]: I0310 08:30:22.475285 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c8a03d-771a-4b7b-a3c5-14cfe89337fc-catalog-content\") pod \"69c8a03d-771a-4b7b-a3c5-14cfe89337fc\" (UID: \"69c8a03d-771a-4b7b-a3c5-14cfe89337fc\") " Mar 10 08:30:22 crc kubenswrapper[4825]: I0310 08:30:22.475351 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sq7d\" (UniqueName: \"kubernetes.io/projected/69c8a03d-771a-4b7b-a3c5-14cfe89337fc-kube-api-access-5sq7d\") pod \"69c8a03d-771a-4b7b-a3c5-14cfe89337fc\" (UID: \"69c8a03d-771a-4b7b-a3c5-14cfe89337fc\") " Mar 10 08:30:22 crc kubenswrapper[4825]: I0310 08:30:22.475386 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c8a03d-771a-4b7b-a3c5-14cfe89337fc-utilities\") pod \"69c8a03d-771a-4b7b-a3c5-14cfe89337fc\" (UID: \"69c8a03d-771a-4b7b-a3c5-14cfe89337fc\") " Mar 10 08:30:22 crc kubenswrapper[4825]: I0310 08:30:22.477541 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69c8a03d-771a-4b7b-a3c5-14cfe89337fc-utilities" (OuterVolumeSpecName: "utilities") pod "69c8a03d-771a-4b7b-a3c5-14cfe89337fc" (UID: "69c8a03d-771a-4b7b-a3c5-14cfe89337fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:30:22 crc kubenswrapper[4825]: I0310 08:30:22.481215 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69c8a03d-771a-4b7b-a3c5-14cfe89337fc-kube-api-access-5sq7d" (OuterVolumeSpecName: "kube-api-access-5sq7d") pod "69c8a03d-771a-4b7b-a3c5-14cfe89337fc" (UID: "69c8a03d-771a-4b7b-a3c5-14cfe89337fc"). InnerVolumeSpecName "kube-api-access-5sq7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:30:22 crc kubenswrapper[4825]: I0310 08:30:22.578223 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sq7d\" (UniqueName: \"kubernetes.io/projected/69c8a03d-771a-4b7b-a3c5-14cfe89337fc-kube-api-access-5sq7d\") on node \"crc\" DevicePath \"\"" Mar 10 08:30:22 crc kubenswrapper[4825]: I0310 08:30:22.578399 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c8a03d-771a-4b7b-a3c5-14cfe89337fc-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 08:30:22 crc kubenswrapper[4825]: I0310 08:30:22.754700 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69c8a03d-771a-4b7b-a3c5-14cfe89337fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69c8a03d-771a-4b7b-a3c5-14cfe89337fc" (UID: "69c8a03d-771a-4b7b-a3c5-14cfe89337fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:30:22 crc kubenswrapper[4825]: I0310 08:30:22.816341 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c8a03d-771a-4b7b-a3c5-14cfe89337fc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 08:30:22 crc kubenswrapper[4825]: I0310 08:30:22.970591 4825 generic.go:334] "Generic (PLEG): container finished" podID="69c8a03d-771a-4b7b-a3c5-14cfe89337fc" containerID="0b72b90a1b4dea17637c00990b4e0058e2b7f8880d6a6e5650e1ee65965d4d11" exitCode=0 Mar 10 08:30:22 crc kubenswrapper[4825]: I0310 08:30:22.970645 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plzcw" event={"ID":"69c8a03d-771a-4b7b-a3c5-14cfe89337fc","Type":"ContainerDied","Data":"0b72b90a1b4dea17637c00990b4e0058e2b7f8880d6a6e5650e1ee65965d4d11"} Mar 10 08:30:22 crc kubenswrapper[4825]: I0310 08:30:22.970673 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plzcw" event={"ID":"69c8a03d-771a-4b7b-a3c5-14cfe89337fc","Type":"ContainerDied","Data":"289f2e31cab609b69aa6dce3bed3115bd5fbe5e60eb761c91bd45a0d6c180dfe"} Mar 10 08:30:22 crc kubenswrapper[4825]: I0310 08:30:22.970680 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-plzcw" Mar 10 08:30:22 crc kubenswrapper[4825]: I0310 08:30:22.970695 4825 scope.go:117] "RemoveContainer" containerID="0b72b90a1b4dea17637c00990b4e0058e2b7f8880d6a6e5650e1ee65965d4d11" Mar 10 08:30:23 crc kubenswrapper[4825]: I0310 08:30:23.007439 4825 scope.go:117] "RemoveContainer" containerID="ebbed588bcf55313c83f0f9c6063394ce912a2cb4af9d57f9f2a629bde197844" Mar 10 08:30:23 crc kubenswrapper[4825]: I0310 08:30:23.019264 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-plzcw"] Mar 10 08:30:23 crc kubenswrapper[4825]: I0310 08:30:23.033476 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-plzcw"] Mar 10 08:30:23 crc kubenswrapper[4825]: I0310 08:30:23.043868 4825 scope.go:117] "RemoveContainer" containerID="fea154872d18245251c5ec97c6189a83abbcf5ff495ad22dc8d36b4c663a86e1" Mar 10 08:30:23 crc kubenswrapper[4825]: I0310 08:30:23.100088 4825 scope.go:117] "RemoveContainer" containerID="0b72b90a1b4dea17637c00990b4e0058e2b7f8880d6a6e5650e1ee65965d4d11" Mar 10 08:30:23 crc kubenswrapper[4825]: E0310 08:30:23.100551 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b72b90a1b4dea17637c00990b4e0058e2b7f8880d6a6e5650e1ee65965d4d11\": container with ID starting with 0b72b90a1b4dea17637c00990b4e0058e2b7f8880d6a6e5650e1ee65965d4d11 not found: ID does not exist" containerID="0b72b90a1b4dea17637c00990b4e0058e2b7f8880d6a6e5650e1ee65965d4d11" Mar 10 08:30:23 crc kubenswrapper[4825]: I0310 08:30:23.100591 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b72b90a1b4dea17637c00990b4e0058e2b7f8880d6a6e5650e1ee65965d4d11"} err="failed to get container status \"0b72b90a1b4dea17637c00990b4e0058e2b7f8880d6a6e5650e1ee65965d4d11\": rpc error: code = NotFound desc = could not find container \"0b72b90a1b4dea17637c00990b4e0058e2b7f8880d6a6e5650e1ee65965d4d11\": container with ID starting with 0b72b90a1b4dea17637c00990b4e0058e2b7f8880d6a6e5650e1ee65965d4d11 not found: ID does not exist" Mar 10 08:30:23 crc kubenswrapper[4825]: I0310 08:30:23.100618 4825 scope.go:117] "RemoveContainer" containerID="ebbed588bcf55313c83f0f9c6063394ce912a2cb4af9d57f9f2a629bde197844" Mar 10 08:30:23 crc kubenswrapper[4825]: E0310 08:30:23.101126 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebbed588bcf55313c83f0f9c6063394ce912a2cb4af9d57f9f2a629bde197844\": container with ID starting with ebbed588bcf55313c83f0f9c6063394ce912a2cb4af9d57f9f2a629bde197844 not found: ID does not exist" containerID="ebbed588bcf55313c83f0f9c6063394ce912a2cb4af9d57f9f2a629bde197844" Mar 10 08:30:23 crc kubenswrapper[4825]: I0310 08:30:23.101165 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebbed588bcf55313c83f0f9c6063394ce912a2cb4af9d57f9f2a629bde197844"} err="failed to get container status \"ebbed588bcf55313c83f0f9c6063394ce912a2cb4af9d57f9f2a629bde197844\": rpc error: code = NotFound desc = could not find container \"ebbed588bcf55313c83f0f9c6063394ce912a2cb4af9d57f9f2a629bde197844\": container with ID starting with ebbed588bcf55313c83f0f9c6063394ce912a2cb4af9d57f9f2a629bde197844 not found: ID does not exist" Mar 10 08:30:23 crc kubenswrapper[4825]: I0310 08:30:23.101181 4825 scope.go:117] "RemoveContainer" containerID="fea154872d18245251c5ec97c6189a83abbcf5ff495ad22dc8d36b4c663a86e1" Mar 10 08:30:23 crc kubenswrapper[4825]: E0310 08:30:23.101426 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fea154872d18245251c5ec97c6189a83abbcf5ff495ad22dc8d36b4c663a86e1\": container with ID starting with fea154872d18245251c5ec97c6189a83abbcf5ff495ad22dc8d36b4c663a86e1 not found: ID does not exist" containerID="fea154872d18245251c5ec97c6189a83abbcf5ff495ad22dc8d36b4c663a86e1" Mar 10 08:30:23 crc kubenswrapper[4825]: I0310 08:30:23.101447 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fea154872d18245251c5ec97c6189a83abbcf5ff495ad22dc8d36b4c663a86e1"} err="failed to get container status \"fea154872d18245251c5ec97c6189a83abbcf5ff495ad22dc8d36b4c663a86e1\": rpc error: code = NotFound desc = could not find container \"fea154872d18245251c5ec97c6189a83abbcf5ff495ad22dc8d36b4c663a86e1\": container with ID starting with fea154872d18245251c5ec97c6189a83abbcf5ff495ad22dc8d36b4c663a86e1 not found: ID does not exist" Mar 10 08:30:23 crc kubenswrapper[4825]: I0310 08:30:23.247746 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69c8a03d-771a-4b7b-a3c5-14cfe89337fc" path="/var/lib/kubelet/pods/69c8a03d-771a-4b7b-a3c5-14cfe89337fc/volumes" Mar 10 08:30:26 crc kubenswrapper[4825]: I0310 08:30:26.237094 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:30:26 crc kubenswrapper[4825]: E0310 08:30:26.237675 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:30:40 crc kubenswrapper[4825]: I0310 08:30:40.236647 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:30:40 crc kubenswrapper[4825]: E0310 08:30:40.237520 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:30:51 crc kubenswrapper[4825]: I0310 08:30:51.236993 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:30:52 crc kubenswrapper[4825]: I0310 08:30:52.279246 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"ba716d5199aaa639d1807a8298ac7530f73bce5b83b5cee2bf9ea8fb07b40b32"} Mar 10 08:30:52 crc kubenswrapper[4825]: I0310 08:30:52.525384 4825 scope.go:117] "RemoveContainer" containerID="9c66832fb69ed2b1d1cd3a56d2e3225e31beb613c4c6545f10fcc37a148ede04" Mar 10 08:30:52 crc kubenswrapper[4825]: I0310 08:30:52.577417 4825 scope.go:117] "RemoveContainer" containerID="282fd61201a2bb2509a065ff2805533500df406771d4f343ac95615c7c66cf18" Mar 10 08:32:00 crc kubenswrapper[4825]: I0310 08:32:00.043623 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-pv5lc"] Mar 10 08:32:00 crc kubenswrapper[4825]: I0310 08:32:00.056665 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-pv5lc"] Mar 10 08:32:00 crc kubenswrapper[4825]: I0310 08:32:00.142535 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552192-7hvcs"] Mar 10 08:32:00 crc kubenswrapper[4825]: E0310 08:32:00.142909 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c8a03d-771a-4b7b-a3c5-14cfe89337fc" containerName="extract-content" Mar 10 08:32:00 crc kubenswrapper[4825]: I0310 08:32:00.142925 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c8a03d-771a-4b7b-a3c5-14cfe89337fc" containerName="extract-content" Mar 10 08:32:00 crc kubenswrapper[4825]: E0310 08:32:00.142946 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c8a03d-771a-4b7b-a3c5-14cfe89337fc" containerName="registry-server" Mar 10 08:32:00 crc kubenswrapper[4825]: I0310 08:32:00.142953 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c8a03d-771a-4b7b-a3c5-14cfe89337fc" containerName="registry-server" Mar 10 08:32:00 crc kubenswrapper[4825]: E0310 08:32:00.142969 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c8a03d-771a-4b7b-a3c5-14cfe89337fc" containerName="extract-utilities" Mar 10 08:32:00 crc kubenswrapper[4825]: I0310 08:32:00.142976 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c8a03d-771a-4b7b-a3c5-14cfe89337fc" containerName="extract-utilities" Mar 10 08:32:00 crc kubenswrapper[4825]: I0310 08:32:00.143160 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="69c8a03d-771a-4b7b-a3c5-14cfe89337fc" containerName="registry-server" Mar 10 08:32:00 crc kubenswrapper[4825]: I0310 08:32:00.143823 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552192-7hvcs" Mar 10 08:32:00 crc kubenswrapper[4825]: I0310 08:32:00.146041 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:32:00 crc kubenswrapper[4825]: I0310 08:32:00.146185 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:32:00 crc kubenswrapper[4825]: I0310 08:32:00.146223 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:32:00 crc kubenswrapper[4825]: I0310 08:32:00.158569 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552192-7hvcs"] Mar 10 08:32:00 crc kubenswrapper[4825]: I0310 08:32:00.243328 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54kvg\" (UniqueName: \"kubernetes.io/projected/39980f78-dbfe-469c-935f-83a2acb5e039-kube-api-access-54kvg\") pod \"auto-csr-approver-29552192-7hvcs\" (UID: \"39980f78-dbfe-469c-935f-83a2acb5e039\") " pod="openshift-infra/auto-csr-approver-29552192-7hvcs" Mar 10 08:32:00 crc kubenswrapper[4825]: I0310 08:32:00.345933 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54kvg\" (UniqueName: \"kubernetes.io/projected/39980f78-dbfe-469c-935f-83a2acb5e039-kube-api-access-54kvg\") pod \"auto-csr-approver-29552192-7hvcs\" (UID: \"39980f78-dbfe-469c-935f-83a2acb5e039\") " pod="openshift-infra/auto-csr-approver-29552192-7hvcs" Mar 10 08:32:00 crc kubenswrapper[4825]: I0310 08:32:00.368009 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54kvg\" (UniqueName: \"kubernetes.io/projected/39980f78-dbfe-469c-935f-83a2acb5e039-kube-api-access-54kvg\") pod \"auto-csr-approver-29552192-7hvcs\" (UID: \"39980f78-dbfe-469c-935f-83a2acb5e039\") " pod="openshift-infra/auto-csr-approver-29552192-7hvcs" Mar 10 08:32:00 crc kubenswrapper[4825]: I0310 08:32:00.468813 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552192-7hvcs" Mar 10 08:32:00 crc kubenswrapper[4825]: I0310 08:32:00.904926 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552192-7hvcs"] Mar 10 08:32:01 crc kubenswrapper[4825]: I0310 08:32:01.025827 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-2379-account-create-update-bq46z"] Mar 10 08:32:01 crc kubenswrapper[4825]: I0310 08:32:01.035078 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-2379-account-create-update-bq46z"] Mar 10 08:32:01 crc kubenswrapper[4825]: I0310 08:32:01.256762 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="320db231-7a1a-4a9c-aa42-dbc5593be19b" path="/var/lib/kubelet/pods/320db231-7a1a-4a9c-aa42-dbc5593be19b/volumes" Mar 10 08:32:01 crc kubenswrapper[4825]: I0310 08:32:01.257782 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e1976b7-f398-46fb-b376-64939c737883" path="/var/lib/kubelet/pods/7e1976b7-f398-46fb-b376-64939c737883/volumes" Mar 10 08:32:01 crc kubenswrapper[4825]: I0310 08:32:01.912860 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552192-7hvcs" event={"ID":"39980f78-dbfe-469c-935f-83a2acb5e039","Type":"ContainerStarted","Data":"e309b26d7b8ec79da8a24a43f9e6299b9fca460a52addc88b44cb5eec15c8d0d"} Mar 10 08:32:02 crc kubenswrapper[4825]: I0310 08:32:02.924982 4825 generic.go:334] "Generic (PLEG): container finished" podID="39980f78-dbfe-469c-935f-83a2acb5e039" containerID="acfaf4e3804aaa9fe787732f6d0f2dc4f6949e094c5eec80aefd94fb535cf683" exitCode=0 Mar 10 08:32:02 crc kubenswrapper[4825]: I0310 08:32:02.925046 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552192-7hvcs" event={"ID":"39980f78-dbfe-469c-935f-83a2acb5e039","Type":"ContainerDied","Data":"acfaf4e3804aaa9fe787732f6d0f2dc4f6949e094c5eec80aefd94fb535cf683"} Mar 10 08:32:04 crc kubenswrapper[4825]: I0310 08:32:04.264047 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552192-7hvcs" Mar 10 08:32:04 crc kubenswrapper[4825]: I0310 08:32:04.329584 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54kvg\" (UniqueName: \"kubernetes.io/projected/39980f78-dbfe-469c-935f-83a2acb5e039-kube-api-access-54kvg\") pod \"39980f78-dbfe-469c-935f-83a2acb5e039\" (UID: \"39980f78-dbfe-469c-935f-83a2acb5e039\") " Mar 10 08:32:04 crc kubenswrapper[4825]: I0310 08:32:04.335423 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39980f78-dbfe-469c-935f-83a2acb5e039-kube-api-access-54kvg" (OuterVolumeSpecName: "kube-api-access-54kvg") pod "39980f78-dbfe-469c-935f-83a2acb5e039" (UID: "39980f78-dbfe-469c-935f-83a2acb5e039"). InnerVolumeSpecName "kube-api-access-54kvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:32:04 crc kubenswrapper[4825]: I0310 08:32:04.432220 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54kvg\" (UniqueName: \"kubernetes.io/projected/39980f78-dbfe-469c-935f-83a2acb5e039-kube-api-access-54kvg\") on node \"crc\" DevicePath \"\"" Mar 10 08:32:04 crc kubenswrapper[4825]: I0310 08:32:04.942756 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552192-7hvcs" event={"ID":"39980f78-dbfe-469c-935f-83a2acb5e039","Type":"ContainerDied","Data":"e309b26d7b8ec79da8a24a43f9e6299b9fca460a52addc88b44cb5eec15c8d0d"} Mar 10 08:32:04 crc kubenswrapper[4825]: I0310 08:32:04.942804 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e309b26d7b8ec79da8a24a43f9e6299b9fca460a52addc88b44cb5eec15c8d0d" Mar 10 08:32:04 crc kubenswrapper[4825]: I0310 08:32:04.942859 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552192-7hvcs" Mar 10 08:32:05 crc kubenswrapper[4825]: I0310 08:32:05.326504 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552186-z88l7"] Mar 10 08:32:05 crc kubenswrapper[4825]: I0310 08:32:05.335510 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552186-z88l7"] Mar 10 08:32:07 crc kubenswrapper[4825]: I0310 08:32:07.257415 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fc5e5dd-4c5e-4729-8750-b65c6e70e83d" path="/var/lib/kubelet/pods/8fc5e5dd-4c5e-4729-8750-b65c6e70e83d/volumes" Mar 10 08:32:17 crc kubenswrapper[4825]: I0310 08:32:17.040588 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-fv2c7"] Mar 10 08:32:17 crc kubenswrapper[4825]: I0310 08:32:17.047792 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-fv2c7"] Mar 10 08:32:17 crc kubenswrapper[4825]: I0310 08:32:17.251960 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca" path="/var/lib/kubelet/pods/d4e5f6c8-a2c4-416a-8f63-d88e5b1709ca/volumes" Mar 10 08:32:52 crc kubenswrapper[4825]: I0310 08:32:52.717291 4825 scope.go:117] "RemoveContainer" containerID="c7a2cc348757ad3a8facf25c670353306edff3a7e1faa407a581d0eb6786ec65" Mar 10 08:32:52 crc kubenswrapper[4825]: I0310 08:32:52.754446 4825 scope.go:117] "RemoveContainer" containerID="7c3510e3af6f6a5910d61e21a734847a5d4ade589767e8f52829f5b891c450b7" Mar 10 08:32:52 crc kubenswrapper[4825]: I0310 08:32:52.822300 4825 scope.go:117] "RemoveContainer" containerID="10d28f408366d3a98aa2961943ac9fe1c11d4640f9c9e39ef75970fa87943e77" Mar 10 08:32:52 crc kubenswrapper[4825]: I0310 08:32:52.890313 4825 scope.go:117] "RemoveContainer" containerID="b1ef5efcc7904aa7c5ade0c1281756ff3bbc47d6ded7eb42771dd36a36df9448" Mar 10 08:33:16 crc kubenswrapper[4825]: I0310 08:33:16.888383 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:33:16 crc kubenswrapper[4825]: I0310 08:33:16.889758 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:33:46 crc kubenswrapper[4825]: I0310 08:33:46.888724 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:33:46 crc kubenswrapper[4825]: I0310 08:33:46.889363 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:34:00 crc kubenswrapper[4825]: I0310 08:34:00.163014 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552194-t8xbz"] Mar 10 08:34:00 crc kubenswrapper[4825]: E0310 08:34:00.164343 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39980f78-dbfe-469c-935f-83a2acb5e039" containerName="oc" Mar 10 08:34:00 crc kubenswrapper[4825]: I0310 08:34:00.164367 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="39980f78-dbfe-469c-935f-83a2acb5e039" containerName="oc" Mar 10 08:34:00 crc kubenswrapper[4825]: I0310 08:34:00.164820 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="39980f78-dbfe-469c-935f-83a2acb5e039" containerName="oc" Mar 10 08:34:00 crc kubenswrapper[4825]: I0310 08:34:00.166317 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552194-t8xbz" Mar 10 08:34:00 crc kubenswrapper[4825]: I0310 08:34:00.169587 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:34:00 crc kubenswrapper[4825]: I0310 08:34:00.169850 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:34:00 crc kubenswrapper[4825]: I0310 08:34:00.170475 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:34:00 crc kubenswrapper[4825]: I0310 08:34:00.175209 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552194-t8xbz"] Mar 10 08:34:00 crc kubenswrapper[4825]: I0310 08:34:00.362728 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zq67\" (UniqueName: \"kubernetes.io/projected/a13eabac-e76b-4096-a5de-9bb438d19a1e-kube-api-access-6zq67\") pod \"auto-csr-approver-29552194-t8xbz\" (UID: \"a13eabac-e76b-4096-a5de-9bb438d19a1e\") " pod="openshift-infra/auto-csr-approver-29552194-t8xbz" Mar 10 08:34:00 crc kubenswrapper[4825]: I0310 08:34:00.466282 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zq67\" (UniqueName: \"kubernetes.io/projected/a13eabac-e76b-4096-a5de-9bb438d19a1e-kube-api-access-6zq67\") pod \"auto-csr-approver-29552194-t8xbz\" (UID: \"a13eabac-e76b-4096-a5de-9bb438d19a1e\") " pod="openshift-infra/auto-csr-approver-29552194-t8xbz" Mar 10 08:34:00 crc kubenswrapper[4825]: I0310 08:34:00.490108 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zq67\" (UniqueName: \"kubernetes.io/projected/a13eabac-e76b-4096-a5de-9bb438d19a1e-kube-api-access-6zq67\") pod \"auto-csr-approver-29552194-t8xbz\" (UID: \"a13eabac-e76b-4096-a5de-9bb438d19a1e\") " pod="openshift-infra/auto-csr-approver-29552194-t8xbz" Mar 10 08:34:00 crc kubenswrapper[4825]: I0310 08:34:00.715647 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pm5bb"] Mar 10 08:34:00 crc kubenswrapper[4825]: I0310 08:34:00.718053 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pm5bb" Mar 10 08:34:00 crc kubenswrapper[4825]: I0310 08:34:00.736876 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pm5bb"] Mar 10 08:34:00 crc kubenswrapper[4825]: I0310 08:34:00.789950 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552194-t8xbz" Mar 10 08:34:00 crc kubenswrapper[4825]: I0310 08:34:00.875571 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b12b6a6-2681-4223-8add-f86ed3fac540-utilities\") pod \"certified-operators-pm5bb\" (UID: \"8b12b6a6-2681-4223-8add-f86ed3fac540\") " pod="openshift-marketplace/certified-operators-pm5bb" Mar 10 08:34:00 crc kubenswrapper[4825]: I0310 08:34:00.875991 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5jbz\" (UniqueName: \"kubernetes.io/projected/8b12b6a6-2681-4223-8add-f86ed3fac540-kube-api-access-n5jbz\") pod \"certified-operators-pm5bb\" (UID: \"8b12b6a6-2681-4223-8add-f86ed3fac540\") " pod="openshift-marketplace/certified-operators-pm5bb" Mar 10 08:34:00 crc kubenswrapper[4825]: I0310 08:34:00.876032 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b12b6a6-2681-4223-8add-f86ed3fac540-catalog-content\") pod \"certified-operators-pm5bb\" (UID: \"8b12b6a6-2681-4223-8add-f86ed3fac540\") " pod="openshift-marketplace/certified-operators-pm5bb" Mar 10 08:34:00 crc kubenswrapper[4825]: I0310 08:34:00.979302 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b12b6a6-2681-4223-8add-f86ed3fac540-utilities\") pod \"certified-operators-pm5bb\" (UID: \"8b12b6a6-2681-4223-8add-f86ed3fac540\") " pod="openshift-marketplace/certified-operators-pm5bb" Mar 10 08:34:00 crc kubenswrapper[4825]: I0310 08:34:00.979388 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5jbz\" (UniqueName: \"kubernetes.io/projected/8b12b6a6-2681-4223-8add-f86ed3fac540-kube-api-access-n5jbz\") pod \"certified-operators-pm5bb\" (UID: \"8b12b6a6-2681-4223-8add-f86ed3fac540\") " pod="openshift-marketplace/certified-operators-pm5bb" Mar 10 08:34:00 crc kubenswrapper[4825]: I0310 08:34:00.979409 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b12b6a6-2681-4223-8add-f86ed3fac540-catalog-content\") pod \"certified-operators-pm5bb\" (UID: \"8b12b6a6-2681-4223-8add-f86ed3fac540\") " pod="openshift-marketplace/certified-operators-pm5bb" Mar 10 08:34:00 crc kubenswrapper[4825]: I0310 08:34:00.981849 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b12b6a6-2681-4223-8add-f86ed3fac540-catalog-content\") pod \"certified-operators-pm5bb\" (UID: \"8b12b6a6-2681-4223-8add-f86ed3fac540\") " pod="openshift-marketplace/certified-operators-pm5bb" Mar 10 08:34:00 crc kubenswrapper[4825]: I0310 08:34:00.981873 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b12b6a6-2681-4223-8add-f86ed3fac540-utilities\") pod \"certified-operators-pm5bb\" (UID: \"8b12b6a6-2681-4223-8add-f86ed3fac540\") " pod="openshift-marketplace/certified-operators-pm5bb" Mar 10 08:34:00 crc kubenswrapper[4825]: I0310 08:34:00.999634 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5jbz\" (UniqueName: \"kubernetes.io/projected/8b12b6a6-2681-4223-8add-f86ed3fac540-kube-api-access-n5jbz\") pod \"certified-operators-pm5bb\" (UID: \"8b12b6a6-2681-4223-8add-f86ed3fac540\") " pod="openshift-marketplace/certified-operators-pm5bb" Mar 10 08:34:01 crc kubenswrapper[4825]: I0310 08:34:01.051564 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pm5bb" Mar 10 08:34:01 crc kubenswrapper[4825]: I0310 08:34:01.297465 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552194-t8xbz"] Mar 10 08:34:01 crc kubenswrapper[4825]: I0310 08:34:01.582438 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pm5bb"] Mar 10 08:34:02 crc kubenswrapper[4825]: I0310 08:34:02.054685 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552194-t8xbz" event={"ID":"a13eabac-e76b-4096-a5de-9bb438d19a1e","Type":"ContainerStarted","Data":"adaadab900e3baf15f6db006319a85e53f898f1bd9d43fd7b472fc97d4dfe774"} Mar 10 08:34:02 crc kubenswrapper[4825]: I0310 08:34:02.056702 4825 generic.go:334] "Generic (PLEG): container finished" podID="8b12b6a6-2681-4223-8add-f86ed3fac540" containerID="6aa441167a98d05449474b288c1577f9f99bef7e70b7c23f5e46664f029f9678" exitCode=0 Mar 10 08:34:02 crc kubenswrapper[4825]: I0310 08:34:02.056737 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm5bb" event={"ID":"8b12b6a6-2681-4223-8add-f86ed3fac540","Type":"ContainerDied","Data":"6aa441167a98d05449474b288c1577f9f99bef7e70b7c23f5e46664f029f9678"} Mar 10 08:34:02 crc kubenswrapper[4825]: I0310 08:34:02.056904 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm5bb" event={"ID":"8b12b6a6-2681-4223-8add-f86ed3fac540","Type":"ContainerStarted","Data":"5930e438aeb590bd9e1820b3ac31d48eb9904efbf5a79bfeb5abf069f8bbc6fb"} Mar 10 08:34:03 crc kubenswrapper[4825]: I0310 08:34:03.066455 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm5bb" event={"ID":"8b12b6a6-2681-4223-8add-f86ed3fac540","Type":"ContainerStarted","Data":"ce39e4eaec358098ae387081e6b2f7cc7f1afe0e87804ea4b96d83d9512d6bce"} Mar 10 08:34:03 crc kubenswrapper[4825]: I0310 08:34:03.068749 4825 generic.go:334] "Generic (PLEG): container finished" podID="a13eabac-e76b-4096-a5de-9bb438d19a1e" containerID="5c0e1fdc5ff89d22e308b9fb880226d71a12c2676799a9486b6f2a3d3322e033" exitCode=0 Mar 10 08:34:03 crc kubenswrapper[4825]: I0310 08:34:03.068777 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552194-t8xbz" event={"ID":"a13eabac-e76b-4096-a5de-9bb438d19a1e","Type":"ContainerDied","Data":"5c0e1fdc5ff89d22e308b9fb880226d71a12c2676799a9486b6f2a3d3322e033"} Mar 10 08:34:04 crc kubenswrapper[4825]: I0310 08:34:04.467270 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552194-t8xbz" Mar 10 08:34:04 crc kubenswrapper[4825]: I0310 08:34:04.569649 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zq67\" (UniqueName: \"kubernetes.io/projected/a13eabac-e76b-4096-a5de-9bb438d19a1e-kube-api-access-6zq67\") pod \"a13eabac-e76b-4096-a5de-9bb438d19a1e\" (UID: \"a13eabac-e76b-4096-a5de-9bb438d19a1e\") " Mar 10 08:34:04 crc kubenswrapper[4825]: I0310 08:34:04.574446 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a13eabac-e76b-4096-a5de-9bb438d19a1e-kube-api-access-6zq67" (OuterVolumeSpecName: "kube-api-access-6zq67") pod "a13eabac-e76b-4096-a5de-9bb438d19a1e" (UID: "a13eabac-e76b-4096-a5de-9bb438d19a1e"). InnerVolumeSpecName "kube-api-access-6zq67". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:34:04 crc kubenswrapper[4825]: I0310 08:34:04.672651 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zq67\" (UniqueName: \"kubernetes.io/projected/a13eabac-e76b-4096-a5de-9bb438d19a1e-kube-api-access-6zq67\") on node \"crc\" DevicePath \"\"" Mar 10 08:34:05 crc kubenswrapper[4825]: I0310 08:34:05.088797 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552194-t8xbz" Mar 10 08:34:05 crc kubenswrapper[4825]: I0310 08:34:05.088777 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552194-t8xbz" event={"ID":"a13eabac-e76b-4096-a5de-9bb438d19a1e","Type":"ContainerDied","Data":"adaadab900e3baf15f6db006319a85e53f898f1bd9d43fd7b472fc97d4dfe774"} Mar 10 08:34:05 crc kubenswrapper[4825]: I0310 08:34:05.089294 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adaadab900e3baf15f6db006319a85e53f898f1bd9d43fd7b472fc97d4dfe774" Mar 10 08:34:05 crc kubenswrapper[4825]: I0310 08:34:05.091194 4825 generic.go:334] "Generic (PLEG): container finished" podID="8b12b6a6-2681-4223-8add-f86ed3fac540" containerID="ce39e4eaec358098ae387081e6b2f7cc7f1afe0e87804ea4b96d83d9512d6bce" exitCode=0 Mar 10 08:34:05 crc kubenswrapper[4825]: I0310 08:34:05.091230 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm5bb" event={"ID":"8b12b6a6-2681-4223-8add-f86ed3fac540","Type":"ContainerDied","Data":"ce39e4eaec358098ae387081e6b2f7cc7f1afe0e87804ea4b96d83d9512d6bce"} Mar 10 08:34:05 crc kubenswrapper[4825]: I0310 08:34:05.550081 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552188-hq4rj"] Mar 10 08:34:05 crc kubenswrapper[4825]: I0310 08:34:05.564945 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552188-hq4rj"] Mar 10 08:34:06 crc kubenswrapper[4825]: I0310 08:34:06.107487 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm5bb" event={"ID":"8b12b6a6-2681-4223-8add-f86ed3fac540","Type":"ContainerStarted","Data":"8c0c735b2731aa181b3860728eeda341fa0bd9c15fc2f69d8579c45d3fb59fb6"} Mar 10 08:34:06 crc kubenswrapper[4825]: I0310 08:34:06.131354 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pm5bb" podStartSLOduration=2.634833335 podStartE2EDuration="6.131330883s" podCreationTimestamp="2026-03-10 08:34:00 +0000 UTC" firstStartedPulling="2026-03-10 08:34:02.058994323 +0000 UTC m=+6595.088774938" lastFinishedPulling="2026-03-10 08:34:05.555491861 +0000 UTC m=+6598.585272486" observedRunningTime="2026-03-10 08:34:06.130159801 +0000 UTC m=+6599.159940456" watchObservedRunningTime="2026-03-10 08:34:06.131330883 +0000 UTC m=+6599.161111538" Mar 10 08:34:07 crc kubenswrapper[4825]: I0310 08:34:07.253481 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9d54311-e96f-4179-a4c7-061199831901" path="/var/lib/kubelet/pods/c9d54311-e96f-4179-a4c7-061199831901/volumes" Mar 10 08:34:11 crc kubenswrapper[4825]: I0310 08:34:11.052064 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pm5bb" Mar 10 08:34:11 crc kubenswrapper[4825]: I0310 08:34:11.052623 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pm5bb" Mar 10 08:34:11 crc kubenswrapper[4825]: I0310 08:34:11.100841 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pm5bb" Mar 10 08:34:11 crc kubenswrapper[4825]: I0310 08:34:11.191203 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pm5bb" Mar 10 08:34:11 crc kubenswrapper[4825]: I0310 08:34:11.341335 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pm5bb"] Mar 10 08:34:13 crc kubenswrapper[4825]: I0310 08:34:13.174876 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pm5bb" podUID="8b12b6a6-2681-4223-8add-f86ed3fac540" containerName="registry-server" containerID="cri-o://8c0c735b2731aa181b3860728eeda341fa0bd9c15fc2f69d8579c45d3fb59fb6" gracePeriod=2 Mar 10 08:34:13 crc kubenswrapper[4825]: I0310 08:34:13.727260 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pm5bb" Mar 10 08:34:13 crc kubenswrapper[4825]: I0310 08:34:13.876299 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5jbz\" (UniqueName: \"kubernetes.io/projected/8b12b6a6-2681-4223-8add-f86ed3fac540-kube-api-access-n5jbz\") pod \"8b12b6a6-2681-4223-8add-f86ed3fac540\" (UID: \"8b12b6a6-2681-4223-8add-f86ed3fac540\") " Mar 10 08:34:13 crc kubenswrapper[4825]: I0310 08:34:13.876592 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b12b6a6-2681-4223-8add-f86ed3fac540-catalog-content\") pod \"8b12b6a6-2681-4223-8add-f86ed3fac540\" (UID: \"8b12b6a6-2681-4223-8add-f86ed3fac540\") " Mar 10 08:34:13 crc kubenswrapper[4825]: I0310 08:34:13.876825 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b12b6a6-2681-4223-8add-f86ed3fac540-utilities\") pod \"8b12b6a6-2681-4223-8add-f86ed3fac540\" (UID: \"8b12b6a6-2681-4223-8add-f86ed3fac540\") " Mar 10 08:34:13 crc kubenswrapper[4825]: I0310 08:34:13.879031 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b12b6a6-2681-4223-8add-f86ed3fac540-utilities" (OuterVolumeSpecName: "utilities") pod "8b12b6a6-2681-4223-8add-f86ed3fac540" (UID: "8b12b6a6-2681-4223-8add-f86ed3fac540"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:34:13 crc kubenswrapper[4825]: I0310 08:34:13.897051 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b12b6a6-2681-4223-8add-f86ed3fac540-kube-api-access-n5jbz" (OuterVolumeSpecName: "kube-api-access-n5jbz") pod "8b12b6a6-2681-4223-8add-f86ed3fac540" (UID: "8b12b6a6-2681-4223-8add-f86ed3fac540"). InnerVolumeSpecName "kube-api-access-n5jbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:34:13 crc kubenswrapper[4825]: I0310 08:34:13.967927 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b12b6a6-2681-4223-8add-f86ed3fac540-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b12b6a6-2681-4223-8add-f86ed3fac540" (UID: "8b12b6a6-2681-4223-8add-f86ed3fac540"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:34:13 crc kubenswrapper[4825]: I0310 08:34:13.980480 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b12b6a6-2681-4223-8add-f86ed3fac540-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 08:34:13 crc kubenswrapper[4825]: I0310 08:34:13.980536 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5jbz\" (UniqueName: \"kubernetes.io/projected/8b12b6a6-2681-4223-8add-f86ed3fac540-kube-api-access-n5jbz\") on node \"crc\" DevicePath \"\"" Mar 10 08:34:13 crc kubenswrapper[4825]: I0310 08:34:13.980557 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b12b6a6-2681-4223-8add-f86ed3fac540-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 08:34:14 crc kubenswrapper[4825]: I0310 08:34:14.185674 4825 generic.go:334] "Generic (PLEG): container finished" podID="8b12b6a6-2681-4223-8add-f86ed3fac540" containerID="8c0c735b2731aa181b3860728eeda341fa0bd9c15fc2f69d8579c45d3fb59fb6" exitCode=0 Mar 10 08:34:14 crc kubenswrapper[4825]: I0310 08:34:14.185792 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pm5bb" Mar 10 08:34:14 crc kubenswrapper[4825]: I0310 08:34:14.185807 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm5bb" event={"ID":"8b12b6a6-2681-4223-8add-f86ed3fac540","Type":"ContainerDied","Data":"8c0c735b2731aa181b3860728eeda341fa0bd9c15fc2f69d8579c45d3fb59fb6"} Mar 10 08:34:14 crc kubenswrapper[4825]: I0310 08:34:14.186197 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm5bb" event={"ID":"8b12b6a6-2681-4223-8add-f86ed3fac540","Type":"ContainerDied","Data":"5930e438aeb590bd9e1820b3ac31d48eb9904efbf5a79bfeb5abf069f8bbc6fb"} Mar 10 08:34:14 crc kubenswrapper[4825]: I0310 08:34:14.186229 4825 scope.go:117] "RemoveContainer" containerID="8c0c735b2731aa181b3860728eeda341fa0bd9c15fc2f69d8579c45d3fb59fb6" Mar 10 08:34:14 crc kubenswrapper[4825]: I0310 08:34:14.207395 4825 scope.go:117] "RemoveContainer" containerID="ce39e4eaec358098ae387081e6b2f7cc7f1afe0e87804ea4b96d83d9512d6bce" Mar 10 08:34:14 crc kubenswrapper[4825]: I0310 08:34:14.240249 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pm5bb"] Mar 10 08:34:14 crc kubenswrapper[4825]: I0310 08:34:14.252823 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pm5bb"] Mar 10 08:34:14 crc kubenswrapper[4825]: I0310 08:34:14.260518 4825 scope.go:117] "RemoveContainer" containerID="6aa441167a98d05449474b288c1577f9f99bef7e70b7c23f5e46664f029f9678" Mar 10 08:34:14 crc kubenswrapper[4825]: I0310 08:34:14.303053 4825 scope.go:117] "RemoveContainer" containerID="8c0c735b2731aa181b3860728eeda341fa0bd9c15fc2f69d8579c45d3fb59fb6" Mar 10 08:34:14 crc kubenswrapper[4825]: E0310 08:34:14.303656 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c0c735b2731aa181b3860728eeda341fa0bd9c15fc2f69d8579c45d3fb59fb6\": container with ID starting with 8c0c735b2731aa181b3860728eeda341fa0bd9c15fc2f69d8579c45d3fb59fb6 not found: ID does not exist" containerID="8c0c735b2731aa181b3860728eeda341fa0bd9c15fc2f69d8579c45d3fb59fb6" Mar 10 08:34:14 crc kubenswrapper[4825]: I0310 08:34:14.303702 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c0c735b2731aa181b3860728eeda341fa0bd9c15fc2f69d8579c45d3fb59fb6"} err="failed to get container status \"8c0c735b2731aa181b3860728eeda341fa0bd9c15fc2f69d8579c45d3fb59fb6\": rpc error: code = NotFound desc = could not find container \"8c0c735b2731aa181b3860728eeda341fa0bd9c15fc2f69d8579c45d3fb59fb6\": container with ID starting with 8c0c735b2731aa181b3860728eeda341fa0bd9c15fc2f69d8579c45d3fb59fb6 not found: ID does not exist" Mar 10 08:34:14 crc kubenswrapper[4825]: I0310 08:34:14.303730 4825 scope.go:117] "RemoveContainer" containerID="ce39e4eaec358098ae387081e6b2f7cc7f1afe0e87804ea4b96d83d9512d6bce" Mar 10 08:34:14 crc kubenswrapper[4825]: E0310 08:34:14.304203 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce39e4eaec358098ae387081e6b2f7cc7f1afe0e87804ea4b96d83d9512d6bce\": container with ID starting with ce39e4eaec358098ae387081e6b2f7cc7f1afe0e87804ea4b96d83d9512d6bce not found: ID does not exist" containerID="ce39e4eaec358098ae387081e6b2f7cc7f1afe0e87804ea4b96d83d9512d6bce" Mar 10 08:34:14 crc kubenswrapper[4825]: I0310 08:34:14.304239 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce39e4eaec358098ae387081e6b2f7cc7f1afe0e87804ea4b96d83d9512d6bce"} err="failed to get container status \"ce39e4eaec358098ae387081e6b2f7cc7f1afe0e87804ea4b96d83d9512d6bce\": rpc error: code = NotFound desc = could not find container \"ce39e4eaec358098ae387081e6b2f7cc7f1afe0e87804ea4b96d83d9512d6bce\": container with ID starting with ce39e4eaec358098ae387081e6b2f7cc7f1afe0e87804ea4b96d83d9512d6bce not found: ID does not exist" Mar 10 08:34:14 crc kubenswrapper[4825]: I0310 08:34:14.304268 4825 scope.go:117] "RemoveContainer" containerID="6aa441167a98d05449474b288c1577f9f99bef7e70b7c23f5e46664f029f9678" Mar 10 08:34:14 crc kubenswrapper[4825]: E0310 08:34:14.304606 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aa441167a98d05449474b288c1577f9f99bef7e70b7c23f5e46664f029f9678\": container with ID starting with 6aa441167a98d05449474b288c1577f9f99bef7e70b7c23f5e46664f029f9678 not found: ID does not exist" containerID="6aa441167a98d05449474b288c1577f9f99bef7e70b7c23f5e46664f029f9678" Mar 10 08:34:14 crc kubenswrapper[4825]: I0310 08:34:14.304632 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aa441167a98d05449474b288c1577f9f99bef7e70b7c23f5e46664f029f9678"} err="failed to get container status \"6aa441167a98d05449474b288c1577f9f99bef7e70b7c23f5e46664f029f9678\": rpc error: code = NotFound desc = could not find container \"6aa441167a98d05449474b288c1577f9f99bef7e70b7c23f5e46664f029f9678\": container with ID starting with 6aa441167a98d05449474b288c1577f9f99bef7e70b7c23f5e46664f029f9678 not found: ID does not exist" Mar 10 08:34:15 crc kubenswrapper[4825]: I0310 08:34:15.282424 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b12b6a6-2681-4223-8add-f86ed3fac540" path="/var/lib/kubelet/pods/8b12b6a6-2681-4223-8add-f86ed3fac540/volumes" Mar 10 08:34:16 crc kubenswrapper[4825]: I0310 08:34:16.888747 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:34:16 crc kubenswrapper[4825]: I0310 08:34:16.889059 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:34:16 crc kubenswrapper[4825]: I0310 08:34:16.889107 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 08:34:16 crc kubenswrapper[4825]: I0310 08:34:16.889945 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ba716d5199aaa639d1807a8298ac7530f73bce5b83b5cee2bf9ea8fb07b40b32"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 08:34:16 crc kubenswrapper[4825]: I0310 08:34:16.889993 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://ba716d5199aaa639d1807a8298ac7530f73bce5b83b5cee2bf9ea8fb07b40b32" gracePeriod=600 Mar 10 08:34:17 crc kubenswrapper[4825]: I0310 08:34:17.306567 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="ba716d5199aaa639d1807a8298ac7530f73bce5b83b5cee2bf9ea8fb07b40b32" exitCode=0 Mar 10 08:34:17 crc kubenswrapper[4825]: I0310 08:34:17.306732 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"ba716d5199aaa639d1807a8298ac7530f73bce5b83b5cee2bf9ea8fb07b40b32"} Mar 10 08:34:17 crc kubenswrapper[4825]: I0310 08:34:17.307293 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d"} Mar 10 08:34:17 crc kubenswrapper[4825]: I0310 08:34:17.307389 4825 scope.go:117] "RemoveContainer" containerID="e66b7b78375632673167d52203536b8000b1c9a9d311efe989c77daf24bd4cfc" Mar 10 08:34:53 crc kubenswrapper[4825]: I0310 08:34:53.020531 4825 scope.go:117] "RemoveContainer" containerID="a83c12e06053a922044a9d7a9029364af4ef7ca9dd542c82769661855d1c8dc8" Mar 10 08:34:54 crc kubenswrapper[4825]: I0310 08:34:54.056831 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-hfkl9"] Mar 10 08:34:54 crc kubenswrapper[4825]: I0310 08:34:54.072218 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-hfkl9"] Mar 10 08:34:54 crc kubenswrapper[4825]: I0310 08:34:54.079187 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-170a-account-create-update-dm2z4"] Mar 10 08:34:54 crc kubenswrapper[4825]: I0310 08:34:54.086690 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-170a-account-create-update-dm2z4"] Mar 10 08:34:55 crc kubenswrapper[4825]: I0310 08:34:55.248491 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a04468f-d0fe-40da-81ab-85ff0e683d0a" path="/var/lib/kubelet/pods/5a04468f-d0fe-40da-81ab-85ff0e683d0a/volumes" Mar 10 08:34:55 crc kubenswrapper[4825]: I0310 08:34:55.249063 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb9c8dbc-8d08-41bc-92b9-807b45181044" path="/var/lib/kubelet/pods/fb9c8dbc-8d08-41bc-92b9-807b45181044/volumes" Mar 10 08:35:07 crc kubenswrapper[4825]: I0310 08:35:07.046743 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-whppm"] Mar 10 08:35:07 crc kubenswrapper[4825]: I0310 08:35:07.058890 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-whppm"] Mar 10 08:35:07 crc kubenswrapper[4825]: I0310 08:35:07.252825 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5737928d-0f6e-441a-a129-f40bb9a984f6" path="/var/lib/kubelet/pods/5737928d-0f6e-441a-a129-f40bb9a984f6/volumes" Mar 10 08:35:53 crc kubenswrapper[4825]: I0310 08:35:53.132741 4825 scope.go:117] "RemoveContainer" containerID="3591ea9da8f8ff9ac66ee64d0cdc12fcbef2e5cdb74c1d2df5e52e9ccbeb7187" Mar 10 08:35:53 crc kubenswrapper[4825]: I0310 08:35:53.179555 4825 scope.go:117] "RemoveContainer" containerID="b77c91bb00dc24963b87c5e53479c9a9343f56474be1f973da5436cbf8377f2a" Mar 10 08:35:53 crc kubenswrapper[4825]: I0310 08:35:53.231561 4825 scope.go:117] "RemoveContainer" containerID="f5d7ae71e5203c3d98973148eb54aec5743a00074e6c8fb3f5df6ec4097c8f6b" Mar 10 08:36:00 crc kubenswrapper[4825]: I0310 08:36:00.146060 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552196-xvh7m"] Mar 10 08:36:00 crc kubenswrapper[4825]: E0310 08:36:00.146905 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b12b6a6-2681-4223-8add-f86ed3fac540" containerName="registry-server" Mar 10 08:36:00 crc kubenswrapper[4825]: I0310 08:36:00.146917 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b12b6a6-2681-4223-8add-f86ed3fac540" containerName="registry-server" Mar 10 08:36:00 crc kubenswrapper[4825]: E0310 08:36:00.146947 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b12b6a6-2681-4223-8add-f86ed3fac540" containerName="extract-content" Mar 10 08:36:00 crc kubenswrapper[4825]: I0310 08:36:00.146953 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b12b6a6-2681-4223-8add-f86ed3fac540" containerName="extract-content" Mar 10 08:36:00 crc kubenswrapper[4825]: E0310 08:36:00.146969 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13eabac-e76b-4096-a5de-9bb438d19a1e" containerName="oc" Mar 10 08:36:00 crc kubenswrapper[4825]: I0310 08:36:00.146976 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13eabac-e76b-4096-a5de-9bb438d19a1e" containerName="oc" Mar 10 08:36:00 crc kubenswrapper[4825]: E0310 08:36:00.146985 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b12b6a6-2681-4223-8add-f86ed3fac540" containerName="extract-utilities" Mar 10 08:36:00 crc kubenswrapper[4825]: I0310 08:36:00.146991 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b12b6a6-2681-4223-8add-f86ed3fac540" containerName="extract-utilities" Mar 10 08:36:00 crc kubenswrapper[4825]: I0310 08:36:00.147200 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b12b6a6-2681-4223-8add-f86ed3fac540" containerName="registry-server" Mar 10 08:36:00 crc kubenswrapper[4825]: I0310 08:36:00.147213 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13eabac-e76b-4096-a5de-9bb438d19a1e" containerName="oc" Mar 10 08:36:00 crc kubenswrapper[4825]: I0310 08:36:00.147884 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552196-xvh7m" Mar 10 08:36:00 crc kubenswrapper[4825]: I0310 08:36:00.153022 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:36:00 crc kubenswrapper[4825]: I0310 08:36:00.153304 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:36:00 crc kubenswrapper[4825]: I0310 08:36:00.153581 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:36:00 crc kubenswrapper[4825]: I0310 08:36:00.171160 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552196-xvh7m"] Mar 10 08:36:00 crc kubenswrapper[4825]: I0310 08:36:00.185414 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw8rg\" (UniqueName: \"kubernetes.io/projected/e0d88a47-f079-456f-a0e7-847c942efa88-kube-api-access-vw8rg\") pod \"auto-csr-approver-29552196-xvh7m\" (UID: \"e0d88a47-f079-456f-a0e7-847c942efa88\") " pod="openshift-infra/auto-csr-approver-29552196-xvh7m" Mar 10 08:36:00 crc kubenswrapper[4825]: I0310 08:36:00.287740 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw8rg\" (UniqueName: \"kubernetes.io/projected/e0d88a47-f079-456f-a0e7-847c942efa88-kube-api-access-vw8rg\") pod \"auto-csr-approver-29552196-xvh7m\" (UID: \"e0d88a47-f079-456f-a0e7-847c942efa88\") " pod="openshift-infra/auto-csr-approver-29552196-xvh7m" Mar 10 08:36:00 crc kubenswrapper[4825]: I0310 08:36:00.312444 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw8rg\" (UniqueName: \"kubernetes.io/projected/e0d88a47-f079-456f-a0e7-847c942efa88-kube-api-access-vw8rg\") pod \"auto-csr-approver-29552196-xvh7m\" (UID: \"e0d88a47-f079-456f-a0e7-847c942efa88\") " pod="openshift-infra/auto-csr-approver-29552196-xvh7m" Mar 10 08:36:00 crc kubenswrapper[4825]: I0310 08:36:00.478358 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552196-xvh7m" Mar 10 08:36:00 crc kubenswrapper[4825]: I0310 08:36:00.986940 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552196-xvh7m"] Mar 10 08:36:00 crc kubenswrapper[4825]: I0310 08:36:00.991603 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 08:36:01 crc kubenswrapper[4825]: I0310 08:36:01.308123 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552196-xvh7m" event={"ID":"e0d88a47-f079-456f-a0e7-847c942efa88","Type":"ContainerStarted","Data":"c215986b709b48bdc8a14796eaf4e449bea53551e430246baa00143bcbbaa61e"} Mar 10 08:36:03 crc kubenswrapper[4825]: I0310 08:36:03.328748 4825 generic.go:334] "Generic (PLEG): container finished" podID="e0d88a47-f079-456f-a0e7-847c942efa88" containerID="bc53c3987e986ccb6550724d2078ea1663265c7fa6db439abe7d2690277125f5" exitCode=0 Mar 10 08:36:03 crc kubenswrapper[4825]: I0310 08:36:03.328792 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552196-xvh7m" event={"ID":"e0d88a47-f079-456f-a0e7-847c942efa88","Type":"ContainerDied","Data":"bc53c3987e986ccb6550724d2078ea1663265c7fa6db439abe7d2690277125f5"} Mar 10 08:36:04 crc kubenswrapper[4825]: I0310 08:36:04.633616 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552196-xvh7m" Mar 10 08:36:04 crc kubenswrapper[4825]: I0310 08:36:04.784306 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw8rg\" (UniqueName: \"kubernetes.io/projected/e0d88a47-f079-456f-a0e7-847c942efa88-kube-api-access-vw8rg\") pod \"e0d88a47-f079-456f-a0e7-847c942efa88\" (UID: \"e0d88a47-f079-456f-a0e7-847c942efa88\") " Mar 10 08:36:04 crc kubenswrapper[4825]: I0310 08:36:04.790729 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0d88a47-f079-456f-a0e7-847c942efa88-kube-api-access-vw8rg" (OuterVolumeSpecName: "kube-api-access-vw8rg") pod "e0d88a47-f079-456f-a0e7-847c942efa88" (UID: "e0d88a47-f079-456f-a0e7-847c942efa88"). InnerVolumeSpecName "kube-api-access-vw8rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:36:04 crc kubenswrapper[4825]: I0310 08:36:04.887831 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw8rg\" (UniqueName: \"kubernetes.io/projected/e0d88a47-f079-456f-a0e7-847c942efa88-kube-api-access-vw8rg\") on node \"crc\" DevicePath \"\"" Mar 10 08:36:05 crc kubenswrapper[4825]: I0310 08:36:05.346167 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552196-xvh7m" event={"ID":"e0d88a47-f079-456f-a0e7-847c942efa88","Type":"ContainerDied","Data":"c215986b709b48bdc8a14796eaf4e449bea53551e430246baa00143bcbbaa61e"} Mar 10 08:36:05 crc kubenswrapper[4825]: I0310 08:36:05.346213 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c215986b709b48bdc8a14796eaf4e449bea53551e430246baa00143bcbbaa61e" Mar 10 08:36:05 crc kubenswrapper[4825]: I0310 08:36:05.346519 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552196-xvh7m" Mar 10 08:36:05 crc kubenswrapper[4825]: I0310 08:36:05.708029 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552190-tdnwv"] Mar 10 08:36:05 crc kubenswrapper[4825]: I0310 08:36:05.722602 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552190-tdnwv"] Mar 10 08:36:07 crc kubenswrapper[4825]: I0310 08:36:07.247106 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6bf8e71-0bdf-4597-8b70-f15499e6d4f8" path="/var/lib/kubelet/pods/b6bf8e71-0bdf-4597-8b70-f15499e6d4f8/volumes" Mar 10 08:36:46 crc kubenswrapper[4825]: I0310 08:36:46.892784 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:36:46 crc kubenswrapper[4825]: I0310 08:36:46.893215 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:36:53 crc kubenswrapper[4825]: I0310 08:36:53.379434 4825 scope.go:117] "RemoveContainer" containerID="70bd5cad43c296e34cea39f55c9b8fbaad4124b8648275278dff6be36f9b0955" Mar 10 08:37:07 crc kubenswrapper[4825]: I0310 08:37:07.945231 4825 generic.go:334] "Generic (PLEG): container finished" podID="98107d82-46d1-4863-8c79-a10abec1737f" containerID="763e694dbcd38f37b0c74ddda85320c761e232c0be5811a4f7d05b3d8748ddfb" exitCode=0 Mar 10 08:37:07 crc kubenswrapper[4825]: I0310 08:37:07.945971 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk" event={"ID":"98107d82-46d1-4863-8c79-a10abec1737f","Type":"ContainerDied","Data":"763e694dbcd38f37b0c74ddda85320c761e232c0be5811a4f7d05b3d8748ddfb"} Mar 10 08:37:09 crc kubenswrapper[4825]: I0310 08:37:09.403514 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk" Mar 10 08:37:09 crc kubenswrapper[4825]: I0310 08:37:09.527514 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/98107d82-46d1-4863-8c79-a10abec1737f-ssh-key-openstack-cell1\") pod \"98107d82-46d1-4863-8c79-a10abec1737f\" (UID: \"98107d82-46d1-4863-8c79-a10abec1737f\") " Mar 10 08:37:09 crc kubenswrapper[4825]: I0310 08:37:09.527602 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98107d82-46d1-4863-8c79-a10abec1737f-tripleo-cleanup-combined-ca-bundle\") pod \"98107d82-46d1-4863-8c79-a10abec1737f\" (UID: \"98107d82-46d1-4863-8c79-a10abec1737f\") " Mar 10 08:37:09 crc kubenswrapper[4825]: I0310 08:37:09.527747 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98107d82-46d1-4863-8c79-a10abec1737f-inventory\") pod \"98107d82-46d1-4863-8c79-a10abec1737f\" (UID: \"98107d82-46d1-4863-8c79-a10abec1737f\") " Mar 10 08:37:09 crc kubenswrapper[4825]: I0310 08:37:09.527880 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpk85\" (UniqueName: \"kubernetes.io/projected/98107d82-46d1-4863-8c79-a10abec1737f-kube-api-access-tpk85\") pod \"98107d82-46d1-4863-8c79-a10abec1737f\" (UID: \"98107d82-46d1-4863-8c79-a10abec1737f\") " Mar 10 08:37:09 crc kubenswrapper[4825]: I0310 08:37:09.533398 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98107d82-46d1-4863-8c79-a10abec1737f-kube-api-access-tpk85" (OuterVolumeSpecName: "kube-api-access-tpk85") pod "98107d82-46d1-4863-8c79-a10abec1737f" (UID: "98107d82-46d1-4863-8c79-a10abec1737f"). InnerVolumeSpecName "kube-api-access-tpk85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:37:09 crc kubenswrapper[4825]: I0310 08:37:09.534245 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98107d82-46d1-4863-8c79-a10abec1737f-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "98107d82-46d1-4863-8c79-a10abec1737f" (UID: "98107d82-46d1-4863-8c79-a10abec1737f"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:37:09 crc kubenswrapper[4825]: I0310 08:37:09.559997 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98107d82-46d1-4863-8c79-a10abec1737f-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "98107d82-46d1-4863-8c79-a10abec1737f" (UID: "98107d82-46d1-4863-8c79-a10abec1737f"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:37:09 crc kubenswrapper[4825]: I0310 08:37:09.564336 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98107d82-46d1-4863-8c79-a10abec1737f-inventory" (OuterVolumeSpecName: "inventory") pod "98107d82-46d1-4863-8c79-a10abec1737f" (UID: "98107d82-46d1-4863-8c79-a10abec1737f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:37:09 crc kubenswrapper[4825]: I0310 08:37:09.632299 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98107d82-46d1-4863-8c79-a10abec1737f-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 08:37:09 crc kubenswrapper[4825]: I0310 08:37:09.632346 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpk85\" (UniqueName: \"kubernetes.io/projected/98107d82-46d1-4863-8c79-a10abec1737f-kube-api-access-tpk85\") on node \"crc\" DevicePath \"\"" Mar 10 08:37:09 crc kubenswrapper[4825]: I0310 08:37:09.632366 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/98107d82-46d1-4863-8c79-a10abec1737f-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 08:37:09 crc kubenswrapper[4825]: I0310 08:37:09.632401 4825 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98107d82-46d1-4863-8c79-a10abec1737f-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:37:09 crc kubenswrapper[4825]: I0310 08:37:09.971637 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk" event={"ID":"98107d82-46d1-4863-8c79-a10abec1737f","Type":"ContainerDied","Data":"453bac1b39ec329db42b048fca923fd22c77bac1b508358901caa62b14cd9720"} Mar 10 08:37:09 crc kubenswrapper[4825]: I0310 08:37:09.971684 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="453bac1b39ec329db42b048fca923fd22c77bac1b508358901caa62b14cd9720" Mar 10 08:37:09 crc kubenswrapper[4825]: I0310 08:37:09.971754 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk" Mar 10 08:37:16 crc kubenswrapper[4825]: I0310 08:37:16.888634 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:37:16 crc kubenswrapper[4825]: I0310 08:37:16.889205 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:37:19 crc kubenswrapper[4825]: I0310 08:37:19.281419 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-grqk8"] Mar 10 08:37:19 crc kubenswrapper[4825]: E0310 08:37:19.282244 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0d88a47-f079-456f-a0e7-847c942efa88" containerName="oc" Mar 10 08:37:19 crc kubenswrapper[4825]: I0310 08:37:19.282263 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0d88a47-f079-456f-a0e7-847c942efa88" containerName="oc" Mar 10 08:37:19 crc kubenswrapper[4825]: E0310 08:37:19.282281 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98107d82-46d1-4863-8c79-a10abec1737f" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 10 08:37:19 crc kubenswrapper[4825]: I0310 08:37:19.282290 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="98107d82-46d1-4863-8c79-a10abec1737f" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 10 08:37:19 crc kubenswrapper[4825]: I0310 08:37:19.282730 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0d88a47-f079-456f-a0e7-847c942efa88" containerName="oc" Mar 10 08:37:19 crc kubenswrapper[4825]: I0310 08:37:19.282766 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="98107d82-46d1-4863-8c79-a10abec1737f" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 10 08:37:19 crc kubenswrapper[4825]: I0310 08:37:19.283672 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-grqk8" Mar 10 08:37:19 crc kubenswrapper[4825]: I0310 08:37:19.286452 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-5twvv" Mar 10 08:37:19 crc kubenswrapper[4825]: I0310 08:37:19.286716 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 08:37:19 crc kubenswrapper[4825]: I0310 08:37:19.286836 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 08:37:19 crc kubenswrapper[4825]: I0310 08:37:19.290065 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-grqk8"] Mar 10 08:37:19 crc kubenswrapper[4825]: I0310 08:37:19.293261 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 08:37:19 crc kubenswrapper[4825]: I0310 08:37:19.434662 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xthf8\" (UniqueName: \"kubernetes.io/projected/547aa46f-b19d-4704-89c0-4c27e28ba30e-kube-api-access-xthf8\") pod \"bootstrap-openstack-openstack-cell1-grqk8\" (UID: \"547aa46f-b19d-4704-89c0-4c27e28ba30e\") " pod="openstack/bootstrap-openstack-openstack-cell1-grqk8" Mar 10 08:37:19 crc kubenswrapper[4825]: I0310 08:37:19.434803 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/547aa46f-b19d-4704-89c0-4c27e28ba30e-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-grqk8\" (UID: \"547aa46f-b19d-4704-89c0-4c27e28ba30e\") " pod="openstack/bootstrap-openstack-openstack-cell1-grqk8" Mar 10 08:37:19 crc kubenswrapper[4825]: I0310 08:37:19.435005 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/547aa46f-b19d-4704-89c0-4c27e28ba30e-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-grqk8\" (UID: \"547aa46f-b19d-4704-89c0-4c27e28ba30e\") " pod="openstack/bootstrap-openstack-openstack-cell1-grqk8" Mar 10 08:37:19 crc kubenswrapper[4825]: I0310 08:37:19.435283 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/547aa46f-b19d-4704-89c0-4c27e28ba30e-inventory\") pod \"bootstrap-openstack-openstack-cell1-grqk8\" (UID: \"547aa46f-b19d-4704-89c0-4c27e28ba30e\") " pod="openstack/bootstrap-openstack-openstack-cell1-grqk8" Mar 10 08:37:19 crc kubenswrapper[4825]: I0310 08:37:19.536686 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/547aa46f-b19d-4704-89c0-4c27e28ba30e-inventory\") pod \"bootstrap-openstack-openstack-cell1-grqk8\" (UID: \"547aa46f-b19d-4704-89c0-4c27e28ba30e\") " pod="openstack/bootstrap-openstack-openstack-cell1-grqk8" Mar 10 08:37:19 crc kubenswrapper[4825]: I0310 08:37:19.536807 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xthf8\" (UniqueName: \"kubernetes.io/projected/547aa46f-b19d-4704-89c0-4c27e28ba30e-kube-api-access-xthf8\") pod \"bootstrap-openstack-openstack-cell1-grqk8\" (UID: \"547aa46f-b19d-4704-89c0-4c27e28ba30e\") " pod="openstack/bootstrap-openstack-openstack-cell1-grqk8" Mar 10 08:37:19 crc kubenswrapper[4825]: I0310 08:37:19.536841 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/547aa46f-b19d-4704-89c0-4c27e28ba30e-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-grqk8\" (UID: \"547aa46f-b19d-4704-89c0-4c27e28ba30e\") " pod="openstack/bootstrap-openstack-openstack-cell1-grqk8" Mar 10 08:37:19 crc kubenswrapper[4825]: I0310 08:37:19.536871 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/547aa46f-b19d-4704-89c0-4c27e28ba30e-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-grqk8\" (UID: \"547aa46f-b19d-4704-89c0-4c27e28ba30e\") " pod="openstack/bootstrap-openstack-openstack-cell1-grqk8" Mar 10 08:37:19 crc kubenswrapper[4825]: I0310 08:37:19.542783 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/547aa46f-b19d-4704-89c0-4c27e28ba30e-inventory\") pod \"bootstrap-openstack-openstack-cell1-grqk8\" (UID: \"547aa46f-b19d-4704-89c0-4c27e28ba30e\") " pod="openstack/bootstrap-openstack-openstack-cell1-grqk8" Mar 10 08:37:19 crc kubenswrapper[4825]: I0310 08:37:19.543856 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/547aa46f-b19d-4704-89c0-4c27e28ba30e-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-grqk8\" (UID: \"547aa46f-b19d-4704-89c0-4c27e28ba30e\") " pod="openstack/bootstrap-openstack-openstack-cell1-grqk8" Mar 10 08:37:19 crc kubenswrapper[4825]: I0310 08:37:19.545930 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/547aa46f-b19d-4704-89c0-4c27e28ba30e-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-grqk8\" (UID: \"547aa46f-b19d-4704-89c0-4c27e28ba30e\") " pod="openstack/bootstrap-openstack-openstack-cell1-grqk8" Mar 10 08:37:19 crc kubenswrapper[4825]: I0310 08:37:19.552124 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xthf8\" (UniqueName: \"kubernetes.io/projected/547aa46f-b19d-4704-89c0-4c27e28ba30e-kube-api-access-xthf8\") pod \"bootstrap-openstack-openstack-cell1-grqk8\" (UID: \"547aa46f-b19d-4704-89c0-4c27e28ba30e\") " pod="openstack/bootstrap-openstack-openstack-cell1-grqk8" Mar 10 08:37:19 crc kubenswrapper[4825]: I0310 08:37:19.601355 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-grqk8" Mar 10 08:37:20 crc kubenswrapper[4825]: I0310 08:37:20.108914 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-grqk8"] Mar 10 08:37:21 crc kubenswrapper[4825]: I0310 08:37:21.090184 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-grqk8" event={"ID":"547aa46f-b19d-4704-89c0-4c27e28ba30e","Type":"ContainerStarted","Data":"9fe387207ff94d7615d3c98234bfaef5dc7540cdc69e4664faec6d1771c95d79"} Mar 10 08:37:21 crc kubenswrapper[4825]: I0310 08:37:21.090561 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-grqk8" event={"ID":"547aa46f-b19d-4704-89c0-4c27e28ba30e","Type":"ContainerStarted","Data":"aa6a73da9501b0fbaaca9be8d96c3215288723a02fe258e8e672667c549bc14b"} Mar 10 08:37:21 crc kubenswrapper[4825]: I0310 08:37:21.110327 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-grqk8" podStartSLOduration=1.6105300630000001 podStartE2EDuration="2.110308901s" podCreationTimestamp="2026-03-10 08:37:19 +0000 UTC" firstStartedPulling="2026-03-10 08:37:20.112475174 +0000 UTC m=+6793.142255789" lastFinishedPulling="2026-03-10 08:37:20.612254002 +0000 UTC m=+6793.642034627" observedRunningTime="2026-03-10 08:37:21.107386578 +0000 UTC m=+6794.137167193" watchObservedRunningTime="2026-03-10 08:37:21.110308901 +0000 UTC m=+6794.140089516" Mar 10 08:37:46 crc kubenswrapper[4825]: I0310 08:37:46.890628 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:37:46 crc kubenswrapper[4825]: I0310 08:37:46.891161 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:37:46 crc kubenswrapper[4825]: I0310 08:37:46.891209 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 08:37:46 crc kubenswrapper[4825]: I0310 08:37:46.891917 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 08:37:46 crc kubenswrapper[4825]: I0310 08:37:46.891973 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" gracePeriod=600 Mar 10 08:37:47 crc kubenswrapper[4825]: E0310 08:37:47.019873 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:37:47 crc kubenswrapper[4825]: I0310 08:37:47.323848 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" exitCode=0 Mar 10 08:37:47 crc kubenswrapper[4825]: I0310 08:37:47.323925 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d"} Mar 10 08:37:47 crc kubenswrapper[4825]: I0310 08:37:47.324073 4825 scope.go:117] "RemoveContainer" containerID="ba716d5199aaa639d1807a8298ac7530f73bce5b83b5cee2bf9ea8fb07b40b32" Mar 10 08:37:47 crc kubenswrapper[4825]: I0310 08:37:47.324963 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:37:47 crc kubenswrapper[4825]: E0310 08:37:47.325390 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:37:51 crc kubenswrapper[4825]: I0310 08:37:51.996892 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gbxbv"] Mar 10 08:37:52 crc kubenswrapper[4825]: I0310 08:37:52.001066 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gbxbv" Mar 10 08:37:52 crc kubenswrapper[4825]: I0310 08:37:52.015429 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gbxbv"] Mar 10 08:37:52 crc kubenswrapper[4825]: I0310 08:37:52.156417 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0496816-ab14-4d86-a5b5-d2812b62d3fb-catalog-content\") pod \"redhat-operators-gbxbv\" (UID: \"a0496816-ab14-4d86-a5b5-d2812b62d3fb\") " pod="openshift-marketplace/redhat-operators-gbxbv" Mar 10 08:37:52 crc kubenswrapper[4825]: I0310 08:37:52.156474 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0496816-ab14-4d86-a5b5-d2812b62d3fb-utilities\") pod \"redhat-operators-gbxbv\" (UID: \"a0496816-ab14-4d86-a5b5-d2812b62d3fb\") " pod="openshift-marketplace/redhat-operators-gbxbv" Mar 10 08:37:52 crc kubenswrapper[4825]: I0310 08:37:52.156521 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q27d8\" (UniqueName: \"kubernetes.io/projected/a0496816-ab14-4d86-a5b5-d2812b62d3fb-kube-api-access-q27d8\") pod \"redhat-operators-gbxbv\" (UID: \"a0496816-ab14-4d86-a5b5-d2812b62d3fb\") " pod="openshift-marketplace/redhat-operators-gbxbv" Mar 10 08:37:52 crc kubenswrapper[4825]: I0310 08:37:52.258809 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0496816-ab14-4d86-a5b5-d2812b62d3fb-catalog-content\") pod \"redhat-operators-gbxbv\" (UID: \"a0496816-ab14-4d86-a5b5-d2812b62d3fb\") " pod="openshift-marketplace/redhat-operators-gbxbv" Mar 10 08:37:52 crc kubenswrapper[4825]: I0310 08:37:52.258878 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0496816-ab14-4d86-a5b5-d2812b62d3fb-utilities\") pod \"redhat-operators-gbxbv\" (UID: \"a0496816-ab14-4d86-a5b5-d2812b62d3fb\") " pod="openshift-marketplace/redhat-operators-gbxbv" Mar 10 08:37:52 crc kubenswrapper[4825]: I0310 08:37:52.258939 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q27d8\" (UniqueName: \"kubernetes.io/projected/a0496816-ab14-4d86-a5b5-d2812b62d3fb-kube-api-access-q27d8\") pod \"redhat-operators-gbxbv\" (UID: \"a0496816-ab14-4d86-a5b5-d2812b62d3fb\") " pod="openshift-marketplace/redhat-operators-gbxbv" Mar 10 08:37:52 crc kubenswrapper[4825]: I0310 08:37:52.259508 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0496816-ab14-4d86-a5b5-d2812b62d3fb-catalog-content\") pod \"redhat-operators-gbxbv\" (UID: \"a0496816-ab14-4d86-a5b5-d2812b62d3fb\") " pod="openshift-marketplace/redhat-operators-gbxbv" Mar 10 08:37:52 crc kubenswrapper[4825]: I0310 08:37:52.259640 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0496816-ab14-4d86-a5b5-d2812b62d3fb-utilities\") pod \"redhat-operators-gbxbv\" (UID: \"a0496816-ab14-4d86-a5b5-d2812b62d3fb\") " pod="openshift-marketplace/redhat-operators-gbxbv" Mar 10 08:37:52 crc kubenswrapper[4825]: I0310 08:37:52.281572 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q27d8\" (UniqueName: \"kubernetes.io/projected/a0496816-ab14-4d86-a5b5-d2812b62d3fb-kube-api-access-q27d8\") pod \"redhat-operators-gbxbv\" (UID: \"a0496816-ab14-4d86-a5b5-d2812b62d3fb\") " pod="openshift-marketplace/redhat-operators-gbxbv" Mar 10 08:37:52 crc kubenswrapper[4825]: I0310 08:37:52.372809 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gbxbv" Mar 10 08:37:52 crc kubenswrapper[4825]: I0310 08:37:52.847828 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gbxbv"] Mar 10 08:37:53 crc kubenswrapper[4825]: I0310 08:37:53.375438 4825 generic.go:334] "Generic (PLEG): container finished" podID="a0496816-ab14-4d86-a5b5-d2812b62d3fb" containerID="5d38a2cc3a7229b2d87753bae968d6a76338cf152f82ba5eb1728cb1c478a1b2" exitCode=0 Mar 10 08:37:53 crc kubenswrapper[4825]: I0310 08:37:53.375547 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gbxbv" event={"ID":"a0496816-ab14-4d86-a5b5-d2812b62d3fb","Type":"ContainerDied","Data":"5d38a2cc3a7229b2d87753bae968d6a76338cf152f82ba5eb1728cb1c478a1b2"} Mar 10 08:37:53 crc kubenswrapper[4825]: I0310 08:37:53.375757 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gbxbv" event={"ID":"a0496816-ab14-4d86-a5b5-d2812b62d3fb","Type":"ContainerStarted","Data":"aa588c62f5a24563c36c0a805f584fbe529f11d80a2e0d5d69d313289cfb40e6"} Mar 10 08:37:54 crc kubenswrapper[4825]: I0310 08:37:54.385709 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gbxbv" event={"ID":"a0496816-ab14-4d86-a5b5-d2812b62d3fb","Type":"ContainerStarted","Data":"6a94ecc01c2d67baf5845b0b6338e4701c8a896ba1a96b735ad836019c537047"} Mar 10 08:37:59 crc kubenswrapper[4825]: I0310 08:37:59.244916 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:37:59 crc kubenswrapper[4825]: E0310 08:37:59.245374 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:38:00 crc kubenswrapper[4825]: I0310 08:38:00.160104 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552198-bvsh5"] Mar 10 08:38:00 crc kubenswrapper[4825]: I0310 08:38:00.162000 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552198-bvsh5" Mar 10 08:38:00 crc kubenswrapper[4825]: I0310 08:38:00.164151 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:38:00 crc kubenswrapper[4825]: I0310 08:38:00.164219 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:38:00 crc kubenswrapper[4825]: I0310 08:38:00.165490 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:38:00 crc kubenswrapper[4825]: I0310 08:38:00.172505 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552198-bvsh5"] Mar 10 08:38:00 crc kubenswrapper[4825]: I0310 08:38:00.242349 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfqlj\" (UniqueName: \"kubernetes.io/projected/4eec7146-26d1-4fd7-99a8-d4f910750686-kube-api-access-vfqlj\") pod \"auto-csr-approver-29552198-bvsh5\" (UID: \"4eec7146-26d1-4fd7-99a8-d4f910750686\") " pod="openshift-infra/auto-csr-approver-29552198-bvsh5" Mar 10 08:38:00 crc kubenswrapper[4825]: I0310 08:38:00.344844 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfqlj\" (UniqueName: \"kubernetes.io/projected/4eec7146-26d1-4fd7-99a8-d4f910750686-kube-api-access-vfqlj\") pod \"auto-csr-approver-29552198-bvsh5\" (UID: \"4eec7146-26d1-4fd7-99a8-d4f910750686\") " pod="openshift-infra/auto-csr-approver-29552198-bvsh5" Mar 10 08:38:00 crc kubenswrapper[4825]: I0310 08:38:00.368901 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfqlj\" (UniqueName: \"kubernetes.io/projected/4eec7146-26d1-4fd7-99a8-d4f910750686-kube-api-access-vfqlj\") pod \"auto-csr-approver-29552198-bvsh5\" (UID: \"4eec7146-26d1-4fd7-99a8-d4f910750686\") " pod="openshift-infra/auto-csr-approver-29552198-bvsh5" Mar 10 08:38:00 crc kubenswrapper[4825]: I0310 08:38:00.486105 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552198-bvsh5" Mar 10 08:38:00 crc kubenswrapper[4825]: I0310 08:38:00.957039 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552198-bvsh5"] Mar 10 08:38:01 crc kubenswrapper[4825]: I0310 08:38:01.443520 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552198-bvsh5" event={"ID":"4eec7146-26d1-4fd7-99a8-d4f910750686","Type":"ContainerStarted","Data":"fe4274470c446e08831fc39dfe90dda0149a66756078068cb735e58a54f9a66e"} Mar 10 08:38:02 crc kubenswrapper[4825]: I0310 08:38:02.455436 4825 generic.go:334] "Generic (PLEG): container finished" podID="a0496816-ab14-4d86-a5b5-d2812b62d3fb" containerID="6a94ecc01c2d67baf5845b0b6338e4701c8a896ba1a96b735ad836019c537047" exitCode=0 Mar 10 08:38:02 crc kubenswrapper[4825]: I0310 08:38:02.455501 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gbxbv" event={"ID":"a0496816-ab14-4d86-a5b5-d2812b62d3fb","Type":"ContainerDied","Data":"6a94ecc01c2d67baf5845b0b6338e4701c8a896ba1a96b735ad836019c537047"} Mar 10 08:38:03 crc kubenswrapper[4825]: I0310 08:38:03.467718 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gbxbv" event={"ID":"a0496816-ab14-4d86-a5b5-d2812b62d3fb","Type":"ContainerStarted","Data":"aabba286fdbcd1acd607efe79540643fdd1a99369c0832e6060ef60d26cfbcd4"} Mar 10 08:38:03 crc kubenswrapper[4825]: I0310 08:38:03.470236 4825 generic.go:334] "Generic (PLEG): container finished" podID="4eec7146-26d1-4fd7-99a8-d4f910750686" containerID="f99e33e974093644f16af2af3f8441900d225f4d483a56c6a8ff632972b03215" exitCode=0 Mar 10 08:38:03 crc kubenswrapper[4825]: I0310 08:38:03.470286 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552198-bvsh5" event={"ID":"4eec7146-26d1-4fd7-99a8-d4f910750686","Type":"ContainerDied","Data":"f99e33e974093644f16af2af3f8441900d225f4d483a56c6a8ff632972b03215"} Mar 10 08:38:03 crc kubenswrapper[4825]: I0310 08:38:03.491681 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gbxbv" podStartSLOduration=2.895669337 podStartE2EDuration="12.491663246s" podCreationTimestamp="2026-03-10 08:37:51 +0000 UTC" firstStartedPulling="2026-03-10 08:37:53.377078575 +0000 UTC m=+6826.406859190" lastFinishedPulling="2026-03-10 08:38:02.973072484 +0000 UTC m=+6836.002853099" observedRunningTime="2026-03-10 08:38:03.485491901 +0000 UTC m=+6836.515272516" watchObservedRunningTime="2026-03-10 08:38:03.491663246 +0000 UTC m=+6836.521443861" Mar 10 08:38:04 crc kubenswrapper[4825]: I0310 08:38:04.853847 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552198-bvsh5" Mar 10 08:38:04 crc kubenswrapper[4825]: I0310 08:38:04.962092 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfqlj\" (UniqueName: \"kubernetes.io/projected/4eec7146-26d1-4fd7-99a8-d4f910750686-kube-api-access-vfqlj\") pod \"4eec7146-26d1-4fd7-99a8-d4f910750686\" (UID: \"4eec7146-26d1-4fd7-99a8-d4f910750686\") " Mar 10 08:38:04 crc kubenswrapper[4825]: I0310 08:38:04.967338 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eec7146-26d1-4fd7-99a8-d4f910750686-kube-api-access-vfqlj" (OuterVolumeSpecName: "kube-api-access-vfqlj") pod "4eec7146-26d1-4fd7-99a8-d4f910750686" (UID: "4eec7146-26d1-4fd7-99a8-d4f910750686"). InnerVolumeSpecName "kube-api-access-vfqlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:38:05 crc kubenswrapper[4825]: I0310 08:38:05.064762 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfqlj\" (UniqueName: \"kubernetes.io/projected/4eec7146-26d1-4fd7-99a8-d4f910750686-kube-api-access-vfqlj\") on node \"crc\" DevicePath \"\"" Mar 10 08:38:05 crc kubenswrapper[4825]: I0310 08:38:05.486666 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552198-bvsh5" event={"ID":"4eec7146-26d1-4fd7-99a8-d4f910750686","Type":"ContainerDied","Data":"fe4274470c446e08831fc39dfe90dda0149a66756078068cb735e58a54f9a66e"} Mar 10 08:38:05 crc kubenswrapper[4825]: I0310 08:38:05.486967 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe4274470c446e08831fc39dfe90dda0149a66756078068cb735e58a54f9a66e" Mar 10 08:38:05 crc kubenswrapper[4825]: I0310 08:38:05.486716 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552198-bvsh5" Mar 10 08:38:05 crc kubenswrapper[4825]: I0310 08:38:05.938586 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552192-7hvcs"] Mar 10 08:38:05 crc kubenswrapper[4825]: I0310 08:38:05.946535 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552192-7hvcs"] Mar 10 08:38:07 crc kubenswrapper[4825]: I0310 08:38:07.250456 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39980f78-dbfe-469c-935f-83a2acb5e039" path="/var/lib/kubelet/pods/39980f78-dbfe-469c-935f-83a2acb5e039/volumes" Mar 10 08:38:12 crc kubenswrapper[4825]: I0310 08:38:12.373874 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gbxbv" Mar 10 08:38:12 crc kubenswrapper[4825]: I0310 08:38:12.375325 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gbxbv" Mar 10 08:38:13 crc kubenswrapper[4825]: I0310 08:38:13.425322 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gbxbv" podUID="a0496816-ab14-4d86-a5b5-d2812b62d3fb" containerName="registry-server" probeResult="failure" output=< Mar 10 08:38:13 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 08:38:13 crc kubenswrapper[4825]: > Mar 10 08:38:14 crc kubenswrapper[4825]: I0310 08:38:14.236121 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:38:14 crc kubenswrapper[4825]: E0310 08:38:14.236413 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:38:23 crc kubenswrapper[4825]: I0310 08:38:23.426546 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gbxbv" podUID="a0496816-ab14-4d86-a5b5-d2812b62d3fb" containerName="registry-server" probeResult="failure" output=< Mar 10 08:38:23 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 08:38:23 crc kubenswrapper[4825]: > Mar 10 08:38:26 crc kubenswrapper[4825]: I0310 08:38:26.236686 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:38:26 crc kubenswrapper[4825]: E0310 08:38:26.238789 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:38:32 crc kubenswrapper[4825]: I0310 08:38:32.422100 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gbxbv" Mar 10 08:38:32 crc kubenswrapper[4825]: I0310 08:38:32.489888 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gbxbv" Mar 10 08:38:32 crc kubenswrapper[4825]: I0310 08:38:32.666476 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gbxbv"] Mar 10 08:38:33 crc kubenswrapper[4825]: I0310 08:38:33.769789 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gbxbv" podUID="a0496816-ab14-4d86-a5b5-d2812b62d3fb" containerName="registry-server" containerID="cri-o://aabba286fdbcd1acd607efe79540643fdd1a99369c0832e6060ef60d26cfbcd4" gracePeriod=2 Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.209417 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gbxbv" Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.288507 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0496816-ab14-4d86-a5b5-d2812b62d3fb-catalog-content\") pod \"a0496816-ab14-4d86-a5b5-d2812b62d3fb\" (UID: \"a0496816-ab14-4d86-a5b5-d2812b62d3fb\") " Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.292809 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q27d8\" (UniqueName: \"kubernetes.io/projected/a0496816-ab14-4d86-a5b5-d2812b62d3fb-kube-api-access-q27d8\") pod \"a0496816-ab14-4d86-a5b5-d2812b62d3fb\" (UID: \"a0496816-ab14-4d86-a5b5-d2812b62d3fb\") " Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.292895 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0496816-ab14-4d86-a5b5-d2812b62d3fb-utilities\") pod \"a0496816-ab14-4d86-a5b5-d2812b62d3fb\" (UID: \"a0496816-ab14-4d86-a5b5-d2812b62d3fb\") " Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.305170 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0496816-ab14-4d86-a5b5-d2812b62d3fb-kube-api-access-q27d8" (OuterVolumeSpecName: "kube-api-access-q27d8") pod "a0496816-ab14-4d86-a5b5-d2812b62d3fb" (UID: "a0496816-ab14-4d86-a5b5-d2812b62d3fb"). InnerVolumeSpecName "kube-api-access-q27d8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.311023 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0496816-ab14-4d86-a5b5-d2812b62d3fb-utilities" (OuterVolumeSpecName: "utilities") pod "a0496816-ab14-4d86-a5b5-d2812b62d3fb" (UID: "a0496816-ab14-4d86-a5b5-d2812b62d3fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.395400 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q27d8\" (UniqueName: \"kubernetes.io/projected/a0496816-ab14-4d86-a5b5-d2812b62d3fb-kube-api-access-q27d8\") on node \"crc\" DevicePath \"\"" Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.395434 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0496816-ab14-4d86-a5b5-d2812b62d3fb-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.460260 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0496816-ab14-4d86-a5b5-d2812b62d3fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0496816-ab14-4d86-a5b5-d2812b62d3fb" (UID: "a0496816-ab14-4d86-a5b5-d2812b62d3fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.496576 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0496816-ab14-4d86-a5b5-d2812b62d3fb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.785489 4825 generic.go:334] "Generic (PLEG): container finished" podID="a0496816-ab14-4d86-a5b5-d2812b62d3fb" containerID="aabba286fdbcd1acd607efe79540643fdd1a99369c0832e6060ef60d26cfbcd4" exitCode=0 Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.785544 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gbxbv" event={"ID":"a0496816-ab14-4d86-a5b5-d2812b62d3fb","Type":"ContainerDied","Data":"aabba286fdbcd1acd607efe79540643fdd1a99369c0832e6060ef60d26cfbcd4"} Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.785577 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gbxbv" event={"ID":"a0496816-ab14-4d86-a5b5-d2812b62d3fb","Type":"ContainerDied","Data":"aa588c62f5a24563c36c0a805f584fbe529f11d80a2e0d5d69d313289cfb40e6"} Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.785576 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gbxbv" Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.785597 4825 scope.go:117] "RemoveContainer" containerID="aabba286fdbcd1acd607efe79540643fdd1a99369c0832e6060ef60d26cfbcd4" Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.815272 4825 scope.go:117] "RemoveContainer" containerID="6a94ecc01c2d67baf5845b0b6338e4701c8a896ba1a96b735ad836019c537047" Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.828452 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gbxbv"] Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.843343 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gbxbv"] Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.849733 4825 scope.go:117] "RemoveContainer" containerID="5d38a2cc3a7229b2d87753bae968d6a76338cf152f82ba5eb1728cb1c478a1b2" Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.896009 4825 scope.go:117] "RemoveContainer" containerID="aabba286fdbcd1acd607efe79540643fdd1a99369c0832e6060ef60d26cfbcd4" Mar 10 08:38:34 crc kubenswrapper[4825]: E0310 08:38:34.896720 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aabba286fdbcd1acd607efe79540643fdd1a99369c0832e6060ef60d26cfbcd4\": container with ID starting with aabba286fdbcd1acd607efe79540643fdd1a99369c0832e6060ef60d26cfbcd4 not found: ID does not exist" containerID="aabba286fdbcd1acd607efe79540643fdd1a99369c0832e6060ef60d26cfbcd4" Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.896773 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aabba286fdbcd1acd607efe79540643fdd1a99369c0832e6060ef60d26cfbcd4"} err="failed to get container status \"aabba286fdbcd1acd607efe79540643fdd1a99369c0832e6060ef60d26cfbcd4\": rpc error: code = NotFound desc = could not find container \"aabba286fdbcd1acd607efe79540643fdd1a99369c0832e6060ef60d26cfbcd4\": container with ID starting with aabba286fdbcd1acd607efe79540643fdd1a99369c0832e6060ef60d26cfbcd4 not found: ID does not exist" Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.896809 4825 scope.go:117] "RemoveContainer" containerID="6a94ecc01c2d67baf5845b0b6338e4701c8a896ba1a96b735ad836019c537047" Mar 10 08:38:34 crc kubenswrapper[4825]: E0310 08:38:34.897584 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a94ecc01c2d67baf5845b0b6338e4701c8a896ba1a96b735ad836019c537047\": container with ID starting with 6a94ecc01c2d67baf5845b0b6338e4701c8a896ba1a96b735ad836019c537047 not found: ID does not exist" containerID="6a94ecc01c2d67baf5845b0b6338e4701c8a896ba1a96b735ad836019c537047" Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.897995 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a94ecc01c2d67baf5845b0b6338e4701c8a896ba1a96b735ad836019c537047"} err="failed to get container status \"6a94ecc01c2d67baf5845b0b6338e4701c8a896ba1a96b735ad836019c537047\": rpc error: code = NotFound desc = could not find container \"6a94ecc01c2d67baf5845b0b6338e4701c8a896ba1a96b735ad836019c537047\": container with ID starting with 6a94ecc01c2d67baf5845b0b6338e4701c8a896ba1a96b735ad836019c537047 not found: ID does not exist" Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.898304 4825 scope.go:117] "RemoveContainer" containerID="5d38a2cc3a7229b2d87753bae968d6a76338cf152f82ba5eb1728cb1c478a1b2" Mar 10 08:38:34 crc kubenswrapper[4825]: E0310 08:38:34.898772 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d38a2cc3a7229b2d87753bae968d6a76338cf152f82ba5eb1728cb1c478a1b2\": container with ID starting with 5d38a2cc3a7229b2d87753bae968d6a76338cf152f82ba5eb1728cb1c478a1b2 not found: ID does not exist" containerID="5d38a2cc3a7229b2d87753bae968d6a76338cf152f82ba5eb1728cb1c478a1b2" Mar 10 08:38:34 crc kubenswrapper[4825]: I0310 08:38:34.898810 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d38a2cc3a7229b2d87753bae968d6a76338cf152f82ba5eb1728cb1c478a1b2"} err="failed to get container status \"5d38a2cc3a7229b2d87753bae968d6a76338cf152f82ba5eb1728cb1c478a1b2\": rpc error: code = NotFound desc = could not find container \"5d38a2cc3a7229b2d87753bae968d6a76338cf152f82ba5eb1728cb1c478a1b2\": container with ID starting with 5d38a2cc3a7229b2d87753bae968d6a76338cf152f82ba5eb1728cb1c478a1b2 not found: ID does not exist" Mar 10 08:38:35 crc kubenswrapper[4825]: I0310 08:38:35.248489 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0496816-ab14-4d86-a5b5-d2812b62d3fb" path="/var/lib/kubelet/pods/a0496816-ab14-4d86-a5b5-d2812b62d3fb/volumes" Mar 10 08:38:38 crc kubenswrapper[4825]: I0310 08:38:38.236992 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:38:38 crc kubenswrapper[4825]: E0310 08:38:38.237726 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:38:50 crc kubenswrapper[4825]: I0310 08:38:50.239244 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:38:50 crc kubenswrapper[4825]: E0310 08:38:50.239989 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:38:53 crc kubenswrapper[4825]: I0310 08:38:53.512989 4825 scope.go:117] "RemoveContainer" containerID="acfaf4e3804aaa9fe787732f6d0f2dc4f6949e094c5eec80aefd94fb535cf683" Mar 10 08:39:05 crc kubenswrapper[4825]: I0310 08:39:05.237020 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:39:05 crc kubenswrapper[4825]: E0310 08:39:05.238231 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:39:17 crc kubenswrapper[4825]: I0310 08:39:17.237115 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:39:17 crc kubenswrapper[4825]: E0310 08:39:17.238310 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:39:32 crc kubenswrapper[4825]: I0310 08:39:32.236973 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:39:32 crc kubenswrapper[4825]: E0310 08:39:32.237716 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:39:46 crc kubenswrapper[4825]: I0310 08:39:46.242607 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:39:46 crc kubenswrapper[4825]: E0310 08:39:46.243789 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:39:59 crc kubenswrapper[4825]: I0310 08:39:59.240530 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:39:59 crc kubenswrapper[4825]: E0310 08:39:59.241550 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:40:00 crc kubenswrapper[4825]: I0310 08:40:00.145615 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552200-8hw26"] Mar 10 08:40:00 crc kubenswrapper[4825]: E0310 08:40:00.146037 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eec7146-26d1-4fd7-99a8-d4f910750686" containerName="oc" Mar 10 08:40:00 crc kubenswrapper[4825]: I0310 08:40:00.146052 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eec7146-26d1-4fd7-99a8-d4f910750686" containerName="oc" Mar 10 08:40:00 crc kubenswrapper[4825]: E0310 08:40:00.146075 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0496816-ab14-4d86-a5b5-d2812b62d3fb" containerName="extract-content" Mar 10 08:40:00 crc kubenswrapper[4825]: I0310 08:40:00.146082 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0496816-ab14-4d86-a5b5-d2812b62d3fb" containerName="extract-content" Mar 10 08:40:00 crc kubenswrapper[4825]: E0310 08:40:00.146096 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0496816-ab14-4d86-a5b5-d2812b62d3fb" containerName="extract-utilities" Mar 10 08:40:00 crc kubenswrapper[4825]: I0310 08:40:00.146103 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0496816-ab14-4d86-a5b5-d2812b62d3fb" containerName="extract-utilities" Mar 10 08:40:00 crc kubenswrapper[4825]: E0310 08:40:00.146112 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0496816-ab14-4d86-a5b5-d2812b62d3fb" containerName="registry-server" Mar 10 08:40:00 crc kubenswrapper[4825]: I0310 08:40:00.146118 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0496816-ab14-4d86-a5b5-d2812b62d3fb" containerName="registry-server" Mar 10 08:40:00 crc kubenswrapper[4825]: I0310 08:40:00.146314 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0496816-ab14-4d86-a5b5-d2812b62d3fb" containerName="registry-server" Mar 10 08:40:00 crc kubenswrapper[4825]: I0310 08:40:00.146324 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eec7146-26d1-4fd7-99a8-d4f910750686" containerName="oc" Mar 10 08:40:00 crc kubenswrapper[4825]: I0310 08:40:00.148426 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552200-8hw26" Mar 10 08:40:00 crc kubenswrapper[4825]: I0310 08:40:00.155079 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:40:00 crc kubenswrapper[4825]: I0310 08:40:00.155446 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:40:00 crc kubenswrapper[4825]: I0310 08:40:00.155691 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:40:00 crc kubenswrapper[4825]: I0310 08:40:00.180348 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552200-8hw26"] Mar 10 08:40:00 crc kubenswrapper[4825]: I0310 08:40:00.218404 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j46jz\" (UniqueName: \"kubernetes.io/projected/5e370149-7b09-44f2-a363-38d1fddc8c60-kube-api-access-j46jz\") pod \"auto-csr-approver-29552200-8hw26\" (UID: \"5e370149-7b09-44f2-a363-38d1fddc8c60\") " pod="openshift-infra/auto-csr-approver-29552200-8hw26" Mar 10 08:40:00 crc kubenswrapper[4825]: I0310 08:40:00.320892 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j46jz\" (UniqueName: \"kubernetes.io/projected/5e370149-7b09-44f2-a363-38d1fddc8c60-kube-api-access-j46jz\") pod \"auto-csr-approver-29552200-8hw26\" (UID: \"5e370149-7b09-44f2-a363-38d1fddc8c60\") " pod="openshift-infra/auto-csr-approver-29552200-8hw26" Mar 10 08:40:00 crc kubenswrapper[4825]: I0310 08:40:00.342325 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j46jz\" (UniqueName: \"kubernetes.io/projected/5e370149-7b09-44f2-a363-38d1fddc8c60-kube-api-access-j46jz\") pod \"auto-csr-approver-29552200-8hw26\" (UID: \"5e370149-7b09-44f2-a363-38d1fddc8c60\") " pod="openshift-infra/auto-csr-approver-29552200-8hw26" Mar 10 08:40:00 crc kubenswrapper[4825]: I0310 08:40:00.478580 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552200-8hw26" Mar 10 08:40:00 crc kubenswrapper[4825]: I0310 08:40:00.949002 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552200-8hw26"] Mar 10 08:40:01 crc kubenswrapper[4825]: I0310 08:40:01.670816 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552200-8hw26" event={"ID":"5e370149-7b09-44f2-a363-38d1fddc8c60","Type":"ContainerStarted","Data":"c48eef2d300ed7458c3a8728db31a3ead9eb1c16867c4f94aac6d2dbf7693780"} Mar 10 08:40:02 crc kubenswrapper[4825]: I0310 08:40:02.680223 4825 generic.go:334] "Generic (PLEG): container finished" podID="5e370149-7b09-44f2-a363-38d1fddc8c60" containerID="5bbb4b60ecd62a2318f6533e7219a6a2242036b3dae5a4e0430b38f439a078c2" exitCode=0 Mar 10 08:40:02 crc kubenswrapper[4825]: I0310 08:40:02.680321 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552200-8hw26" event={"ID":"5e370149-7b09-44f2-a363-38d1fddc8c60","Type":"ContainerDied","Data":"5bbb4b60ecd62a2318f6533e7219a6a2242036b3dae5a4e0430b38f439a078c2"} Mar 10 08:40:04 crc kubenswrapper[4825]: I0310 08:40:04.002578 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552200-8hw26" Mar 10 08:40:04 crc kubenswrapper[4825]: I0310 08:40:04.109543 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j46jz\" (UniqueName: \"kubernetes.io/projected/5e370149-7b09-44f2-a363-38d1fddc8c60-kube-api-access-j46jz\") pod \"5e370149-7b09-44f2-a363-38d1fddc8c60\" (UID: \"5e370149-7b09-44f2-a363-38d1fddc8c60\") " Mar 10 08:40:04 crc kubenswrapper[4825]: I0310 08:40:04.114857 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e370149-7b09-44f2-a363-38d1fddc8c60-kube-api-access-j46jz" (OuterVolumeSpecName: "kube-api-access-j46jz") pod "5e370149-7b09-44f2-a363-38d1fddc8c60" (UID: "5e370149-7b09-44f2-a363-38d1fddc8c60"). InnerVolumeSpecName "kube-api-access-j46jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:40:04 crc kubenswrapper[4825]: I0310 08:40:04.212370 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j46jz\" (UniqueName: \"kubernetes.io/projected/5e370149-7b09-44f2-a363-38d1fddc8c60-kube-api-access-j46jz\") on node \"crc\" DevicePath \"\"" Mar 10 08:40:04 crc kubenswrapper[4825]: I0310 08:40:04.698751 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552200-8hw26" event={"ID":"5e370149-7b09-44f2-a363-38d1fddc8c60","Type":"ContainerDied","Data":"c48eef2d300ed7458c3a8728db31a3ead9eb1c16867c4f94aac6d2dbf7693780"} Mar 10 08:40:04 crc kubenswrapper[4825]: I0310 08:40:04.698790 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c48eef2d300ed7458c3a8728db31a3ead9eb1c16867c4f94aac6d2dbf7693780" Mar 10 08:40:04 crc kubenswrapper[4825]: I0310 08:40:04.698799 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552200-8hw26" Mar 10 08:40:05 crc kubenswrapper[4825]: I0310 08:40:05.084072 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552194-t8xbz"] Mar 10 08:40:05 crc kubenswrapper[4825]: I0310 08:40:05.097342 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552194-t8xbz"] Mar 10 08:40:05 crc kubenswrapper[4825]: I0310 08:40:05.253741 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a13eabac-e76b-4096-a5de-9bb438d19a1e" path="/var/lib/kubelet/pods/a13eabac-e76b-4096-a5de-9bb438d19a1e/volumes" Mar 10 08:40:13 crc kubenswrapper[4825]: I0310 08:40:13.238811 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:40:13 crc kubenswrapper[4825]: E0310 08:40:13.239559 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:40:26 crc kubenswrapper[4825]: I0310 08:40:26.237299 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:40:26 crc kubenswrapper[4825]: E0310 08:40:26.238603 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:40:29 crc kubenswrapper[4825]: I0310 08:40:29.935371 4825 generic.go:334] "Generic (PLEG): container finished" podID="547aa46f-b19d-4704-89c0-4c27e28ba30e" containerID="9fe387207ff94d7615d3c98234bfaef5dc7540cdc69e4664faec6d1771c95d79" exitCode=0 Mar 10 08:40:29 crc kubenswrapper[4825]: I0310 08:40:29.935448 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-grqk8" event={"ID":"547aa46f-b19d-4704-89c0-4c27e28ba30e","Type":"ContainerDied","Data":"9fe387207ff94d7615d3c98234bfaef5dc7540cdc69e4664faec6d1771c95d79"} Mar 10 08:40:31 crc kubenswrapper[4825]: I0310 08:40:31.394653 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-grqk8" Mar 10 08:40:31 crc kubenswrapper[4825]: I0310 08:40:31.516087 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/547aa46f-b19d-4704-89c0-4c27e28ba30e-bootstrap-combined-ca-bundle\") pod \"547aa46f-b19d-4704-89c0-4c27e28ba30e\" (UID: \"547aa46f-b19d-4704-89c0-4c27e28ba30e\") " Mar 10 08:40:31 crc kubenswrapper[4825]: I0310 08:40:31.516347 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/547aa46f-b19d-4704-89c0-4c27e28ba30e-ssh-key-openstack-cell1\") pod \"547aa46f-b19d-4704-89c0-4c27e28ba30e\" (UID: \"547aa46f-b19d-4704-89c0-4c27e28ba30e\") " Mar 10 08:40:31 crc kubenswrapper[4825]: I0310 08:40:31.516509 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xthf8\" (UniqueName: \"kubernetes.io/projected/547aa46f-b19d-4704-89c0-4c27e28ba30e-kube-api-access-xthf8\") pod \"547aa46f-b19d-4704-89c0-4c27e28ba30e\" (UID: \"547aa46f-b19d-4704-89c0-4c27e28ba30e\") " Mar 10 08:40:31 crc kubenswrapper[4825]: I0310 08:40:31.516556 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/547aa46f-b19d-4704-89c0-4c27e28ba30e-inventory\") pod \"547aa46f-b19d-4704-89c0-4c27e28ba30e\" (UID: \"547aa46f-b19d-4704-89c0-4c27e28ba30e\") " Mar 10 08:40:31 crc kubenswrapper[4825]: I0310 08:40:31.522061 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/547aa46f-b19d-4704-89c0-4c27e28ba30e-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "547aa46f-b19d-4704-89c0-4c27e28ba30e" (UID: "547aa46f-b19d-4704-89c0-4c27e28ba30e"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:40:31 crc kubenswrapper[4825]: I0310 08:40:31.522129 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/547aa46f-b19d-4704-89c0-4c27e28ba30e-kube-api-access-xthf8" (OuterVolumeSpecName: "kube-api-access-xthf8") pod "547aa46f-b19d-4704-89c0-4c27e28ba30e" (UID: "547aa46f-b19d-4704-89c0-4c27e28ba30e"). InnerVolumeSpecName "kube-api-access-xthf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:40:31 crc kubenswrapper[4825]: I0310 08:40:31.543339 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/547aa46f-b19d-4704-89c0-4c27e28ba30e-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "547aa46f-b19d-4704-89c0-4c27e28ba30e" (UID: "547aa46f-b19d-4704-89c0-4c27e28ba30e"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:40:31 crc kubenswrapper[4825]: I0310 08:40:31.544010 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/547aa46f-b19d-4704-89c0-4c27e28ba30e-inventory" (OuterVolumeSpecName: "inventory") pod "547aa46f-b19d-4704-89c0-4c27e28ba30e" (UID: "547aa46f-b19d-4704-89c0-4c27e28ba30e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:40:31 crc kubenswrapper[4825]: I0310 08:40:31.618970 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/547aa46f-b19d-4704-89c0-4c27e28ba30e-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 08:40:31 crc kubenswrapper[4825]: I0310 08:40:31.618998 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xthf8\" (UniqueName: \"kubernetes.io/projected/547aa46f-b19d-4704-89c0-4c27e28ba30e-kube-api-access-xthf8\") on node \"crc\" DevicePath \"\"" Mar 10 08:40:31 crc kubenswrapper[4825]: I0310 08:40:31.619008 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/547aa46f-b19d-4704-89c0-4c27e28ba30e-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 08:40:31 crc kubenswrapper[4825]: I0310 08:40:31.619037 4825 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/547aa46f-b19d-4704-89c0-4c27e28ba30e-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:40:31 crc kubenswrapper[4825]: I0310 08:40:31.954314 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-grqk8" event={"ID":"547aa46f-b19d-4704-89c0-4c27e28ba30e","Type":"ContainerDied","Data":"aa6a73da9501b0fbaaca9be8d96c3215288723a02fe258e8e672667c549bc14b"} Mar 10 08:40:31 crc kubenswrapper[4825]: I0310 08:40:31.954361 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa6a73da9501b0fbaaca9be8d96c3215288723a02fe258e8e672667c549bc14b" Mar 10 08:40:31 crc kubenswrapper[4825]: I0310 08:40:31.954399 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-grqk8" Mar 10 08:40:32 crc kubenswrapper[4825]: I0310 08:40:32.035179 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-mhc2l"] Mar 10 08:40:32 crc kubenswrapper[4825]: E0310 08:40:32.035579 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="547aa46f-b19d-4704-89c0-4c27e28ba30e" containerName="bootstrap-openstack-openstack-cell1" Mar 10 08:40:32 crc kubenswrapper[4825]: I0310 08:40:32.035597 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="547aa46f-b19d-4704-89c0-4c27e28ba30e" containerName="bootstrap-openstack-openstack-cell1" Mar 10 08:40:32 crc kubenswrapper[4825]: E0310 08:40:32.035612 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e370149-7b09-44f2-a363-38d1fddc8c60" containerName="oc" Mar 10 08:40:32 crc kubenswrapper[4825]: I0310 08:40:32.035620 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e370149-7b09-44f2-a363-38d1fddc8c60" containerName="oc" Mar 10 08:40:32 crc kubenswrapper[4825]: I0310 08:40:32.035821 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="547aa46f-b19d-4704-89c0-4c27e28ba30e" containerName="bootstrap-openstack-openstack-cell1" Mar 10 08:40:32 crc kubenswrapper[4825]: I0310 08:40:32.035848 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e370149-7b09-44f2-a363-38d1fddc8c60" containerName="oc" Mar 10 08:40:32 crc kubenswrapper[4825]: I0310 08:40:32.036603 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-mhc2l" Mar 10 08:40:32 crc kubenswrapper[4825]: I0310 08:40:32.040396 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-5twvv" Mar 10 08:40:32 crc kubenswrapper[4825]: I0310 08:40:32.040630 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 08:40:32 crc kubenswrapper[4825]: I0310 08:40:32.040797 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 08:40:32 crc kubenswrapper[4825]: I0310 08:40:32.041270 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 08:40:32 crc kubenswrapper[4825]: I0310 08:40:32.045749 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-mhc2l"] Mar 10 08:40:32 crc kubenswrapper[4825]: I0310 08:40:32.230052 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ae858185-87e8-423e-86ff-cc5b55199c37-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-mhc2l\" (UID: \"ae858185-87e8-423e-86ff-cc5b55199c37\") " pod="openstack/download-cache-openstack-openstack-cell1-mhc2l" Mar 10 08:40:32 crc kubenswrapper[4825]: I0310 08:40:32.230310 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ntxl\" (UniqueName: \"kubernetes.io/projected/ae858185-87e8-423e-86ff-cc5b55199c37-kube-api-access-4ntxl\") pod \"download-cache-openstack-openstack-cell1-mhc2l\" (UID: \"ae858185-87e8-423e-86ff-cc5b55199c37\") " pod="openstack/download-cache-openstack-openstack-cell1-mhc2l" Mar 10 08:40:32 crc kubenswrapper[4825]: I0310 08:40:32.230625 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae858185-87e8-423e-86ff-cc5b55199c37-inventory\") pod \"download-cache-openstack-openstack-cell1-mhc2l\" (UID: \"ae858185-87e8-423e-86ff-cc5b55199c37\") " pod="openstack/download-cache-openstack-openstack-cell1-mhc2l" Mar 10 08:40:32 crc kubenswrapper[4825]: I0310 08:40:32.333274 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ntxl\" (UniqueName: \"kubernetes.io/projected/ae858185-87e8-423e-86ff-cc5b55199c37-kube-api-access-4ntxl\") pod \"download-cache-openstack-openstack-cell1-mhc2l\" (UID: \"ae858185-87e8-423e-86ff-cc5b55199c37\") " pod="openstack/download-cache-openstack-openstack-cell1-mhc2l" Mar 10 08:40:32 crc kubenswrapper[4825]: I0310 08:40:32.333428 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae858185-87e8-423e-86ff-cc5b55199c37-inventory\") pod \"download-cache-openstack-openstack-cell1-mhc2l\" (UID: \"ae858185-87e8-423e-86ff-cc5b55199c37\") " pod="openstack/download-cache-openstack-openstack-cell1-mhc2l" Mar 10 08:40:32 crc kubenswrapper[4825]: I0310 08:40:32.333517 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ae858185-87e8-423e-86ff-cc5b55199c37-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-mhc2l\" (UID: \"ae858185-87e8-423e-86ff-cc5b55199c37\") " pod="openstack/download-cache-openstack-openstack-cell1-mhc2l" Mar 10 08:40:32 crc kubenswrapper[4825]: I0310 08:40:32.345198 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae858185-87e8-423e-86ff-cc5b55199c37-inventory\") pod \"download-cache-openstack-openstack-cell1-mhc2l\" (UID: \"ae858185-87e8-423e-86ff-cc5b55199c37\") " pod="openstack/download-cache-openstack-openstack-cell1-mhc2l" Mar 10 08:40:32 crc kubenswrapper[4825]: I0310 08:40:32.345434 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ae858185-87e8-423e-86ff-cc5b55199c37-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-mhc2l\" (UID: \"ae858185-87e8-423e-86ff-cc5b55199c37\") " pod="openstack/download-cache-openstack-openstack-cell1-mhc2l" Mar 10 08:40:32 crc kubenswrapper[4825]: I0310 08:40:32.361060 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ntxl\" (UniqueName: \"kubernetes.io/projected/ae858185-87e8-423e-86ff-cc5b55199c37-kube-api-access-4ntxl\") pod \"download-cache-openstack-openstack-cell1-mhc2l\" (UID: \"ae858185-87e8-423e-86ff-cc5b55199c37\") " pod="openstack/download-cache-openstack-openstack-cell1-mhc2l" Mar 10 08:40:32 crc kubenswrapper[4825]: I0310 08:40:32.655951 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-mhc2l" Mar 10 08:40:33 crc kubenswrapper[4825]: I0310 08:40:33.157355 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-mhc2l"] Mar 10 08:40:33 crc kubenswrapper[4825]: I0310 08:40:33.971178 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-mhc2l" event={"ID":"ae858185-87e8-423e-86ff-cc5b55199c37","Type":"ContainerStarted","Data":"60a1d28f21c6e2b291ecf30a00eb414a390aee48352e4738d2fac69872baea94"} Mar 10 08:40:34 crc kubenswrapper[4825]: I0310 08:40:34.980540 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-mhc2l" event={"ID":"ae858185-87e8-423e-86ff-cc5b55199c37","Type":"ContainerStarted","Data":"fca582afa2e4cd6d6c0a95ab8044f8a3322374b08624c4502e67ce59f2d83f0b"} Mar 10 08:40:35 crc kubenswrapper[4825]: I0310 08:40:35.001252 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-mhc2l" podStartSLOduration=2.433406926 podStartE2EDuration="3.001230342s" podCreationTimestamp="2026-03-10 08:40:32 +0000 UTC" firstStartedPulling="2026-03-10 08:40:33.16081239 +0000 UTC m=+6986.190593005" lastFinishedPulling="2026-03-10 08:40:33.728635806 +0000 UTC m=+6986.758416421" observedRunningTime="2026-03-10 08:40:34.995314386 +0000 UTC m=+6988.025095011" watchObservedRunningTime="2026-03-10 08:40:35.001230342 +0000 UTC m=+6988.031010987" Mar 10 08:40:40 crc kubenswrapper[4825]: I0310 08:40:40.238894 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:40:40 crc kubenswrapper[4825]: E0310 08:40:40.239697 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:40:53 crc kubenswrapper[4825]: I0310 08:40:53.630700 4825 scope.go:117] "RemoveContainer" containerID="5c0e1fdc5ff89d22e308b9fb880226d71a12c2676799a9486b6f2a3d3322e033" Mar 10 08:40:55 crc kubenswrapper[4825]: I0310 08:40:55.236305 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:40:55 crc kubenswrapper[4825]: E0310 08:40:55.236906 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:41:07 crc kubenswrapper[4825]: I0310 08:41:07.237698 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:41:07 crc kubenswrapper[4825]: E0310 08:41:07.238500 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:41:21 crc kubenswrapper[4825]: I0310 08:41:21.241398 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:41:21 crc kubenswrapper[4825]: E0310 08:41:21.242118 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:41:34 crc kubenswrapper[4825]: I0310 08:41:34.237186 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:41:34 crc kubenswrapper[4825]: E0310 08:41:34.237996 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:41:34 crc kubenswrapper[4825]: I0310 08:41:34.794990 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-94qvt"] Mar 10 08:41:34 crc kubenswrapper[4825]: I0310 08:41:34.797820 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-94qvt" Mar 10 08:41:34 crc kubenswrapper[4825]: I0310 08:41:34.809585 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-94qvt"] Mar 10 08:41:34 crc kubenswrapper[4825]: I0310 08:41:34.902396 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks84p\" (UniqueName: \"kubernetes.io/projected/250a7fd8-0822-460a-99fc-e92cb2ebc34f-kube-api-access-ks84p\") pod \"redhat-marketplace-94qvt\" (UID: \"250a7fd8-0822-460a-99fc-e92cb2ebc34f\") " pod="openshift-marketplace/redhat-marketplace-94qvt" Mar 10 08:41:34 crc kubenswrapper[4825]: I0310 08:41:34.902564 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250a7fd8-0822-460a-99fc-e92cb2ebc34f-utilities\") pod \"redhat-marketplace-94qvt\" (UID: \"250a7fd8-0822-460a-99fc-e92cb2ebc34f\") " pod="openshift-marketplace/redhat-marketplace-94qvt" Mar 10 08:41:34 crc kubenswrapper[4825]: I0310 08:41:34.902595 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250a7fd8-0822-460a-99fc-e92cb2ebc34f-catalog-content\") pod \"redhat-marketplace-94qvt\" (UID: \"250a7fd8-0822-460a-99fc-e92cb2ebc34f\") " pod="openshift-marketplace/redhat-marketplace-94qvt" Mar 10 08:41:35 crc kubenswrapper[4825]: I0310 08:41:35.004838 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks84p\" (UniqueName: \"kubernetes.io/projected/250a7fd8-0822-460a-99fc-e92cb2ebc34f-kube-api-access-ks84p\") pod \"redhat-marketplace-94qvt\" (UID: \"250a7fd8-0822-460a-99fc-e92cb2ebc34f\") " pod="openshift-marketplace/redhat-marketplace-94qvt" Mar 10 08:41:35 crc kubenswrapper[4825]: I0310 08:41:35.004971 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250a7fd8-0822-460a-99fc-e92cb2ebc34f-utilities\") pod \"redhat-marketplace-94qvt\" (UID: \"250a7fd8-0822-460a-99fc-e92cb2ebc34f\") " pod="openshift-marketplace/redhat-marketplace-94qvt" Mar 10 08:41:35 crc kubenswrapper[4825]: I0310 08:41:35.004994 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250a7fd8-0822-460a-99fc-e92cb2ebc34f-catalog-content\") pod \"redhat-marketplace-94qvt\" (UID: \"250a7fd8-0822-460a-99fc-e92cb2ebc34f\") " pod="openshift-marketplace/redhat-marketplace-94qvt" Mar 10 08:41:35 crc kubenswrapper[4825]: I0310 08:41:35.005512 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250a7fd8-0822-460a-99fc-e92cb2ebc34f-utilities\") pod \"redhat-marketplace-94qvt\" (UID: \"250a7fd8-0822-460a-99fc-e92cb2ebc34f\") " pod="openshift-marketplace/redhat-marketplace-94qvt" Mar 10 08:41:35 crc kubenswrapper[4825]: I0310 08:41:35.005550 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250a7fd8-0822-460a-99fc-e92cb2ebc34f-catalog-content\") pod \"redhat-marketplace-94qvt\" (UID: \"250a7fd8-0822-460a-99fc-e92cb2ebc34f\") " pod="openshift-marketplace/redhat-marketplace-94qvt" Mar 10 08:41:35 crc kubenswrapper[4825]: I0310 08:41:35.025874 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks84p\" (UniqueName: \"kubernetes.io/projected/250a7fd8-0822-460a-99fc-e92cb2ebc34f-kube-api-access-ks84p\") pod \"redhat-marketplace-94qvt\" (UID: \"250a7fd8-0822-460a-99fc-e92cb2ebc34f\") " pod="openshift-marketplace/redhat-marketplace-94qvt" Mar 10 08:41:35 crc kubenswrapper[4825]: I0310 08:41:35.118474 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-94qvt" Mar 10 08:41:35 crc kubenswrapper[4825]: I0310 08:41:35.609124 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-94qvt"] Mar 10 08:41:35 crc kubenswrapper[4825]: W0310 08:41:35.609304 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod250a7fd8_0822_460a_99fc_e92cb2ebc34f.slice/crio-58f816a7148e9ae7a17111a91c7bdb57b67b0092ce1384827bc47876ec0e1215 WatchSource:0}: Error finding container 58f816a7148e9ae7a17111a91c7bdb57b67b0092ce1384827bc47876ec0e1215: Status 404 returned error can't find the container with id 58f816a7148e9ae7a17111a91c7bdb57b67b0092ce1384827bc47876ec0e1215 Mar 10 08:41:36 crc kubenswrapper[4825]: I0310 08:41:36.532281 4825 generic.go:334] "Generic (PLEG): container finished" podID="250a7fd8-0822-460a-99fc-e92cb2ebc34f" containerID="2a611bdc9e780bbfcb41f152be710f6ae91bf7e67be28dd0900c7098c844435a" exitCode=0 Mar 10 08:41:36 crc kubenswrapper[4825]: I0310 08:41:36.532447 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-94qvt" event={"ID":"250a7fd8-0822-460a-99fc-e92cb2ebc34f","Type":"ContainerDied","Data":"2a611bdc9e780bbfcb41f152be710f6ae91bf7e67be28dd0900c7098c844435a"} Mar 10 08:41:36 crc kubenswrapper[4825]: I0310 08:41:36.532636 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-94qvt" event={"ID":"250a7fd8-0822-460a-99fc-e92cb2ebc34f","Type":"ContainerStarted","Data":"58f816a7148e9ae7a17111a91c7bdb57b67b0092ce1384827bc47876ec0e1215"} Mar 10 08:41:36 crc kubenswrapper[4825]: I0310 08:41:36.535474 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 08:41:37 crc kubenswrapper[4825]: I0310 08:41:37.543839 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-94qvt" event={"ID":"250a7fd8-0822-460a-99fc-e92cb2ebc34f","Type":"ContainerStarted","Data":"17388bcf63cfac3188d2dddc269649cf8021570fe8c90349c8edccd438781933"} Mar 10 08:41:38 crc kubenswrapper[4825]: I0310 08:41:38.556108 4825 generic.go:334] "Generic (PLEG): container finished" podID="250a7fd8-0822-460a-99fc-e92cb2ebc34f" containerID="17388bcf63cfac3188d2dddc269649cf8021570fe8c90349c8edccd438781933" exitCode=0 Mar 10 08:41:38 crc kubenswrapper[4825]: I0310 08:41:38.556215 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-94qvt" event={"ID":"250a7fd8-0822-460a-99fc-e92cb2ebc34f","Type":"ContainerDied","Data":"17388bcf63cfac3188d2dddc269649cf8021570fe8c90349c8edccd438781933"} Mar 10 08:41:39 crc kubenswrapper[4825]: I0310 08:41:39.569861 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-94qvt" event={"ID":"250a7fd8-0822-460a-99fc-e92cb2ebc34f","Type":"ContainerStarted","Data":"3190b82f60a01569d31a0d3d89359f7a3ab771e6eba4fc5f95fdc8fe0c84f231"} Mar 10 08:41:39 crc kubenswrapper[4825]: I0310 08:41:39.586624 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-94qvt" podStartSLOduration=3.161457397 podStartE2EDuration="5.586603543s" podCreationTimestamp="2026-03-10 08:41:34 +0000 UTC" firstStartedPulling="2026-03-10 08:41:36.534956243 +0000 UTC m=+7049.564736878" lastFinishedPulling="2026-03-10 08:41:38.960102389 +0000 UTC m=+7051.989883024" observedRunningTime="2026-03-10 08:41:39.585396961 +0000 UTC m=+7052.615177586" watchObservedRunningTime="2026-03-10 08:41:39.586603543 +0000 UTC m=+7052.616384158" Mar 10 08:41:45 crc kubenswrapper[4825]: I0310 08:41:45.119517 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-94qvt" Mar 10 08:41:45 crc kubenswrapper[4825]: I0310 08:41:45.120066 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-94qvt" Mar 10 08:41:45 crc kubenswrapper[4825]: I0310 08:41:45.173403 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-94qvt" Mar 10 08:41:45 crc kubenswrapper[4825]: I0310 08:41:45.669519 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-94qvt" Mar 10 08:41:45 crc kubenswrapper[4825]: I0310 08:41:45.727933 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-94qvt"] Mar 10 08:41:47 crc kubenswrapper[4825]: I0310 08:41:47.646147 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-94qvt" podUID="250a7fd8-0822-460a-99fc-e92cb2ebc34f" containerName="registry-server" containerID="cri-o://3190b82f60a01569d31a0d3d89359f7a3ab771e6eba4fc5f95fdc8fe0c84f231" gracePeriod=2 Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.157022 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-94qvt" Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.267068 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250a7fd8-0822-460a-99fc-e92cb2ebc34f-catalog-content\") pod \"250a7fd8-0822-460a-99fc-e92cb2ebc34f\" (UID: \"250a7fd8-0822-460a-99fc-e92cb2ebc34f\") " Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.267170 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks84p\" (UniqueName: \"kubernetes.io/projected/250a7fd8-0822-460a-99fc-e92cb2ebc34f-kube-api-access-ks84p\") pod \"250a7fd8-0822-460a-99fc-e92cb2ebc34f\" (UID: \"250a7fd8-0822-460a-99fc-e92cb2ebc34f\") " Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.267436 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250a7fd8-0822-460a-99fc-e92cb2ebc34f-utilities\") pod \"250a7fd8-0822-460a-99fc-e92cb2ebc34f\" (UID: \"250a7fd8-0822-460a-99fc-e92cb2ebc34f\") " Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.268249 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/250a7fd8-0822-460a-99fc-e92cb2ebc34f-utilities" (OuterVolumeSpecName: "utilities") pod "250a7fd8-0822-460a-99fc-e92cb2ebc34f" (UID: "250a7fd8-0822-460a-99fc-e92cb2ebc34f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.272718 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/250a7fd8-0822-460a-99fc-e92cb2ebc34f-kube-api-access-ks84p" (OuterVolumeSpecName: "kube-api-access-ks84p") pod "250a7fd8-0822-460a-99fc-e92cb2ebc34f" (UID: "250a7fd8-0822-460a-99fc-e92cb2ebc34f"). InnerVolumeSpecName "kube-api-access-ks84p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.310860 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/250a7fd8-0822-460a-99fc-e92cb2ebc34f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "250a7fd8-0822-460a-99fc-e92cb2ebc34f" (UID: "250a7fd8-0822-460a-99fc-e92cb2ebc34f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.370245 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks84p\" (UniqueName: \"kubernetes.io/projected/250a7fd8-0822-460a-99fc-e92cb2ebc34f-kube-api-access-ks84p\") on node \"crc\" DevicePath \"\"" Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.370287 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/250a7fd8-0822-460a-99fc-e92cb2ebc34f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.370297 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/250a7fd8-0822-460a-99fc-e92cb2ebc34f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.660299 4825 generic.go:334] "Generic (PLEG): container finished" podID="250a7fd8-0822-460a-99fc-e92cb2ebc34f" containerID="3190b82f60a01569d31a0d3d89359f7a3ab771e6eba4fc5f95fdc8fe0c84f231" exitCode=0 Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.660352 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-94qvt" event={"ID":"250a7fd8-0822-460a-99fc-e92cb2ebc34f","Type":"ContainerDied","Data":"3190b82f60a01569d31a0d3d89359f7a3ab771e6eba4fc5f95fdc8fe0c84f231"} Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.660385 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-94qvt" event={"ID":"250a7fd8-0822-460a-99fc-e92cb2ebc34f","Type":"ContainerDied","Data":"58f816a7148e9ae7a17111a91c7bdb57b67b0092ce1384827bc47876ec0e1215"} Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.660406 4825 scope.go:117] "RemoveContainer" containerID="3190b82f60a01569d31a0d3d89359f7a3ab771e6eba4fc5f95fdc8fe0c84f231" Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.660570 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-94qvt" Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.693610 4825 scope.go:117] "RemoveContainer" containerID="17388bcf63cfac3188d2dddc269649cf8021570fe8c90349c8edccd438781933" Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.707161 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-94qvt"] Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.716520 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-94qvt"] Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.721111 4825 scope.go:117] "RemoveContainer" containerID="2a611bdc9e780bbfcb41f152be710f6ae91bf7e67be28dd0900c7098c844435a" Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.780349 4825 scope.go:117] "RemoveContainer" containerID="3190b82f60a01569d31a0d3d89359f7a3ab771e6eba4fc5f95fdc8fe0c84f231" Mar 10 08:41:48 crc kubenswrapper[4825]: E0310 08:41:48.780792 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3190b82f60a01569d31a0d3d89359f7a3ab771e6eba4fc5f95fdc8fe0c84f231\": container with ID starting with 3190b82f60a01569d31a0d3d89359f7a3ab771e6eba4fc5f95fdc8fe0c84f231 not found: ID does not exist" containerID="3190b82f60a01569d31a0d3d89359f7a3ab771e6eba4fc5f95fdc8fe0c84f231" Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.780834 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3190b82f60a01569d31a0d3d89359f7a3ab771e6eba4fc5f95fdc8fe0c84f231"} err="failed to get container status \"3190b82f60a01569d31a0d3d89359f7a3ab771e6eba4fc5f95fdc8fe0c84f231\": rpc error: code = NotFound desc = could not find container \"3190b82f60a01569d31a0d3d89359f7a3ab771e6eba4fc5f95fdc8fe0c84f231\": container with ID starting with 3190b82f60a01569d31a0d3d89359f7a3ab771e6eba4fc5f95fdc8fe0c84f231 not found: ID does not exist" Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.780860 4825 scope.go:117] "RemoveContainer" containerID="17388bcf63cfac3188d2dddc269649cf8021570fe8c90349c8edccd438781933" Mar 10 08:41:48 crc kubenswrapper[4825]: E0310 08:41:48.781099 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17388bcf63cfac3188d2dddc269649cf8021570fe8c90349c8edccd438781933\": container with ID starting with 17388bcf63cfac3188d2dddc269649cf8021570fe8c90349c8edccd438781933 not found: ID does not exist" containerID="17388bcf63cfac3188d2dddc269649cf8021570fe8c90349c8edccd438781933" Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.781158 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17388bcf63cfac3188d2dddc269649cf8021570fe8c90349c8edccd438781933"} err="failed to get container status \"17388bcf63cfac3188d2dddc269649cf8021570fe8c90349c8edccd438781933\": rpc error: code = NotFound desc = could not find container \"17388bcf63cfac3188d2dddc269649cf8021570fe8c90349c8edccd438781933\": container with ID starting with 17388bcf63cfac3188d2dddc269649cf8021570fe8c90349c8edccd438781933 not found: ID does not exist" Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.781178 4825 scope.go:117] "RemoveContainer" containerID="2a611bdc9e780bbfcb41f152be710f6ae91bf7e67be28dd0900c7098c844435a" Mar 10 08:41:48 crc kubenswrapper[4825]: E0310 08:41:48.781413 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a611bdc9e780bbfcb41f152be710f6ae91bf7e67be28dd0900c7098c844435a\": container with ID starting with 2a611bdc9e780bbfcb41f152be710f6ae91bf7e67be28dd0900c7098c844435a not found: ID does not exist" containerID="2a611bdc9e780bbfcb41f152be710f6ae91bf7e67be28dd0900c7098c844435a" Mar 10 08:41:48 crc kubenswrapper[4825]: I0310 08:41:48.781442 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a611bdc9e780bbfcb41f152be710f6ae91bf7e67be28dd0900c7098c844435a"} err="failed to get container status \"2a611bdc9e780bbfcb41f152be710f6ae91bf7e67be28dd0900c7098c844435a\": rpc error: code = NotFound desc = could not find container \"2a611bdc9e780bbfcb41f152be710f6ae91bf7e67be28dd0900c7098c844435a\": container with ID starting with 2a611bdc9e780bbfcb41f152be710f6ae91bf7e67be28dd0900c7098c844435a not found: ID does not exist" Mar 10 08:41:49 crc kubenswrapper[4825]: I0310 08:41:49.244733 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:41:49 crc kubenswrapper[4825]: E0310 08:41:49.245093 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:41:49 crc kubenswrapper[4825]: I0310 08:41:49.257828 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="250a7fd8-0822-460a-99fc-e92cb2ebc34f" path="/var/lib/kubelet/pods/250a7fd8-0822-460a-99fc-e92cb2ebc34f/volumes" Mar 10 08:42:00 crc kubenswrapper[4825]: I0310 08:42:00.168418 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552202-r7794"] Mar 10 08:42:00 crc kubenswrapper[4825]: E0310 08:42:00.169306 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250a7fd8-0822-460a-99fc-e92cb2ebc34f" containerName="extract-utilities" Mar 10 08:42:00 crc kubenswrapper[4825]: I0310 08:42:00.169321 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="250a7fd8-0822-460a-99fc-e92cb2ebc34f" containerName="extract-utilities" Mar 10 08:42:00 crc kubenswrapper[4825]: E0310 08:42:00.169353 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250a7fd8-0822-460a-99fc-e92cb2ebc34f" containerName="registry-server" Mar 10 08:42:00 crc kubenswrapper[4825]: I0310 08:42:00.169361 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="250a7fd8-0822-460a-99fc-e92cb2ebc34f" containerName="registry-server" Mar 10 08:42:00 crc kubenswrapper[4825]: E0310 08:42:00.169378 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250a7fd8-0822-460a-99fc-e92cb2ebc34f" containerName="extract-content" Mar 10 08:42:00 crc kubenswrapper[4825]: I0310 08:42:00.169386 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="250a7fd8-0822-460a-99fc-e92cb2ebc34f" containerName="extract-content" Mar 10 08:42:00 crc kubenswrapper[4825]: I0310 08:42:00.169610 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="250a7fd8-0822-460a-99fc-e92cb2ebc34f" containerName="registry-server" Mar 10 08:42:00 crc kubenswrapper[4825]: I0310 08:42:00.170425 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552202-r7794" Mar 10 08:42:00 crc kubenswrapper[4825]: I0310 08:42:00.188107 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:42:00 crc kubenswrapper[4825]: I0310 08:42:00.188197 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:42:00 crc kubenswrapper[4825]: I0310 08:42:00.188118 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:42:00 crc kubenswrapper[4825]: I0310 08:42:00.193932 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552202-r7794"] Mar 10 08:42:00 crc kubenswrapper[4825]: I0310 08:42:00.266414 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wjb2\" (UniqueName: \"kubernetes.io/projected/4bdfaf77-bd08-4483-a7c4-ebb450e783f9-kube-api-access-7wjb2\") pod \"auto-csr-approver-29552202-r7794\" (UID: \"4bdfaf77-bd08-4483-a7c4-ebb450e783f9\") " pod="openshift-infra/auto-csr-approver-29552202-r7794" Mar 10 08:42:00 crc kubenswrapper[4825]: I0310 08:42:00.368485 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wjb2\" (UniqueName: \"kubernetes.io/projected/4bdfaf77-bd08-4483-a7c4-ebb450e783f9-kube-api-access-7wjb2\") pod \"auto-csr-approver-29552202-r7794\" (UID: \"4bdfaf77-bd08-4483-a7c4-ebb450e783f9\") " pod="openshift-infra/auto-csr-approver-29552202-r7794" Mar 10 08:42:00 crc kubenswrapper[4825]: I0310 08:42:00.391111 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wjb2\" (UniqueName: \"kubernetes.io/projected/4bdfaf77-bd08-4483-a7c4-ebb450e783f9-kube-api-access-7wjb2\") pod \"auto-csr-approver-29552202-r7794\" (UID: \"4bdfaf77-bd08-4483-a7c4-ebb450e783f9\") " pod="openshift-infra/auto-csr-approver-29552202-r7794" Mar 10 08:42:00 crc kubenswrapper[4825]: I0310 08:42:00.502403 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552202-r7794" Mar 10 08:42:00 crc kubenswrapper[4825]: I0310 08:42:00.986672 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552202-r7794"] Mar 10 08:42:01 crc kubenswrapper[4825]: I0310 08:42:01.238180 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:42:01 crc kubenswrapper[4825]: E0310 08:42:01.238420 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:42:01 crc kubenswrapper[4825]: I0310 08:42:01.800103 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552202-r7794" event={"ID":"4bdfaf77-bd08-4483-a7c4-ebb450e783f9","Type":"ContainerStarted","Data":"cea22fcf25458327339aee89c1cd841608da49cdbb4f224e3d161bf6ff7f462a"} Mar 10 08:42:02 crc kubenswrapper[4825]: I0310 08:42:02.814947 4825 generic.go:334] "Generic (PLEG): container finished" podID="4bdfaf77-bd08-4483-a7c4-ebb450e783f9" containerID="74e65c5cd7f3cbe836e4758447779e6aacc9fa2c0e2484ed78819f676958437d" exitCode=0 Mar 10 08:42:02 crc kubenswrapper[4825]: I0310 08:42:02.815103 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552202-r7794" event={"ID":"4bdfaf77-bd08-4483-a7c4-ebb450e783f9","Type":"ContainerDied","Data":"74e65c5cd7f3cbe836e4758447779e6aacc9fa2c0e2484ed78819f676958437d"} Mar 10 08:42:04 crc kubenswrapper[4825]: I0310 08:42:04.170509 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552202-r7794" Mar 10 08:42:04 crc kubenswrapper[4825]: I0310 08:42:04.257833 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wjb2\" (UniqueName: \"kubernetes.io/projected/4bdfaf77-bd08-4483-a7c4-ebb450e783f9-kube-api-access-7wjb2\") pod \"4bdfaf77-bd08-4483-a7c4-ebb450e783f9\" (UID: \"4bdfaf77-bd08-4483-a7c4-ebb450e783f9\") " Mar 10 08:42:04 crc kubenswrapper[4825]: I0310 08:42:04.263968 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bdfaf77-bd08-4483-a7c4-ebb450e783f9-kube-api-access-7wjb2" (OuterVolumeSpecName: "kube-api-access-7wjb2") pod "4bdfaf77-bd08-4483-a7c4-ebb450e783f9" (UID: "4bdfaf77-bd08-4483-a7c4-ebb450e783f9"). InnerVolumeSpecName "kube-api-access-7wjb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:42:04 crc kubenswrapper[4825]: I0310 08:42:04.360467 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wjb2\" (UniqueName: \"kubernetes.io/projected/4bdfaf77-bd08-4483-a7c4-ebb450e783f9-kube-api-access-7wjb2\") on node \"crc\" DevicePath \"\"" Mar 10 08:42:04 crc kubenswrapper[4825]: I0310 08:42:04.839076 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552202-r7794" event={"ID":"4bdfaf77-bd08-4483-a7c4-ebb450e783f9","Type":"ContainerDied","Data":"cea22fcf25458327339aee89c1cd841608da49cdbb4f224e3d161bf6ff7f462a"} Mar 10 08:42:04 crc kubenswrapper[4825]: I0310 08:42:04.839148 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cea22fcf25458327339aee89c1cd841608da49cdbb4f224e3d161bf6ff7f462a" Mar 10 08:42:04 crc kubenswrapper[4825]: I0310 08:42:04.839781 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552202-r7794" Mar 10 08:42:05 crc kubenswrapper[4825]: I0310 08:42:05.247605 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552196-xvh7m"] Mar 10 08:42:05 crc kubenswrapper[4825]: I0310 08:42:05.254235 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552196-xvh7m"] Mar 10 08:42:07 crc kubenswrapper[4825]: I0310 08:42:07.252290 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0d88a47-f079-456f-a0e7-847c942efa88" path="/var/lib/kubelet/pods/e0d88a47-f079-456f-a0e7-847c942efa88/volumes" Mar 10 08:42:16 crc kubenswrapper[4825]: I0310 08:42:16.236821 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:42:16 crc kubenswrapper[4825]: E0310 08:42:16.238512 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:42:18 crc kubenswrapper[4825]: I0310 08:42:18.606159 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9pl5q"] Mar 10 08:42:18 crc kubenswrapper[4825]: E0310 08:42:18.607234 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bdfaf77-bd08-4483-a7c4-ebb450e783f9" containerName="oc" Mar 10 08:42:18 crc kubenswrapper[4825]: I0310 08:42:18.607251 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bdfaf77-bd08-4483-a7c4-ebb450e783f9" containerName="oc" Mar 10 08:42:18 crc kubenswrapper[4825]: I0310 08:42:18.607495 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bdfaf77-bd08-4483-a7c4-ebb450e783f9" containerName="oc" Mar 10 08:42:18 crc kubenswrapper[4825]: I0310 08:42:18.608967 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9pl5q" Mar 10 08:42:18 crc kubenswrapper[4825]: I0310 08:42:18.623492 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9pl5q"] Mar 10 08:42:18 crc kubenswrapper[4825]: I0310 08:42:18.643564 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f9ff4a6-484a-44ba-8d2e-1f43790ea38f-catalog-content\") pod \"community-operators-9pl5q\" (UID: \"0f9ff4a6-484a-44ba-8d2e-1f43790ea38f\") " pod="openshift-marketplace/community-operators-9pl5q" Mar 10 08:42:18 crc kubenswrapper[4825]: I0310 08:42:18.643632 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f9ff4a6-484a-44ba-8d2e-1f43790ea38f-utilities\") pod \"community-operators-9pl5q\" (UID: \"0f9ff4a6-484a-44ba-8d2e-1f43790ea38f\") " pod="openshift-marketplace/community-operators-9pl5q" Mar 10 08:42:18 crc kubenswrapper[4825]: I0310 08:42:18.643672 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4qtd\" (UniqueName: \"kubernetes.io/projected/0f9ff4a6-484a-44ba-8d2e-1f43790ea38f-kube-api-access-n4qtd\") pod \"community-operators-9pl5q\" (UID: \"0f9ff4a6-484a-44ba-8d2e-1f43790ea38f\") " pod="openshift-marketplace/community-operators-9pl5q" Mar 10 08:42:18 crc kubenswrapper[4825]: I0310 08:42:18.745331 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f9ff4a6-484a-44ba-8d2e-1f43790ea38f-catalog-content\") pod \"community-operators-9pl5q\" (UID: \"0f9ff4a6-484a-44ba-8d2e-1f43790ea38f\") " pod="openshift-marketplace/community-operators-9pl5q" Mar 10 08:42:18 crc kubenswrapper[4825]: I0310 08:42:18.745666 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f9ff4a6-484a-44ba-8d2e-1f43790ea38f-utilities\") pod \"community-operators-9pl5q\" (UID: \"0f9ff4a6-484a-44ba-8d2e-1f43790ea38f\") " pod="openshift-marketplace/community-operators-9pl5q" Mar 10 08:42:18 crc kubenswrapper[4825]: I0310 08:42:18.745814 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4qtd\" (UniqueName: \"kubernetes.io/projected/0f9ff4a6-484a-44ba-8d2e-1f43790ea38f-kube-api-access-n4qtd\") pod \"community-operators-9pl5q\" (UID: \"0f9ff4a6-484a-44ba-8d2e-1f43790ea38f\") " pod="openshift-marketplace/community-operators-9pl5q" Mar 10 08:42:18 crc kubenswrapper[4825]: I0310 08:42:18.745820 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f9ff4a6-484a-44ba-8d2e-1f43790ea38f-catalog-content\") pod \"community-operators-9pl5q\" (UID: \"0f9ff4a6-484a-44ba-8d2e-1f43790ea38f\") " pod="openshift-marketplace/community-operators-9pl5q" Mar 10 08:42:18 crc kubenswrapper[4825]: I0310 08:42:18.746144 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f9ff4a6-484a-44ba-8d2e-1f43790ea38f-utilities\") pod \"community-operators-9pl5q\" (UID: \"0f9ff4a6-484a-44ba-8d2e-1f43790ea38f\") " pod="openshift-marketplace/community-operators-9pl5q" Mar 10 08:42:18 crc kubenswrapper[4825]: I0310 08:42:18.766030 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4qtd\" (UniqueName: \"kubernetes.io/projected/0f9ff4a6-484a-44ba-8d2e-1f43790ea38f-kube-api-access-n4qtd\") pod \"community-operators-9pl5q\" (UID: \"0f9ff4a6-484a-44ba-8d2e-1f43790ea38f\") " pod="openshift-marketplace/community-operators-9pl5q" Mar 10 08:42:18 crc kubenswrapper[4825]: I0310 08:42:18.951821 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9pl5q" Mar 10 08:42:19 crc kubenswrapper[4825]: I0310 08:42:19.498734 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9pl5q"] Mar 10 08:42:19 crc kubenswrapper[4825]: I0310 08:42:19.981179 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9pl5q" event={"ID":"0f9ff4a6-484a-44ba-8d2e-1f43790ea38f","Type":"ContainerDied","Data":"12cdea19031daa6fef93809781d8e5bbfb7fe0da43f6c31f2d536ab45c6b7ce1"} Mar 10 08:42:19 crc kubenswrapper[4825]: I0310 08:42:19.981092 4825 generic.go:334] "Generic (PLEG): container finished" podID="0f9ff4a6-484a-44ba-8d2e-1f43790ea38f" containerID="12cdea19031daa6fef93809781d8e5bbfb7fe0da43f6c31f2d536ab45c6b7ce1" exitCode=0 Mar 10 08:42:19 crc kubenswrapper[4825]: I0310 08:42:19.981496 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9pl5q" event={"ID":"0f9ff4a6-484a-44ba-8d2e-1f43790ea38f","Type":"ContainerStarted","Data":"d15d0c824cf761872f5bce72d1b4c69ff1b35a569639e372924979b61559d703"} Mar 10 08:42:20 crc kubenswrapper[4825]: I0310 08:42:20.990572 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9pl5q" event={"ID":"0f9ff4a6-484a-44ba-8d2e-1f43790ea38f","Type":"ContainerStarted","Data":"cec32be00e1bf17241a0db637d27d8e9f27571f751d32119eccbc72bf63220d5"} Mar 10 08:42:22 crc kubenswrapper[4825]: I0310 08:42:22.002579 4825 generic.go:334] "Generic (PLEG): container finished" podID="0f9ff4a6-484a-44ba-8d2e-1f43790ea38f" containerID="cec32be00e1bf17241a0db637d27d8e9f27571f751d32119eccbc72bf63220d5" exitCode=0 Mar 10 08:42:22 crc kubenswrapper[4825]: I0310 08:42:22.002656 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9pl5q" event={"ID":"0f9ff4a6-484a-44ba-8d2e-1f43790ea38f","Type":"ContainerDied","Data":"cec32be00e1bf17241a0db637d27d8e9f27571f751d32119eccbc72bf63220d5"} Mar 10 08:42:23 crc kubenswrapper[4825]: I0310 08:42:23.012719 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9pl5q" event={"ID":"0f9ff4a6-484a-44ba-8d2e-1f43790ea38f","Type":"ContainerStarted","Data":"26902234de12f0815508c15bc9a860621f9196ac3c230b2c91a282159c5e85de"} Mar 10 08:42:23 crc kubenswrapper[4825]: I0310 08:42:23.032881 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9pl5q" podStartSLOduration=2.6025684030000003 podStartE2EDuration="5.032861703s" podCreationTimestamp="2026-03-10 08:42:18 +0000 UTC" firstStartedPulling="2026-03-10 08:42:19.984306516 +0000 UTC m=+7093.014087141" lastFinishedPulling="2026-03-10 08:42:22.414599826 +0000 UTC m=+7095.444380441" observedRunningTime="2026-03-10 08:42:23.027497922 +0000 UTC m=+7096.057278557" watchObservedRunningTime="2026-03-10 08:42:23.032861703 +0000 UTC m=+7096.062642318" Mar 10 08:42:28 crc kubenswrapper[4825]: I0310 08:42:28.236533 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:42:28 crc kubenswrapper[4825]: E0310 08:42:28.238822 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:42:28 crc kubenswrapper[4825]: I0310 08:42:28.952955 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9pl5q" Mar 10 08:42:28 crc kubenswrapper[4825]: I0310 08:42:28.953282 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9pl5q" Mar 10 08:42:28 crc kubenswrapper[4825]: I0310 08:42:28.999688 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9pl5q" Mar 10 08:42:29 crc kubenswrapper[4825]: I0310 08:42:29.105442 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9pl5q" Mar 10 08:42:30 crc kubenswrapper[4825]: I0310 08:42:30.069359 4825 generic.go:334] "Generic (PLEG): container finished" podID="ae858185-87e8-423e-86ff-cc5b55199c37" containerID="fca582afa2e4cd6d6c0a95ab8044f8a3322374b08624c4502e67ce59f2d83f0b" exitCode=0 Mar 10 08:42:30 crc kubenswrapper[4825]: I0310 08:42:30.069434 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-mhc2l" event={"ID":"ae858185-87e8-423e-86ff-cc5b55199c37","Type":"ContainerDied","Data":"fca582afa2e4cd6d6c0a95ab8044f8a3322374b08624c4502e67ce59f2d83f0b"} Mar 10 08:42:31 crc kubenswrapper[4825]: I0310 08:42:31.399172 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9pl5q"] Mar 10 08:42:31 crc kubenswrapper[4825]: I0310 08:42:31.400833 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9pl5q" podUID="0f9ff4a6-484a-44ba-8d2e-1f43790ea38f" containerName="registry-server" containerID="cri-o://26902234de12f0815508c15bc9a860621f9196ac3c230b2c91a282159c5e85de" gracePeriod=2 Mar 10 08:42:31 crc kubenswrapper[4825]: I0310 08:42:31.678222 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-mhc2l" Mar 10 08:42:31 crc kubenswrapper[4825]: I0310 08:42:31.765448 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ae858185-87e8-423e-86ff-cc5b55199c37-ssh-key-openstack-cell1\") pod \"ae858185-87e8-423e-86ff-cc5b55199c37\" (UID: \"ae858185-87e8-423e-86ff-cc5b55199c37\") " Mar 10 08:42:31 crc kubenswrapper[4825]: I0310 08:42:31.765553 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ntxl\" (UniqueName: \"kubernetes.io/projected/ae858185-87e8-423e-86ff-cc5b55199c37-kube-api-access-4ntxl\") pod \"ae858185-87e8-423e-86ff-cc5b55199c37\" (UID: \"ae858185-87e8-423e-86ff-cc5b55199c37\") " Mar 10 08:42:31 crc kubenswrapper[4825]: I0310 08:42:31.765597 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae858185-87e8-423e-86ff-cc5b55199c37-inventory\") pod \"ae858185-87e8-423e-86ff-cc5b55199c37\" (UID: \"ae858185-87e8-423e-86ff-cc5b55199c37\") " Mar 10 08:42:31 crc kubenswrapper[4825]: I0310 08:42:31.778254 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae858185-87e8-423e-86ff-cc5b55199c37-kube-api-access-4ntxl" (OuterVolumeSpecName: "kube-api-access-4ntxl") pod "ae858185-87e8-423e-86ff-cc5b55199c37" (UID: "ae858185-87e8-423e-86ff-cc5b55199c37"). InnerVolumeSpecName "kube-api-access-4ntxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:42:31 crc kubenswrapper[4825]: I0310 08:42:31.800175 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae858185-87e8-423e-86ff-cc5b55199c37-inventory" (OuterVolumeSpecName: "inventory") pod "ae858185-87e8-423e-86ff-cc5b55199c37" (UID: "ae858185-87e8-423e-86ff-cc5b55199c37"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:42:31 crc kubenswrapper[4825]: I0310 08:42:31.803093 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae858185-87e8-423e-86ff-cc5b55199c37-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "ae858185-87e8-423e-86ff-cc5b55199c37" (UID: "ae858185-87e8-423e-86ff-cc5b55199c37"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:42:31 crc kubenswrapper[4825]: I0310 08:42:31.868551 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ntxl\" (UniqueName: \"kubernetes.io/projected/ae858185-87e8-423e-86ff-cc5b55199c37-kube-api-access-4ntxl\") on node \"crc\" DevicePath \"\"" Mar 10 08:42:31 crc kubenswrapper[4825]: I0310 08:42:31.870239 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae858185-87e8-423e-86ff-cc5b55199c37-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 08:42:31 crc kubenswrapper[4825]: I0310 08:42:31.870255 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ae858185-87e8-423e-86ff-cc5b55199c37-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 08:42:31 crc kubenswrapper[4825]: I0310 08:42:31.895100 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9pl5q" Mar 10 08:42:31 crc kubenswrapper[4825]: I0310 08:42:31.970666 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f9ff4a6-484a-44ba-8d2e-1f43790ea38f-utilities\") pod \"0f9ff4a6-484a-44ba-8d2e-1f43790ea38f\" (UID: \"0f9ff4a6-484a-44ba-8d2e-1f43790ea38f\") " Mar 10 08:42:31 crc kubenswrapper[4825]: I0310 08:42:31.970822 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f9ff4a6-484a-44ba-8d2e-1f43790ea38f-catalog-content\") pod \"0f9ff4a6-484a-44ba-8d2e-1f43790ea38f\" (UID: \"0f9ff4a6-484a-44ba-8d2e-1f43790ea38f\") " Mar 10 08:42:31 crc kubenswrapper[4825]: I0310 08:42:31.970880 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4qtd\" (UniqueName: \"kubernetes.io/projected/0f9ff4a6-484a-44ba-8d2e-1f43790ea38f-kube-api-access-n4qtd\") pod \"0f9ff4a6-484a-44ba-8d2e-1f43790ea38f\" (UID: \"0f9ff4a6-484a-44ba-8d2e-1f43790ea38f\") " Mar 10 08:42:31 crc kubenswrapper[4825]: I0310 08:42:31.972487 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f9ff4a6-484a-44ba-8d2e-1f43790ea38f-utilities" (OuterVolumeSpecName: "utilities") pod "0f9ff4a6-484a-44ba-8d2e-1f43790ea38f" (UID: "0f9ff4a6-484a-44ba-8d2e-1f43790ea38f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:42:31 crc kubenswrapper[4825]: I0310 08:42:31.974555 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f9ff4a6-484a-44ba-8d2e-1f43790ea38f-kube-api-access-n4qtd" (OuterVolumeSpecName: "kube-api-access-n4qtd") pod "0f9ff4a6-484a-44ba-8d2e-1f43790ea38f" (UID: "0f9ff4a6-484a-44ba-8d2e-1f43790ea38f"). InnerVolumeSpecName "kube-api-access-n4qtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.024423 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f9ff4a6-484a-44ba-8d2e-1f43790ea38f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f9ff4a6-484a-44ba-8d2e-1f43790ea38f" (UID: "0f9ff4a6-484a-44ba-8d2e-1f43790ea38f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.072956 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f9ff4a6-484a-44ba-8d2e-1f43790ea38f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.072989 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4qtd\" (UniqueName: \"kubernetes.io/projected/0f9ff4a6-484a-44ba-8d2e-1f43790ea38f-kube-api-access-n4qtd\") on node \"crc\" DevicePath \"\"" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.073001 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f9ff4a6-484a-44ba-8d2e-1f43790ea38f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.092098 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-mhc2l" event={"ID":"ae858185-87e8-423e-86ff-cc5b55199c37","Type":"ContainerDied","Data":"60a1d28f21c6e2b291ecf30a00eb414a390aee48352e4738d2fac69872baea94"} Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.092232 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60a1d28f21c6e2b291ecf30a00eb414a390aee48352e4738d2fac69872baea94" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.092206 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-mhc2l" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.101558 4825 generic.go:334] "Generic (PLEG): container finished" podID="0f9ff4a6-484a-44ba-8d2e-1f43790ea38f" containerID="26902234de12f0815508c15bc9a860621f9196ac3c230b2c91a282159c5e85de" exitCode=0 Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.101657 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9pl5q" event={"ID":"0f9ff4a6-484a-44ba-8d2e-1f43790ea38f","Type":"ContainerDied","Data":"26902234de12f0815508c15bc9a860621f9196ac3c230b2c91a282159c5e85de"} Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.101733 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9pl5q" event={"ID":"0f9ff4a6-484a-44ba-8d2e-1f43790ea38f","Type":"ContainerDied","Data":"d15d0c824cf761872f5bce72d1b4c69ff1b35a569639e372924979b61559d703"} Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.101765 4825 scope.go:117] "RemoveContainer" containerID="26902234de12f0815508c15bc9a860621f9196ac3c230b2c91a282159c5e85de" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.102083 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9pl5q" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.152637 4825 scope.go:117] "RemoveContainer" containerID="cec32be00e1bf17241a0db637d27d8e9f27571f751d32119eccbc72bf63220d5" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.159469 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9pl5q"] Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.169999 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9pl5q"] Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.182319 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-x88hw"] Mar 10 08:42:32 crc kubenswrapper[4825]: E0310 08:42:32.182814 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f9ff4a6-484a-44ba-8d2e-1f43790ea38f" containerName="extract-content" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.182843 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f9ff4a6-484a-44ba-8d2e-1f43790ea38f" containerName="extract-content" Mar 10 08:42:32 crc kubenswrapper[4825]: E0310 08:42:32.182869 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f9ff4a6-484a-44ba-8d2e-1f43790ea38f" containerName="extract-utilities" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.182880 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f9ff4a6-484a-44ba-8d2e-1f43790ea38f" containerName="extract-utilities" Mar 10 08:42:32 crc kubenswrapper[4825]: E0310 08:42:32.182896 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f9ff4a6-484a-44ba-8d2e-1f43790ea38f" containerName="registry-server" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.182902 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f9ff4a6-484a-44ba-8d2e-1f43790ea38f" containerName="registry-server" Mar 10 08:42:32 crc kubenswrapper[4825]: E0310 08:42:32.182946 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae858185-87e8-423e-86ff-cc5b55199c37" containerName="download-cache-openstack-openstack-cell1" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.182959 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae858185-87e8-423e-86ff-cc5b55199c37" containerName="download-cache-openstack-openstack-cell1" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.183197 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f9ff4a6-484a-44ba-8d2e-1f43790ea38f" containerName="registry-server" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.183221 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae858185-87e8-423e-86ff-cc5b55199c37" containerName="download-cache-openstack-openstack-cell1" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.184271 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-x88hw" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.184887 4825 scope.go:117] "RemoveContainer" containerID="12cdea19031daa6fef93809781d8e5bbfb7fe0da43f6c31f2d536ab45c6b7ce1" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.189425 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.189622 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.189749 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-5twvv" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.189820 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.196217 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-x88hw"] Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.235024 4825 scope.go:117] "RemoveContainer" containerID="26902234de12f0815508c15bc9a860621f9196ac3c230b2c91a282159c5e85de" Mar 10 08:42:32 crc kubenswrapper[4825]: E0310 08:42:32.235536 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26902234de12f0815508c15bc9a860621f9196ac3c230b2c91a282159c5e85de\": container with ID starting with 26902234de12f0815508c15bc9a860621f9196ac3c230b2c91a282159c5e85de not found: ID does not exist" containerID="26902234de12f0815508c15bc9a860621f9196ac3c230b2c91a282159c5e85de" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.235586 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26902234de12f0815508c15bc9a860621f9196ac3c230b2c91a282159c5e85de"} err="failed to get container status \"26902234de12f0815508c15bc9a860621f9196ac3c230b2c91a282159c5e85de\": rpc error: code = NotFound desc = could not find container \"26902234de12f0815508c15bc9a860621f9196ac3c230b2c91a282159c5e85de\": container with ID starting with 26902234de12f0815508c15bc9a860621f9196ac3c230b2c91a282159c5e85de not found: ID does not exist" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.235626 4825 scope.go:117] "RemoveContainer" containerID="cec32be00e1bf17241a0db637d27d8e9f27571f751d32119eccbc72bf63220d5" Mar 10 08:42:32 crc kubenswrapper[4825]: E0310 08:42:32.236067 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cec32be00e1bf17241a0db637d27d8e9f27571f751d32119eccbc72bf63220d5\": container with ID starting with cec32be00e1bf17241a0db637d27d8e9f27571f751d32119eccbc72bf63220d5 not found: ID does not exist" containerID="cec32be00e1bf17241a0db637d27d8e9f27571f751d32119eccbc72bf63220d5" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.236094 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cec32be00e1bf17241a0db637d27d8e9f27571f751d32119eccbc72bf63220d5"} err="failed to get container status \"cec32be00e1bf17241a0db637d27d8e9f27571f751d32119eccbc72bf63220d5\": rpc error: code = NotFound desc = could not find container \"cec32be00e1bf17241a0db637d27d8e9f27571f751d32119eccbc72bf63220d5\": container with ID starting with cec32be00e1bf17241a0db637d27d8e9f27571f751d32119eccbc72bf63220d5 not found: ID does not exist" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.236111 4825 scope.go:117] "RemoveContainer" containerID="12cdea19031daa6fef93809781d8e5bbfb7fe0da43f6c31f2d536ab45c6b7ce1" Mar 10 08:42:32 crc kubenswrapper[4825]: E0310 08:42:32.236552 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12cdea19031daa6fef93809781d8e5bbfb7fe0da43f6c31f2d536ab45c6b7ce1\": container with ID starting with 12cdea19031daa6fef93809781d8e5bbfb7fe0da43f6c31f2d536ab45c6b7ce1 not found: ID does not exist" containerID="12cdea19031daa6fef93809781d8e5bbfb7fe0da43f6c31f2d536ab45c6b7ce1" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.236582 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12cdea19031daa6fef93809781d8e5bbfb7fe0da43f6c31f2d536ab45c6b7ce1"} err="failed to get container status \"12cdea19031daa6fef93809781d8e5bbfb7fe0da43f6c31f2d536ab45c6b7ce1\": rpc error: code = NotFound desc = could not find container \"12cdea19031daa6fef93809781d8e5bbfb7fe0da43f6c31f2d536ab45c6b7ce1\": container with ID starting with 12cdea19031daa6fef93809781d8e5bbfb7fe0da43f6c31f2d536ab45c6b7ce1 not found: ID does not exist" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.378092 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c9229dc9-6790-4cd4-bbc3-0e6e156cc076-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-x88hw\" (UID: \"c9229dc9-6790-4cd4-bbc3-0e6e156cc076\") " pod="openstack/configure-network-openstack-openstack-cell1-x88hw" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.378229 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9229dc9-6790-4cd4-bbc3-0e6e156cc076-inventory\") pod \"configure-network-openstack-openstack-cell1-x88hw\" (UID: \"c9229dc9-6790-4cd4-bbc3-0e6e156cc076\") " pod="openstack/configure-network-openstack-openstack-cell1-x88hw" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.378287 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkqsh\" (UniqueName: \"kubernetes.io/projected/c9229dc9-6790-4cd4-bbc3-0e6e156cc076-kube-api-access-xkqsh\") pod \"configure-network-openstack-openstack-cell1-x88hw\" (UID: \"c9229dc9-6790-4cd4-bbc3-0e6e156cc076\") " pod="openstack/configure-network-openstack-openstack-cell1-x88hw" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.480549 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c9229dc9-6790-4cd4-bbc3-0e6e156cc076-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-x88hw\" (UID: \"c9229dc9-6790-4cd4-bbc3-0e6e156cc076\") " pod="openstack/configure-network-openstack-openstack-cell1-x88hw" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.481187 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9229dc9-6790-4cd4-bbc3-0e6e156cc076-inventory\") pod \"configure-network-openstack-openstack-cell1-x88hw\" (UID: \"c9229dc9-6790-4cd4-bbc3-0e6e156cc076\") " pod="openstack/configure-network-openstack-openstack-cell1-x88hw" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.481612 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkqsh\" (UniqueName: \"kubernetes.io/projected/c9229dc9-6790-4cd4-bbc3-0e6e156cc076-kube-api-access-xkqsh\") pod \"configure-network-openstack-openstack-cell1-x88hw\" (UID: \"c9229dc9-6790-4cd4-bbc3-0e6e156cc076\") " pod="openstack/configure-network-openstack-openstack-cell1-x88hw" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.487168 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c9229dc9-6790-4cd4-bbc3-0e6e156cc076-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-x88hw\" (UID: \"c9229dc9-6790-4cd4-bbc3-0e6e156cc076\") " pod="openstack/configure-network-openstack-openstack-cell1-x88hw" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.491629 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9229dc9-6790-4cd4-bbc3-0e6e156cc076-inventory\") pod \"configure-network-openstack-openstack-cell1-x88hw\" (UID: \"c9229dc9-6790-4cd4-bbc3-0e6e156cc076\") " pod="openstack/configure-network-openstack-openstack-cell1-x88hw" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.497849 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkqsh\" (UniqueName: \"kubernetes.io/projected/c9229dc9-6790-4cd4-bbc3-0e6e156cc076-kube-api-access-xkqsh\") pod \"configure-network-openstack-openstack-cell1-x88hw\" (UID: \"c9229dc9-6790-4cd4-bbc3-0e6e156cc076\") " pod="openstack/configure-network-openstack-openstack-cell1-x88hw" Mar 10 08:42:32 crc kubenswrapper[4825]: I0310 08:42:32.513370 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-x88hw" Mar 10 08:42:33 crc kubenswrapper[4825]: I0310 08:42:33.052959 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-x88hw"] Mar 10 08:42:33 crc kubenswrapper[4825]: W0310 08:42:33.056614 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9229dc9_6790_4cd4_bbc3_0e6e156cc076.slice/crio-b998a5e785fd8d16cb5cc2515d257ededfbe825724e2e08521c1cfe8e2449781 WatchSource:0}: Error finding container b998a5e785fd8d16cb5cc2515d257ededfbe825724e2e08521c1cfe8e2449781: Status 404 returned error can't find the container with id b998a5e785fd8d16cb5cc2515d257ededfbe825724e2e08521c1cfe8e2449781 Mar 10 08:42:33 crc kubenswrapper[4825]: I0310 08:42:33.110494 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-x88hw" event={"ID":"c9229dc9-6790-4cd4-bbc3-0e6e156cc076","Type":"ContainerStarted","Data":"b998a5e785fd8d16cb5cc2515d257ededfbe825724e2e08521c1cfe8e2449781"} Mar 10 08:42:33 crc kubenswrapper[4825]: I0310 08:42:33.253156 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f9ff4a6-484a-44ba-8d2e-1f43790ea38f" path="/var/lib/kubelet/pods/0f9ff4a6-484a-44ba-8d2e-1f43790ea38f/volumes" Mar 10 08:42:34 crc kubenswrapper[4825]: I0310 08:42:34.120760 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-x88hw" event={"ID":"c9229dc9-6790-4cd4-bbc3-0e6e156cc076","Type":"ContainerStarted","Data":"f86f3afc5c331dc93ab4cb45793088e2aeda53327eda518d1bc40754b2f89788"} Mar 10 08:42:34 crc kubenswrapper[4825]: I0310 08:42:34.144257 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-x88hw" podStartSLOduration=1.665778508 podStartE2EDuration="2.144232246s" podCreationTimestamp="2026-03-10 08:42:32 +0000 UTC" firstStartedPulling="2026-03-10 08:42:33.059524148 +0000 UTC m=+7106.089304763" lastFinishedPulling="2026-03-10 08:42:33.537977886 +0000 UTC m=+7106.567758501" observedRunningTime="2026-03-10 08:42:34.140520378 +0000 UTC m=+7107.170300993" watchObservedRunningTime="2026-03-10 08:42:34.144232246 +0000 UTC m=+7107.174012871" Mar 10 08:42:41 crc kubenswrapper[4825]: I0310 08:42:41.236980 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:42:41 crc kubenswrapper[4825]: E0310 08:42:41.238888 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:42:52 crc kubenswrapper[4825]: I0310 08:42:52.236857 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:42:53 crc kubenswrapper[4825]: I0310 08:42:53.320581 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"a77ec3023a65c7544ad73bb812378227e2a33544cb0a071029aa0eecd3038079"} Mar 10 08:42:53 crc kubenswrapper[4825]: I0310 08:42:53.768362 4825 scope.go:117] "RemoveContainer" containerID="bc53c3987e986ccb6550724d2078ea1663265c7fa6db439abe7d2690277125f5" Mar 10 08:43:51 crc kubenswrapper[4825]: I0310 08:43:51.893397 4825 generic.go:334] "Generic (PLEG): container finished" podID="c9229dc9-6790-4cd4-bbc3-0e6e156cc076" containerID="f86f3afc5c331dc93ab4cb45793088e2aeda53327eda518d1bc40754b2f89788" exitCode=0 Mar 10 08:43:51 crc kubenswrapper[4825]: I0310 08:43:51.893906 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-x88hw" event={"ID":"c9229dc9-6790-4cd4-bbc3-0e6e156cc076","Type":"ContainerDied","Data":"f86f3afc5c331dc93ab4cb45793088e2aeda53327eda518d1bc40754b2f89788"} Mar 10 08:43:53 crc kubenswrapper[4825]: I0310 08:43:53.294400 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-x88hw" Mar 10 08:43:53 crc kubenswrapper[4825]: I0310 08:43:53.395671 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9229dc9-6790-4cd4-bbc3-0e6e156cc076-inventory\") pod \"c9229dc9-6790-4cd4-bbc3-0e6e156cc076\" (UID: \"c9229dc9-6790-4cd4-bbc3-0e6e156cc076\") " Mar 10 08:43:53 crc kubenswrapper[4825]: I0310 08:43:53.395911 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c9229dc9-6790-4cd4-bbc3-0e6e156cc076-ssh-key-openstack-cell1\") pod \"c9229dc9-6790-4cd4-bbc3-0e6e156cc076\" (UID: \"c9229dc9-6790-4cd4-bbc3-0e6e156cc076\") " Mar 10 08:43:53 crc kubenswrapper[4825]: I0310 08:43:53.396023 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkqsh\" (UniqueName: \"kubernetes.io/projected/c9229dc9-6790-4cd4-bbc3-0e6e156cc076-kube-api-access-xkqsh\") pod \"c9229dc9-6790-4cd4-bbc3-0e6e156cc076\" (UID: \"c9229dc9-6790-4cd4-bbc3-0e6e156cc076\") " Mar 10 08:43:53 crc kubenswrapper[4825]: I0310 08:43:53.402180 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9229dc9-6790-4cd4-bbc3-0e6e156cc076-kube-api-access-xkqsh" (OuterVolumeSpecName: "kube-api-access-xkqsh") pod "c9229dc9-6790-4cd4-bbc3-0e6e156cc076" (UID: "c9229dc9-6790-4cd4-bbc3-0e6e156cc076"). InnerVolumeSpecName "kube-api-access-xkqsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:43:53 crc kubenswrapper[4825]: I0310 08:43:53.426928 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9229dc9-6790-4cd4-bbc3-0e6e156cc076-inventory" (OuterVolumeSpecName: "inventory") pod "c9229dc9-6790-4cd4-bbc3-0e6e156cc076" (UID: "c9229dc9-6790-4cd4-bbc3-0e6e156cc076"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:43:53 crc kubenswrapper[4825]: I0310 08:43:53.427485 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9229dc9-6790-4cd4-bbc3-0e6e156cc076-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "c9229dc9-6790-4cd4-bbc3-0e6e156cc076" (UID: "c9229dc9-6790-4cd4-bbc3-0e6e156cc076"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:43:53 crc kubenswrapper[4825]: I0310 08:43:53.498815 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c9229dc9-6790-4cd4-bbc3-0e6e156cc076-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 08:43:53 crc kubenswrapper[4825]: I0310 08:43:53.498855 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkqsh\" (UniqueName: \"kubernetes.io/projected/c9229dc9-6790-4cd4-bbc3-0e6e156cc076-kube-api-access-xkqsh\") on node \"crc\" DevicePath \"\"" Mar 10 08:43:53 crc kubenswrapper[4825]: I0310 08:43:53.498865 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c9229dc9-6790-4cd4-bbc3-0e6e156cc076-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 08:43:53 crc kubenswrapper[4825]: I0310 08:43:53.914996 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-x88hw" event={"ID":"c9229dc9-6790-4cd4-bbc3-0e6e156cc076","Type":"ContainerDied","Data":"b998a5e785fd8d16cb5cc2515d257ededfbe825724e2e08521c1cfe8e2449781"} Mar 10 08:43:53 crc kubenswrapper[4825]: I0310 08:43:53.915042 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b998a5e785fd8d16cb5cc2515d257ededfbe825724e2e08521c1cfe8e2449781" Mar 10 08:43:53 crc kubenswrapper[4825]: I0310 08:43:53.915102 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-x88hw" Mar 10 08:43:53 crc kubenswrapper[4825]: I0310 08:43:53.999516 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-c2hw6"] Mar 10 08:43:54 crc kubenswrapper[4825]: E0310 08:43:54.000080 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9229dc9-6790-4cd4-bbc3-0e6e156cc076" containerName="configure-network-openstack-openstack-cell1" Mar 10 08:43:54 crc kubenswrapper[4825]: I0310 08:43:54.000106 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9229dc9-6790-4cd4-bbc3-0e6e156cc076" containerName="configure-network-openstack-openstack-cell1" Mar 10 08:43:54 crc kubenswrapper[4825]: I0310 08:43:54.000388 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9229dc9-6790-4cd4-bbc3-0e6e156cc076" containerName="configure-network-openstack-openstack-cell1" Mar 10 08:43:54 crc kubenswrapper[4825]: I0310 08:43:54.001181 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-c2hw6" Mar 10 08:43:54 crc kubenswrapper[4825]: I0310 08:43:54.011109 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-c2hw6"] Mar 10 08:43:54 crc kubenswrapper[4825]: I0310 08:43:54.018478 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 08:43:54 crc kubenswrapper[4825]: I0310 08:43:54.018713 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-5twvv" Mar 10 08:43:54 crc kubenswrapper[4825]: I0310 08:43:54.018867 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 08:43:54 crc kubenswrapper[4825]: I0310 08:43:54.019034 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 08:43:54 crc kubenswrapper[4825]: I0310 08:43:54.112410 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lddfv\" (UniqueName: \"kubernetes.io/projected/3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a-kube-api-access-lddfv\") pod \"validate-network-openstack-openstack-cell1-c2hw6\" (UID: \"3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a\") " pod="openstack/validate-network-openstack-openstack-cell1-c2hw6" Mar 10 08:43:54 crc kubenswrapper[4825]: I0310 08:43:54.112663 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-c2hw6\" (UID: \"3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a\") " pod="openstack/validate-network-openstack-openstack-cell1-c2hw6" Mar 10 08:43:54 crc kubenswrapper[4825]: I0310 08:43:54.112701 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a-inventory\") pod \"validate-network-openstack-openstack-cell1-c2hw6\" (UID: \"3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a\") " pod="openstack/validate-network-openstack-openstack-cell1-c2hw6" Mar 10 08:43:54 crc kubenswrapper[4825]: I0310 08:43:54.217005 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lddfv\" (UniqueName: \"kubernetes.io/projected/3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a-kube-api-access-lddfv\") pod \"validate-network-openstack-openstack-cell1-c2hw6\" (UID: \"3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a\") " pod="openstack/validate-network-openstack-openstack-cell1-c2hw6" Mar 10 08:43:54 crc kubenswrapper[4825]: I0310 08:43:54.217107 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-c2hw6\" (UID: \"3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a\") " pod="openstack/validate-network-openstack-openstack-cell1-c2hw6" Mar 10 08:43:54 crc kubenswrapper[4825]: I0310 08:43:54.217163 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a-inventory\") pod \"validate-network-openstack-openstack-cell1-c2hw6\" (UID: \"3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a\") " pod="openstack/validate-network-openstack-openstack-cell1-c2hw6" Mar 10 08:43:54 crc kubenswrapper[4825]: I0310 08:43:54.226256 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a-inventory\") pod \"validate-network-openstack-openstack-cell1-c2hw6\" (UID: \"3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a\") " pod="openstack/validate-network-openstack-openstack-cell1-c2hw6" Mar 10 08:43:54 crc kubenswrapper[4825]: I0310 08:43:54.226859 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-c2hw6\" (UID: \"3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a\") " pod="openstack/validate-network-openstack-openstack-cell1-c2hw6" Mar 10 08:43:54 crc kubenswrapper[4825]: I0310 08:43:54.238966 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lddfv\" (UniqueName: \"kubernetes.io/projected/3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a-kube-api-access-lddfv\") pod \"validate-network-openstack-openstack-cell1-c2hw6\" (UID: \"3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a\") " pod="openstack/validate-network-openstack-openstack-cell1-c2hw6" Mar 10 08:43:54 crc kubenswrapper[4825]: I0310 08:43:54.353226 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-c2hw6" Mar 10 08:43:54 crc kubenswrapper[4825]: I0310 08:43:54.882793 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-c2hw6"] Mar 10 08:43:54 crc kubenswrapper[4825]: I0310 08:43:54.927968 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-c2hw6" event={"ID":"3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a","Type":"ContainerStarted","Data":"6efc177b22700105b64840170c58579a17d7da040d3095792a1165cc61f3bbc0"} Mar 10 08:43:55 crc kubenswrapper[4825]: I0310 08:43:55.937887 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-c2hw6" event={"ID":"3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a","Type":"ContainerStarted","Data":"de262b3c2d94194608c2de6e90087fc520f7d2e98df421bbde1c2207e12ac143"} Mar 10 08:43:55 crc kubenswrapper[4825]: I0310 08:43:55.955913 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-c2hw6" podStartSLOduration=2.491060234 podStartE2EDuration="2.95589482s" podCreationTimestamp="2026-03-10 08:43:53 +0000 UTC" firstStartedPulling="2026-03-10 08:43:54.88699358 +0000 UTC m=+7187.916774195" lastFinishedPulling="2026-03-10 08:43:55.351828166 +0000 UTC m=+7188.381608781" observedRunningTime="2026-03-10 08:43:55.951035432 +0000 UTC m=+7188.980816067" watchObservedRunningTime="2026-03-10 08:43:55.95589482 +0000 UTC m=+7188.985675435" Mar 10 08:44:00 crc kubenswrapper[4825]: I0310 08:44:00.143255 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552204-c79wc"] Mar 10 08:44:00 crc kubenswrapper[4825]: I0310 08:44:00.145277 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552204-c79wc" Mar 10 08:44:00 crc kubenswrapper[4825]: I0310 08:44:00.148284 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:44:00 crc kubenswrapper[4825]: I0310 08:44:00.148440 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:44:00 crc kubenswrapper[4825]: I0310 08:44:00.148599 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:44:00 crc kubenswrapper[4825]: I0310 08:44:00.152014 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552204-c79wc"] Mar 10 08:44:00 crc kubenswrapper[4825]: I0310 08:44:00.252672 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgtr5\" (UniqueName: \"kubernetes.io/projected/65c0626b-a45c-46bc-99fc-b889bad71c1b-kube-api-access-hgtr5\") pod \"auto-csr-approver-29552204-c79wc\" (UID: \"65c0626b-a45c-46bc-99fc-b889bad71c1b\") " pod="openshift-infra/auto-csr-approver-29552204-c79wc" Mar 10 08:44:00 crc kubenswrapper[4825]: I0310 08:44:00.354657 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgtr5\" (UniqueName: \"kubernetes.io/projected/65c0626b-a45c-46bc-99fc-b889bad71c1b-kube-api-access-hgtr5\") pod \"auto-csr-approver-29552204-c79wc\" (UID: \"65c0626b-a45c-46bc-99fc-b889bad71c1b\") " pod="openshift-infra/auto-csr-approver-29552204-c79wc" Mar 10 08:44:00 crc kubenswrapper[4825]: I0310 08:44:00.372102 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgtr5\" (UniqueName: \"kubernetes.io/projected/65c0626b-a45c-46bc-99fc-b889bad71c1b-kube-api-access-hgtr5\") pod \"auto-csr-approver-29552204-c79wc\" (UID: \"65c0626b-a45c-46bc-99fc-b889bad71c1b\") " pod="openshift-infra/auto-csr-approver-29552204-c79wc" Mar 10 08:44:00 crc kubenswrapper[4825]: I0310 08:44:00.478614 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552204-c79wc" Mar 10 08:44:00 crc kubenswrapper[4825]: I0310 08:44:00.959425 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552204-c79wc"] Mar 10 08:44:00 crc kubenswrapper[4825]: W0310 08:44:00.962675 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65c0626b_a45c_46bc_99fc_b889bad71c1b.slice/crio-42d856ad0bce580c9876d4513f4c7bd79491f9db7816ebf3e288289527d8fb17 WatchSource:0}: Error finding container 42d856ad0bce580c9876d4513f4c7bd79491f9db7816ebf3e288289527d8fb17: Status 404 returned error can't find the container with id 42d856ad0bce580c9876d4513f4c7bd79491f9db7816ebf3e288289527d8fb17 Mar 10 08:44:00 crc kubenswrapper[4825]: I0310 08:44:00.983076 4825 generic.go:334] "Generic (PLEG): container finished" podID="3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a" containerID="de262b3c2d94194608c2de6e90087fc520f7d2e98df421bbde1c2207e12ac143" exitCode=0 Mar 10 08:44:00 crc kubenswrapper[4825]: I0310 08:44:00.983169 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-c2hw6" event={"ID":"3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a","Type":"ContainerDied","Data":"de262b3c2d94194608c2de6e90087fc520f7d2e98df421bbde1c2207e12ac143"} Mar 10 08:44:00 crc kubenswrapper[4825]: I0310 08:44:00.984424 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552204-c79wc" event={"ID":"65c0626b-a45c-46bc-99fc-b889bad71c1b","Type":"ContainerStarted","Data":"42d856ad0bce580c9876d4513f4c7bd79491f9db7816ebf3e288289527d8fb17"} Mar 10 08:44:02 crc kubenswrapper[4825]: I0310 08:44:02.481215 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-c2hw6" Mar 10 08:44:02 crc kubenswrapper[4825]: I0310 08:44:02.605184 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lddfv\" (UniqueName: \"kubernetes.io/projected/3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a-kube-api-access-lddfv\") pod \"3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a\" (UID: \"3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a\") " Mar 10 08:44:02 crc kubenswrapper[4825]: I0310 08:44:02.605812 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a-ssh-key-openstack-cell1\") pod \"3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a\" (UID: \"3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a\") " Mar 10 08:44:02 crc kubenswrapper[4825]: I0310 08:44:02.605946 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a-inventory\") pod \"3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a\" (UID: \"3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a\") " Mar 10 08:44:02 crc kubenswrapper[4825]: I0310 08:44:02.612914 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a-kube-api-access-lddfv" (OuterVolumeSpecName: "kube-api-access-lddfv") pod "3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a" (UID: "3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a"). InnerVolumeSpecName "kube-api-access-lddfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:44:02 crc kubenswrapper[4825]: I0310 08:44:02.657330 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a" (UID: "3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:44:02 crc kubenswrapper[4825]: I0310 08:44:02.665222 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a-inventory" (OuterVolumeSpecName: "inventory") pod "3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a" (UID: "3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:44:02 crc kubenswrapper[4825]: I0310 08:44:02.708059 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 08:44:02 crc kubenswrapper[4825]: I0310 08:44:02.708293 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lddfv\" (UniqueName: \"kubernetes.io/projected/3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a-kube-api-access-lddfv\") on node \"crc\" DevicePath \"\"" Mar 10 08:44:02 crc kubenswrapper[4825]: I0310 08:44:02.708392 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.004391 4825 generic.go:334] "Generic (PLEG): container finished" podID="65c0626b-a45c-46bc-99fc-b889bad71c1b" containerID="60f8546bda46f78c8d14c36cacc9fc5642d9f3fbce1b310ded87717df4648c9c" exitCode=0 Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.004481 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552204-c79wc" event={"ID":"65c0626b-a45c-46bc-99fc-b889bad71c1b","Type":"ContainerDied","Data":"60f8546bda46f78c8d14c36cacc9fc5642d9f3fbce1b310ded87717df4648c9c"} Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.006905 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-c2hw6" event={"ID":"3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a","Type":"ContainerDied","Data":"6efc177b22700105b64840170c58579a17d7da040d3095792a1165cc61f3bbc0"} Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.006949 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6efc177b22700105b64840170c58579a17d7da040d3095792a1165cc61f3bbc0" Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.006977 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-c2hw6" Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.087808 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-sl2sb"] Mar 10 08:44:03 crc kubenswrapper[4825]: E0310 08:44:03.088252 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a" containerName="validate-network-openstack-openstack-cell1" Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.088274 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a" containerName="validate-network-openstack-openstack-cell1" Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.088558 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a" containerName="validate-network-openstack-openstack-cell1" Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.089421 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-sl2sb" Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.092974 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-5twvv" Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.093404 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.093755 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.096175 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.120614 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-sl2sb"] Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.218983 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d150047-2232-43ae-990a-23bf03421efa-inventory\") pod \"install-os-openstack-openstack-cell1-sl2sb\" (UID: \"3d150047-2232-43ae-990a-23bf03421efa\") " pod="openstack/install-os-openstack-openstack-cell1-sl2sb" Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.219611 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg95r\" (UniqueName: \"kubernetes.io/projected/3d150047-2232-43ae-990a-23bf03421efa-kube-api-access-jg95r\") pod \"install-os-openstack-openstack-cell1-sl2sb\" (UID: \"3d150047-2232-43ae-990a-23bf03421efa\") " pod="openstack/install-os-openstack-openstack-cell1-sl2sb" Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.219817 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3d150047-2232-43ae-990a-23bf03421efa-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-sl2sb\" (UID: \"3d150047-2232-43ae-990a-23bf03421efa\") " pod="openstack/install-os-openstack-openstack-cell1-sl2sb" Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.322116 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3d150047-2232-43ae-990a-23bf03421efa-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-sl2sb\" (UID: \"3d150047-2232-43ae-990a-23bf03421efa\") " pod="openstack/install-os-openstack-openstack-cell1-sl2sb" Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.322450 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d150047-2232-43ae-990a-23bf03421efa-inventory\") pod \"install-os-openstack-openstack-cell1-sl2sb\" (UID: \"3d150047-2232-43ae-990a-23bf03421efa\") " pod="openstack/install-os-openstack-openstack-cell1-sl2sb" Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.322484 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg95r\" (UniqueName: \"kubernetes.io/projected/3d150047-2232-43ae-990a-23bf03421efa-kube-api-access-jg95r\") pod \"install-os-openstack-openstack-cell1-sl2sb\" (UID: \"3d150047-2232-43ae-990a-23bf03421efa\") " pod="openstack/install-os-openstack-openstack-cell1-sl2sb" Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.327033 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d150047-2232-43ae-990a-23bf03421efa-inventory\") pod \"install-os-openstack-openstack-cell1-sl2sb\" (UID: \"3d150047-2232-43ae-990a-23bf03421efa\") " pod="openstack/install-os-openstack-openstack-cell1-sl2sb" Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.327612 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3d150047-2232-43ae-990a-23bf03421efa-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-sl2sb\" (UID: \"3d150047-2232-43ae-990a-23bf03421efa\") " pod="openstack/install-os-openstack-openstack-cell1-sl2sb" Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.339205 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg95r\" (UniqueName: \"kubernetes.io/projected/3d150047-2232-43ae-990a-23bf03421efa-kube-api-access-jg95r\") pod \"install-os-openstack-openstack-cell1-sl2sb\" (UID: \"3d150047-2232-43ae-990a-23bf03421efa\") " pod="openstack/install-os-openstack-openstack-cell1-sl2sb" Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.417245 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-sl2sb" Mar 10 08:44:03 crc kubenswrapper[4825]: I0310 08:44:03.968281 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-sl2sb"] Mar 10 08:44:04 crc kubenswrapper[4825]: I0310 08:44:04.016508 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-sl2sb" event={"ID":"3d150047-2232-43ae-990a-23bf03421efa","Type":"ContainerStarted","Data":"82fbf6f303bc764e9ef07ebabbfa0c45532c2caa89a4ddc0e15b383a22824b31"} Mar 10 08:44:04 crc kubenswrapper[4825]: I0310 08:44:04.248586 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552204-c79wc" Mar 10 08:44:04 crc kubenswrapper[4825]: I0310 08:44:04.342846 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgtr5\" (UniqueName: \"kubernetes.io/projected/65c0626b-a45c-46bc-99fc-b889bad71c1b-kube-api-access-hgtr5\") pod \"65c0626b-a45c-46bc-99fc-b889bad71c1b\" (UID: \"65c0626b-a45c-46bc-99fc-b889bad71c1b\") " Mar 10 08:44:04 crc kubenswrapper[4825]: I0310 08:44:04.349544 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65c0626b-a45c-46bc-99fc-b889bad71c1b-kube-api-access-hgtr5" (OuterVolumeSpecName: "kube-api-access-hgtr5") pod "65c0626b-a45c-46bc-99fc-b889bad71c1b" (UID: "65c0626b-a45c-46bc-99fc-b889bad71c1b"). InnerVolumeSpecName "kube-api-access-hgtr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:44:04 crc kubenswrapper[4825]: I0310 08:44:04.445972 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgtr5\" (UniqueName: \"kubernetes.io/projected/65c0626b-a45c-46bc-99fc-b889bad71c1b-kube-api-access-hgtr5\") on node \"crc\" DevicePath \"\"" Mar 10 08:44:05 crc kubenswrapper[4825]: I0310 08:44:05.026942 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552204-c79wc" Mar 10 08:44:05 crc kubenswrapper[4825]: I0310 08:44:05.026947 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552204-c79wc" event={"ID":"65c0626b-a45c-46bc-99fc-b889bad71c1b","Type":"ContainerDied","Data":"42d856ad0bce580c9876d4513f4c7bd79491f9db7816ebf3e288289527d8fb17"} Mar 10 08:44:05 crc kubenswrapper[4825]: I0310 08:44:05.027228 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42d856ad0bce580c9876d4513f4c7bd79491f9db7816ebf3e288289527d8fb17" Mar 10 08:44:05 crc kubenswrapper[4825]: I0310 08:44:05.029061 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-sl2sb" event={"ID":"3d150047-2232-43ae-990a-23bf03421efa","Type":"ContainerStarted","Data":"f8690f73641f434ff6ae6dbf91d7a18e80c2c4d91c80d4fbbe7087ac08ebf387"} Mar 10 08:44:05 crc kubenswrapper[4825]: I0310 08:44:05.050699 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-sl2sb" podStartSLOduration=1.677072775 podStartE2EDuration="2.050677432s" podCreationTimestamp="2026-03-10 08:44:03 +0000 UTC" firstStartedPulling="2026-03-10 08:44:03.981372211 +0000 UTC m=+7197.011152826" lastFinishedPulling="2026-03-10 08:44:04.354976878 +0000 UTC m=+7197.384757483" observedRunningTime="2026-03-10 08:44:05.04715574 +0000 UTC m=+7198.076936355" watchObservedRunningTime="2026-03-10 08:44:05.050677432 +0000 UTC m=+7198.080458047" Mar 10 08:44:05 crc kubenswrapper[4825]: I0310 08:44:05.310583 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552198-bvsh5"] Mar 10 08:44:05 crc kubenswrapper[4825]: I0310 08:44:05.318002 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552198-bvsh5"] Mar 10 08:44:07 crc kubenswrapper[4825]: I0310 08:44:07.249873 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eec7146-26d1-4fd7-99a8-d4f910750686" path="/var/lib/kubelet/pods/4eec7146-26d1-4fd7-99a8-d4f910750686/volumes" Mar 10 08:44:48 crc kubenswrapper[4825]: I0310 08:44:48.487021 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vsq9k"] Mar 10 08:44:48 crc kubenswrapper[4825]: E0310 08:44:48.488097 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65c0626b-a45c-46bc-99fc-b889bad71c1b" containerName="oc" Mar 10 08:44:48 crc kubenswrapper[4825]: I0310 08:44:48.488114 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="65c0626b-a45c-46bc-99fc-b889bad71c1b" containerName="oc" Mar 10 08:44:48 crc kubenswrapper[4825]: I0310 08:44:48.488397 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="65c0626b-a45c-46bc-99fc-b889bad71c1b" containerName="oc" Mar 10 08:44:48 crc kubenswrapper[4825]: I0310 08:44:48.490378 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsq9k" Mar 10 08:44:48 crc kubenswrapper[4825]: I0310 08:44:48.507219 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vsq9k"] Mar 10 08:44:48 crc kubenswrapper[4825]: I0310 08:44:48.571580 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kf26\" (UniqueName: \"kubernetes.io/projected/c7688266-98db-4cb5-ac55-013e7faa02e2-kube-api-access-2kf26\") pod \"certified-operators-vsq9k\" (UID: \"c7688266-98db-4cb5-ac55-013e7faa02e2\") " pod="openshift-marketplace/certified-operators-vsq9k" Mar 10 08:44:48 crc kubenswrapper[4825]: I0310 08:44:48.573930 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7688266-98db-4cb5-ac55-013e7faa02e2-catalog-content\") pod \"certified-operators-vsq9k\" (UID: \"c7688266-98db-4cb5-ac55-013e7faa02e2\") " pod="openshift-marketplace/certified-operators-vsq9k" Mar 10 08:44:48 crc kubenswrapper[4825]: I0310 08:44:48.574357 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7688266-98db-4cb5-ac55-013e7faa02e2-utilities\") pod \"certified-operators-vsq9k\" (UID: \"c7688266-98db-4cb5-ac55-013e7faa02e2\") " pod="openshift-marketplace/certified-operators-vsq9k" Mar 10 08:44:48 crc kubenswrapper[4825]: I0310 08:44:48.677224 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7688266-98db-4cb5-ac55-013e7faa02e2-catalog-content\") pod \"certified-operators-vsq9k\" (UID: \"c7688266-98db-4cb5-ac55-013e7faa02e2\") " pod="openshift-marketplace/certified-operators-vsq9k" Mar 10 08:44:48 crc kubenswrapper[4825]: I0310 08:44:48.677299 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7688266-98db-4cb5-ac55-013e7faa02e2-utilities\") pod \"certified-operators-vsq9k\" (UID: \"c7688266-98db-4cb5-ac55-013e7faa02e2\") " pod="openshift-marketplace/certified-operators-vsq9k" Mar 10 08:44:48 crc kubenswrapper[4825]: I0310 08:44:48.677389 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kf26\" (UniqueName: \"kubernetes.io/projected/c7688266-98db-4cb5-ac55-013e7faa02e2-kube-api-access-2kf26\") pod \"certified-operators-vsq9k\" (UID: \"c7688266-98db-4cb5-ac55-013e7faa02e2\") " pod="openshift-marketplace/certified-operators-vsq9k" Mar 10 08:44:48 crc kubenswrapper[4825]: I0310 08:44:48.677957 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7688266-98db-4cb5-ac55-013e7faa02e2-catalog-content\") pod \"certified-operators-vsq9k\" (UID: \"c7688266-98db-4cb5-ac55-013e7faa02e2\") " pod="openshift-marketplace/certified-operators-vsq9k" Mar 10 08:44:48 crc kubenswrapper[4825]: I0310 08:44:48.678188 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7688266-98db-4cb5-ac55-013e7faa02e2-utilities\") pod \"certified-operators-vsq9k\" (UID: \"c7688266-98db-4cb5-ac55-013e7faa02e2\") " pod="openshift-marketplace/certified-operators-vsq9k" Mar 10 08:44:48 crc kubenswrapper[4825]: I0310 08:44:48.702356 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kf26\" (UniqueName: \"kubernetes.io/projected/c7688266-98db-4cb5-ac55-013e7faa02e2-kube-api-access-2kf26\") pod \"certified-operators-vsq9k\" (UID: \"c7688266-98db-4cb5-ac55-013e7faa02e2\") " pod="openshift-marketplace/certified-operators-vsq9k" Mar 10 08:44:48 crc kubenswrapper[4825]: I0310 08:44:48.814904 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsq9k" Mar 10 08:44:49 crc kubenswrapper[4825]: I0310 08:44:49.360073 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vsq9k"] Mar 10 08:44:49 crc kubenswrapper[4825]: I0310 08:44:49.455460 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsq9k" event={"ID":"c7688266-98db-4cb5-ac55-013e7faa02e2","Type":"ContainerStarted","Data":"123c20d03335d1ab612dc763b1b9374a55c8bacc284649399b6229481c8c5a0d"} Mar 10 08:44:49 crc kubenswrapper[4825]: I0310 08:44:49.462161 4825 generic.go:334] "Generic (PLEG): container finished" podID="3d150047-2232-43ae-990a-23bf03421efa" containerID="f8690f73641f434ff6ae6dbf91d7a18e80c2c4d91c80d4fbbe7087ac08ebf387" exitCode=0 Mar 10 08:44:49 crc kubenswrapper[4825]: I0310 08:44:49.462207 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-sl2sb" event={"ID":"3d150047-2232-43ae-990a-23bf03421efa","Type":"ContainerDied","Data":"f8690f73641f434ff6ae6dbf91d7a18e80c2c4d91c80d4fbbe7087ac08ebf387"} Mar 10 08:44:50 crc kubenswrapper[4825]: I0310 08:44:50.479345 4825 generic.go:334] "Generic (PLEG): container finished" podID="c7688266-98db-4cb5-ac55-013e7faa02e2" containerID="8aa5f53205e01fa83f66dc3dd59d9c263800bf08af79f490f1852096bac647c5" exitCode=0 Mar 10 08:44:50 crc kubenswrapper[4825]: I0310 08:44:50.479437 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsq9k" event={"ID":"c7688266-98db-4cb5-ac55-013e7faa02e2","Type":"ContainerDied","Data":"8aa5f53205e01fa83f66dc3dd59d9c263800bf08af79f490f1852096bac647c5"} Mar 10 08:44:50 crc kubenswrapper[4825]: I0310 08:44:50.924434 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-sl2sb" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.025171 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg95r\" (UniqueName: \"kubernetes.io/projected/3d150047-2232-43ae-990a-23bf03421efa-kube-api-access-jg95r\") pod \"3d150047-2232-43ae-990a-23bf03421efa\" (UID: \"3d150047-2232-43ae-990a-23bf03421efa\") " Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.025301 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3d150047-2232-43ae-990a-23bf03421efa-ssh-key-openstack-cell1\") pod \"3d150047-2232-43ae-990a-23bf03421efa\" (UID: \"3d150047-2232-43ae-990a-23bf03421efa\") " Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.025420 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d150047-2232-43ae-990a-23bf03421efa-inventory\") pod \"3d150047-2232-43ae-990a-23bf03421efa\" (UID: \"3d150047-2232-43ae-990a-23bf03421efa\") " Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.031194 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d150047-2232-43ae-990a-23bf03421efa-kube-api-access-jg95r" (OuterVolumeSpecName: "kube-api-access-jg95r") pod "3d150047-2232-43ae-990a-23bf03421efa" (UID: "3d150047-2232-43ae-990a-23bf03421efa"). InnerVolumeSpecName "kube-api-access-jg95r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.054854 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d150047-2232-43ae-990a-23bf03421efa-inventory" (OuterVolumeSpecName: "inventory") pod "3d150047-2232-43ae-990a-23bf03421efa" (UID: "3d150047-2232-43ae-990a-23bf03421efa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.058327 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d150047-2232-43ae-990a-23bf03421efa-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "3d150047-2232-43ae-990a-23bf03421efa" (UID: "3d150047-2232-43ae-990a-23bf03421efa"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.127341 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg95r\" (UniqueName: \"kubernetes.io/projected/3d150047-2232-43ae-990a-23bf03421efa-kube-api-access-jg95r\") on node \"crc\" DevicePath \"\"" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.127650 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3d150047-2232-43ae-990a-23bf03421efa-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.127661 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d150047-2232-43ae-990a-23bf03421efa-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.492921 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-sl2sb" event={"ID":"3d150047-2232-43ae-990a-23bf03421efa","Type":"ContainerDied","Data":"82fbf6f303bc764e9ef07ebabbfa0c45532c2caa89a4ddc0e15b383a22824b31"} Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.492984 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82fbf6f303bc764e9ef07ebabbfa0c45532c2caa89a4ddc0e15b383a22824b31" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.493002 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-sl2sb" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.495979 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsq9k" event={"ID":"c7688266-98db-4cb5-ac55-013e7faa02e2","Type":"ContainerStarted","Data":"20f8fa0995fb31c63c7e00fccbea2c765a6d2a596e749c0449226931e5318b05"} Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.593723 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-wghgg"] Mar 10 08:44:51 crc kubenswrapper[4825]: E0310 08:44:51.594627 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d150047-2232-43ae-990a-23bf03421efa" containerName="install-os-openstack-openstack-cell1" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.594678 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d150047-2232-43ae-990a-23bf03421efa" containerName="install-os-openstack-openstack-cell1" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.595186 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d150047-2232-43ae-990a-23bf03421efa" containerName="install-os-openstack-openstack-cell1" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.596871 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-wghgg" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.599125 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.600610 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.600687 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-5twvv" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.600667 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.606936 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-wghgg"] Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.745654 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/596aaaec-e224-450d-886b-7e7477c7f221-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-wghgg\" (UID: \"596aaaec-e224-450d-886b-7e7477c7f221\") " pod="openstack/configure-os-openstack-openstack-cell1-wghgg" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.745785 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/596aaaec-e224-450d-886b-7e7477c7f221-inventory\") pod \"configure-os-openstack-openstack-cell1-wghgg\" (UID: \"596aaaec-e224-450d-886b-7e7477c7f221\") " pod="openstack/configure-os-openstack-openstack-cell1-wghgg" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.745897 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wlfq\" (UniqueName: \"kubernetes.io/projected/596aaaec-e224-450d-886b-7e7477c7f221-kube-api-access-4wlfq\") pod \"configure-os-openstack-openstack-cell1-wghgg\" (UID: \"596aaaec-e224-450d-886b-7e7477c7f221\") " pod="openstack/configure-os-openstack-openstack-cell1-wghgg" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.847264 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wlfq\" (UniqueName: \"kubernetes.io/projected/596aaaec-e224-450d-886b-7e7477c7f221-kube-api-access-4wlfq\") pod \"configure-os-openstack-openstack-cell1-wghgg\" (UID: \"596aaaec-e224-450d-886b-7e7477c7f221\") " pod="openstack/configure-os-openstack-openstack-cell1-wghgg" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.847374 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/596aaaec-e224-450d-886b-7e7477c7f221-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-wghgg\" (UID: \"596aaaec-e224-450d-886b-7e7477c7f221\") " pod="openstack/configure-os-openstack-openstack-cell1-wghgg" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.847462 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/596aaaec-e224-450d-886b-7e7477c7f221-inventory\") pod \"configure-os-openstack-openstack-cell1-wghgg\" (UID: \"596aaaec-e224-450d-886b-7e7477c7f221\") " pod="openstack/configure-os-openstack-openstack-cell1-wghgg" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.852666 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/596aaaec-e224-450d-886b-7e7477c7f221-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-wghgg\" (UID: \"596aaaec-e224-450d-886b-7e7477c7f221\") " pod="openstack/configure-os-openstack-openstack-cell1-wghgg" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.852685 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/596aaaec-e224-450d-886b-7e7477c7f221-inventory\") pod \"configure-os-openstack-openstack-cell1-wghgg\" (UID: \"596aaaec-e224-450d-886b-7e7477c7f221\") " pod="openstack/configure-os-openstack-openstack-cell1-wghgg" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.862594 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wlfq\" (UniqueName: \"kubernetes.io/projected/596aaaec-e224-450d-886b-7e7477c7f221-kube-api-access-4wlfq\") pod \"configure-os-openstack-openstack-cell1-wghgg\" (UID: \"596aaaec-e224-450d-886b-7e7477c7f221\") " pod="openstack/configure-os-openstack-openstack-cell1-wghgg" Mar 10 08:44:51 crc kubenswrapper[4825]: I0310 08:44:51.926751 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-wghgg" Mar 10 08:44:52 crc kubenswrapper[4825]: I0310 08:44:52.471578 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-wghgg"] Mar 10 08:44:52 crc kubenswrapper[4825]: I0310 08:44:52.505690 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-wghgg" event={"ID":"596aaaec-e224-450d-886b-7e7477c7f221","Type":"ContainerStarted","Data":"9b90b7adb6d2cca4f067cf37707a46f28341d06279815b86cd0f4beb8062ea5f"} Mar 10 08:44:53 crc kubenswrapper[4825]: I0310 08:44:53.516491 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-wghgg" event={"ID":"596aaaec-e224-450d-886b-7e7477c7f221","Type":"ContainerStarted","Data":"7af9c190cdb4b44f626a44df30ca6609fe36c8b4a82796c4fa40e8e1c704b965"} Mar 10 08:44:53 crc kubenswrapper[4825]: I0310 08:44:53.518714 4825 generic.go:334] "Generic (PLEG): container finished" podID="c7688266-98db-4cb5-ac55-013e7faa02e2" containerID="20f8fa0995fb31c63c7e00fccbea2c765a6d2a596e749c0449226931e5318b05" exitCode=0 Mar 10 08:44:53 crc kubenswrapper[4825]: I0310 08:44:53.518748 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsq9k" event={"ID":"c7688266-98db-4cb5-ac55-013e7faa02e2","Type":"ContainerDied","Data":"20f8fa0995fb31c63c7e00fccbea2c765a6d2a596e749c0449226931e5318b05"} Mar 10 08:44:53 crc kubenswrapper[4825]: I0310 08:44:53.543271 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-wghgg" podStartSLOduration=2.094199689 podStartE2EDuration="2.543248808s" podCreationTimestamp="2026-03-10 08:44:51 +0000 UTC" firstStartedPulling="2026-03-10 08:44:52.482411691 +0000 UTC m=+7245.512192306" lastFinishedPulling="2026-03-10 08:44:52.93146081 +0000 UTC m=+7245.961241425" observedRunningTime="2026-03-10 08:44:53.532123104 +0000 UTC m=+7246.561903719" watchObservedRunningTime="2026-03-10 08:44:53.543248808 +0000 UTC m=+7246.573029413" Mar 10 08:44:53 crc kubenswrapper[4825]: I0310 08:44:53.880997 4825 scope.go:117] "RemoveContainer" containerID="f99e33e974093644f16af2af3f8441900d225f4d483a56c6a8ff632972b03215" Mar 10 08:44:54 crc kubenswrapper[4825]: I0310 08:44:54.528683 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsq9k" event={"ID":"c7688266-98db-4cb5-ac55-013e7faa02e2","Type":"ContainerStarted","Data":"0a5532a08462026a7e5f07e1f92aa9e8f4dcfff219eaf6c6c994b6220718447a"} Mar 10 08:44:54 crc kubenswrapper[4825]: I0310 08:44:54.549457 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vsq9k" podStartSLOduration=2.8303337859999997 podStartE2EDuration="6.549436712s" podCreationTimestamp="2026-03-10 08:44:48 +0000 UTC" firstStartedPulling="2026-03-10 08:44:50.482043278 +0000 UTC m=+7243.511823893" lastFinishedPulling="2026-03-10 08:44:54.201146204 +0000 UTC m=+7247.230926819" observedRunningTime="2026-03-10 08:44:54.547886861 +0000 UTC m=+7247.577667476" watchObservedRunningTime="2026-03-10 08:44:54.549436712 +0000 UTC m=+7247.579217327" Mar 10 08:44:58 crc kubenswrapper[4825]: I0310 08:44:58.815048 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vsq9k" Mar 10 08:44:58 crc kubenswrapper[4825]: I0310 08:44:58.815597 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vsq9k" Mar 10 08:44:59 crc kubenswrapper[4825]: I0310 08:44:59.863609 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-vsq9k" podUID="c7688266-98db-4cb5-ac55-013e7faa02e2" containerName="registry-server" probeResult="failure" output=< Mar 10 08:44:59 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 08:44:59 crc kubenswrapper[4825]: > Mar 10 08:45:00 crc kubenswrapper[4825]: I0310 08:45:00.143593 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552205-bxgcw"] Mar 10 08:45:00 crc kubenswrapper[4825]: I0310 08:45:00.145073 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552205-bxgcw" Mar 10 08:45:00 crc kubenswrapper[4825]: I0310 08:45:00.152004 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 08:45:00 crc kubenswrapper[4825]: I0310 08:45:00.152214 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552205-bxgcw"] Mar 10 08:45:00 crc kubenswrapper[4825]: I0310 08:45:00.152325 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 08:45:00 crc kubenswrapper[4825]: I0310 08:45:00.310929 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/749d1021-e2d5-4c3a-a300-ea500dd37438-config-volume\") pod \"collect-profiles-29552205-bxgcw\" (UID: \"749d1021-e2d5-4c3a-a300-ea500dd37438\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552205-bxgcw" Mar 10 08:45:00 crc kubenswrapper[4825]: I0310 08:45:00.310988 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tgsd\" (UniqueName: \"kubernetes.io/projected/749d1021-e2d5-4c3a-a300-ea500dd37438-kube-api-access-9tgsd\") pod \"collect-profiles-29552205-bxgcw\" (UID: \"749d1021-e2d5-4c3a-a300-ea500dd37438\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552205-bxgcw" Mar 10 08:45:00 crc kubenswrapper[4825]: I0310 08:45:00.311037 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/749d1021-e2d5-4c3a-a300-ea500dd37438-secret-volume\") pod \"collect-profiles-29552205-bxgcw\" (UID: \"749d1021-e2d5-4c3a-a300-ea500dd37438\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552205-bxgcw" Mar 10 08:45:00 crc kubenswrapper[4825]: I0310 08:45:00.412469 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/749d1021-e2d5-4c3a-a300-ea500dd37438-config-volume\") pod \"collect-profiles-29552205-bxgcw\" (UID: \"749d1021-e2d5-4c3a-a300-ea500dd37438\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552205-bxgcw" Mar 10 08:45:00 crc kubenswrapper[4825]: I0310 08:45:00.412547 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tgsd\" (UniqueName: \"kubernetes.io/projected/749d1021-e2d5-4c3a-a300-ea500dd37438-kube-api-access-9tgsd\") pod \"collect-profiles-29552205-bxgcw\" (UID: \"749d1021-e2d5-4c3a-a300-ea500dd37438\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552205-bxgcw" Mar 10 08:45:00 crc kubenswrapper[4825]: I0310 08:45:00.412598 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/749d1021-e2d5-4c3a-a300-ea500dd37438-secret-volume\") pod \"collect-profiles-29552205-bxgcw\" (UID: \"749d1021-e2d5-4c3a-a300-ea500dd37438\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552205-bxgcw" Mar 10 08:45:00 crc kubenswrapper[4825]: I0310 08:45:00.413567 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/749d1021-e2d5-4c3a-a300-ea500dd37438-config-volume\") pod \"collect-profiles-29552205-bxgcw\" (UID: \"749d1021-e2d5-4c3a-a300-ea500dd37438\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552205-bxgcw" Mar 10 08:45:00 crc kubenswrapper[4825]: I0310 08:45:00.421824 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/749d1021-e2d5-4c3a-a300-ea500dd37438-secret-volume\") pod \"collect-profiles-29552205-bxgcw\" (UID: \"749d1021-e2d5-4c3a-a300-ea500dd37438\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552205-bxgcw" Mar 10 08:45:00 crc kubenswrapper[4825]: I0310 08:45:00.433297 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tgsd\" (UniqueName: \"kubernetes.io/projected/749d1021-e2d5-4c3a-a300-ea500dd37438-kube-api-access-9tgsd\") pod \"collect-profiles-29552205-bxgcw\" (UID: \"749d1021-e2d5-4c3a-a300-ea500dd37438\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552205-bxgcw" Mar 10 08:45:00 crc kubenswrapper[4825]: I0310 08:45:00.488462 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552205-bxgcw" Mar 10 08:45:00 crc kubenswrapper[4825]: I0310 08:45:00.977312 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552205-bxgcw"] Mar 10 08:45:00 crc kubenswrapper[4825]: W0310 08:45:00.990317 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod749d1021_e2d5_4c3a_a300_ea500dd37438.slice/crio-a1ec8773914528190a6e5057948c14625a0ae684fabb253e307137be0df95944 WatchSource:0}: Error finding container a1ec8773914528190a6e5057948c14625a0ae684fabb253e307137be0df95944: Status 404 returned error can't find the container with id a1ec8773914528190a6e5057948c14625a0ae684fabb253e307137be0df95944 Mar 10 08:45:01 crc kubenswrapper[4825]: I0310 08:45:01.596559 4825 generic.go:334] "Generic (PLEG): container finished" podID="749d1021-e2d5-4c3a-a300-ea500dd37438" containerID="7ae228cadb05a6fe4bceec6c595529ca06ba12722a7dbcd3b8fc270ed110488e" exitCode=0 Mar 10 08:45:01 crc kubenswrapper[4825]: I0310 08:45:01.596673 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552205-bxgcw" event={"ID":"749d1021-e2d5-4c3a-a300-ea500dd37438","Type":"ContainerDied","Data":"7ae228cadb05a6fe4bceec6c595529ca06ba12722a7dbcd3b8fc270ed110488e"} Mar 10 08:45:01 crc kubenswrapper[4825]: I0310 08:45:01.596958 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552205-bxgcw" event={"ID":"749d1021-e2d5-4c3a-a300-ea500dd37438","Type":"ContainerStarted","Data":"a1ec8773914528190a6e5057948c14625a0ae684fabb253e307137be0df95944"} Mar 10 08:45:02 crc kubenswrapper[4825]: I0310 08:45:02.939585 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552205-bxgcw" Mar 10 08:45:03 crc kubenswrapper[4825]: I0310 08:45:03.071942 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/749d1021-e2d5-4c3a-a300-ea500dd37438-secret-volume\") pod \"749d1021-e2d5-4c3a-a300-ea500dd37438\" (UID: \"749d1021-e2d5-4c3a-a300-ea500dd37438\") " Mar 10 08:45:03 crc kubenswrapper[4825]: I0310 08:45:03.072149 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tgsd\" (UniqueName: \"kubernetes.io/projected/749d1021-e2d5-4c3a-a300-ea500dd37438-kube-api-access-9tgsd\") pod \"749d1021-e2d5-4c3a-a300-ea500dd37438\" (UID: \"749d1021-e2d5-4c3a-a300-ea500dd37438\") " Mar 10 08:45:03 crc kubenswrapper[4825]: I0310 08:45:03.072365 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/749d1021-e2d5-4c3a-a300-ea500dd37438-config-volume\") pod \"749d1021-e2d5-4c3a-a300-ea500dd37438\" (UID: \"749d1021-e2d5-4c3a-a300-ea500dd37438\") " Mar 10 08:45:03 crc kubenswrapper[4825]: I0310 08:45:03.072922 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/749d1021-e2d5-4c3a-a300-ea500dd37438-config-volume" (OuterVolumeSpecName: "config-volume") pod "749d1021-e2d5-4c3a-a300-ea500dd37438" (UID: "749d1021-e2d5-4c3a-a300-ea500dd37438"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:45:03 crc kubenswrapper[4825]: I0310 08:45:03.081326 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/749d1021-e2d5-4c3a-a300-ea500dd37438-kube-api-access-9tgsd" (OuterVolumeSpecName: "kube-api-access-9tgsd") pod "749d1021-e2d5-4c3a-a300-ea500dd37438" (UID: "749d1021-e2d5-4c3a-a300-ea500dd37438"). InnerVolumeSpecName "kube-api-access-9tgsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:45:03 crc kubenswrapper[4825]: I0310 08:45:03.081856 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/749d1021-e2d5-4c3a-a300-ea500dd37438-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "749d1021-e2d5-4c3a-a300-ea500dd37438" (UID: "749d1021-e2d5-4c3a-a300-ea500dd37438"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:45:03 crc kubenswrapper[4825]: I0310 08:45:03.174266 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tgsd\" (UniqueName: \"kubernetes.io/projected/749d1021-e2d5-4c3a-a300-ea500dd37438-kube-api-access-9tgsd\") on node \"crc\" DevicePath \"\"" Mar 10 08:45:03 crc kubenswrapper[4825]: I0310 08:45:03.174306 4825 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/749d1021-e2d5-4c3a-a300-ea500dd37438-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 08:45:03 crc kubenswrapper[4825]: I0310 08:45:03.174318 4825 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/749d1021-e2d5-4c3a-a300-ea500dd37438-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 08:45:03 crc kubenswrapper[4825]: I0310 08:45:03.620337 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552205-bxgcw" event={"ID":"749d1021-e2d5-4c3a-a300-ea500dd37438","Type":"ContainerDied","Data":"a1ec8773914528190a6e5057948c14625a0ae684fabb253e307137be0df95944"} Mar 10 08:45:03 crc kubenswrapper[4825]: I0310 08:45:03.620923 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1ec8773914528190a6e5057948c14625a0ae684fabb253e307137be0df95944" Mar 10 08:45:03 crc kubenswrapper[4825]: I0310 08:45:03.620422 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552205-bxgcw" Mar 10 08:45:04 crc kubenswrapper[4825]: I0310 08:45:04.007262 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552160-tc9rj"] Mar 10 08:45:04 crc kubenswrapper[4825]: I0310 08:45:04.015921 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552160-tc9rj"] Mar 10 08:45:05 crc kubenswrapper[4825]: I0310 08:45:05.249104 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="535e3e1e-6cd1-4558-8eb1-26d354d84aab" path="/var/lib/kubelet/pods/535e3e1e-6cd1-4558-8eb1-26d354d84aab/volumes" Mar 10 08:45:08 crc kubenswrapper[4825]: I0310 08:45:08.860232 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vsq9k" Mar 10 08:45:08 crc kubenswrapper[4825]: I0310 08:45:08.924577 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vsq9k" Mar 10 08:45:09 crc kubenswrapper[4825]: I0310 08:45:09.100967 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vsq9k"] Mar 10 08:45:10 crc kubenswrapper[4825]: I0310 08:45:10.683280 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vsq9k" podUID="c7688266-98db-4cb5-ac55-013e7faa02e2" containerName="registry-server" containerID="cri-o://0a5532a08462026a7e5f07e1f92aa9e8f4dcfff219eaf6c6c994b6220718447a" gracePeriod=2 Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.128727 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsq9k" Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.249382 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7688266-98db-4cb5-ac55-013e7faa02e2-catalog-content\") pod \"c7688266-98db-4cb5-ac55-013e7faa02e2\" (UID: \"c7688266-98db-4cb5-ac55-013e7faa02e2\") " Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.249771 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7688266-98db-4cb5-ac55-013e7faa02e2-utilities\") pod \"c7688266-98db-4cb5-ac55-013e7faa02e2\" (UID: \"c7688266-98db-4cb5-ac55-013e7faa02e2\") " Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.250584 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7688266-98db-4cb5-ac55-013e7faa02e2-utilities" (OuterVolumeSpecName: "utilities") pod "c7688266-98db-4cb5-ac55-013e7faa02e2" (UID: "c7688266-98db-4cb5-ac55-013e7faa02e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.250795 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kf26\" (UniqueName: \"kubernetes.io/projected/c7688266-98db-4cb5-ac55-013e7faa02e2-kube-api-access-2kf26\") pod \"c7688266-98db-4cb5-ac55-013e7faa02e2\" (UID: \"c7688266-98db-4cb5-ac55-013e7faa02e2\") " Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.251354 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7688266-98db-4cb5-ac55-013e7faa02e2-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.256040 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7688266-98db-4cb5-ac55-013e7faa02e2-kube-api-access-2kf26" (OuterVolumeSpecName: "kube-api-access-2kf26") pod "c7688266-98db-4cb5-ac55-013e7faa02e2" (UID: "c7688266-98db-4cb5-ac55-013e7faa02e2"). InnerVolumeSpecName "kube-api-access-2kf26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.308070 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7688266-98db-4cb5-ac55-013e7faa02e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7688266-98db-4cb5-ac55-013e7faa02e2" (UID: "c7688266-98db-4cb5-ac55-013e7faa02e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.354359 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kf26\" (UniqueName: \"kubernetes.io/projected/c7688266-98db-4cb5-ac55-013e7faa02e2-kube-api-access-2kf26\") on node \"crc\" DevicePath \"\"" Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.354569 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7688266-98db-4cb5-ac55-013e7faa02e2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.694259 4825 generic.go:334] "Generic (PLEG): container finished" podID="c7688266-98db-4cb5-ac55-013e7faa02e2" containerID="0a5532a08462026a7e5f07e1f92aa9e8f4dcfff219eaf6c6c994b6220718447a" exitCode=0 Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.694311 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsq9k" event={"ID":"c7688266-98db-4cb5-ac55-013e7faa02e2","Type":"ContainerDied","Data":"0a5532a08462026a7e5f07e1f92aa9e8f4dcfff219eaf6c6c994b6220718447a"} Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.694374 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsq9k" event={"ID":"c7688266-98db-4cb5-ac55-013e7faa02e2","Type":"ContainerDied","Data":"123c20d03335d1ab612dc763b1b9374a55c8bacc284649399b6229481c8c5a0d"} Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.694398 4825 scope.go:117] "RemoveContainer" containerID="0a5532a08462026a7e5f07e1f92aa9e8f4dcfff219eaf6c6c994b6220718447a" Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.694329 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsq9k" Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.715685 4825 scope.go:117] "RemoveContainer" containerID="20f8fa0995fb31c63c7e00fccbea2c765a6d2a596e749c0449226931e5318b05" Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.736872 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vsq9k"] Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.747167 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vsq9k"] Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.756171 4825 scope.go:117] "RemoveContainer" containerID="8aa5f53205e01fa83f66dc3dd59d9c263800bf08af79f490f1852096bac647c5" Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.785726 4825 scope.go:117] "RemoveContainer" containerID="0a5532a08462026a7e5f07e1f92aa9e8f4dcfff219eaf6c6c994b6220718447a" Mar 10 08:45:11 crc kubenswrapper[4825]: E0310 08:45:11.786058 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a5532a08462026a7e5f07e1f92aa9e8f4dcfff219eaf6c6c994b6220718447a\": container with ID starting with 0a5532a08462026a7e5f07e1f92aa9e8f4dcfff219eaf6c6c994b6220718447a not found: ID does not exist" containerID="0a5532a08462026a7e5f07e1f92aa9e8f4dcfff219eaf6c6c994b6220718447a" Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.786102 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a5532a08462026a7e5f07e1f92aa9e8f4dcfff219eaf6c6c994b6220718447a"} err="failed to get container status \"0a5532a08462026a7e5f07e1f92aa9e8f4dcfff219eaf6c6c994b6220718447a\": rpc error: code = NotFound desc = could not find container \"0a5532a08462026a7e5f07e1f92aa9e8f4dcfff219eaf6c6c994b6220718447a\": container with ID starting with 0a5532a08462026a7e5f07e1f92aa9e8f4dcfff219eaf6c6c994b6220718447a not found: ID does not exist" Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.786144 4825 scope.go:117] "RemoveContainer" containerID="20f8fa0995fb31c63c7e00fccbea2c765a6d2a596e749c0449226931e5318b05" Mar 10 08:45:11 crc kubenswrapper[4825]: E0310 08:45:11.786586 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20f8fa0995fb31c63c7e00fccbea2c765a6d2a596e749c0449226931e5318b05\": container with ID starting with 20f8fa0995fb31c63c7e00fccbea2c765a6d2a596e749c0449226931e5318b05 not found: ID does not exist" containerID="20f8fa0995fb31c63c7e00fccbea2c765a6d2a596e749c0449226931e5318b05" Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.786615 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20f8fa0995fb31c63c7e00fccbea2c765a6d2a596e749c0449226931e5318b05"} err="failed to get container status \"20f8fa0995fb31c63c7e00fccbea2c765a6d2a596e749c0449226931e5318b05\": rpc error: code = NotFound desc = could not find container \"20f8fa0995fb31c63c7e00fccbea2c765a6d2a596e749c0449226931e5318b05\": container with ID starting with 20f8fa0995fb31c63c7e00fccbea2c765a6d2a596e749c0449226931e5318b05 not found: ID does not exist" Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.786636 4825 scope.go:117] "RemoveContainer" containerID="8aa5f53205e01fa83f66dc3dd59d9c263800bf08af79f490f1852096bac647c5" Mar 10 08:45:11 crc kubenswrapper[4825]: E0310 08:45:11.787034 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aa5f53205e01fa83f66dc3dd59d9c263800bf08af79f490f1852096bac647c5\": container with ID starting with 8aa5f53205e01fa83f66dc3dd59d9c263800bf08af79f490f1852096bac647c5 not found: ID does not exist" containerID="8aa5f53205e01fa83f66dc3dd59d9c263800bf08af79f490f1852096bac647c5" Mar 10 08:45:11 crc kubenswrapper[4825]: I0310 08:45:11.787183 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aa5f53205e01fa83f66dc3dd59d9c263800bf08af79f490f1852096bac647c5"} err="failed to get container status \"8aa5f53205e01fa83f66dc3dd59d9c263800bf08af79f490f1852096bac647c5\": rpc error: code = NotFound desc = could not find container \"8aa5f53205e01fa83f66dc3dd59d9c263800bf08af79f490f1852096bac647c5\": container with ID starting with 8aa5f53205e01fa83f66dc3dd59d9c263800bf08af79f490f1852096bac647c5 not found: ID does not exist" Mar 10 08:45:13 crc kubenswrapper[4825]: I0310 08:45:13.251475 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7688266-98db-4cb5-ac55-013e7faa02e2" path="/var/lib/kubelet/pods/c7688266-98db-4cb5-ac55-013e7faa02e2/volumes" Mar 10 08:45:16 crc kubenswrapper[4825]: I0310 08:45:16.888728 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:45:16 crc kubenswrapper[4825]: I0310 08:45:16.889023 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:45:36 crc kubenswrapper[4825]: I0310 08:45:36.948621 4825 generic.go:334] "Generic (PLEG): container finished" podID="596aaaec-e224-450d-886b-7e7477c7f221" containerID="7af9c190cdb4b44f626a44df30ca6609fe36c8b4a82796c4fa40e8e1c704b965" exitCode=0 Mar 10 08:45:36 crc kubenswrapper[4825]: I0310 08:45:36.948710 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-wghgg" event={"ID":"596aaaec-e224-450d-886b-7e7477c7f221","Type":"ContainerDied","Data":"7af9c190cdb4b44f626a44df30ca6609fe36c8b4a82796c4fa40e8e1c704b965"} Mar 10 08:45:38 crc kubenswrapper[4825]: I0310 08:45:38.449613 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-wghgg" Mar 10 08:45:38 crc kubenswrapper[4825]: I0310 08:45:38.571215 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/596aaaec-e224-450d-886b-7e7477c7f221-ssh-key-openstack-cell1\") pod \"596aaaec-e224-450d-886b-7e7477c7f221\" (UID: \"596aaaec-e224-450d-886b-7e7477c7f221\") " Mar 10 08:45:38 crc kubenswrapper[4825]: I0310 08:45:38.571347 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wlfq\" (UniqueName: \"kubernetes.io/projected/596aaaec-e224-450d-886b-7e7477c7f221-kube-api-access-4wlfq\") pod \"596aaaec-e224-450d-886b-7e7477c7f221\" (UID: \"596aaaec-e224-450d-886b-7e7477c7f221\") " Mar 10 08:45:38 crc kubenswrapper[4825]: I0310 08:45:38.571594 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/596aaaec-e224-450d-886b-7e7477c7f221-inventory\") pod \"596aaaec-e224-450d-886b-7e7477c7f221\" (UID: \"596aaaec-e224-450d-886b-7e7477c7f221\") " Mar 10 08:45:38 crc kubenswrapper[4825]: I0310 08:45:38.578587 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/596aaaec-e224-450d-886b-7e7477c7f221-kube-api-access-4wlfq" (OuterVolumeSpecName: "kube-api-access-4wlfq") pod "596aaaec-e224-450d-886b-7e7477c7f221" (UID: "596aaaec-e224-450d-886b-7e7477c7f221"). InnerVolumeSpecName "kube-api-access-4wlfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:45:38 crc kubenswrapper[4825]: I0310 08:45:38.601650 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/596aaaec-e224-450d-886b-7e7477c7f221-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "596aaaec-e224-450d-886b-7e7477c7f221" (UID: "596aaaec-e224-450d-886b-7e7477c7f221"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:45:38 crc kubenswrapper[4825]: I0310 08:45:38.603616 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/596aaaec-e224-450d-886b-7e7477c7f221-inventory" (OuterVolumeSpecName: "inventory") pod "596aaaec-e224-450d-886b-7e7477c7f221" (UID: "596aaaec-e224-450d-886b-7e7477c7f221"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:45:38 crc kubenswrapper[4825]: I0310 08:45:38.674760 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/596aaaec-e224-450d-886b-7e7477c7f221-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 08:45:38 crc kubenswrapper[4825]: I0310 08:45:38.674799 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wlfq\" (UniqueName: \"kubernetes.io/projected/596aaaec-e224-450d-886b-7e7477c7f221-kube-api-access-4wlfq\") on node \"crc\" DevicePath \"\"" Mar 10 08:45:38 crc kubenswrapper[4825]: I0310 08:45:38.674810 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/596aaaec-e224-450d-886b-7e7477c7f221-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 08:45:38 crc kubenswrapper[4825]: I0310 08:45:38.967365 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-wghgg" event={"ID":"596aaaec-e224-450d-886b-7e7477c7f221","Type":"ContainerDied","Data":"9b90b7adb6d2cca4f067cf37707a46f28341d06279815b86cd0f4beb8062ea5f"} Mar 10 08:45:38 crc kubenswrapper[4825]: I0310 08:45:38.967408 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b90b7adb6d2cca4f067cf37707a46f28341d06279815b86cd0f4beb8062ea5f" Mar 10 08:45:38 crc kubenswrapper[4825]: I0310 08:45:38.967478 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-wghgg" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.059060 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-jtd8b"] Mar 10 08:45:39 crc kubenswrapper[4825]: E0310 08:45:39.059660 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7688266-98db-4cb5-ac55-013e7faa02e2" containerName="extract-utilities" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.059695 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7688266-98db-4cb5-ac55-013e7faa02e2" containerName="extract-utilities" Mar 10 08:45:39 crc kubenswrapper[4825]: E0310 08:45:39.059721 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7688266-98db-4cb5-ac55-013e7faa02e2" containerName="extract-content" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.059730 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7688266-98db-4cb5-ac55-013e7faa02e2" containerName="extract-content" Mar 10 08:45:39 crc kubenswrapper[4825]: E0310 08:45:39.059752 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="749d1021-e2d5-4c3a-a300-ea500dd37438" containerName="collect-profiles" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.059771 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="749d1021-e2d5-4c3a-a300-ea500dd37438" containerName="collect-profiles" Mar 10 08:45:39 crc kubenswrapper[4825]: E0310 08:45:39.059800 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596aaaec-e224-450d-886b-7e7477c7f221" containerName="configure-os-openstack-openstack-cell1" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.059812 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="596aaaec-e224-450d-886b-7e7477c7f221" containerName="configure-os-openstack-openstack-cell1" Mar 10 08:45:39 crc kubenswrapper[4825]: E0310 08:45:39.059833 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7688266-98db-4cb5-ac55-013e7faa02e2" containerName="registry-server" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.059844 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7688266-98db-4cb5-ac55-013e7faa02e2" containerName="registry-server" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.060176 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="596aaaec-e224-450d-886b-7e7477c7f221" containerName="configure-os-openstack-openstack-cell1" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.060231 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="749d1021-e2d5-4c3a-a300-ea500dd37438" containerName="collect-profiles" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.060275 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7688266-98db-4cb5-ac55-013e7faa02e2" containerName="registry-server" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.061195 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-jtd8b" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.063950 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.064664 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.067184 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.067778 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-5twvv" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.070296 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-jtd8b"] Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.189966 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f7c77abe-85f2-42bf-a43c-0fb55031e37d-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-jtd8b\" (UID: \"f7c77abe-85f2-42bf-a43c-0fb55031e37d\") " pod="openstack/ssh-known-hosts-openstack-jtd8b" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.190061 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f7c77abe-85f2-42bf-a43c-0fb55031e37d-inventory-0\") pod \"ssh-known-hosts-openstack-jtd8b\" (UID: \"f7c77abe-85f2-42bf-a43c-0fb55031e37d\") " pod="openstack/ssh-known-hosts-openstack-jtd8b" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.190335 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tvhc\" (UniqueName: \"kubernetes.io/projected/f7c77abe-85f2-42bf-a43c-0fb55031e37d-kube-api-access-4tvhc\") pod \"ssh-known-hosts-openstack-jtd8b\" (UID: \"f7c77abe-85f2-42bf-a43c-0fb55031e37d\") " pod="openstack/ssh-known-hosts-openstack-jtd8b" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.292332 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tvhc\" (UniqueName: \"kubernetes.io/projected/f7c77abe-85f2-42bf-a43c-0fb55031e37d-kube-api-access-4tvhc\") pod \"ssh-known-hosts-openstack-jtd8b\" (UID: \"f7c77abe-85f2-42bf-a43c-0fb55031e37d\") " pod="openstack/ssh-known-hosts-openstack-jtd8b" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.292435 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f7c77abe-85f2-42bf-a43c-0fb55031e37d-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-jtd8b\" (UID: \"f7c77abe-85f2-42bf-a43c-0fb55031e37d\") " pod="openstack/ssh-known-hosts-openstack-jtd8b" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.292495 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f7c77abe-85f2-42bf-a43c-0fb55031e37d-inventory-0\") pod \"ssh-known-hosts-openstack-jtd8b\" (UID: \"f7c77abe-85f2-42bf-a43c-0fb55031e37d\") " pod="openstack/ssh-known-hosts-openstack-jtd8b" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.298724 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f7c77abe-85f2-42bf-a43c-0fb55031e37d-inventory-0\") pod \"ssh-known-hosts-openstack-jtd8b\" (UID: \"f7c77abe-85f2-42bf-a43c-0fb55031e37d\") " pod="openstack/ssh-known-hosts-openstack-jtd8b" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.298814 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f7c77abe-85f2-42bf-a43c-0fb55031e37d-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-jtd8b\" (UID: \"f7c77abe-85f2-42bf-a43c-0fb55031e37d\") " pod="openstack/ssh-known-hosts-openstack-jtd8b" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.313018 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tvhc\" (UniqueName: \"kubernetes.io/projected/f7c77abe-85f2-42bf-a43c-0fb55031e37d-kube-api-access-4tvhc\") pod \"ssh-known-hosts-openstack-jtd8b\" (UID: \"f7c77abe-85f2-42bf-a43c-0fb55031e37d\") " pod="openstack/ssh-known-hosts-openstack-jtd8b" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.379101 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-jtd8b" Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.896763 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-jtd8b"] Mar 10 08:45:39 crc kubenswrapper[4825]: I0310 08:45:39.976936 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-jtd8b" event={"ID":"f7c77abe-85f2-42bf-a43c-0fb55031e37d","Type":"ContainerStarted","Data":"a6fd9a4afda8d028843b6882347d8b44ae7b6206b3dff5570759277ade070fb1"} Mar 10 08:45:40 crc kubenswrapper[4825]: I0310 08:45:40.988438 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-jtd8b" event={"ID":"f7c77abe-85f2-42bf-a43c-0fb55031e37d","Type":"ContainerStarted","Data":"4d1be249d6481b27a5e7ff06bd3b95579d646b4ace7ad4dac9b19214b856f105"} Mar 10 08:45:41 crc kubenswrapper[4825]: I0310 08:45:41.008306 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-jtd8b" podStartSLOduration=1.265743686 podStartE2EDuration="2.008287967s" podCreationTimestamp="2026-03-10 08:45:39 +0000 UTC" firstStartedPulling="2026-03-10 08:45:39.904413043 +0000 UTC m=+7292.934193658" lastFinishedPulling="2026-03-10 08:45:40.646957324 +0000 UTC m=+7293.676737939" observedRunningTime="2026-03-10 08:45:41.006376497 +0000 UTC m=+7294.036157122" watchObservedRunningTime="2026-03-10 08:45:41.008287967 +0000 UTC m=+7294.038068572" Mar 10 08:45:46 crc kubenswrapper[4825]: I0310 08:45:46.888671 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:45:46 crc kubenswrapper[4825]: I0310 08:45:46.889243 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:45:50 crc kubenswrapper[4825]: I0310 08:45:50.075382 4825 generic.go:334] "Generic (PLEG): container finished" podID="f7c77abe-85f2-42bf-a43c-0fb55031e37d" containerID="4d1be249d6481b27a5e7ff06bd3b95579d646b4ace7ad4dac9b19214b856f105" exitCode=0 Mar 10 08:45:50 crc kubenswrapper[4825]: I0310 08:45:50.075552 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-jtd8b" event={"ID":"f7c77abe-85f2-42bf-a43c-0fb55031e37d","Type":"ContainerDied","Data":"4d1be249d6481b27a5e7ff06bd3b95579d646b4ace7ad4dac9b19214b856f105"} Mar 10 08:45:51 crc kubenswrapper[4825]: I0310 08:45:51.495872 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-jtd8b" Mar 10 08:45:51 crc kubenswrapper[4825]: I0310 08:45:51.577747 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f7c77abe-85f2-42bf-a43c-0fb55031e37d-ssh-key-openstack-cell1\") pod \"f7c77abe-85f2-42bf-a43c-0fb55031e37d\" (UID: \"f7c77abe-85f2-42bf-a43c-0fb55031e37d\") " Mar 10 08:45:51 crc kubenswrapper[4825]: I0310 08:45:51.577871 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tvhc\" (UniqueName: \"kubernetes.io/projected/f7c77abe-85f2-42bf-a43c-0fb55031e37d-kube-api-access-4tvhc\") pod \"f7c77abe-85f2-42bf-a43c-0fb55031e37d\" (UID: \"f7c77abe-85f2-42bf-a43c-0fb55031e37d\") " Mar 10 08:45:51 crc kubenswrapper[4825]: I0310 08:45:51.577990 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f7c77abe-85f2-42bf-a43c-0fb55031e37d-inventory-0\") pod \"f7c77abe-85f2-42bf-a43c-0fb55031e37d\" (UID: \"f7c77abe-85f2-42bf-a43c-0fb55031e37d\") " Mar 10 08:45:51 crc kubenswrapper[4825]: I0310 08:45:51.584429 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7c77abe-85f2-42bf-a43c-0fb55031e37d-kube-api-access-4tvhc" (OuterVolumeSpecName: "kube-api-access-4tvhc") pod "f7c77abe-85f2-42bf-a43c-0fb55031e37d" (UID: "f7c77abe-85f2-42bf-a43c-0fb55031e37d"). InnerVolumeSpecName "kube-api-access-4tvhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:45:51 crc kubenswrapper[4825]: I0310 08:45:51.608897 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c77abe-85f2-42bf-a43c-0fb55031e37d-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "f7c77abe-85f2-42bf-a43c-0fb55031e37d" (UID: "f7c77abe-85f2-42bf-a43c-0fb55031e37d"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:45:51 crc kubenswrapper[4825]: I0310 08:45:51.615890 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c77abe-85f2-42bf-a43c-0fb55031e37d-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "f7c77abe-85f2-42bf-a43c-0fb55031e37d" (UID: "f7c77abe-85f2-42bf-a43c-0fb55031e37d"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:45:51 crc kubenswrapper[4825]: I0310 08:45:51.680068 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f7c77abe-85f2-42bf-a43c-0fb55031e37d-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 08:45:51 crc kubenswrapper[4825]: I0310 08:45:51.680100 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tvhc\" (UniqueName: \"kubernetes.io/projected/f7c77abe-85f2-42bf-a43c-0fb55031e37d-kube-api-access-4tvhc\") on node \"crc\" DevicePath \"\"" Mar 10 08:45:51 crc kubenswrapper[4825]: I0310 08:45:51.680116 4825 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f7c77abe-85f2-42bf-a43c-0fb55031e37d-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 10 08:45:52 crc kubenswrapper[4825]: I0310 08:45:52.096874 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-jtd8b" event={"ID":"f7c77abe-85f2-42bf-a43c-0fb55031e37d","Type":"ContainerDied","Data":"a6fd9a4afda8d028843b6882347d8b44ae7b6206b3dff5570759277ade070fb1"} Mar 10 08:45:52 crc kubenswrapper[4825]: I0310 08:45:52.096922 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6fd9a4afda8d028843b6882347d8b44ae7b6206b3dff5570759277ade070fb1" Mar 10 08:45:52 crc kubenswrapper[4825]: I0310 08:45:52.097551 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-jtd8b" Mar 10 08:45:52 crc kubenswrapper[4825]: I0310 08:45:52.177907 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-7v9mc"] Mar 10 08:45:52 crc kubenswrapper[4825]: E0310 08:45:52.178628 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c77abe-85f2-42bf-a43c-0fb55031e37d" containerName="ssh-known-hosts-openstack" Mar 10 08:45:52 crc kubenswrapper[4825]: I0310 08:45:52.178661 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c77abe-85f2-42bf-a43c-0fb55031e37d" containerName="ssh-known-hosts-openstack" Mar 10 08:45:52 crc kubenswrapper[4825]: I0310 08:45:52.178959 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7c77abe-85f2-42bf-a43c-0fb55031e37d" containerName="ssh-known-hosts-openstack" Mar 10 08:45:52 crc kubenswrapper[4825]: I0310 08:45:52.180060 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-7v9mc" Mar 10 08:45:52 crc kubenswrapper[4825]: I0310 08:45:52.183126 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 08:45:52 crc kubenswrapper[4825]: I0310 08:45:52.183746 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 08:45:52 crc kubenswrapper[4825]: I0310 08:45:52.183979 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 08:45:52 crc kubenswrapper[4825]: I0310 08:45:52.184009 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-5twvv" Mar 10 08:45:52 crc kubenswrapper[4825]: I0310 08:45:52.195337 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-7v9mc"] Mar 10 08:45:52 crc kubenswrapper[4825]: I0310 08:45:52.290648 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sfgf\" (UniqueName: \"kubernetes.io/projected/9f21dc2a-195c-4bf3-82fe-be871a275ae3-kube-api-access-4sfgf\") pod \"run-os-openstack-openstack-cell1-7v9mc\" (UID: \"9f21dc2a-195c-4bf3-82fe-be871a275ae3\") " pod="openstack/run-os-openstack-openstack-cell1-7v9mc" Mar 10 08:45:52 crc kubenswrapper[4825]: I0310 08:45:52.290698 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9f21dc2a-195c-4bf3-82fe-be871a275ae3-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-7v9mc\" (UID: \"9f21dc2a-195c-4bf3-82fe-be871a275ae3\") " pod="openstack/run-os-openstack-openstack-cell1-7v9mc" Mar 10 08:45:52 crc kubenswrapper[4825]: I0310 08:45:52.291599 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f21dc2a-195c-4bf3-82fe-be871a275ae3-inventory\") pod \"run-os-openstack-openstack-cell1-7v9mc\" (UID: \"9f21dc2a-195c-4bf3-82fe-be871a275ae3\") " pod="openstack/run-os-openstack-openstack-cell1-7v9mc" Mar 10 08:45:52 crc kubenswrapper[4825]: I0310 08:45:52.394019 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sfgf\" (UniqueName: \"kubernetes.io/projected/9f21dc2a-195c-4bf3-82fe-be871a275ae3-kube-api-access-4sfgf\") pod \"run-os-openstack-openstack-cell1-7v9mc\" (UID: \"9f21dc2a-195c-4bf3-82fe-be871a275ae3\") " pod="openstack/run-os-openstack-openstack-cell1-7v9mc" Mar 10 08:45:52 crc kubenswrapper[4825]: I0310 08:45:52.394073 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9f21dc2a-195c-4bf3-82fe-be871a275ae3-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-7v9mc\" (UID: \"9f21dc2a-195c-4bf3-82fe-be871a275ae3\") " pod="openstack/run-os-openstack-openstack-cell1-7v9mc" Mar 10 08:45:52 crc kubenswrapper[4825]: I0310 08:45:52.394239 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f21dc2a-195c-4bf3-82fe-be871a275ae3-inventory\") pod \"run-os-openstack-openstack-cell1-7v9mc\" (UID: \"9f21dc2a-195c-4bf3-82fe-be871a275ae3\") " pod="openstack/run-os-openstack-openstack-cell1-7v9mc" Mar 10 08:45:52 crc kubenswrapper[4825]: I0310 08:45:52.399390 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9f21dc2a-195c-4bf3-82fe-be871a275ae3-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-7v9mc\" (UID: \"9f21dc2a-195c-4bf3-82fe-be871a275ae3\") " pod="openstack/run-os-openstack-openstack-cell1-7v9mc" Mar 10 08:45:52 crc kubenswrapper[4825]: I0310 08:45:52.401168 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f21dc2a-195c-4bf3-82fe-be871a275ae3-inventory\") pod \"run-os-openstack-openstack-cell1-7v9mc\" (UID: \"9f21dc2a-195c-4bf3-82fe-be871a275ae3\") " pod="openstack/run-os-openstack-openstack-cell1-7v9mc" Mar 10 08:45:52 crc kubenswrapper[4825]: I0310 08:45:52.409597 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sfgf\" (UniqueName: \"kubernetes.io/projected/9f21dc2a-195c-4bf3-82fe-be871a275ae3-kube-api-access-4sfgf\") pod \"run-os-openstack-openstack-cell1-7v9mc\" (UID: \"9f21dc2a-195c-4bf3-82fe-be871a275ae3\") " pod="openstack/run-os-openstack-openstack-cell1-7v9mc" Mar 10 08:45:52 crc kubenswrapper[4825]: I0310 08:45:52.512955 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-7v9mc" Mar 10 08:45:53 crc kubenswrapper[4825]: I0310 08:45:53.080681 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-7v9mc"] Mar 10 08:45:53 crc kubenswrapper[4825]: I0310 08:45:53.107677 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-7v9mc" event={"ID":"9f21dc2a-195c-4bf3-82fe-be871a275ae3","Type":"ContainerStarted","Data":"d6bfd4b9b07759ccfc2fb7acfe0342e3b0fc58a50f94f5b39e8e904975cded2f"} Mar 10 08:45:53 crc kubenswrapper[4825]: I0310 08:45:53.973281 4825 scope.go:117] "RemoveContainer" containerID="f69710812475557cff5b28a89e252f8422664974d758dfa8bd70815e8bee731c" Mar 10 08:45:54 crc kubenswrapper[4825]: I0310 08:45:54.118073 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-7v9mc" event={"ID":"9f21dc2a-195c-4bf3-82fe-be871a275ae3","Type":"ContainerStarted","Data":"98723ee1f9807d14378b325224e6b0bd801764706724afa88e1bbf0abf2cc3d7"} Mar 10 08:45:54 crc kubenswrapper[4825]: I0310 08:45:54.148888 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-7v9mc" podStartSLOduration=1.5538151980000001 podStartE2EDuration="2.148869703s" podCreationTimestamp="2026-03-10 08:45:52 +0000 UTC" firstStartedPulling="2026-03-10 08:45:53.088034606 +0000 UTC m=+7306.117815221" lastFinishedPulling="2026-03-10 08:45:53.683089111 +0000 UTC m=+7306.712869726" observedRunningTime="2026-03-10 08:45:54.143336547 +0000 UTC m=+7307.173117182" watchObservedRunningTime="2026-03-10 08:45:54.148869703 +0000 UTC m=+7307.178650318" Mar 10 08:46:00 crc kubenswrapper[4825]: I0310 08:46:00.137095 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552206-q6ncf"] Mar 10 08:46:00 crc kubenswrapper[4825]: I0310 08:46:00.138926 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552206-q6ncf" Mar 10 08:46:00 crc kubenswrapper[4825]: I0310 08:46:00.142359 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:46:00 crc kubenswrapper[4825]: I0310 08:46:00.142369 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:46:00 crc kubenswrapper[4825]: I0310 08:46:00.142405 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:46:00 crc kubenswrapper[4825]: I0310 08:46:00.148673 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552206-q6ncf"] Mar 10 08:46:00 crc kubenswrapper[4825]: I0310 08:46:00.149595 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwkhx\" (UniqueName: \"kubernetes.io/projected/0f9fcab9-d38b-401c-96da-163b9077f329-kube-api-access-hwkhx\") pod \"auto-csr-approver-29552206-q6ncf\" (UID: \"0f9fcab9-d38b-401c-96da-163b9077f329\") " pod="openshift-infra/auto-csr-approver-29552206-q6ncf" Mar 10 08:46:00 crc kubenswrapper[4825]: I0310 08:46:00.252440 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwkhx\" (UniqueName: \"kubernetes.io/projected/0f9fcab9-d38b-401c-96da-163b9077f329-kube-api-access-hwkhx\") pod \"auto-csr-approver-29552206-q6ncf\" (UID: \"0f9fcab9-d38b-401c-96da-163b9077f329\") " pod="openshift-infra/auto-csr-approver-29552206-q6ncf" Mar 10 08:46:00 crc kubenswrapper[4825]: I0310 08:46:00.271918 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwkhx\" (UniqueName: \"kubernetes.io/projected/0f9fcab9-d38b-401c-96da-163b9077f329-kube-api-access-hwkhx\") pod \"auto-csr-approver-29552206-q6ncf\" (UID: \"0f9fcab9-d38b-401c-96da-163b9077f329\") " pod="openshift-infra/auto-csr-approver-29552206-q6ncf" Mar 10 08:46:00 crc kubenswrapper[4825]: I0310 08:46:00.467271 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552206-q6ncf" Mar 10 08:46:00 crc kubenswrapper[4825]: I0310 08:46:00.895827 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552206-q6ncf"] Mar 10 08:46:01 crc kubenswrapper[4825]: I0310 08:46:01.189065 4825 generic.go:334] "Generic (PLEG): container finished" podID="9f21dc2a-195c-4bf3-82fe-be871a275ae3" containerID="98723ee1f9807d14378b325224e6b0bd801764706724afa88e1bbf0abf2cc3d7" exitCode=0 Mar 10 08:46:01 crc kubenswrapper[4825]: I0310 08:46:01.189164 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-7v9mc" event={"ID":"9f21dc2a-195c-4bf3-82fe-be871a275ae3","Type":"ContainerDied","Data":"98723ee1f9807d14378b325224e6b0bd801764706724afa88e1bbf0abf2cc3d7"} Mar 10 08:46:01 crc kubenswrapper[4825]: I0310 08:46:01.190654 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552206-q6ncf" event={"ID":"0f9fcab9-d38b-401c-96da-163b9077f329","Type":"ContainerStarted","Data":"071879c4382f0e1b7044b8fe109dc1690b558521101f01256a5e15a9feeb8a11"} Mar 10 08:46:02 crc kubenswrapper[4825]: I0310 08:46:02.637727 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-7v9mc" Mar 10 08:46:02 crc kubenswrapper[4825]: I0310 08:46:02.810113 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9f21dc2a-195c-4bf3-82fe-be871a275ae3-ssh-key-openstack-cell1\") pod \"9f21dc2a-195c-4bf3-82fe-be871a275ae3\" (UID: \"9f21dc2a-195c-4bf3-82fe-be871a275ae3\") " Mar 10 08:46:02 crc kubenswrapper[4825]: I0310 08:46:02.810192 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f21dc2a-195c-4bf3-82fe-be871a275ae3-inventory\") pod \"9f21dc2a-195c-4bf3-82fe-be871a275ae3\" (UID: \"9f21dc2a-195c-4bf3-82fe-be871a275ae3\") " Mar 10 08:46:02 crc kubenswrapper[4825]: I0310 08:46:02.810466 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sfgf\" (UniqueName: \"kubernetes.io/projected/9f21dc2a-195c-4bf3-82fe-be871a275ae3-kube-api-access-4sfgf\") pod \"9f21dc2a-195c-4bf3-82fe-be871a275ae3\" (UID: \"9f21dc2a-195c-4bf3-82fe-be871a275ae3\") " Mar 10 08:46:02 crc kubenswrapper[4825]: I0310 08:46:02.814956 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f21dc2a-195c-4bf3-82fe-be871a275ae3-kube-api-access-4sfgf" (OuterVolumeSpecName: "kube-api-access-4sfgf") pod "9f21dc2a-195c-4bf3-82fe-be871a275ae3" (UID: "9f21dc2a-195c-4bf3-82fe-be871a275ae3"). InnerVolumeSpecName "kube-api-access-4sfgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:46:02 crc kubenswrapper[4825]: I0310 08:46:02.839333 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f21dc2a-195c-4bf3-82fe-be871a275ae3-inventory" (OuterVolumeSpecName: "inventory") pod "9f21dc2a-195c-4bf3-82fe-be871a275ae3" (UID: "9f21dc2a-195c-4bf3-82fe-be871a275ae3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:46:02 crc kubenswrapper[4825]: I0310 08:46:02.839579 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f21dc2a-195c-4bf3-82fe-be871a275ae3-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "9f21dc2a-195c-4bf3-82fe-be871a275ae3" (UID: "9f21dc2a-195c-4bf3-82fe-be871a275ae3"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:46:02 crc kubenswrapper[4825]: I0310 08:46:02.913157 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sfgf\" (UniqueName: \"kubernetes.io/projected/9f21dc2a-195c-4bf3-82fe-be871a275ae3-kube-api-access-4sfgf\") on node \"crc\" DevicePath \"\"" Mar 10 08:46:02 crc kubenswrapper[4825]: I0310 08:46:02.913190 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9f21dc2a-195c-4bf3-82fe-be871a275ae3-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 08:46:02 crc kubenswrapper[4825]: I0310 08:46:02.913200 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f21dc2a-195c-4bf3-82fe-be871a275ae3-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 08:46:03 crc kubenswrapper[4825]: I0310 08:46:03.209076 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-7v9mc" event={"ID":"9f21dc2a-195c-4bf3-82fe-be871a275ae3","Type":"ContainerDied","Data":"d6bfd4b9b07759ccfc2fb7acfe0342e3b0fc58a50f94f5b39e8e904975cded2f"} Mar 10 08:46:03 crc kubenswrapper[4825]: I0310 08:46:03.209112 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6bfd4b9b07759ccfc2fb7acfe0342e3b0fc58a50f94f5b39e8e904975cded2f" Mar 10 08:46:03 crc kubenswrapper[4825]: I0310 08:46:03.209090 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-7v9mc" Mar 10 08:46:03 crc kubenswrapper[4825]: I0310 08:46:03.210758 4825 generic.go:334] "Generic (PLEG): container finished" podID="0f9fcab9-d38b-401c-96da-163b9077f329" containerID="a99816d89d8ad7aa4564437141d90d96a4829083d63382270ac6a376cdb2c0f3" exitCode=0 Mar 10 08:46:03 crc kubenswrapper[4825]: I0310 08:46:03.210784 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552206-q6ncf" event={"ID":"0f9fcab9-d38b-401c-96da-163b9077f329","Type":"ContainerDied","Data":"a99816d89d8ad7aa4564437141d90d96a4829083d63382270ac6a376cdb2c0f3"} Mar 10 08:46:03 crc kubenswrapper[4825]: I0310 08:46:03.308594 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-ljb4q"] Mar 10 08:46:03 crc kubenswrapper[4825]: E0310 08:46:03.309123 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f21dc2a-195c-4bf3-82fe-be871a275ae3" containerName="run-os-openstack-openstack-cell1" Mar 10 08:46:03 crc kubenswrapper[4825]: I0310 08:46:03.309480 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f21dc2a-195c-4bf3-82fe-be871a275ae3" containerName="run-os-openstack-openstack-cell1" Mar 10 08:46:03 crc kubenswrapper[4825]: I0310 08:46:03.309734 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f21dc2a-195c-4bf3-82fe-be871a275ae3" containerName="run-os-openstack-openstack-cell1" Mar 10 08:46:03 crc kubenswrapper[4825]: I0310 08:46:03.310546 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-ljb4q" Mar 10 08:46:03 crc kubenswrapper[4825]: I0310 08:46:03.314546 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 08:46:03 crc kubenswrapper[4825]: I0310 08:46:03.314768 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 08:46:03 crc kubenswrapper[4825]: I0310 08:46:03.314942 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 08:46:03 crc kubenswrapper[4825]: I0310 08:46:03.315266 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-5twvv" Mar 10 08:46:03 crc kubenswrapper[4825]: I0310 08:46:03.320433 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-ljb4q"] Mar 10 08:46:03 crc kubenswrapper[4825]: I0310 08:46:03.422282 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8-inventory\") pod \"reboot-os-openstack-openstack-cell1-ljb4q\" (UID: \"5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8\") " pod="openstack/reboot-os-openstack-openstack-cell1-ljb4q" Mar 10 08:46:03 crc kubenswrapper[4825]: I0310 08:46:03.422327 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-ljb4q\" (UID: \"5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8\") " pod="openstack/reboot-os-openstack-openstack-cell1-ljb4q" Mar 10 08:46:03 crc kubenswrapper[4825]: I0310 08:46:03.422457 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8blf\" (UniqueName: \"kubernetes.io/projected/5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8-kube-api-access-k8blf\") pod \"reboot-os-openstack-openstack-cell1-ljb4q\" (UID: \"5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8\") " pod="openstack/reboot-os-openstack-openstack-cell1-ljb4q" Mar 10 08:46:03 crc kubenswrapper[4825]: I0310 08:46:03.525215 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8-inventory\") pod \"reboot-os-openstack-openstack-cell1-ljb4q\" (UID: \"5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8\") " pod="openstack/reboot-os-openstack-openstack-cell1-ljb4q" Mar 10 08:46:03 crc kubenswrapper[4825]: I0310 08:46:03.525511 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-ljb4q\" (UID: \"5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8\") " pod="openstack/reboot-os-openstack-openstack-cell1-ljb4q" Mar 10 08:46:03 crc kubenswrapper[4825]: I0310 08:46:03.525702 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8blf\" (UniqueName: \"kubernetes.io/projected/5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8-kube-api-access-k8blf\") pod \"reboot-os-openstack-openstack-cell1-ljb4q\" (UID: \"5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8\") " pod="openstack/reboot-os-openstack-openstack-cell1-ljb4q" Mar 10 08:46:03 crc kubenswrapper[4825]: I0310 08:46:03.529389 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8-inventory\") pod \"reboot-os-openstack-openstack-cell1-ljb4q\" (UID: \"5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8\") " pod="openstack/reboot-os-openstack-openstack-cell1-ljb4q" Mar 10 08:46:03 crc kubenswrapper[4825]: I0310 08:46:03.538793 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-ljb4q\" (UID: \"5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8\") " pod="openstack/reboot-os-openstack-openstack-cell1-ljb4q" Mar 10 08:46:03 crc kubenswrapper[4825]: I0310 08:46:03.542278 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8blf\" (UniqueName: \"kubernetes.io/projected/5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8-kube-api-access-k8blf\") pod \"reboot-os-openstack-openstack-cell1-ljb4q\" (UID: \"5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8\") " pod="openstack/reboot-os-openstack-openstack-cell1-ljb4q" Mar 10 08:46:03 crc kubenswrapper[4825]: I0310 08:46:03.634956 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-ljb4q" Mar 10 08:46:04 crc kubenswrapper[4825]: I0310 08:46:04.140121 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-ljb4q"] Mar 10 08:46:04 crc kubenswrapper[4825]: I0310 08:46:04.221421 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-ljb4q" event={"ID":"5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8","Type":"ContainerStarted","Data":"8552f76b88e13f343478227690bf6814ec5738fcf6f662f7fba794da5a1dfeb7"} Mar 10 08:46:04 crc kubenswrapper[4825]: I0310 08:46:04.451608 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552206-q6ncf" Mar 10 08:46:04 crc kubenswrapper[4825]: I0310 08:46:04.547509 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwkhx\" (UniqueName: \"kubernetes.io/projected/0f9fcab9-d38b-401c-96da-163b9077f329-kube-api-access-hwkhx\") pod \"0f9fcab9-d38b-401c-96da-163b9077f329\" (UID: \"0f9fcab9-d38b-401c-96da-163b9077f329\") " Mar 10 08:46:04 crc kubenswrapper[4825]: I0310 08:46:04.553265 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f9fcab9-d38b-401c-96da-163b9077f329-kube-api-access-hwkhx" (OuterVolumeSpecName: "kube-api-access-hwkhx") pod "0f9fcab9-d38b-401c-96da-163b9077f329" (UID: "0f9fcab9-d38b-401c-96da-163b9077f329"). InnerVolumeSpecName "kube-api-access-hwkhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:46:04 crc kubenswrapper[4825]: I0310 08:46:04.649707 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwkhx\" (UniqueName: \"kubernetes.io/projected/0f9fcab9-d38b-401c-96da-163b9077f329-kube-api-access-hwkhx\") on node \"crc\" DevicePath \"\"" Mar 10 08:46:05 crc kubenswrapper[4825]: I0310 08:46:05.233932 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552206-q6ncf" event={"ID":"0f9fcab9-d38b-401c-96da-163b9077f329","Type":"ContainerDied","Data":"071879c4382f0e1b7044b8fe109dc1690b558521101f01256a5e15a9feeb8a11"} Mar 10 08:46:05 crc kubenswrapper[4825]: I0310 08:46:05.233977 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552206-q6ncf" Mar 10 08:46:05 crc kubenswrapper[4825]: I0310 08:46:05.233988 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="071879c4382f0e1b7044b8fe109dc1690b558521101f01256a5e15a9feeb8a11" Mar 10 08:46:05 crc kubenswrapper[4825]: I0310 08:46:05.235379 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-ljb4q" event={"ID":"5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8","Type":"ContainerStarted","Data":"f8bab8da5279a5824e6d5470a22b992e9be0ab0f52ed3f16e0fb821153bcf8c8"} Mar 10 08:46:05 crc kubenswrapper[4825]: I0310 08:46:05.259995 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-ljb4q" podStartSLOduration=1.542025587 podStartE2EDuration="2.259977558s" podCreationTimestamp="2026-03-10 08:46:03 +0000 UTC" firstStartedPulling="2026-03-10 08:46:04.142520355 +0000 UTC m=+7317.172300970" lastFinishedPulling="2026-03-10 08:46:04.860472326 +0000 UTC m=+7317.890252941" observedRunningTime="2026-03-10 08:46:05.250448426 +0000 UTC m=+7318.280229051" watchObservedRunningTime="2026-03-10 08:46:05.259977558 +0000 UTC m=+7318.289758163" Mar 10 08:46:05 crc kubenswrapper[4825]: I0310 08:46:05.528387 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552200-8hw26"] Mar 10 08:46:05 crc kubenswrapper[4825]: I0310 08:46:05.542836 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552200-8hw26"] Mar 10 08:46:07 crc kubenswrapper[4825]: I0310 08:46:07.250932 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e370149-7b09-44f2-a363-38d1fddc8c60" path="/var/lib/kubelet/pods/5e370149-7b09-44f2-a363-38d1fddc8c60/volumes" Mar 10 08:46:16 crc kubenswrapper[4825]: I0310 08:46:16.891383 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:46:16 crc kubenswrapper[4825]: I0310 08:46:16.891898 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:46:16 crc kubenswrapper[4825]: I0310 08:46:16.891942 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 08:46:16 crc kubenswrapper[4825]: I0310 08:46:16.892794 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a77ec3023a65c7544ad73bb812378227e2a33544cb0a071029aa0eecd3038079"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 08:46:16 crc kubenswrapper[4825]: I0310 08:46:16.892841 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://a77ec3023a65c7544ad73bb812378227e2a33544cb0a071029aa0eecd3038079" gracePeriod=600 Mar 10 08:46:17 crc kubenswrapper[4825]: I0310 08:46:17.408322 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="a77ec3023a65c7544ad73bb812378227e2a33544cb0a071029aa0eecd3038079" exitCode=0 Mar 10 08:46:17 crc kubenswrapper[4825]: I0310 08:46:17.408404 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"a77ec3023a65c7544ad73bb812378227e2a33544cb0a071029aa0eecd3038079"} Mar 10 08:46:17 crc kubenswrapper[4825]: I0310 08:46:17.408679 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe"} Mar 10 08:46:17 crc kubenswrapper[4825]: I0310 08:46:17.408709 4825 scope.go:117] "RemoveContainer" containerID="e35b101378bc0043b3ed9549d6a5f8744f6f01d05a4da2d7392043a03928772d" Mar 10 08:46:21 crc kubenswrapper[4825]: I0310 08:46:21.449033 4825 generic.go:334] "Generic (PLEG): container finished" podID="5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8" containerID="f8bab8da5279a5824e6d5470a22b992e9be0ab0f52ed3f16e0fb821153bcf8c8" exitCode=0 Mar 10 08:46:21 crc kubenswrapper[4825]: I0310 08:46:21.449184 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-ljb4q" event={"ID":"5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8","Type":"ContainerDied","Data":"f8bab8da5279a5824e6d5470a22b992e9be0ab0f52ed3f16e0fb821153bcf8c8"} Mar 10 08:46:22 crc kubenswrapper[4825]: I0310 08:46:22.879756 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-ljb4q" Mar 10 08:46:22 crc kubenswrapper[4825]: I0310 08:46:22.960227 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8-ssh-key-openstack-cell1\") pod \"5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8\" (UID: \"5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8\") " Mar 10 08:46:22 crc kubenswrapper[4825]: I0310 08:46:22.960336 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8-inventory\") pod \"5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8\" (UID: \"5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8\") " Mar 10 08:46:22 crc kubenswrapper[4825]: I0310 08:46:22.960438 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8blf\" (UniqueName: \"kubernetes.io/projected/5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8-kube-api-access-k8blf\") pod \"5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8\" (UID: \"5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8\") " Mar 10 08:46:22 crc kubenswrapper[4825]: I0310 08:46:22.967167 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8-kube-api-access-k8blf" (OuterVolumeSpecName: "kube-api-access-k8blf") pod "5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8" (UID: "5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8"). InnerVolumeSpecName "kube-api-access-k8blf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:46:22 crc kubenswrapper[4825]: I0310 08:46:22.988988 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8" (UID: "5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:46:22 crc kubenswrapper[4825]: I0310 08:46:22.992758 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8-inventory" (OuterVolumeSpecName: "inventory") pod "5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8" (UID: "5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.062596 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.062634 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8blf\" (UniqueName: \"kubernetes.io/projected/5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8-kube-api-access-k8blf\") on node \"crc\" DevicePath \"\"" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.062673 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.492419 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-ljb4q" event={"ID":"5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8","Type":"ContainerDied","Data":"8552f76b88e13f343478227690bf6814ec5738fcf6f662f7fba794da5a1dfeb7"} Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.492758 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8552f76b88e13f343478227690bf6814ec5738fcf6f662f7fba794da5a1dfeb7" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.492846 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-ljb4q" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.559822 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-p52j5"] Mar 10 08:46:23 crc kubenswrapper[4825]: E0310 08:46:23.560287 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f9fcab9-d38b-401c-96da-163b9077f329" containerName="oc" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.560304 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f9fcab9-d38b-401c-96da-163b9077f329" containerName="oc" Mar 10 08:46:23 crc kubenswrapper[4825]: E0310 08:46:23.560328 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8" containerName="reboot-os-openstack-openstack-cell1" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.560336 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8" containerName="reboot-os-openstack-openstack-cell1" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.560505 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8" containerName="reboot-os-openstack-openstack-cell1" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.560524 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f9fcab9-d38b-401c-96da-163b9077f329" containerName="oc" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.561291 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.565223 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.565599 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-libvirt-default-certs-0" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.565777 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-5twvv" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.565917 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-telemetry-default-certs-0" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.566022 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-neutron-metadata-default-certs-0" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.566118 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-ovn-default-certs-0" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.566234 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.566444 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.592096 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-p52j5"] Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.682803 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.682874 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.682915 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.682947 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-inventory\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.683023 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.683073 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.683310 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.683445 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rb2p\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-kube-api-access-5rb2p\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.683571 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.683668 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.683714 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.683803 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.683864 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.683891 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.683946 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.785709 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.785758 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.785837 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.785858 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.785890 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.785952 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.785973 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.785997 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.786020 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.786042 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-inventory\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.786069 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.786105 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.786177 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.786207 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rb2p\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-kube-api-access-5rb2p\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.786239 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.791278 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.792537 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.792993 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.793411 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.793597 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.795308 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.795630 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.796287 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.796491 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.796677 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.797250 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.802548 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-inventory\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.803961 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rb2p\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-kube-api-access-5rb2p\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.804828 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.805134 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-p52j5\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:23 crc kubenswrapper[4825]: I0310 08:46:23.880196 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:46:24 crc kubenswrapper[4825]: I0310 08:46:24.467644 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-p52j5"] Mar 10 08:46:24 crc kubenswrapper[4825]: I0310 08:46:24.501754 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-p52j5" event={"ID":"bd05ab73-33e3-4441-80f1-47ecc62c610e","Type":"ContainerStarted","Data":"a6987d266a06a7388aa8ecf4a6cbe16d8ab170881fec3ec77584a9d0a6d4021e"} Mar 10 08:46:25 crc kubenswrapper[4825]: I0310 08:46:25.513292 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-p52j5" event={"ID":"bd05ab73-33e3-4441-80f1-47ecc62c610e","Type":"ContainerStarted","Data":"7e0d0e08a9f290a81e86c913ccaa8ee450a5af06e20edc4d9ee74f82d8e5583c"} Mar 10 08:46:25 crc kubenswrapper[4825]: I0310 08:46:25.538356 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-p52j5" podStartSLOduration=1.983315671 podStartE2EDuration="2.53833223s" podCreationTimestamp="2026-03-10 08:46:23 +0000 UTC" firstStartedPulling="2026-03-10 08:46:24.476688 +0000 UTC m=+7337.506468615" lastFinishedPulling="2026-03-10 08:46:25.031704549 +0000 UTC m=+7338.061485174" observedRunningTime="2026-03-10 08:46:25.535517625 +0000 UTC m=+7338.565298260" watchObservedRunningTime="2026-03-10 08:46:25.53833223 +0000 UTC m=+7338.568112845" Mar 10 08:46:54 crc kubenswrapper[4825]: I0310 08:46:54.072201 4825 scope.go:117] "RemoveContainer" containerID="5bbb4b60ecd62a2318f6533e7219a6a2242036b3dae5a4e0430b38f439a078c2" Mar 10 08:47:01 crc kubenswrapper[4825]: I0310 08:47:01.985694 4825 generic.go:334] "Generic (PLEG): container finished" podID="bd05ab73-33e3-4441-80f1-47ecc62c610e" containerID="7e0d0e08a9f290a81e86c913ccaa8ee450a5af06e20edc4d9ee74f82d8e5583c" exitCode=0 Mar 10 08:47:01 crc kubenswrapper[4825]: I0310 08:47:01.985867 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-p52j5" event={"ID":"bd05ab73-33e3-4441-80f1-47ecc62c610e","Type":"ContainerDied","Data":"7e0d0e08a9f290a81e86c913ccaa8ee450a5af06e20edc4d9ee74f82d8e5583c"} Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.410250 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.476680 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-openstack-cell1-neutron-metadata-default-certs-0\") pod \"bd05ab73-33e3-4441-80f1-47ecc62c610e\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.476734 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rb2p\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-kube-api-access-5rb2p\") pod \"bd05ab73-33e3-4441-80f1-47ecc62c610e\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.476766 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-libvirt-combined-ca-bundle\") pod \"bd05ab73-33e3-4441-80f1-47ecc62c610e\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.476832 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-bootstrap-combined-ca-bundle\") pod \"bd05ab73-33e3-4441-80f1-47ecc62c610e\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.476857 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-ssh-key-openstack-cell1\") pod \"bd05ab73-33e3-4441-80f1-47ecc62c610e\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.476881 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-ovn-combined-ca-bundle\") pod \"bd05ab73-33e3-4441-80f1-47ecc62c610e\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.476961 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-openstack-cell1-telemetry-default-certs-0\") pod \"bd05ab73-33e3-4441-80f1-47ecc62c610e\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.477069 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-neutron-dhcp-combined-ca-bundle\") pod \"bd05ab73-33e3-4441-80f1-47ecc62c610e\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.477102 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-nova-combined-ca-bundle\") pod \"bd05ab73-33e3-4441-80f1-47ecc62c610e\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.477202 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-openstack-cell1-ovn-default-certs-0\") pod \"bd05ab73-33e3-4441-80f1-47ecc62c610e\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.477263 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-neutron-metadata-combined-ca-bundle\") pod \"bd05ab73-33e3-4441-80f1-47ecc62c610e\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.477290 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-telemetry-combined-ca-bundle\") pod \"bd05ab73-33e3-4441-80f1-47ecc62c610e\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.477309 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-neutron-sriov-combined-ca-bundle\") pod \"bd05ab73-33e3-4441-80f1-47ecc62c610e\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.477335 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-openstack-cell1-libvirt-default-certs-0\") pod \"bd05ab73-33e3-4441-80f1-47ecc62c610e\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.477370 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-inventory\") pod \"bd05ab73-33e3-4441-80f1-47ecc62c610e\" (UID: \"bd05ab73-33e3-4441-80f1-47ecc62c610e\") " Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.483015 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-openstack-cell1-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-ovn-default-certs-0") pod "bd05ab73-33e3-4441-80f1-47ecc62c610e" (UID: "bd05ab73-33e3-4441-80f1-47ecc62c610e"). InnerVolumeSpecName "openstack-cell1-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.484193 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "bd05ab73-33e3-4441-80f1-47ecc62c610e" (UID: "bd05ab73-33e3-4441-80f1-47ecc62c610e"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.484474 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-openstack-cell1-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-neutron-metadata-default-certs-0") pod "bd05ab73-33e3-4441-80f1-47ecc62c610e" (UID: "bd05ab73-33e3-4441-80f1-47ecc62c610e"). InnerVolumeSpecName "openstack-cell1-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.485972 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-kube-api-access-5rb2p" (OuterVolumeSpecName: "kube-api-access-5rb2p") pod "bd05ab73-33e3-4441-80f1-47ecc62c610e" (UID: "bd05ab73-33e3-4441-80f1-47ecc62c610e"). InnerVolumeSpecName "kube-api-access-5rb2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.486419 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "bd05ab73-33e3-4441-80f1-47ecc62c610e" (UID: "bd05ab73-33e3-4441-80f1-47ecc62c610e"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.486902 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "bd05ab73-33e3-4441-80f1-47ecc62c610e" (UID: "bd05ab73-33e3-4441-80f1-47ecc62c610e"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.487611 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "bd05ab73-33e3-4441-80f1-47ecc62c610e" (UID: "bd05ab73-33e3-4441-80f1-47ecc62c610e"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.488290 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "bd05ab73-33e3-4441-80f1-47ecc62c610e" (UID: "bd05ab73-33e3-4441-80f1-47ecc62c610e"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.488458 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "bd05ab73-33e3-4441-80f1-47ecc62c610e" (UID: "bd05ab73-33e3-4441-80f1-47ecc62c610e"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.488539 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "bd05ab73-33e3-4441-80f1-47ecc62c610e" (UID: "bd05ab73-33e3-4441-80f1-47ecc62c610e"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.489388 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-openstack-cell1-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-telemetry-default-certs-0") pod "bd05ab73-33e3-4441-80f1-47ecc62c610e" (UID: "bd05ab73-33e3-4441-80f1-47ecc62c610e"). InnerVolumeSpecName "openstack-cell1-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.503199 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-openstack-cell1-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-libvirt-default-certs-0") pod "bd05ab73-33e3-4441-80f1-47ecc62c610e" (UID: "bd05ab73-33e3-4441-80f1-47ecc62c610e"). InnerVolumeSpecName "openstack-cell1-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.504419 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "bd05ab73-33e3-4441-80f1-47ecc62c610e" (UID: "bd05ab73-33e3-4441-80f1-47ecc62c610e"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.513199 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-inventory" (OuterVolumeSpecName: "inventory") pod "bd05ab73-33e3-4441-80f1-47ecc62c610e" (UID: "bd05ab73-33e3-4441-80f1-47ecc62c610e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.531237 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "bd05ab73-33e3-4441-80f1-47ecc62c610e" (UID: "bd05ab73-33e3-4441-80f1-47ecc62c610e"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.580249 4825 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.580291 4825 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.580303 4825 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-openstack-cell1-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.580321 4825 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.580342 4825 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.580353 4825 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-openstack-cell1-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.580363 4825 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.580374 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.580385 4825 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-openstack-cell1-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.580395 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rb2p\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-kube-api-access-5rb2p\") on node \"crc\" DevicePath \"\"" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.580405 4825 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.580414 4825 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.580422 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.580432 4825 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd05ab73-33e3-4441-80f1-47ecc62c610e-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:47:03 crc kubenswrapper[4825]: I0310 08:47:03.580443 4825 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/bd05ab73-33e3-4441-80f1-47ecc62c610e-openstack-cell1-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.008838 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-p52j5" event={"ID":"bd05ab73-33e3-4441-80f1-47ecc62c610e","Type":"ContainerDied","Data":"a6987d266a06a7388aa8ecf4a6cbe16d8ab170881fec3ec77584a9d0a6d4021e"} Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.008879 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6987d266a06a7388aa8ecf4a6cbe16d8ab170881fec3ec77584a9d0a6d4021e" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.008925 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-p52j5" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.114120 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-lwdhg"] Mar 10 08:47:04 crc kubenswrapper[4825]: E0310 08:47:04.114552 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd05ab73-33e3-4441-80f1-47ecc62c610e" containerName="install-certs-openstack-openstack-cell1" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.114574 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd05ab73-33e3-4441-80f1-47ecc62c610e" containerName="install-certs-openstack-openstack-cell1" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.114870 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd05ab73-33e3-4441-80f1-47ecc62c610e" containerName="install-certs-openstack-openstack-cell1" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.116354 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-lwdhg" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.121765 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-5twvv" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.121946 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.122079 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.122203 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.122396 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.133569 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-lwdhg"] Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.192098 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-lwdhg\" (UID: \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\") " pod="openstack/ovn-openstack-openstack-cell1-lwdhg" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.192272 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-lwdhg\" (UID: \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\") " pod="openstack/ovn-openstack-openstack-cell1-lwdhg" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.192316 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-lwdhg\" (UID: \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\") " pod="openstack/ovn-openstack-openstack-cell1-lwdhg" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.192346 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-inventory\") pod \"ovn-openstack-openstack-cell1-lwdhg\" (UID: \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\") " pod="openstack/ovn-openstack-openstack-cell1-lwdhg" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.192367 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bskf7\" (UniqueName: \"kubernetes.io/projected/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-kube-api-access-bskf7\") pod \"ovn-openstack-openstack-cell1-lwdhg\" (UID: \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\") " pod="openstack/ovn-openstack-openstack-cell1-lwdhg" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.294125 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-lwdhg\" (UID: \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\") " pod="openstack/ovn-openstack-openstack-cell1-lwdhg" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.294242 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-lwdhg\" (UID: \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\") " pod="openstack/ovn-openstack-openstack-cell1-lwdhg" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.294310 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-inventory\") pod \"ovn-openstack-openstack-cell1-lwdhg\" (UID: \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\") " pod="openstack/ovn-openstack-openstack-cell1-lwdhg" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.294338 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bskf7\" (UniqueName: \"kubernetes.io/projected/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-kube-api-access-bskf7\") pod \"ovn-openstack-openstack-cell1-lwdhg\" (UID: \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\") " pod="openstack/ovn-openstack-openstack-cell1-lwdhg" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.294469 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-lwdhg\" (UID: \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\") " pod="openstack/ovn-openstack-openstack-cell1-lwdhg" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.295339 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-lwdhg\" (UID: \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\") " pod="openstack/ovn-openstack-openstack-cell1-lwdhg" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.298808 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-lwdhg\" (UID: \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\") " pod="openstack/ovn-openstack-openstack-cell1-lwdhg" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.300609 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-inventory\") pod \"ovn-openstack-openstack-cell1-lwdhg\" (UID: \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\") " pod="openstack/ovn-openstack-openstack-cell1-lwdhg" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.306938 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-lwdhg\" (UID: \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\") " pod="openstack/ovn-openstack-openstack-cell1-lwdhg" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.310309 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bskf7\" (UniqueName: \"kubernetes.io/projected/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-kube-api-access-bskf7\") pod \"ovn-openstack-openstack-cell1-lwdhg\" (UID: \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\") " pod="openstack/ovn-openstack-openstack-cell1-lwdhg" Mar 10 08:47:04 crc kubenswrapper[4825]: I0310 08:47:04.443696 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-lwdhg" Mar 10 08:47:05 crc kubenswrapper[4825]: I0310 08:47:05.096249 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-lwdhg"] Mar 10 08:47:05 crc kubenswrapper[4825]: I0310 08:47:05.103196 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 08:47:06 crc kubenswrapper[4825]: I0310 08:47:06.032425 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-lwdhg" event={"ID":"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46","Type":"ContainerStarted","Data":"b32d06a2383c505c169e5d4cd8f992cbf306b2b5a266773f0bd17dcd9a20d91d"} Mar 10 08:47:06 crc kubenswrapper[4825]: I0310 08:47:06.032692 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-lwdhg" event={"ID":"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46","Type":"ContainerStarted","Data":"57b48e71a5a1ea03fee3d0cb1dccc9b3eb559555ae4a2f80d787841e0771dc4f"} Mar 10 08:47:06 crc kubenswrapper[4825]: I0310 08:47:06.049992 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-lwdhg" podStartSLOduration=1.476780701 podStartE2EDuration="2.049973189s" podCreationTimestamp="2026-03-10 08:47:04 +0000 UTC" firstStartedPulling="2026-03-10 08:47:05.102848635 +0000 UTC m=+7378.132629250" lastFinishedPulling="2026-03-10 08:47:05.676041113 +0000 UTC m=+7378.705821738" observedRunningTime="2026-03-10 08:47:06.048407928 +0000 UTC m=+7379.078188553" watchObservedRunningTime="2026-03-10 08:47:06.049973189 +0000 UTC m=+7379.079753804" Mar 10 08:47:55 crc kubenswrapper[4825]: I0310 08:47:55.055191 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kgbjg"] Mar 10 08:47:55 crc kubenswrapper[4825]: I0310 08:47:55.058451 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kgbjg" Mar 10 08:47:55 crc kubenswrapper[4825]: I0310 08:47:55.067556 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kgbjg"] Mar 10 08:47:55 crc kubenswrapper[4825]: I0310 08:47:55.196098 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lwvv\" (UniqueName: \"kubernetes.io/projected/42d935d7-95d6-4915-b5f5-99a5c54e7341-kube-api-access-5lwvv\") pod \"redhat-operators-kgbjg\" (UID: \"42d935d7-95d6-4915-b5f5-99a5c54e7341\") " pod="openshift-marketplace/redhat-operators-kgbjg" Mar 10 08:47:55 crc kubenswrapper[4825]: I0310 08:47:55.196242 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42d935d7-95d6-4915-b5f5-99a5c54e7341-utilities\") pod \"redhat-operators-kgbjg\" (UID: \"42d935d7-95d6-4915-b5f5-99a5c54e7341\") " pod="openshift-marketplace/redhat-operators-kgbjg" Mar 10 08:47:55 crc kubenswrapper[4825]: I0310 08:47:55.196294 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42d935d7-95d6-4915-b5f5-99a5c54e7341-catalog-content\") pod \"redhat-operators-kgbjg\" (UID: \"42d935d7-95d6-4915-b5f5-99a5c54e7341\") " pod="openshift-marketplace/redhat-operators-kgbjg" Mar 10 08:47:55 crc kubenswrapper[4825]: I0310 08:47:55.300787 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lwvv\" (UniqueName: \"kubernetes.io/projected/42d935d7-95d6-4915-b5f5-99a5c54e7341-kube-api-access-5lwvv\") pod \"redhat-operators-kgbjg\" (UID: \"42d935d7-95d6-4915-b5f5-99a5c54e7341\") " pod="openshift-marketplace/redhat-operators-kgbjg" Mar 10 08:47:55 crc kubenswrapper[4825]: I0310 08:47:55.301047 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42d935d7-95d6-4915-b5f5-99a5c54e7341-utilities\") pod \"redhat-operators-kgbjg\" (UID: \"42d935d7-95d6-4915-b5f5-99a5c54e7341\") " pod="openshift-marketplace/redhat-operators-kgbjg" Mar 10 08:47:55 crc kubenswrapper[4825]: I0310 08:47:55.301137 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42d935d7-95d6-4915-b5f5-99a5c54e7341-catalog-content\") pod \"redhat-operators-kgbjg\" (UID: \"42d935d7-95d6-4915-b5f5-99a5c54e7341\") " pod="openshift-marketplace/redhat-operators-kgbjg" Mar 10 08:47:55 crc kubenswrapper[4825]: I0310 08:47:55.301782 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42d935d7-95d6-4915-b5f5-99a5c54e7341-catalog-content\") pod \"redhat-operators-kgbjg\" (UID: \"42d935d7-95d6-4915-b5f5-99a5c54e7341\") " pod="openshift-marketplace/redhat-operators-kgbjg" Mar 10 08:47:55 crc kubenswrapper[4825]: I0310 08:47:55.301877 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42d935d7-95d6-4915-b5f5-99a5c54e7341-utilities\") pod \"redhat-operators-kgbjg\" (UID: \"42d935d7-95d6-4915-b5f5-99a5c54e7341\") " pod="openshift-marketplace/redhat-operators-kgbjg" Mar 10 08:47:55 crc kubenswrapper[4825]: I0310 08:47:55.325218 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lwvv\" (UniqueName: \"kubernetes.io/projected/42d935d7-95d6-4915-b5f5-99a5c54e7341-kube-api-access-5lwvv\") pod \"redhat-operators-kgbjg\" (UID: \"42d935d7-95d6-4915-b5f5-99a5c54e7341\") " pod="openshift-marketplace/redhat-operators-kgbjg" Mar 10 08:47:55 crc kubenswrapper[4825]: I0310 08:47:55.381596 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kgbjg" Mar 10 08:47:55 crc kubenswrapper[4825]: I0310 08:47:55.886786 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kgbjg"] Mar 10 08:47:56 crc kubenswrapper[4825]: I0310 08:47:56.496441 4825 generic.go:334] "Generic (PLEG): container finished" podID="42d935d7-95d6-4915-b5f5-99a5c54e7341" containerID="323d8d51442098deb8f9c840d455f6575fc284b17e8bf6e51aca528cd4f0ab52" exitCode=0 Mar 10 08:47:56 crc kubenswrapper[4825]: I0310 08:47:56.496496 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgbjg" event={"ID":"42d935d7-95d6-4915-b5f5-99a5c54e7341","Type":"ContainerDied","Data":"323d8d51442098deb8f9c840d455f6575fc284b17e8bf6e51aca528cd4f0ab52"} Mar 10 08:47:56 crc kubenswrapper[4825]: I0310 08:47:56.496791 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgbjg" event={"ID":"42d935d7-95d6-4915-b5f5-99a5c54e7341","Type":"ContainerStarted","Data":"472a22fac775b2f931eec559920c88fbb827794594ce64bcc1995df7344831af"} Mar 10 08:47:58 crc kubenswrapper[4825]: I0310 08:47:58.521710 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgbjg" event={"ID":"42d935d7-95d6-4915-b5f5-99a5c54e7341","Type":"ContainerStarted","Data":"5d6c38182daa5f6954e2fdab1f45e8a5c574320c7ab33b3ca96af892208645ac"} Mar 10 08:48:00 crc kubenswrapper[4825]: I0310 08:48:00.160659 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552208-sh75f"] Mar 10 08:48:00 crc kubenswrapper[4825]: I0310 08:48:00.162580 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552208-sh75f" Mar 10 08:48:00 crc kubenswrapper[4825]: I0310 08:48:00.165351 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:48:00 crc kubenswrapper[4825]: I0310 08:48:00.168610 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:48:00 crc kubenswrapper[4825]: I0310 08:48:00.168928 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:48:00 crc kubenswrapper[4825]: I0310 08:48:00.171964 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552208-sh75f"] Mar 10 08:48:00 crc kubenswrapper[4825]: I0310 08:48:00.238352 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6hhh\" (UniqueName: \"kubernetes.io/projected/3671c762-4fd8-4783-89f4-9d1b298717ae-kube-api-access-h6hhh\") pod \"auto-csr-approver-29552208-sh75f\" (UID: \"3671c762-4fd8-4783-89f4-9d1b298717ae\") " pod="openshift-infra/auto-csr-approver-29552208-sh75f" Mar 10 08:48:00 crc kubenswrapper[4825]: I0310 08:48:00.340757 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6hhh\" (UniqueName: \"kubernetes.io/projected/3671c762-4fd8-4783-89f4-9d1b298717ae-kube-api-access-h6hhh\") pod \"auto-csr-approver-29552208-sh75f\" (UID: \"3671c762-4fd8-4783-89f4-9d1b298717ae\") " pod="openshift-infra/auto-csr-approver-29552208-sh75f" Mar 10 08:48:00 crc kubenswrapper[4825]: I0310 08:48:00.363785 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6hhh\" (UniqueName: \"kubernetes.io/projected/3671c762-4fd8-4783-89f4-9d1b298717ae-kube-api-access-h6hhh\") pod \"auto-csr-approver-29552208-sh75f\" (UID: \"3671c762-4fd8-4783-89f4-9d1b298717ae\") " pod="openshift-infra/auto-csr-approver-29552208-sh75f" Mar 10 08:48:00 crc kubenswrapper[4825]: I0310 08:48:00.497828 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552208-sh75f" Mar 10 08:48:01 crc kubenswrapper[4825]: I0310 08:48:01.014104 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552208-sh75f"] Mar 10 08:48:01 crc kubenswrapper[4825]: I0310 08:48:01.550205 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552208-sh75f" event={"ID":"3671c762-4fd8-4783-89f4-9d1b298717ae","Type":"ContainerStarted","Data":"044ff8d071787cde624f14ae857e601fb630e0a2b6e655bdab12939dd2317c6d"} Mar 10 08:48:02 crc kubenswrapper[4825]: I0310 08:48:02.561953 4825 generic.go:334] "Generic (PLEG): container finished" podID="42d935d7-95d6-4915-b5f5-99a5c54e7341" containerID="5d6c38182daa5f6954e2fdab1f45e8a5c574320c7ab33b3ca96af892208645ac" exitCode=0 Mar 10 08:48:02 crc kubenswrapper[4825]: I0310 08:48:02.562007 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgbjg" event={"ID":"42d935d7-95d6-4915-b5f5-99a5c54e7341","Type":"ContainerDied","Data":"5d6c38182daa5f6954e2fdab1f45e8a5c574320c7ab33b3ca96af892208645ac"} Mar 10 08:48:02 crc kubenswrapper[4825]: I0310 08:48:02.564928 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552208-sh75f" event={"ID":"3671c762-4fd8-4783-89f4-9d1b298717ae","Type":"ContainerStarted","Data":"46a96044ed1d1ebef3961cb16e8d0e815d88992e28ff3f052c3d3cc15824f7d5"} Mar 10 08:48:02 crc kubenswrapper[4825]: I0310 08:48:02.602223 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552208-sh75f" podStartSLOduration=1.5171615520000001 podStartE2EDuration="2.602202499s" podCreationTimestamp="2026-03-10 08:48:00 +0000 UTC" firstStartedPulling="2026-03-10 08:48:01.026782551 +0000 UTC m=+7434.056563166" lastFinishedPulling="2026-03-10 08:48:02.111823478 +0000 UTC m=+7435.141604113" observedRunningTime="2026-03-10 08:48:02.595239335 +0000 UTC m=+7435.625019970" watchObservedRunningTime="2026-03-10 08:48:02.602202499 +0000 UTC m=+7435.631983124" Mar 10 08:48:03 crc kubenswrapper[4825]: I0310 08:48:03.574152 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgbjg" event={"ID":"42d935d7-95d6-4915-b5f5-99a5c54e7341","Type":"ContainerStarted","Data":"3159239c0c64c453f5ca83ece0c9cafb9dc0caa7c16f4f2a82d328c51f8eaedf"} Mar 10 08:48:03 crc kubenswrapper[4825]: I0310 08:48:03.576042 4825 generic.go:334] "Generic (PLEG): container finished" podID="3671c762-4fd8-4783-89f4-9d1b298717ae" containerID="46a96044ed1d1ebef3961cb16e8d0e815d88992e28ff3f052c3d3cc15824f7d5" exitCode=0 Mar 10 08:48:03 crc kubenswrapper[4825]: I0310 08:48:03.576071 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552208-sh75f" event={"ID":"3671c762-4fd8-4783-89f4-9d1b298717ae","Type":"ContainerDied","Data":"46a96044ed1d1ebef3961cb16e8d0e815d88992e28ff3f052c3d3cc15824f7d5"} Mar 10 08:48:03 crc kubenswrapper[4825]: I0310 08:48:03.594920 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kgbjg" podStartSLOduration=2.106202374 podStartE2EDuration="8.594897157s" podCreationTimestamp="2026-03-10 08:47:55 +0000 UTC" firstStartedPulling="2026-03-10 08:47:56.498895075 +0000 UTC m=+7429.528675690" lastFinishedPulling="2026-03-10 08:48:02.987589858 +0000 UTC m=+7436.017370473" observedRunningTime="2026-03-10 08:48:03.590157322 +0000 UTC m=+7436.619937957" watchObservedRunningTime="2026-03-10 08:48:03.594897157 +0000 UTC m=+7436.624677792" Mar 10 08:48:04 crc kubenswrapper[4825]: I0310 08:48:04.984097 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552208-sh75f" Mar 10 08:48:05 crc kubenswrapper[4825]: I0310 08:48:05.145755 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6hhh\" (UniqueName: \"kubernetes.io/projected/3671c762-4fd8-4783-89f4-9d1b298717ae-kube-api-access-h6hhh\") pod \"3671c762-4fd8-4783-89f4-9d1b298717ae\" (UID: \"3671c762-4fd8-4783-89f4-9d1b298717ae\") " Mar 10 08:48:05 crc kubenswrapper[4825]: I0310 08:48:05.152177 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3671c762-4fd8-4783-89f4-9d1b298717ae-kube-api-access-h6hhh" (OuterVolumeSpecName: "kube-api-access-h6hhh") pod "3671c762-4fd8-4783-89f4-9d1b298717ae" (UID: "3671c762-4fd8-4783-89f4-9d1b298717ae"). InnerVolumeSpecName "kube-api-access-h6hhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:48:05 crc kubenswrapper[4825]: I0310 08:48:05.247899 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6hhh\" (UniqueName: \"kubernetes.io/projected/3671c762-4fd8-4783-89f4-9d1b298717ae-kube-api-access-h6hhh\") on node \"crc\" DevicePath \"\"" Mar 10 08:48:05 crc kubenswrapper[4825]: I0310 08:48:05.381795 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kgbjg" Mar 10 08:48:05 crc kubenswrapper[4825]: I0310 08:48:05.382087 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kgbjg" Mar 10 08:48:05 crc kubenswrapper[4825]: I0310 08:48:05.593625 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552208-sh75f" Mar 10 08:48:05 crc kubenswrapper[4825]: I0310 08:48:05.593650 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552208-sh75f" event={"ID":"3671c762-4fd8-4783-89f4-9d1b298717ae","Type":"ContainerDied","Data":"044ff8d071787cde624f14ae857e601fb630e0a2b6e655bdab12939dd2317c6d"} Mar 10 08:48:05 crc kubenswrapper[4825]: I0310 08:48:05.593747 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="044ff8d071787cde624f14ae857e601fb630e0a2b6e655bdab12939dd2317c6d" Mar 10 08:48:05 crc kubenswrapper[4825]: I0310 08:48:05.674387 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552202-r7794"] Mar 10 08:48:05 crc kubenswrapper[4825]: I0310 08:48:05.683808 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552202-r7794"] Mar 10 08:48:06 crc kubenswrapper[4825]: I0310 08:48:06.427244 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kgbjg" podUID="42d935d7-95d6-4915-b5f5-99a5c54e7341" containerName="registry-server" probeResult="failure" output=< Mar 10 08:48:06 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 08:48:06 crc kubenswrapper[4825]: > Mar 10 08:48:07 crc kubenswrapper[4825]: I0310 08:48:07.252336 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bdfaf77-bd08-4483-a7c4-ebb450e783f9" path="/var/lib/kubelet/pods/4bdfaf77-bd08-4483-a7c4-ebb450e783f9/volumes" Mar 10 08:48:07 crc kubenswrapper[4825]: I0310 08:48:07.611326 4825 generic.go:334] "Generic (PLEG): container finished" podID="0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46" containerID="b32d06a2383c505c169e5d4cd8f992cbf306b2b5a266773f0bd17dcd9a20d91d" exitCode=0 Mar 10 08:48:07 crc kubenswrapper[4825]: I0310 08:48:07.611372 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-lwdhg" event={"ID":"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46","Type":"ContainerDied","Data":"b32d06a2383c505c169e5d4cd8f992cbf306b2b5a266773f0bd17dcd9a20d91d"} Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.073879 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-lwdhg" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.230711 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-ssh-key-openstack-cell1\") pod \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\" (UID: \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\") " Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.230786 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-ovncontroller-config-0\") pod \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\" (UID: \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\") " Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.231407 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-ovn-combined-ca-bundle\") pod \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\" (UID: \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\") " Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.231484 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-inventory\") pod \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\" (UID: \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\") " Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.231622 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bskf7\" (UniqueName: \"kubernetes.io/projected/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-kube-api-access-bskf7\") pod \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\" (UID: \"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46\") " Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.239506 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46" (UID: "0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.239871 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-kube-api-access-bskf7" (OuterVolumeSpecName: "kube-api-access-bskf7") pod "0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46" (UID: "0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46"). InnerVolumeSpecName "kube-api-access-bskf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.278671 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-inventory" (OuterVolumeSpecName: "inventory") pod "0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46" (UID: "0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.286393 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46" (UID: "0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.286462 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46" (UID: "0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.341876 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bskf7\" (UniqueName: \"kubernetes.io/projected/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-kube-api-access-bskf7\") on node \"crc\" DevicePath \"\"" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.341915 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.341929 4825 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.341942 4825 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.341955 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.638899 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-lwdhg" event={"ID":"0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46","Type":"ContainerDied","Data":"57b48e71a5a1ea03fee3d0cb1dccc9b3eb559555ae4a2f80d787841e0771dc4f"} Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.638943 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57b48e71a5a1ea03fee3d0cb1dccc9b3eb559555ae4a2f80d787841e0771dc4f" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.639278 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-lwdhg" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.743121 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-xcjgb"] Mar 10 08:48:09 crc kubenswrapper[4825]: E0310 08:48:09.743650 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3671c762-4fd8-4783-89f4-9d1b298717ae" containerName="oc" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.743671 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3671c762-4fd8-4783-89f4-9d1b298717ae" containerName="oc" Mar 10 08:48:09 crc kubenswrapper[4825]: E0310 08:48:09.743710 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46" containerName="ovn-openstack-openstack-cell1" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.743717 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46" containerName="ovn-openstack-openstack-cell1" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.743884 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3671c762-4fd8-4783-89f4-9d1b298717ae" containerName="oc" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.743898 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46" containerName="ovn-openstack-openstack-cell1" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.744687 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.749099 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.749191 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.749291 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.749428 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.749644 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-5twvv" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.749780 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.753770 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-xcjgb"] Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.852640 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-xcjgb\" (UID: \"67401e8d-825e-461a-a88d-7573c48c5918\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.853008 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-xcjgb\" (UID: \"67401e8d-825e-461a-a88d-7573c48c5918\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.853189 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-xcjgb\" (UID: \"67401e8d-825e-461a-a88d-7573c48c5918\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.853446 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-xcjgb\" (UID: \"67401e8d-825e-461a-a88d-7573c48c5918\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.853597 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-xcjgb\" (UID: \"67401e8d-825e-461a-a88d-7573c48c5918\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.853854 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msg59\" (UniqueName: \"kubernetes.io/projected/67401e8d-825e-461a-a88d-7573c48c5918-kube-api-access-msg59\") pod \"neutron-metadata-openstack-openstack-cell1-xcjgb\" (UID: \"67401e8d-825e-461a-a88d-7573c48c5918\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.956051 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-xcjgb\" (UID: \"67401e8d-825e-461a-a88d-7573c48c5918\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.956388 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-xcjgb\" (UID: \"67401e8d-825e-461a-a88d-7573c48c5918\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.956438 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-xcjgb\" (UID: \"67401e8d-825e-461a-a88d-7573c48c5918\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.956483 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-xcjgb\" (UID: \"67401e8d-825e-461a-a88d-7573c48c5918\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.956522 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-xcjgb\" (UID: \"67401e8d-825e-461a-a88d-7573c48c5918\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.956592 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msg59\" (UniqueName: \"kubernetes.io/projected/67401e8d-825e-461a-a88d-7573c48c5918-kube-api-access-msg59\") pod \"neutron-metadata-openstack-openstack-cell1-xcjgb\" (UID: \"67401e8d-825e-461a-a88d-7573c48c5918\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.960335 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-xcjgb\" (UID: \"67401e8d-825e-461a-a88d-7573c48c5918\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.960647 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-xcjgb\" (UID: \"67401e8d-825e-461a-a88d-7573c48c5918\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.961312 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-xcjgb\" (UID: \"67401e8d-825e-461a-a88d-7573c48c5918\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.964761 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-xcjgb\" (UID: \"67401e8d-825e-461a-a88d-7573c48c5918\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.965182 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-xcjgb\" (UID: \"67401e8d-825e-461a-a88d-7573c48c5918\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" Mar 10 08:48:09 crc kubenswrapper[4825]: I0310 08:48:09.974236 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msg59\" (UniqueName: \"kubernetes.io/projected/67401e8d-825e-461a-a88d-7573c48c5918-kube-api-access-msg59\") pod \"neutron-metadata-openstack-openstack-cell1-xcjgb\" (UID: \"67401e8d-825e-461a-a88d-7573c48c5918\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" Mar 10 08:48:10 crc kubenswrapper[4825]: I0310 08:48:10.082877 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" Mar 10 08:48:10 crc kubenswrapper[4825]: I0310 08:48:10.600987 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-xcjgb"] Mar 10 08:48:10 crc kubenswrapper[4825]: I0310 08:48:10.658441 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" event={"ID":"67401e8d-825e-461a-a88d-7573c48c5918","Type":"ContainerStarted","Data":"1242b4fd9b6ec925e2202ba58cb59ad611031426e6e51ea89e40a1b585068e06"} Mar 10 08:48:11 crc kubenswrapper[4825]: I0310 08:48:11.668828 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" event={"ID":"67401e8d-825e-461a-a88d-7573c48c5918","Type":"ContainerStarted","Data":"11743399ae8ea9e673bf5faecfef8f32957f11f857457eb023b6254e14fafb11"} Mar 10 08:48:11 crc kubenswrapper[4825]: I0310 08:48:11.692465 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" podStartSLOduration=2.066818768 podStartE2EDuration="2.692447202s" podCreationTimestamp="2026-03-10 08:48:09 +0000 UTC" firstStartedPulling="2026-03-10 08:48:10.609244733 +0000 UTC m=+7443.639025348" lastFinishedPulling="2026-03-10 08:48:11.234873167 +0000 UTC m=+7444.264653782" observedRunningTime="2026-03-10 08:48:11.689561136 +0000 UTC m=+7444.719341751" watchObservedRunningTime="2026-03-10 08:48:11.692447202 +0000 UTC m=+7444.722227817" Mar 10 08:48:16 crc kubenswrapper[4825]: I0310 08:48:16.426607 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kgbjg" podUID="42d935d7-95d6-4915-b5f5-99a5c54e7341" containerName="registry-server" probeResult="failure" output=< Mar 10 08:48:16 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 08:48:16 crc kubenswrapper[4825]: > Mar 10 08:48:26 crc kubenswrapper[4825]: I0310 08:48:26.424168 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kgbjg" podUID="42d935d7-95d6-4915-b5f5-99a5c54e7341" containerName="registry-server" probeResult="failure" output=< Mar 10 08:48:26 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 08:48:26 crc kubenswrapper[4825]: > Mar 10 08:48:35 crc kubenswrapper[4825]: I0310 08:48:35.457427 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kgbjg" Mar 10 08:48:35 crc kubenswrapper[4825]: I0310 08:48:35.519975 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kgbjg" Mar 10 08:48:35 crc kubenswrapper[4825]: I0310 08:48:35.691425 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kgbjg"] Mar 10 08:48:36 crc kubenswrapper[4825]: I0310 08:48:36.890373 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kgbjg" podUID="42d935d7-95d6-4915-b5f5-99a5c54e7341" containerName="registry-server" containerID="cri-o://3159239c0c64c453f5ca83ece0c9cafb9dc0caa7c16f4f2a82d328c51f8eaedf" gracePeriod=2 Mar 10 08:48:37 crc kubenswrapper[4825]: I0310 08:48:37.340276 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kgbjg" Mar 10 08:48:37 crc kubenswrapper[4825]: I0310 08:48:37.441922 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42d935d7-95d6-4915-b5f5-99a5c54e7341-catalog-content\") pod \"42d935d7-95d6-4915-b5f5-99a5c54e7341\" (UID: \"42d935d7-95d6-4915-b5f5-99a5c54e7341\") " Mar 10 08:48:37 crc kubenswrapper[4825]: I0310 08:48:37.442230 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42d935d7-95d6-4915-b5f5-99a5c54e7341-utilities\") pod \"42d935d7-95d6-4915-b5f5-99a5c54e7341\" (UID: \"42d935d7-95d6-4915-b5f5-99a5c54e7341\") " Mar 10 08:48:37 crc kubenswrapper[4825]: I0310 08:48:37.442286 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lwvv\" (UniqueName: \"kubernetes.io/projected/42d935d7-95d6-4915-b5f5-99a5c54e7341-kube-api-access-5lwvv\") pod \"42d935d7-95d6-4915-b5f5-99a5c54e7341\" (UID: \"42d935d7-95d6-4915-b5f5-99a5c54e7341\") " Mar 10 08:48:37 crc kubenswrapper[4825]: I0310 08:48:37.443211 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42d935d7-95d6-4915-b5f5-99a5c54e7341-utilities" (OuterVolumeSpecName: "utilities") pod "42d935d7-95d6-4915-b5f5-99a5c54e7341" (UID: "42d935d7-95d6-4915-b5f5-99a5c54e7341"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:48:37 crc kubenswrapper[4825]: I0310 08:48:37.451656 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42d935d7-95d6-4915-b5f5-99a5c54e7341-kube-api-access-5lwvv" (OuterVolumeSpecName: "kube-api-access-5lwvv") pod "42d935d7-95d6-4915-b5f5-99a5c54e7341" (UID: "42d935d7-95d6-4915-b5f5-99a5c54e7341"). InnerVolumeSpecName "kube-api-access-5lwvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:48:37 crc kubenswrapper[4825]: I0310 08:48:37.544577 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42d935d7-95d6-4915-b5f5-99a5c54e7341-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 08:48:37 crc kubenswrapper[4825]: I0310 08:48:37.544608 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lwvv\" (UniqueName: \"kubernetes.io/projected/42d935d7-95d6-4915-b5f5-99a5c54e7341-kube-api-access-5lwvv\") on node \"crc\" DevicePath \"\"" Mar 10 08:48:37 crc kubenswrapper[4825]: I0310 08:48:37.570051 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42d935d7-95d6-4915-b5f5-99a5c54e7341-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42d935d7-95d6-4915-b5f5-99a5c54e7341" (UID: "42d935d7-95d6-4915-b5f5-99a5c54e7341"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:48:37 crc kubenswrapper[4825]: I0310 08:48:37.646900 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42d935d7-95d6-4915-b5f5-99a5c54e7341-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 08:48:37 crc kubenswrapper[4825]: I0310 08:48:37.904536 4825 generic.go:334] "Generic (PLEG): container finished" podID="42d935d7-95d6-4915-b5f5-99a5c54e7341" containerID="3159239c0c64c453f5ca83ece0c9cafb9dc0caa7c16f4f2a82d328c51f8eaedf" exitCode=0 Mar 10 08:48:37 crc kubenswrapper[4825]: I0310 08:48:37.904646 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kgbjg" Mar 10 08:48:37 crc kubenswrapper[4825]: I0310 08:48:37.904681 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgbjg" event={"ID":"42d935d7-95d6-4915-b5f5-99a5c54e7341","Type":"ContainerDied","Data":"3159239c0c64c453f5ca83ece0c9cafb9dc0caa7c16f4f2a82d328c51f8eaedf"} Mar 10 08:48:37 crc kubenswrapper[4825]: I0310 08:48:37.905275 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgbjg" event={"ID":"42d935d7-95d6-4915-b5f5-99a5c54e7341","Type":"ContainerDied","Data":"472a22fac775b2f931eec559920c88fbb827794594ce64bcc1995df7344831af"} Mar 10 08:48:37 crc kubenswrapper[4825]: I0310 08:48:37.905297 4825 scope.go:117] "RemoveContainer" containerID="3159239c0c64c453f5ca83ece0c9cafb9dc0caa7c16f4f2a82d328c51f8eaedf" Mar 10 08:48:37 crc kubenswrapper[4825]: I0310 08:48:37.937322 4825 scope.go:117] "RemoveContainer" containerID="5d6c38182daa5f6954e2fdab1f45e8a5c574320c7ab33b3ca96af892208645ac" Mar 10 08:48:37 crc kubenswrapper[4825]: I0310 08:48:37.946889 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kgbjg"] Mar 10 08:48:37 crc kubenswrapper[4825]: I0310 08:48:37.960075 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kgbjg"] Mar 10 08:48:37 crc kubenswrapper[4825]: I0310 08:48:37.965150 4825 scope.go:117] "RemoveContainer" containerID="323d8d51442098deb8f9c840d455f6575fc284b17e8bf6e51aca528cd4f0ab52" Mar 10 08:48:38 crc kubenswrapper[4825]: I0310 08:48:38.018060 4825 scope.go:117] "RemoveContainer" containerID="3159239c0c64c453f5ca83ece0c9cafb9dc0caa7c16f4f2a82d328c51f8eaedf" Mar 10 08:48:38 crc kubenswrapper[4825]: E0310 08:48:38.020052 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3159239c0c64c453f5ca83ece0c9cafb9dc0caa7c16f4f2a82d328c51f8eaedf\": container with ID starting with 3159239c0c64c453f5ca83ece0c9cafb9dc0caa7c16f4f2a82d328c51f8eaedf not found: ID does not exist" containerID="3159239c0c64c453f5ca83ece0c9cafb9dc0caa7c16f4f2a82d328c51f8eaedf" Mar 10 08:48:38 crc kubenswrapper[4825]: I0310 08:48:38.020101 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3159239c0c64c453f5ca83ece0c9cafb9dc0caa7c16f4f2a82d328c51f8eaedf"} err="failed to get container status \"3159239c0c64c453f5ca83ece0c9cafb9dc0caa7c16f4f2a82d328c51f8eaedf\": rpc error: code = NotFound desc = could not find container \"3159239c0c64c453f5ca83ece0c9cafb9dc0caa7c16f4f2a82d328c51f8eaedf\": container with ID starting with 3159239c0c64c453f5ca83ece0c9cafb9dc0caa7c16f4f2a82d328c51f8eaedf not found: ID does not exist" Mar 10 08:48:38 crc kubenswrapper[4825]: I0310 08:48:38.020147 4825 scope.go:117] "RemoveContainer" containerID="5d6c38182daa5f6954e2fdab1f45e8a5c574320c7ab33b3ca96af892208645ac" Mar 10 08:48:38 crc kubenswrapper[4825]: E0310 08:48:38.020416 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d6c38182daa5f6954e2fdab1f45e8a5c574320c7ab33b3ca96af892208645ac\": container with ID starting with 5d6c38182daa5f6954e2fdab1f45e8a5c574320c7ab33b3ca96af892208645ac not found: ID does not exist" containerID="5d6c38182daa5f6954e2fdab1f45e8a5c574320c7ab33b3ca96af892208645ac" Mar 10 08:48:38 crc kubenswrapper[4825]: I0310 08:48:38.020442 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d6c38182daa5f6954e2fdab1f45e8a5c574320c7ab33b3ca96af892208645ac"} err="failed to get container status \"5d6c38182daa5f6954e2fdab1f45e8a5c574320c7ab33b3ca96af892208645ac\": rpc error: code = NotFound desc = could not find container \"5d6c38182daa5f6954e2fdab1f45e8a5c574320c7ab33b3ca96af892208645ac\": container with ID starting with 5d6c38182daa5f6954e2fdab1f45e8a5c574320c7ab33b3ca96af892208645ac not found: ID does not exist" Mar 10 08:48:38 crc kubenswrapper[4825]: I0310 08:48:38.020459 4825 scope.go:117] "RemoveContainer" containerID="323d8d51442098deb8f9c840d455f6575fc284b17e8bf6e51aca528cd4f0ab52" Mar 10 08:48:38 crc kubenswrapper[4825]: E0310 08:48:38.020675 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"323d8d51442098deb8f9c840d455f6575fc284b17e8bf6e51aca528cd4f0ab52\": container with ID starting with 323d8d51442098deb8f9c840d455f6575fc284b17e8bf6e51aca528cd4f0ab52 not found: ID does not exist" containerID="323d8d51442098deb8f9c840d455f6575fc284b17e8bf6e51aca528cd4f0ab52" Mar 10 08:48:38 crc kubenswrapper[4825]: I0310 08:48:38.020702 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"323d8d51442098deb8f9c840d455f6575fc284b17e8bf6e51aca528cd4f0ab52"} err="failed to get container status \"323d8d51442098deb8f9c840d455f6575fc284b17e8bf6e51aca528cd4f0ab52\": rpc error: code = NotFound desc = could not find container \"323d8d51442098deb8f9c840d455f6575fc284b17e8bf6e51aca528cd4f0ab52\": container with ID starting with 323d8d51442098deb8f9c840d455f6575fc284b17e8bf6e51aca528cd4f0ab52 not found: ID does not exist" Mar 10 08:48:39 crc kubenswrapper[4825]: I0310 08:48:39.250660 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42d935d7-95d6-4915-b5f5-99a5c54e7341" path="/var/lib/kubelet/pods/42d935d7-95d6-4915-b5f5-99a5c54e7341/volumes" Mar 10 08:48:46 crc kubenswrapper[4825]: I0310 08:48:46.888542 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:48:46 crc kubenswrapper[4825]: I0310 08:48:46.889388 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:48:54 crc kubenswrapper[4825]: I0310 08:48:54.166635 4825 scope.go:117] "RemoveContainer" containerID="74e65c5cd7f3cbe836e4758447779e6aacc9fa2c0e2484ed78819f676958437d" Mar 10 08:49:01 crc kubenswrapper[4825]: I0310 08:49:01.121548 4825 generic.go:334] "Generic (PLEG): container finished" podID="67401e8d-825e-461a-a88d-7573c48c5918" containerID="11743399ae8ea9e673bf5faecfef8f32957f11f857457eb023b6254e14fafb11" exitCode=0 Mar 10 08:49:01 crc kubenswrapper[4825]: I0310 08:49:01.121632 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" event={"ID":"67401e8d-825e-461a-a88d-7573c48c5918","Type":"ContainerDied","Data":"11743399ae8ea9e673bf5faecfef8f32957f11f857457eb023b6254e14fafb11"} Mar 10 08:49:02 crc kubenswrapper[4825]: I0310 08:49:02.558637 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" Mar 10 08:49:02 crc kubenswrapper[4825]: I0310 08:49:02.577888 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-neutron-metadata-combined-ca-bundle\") pod \"67401e8d-825e-461a-a88d-7573c48c5918\" (UID: \"67401e8d-825e-461a-a88d-7573c48c5918\") " Mar 10 08:49:02 crc kubenswrapper[4825]: I0310 08:49:02.577961 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-neutron-ovn-metadata-agent-neutron-config-0\") pod \"67401e8d-825e-461a-a88d-7573c48c5918\" (UID: \"67401e8d-825e-461a-a88d-7573c48c5918\") " Mar 10 08:49:02 crc kubenswrapper[4825]: I0310 08:49:02.577996 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msg59\" (UniqueName: \"kubernetes.io/projected/67401e8d-825e-461a-a88d-7573c48c5918-kube-api-access-msg59\") pod \"67401e8d-825e-461a-a88d-7573c48c5918\" (UID: \"67401e8d-825e-461a-a88d-7573c48c5918\") " Mar 10 08:49:02 crc kubenswrapper[4825]: I0310 08:49:02.578290 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-ssh-key-openstack-cell1\") pod \"67401e8d-825e-461a-a88d-7573c48c5918\" (UID: \"67401e8d-825e-461a-a88d-7573c48c5918\") " Mar 10 08:49:02 crc kubenswrapper[4825]: I0310 08:49:02.578322 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-inventory\") pod \"67401e8d-825e-461a-a88d-7573c48c5918\" (UID: \"67401e8d-825e-461a-a88d-7573c48c5918\") " Mar 10 08:49:02 crc kubenswrapper[4825]: I0310 08:49:02.578364 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-nova-metadata-neutron-config-0\") pod \"67401e8d-825e-461a-a88d-7573c48c5918\" (UID: \"67401e8d-825e-461a-a88d-7573c48c5918\") " Mar 10 08:49:02 crc kubenswrapper[4825]: I0310 08:49:02.584538 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67401e8d-825e-461a-a88d-7573c48c5918-kube-api-access-msg59" (OuterVolumeSpecName: "kube-api-access-msg59") pod "67401e8d-825e-461a-a88d-7573c48c5918" (UID: "67401e8d-825e-461a-a88d-7573c48c5918"). InnerVolumeSpecName "kube-api-access-msg59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:49:02 crc kubenswrapper[4825]: I0310 08:49:02.587483 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "67401e8d-825e-461a-a88d-7573c48c5918" (UID: "67401e8d-825e-461a-a88d-7573c48c5918"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:49:02 crc kubenswrapper[4825]: I0310 08:49:02.618455 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "67401e8d-825e-461a-a88d-7573c48c5918" (UID: "67401e8d-825e-461a-a88d-7573c48c5918"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:49:02 crc kubenswrapper[4825]: I0310 08:49:02.640954 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "67401e8d-825e-461a-a88d-7573c48c5918" (UID: "67401e8d-825e-461a-a88d-7573c48c5918"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:49:02 crc kubenswrapper[4825]: I0310 08:49:02.643328 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "67401e8d-825e-461a-a88d-7573c48c5918" (UID: "67401e8d-825e-461a-a88d-7573c48c5918"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:49:02 crc kubenswrapper[4825]: I0310 08:49:02.648603 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-inventory" (OuterVolumeSpecName: "inventory") pod "67401e8d-825e-461a-a88d-7573c48c5918" (UID: "67401e8d-825e-461a-a88d-7573c48c5918"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:49:02 crc kubenswrapper[4825]: I0310 08:49:02.680919 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 08:49:02 crc kubenswrapper[4825]: I0310 08:49:02.680956 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 08:49:02 crc kubenswrapper[4825]: I0310 08:49:02.680965 4825 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 08:49:02 crc kubenswrapper[4825]: I0310 08:49:02.680978 4825 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:49:02 crc kubenswrapper[4825]: I0310 08:49:02.680990 4825 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67401e8d-825e-461a-a88d-7573c48c5918-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 08:49:02 crc kubenswrapper[4825]: I0310 08:49:02.681001 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msg59\" (UniqueName: \"kubernetes.io/projected/67401e8d-825e-461a-a88d-7573c48c5918-kube-api-access-msg59\") on node \"crc\" DevicePath \"\"" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.139714 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" event={"ID":"67401e8d-825e-461a-a88d-7573c48c5918","Type":"ContainerDied","Data":"1242b4fd9b6ec925e2202ba58cb59ad611031426e6e51ea89e40a1b585068e06"} Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.139763 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1242b4fd9b6ec925e2202ba58cb59ad611031426e6e51ea89e40a1b585068e06" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.140097 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-xcjgb" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.250705 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-gd2sq"] Mar 10 08:49:03 crc kubenswrapper[4825]: E0310 08:49:03.251054 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67401e8d-825e-461a-a88d-7573c48c5918" containerName="neutron-metadata-openstack-openstack-cell1" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.251068 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="67401e8d-825e-461a-a88d-7573c48c5918" containerName="neutron-metadata-openstack-openstack-cell1" Mar 10 08:49:03 crc kubenswrapper[4825]: E0310 08:49:03.251077 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42d935d7-95d6-4915-b5f5-99a5c54e7341" containerName="extract-content" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.251083 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d935d7-95d6-4915-b5f5-99a5c54e7341" containerName="extract-content" Mar 10 08:49:03 crc kubenswrapper[4825]: E0310 08:49:03.251108 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42d935d7-95d6-4915-b5f5-99a5c54e7341" containerName="extract-utilities" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.251115 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d935d7-95d6-4915-b5f5-99a5c54e7341" containerName="extract-utilities" Mar 10 08:49:03 crc kubenswrapper[4825]: E0310 08:49:03.251164 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42d935d7-95d6-4915-b5f5-99a5c54e7341" containerName="registry-server" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.251218 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d935d7-95d6-4915-b5f5-99a5c54e7341" containerName="registry-server" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.251415 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="67401e8d-825e-461a-a88d-7573c48c5918" containerName="neutron-metadata-openstack-openstack-cell1" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.251441 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="42d935d7-95d6-4915-b5f5-99a5c54e7341" containerName="registry-server" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.252856 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-gd2sq"] Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.252936 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-gd2sq" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.255210 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.255389 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.255545 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.255636 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-5twvv" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.255990 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.319321 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8da704d5-ded8-4714-936b-91ac018532dd-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-gd2sq\" (UID: \"8da704d5-ded8-4714-936b-91ac018532dd\") " pod="openstack/libvirt-openstack-openstack-cell1-gd2sq" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.319487 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw8nk\" (UniqueName: \"kubernetes.io/projected/8da704d5-ded8-4714-936b-91ac018532dd-kube-api-access-vw8nk\") pod \"libvirt-openstack-openstack-cell1-gd2sq\" (UID: \"8da704d5-ded8-4714-936b-91ac018532dd\") " pod="openstack/libvirt-openstack-openstack-cell1-gd2sq" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.319526 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8da704d5-ded8-4714-936b-91ac018532dd-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-gd2sq\" (UID: \"8da704d5-ded8-4714-936b-91ac018532dd\") " pod="openstack/libvirt-openstack-openstack-cell1-gd2sq" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.319610 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8da704d5-ded8-4714-936b-91ac018532dd-inventory\") pod \"libvirt-openstack-openstack-cell1-gd2sq\" (UID: \"8da704d5-ded8-4714-936b-91ac018532dd\") " pod="openstack/libvirt-openstack-openstack-cell1-gd2sq" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.319769 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8da704d5-ded8-4714-936b-91ac018532dd-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-gd2sq\" (UID: \"8da704d5-ded8-4714-936b-91ac018532dd\") " pod="openstack/libvirt-openstack-openstack-cell1-gd2sq" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.422073 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8da704d5-ded8-4714-936b-91ac018532dd-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-gd2sq\" (UID: \"8da704d5-ded8-4714-936b-91ac018532dd\") " pod="openstack/libvirt-openstack-openstack-cell1-gd2sq" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.422454 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw8nk\" (UniqueName: \"kubernetes.io/projected/8da704d5-ded8-4714-936b-91ac018532dd-kube-api-access-vw8nk\") pod \"libvirt-openstack-openstack-cell1-gd2sq\" (UID: \"8da704d5-ded8-4714-936b-91ac018532dd\") " pod="openstack/libvirt-openstack-openstack-cell1-gd2sq" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.422553 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8da704d5-ded8-4714-936b-91ac018532dd-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-gd2sq\" (UID: \"8da704d5-ded8-4714-936b-91ac018532dd\") " pod="openstack/libvirt-openstack-openstack-cell1-gd2sq" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.422639 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8da704d5-ded8-4714-936b-91ac018532dd-inventory\") pod \"libvirt-openstack-openstack-cell1-gd2sq\" (UID: \"8da704d5-ded8-4714-936b-91ac018532dd\") " pod="openstack/libvirt-openstack-openstack-cell1-gd2sq" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.422751 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8da704d5-ded8-4714-936b-91ac018532dd-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-gd2sq\" (UID: \"8da704d5-ded8-4714-936b-91ac018532dd\") " pod="openstack/libvirt-openstack-openstack-cell1-gd2sq" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.427341 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8da704d5-ded8-4714-936b-91ac018532dd-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-gd2sq\" (UID: \"8da704d5-ded8-4714-936b-91ac018532dd\") " pod="openstack/libvirt-openstack-openstack-cell1-gd2sq" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.427545 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8da704d5-ded8-4714-936b-91ac018532dd-inventory\") pod \"libvirt-openstack-openstack-cell1-gd2sq\" (UID: \"8da704d5-ded8-4714-936b-91ac018532dd\") " pod="openstack/libvirt-openstack-openstack-cell1-gd2sq" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.427570 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8da704d5-ded8-4714-936b-91ac018532dd-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-gd2sq\" (UID: \"8da704d5-ded8-4714-936b-91ac018532dd\") " pod="openstack/libvirt-openstack-openstack-cell1-gd2sq" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.428078 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8da704d5-ded8-4714-936b-91ac018532dd-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-gd2sq\" (UID: \"8da704d5-ded8-4714-936b-91ac018532dd\") " pod="openstack/libvirt-openstack-openstack-cell1-gd2sq" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.439796 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw8nk\" (UniqueName: \"kubernetes.io/projected/8da704d5-ded8-4714-936b-91ac018532dd-kube-api-access-vw8nk\") pod \"libvirt-openstack-openstack-cell1-gd2sq\" (UID: \"8da704d5-ded8-4714-936b-91ac018532dd\") " pod="openstack/libvirt-openstack-openstack-cell1-gd2sq" Mar 10 08:49:03 crc kubenswrapper[4825]: I0310 08:49:03.584153 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-gd2sq" Mar 10 08:49:04 crc kubenswrapper[4825]: I0310 08:49:04.122555 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-gd2sq"] Mar 10 08:49:04 crc kubenswrapper[4825]: I0310 08:49:04.159612 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-gd2sq" event={"ID":"8da704d5-ded8-4714-936b-91ac018532dd","Type":"ContainerStarted","Data":"6f0909a62388bc28867dbdb2a18f5d6629bbdabbd9b5c5936ef1a61b31c7abcd"} Mar 10 08:49:05 crc kubenswrapper[4825]: I0310 08:49:05.174749 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-gd2sq" event={"ID":"8da704d5-ded8-4714-936b-91ac018532dd","Type":"ContainerStarted","Data":"780544b72239d6a913663c7f17629f5e7430069f0ad1822d58cc05fc261c0d34"} Mar 10 08:49:05 crc kubenswrapper[4825]: I0310 08:49:05.197600 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-gd2sq" podStartSLOduration=1.752968251 podStartE2EDuration="2.197580763s" podCreationTimestamp="2026-03-10 08:49:03 +0000 UTC" firstStartedPulling="2026-03-10 08:49:04.133627064 +0000 UTC m=+7497.163407679" lastFinishedPulling="2026-03-10 08:49:04.578239586 +0000 UTC m=+7497.608020191" observedRunningTime="2026-03-10 08:49:05.196271479 +0000 UTC m=+7498.226052094" watchObservedRunningTime="2026-03-10 08:49:05.197580763 +0000 UTC m=+7498.227361378" Mar 10 08:49:16 crc kubenswrapper[4825]: I0310 08:49:16.888437 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:49:16 crc kubenswrapper[4825]: I0310 08:49:16.888962 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:49:46 crc kubenswrapper[4825]: I0310 08:49:46.887944 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:49:46 crc kubenswrapper[4825]: I0310 08:49:46.888417 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:49:46 crc kubenswrapper[4825]: I0310 08:49:46.888458 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 08:49:46 crc kubenswrapper[4825]: I0310 08:49:46.889235 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 08:49:46 crc kubenswrapper[4825]: I0310 08:49:46.889297 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" gracePeriod=600 Mar 10 08:49:47 crc kubenswrapper[4825]: E0310 08:49:47.018295 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:49:47 crc kubenswrapper[4825]: I0310 08:49:47.596057 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" exitCode=0 Mar 10 08:49:47 crc kubenswrapper[4825]: I0310 08:49:47.596183 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe"} Mar 10 08:49:47 crc kubenswrapper[4825]: I0310 08:49:47.596560 4825 scope.go:117] "RemoveContainer" containerID="a77ec3023a65c7544ad73bb812378227e2a33544cb0a071029aa0eecd3038079" Mar 10 08:49:47 crc kubenswrapper[4825]: I0310 08:49:47.597351 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:49:47 crc kubenswrapper[4825]: E0310 08:49:47.597761 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:50:00 crc kubenswrapper[4825]: I0310 08:50:00.174860 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552210-mkgwn"] Mar 10 08:50:00 crc kubenswrapper[4825]: I0310 08:50:00.177284 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552210-mkgwn" Mar 10 08:50:00 crc kubenswrapper[4825]: I0310 08:50:00.182342 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:50:00 crc kubenswrapper[4825]: I0310 08:50:00.182590 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:50:00 crc kubenswrapper[4825]: I0310 08:50:00.184635 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:50:00 crc kubenswrapper[4825]: I0310 08:50:00.184752 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552210-mkgwn"] Mar 10 08:50:00 crc kubenswrapper[4825]: I0310 08:50:00.236829 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:50:00 crc kubenswrapper[4825]: E0310 08:50:00.242712 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:50:00 crc kubenswrapper[4825]: I0310 08:50:00.284282 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d64xf\" (UniqueName: \"kubernetes.io/projected/5b87021f-252d-48d8-87a2-6d48161f495e-kube-api-access-d64xf\") pod \"auto-csr-approver-29552210-mkgwn\" (UID: \"5b87021f-252d-48d8-87a2-6d48161f495e\") " pod="openshift-infra/auto-csr-approver-29552210-mkgwn" Mar 10 08:50:00 crc kubenswrapper[4825]: I0310 08:50:00.386817 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d64xf\" (UniqueName: \"kubernetes.io/projected/5b87021f-252d-48d8-87a2-6d48161f495e-kube-api-access-d64xf\") pod \"auto-csr-approver-29552210-mkgwn\" (UID: \"5b87021f-252d-48d8-87a2-6d48161f495e\") " pod="openshift-infra/auto-csr-approver-29552210-mkgwn" Mar 10 08:50:00 crc kubenswrapper[4825]: I0310 08:50:00.404813 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d64xf\" (UniqueName: \"kubernetes.io/projected/5b87021f-252d-48d8-87a2-6d48161f495e-kube-api-access-d64xf\") pod \"auto-csr-approver-29552210-mkgwn\" (UID: \"5b87021f-252d-48d8-87a2-6d48161f495e\") " pod="openshift-infra/auto-csr-approver-29552210-mkgwn" Mar 10 08:50:00 crc kubenswrapper[4825]: I0310 08:50:00.499024 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552210-mkgwn" Mar 10 08:50:00 crc kubenswrapper[4825]: I0310 08:50:00.965311 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552210-mkgwn"] Mar 10 08:50:00 crc kubenswrapper[4825]: W0310 08:50:00.972602 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b87021f_252d_48d8_87a2_6d48161f495e.slice/crio-f11bae8508c29f5cc7bfb01920223bb895877c076198ea81bdcba460500e2e21 WatchSource:0}: Error finding container f11bae8508c29f5cc7bfb01920223bb895877c076198ea81bdcba460500e2e21: Status 404 returned error can't find the container with id f11bae8508c29f5cc7bfb01920223bb895877c076198ea81bdcba460500e2e21 Mar 10 08:50:01 crc kubenswrapper[4825]: I0310 08:50:01.749562 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552210-mkgwn" event={"ID":"5b87021f-252d-48d8-87a2-6d48161f495e","Type":"ContainerStarted","Data":"f11bae8508c29f5cc7bfb01920223bb895877c076198ea81bdcba460500e2e21"} Mar 10 08:50:06 crc kubenswrapper[4825]: I0310 08:50:06.818003 4825 generic.go:334] "Generic (PLEG): container finished" podID="5b87021f-252d-48d8-87a2-6d48161f495e" containerID="4175ef55e72869e7c45ae2499480e2aa86c30afaf9cec0c37bc229506563d9ed" exitCode=0 Mar 10 08:50:06 crc kubenswrapper[4825]: I0310 08:50:06.818122 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552210-mkgwn" event={"ID":"5b87021f-252d-48d8-87a2-6d48161f495e","Type":"ContainerDied","Data":"4175ef55e72869e7c45ae2499480e2aa86c30afaf9cec0c37bc229506563d9ed"} Mar 10 08:50:08 crc kubenswrapper[4825]: I0310 08:50:08.207657 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552210-mkgwn" Mar 10 08:50:08 crc kubenswrapper[4825]: I0310 08:50:08.264921 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d64xf\" (UniqueName: \"kubernetes.io/projected/5b87021f-252d-48d8-87a2-6d48161f495e-kube-api-access-d64xf\") pod \"5b87021f-252d-48d8-87a2-6d48161f495e\" (UID: \"5b87021f-252d-48d8-87a2-6d48161f495e\") " Mar 10 08:50:08 crc kubenswrapper[4825]: I0310 08:50:08.273425 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b87021f-252d-48d8-87a2-6d48161f495e-kube-api-access-d64xf" (OuterVolumeSpecName: "kube-api-access-d64xf") pod "5b87021f-252d-48d8-87a2-6d48161f495e" (UID: "5b87021f-252d-48d8-87a2-6d48161f495e"). InnerVolumeSpecName "kube-api-access-d64xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:50:08 crc kubenswrapper[4825]: I0310 08:50:08.368200 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d64xf\" (UniqueName: \"kubernetes.io/projected/5b87021f-252d-48d8-87a2-6d48161f495e-kube-api-access-d64xf\") on node \"crc\" DevicePath \"\"" Mar 10 08:50:08 crc kubenswrapper[4825]: I0310 08:50:08.839526 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552210-mkgwn" event={"ID":"5b87021f-252d-48d8-87a2-6d48161f495e","Type":"ContainerDied","Data":"f11bae8508c29f5cc7bfb01920223bb895877c076198ea81bdcba460500e2e21"} Mar 10 08:50:08 crc kubenswrapper[4825]: I0310 08:50:08.839577 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f11bae8508c29f5cc7bfb01920223bb895877c076198ea81bdcba460500e2e21" Mar 10 08:50:08 crc kubenswrapper[4825]: I0310 08:50:08.839645 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552210-mkgwn" Mar 10 08:50:09 crc kubenswrapper[4825]: I0310 08:50:09.287854 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552204-c79wc"] Mar 10 08:50:09 crc kubenswrapper[4825]: I0310 08:50:09.296929 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552204-c79wc"] Mar 10 08:50:11 crc kubenswrapper[4825]: I0310 08:50:11.237784 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:50:11 crc kubenswrapper[4825]: E0310 08:50:11.238416 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:50:11 crc kubenswrapper[4825]: I0310 08:50:11.254355 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65c0626b-a45c-46bc-99fc-b889bad71c1b" path="/var/lib/kubelet/pods/65c0626b-a45c-46bc-99fc-b889bad71c1b/volumes" Mar 10 08:50:24 crc kubenswrapper[4825]: I0310 08:50:24.238781 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:50:24 crc kubenswrapper[4825]: E0310 08:50:24.239596 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:50:39 crc kubenswrapper[4825]: I0310 08:50:39.245948 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:50:39 crc kubenswrapper[4825]: E0310 08:50:39.246726 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:50:50 crc kubenswrapper[4825]: I0310 08:50:50.236899 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:50:50 crc kubenswrapper[4825]: E0310 08:50:50.237775 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:50:54 crc kubenswrapper[4825]: I0310 08:50:54.287085 4825 scope.go:117] "RemoveContainer" containerID="60f8546bda46f78c8d14c36cacc9fc5642d9f3fbce1b310ded87717df4648c9c" Mar 10 08:51:05 crc kubenswrapper[4825]: I0310 08:51:05.237271 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:51:05 crc kubenswrapper[4825]: E0310 08:51:05.237988 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:51:17 crc kubenswrapper[4825]: I0310 08:51:17.237423 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:51:17 crc kubenswrapper[4825]: E0310 08:51:17.238293 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:51:29 crc kubenswrapper[4825]: I0310 08:51:29.244356 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:51:29 crc kubenswrapper[4825]: E0310 08:51:29.245092 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:51:40 crc kubenswrapper[4825]: I0310 08:51:40.237195 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:51:40 crc kubenswrapper[4825]: E0310 08:51:40.238502 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:51:51 crc kubenswrapper[4825]: I0310 08:51:51.237522 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:51:51 crc kubenswrapper[4825]: E0310 08:51:51.238522 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:52:00 crc kubenswrapper[4825]: I0310 08:52:00.150202 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552212-bgwk7"] Mar 10 08:52:00 crc kubenswrapper[4825]: E0310 08:52:00.151203 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b87021f-252d-48d8-87a2-6d48161f495e" containerName="oc" Mar 10 08:52:00 crc kubenswrapper[4825]: I0310 08:52:00.151218 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b87021f-252d-48d8-87a2-6d48161f495e" containerName="oc" Mar 10 08:52:00 crc kubenswrapper[4825]: I0310 08:52:00.151416 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b87021f-252d-48d8-87a2-6d48161f495e" containerName="oc" Mar 10 08:52:00 crc kubenswrapper[4825]: I0310 08:52:00.152181 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552212-bgwk7" Mar 10 08:52:00 crc kubenswrapper[4825]: I0310 08:52:00.154238 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:52:00 crc kubenswrapper[4825]: I0310 08:52:00.158465 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:52:00 crc kubenswrapper[4825]: I0310 08:52:00.158490 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:52:00 crc kubenswrapper[4825]: I0310 08:52:00.161282 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552212-bgwk7"] Mar 10 08:52:00 crc kubenswrapper[4825]: I0310 08:52:00.272349 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvphn\" (UniqueName: \"kubernetes.io/projected/5a6a81f5-a188-4e8e-85d3-008ca6f74e22-kube-api-access-fvphn\") pod \"auto-csr-approver-29552212-bgwk7\" (UID: \"5a6a81f5-a188-4e8e-85d3-008ca6f74e22\") " pod="openshift-infra/auto-csr-approver-29552212-bgwk7" Mar 10 08:52:00 crc kubenswrapper[4825]: I0310 08:52:00.375386 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvphn\" (UniqueName: \"kubernetes.io/projected/5a6a81f5-a188-4e8e-85d3-008ca6f74e22-kube-api-access-fvphn\") pod \"auto-csr-approver-29552212-bgwk7\" (UID: \"5a6a81f5-a188-4e8e-85d3-008ca6f74e22\") " pod="openshift-infra/auto-csr-approver-29552212-bgwk7" Mar 10 08:52:00 crc kubenswrapper[4825]: I0310 08:52:00.398330 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvphn\" (UniqueName: \"kubernetes.io/projected/5a6a81f5-a188-4e8e-85d3-008ca6f74e22-kube-api-access-fvphn\") pod \"auto-csr-approver-29552212-bgwk7\" (UID: \"5a6a81f5-a188-4e8e-85d3-008ca6f74e22\") " pod="openshift-infra/auto-csr-approver-29552212-bgwk7" Mar 10 08:52:00 crc kubenswrapper[4825]: I0310 08:52:00.471791 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552212-bgwk7" Mar 10 08:52:00 crc kubenswrapper[4825]: I0310 08:52:00.944571 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552212-bgwk7"] Mar 10 08:52:01 crc kubenswrapper[4825]: I0310 08:52:01.001203 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552212-bgwk7" event={"ID":"5a6a81f5-a188-4e8e-85d3-008ca6f74e22","Type":"ContainerStarted","Data":"fdfaf7836eea9c9b75479b6435d38ed8d5d23f4e4965d5492b6b51bdf6f214f6"} Mar 10 08:52:03 crc kubenswrapper[4825]: I0310 08:52:03.020475 4825 generic.go:334] "Generic (PLEG): container finished" podID="5a6a81f5-a188-4e8e-85d3-008ca6f74e22" containerID="65b0c56c065f618de483013f8ef40ff01d54d0de4572f8f2c3e767467417ee2a" exitCode=0 Mar 10 08:52:03 crc kubenswrapper[4825]: I0310 08:52:03.020545 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552212-bgwk7" event={"ID":"5a6a81f5-a188-4e8e-85d3-008ca6f74e22","Type":"ContainerDied","Data":"65b0c56c065f618de483013f8ef40ff01d54d0de4572f8f2c3e767467417ee2a"} Mar 10 08:52:04 crc kubenswrapper[4825]: I0310 08:52:04.236733 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:52:04 crc kubenswrapper[4825]: E0310 08:52:04.237328 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:52:04 crc kubenswrapper[4825]: I0310 08:52:04.376849 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552212-bgwk7" Mar 10 08:52:04 crc kubenswrapper[4825]: I0310 08:52:04.458338 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvphn\" (UniqueName: \"kubernetes.io/projected/5a6a81f5-a188-4e8e-85d3-008ca6f74e22-kube-api-access-fvphn\") pod \"5a6a81f5-a188-4e8e-85d3-008ca6f74e22\" (UID: \"5a6a81f5-a188-4e8e-85d3-008ca6f74e22\") " Mar 10 08:52:04 crc kubenswrapper[4825]: I0310 08:52:04.468439 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a6a81f5-a188-4e8e-85d3-008ca6f74e22-kube-api-access-fvphn" (OuterVolumeSpecName: "kube-api-access-fvphn") pod "5a6a81f5-a188-4e8e-85d3-008ca6f74e22" (UID: "5a6a81f5-a188-4e8e-85d3-008ca6f74e22"). InnerVolumeSpecName "kube-api-access-fvphn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:52:04 crc kubenswrapper[4825]: I0310 08:52:04.560361 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvphn\" (UniqueName: \"kubernetes.io/projected/5a6a81f5-a188-4e8e-85d3-008ca6f74e22-kube-api-access-fvphn\") on node \"crc\" DevicePath \"\"" Mar 10 08:52:05 crc kubenswrapper[4825]: I0310 08:52:05.044460 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552212-bgwk7" event={"ID":"5a6a81f5-a188-4e8e-85d3-008ca6f74e22","Type":"ContainerDied","Data":"fdfaf7836eea9c9b75479b6435d38ed8d5d23f4e4965d5492b6b51bdf6f214f6"} Mar 10 08:52:05 crc kubenswrapper[4825]: I0310 08:52:05.044801 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdfaf7836eea9c9b75479b6435d38ed8d5d23f4e4965d5492b6b51bdf6f214f6" Mar 10 08:52:05 crc kubenswrapper[4825]: I0310 08:52:05.044517 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552212-bgwk7" Mar 10 08:52:05 crc kubenswrapper[4825]: E0310 08:52:05.125688 4825 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a6a81f5_a188_4e8e_85d3_008ca6f74e22.slice\": RecentStats: unable to find data in memory cache]" Mar 10 08:52:05 crc kubenswrapper[4825]: I0310 08:52:05.453930 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552206-q6ncf"] Mar 10 08:52:05 crc kubenswrapper[4825]: I0310 08:52:05.466747 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552206-q6ncf"] Mar 10 08:52:07 crc kubenswrapper[4825]: I0310 08:52:07.248034 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f9fcab9-d38b-401c-96da-163b9077f329" path="/var/lib/kubelet/pods/0f9fcab9-d38b-401c-96da-163b9077f329/volumes" Mar 10 08:52:18 crc kubenswrapper[4825]: I0310 08:52:18.236674 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:52:18 crc kubenswrapper[4825]: E0310 08:52:18.237473 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:52:29 crc kubenswrapper[4825]: I0310 08:52:29.246499 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:52:29 crc kubenswrapper[4825]: E0310 08:52:29.247918 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:52:44 crc kubenswrapper[4825]: I0310 08:52:44.237314 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:52:44 crc kubenswrapper[4825]: E0310 08:52:44.238226 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:52:54 crc kubenswrapper[4825]: I0310 08:52:54.374013 4825 scope.go:117] "RemoveContainer" containerID="a99816d89d8ad7aa4564437141d90d96a4829083d63382270ac6a376cdb2c0f3" Mar 10 08:52:57 crc kubenswrapper[4825]: I0310 08:52:57.236574 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:52:57 crc kubenswrapper[4825]: E0310 08:52:57.237406 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:53:12 crc kubenswrapper[4825]: I0310 08:53:12.237662 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:53:12 crc kubenswrapper[4825]: E0310 08:53:12.239054 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:53:21 crc kubenswrapper[4825]: I0310 08:53:21.771879 4825 generic.go:334] "Generic (PLEG): container finished" podID="8da704d5-ded8-4714-936b-91ac018532dd" containerID="780544b72239d6a913663c7f17629f5e7430069f0ad1822d58cc05fc261c0d34" exitCode=0 Mar 10 08:53:21 crc kubenswrapper[4825]: I0310 08:53:21.772376 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-gd2sq" event={"ID":"8da704d5-ded8-4714-936b-91ac018532dd","Type":"ContainerDied","Data":"780544b72239d6a913663c7f17629f5e7430069f0ad1822d58cc05fc261c0d34"} Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.235360 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-gd2sq" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.254498 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8da704d5-ded8-4714-936b-91ac018532dd-libvirt-secret-0\") pod \"8da704d5-ded8-4714-936b-91ac018532dd\" (UID: \"8da704d5-ded8-4714-936b-91ac018532dd\") " Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.254547 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8da704d5-ded8-4714-936b-91ac018532dd-libvirt-combined-ca-bundle\") pod \"8da704d5-ded8-4714-936b-91ac018532dd\" (UID: \"8da704d5-ded8-4714-936b-91ac018532dd\") " Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.254628 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8da704d5-ded8-4714-936b-91ac018532dd-inventory\") pod \"8da704d5-ded8-4714-936b-91ac018532dd\" (UID: \"8da704d5-ded8-4714-936b-91ac018532dd\") " Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.254728 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw8nk\" (UniqueName: \"kubernetes.io/projected/8da704d5-ded8-4714-936b-91ac018532dd-kube-api-access-vw8nk\") pod \"8da704d5-ded8-4714-936b-91ac018532dd\" (UID: \"8da704d5-ded8-4714-936b-91ac018532dd\") " Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.254877 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8da704d5-ded8-4714-936b-91ac018532dd-ssh-key-openstack-cell1\") pod \"8da704d5-ded8-4714-936b-91ac018532dd\" (UID: \"8da704d5-ded8-4714-936b-91ac018532dd\") " Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.263037 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8da704d5-ded8-4714-936b-91ac018532dd-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "8da704d5-ded8-4714-936b-91ac018532dd" (UID: "8da704d5-ded8-4714-936b-91ac018532dd"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.264392 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8da704d5-ded8-4714-936b-91ac018532dd-kube-api-access-vw8nk" (OuterVolumeSpecName: "kube-api-access-vw8nk") pod "8da704d5-ded8-4714-936b-91ac018532dd" (UID: "8da704d5-ded8-4714-936b-91ac018532dd"). InnerVolumeSpecName "kube-api-access-vw8nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.288309 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8da704d5-ded8-4714-936b-91ac018532dd-inventory" (OuterVolumeSpecName: "inventory") pod "8da704d5-ded8-4714-936b-91ac018532dd" (UID: "8da704d5-ded8-4714-936b-91ac018532dd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.317063 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8da704d5-ded8-4714-936b-91ac018532dd-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "8da704d5-ded8-4714-936b-91ac018532dd" (UID: "8da704d5-ded8-4714-936b-91ac018532dd"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.321555 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8da704d5-ded8-4714-936b-91ac018532dd-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "8da704d5-ded8-4714-936b-91ac018532dd" (UID: "8da704d5-ded8-4714-936b-91ac018532dd"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.359714 4825 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8da704d5-ded8-4714-936b-91ac018532dd-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.359749 4825 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8da704d5-ded8-4714-936b-91ac018532dd-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.359761 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8da704d5-ded8-4714-936b-91ac018532dd-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.359772 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw8nk\" (UniqueName: \"kubernetes.io/projected/8da704d5-ded8-4714-936b-91ac018532dd-kube-api-access-vw8nk\") on node \"crc\" DevicePath \"\"" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.359782 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8da704d5-ded8-4714-936b-91ac018532dd-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.789212 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-gd2sq" event={"ID":"8da704d5-ded8-4714-936b-91ac018532dd","Type":"ContainerDied","Data":"6f0909a62388bc28867dbdb2a18f5d6629bbdabbd9b5c5936ef1a61b31c7abcd"} Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.789261 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f0909a62388bc28867dbdb2a18f5d6629bbdabbd9b5c5936ef1a61b31c7abcd" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.789261 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-gd2sq" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.893990 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-dvswf"] Mar 10 08:53:23 crc kubenswrapper[4825]: E0310 08:53:23.894637 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8da704d5-ded8-4714-936b-91ac018532dd" containerName="libvirt-openstack-openstack-cell1" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.894656 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8da704d5-ded8-4714-936b-91ac018532dd" containerName="libvirt-openstack-openstack-cell1" Mar 10 08:53:23 crc kubenswrapper[4825]: E0310 08:53:23.894693 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a6a81f5-a188-4e8e-85d3-008ca6f74e22" containerName="oc" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.894716 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a6a81f5-a188-4e8e-85d3-008ca6f74e22" containerName="oc" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.894970 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="8da704d5-ded8-4714-936b-91ac018532dd" containerName="libvirt-openstack-openstack-cell1" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.895001 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a6a81f5-a188-4e8e-85d3-008ca6f74e22" containerName="oc" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.895786 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.897888 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-5twvv" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.898787 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.901072 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.901487 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.901691 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.901944 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.902160 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.916063 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-dvswf"] Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.971688 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgcv2\" (UniqueName: \"kubernetes.io/projected/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-kube-api-access-mgcv2\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.971754 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.971969 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.972020 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-inventory\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.972092 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.972207 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.972285 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.972332 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.972396 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.972429 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:23 crc kubenswrapper[4825]: I0310 08:53:23.972492 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.074589 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.074698 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.074732 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-inventory\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.074773 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.074804 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.074839 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.074894 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.074939 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.074977 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.075167 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.075273 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgcv2\" (UniqueName: \"kubernetes.io/projected/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-kube-api-access-mgcv2\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.076119 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.080226 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.080445 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.080987 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.081058 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.081290 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.081460 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.081915 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.082299 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-inventory\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.082923 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.093600 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgcv2\" (UniqueName: \"kubernetes.io/projected/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-kube-api-access-mgcv2\") pod \"nova-cell1-openstack-openstack-cell1-dvswf\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.229234 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.826621 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-dvswf"] Mar 10 08:53:24 crc kubenswrapper[4825]: I0310 08:53:24.828633 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 08:53:25 crc kubenswrapper[4825]: I0310 08:53:25.236521 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:53:25 crc kubenswrapper[4825]: E0310 08:53:25.237053 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:53:25 crc kubenswrapper[4825]: I0310 08:53:25.809697 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" event={"ID":"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7","Type":"ContainerStarted","Data":"6e359258a7dea541056e2079b68e158683b84e730fe1e87547a2ba124bb1b6b3"} Mar 10 08:53:25 crc kubenswrapper[4825]: I0310 08:53:25.809982 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" event={"ID":"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7","Type":"ContainerStarted","Data":"6631c4d3fab0d8b9351360e0cf531c489ee9f223a646c716ccf934efbf85f36b"} Mar 10 08:53:25 crc kubenswrapper[4825]: I0310 08:53:25.845486 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" podStartSLOduration=2.314951722 podStartE2EDuration="2.845467116s" podCreationTimestamp="2026-03-10 08:53:23 +0000 UTC" firstStartedPulling="2026-03-10 08:53:24.82841609 +0000 UTC m=+7757.858196695" lastFinishedPulling="2026-03-10 08:53:25.358931474 +0000 UTC m=+7758.388712089" observedRunningTime="2026-03-10 08:53:25.831438132 +0000 UTC m=+7758.861218807" watchObservedRunningTime="2026-03-10 08:53:25.845467116 +0000 UTC m=+7758.875247731" Mar 10 08:53:40 crc kubenswrapper[4825]: I0310 08:53:40.237108 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:53:40 crc kubenswrapper[4825]: E0310 08:53:40.238252 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:53:53 crc kubenswrapper[4825]: I0310 08:53:53.236717 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:53:53 crc kubenswrapper[4825]: E0310 08:53:53.237390 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:54:00 crc kubenswrapper[4825]: I0310 08:54:00.151388 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552214-82k5m"] Mar 10 08:54:00 crc kubenswrapper[4825]: I0310 08:54:00.154587 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552214-82k5m" Mar 10 08:54:00 crc kubenswrapper[4825]: I0310 08:54:00.158432 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:54:00 crc kubenswrapper[4825]: I0310 08:54:00.158585 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:54:00 crc kubenswrapper[4825]: I0310 08:54:00.158616 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:54:00 crc kubenswrapper[4825]: I0310 08:54:00.162776 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552214-82k5m"] Mar 10 08:54:00 crc kubenswrapper[4825]: I0310 08:54:00.283299 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mctjc\" (UniqueName: \"kubernetes.io/projected/e0332297-f277-4a15-966b-d529656eb1b6-kube-api-access-mctjc\") pod \"auto-csr-approver-29552214-82k5m\" (UID: \"e0332297-f277-4a15-966b-d529656eb1b6\") " pod="openshift-infra/auto-csr-approver-29552214-82k5m" Mar 10 08:54:00 crc kubenswrapper[4825]: I0310 08:54:00.384576 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mctjc\" (UniqueName: \"kubernetes.io/projected/e0332297-f277-4a15-966b-d529656eb1b6-kube-api-access-mctjc\") pod \"auto-csr-approver-29552214-82k5m\" (UID: \"e0332297-f277-4a15-966b-d529656eb1b6\") " pod="openshift-infra/auto-csr-approver-29552214-82k5m" Mar 10 08:54:00 crc kubenswrapper[4825]: I0310 08:54:00.410879 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mctjc\" (UniqueName: \"kubernetes.io/projected/e0332297-f277-4a15-966b-d529656eb1b6-kube-api-access-mctjc\") pod \"auto-csr-approver-29552214-82k5m\" (UID: \"e0332297-f277-4a15-966b-d529656eb1b6\") " pod="openshift-infra/auto-csr-approver-29552214-82k5m" Mar 10 08:54:00 crc kubenswrapper[4825]: I0310 08:54:00.487392 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552214-82k5m" Mar 10 08:54:00 crc kubenswrapper[4825]: I0310 08:54:00.941489 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552214-82k5m"] Mar 10 08:54:00 crc kubenswrapper[4825]: W0310 08:54:00.944787 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0332297_f277_4a15_966b_d529656eb1b6.slice/crio-63dbfd0fe73dc0f4e93d61d80473e5f8bd843d2949677b1f80cdeb741b14e027 WatchSource:0}: Error finding container 63dbfd0fe73dc0f4e93d61d80473e5f8bd843d2949677b1f80cdeb741b14e027: Status 404 returned error can't find the container with id 63dbfd0fe73dc0f4e93d61d80473e5f8bd843d2949677b1f80cdeb741b14e027 Mar 10 08:54:01 crc kubenswrapper[4825]: I0310 08:54:01.192524 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552214-82k5m" event={"ID":"e0332297-f277-4a15-966b-d529656eb1b6","Type":"ContainerStarted","Data":"63dbfd0fe73dc0f4e93d61d80473e5f8bd843d2949677b1f80cdeb741b14e027"} Mar 10 08:54:02 crc kubenswrapper[4825]: I0310 08:54:02.203872 4825 generic.go:334] "Generic (PLEG): container finished" podID="e0332297-f277-4a15-966b-d529656eb1b6" containerID="f319bafbe0385242850f13f1baaf7c73d8b2f7b3e5f67bb20c6a571dbe6448c4" exitCode=0 Mar 10 08:54:02 crc kubenswrapper[4825]: I0310 08:54:02.204038 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552214-82k5m" event={"ID":"e0332297-f277-4a15-966b-d529656eb1b6","Type":"ContainerDied","Data":"f319bafbe0385242850f13f1baaf7c73d8b2f7b3e5f67bb20c6a571dbe6448c4"} Mar 10 08:54:03 crc kubenswrapper[4825]: I0310 08:54:03.629328 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552214-82k5m" Mar 10 08:54:03 crc kubenswrapper[4825]: I0310 08:54:03.758688 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mctjc\" (UniqueName: \"kubernetes.io/projected/e0332297-f277-4a15-966b-d529656eb1b6-kube-api-access-mctjc\") pod \"e0332297-f277-4a15-966b-d529656eb1b6\" (UID: \"e0332297-f277-4a15-966b-d529656eb1b6\") " Mar 10 08:54:03 crc kubenswrapper[4825]: I0310 08:54:03.766859 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0332297-f277-4a15-966b-d529656eb1b6-kube-api-access-mctjc" (OuterVolumeSpecName: "kube-api-access-mctjc") pod "e0332297-f277-4a15-966b-d529656eb1b6" (UID: "e0332297-f277-4a15-966b-d529656eb1b6"). InnerVolumeSpecName "kube-api-access-mctjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:54:03 crc kubenswrapper[4825]: I0310 08:54:03.862831 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mctjc\" (UniqueName: \"kubernetes.io/projected/e0332297-f277-4a15-966b-d529656eb1b6-kube-api-access-mctjc\") on node \"crc\" DevicePath \"\"" Mar 10 08:54:04 crc kubenswrapper[4825]: I0310 08:54:04.226646 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552214-82k5m" event={"ID":"e0332297-f277-4a15-966b-d529656eb1b6","Type":"ContainerDied","Data":"63dbfd0fe73dc0f4e93d61d80473e5f8bd843d2949677b1f80cdeb741b14e027"} Mar 10 08:54:04 crc kubenswrapper[4825]: I0310 08:54:04.226689 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552214-82k5m" Mar 10 08:54:04 crc kubenswrapper[4825]: I0310 08:54:04.226705 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63dbfd0fe73dc0f4e93d61d80473e5f8bd843d2949677b1f80cdeb741b14e027" Mar 10 08:54:04 crc kubenswrapper[4825]: I0310 08:54:04.236507 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:54:04 crc kubenswrapper[4825]: E0310 08:54:04.236888 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:54:04 crc kubenswrapper[4825]: I0310 08:54:04.724437 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552208-sh75f"] Mar 10 08:54:04 crc kubenswrapper[4825]: I0310 08:54:04.732799 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552208-sh75f"] Mar 10 08:54:05 crc kubenswrapper[4825]: I0310 08:54:05.250646 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3671c762-4fd8-4783-89f4-9d1b298717ae" path="/var/lib/kubelet/pods/3671c762-4fd8-4783-89f4-9d1b298717ae/volumes" Mar 10 08:54:18 crc kubenswrapper[4825]: I0310 08:54:18.237108 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:54:18 crc kubenswrapper[4825]: E0310 08:54:18.238301 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:54:31 crc kubenswrapper[4825]: I0310 08:54:31.236996 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:54:31 crc kubenswrapper[4825]: E0310 08:54:31.237934 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.283501 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-clfzb"] Mar 10 08:54:37 crc kubenswrapper[4825]: E0310 08:54:37.284458 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0332297-f277-4a15-966b-d529656eb1b6" containerName="oc" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.284472 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0332297-f277-4a15-966b-d529656eb1b6" containerName="oc" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.284697 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0332297-f277-4a15-966b-d529656eb1b6" containerName="oc" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.286478 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-clfzb" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.299239 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-clfzb"] Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.386041 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faca2616-e3b8-4342-98c2-8c945703cc1f-utilities\") pod \"community-operators-clfzb\" (UID: \"faca2616-e3b8-4342-98c2-8c945703cc1f\") " pod="openshift-marketplace/community-operators-clfzb" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.386282 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hrwq\" (UniqueName: \"kubernetes.io/projected/faca2616-e3b8-4342-98c2-8c945703cc1f-kube-api-access-8hrwq\") pod \"community-operators-clfzb\" (UID: \"faca2616-e3b8-4342-98c2-8c945703cc1f\") " pod="openshift-marketplace/community-operators-clfzb" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.386830 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faca2616-e3b8-4342-98c2-8c945703cc1f-catalog-content\") pod \"community-operators-clfzb\" (UID: \"faca2616-e3b8-4342-98c2-8c945703cc1f\") " pod="openshift-marketplace/community-operators-clfzb" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.476904 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8brl8"] Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.479523 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8brl8" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.488492 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faca2616-e3b8-4342-98c2-8c945703cc1f-catalog-content\") pod \"community-operators-clfzb\" (UID: \"faca2616-e3b8-4342-98c2-8c945703cc1f\") " pod="openshift-marketplace/community-operators-clfzb" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.488576 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faca2616-e3b8-4342-98c2-8c945703cc1f-utilities\") pod \"community-operators-clfzb\" (UID: \"faca2616-e3b8-4342-98c2-8c945703cc1f\") " pod="openshift-marketplace/community-operators-clfzb" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.488647 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hrwq\" (UniqueName: \"kubernetes.io/projected/faca2616-e3b8-4342-98c2-8c945703cc1f-kube-api-access-8hrwq\") pod \"community-operators-clfzb\" (UID: \"faca2616-e3b8-4342-98c2-8c945703cc1f\") " pod="openshift-marketplace/community-operators-clfzb" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.489103 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faca2616-e3b8-4342-98c2-8c945703cc1f-catalog-content\") pod \"community-operators-clfzb\" (UID: \"faca2616-e3b8-4342-98c2-8c945703cc1f\") " pod="openshift-marketplace/community-operators-clfzb" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.489181 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faca2616-e3b8-4342-98c2-8c945703cc1f-utilities\") pod \"community-operators-clfzb\" (UID: \"faca2616-e3b8-4342-98c2-8c945703cc1f\") " pod="openshift-marketplace/community-operators-clfzb" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.495819 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8brl8"] Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.518582 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hrwq\" (UniqueName: \"kubernetes.io/projected/faca2616-e3b8-4342-98c2-8c945703cc1f-kube-api-access-8hrwq\") pod \"community-operators-clfzb\" (UID: \"faca2616-e3b8-4342-98c2-8c945703cc1f\") " pod="openshift-marketplace/community-operators-clfzb" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.590563 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkfwv\" (UniqueName: \"kubernetes.io/projected/b1c0ba11-41d9-435e-a895-0b7976356d86-kube-api-access-mkfwv\") pod \"redhat-marketplace-8brl8\" (UID: \"b1c0ba11-41d9-435e-a895-0b7976356d86\") " pod="openshift-marketplace/redhat-marketplace-8brl8" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.590683 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c0ba11-41d9-435e-a895-0b7976356d86-utilities\") pod \"redhat-marketplace-8brl8\" (UID: \"b1c0ba11-41d9-435e-a895-0b7976356d86\") " pod="openshift-marketplace/redhat-marketplace-8brl8" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.590817 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c0ba11-41d9-435e-a895-0b7976356d86-catalog-content\") pod \"redhat-marketplace-8brl8\" (UID: \"b1c0ba11-41d9-435e-a895-0b7976356d86\") " pod="openshift-marketplace/redhat-marketplace-8brl8" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.611579 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-clfzb" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.692423 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c0ba11-41d9-435e-a895-0b7976356d86-catalog-content\") pod \"redhat-marketplace-8brl8\" (UID: \"b1c0ba11-41d9-435e-a895-0b7976356d86\") " pod="openshift-marketplace/redhat-marketplace-8brl8" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.692548 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkfwv\" (UniqueName: \"kubernetes.io/projected/b1c0ba11-41d9-435e-a895-0b7976356d86-kube-api-access-mkfwv\") pod \"redhat-marketplace-8brl8\" (UID: \"b1c0ba11-41d9-435e-a895-0b7976356d86\") " pod="openshift-marketplace/redhat-marketplace-8brl8" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.692599 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c0ba11-41d9-435e-a895-0b7976356d86-utilities\") pod \"redhat-marketplace-8brl8\" (UID: \"b1c0ba11-41d9-435e-a895-0b7976356d86\") " pod="openshift-marketplace/redhat-marketplace-8brl8" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.693034 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c0ba11-41d9-435e-a895-0b7976356d86-utilities\") pod \"redhat-marketplace-8brl8\" (UID: \"b1c0ba11-41d9-435e-a895-0b7976356d86\") " pod="openshift-marketplace/redhat-marketplace-8brl8" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.693316 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c0ba11-41d9-435e-a895-0b7976356d86-catalog-content\") pod \"redhat-marketplace-8brl8\" (UID: \"b1c0ba11-41d9-435e-a895-0b7976356d86\") " pod="openshift-marketplace/redhat-marketplace-8brl8" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.716200 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkfwv\" (UniqueName: \"kubernetes.io/projected/b1c0ba11-41d9-435e-a895-0b7976356d86-kube-api-access-mkfwv\") pod \"redhat-marketplace-8brl8\" (UID: \"b1c0ba11-41d9-435e-a895-0b7976356d86\") " pod="openshift-marketplace/redhat-marketplace-8brl8" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.800106 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8brl8" Mar 10 08:54:37 crc kubenswrapper[4825]: I0310 08:54:37.991357 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-clfzb"] Mar 10 08:54:38 crc kubenswrapper[4825]: I0310 08:54:38.208891 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8brl8"] Mar 10 08:54:38 crc kubenswrapper[4825]: I0310 08:54:38.558364 4825 generic.go:334] "Generic (PLEG): container finished" podID="faca2616-e3b8-4342-98c2-8c945703cc1f" containerID="243c39eb64537c476018d926d8bc8cf1216032a444e8b25cbf51e0791c0284dd" exitCode=0 Mar 10 08:54:38 crc kubenswrapper[4825]: I0310 08:54:38.558425 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clfzb" event={"ID":"faca2616-e3b8-4342-98c2-8c945703cc1f","Type":"ContainerDied","Data":"243c39eb64537c476018d926d8bc8cf1216032a444e8b25cbf51e0791c0284dd"} Mar 10 08:54:38 crc kubenswrapper[4825]: I0310 08:54:38.558448 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clfzb" event={"ID":"faca2616-e3b8-4342-98c2-8c945703cc1f","Type":"ContainerStarted","Data":"018ad67ac1c1460ffd89b74279ca7f3f055a4fa7224ba51fbe9f30169cd7a147"} Mar 10 08:54:38 crc kubenswrapper[4825]: I0310 08:54:38.561823 4825 generic.go:334] "Generic (PLEG): container finished" podID="b1c0ba11-41d9-435e-a895-0b7976356d86" containerID="3cc1d38cd27005526056bf35e7c09c51503c15038709fd3eb7aef1e851c19813" exitCode=0 Mar 10 08:54:38 crc kubenswrapper[4825]: I0310 08:54:38.561887 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8brl8" event={"ID":"b1c0ba11-41d9-435e-a895-0b7976356d86","Type":"ContainerDied","Data":"3cc1d38cd27005526056bf35e7c09c51503c15038709fd3eb7aef1e851c19813"} Mar 10 08:54:38 crc kubenswrapper[4825]: I0310 08:54:38.562128 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8brl8" event={"ID":"b1c0ba11-41d9-435e-a895-0b7976356d86","Type":"ContainerStarted","Data":"8cc66641f2a5a25a3e2e00714e5d2583d48c45ed2a3834b423b4f1c853cd8ec8"} Mar 10 08:54:39 crc kubenswrapper[4825]: I0310 08:54:39.571985 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clfzb" event={"ID":"faca2616-e3b8-4342-98c2-8c945703cc1f","Type":"ContainerStarted","Data":"f49ff00db0039bd29103b0e7bdf709e17388126a1d86078153c9db7db28c1034"} Mar 10 08:54:39 crc kubenswrapper[4825]: I0310 08:54:39.574195 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8brl8" event={"ID":"b1c0ba11-41d9-435e-a895-0b7976356d86","Type":"ContainerStarted","Data":"3cdd34b4ff15cd8014e216a7044b3bf7777e890ee2a9df623a7badc4e23130f6"} Mar 10 08:54:41 crc kubenswrapper[4825]: I0310 08:54:41.599774 4825 generic.go:334] "Generic (PLEG): container finished" podID="faca2616-e3b8-4342-98c2-8c945703cc1f" containerID="f49ff00db0039bd29103b0e7bdf709e17388126a1d86078153c9db7db28c1034" exitCode=0 Mar 10 08:54:41 crc kubenswrapper[4825]: I0310 08:54:41.599913 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clfzb" event={"ID":"faca2616-e3b8-4342-98c2-8c945703cc1f","Type":"ContainerDied","Data":"f49ff00db0039bd29103b0e7bdf709e17388126a1d86078153c9db7db28c1034"} Mar 10 08:54:41 crc kubenswrapper[4825]: I0310 08:54:41.606082 4825 generic.go:334] "Generic (PLEG): container finished" podID="b1c0ba11-41d9-435e-a895-0b7976356d86" containerID="3cdd34b4ff15cd8014e216a7044b3bf7777e890ee2a9df623a7badc4e23130f6" exitCode=0 Mar 10 08:54:41 crc kubenswrapper[4825]: I0310 08:54:41.606175 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8brl8" event={"ID":"b1c0ba11-41d9-435e-a895-0b7976356d86","Type":"ContainerDied","Data":"3cdd34b4ff15cd8014e216a7044b3bf7777e890ee2a9df623a7badc4e23130f6"} Mar 10 08:54:42 crc kubenswrapper[4825]: I0310 08:54:42.617083 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8brl8" event={"ID":"b1c0ba11-41d9-435e-a895-0b7976356d86","Type":"ContainerStarted","Data":"2342c0598789590ea15abd9423546c59e7a3d63f1e1ee06a576487a919a9ee6e"} Mar 10 08:54:42 crc kubenswrapper[4825]: I0310 08:54:42.619339 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clfzb" event={"ID":"faca2616-e3b8-4342-98c2-8c945703cc1f","Type":"ContainerStarted","Data":"40028a20cb8808da99c36073065e04d3c390fd82fc3dadf7174cff93b9fe5ada"} Mar 10 08:54:42 crc kubenswrapper[4825]: I0310 08:54:42.636112 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8brl8" podStartSLOduration=2.158466707 podStartE2EDuration="5.636095544s" podCreationTimestamp="2026-03-10 08:54:37 +0000 UTC" firstStartedPulling="2026-03-10 08:54:38.564689466 +0000 UTC m=+7831.594470081" lastFinishedPulling="2026-03-10 08:54:42.042318303 +0000 UTC m=+7835.072098918" observedRunningTime="2026-03-10 08:54:42.634906903 +0000 UTC m=+7835.664687548" watchObservedRunningTime="2026-03-10 08:54:42.636095544 +0000 UTC m=+7835.665876159" Mar 10 08:54:42 crc kubenswrapper[4825]: I0310 08:54:42.661641 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-clfzb" podStartSLOduration=2.174379852 podStartE2EDuration="5.661619065s" podCreationTimestamp="2026-03-10 08:54:37 +0000 UTC" firstStartedPulling="2026-03-10 08:54:38.561418439 +0000 UTC m=+7831.591199054" lastFinishedPulling="2026-03-10 08:54:42.048657652 +0000 UTC m=+7835.078438267" observedRunningTime="2026-03-10 08:54:42.653526839 +0000 UTC m=+7835.683307454" watchObservedRunningTime="2026-03-10 08:54:42.661619065 +0000 UTC m=+7835.691399680" Mar 10 08:54:46 crc kubenswrapper[4825]: I0310 08:54:46.236887 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:54:46 crc kubenswrapper[4825]: E0310 08:54:46.240303 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 08:54:47 crc kubenswrapper[4825]: I0310 08:54:47.612814 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-clfzb" Mar 10 08:54:47 crc kubenswrapper[4825]: I0310 08:54:47.612873 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-clfzb" Mar 10 08:54:47 crc kubenswrapper[4825]: I0310 08:54:47.659105 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-clfzb" Mar 10 08:54:47 crc kubenswrapper[4825]: I0310 08:54:47.719081 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-clfzb" Mar 10 08:54:47 crc kubenswrapper[4825]: I0310 08:54:47.801300 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8brl8" Mar 10 08:54:47 crc kubenswrapper[4825]: I0310 08:54:47.801529 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8brl8" Mar 10 08:54:48 crc kubenswrapper[4825]: I0310 08:54:48.480874 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-clfzb"] Mar 10 08:54:48 crc kubenswrapper[4825]: I0310 08:54:48.848755 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-8brl8" podUID="b1c0ba11-41d9-435e-a895-0b7976356d86" containerName="registry-server" probeResult="failure" output=< Mar 10 08:54:48 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 08:54:48 crc kubenswrapper[4825]: > Mar 10 08:54:49 crc kubenswrapper[4825]: I0310 08:54:49.687747 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-clfzb" podUID="faca2616-e3b8-4342-98c2-8c945703cc1f" containerName="registry-server" containerID="cri-o://40028a20cb8808da99c36073065e04d3c390fd82fc3dadf7174cff93b9fe5ada" gracePeriod=2 Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.169817 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-clfzb" Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.281885 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faca2616-e3b8-4342-98c2-8c945703cc1f-catalog-content\") pod \"faca2616-e3b8-4342-98c2-8c945703cc1f\" (UID: \"faca2616-e3b8-4342-98c2-8c945703cc1f\") " Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.282032 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faca2616-e3b8-4342-98c2-8c945703cc1f-utilities\") pod \"faca2616-e3b8-4342-98c2-8c945703cc1f\" (UID: \"faca2616-e3b8-4342-98c2-8c945703cc1f\") " Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.282146 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hrwq\" (UniqueName: \"kubernetes.io/projected/faca2616-e3b8-4342-98c2-8c945703cc1f-kube-api-access-8hrwq\") pod \"faca2616-e3b8-4342-98c2-8c945703cc1f\" (UID: \"faca2616-e3b8-4342-98c2-8c945703cc1f\") " Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.282879 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faca2616-e3b8-4342-98c2-8c945703cc1f-utilities" (OuterVolumeSpecName: "utilities") pod "faca2616-e3b8-4342-98c2-8c945703cc1f" (UID: "faca2616-e3b8-4342-98c2-8c945703cc1f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.289517 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faca2616-e3b8-4342-98c2-8c945703cc1f-kube-api-access-8hrwq" (OuterVolumeSpecName: "kube-api-access-8hrwq") pod "faca2616-e3b8-4342-98c2-8c945703cc1f" (UID: "faca2616-e3b8-4342-98c2-8c945703cc1f"). InnerVolumeSpecName "kube-api-access-8hrwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.338896 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faca2616-e3b8-4342-98c2-8c945703cc1f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "faca2616-e3b8-4342-98c2-8c945703cc1f" (UID: "faca2616-e3b8-4342-98c2-8c945703cc1f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.384813 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hrwq\" (UniqueName: \"kubernetes.io/projected/faca2616-e3b8-4342-98c2-8c945703cc1f-kube-api-access-8hrwq\") on node \"crc\" DevicePath \"\"" Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.384843 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faca2616-e3b8-4342-98c2-8c945703cc1f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.384851 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faca2616-e3b8-4342-98c2-8c945703cc1f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.697697 4825 generic.go:334] "Generic (PLEG): container finished" podID="faca2616-e3b8-4342-98c2-8c945703cc1f" containerID="40028a20cb8808da99c36073065e04d3c390fd82fc3dadf7174cff93b9fe5ada" exitCode=0 Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.697991 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clfzb" event={"ID":"faca2616-e3b8-4342-98c2-8c945703cc1f","Type":"ContainerDied","Data":"40028a20cb8808da99c36073065e04d3c390fd82fc3dadf7174cff93b9fe5ada"} Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.698016 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clfzb" event={"ID":"faca2616-e3b8-4342-98c2-8c945703cc1f","Type":"ContainerDied","Data":"018ad67ac1c1460ffd89b74279ca7f3f055a4fa7224ba51fbe9f30169cd7a147"} Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.698033 4825 scope.go:117] "RemoveContainer" containerID="40028a20cb8808da99c36073065e04d3c390fd82fc3dadf7174cff93b9fe5ada" Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.698171 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-clfzb" Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.725500 4825 scope.go:117] "RemoveContainer" containerID="f49ff00db0039bd29103b0e7bdf709e17388126a1d86078153c9db7db28c1034" Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.734722 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-clfzb"] Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.742357 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-clfzb"] Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.746722 4825 scope.go:117] "RemoveContainer" containerID="243c39eb64537c476018d926d8bc8cf1216032a444e8b25cbf51e0791c0284dd" Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.795907 4825 scope.go:117] "RemoveContainer" containerID="40028a20cb8808da99c36073065e04d3c390fd82fc3dadf7174cff93b9fe5ada" Mar 10 08:54:50 crc kubenswrapper[4825]: E0310 08:54:50.801604 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40028a20cb8808da99c36073065e04d3c390fd82fc3dadf7174cff93b9fe5ada\": container with ID starting with 40028a20cb8808da99c36073065e04d3c390fd82fc3dadf7174cff93b9fe5ada not found: ID does not exist" containerID="40028a20cb8808da99c36073065e04d3c390fd82fc3dadf7174cff93b9fe5ada" Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.801711 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40028a20cb8808da99c36073065e04d3c390fd82fc3dadf7174cff93b9fe5ada"} err="failed to get container status \"40028a20cb8808da99c36073065e04d3c390fd82fc3dadf7174cff93b9fe5ada\": rpc error: code = NotFound desc = could not find container \"40028a20cb8808da99c36073065e04d3c390fd82fc3dadf7174cff93b9fe5ada\": container with ID starting with 40028a20cb8808da99c36073065e04d3c390fd82fc3dadf7174cff93b9fe5ada not found: ID does not exist" Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.801756 4825 scope.go:117] "RemoveContainer" containerID="f49ff00db0039bd29103b0e7bdf709e17388126a1d86078153c9db7db28c1034" Mar 10 08:54:50 crc kubenswrapper[4825]: E0310 08:54:50.802261 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f49ff00db0039bd29103b0e7bdf709e17388126a1d86078153c9db7db28c1034\": container with ID starting with f49ff00db0039bd29103b0e7bdf709e17388126a1d86078153c9db7db28c1034 not found: ID does not exist" containerID="f49ff00db0039bd29103b0e7bdf709e17388126a1d86078153c9db7db28c1034" Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.802311 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f49ff00db0039bd29103b0e7bdf709e17388126a1d86078153c9db7db28c1034"} err="failed to get container status \"f49ff00db0039bd29103b0e7bdf709e17388126a1d86078153c9db7db28c1034\": rpc error: code = NotFound desc = could not find container \"f49ff00db0039bd29103b0e7bdf709e17388126a1d86078153c9db7db28c1034\": container with ID starting with f49ff00db0039bd29103b0e7bdf709e17388126a1d86078153c9db7db28c1034 not found: ID does not exist" Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.802337 4825 scope.go:117] "RemoveContainer" containerID="243c39eb64537c476018d926d8bc8cf1216032a444e8b25cbf51e0791c0284dd" Mar 10 08:54:50 crc kubenswrapper[4825]: E0310 08:54:50.802671 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"243c39eb64537c476018d926d8bc8cf1216032a444e8b25cbf51e0791c0284dd\": container with ID starting with 243c39eb64537c476018d926d8bc8cf1216032a444e8b25cbf51e0791c0284dd not found: ID does not exist" containerID="243c39eb64537c476018d926d8bc8cf1216032a444e8b25cbf51e0791c0284dd" Mar 10 08:54:50 crc kubenswrapper[4825]: I0310 08:54:50.802727 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"243c39eb64537c476018d926d8bc8cf1216032a444e8b25cbf51e0791c0284dd"} err="failed to get container status \"243c39eb64537c476018d926d8bc8cf1216032a444e8b25cbf51e0791c0284dd\": rpc error: code = NotFound desc = could not find container \"243c39eb64537c476018d926d8bc8cf1216032a444e8b25cbf51e0791c0284dd\": container with ID starting with 243c39eb64537c476018d926d8bc8cf1216032a444e8b25cbf51e0791c0284dd not found: ID does not exist" Mar 10 08:54:51 crc kubenswrapper[4825]: I0310 08:54:51.247631 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faca2616-e3b8-4342-98c2-8c945703cc1f" path="/var/lib/kubelet/pods/faca2616-e3b8-4342-98c2-8c945703cc1f/volumes" Mar 10 08:54:54 crc kubenswrapper[4825]: I0310 08:54:54.474217 4825 scope.go:117] "RemoveContainer" containerID="46a96044ed1d1ebef3961cb16e8d0e815d88992e28ff3f052c3d3cc15824f7d5" Mar 10 08:54:54 crc kubenswrapper[4825]: I0310 08:54:54.680776 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b2qvf"] Mar 10 08:54:54 crc kubenswrapper[4825]: E0310 08:54:54.681249 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faca2616-e3b8-4342-98c2-8c945703cc1f" containerName="extract-content" Mar 10 08:54:54 crc kubenswrapper[4825]: I0310 08:54:54.681270 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="faca2616-e3b8-4342-98c2-8c945703cc1f" containerName="extract-content" Mar 10 08:54:54 crc kubenswrapper[4825]: E0310 08:54:54.681300 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faca2616-e3b8-4342-98c2-8c945703cc1f" containerName="registry-server" Mar 10 08:54:54 crc kubenswrapper[4825]: I0310 08:54:54.681309 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="faca2616-e3b8-4342-98c2-8c945703cc1f" containerName="registry-server" Mar 10 08:54:54 crc kubenswrapper[4825]: E0310 08:54:54.681334 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faca2616-e3b8-4342-98c2-8c945703cc1f" containerName="extract-utilities" Mar 10 08:54:54 crc kubenswrapper[4825]: I0310 08:54:54.681342 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="faca2616-e3b8-4342-98c2-8c945703cc1f" containerName="extract-utilities" Mar 10 08:54:54 crc kubenswrapper[4825]: I0310 08:54:54.681604 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="faca2616-e3b8-4342-98c2-8c945703cc1f" containerName="registry-server" Mar 10 08:54:54 crc kubenswrapper[4825]: I0310 08:54:54.684024 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b2qvf" Mar 10 08:54:54 crc kubenswrapper[4825]: I0310 08:54:54.696626 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b2qvf"] Mar 10 08:54:54 crc kubenswrapper[4825]: I0310 08:54:54.788021 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6-catalog-content\") pod \"certified-operators-b2qvf\" (UID: \"499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6\") " pod="openshift-marketplace/certified-operators-b2qvf" Mar 10 08:54:54 crc kubenswrapper[4825]: I0310 08:54:54.788155 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6-utilities\") pod \"certified-operators-b2qvf\" (UID: \"499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6\") " pod="openshift-marketplace/certified-operators-b2qvf" Mar 10 08:54:54 crc kubenswrapper[4825]: I0310 08:54:54.788210 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhjvz\" (UniqueName: \"kubernetes.io/projected/499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6-kube-api-access-nhjvz\") pod \"certified-operators-b2qvf\" (UID: \"499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6\") " pod="openshift-marketplace/certified-operators-b2qvf" Mar 10 08:54:54 crc kubenswrapper[4825]: I0310 08:54:54.889989 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6-catalog-content\") pod \"certified-operators-b2qvf\" (UID: \"499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6\") " pod="openshift-marketplace/certified-operators-b2qvf" Mar 10 08:54:54 crc kubenswrapper[4825]: I0310 08:54:54.890164 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6-utilities\") pod \"certified-operators-b2qvf\" (UID: \"499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6\") " pod="openshift-marketplace/certified-operators-b2qvf" Mar 10 08:54:54 crc kubenswrapper[4825]: I0310 08:54:54.890557 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6-catalog-content\") pod \"certified-operators-b2qvf\" (UID: \"499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6\") " pod="openshift-marketplace/certified-operators-b2qvf" Mar 10 08:54:54 crc kubenswrapper[4825]: I0310 08:54:54.890651 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6-utilities\") pod \"certified-operators-b2qvf\" (UID: \"499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6\") " pod="openshift-marketplace/certified-operators-b2qvf" Mar 10 08:54:54 crc kubenswrapper[4825]: I0310 08:54:54.890729 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhjvz\" (UniqueName: \"kubernetes.io/projected/499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6-kube-api-access-nhjvz\") pod \"certified-operators-b2qvf\" (UID: \"499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6\") " pod="openshift-marketplace/certified-operators-b2qvf" Mar 10 08:54:54 crc kubenswrapper[4825]: I0310 08:54:54.911351 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhjvz\" (UniqueName: \"kubernetes.io/projected/499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6-kube-api-access-nhjvz\") pod \"certified-operators-b2qvf\" (UID: \"499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6\") " pod="openshift-marketplace/certified-operators-b2qvf" Mar 10 08:54:55 crc kubenswrapper[4825]: I0310 08:54:55.006479 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b2qvf" Mar 10 08:54:55 crc kubenswrapper[4825]: I0310 08:54:55.535476 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b2qvf"] Mar 10 08:54:55 crc kubenswrapper[4825]: I0310 08:54:55.759788 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2qvf" event={"ID":"499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6","Type":"ContainerStarted","Data":"0e9bb90b0851861d1f30fd40e09bd218db862f453d23e71e76eb44850d9af8f6"} Mar 10 08:54:56 crc kubenswrapper[4825]: I0310 08:54:56.777069 4825 generic.go:334] "Generic (PLEG): container finished" podID="499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6" containerID="519d41ce8171fde9c493c61e748ac0a598c4bd79dcc9dfa37a03953f898e7bf7" exitCode=0 Mar 10 08:54:56 crc kubenswrapper[4825]: I0310 08:54:56.777183 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2qvf" event={"ID":"499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6","Type":"ContainerDied","Data":"519d41ce8171fde9c493c61e748ac0a598c4bd79dcc9dfa37a03953f898e7bf7"} Mar 10 08:54:57 crc kubenswrapper[4825]: I0310 08:54:57.788855 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2qvf" event={"ID":"499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6","Type":"ContainerStarted","Data":"d32c5fb5de3bfc758de4628b2f4ec9c29627e05025bb7c87e01697bbe3998ea4"} Mar 10 08:54:57 crc kubenswrapper[4825]: I0310 08:54:57.851042 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8brl8" Mar 10 08:54:57 crc kubenswrapper[4825]: I0310 08:54:57.905643 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8brl8" Mar 10 08:54:59 crc kubenswrapper[4825]: I0310 08:54:59.810329 4825 generic.go:334] "Generic (PLEG): container finished" podID="499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6" containerID="d32c5fb5de3bfc758de4628b2f4ec9c29627e05025bb7c87e01697bbe3998ea4" exitCode=0 Mar 10 08:54:59 crc kubenswrapper[4825]: I0310 08:54:59.811003 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2qvf" event={"ID":"499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6","Type":"ContainerDied","Data":"d32c5fb5de3bfc758de4628b2f4ec9c29627e05025bb7c87e01697bbe3998ea4"} Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.237467 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.276468 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8brl8"] Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.276733 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8brl8" podUID="b1c0ba11-41d9-435e-a895-0b7976356d86" containerName="registry-server" containerID="cri-o://2342c0598789590ea15abd9423546c59e7a3d63f1e1ee06a576487a919a9ee6e" gracePeriod=2 Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.768660 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8brl8" Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.824256 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2qvf" event={"ID":"499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6","Type":"ContainerStarted","Data":"8db9472115525c48ff0dc3a8df3eb2d986ef4d4a5375eb045fe28689eb486509"} Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.827053 4825 generic.go:334] "Generic (PLEG): container finished" podID="b1c0ba11-41d9-435e-a895-0b7976356d86" containerID="2342c0598789590ea15abd9423546c59e7a3d63f1e1ee06a576487a919a9ee6e" exitCode=0 Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.827100 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8brl8" Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.827105 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8brl8" event={"ID":"b1c0ba11-41d9-435e-a895-0b7976356d86","Type":"ContainerDied","Data":"2342c0598789590ea15abd9423546c59e7a3d63f1e1ee06a576487a919a9ee6e"} Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.827160 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8brl8" event={"ID":"b1c0ba11-41d9-435e-a895-0b7976356d86","Type":"ContainerDied","Data":"8cc66641f2a5a25a3e2e00714e5d2583d48c45ed2a3834b423b4f1c853cd8ec8"} Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.827180 4825 scope.go:117] "RemoveContainer" containerID="2342c0598789590ea15abd9423546c59e7a3d63f1e1ee06a576487a919a9ee6e" Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.831821 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"81b5ce52c08a356df2c6362cda9b87f444da49d61e188354e699ec5130fe40a8"} Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.848224 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b2qvf" podStartSLOduration=3.25444318 podStartE2EDuration="6.848202863s" podCreationTimestamp="2026-03-10 08:54:54 +0000 UTC" firstStartedPulling="2026-03-10 08:54:56.779487627 +0000 UTC m=+7849.809268252" lastFinishedPulling="2026-03-10 08:55:00.37324732 +0000 UTC m=+7853.403027935" observedRunningTime="2026-03-10 08:55:00.841440413 +0000 UTC m=+7853.871221048" watchObservedRunningTime="2026-03-10 08:55:00.848202863 +0000 UTC m=+7853.877983478" Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.852486 4825 scope.go:117] "RemoveContainer" containerID="3cdd34b4ff15cd8014e216a7044b3bf7777e890ee2a9df623a7badc4e23130f6" Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.887480 4825 scope.go:117] "RemoveContainer" containerID="3cc1d38cd27005526056bf35e7c09c51503c15038709fd3eb7aef1e851c19813" Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.929096 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c0ba11-41d9-435e-a895-0b7976356d86-catalog-content\") pod \"b1c0ba11-41d9-435e-a895-0b7976356d86\" (UID: \"b1c0ba11-41d9-435e-a895-0b7976356d86\") " Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.929329 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkfwv\" (UniqueName: \"kubernetes.io/projected/b1c0ba11-41d9-435e-a895-0b7976356d86-kube-api-access-mkfwv\") pod \"b1c0ba11-41d9-435e-a895-0b7976356d86\" (UID: \"b1c0ba11-41d9-435e-a895-0b7976356d86\") " Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.929452 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c0ba11-41d9-435e-a895-0b7976356d86-utilities\") pod \"b1c0ba11-41d9-435e-a895-0b7976356d86\" (UID: \"b1c0ba11-41d9-435e-a895-0b7976356d86\") " Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.933351 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1c0ba11-41d9-435e-a895-0b7976356d86-utilities" (OuterVolumeSpecName: "utilities") pod "b1c0ba11-41d9-435e-a895-0b7976356d86" (UID: "b1c0ba11-41d9-435e-a895-0b7976356d86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.937926 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1c0ba11-41d9-435e-a895-0b7976356d86-kube-api-access-mkfwv" (OuterVolumeSpecName: "kube-api-access-mkfwv") pod "b1c0ba11-41d9-435e-a895-0b7976356d86" (UID: "b1c0ba11-41d9-435e-a895-0b7976356d86"). InnerVolumeSpecName "kube-api-access-mkfwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.938951 4825 scope.go:117] "RemoveContainer" containerID="2342c0598789590ea15abd9423546c59e7a3d63f1e1ee06a576487a919a9ee6e" Mar 10 08:55:00 crc kubenswrapper[4825]: E0310 08:55:00.939677 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2342c0598789590ea15abd9423546c59e7a3d63f1e1ee06a576487a919a9ee6e\": container with ID starting with 2342c0598789590ea15abd9423546c59e7a3d63f1e1ee06a576487a919a9ee6e not found: ID does not exist" containerID="2342c0598789590ea15abd9423546c59e7a3d63f1e1ee06a576487a919a9ee6e" Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.939710 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2342c0598789590ea15abd9423546c59e7a3d63f1e1ee06a576487a919a9ee6e"} err="failed to get container status \"2342c0598789590ea15abd9423546c59e7a3d63f1e1ee06a576487a919a9ee6e\": rpc error: code = NotFound desc = could not find container \"2342c0598789590ea15abd9423546c59e7a3d63f1e1ee06a576487a919a9ee6e\": container with ID starting with 2342c0598789590ea15abd9423546c59e7a3d63f1e1ee06a576487a919a9ee6e not found: ID does not exist" Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.939731 4825 scope.go:117] "RemoveContainer" containerID="3cdd34b4ff15cd8014e216a7044b3bf7777e890ee2a9df623a7badc4e23130f6" Mar 10 08:55:00 crc kubenswrapper[4825]: E0310 08:55:00.940053 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cdd34b4ff15cd8014e216a7044b3bf7777e890ee2a9df623a7badc4e23130f6\": container with ID starting with 3cdd34b4ff15cd8014e216a7044b3bf7777e890ee2a9df623a7badc4e23130f6 not found: ID does not exist" containerID="3cdd34b4ff15cd8014e216a7044b3bf7777e890ee2a9df623a7badc4e23130f6" Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.940075 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cdd34b4ff15cd8014e216a7044b3bf7777e890ee2a9df623a7badc4e23130f6"} err="failed to get container status \"3cdd34b4ff15cd8014e216a7044b3bf7777e890ee2a9df623a7badc4e23130f6\": rpc error: code = NotFound desc = could not find container \"3cdd34b4ff15cd8014e216a7044b3bf7777e890ee2a9df623a7badc4e23130f6\": container with ID starting with 3cdd34b4ff15cd8014e216a7044b3bf7777e890ee2a9df623a7badc4e23130f6 not found: ID does not exist" Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.940088 4825 scope.go:117] "RemoveContainer" containerID="3cc1d38cd27005526056bf35e7c09c51503c15038709fd3eb7aef1e851c19813" Mar 10 08:55:00 crc kubenswrapper[4825]: E0310 08:55:00.940326 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cc1d38cd27005526056bf35e7c09c51503c15038709fd3eb7aef1e851c19813\": container with ID starting with 3cc1d38cd27005526056bf35e7c09c51503c15038709fd3eb7aef1e851c19813 not found: ID does not exist" containerID="3cc1d38cd27005526056bf35e7c09c51503c15038709fd3eb7aef1e851c19813" Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.940349 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cc1d38cd27005526056bf35e7c09c51503c15038709fd3eb7aef1e851c19813"} err="failed to get container status \"3cc1d38cd27005526056bf35e7c09c51503c15038709fd3eb7aef1e851c19813\": rpc error: code = NotFound desc = could not find container \"3cc1d38cd27005526056bf35e7c09c51503c15038709fd3eb7aef1e851c19813\": container with ID starting with 3cc1d38cd27005526056bf35e7c09c51503c15038709fd3eb7aef1e851c19813 not found: ID does not exist" Mar 10 08:55:00 crc kubenswrapper[4825]: I0310 08:55:00.958163 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1c0ba11-41d9-435e-a895-0b7976356d86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1c0ba11-41d9-435e-a895-0b7976356d86" (UID: "b1c0ba11-41d9-435e-a895-0b7976356d86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:55:01 crc kubenswrapper[4825]: I0310 08:55:01.032282 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkfwv\" (UniqueName: \"kubernetes.io/projected/b1c0ba11-41d9-435e-a895-0b7976356d86-kube-api-access-mkfwv\") on node \"crc\" DevicePath \"\"" Mar 10 08:55:01 crc kubenswrapper[4825]: I0310 08:55:01.032317 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c0ba11-41d9-435e-a895-0b7976356d86-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 08:55:01 crc kubenswrapper[4825]: I0310 08:55:01.032328 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c0ba11-41d9-435e-a895-0b7976356d86-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 08:55:01 crc kubenswrapper[4825]: I0310 08:55:01.162247 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8brl8"] Mar 10 08:55:01 crc kubenswrapper[4825]: I0310 08:55:01.175223 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8brl8"] Mar 10 08:55:01 crc kubenswrapper[4825]: I0310 08:55:01.248592 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1c0ba11-41d9-435e-a895-0b7976356d86" path="/var/lib/kubelet/pods/b1c0ba11-41d9-435e-a895-0b7976356d86/volumes" Mar 10 08:55:05 crc kubenswrapper[4825]: I0310 08:55:05.007419 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b2qvf" Mar 10 08:55:05 crc kubenswrapper[4825]: I0310 08:55:05.008079 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b2qvf" Mar 10 08:55:06 crc kubenswrapper[4825]: I0310 08:55:06.072813 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-b2qvf" podUID="499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6" containerName="registry-server" probeResult="failure" output=< Mar 10 08:55:06 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 08:55:06 crc kubenswrapper[4825]: > Mar 10 08:55:15 crc kubenswrapper[4825]: I0310 08:55:15.067825 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b2qvf" Mar 10 08:55:15 crc kubenswrapper[4825]: I0310 08:55:15.114560 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b2qvf" Mar 10 08:55:15 crc kubenswrapper[4825]: I0310 08:55:15.309875 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b2qvf"] Mar 10 08:55:17 crc kubenswrapper[4825]: I0310 08:55:17.892629 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b2qvf" podUID="499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6" containerName="registry-server" containerID="cri-o://8db9472115525c48ff0dc3a8df3eb2d986ef4d4a5375eb045fe28689eb486509" gracePeriod=2 Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.423270 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b2qvf" Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.506410 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6-catalog-content\") pod \"499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6\" (UID: \"499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6\") " Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.506455 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhjvz\" (UniqueName: \"kubernetes.io/projected/499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6-kube-api-access-nhjvz\") pod \"499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6\" (UID: \"499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6\") " Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.506537 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6-utilities\") pod \"499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6\" (UID: \"499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6\") " Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.507951 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6-utilities" (OuterVolumeSpecName: "utilities") pod "499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6" (UID: "499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.516397 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6-kube-api-access-nhjvz" (OuterVolumeSpecName: "kube-api-access-nhjvz") pod "499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6" (UID: "499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6"). InnerVolumeSpecName "kube-api-access-nhjvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.560786 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6" (UID: "499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.609223 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.609258 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhjvz\" (UniqueName: \"kubernetes.io/projected/499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6-kube-api-access-nhjvz\") on node \"crc\" DevicePath \"\"" Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.609267 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.892411 4825 generic.go:334] "Generic (PLEG): container finished" podID="499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6" containerID="8db9472115525c48ff0dc3a8df3eb2d986ef4d4a5375eb045fe28689eb486509" exitCode=0 Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.892456 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2qvf" event={"ID":"499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6","Type":"ContainerDied","Data":"8db9472115525c48ff0dc3a8df3eb2d986ef4d4a5375eb045fe28689eb486509"} Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.892505 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b2qvf" Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.892527 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2qvf" event={"ID":"499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6","Type":"ContainerDied","Data":"0e9bb90b0851861d1f30fd40e09bd218db862f453d23e71e76eb44850d9af8f6"} Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.892547 4825 scope.go:117] "RemoveContainer" containerID="8db9472115525c48ff0dc3a8df3eb2d986ef4d4a5375eb045fe28689eb486509" Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.929363 4825 scope.go:117] "RemoveContainer" containerID="d32c5fb5de3bfc758de4628b2f4ec9c29627e05025bb7c87e01697bbe3998ea4" Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.935967 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b2qvf"] Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.947425 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b2qvf"] Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.953366 4825 scope.go:117] "RemoveContainer" containerID="519d41ce8171fde9c493c61e748ac0a598c4bd79dcc9dfa37a03953f898e7bf7" Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.998992 4825 scope.go:117] "RemoveContainer" containerID="8db9472115525c48ff0dc3a8df3eb2d986ef4d4a5375eb045fe28689eb486509" Mar 10 08:55:18 crc kubenswrapper[4825]: E0310 08:55:18.999335 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8db9472115525c48ff0dc3a8df3eb2d986ef4d4a5375eb045fe28689eb486509\": container with ID starting with 8db9472115525c48ff0dc3a8df3eb2d986ef4d4a5375eb045fe28689eb486509 not found: ID does not exist" containerID="8db9472115525c48ff0dc3a8df3eb2d986ef4d4a5375eb045fe28689eb486509" Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.999366 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8db9472115525c48ff0dc3a8df3eb2d986ef4d4a5375eb045fe28689eb486509"} err="failed to get container status \"8db9472115525c48ff0dc3a8df3eb2d986ef4d4a5375eb045fe28689eb486509\": rpc error: code = NotFound desc = could not find container \"8db9472115525c48ff0dc3a8df3eb2d986ef4d4a5375eb045fe28689eb486509\": container with ID starting with 8db9472115525c48ff0dc3a8df3eb2d986ef4d4a5375eb045fe28689eb486509 not found: ID does not exist" Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.999406 4825 scope.go:117] "RemoveContainer" containerID="d32c5fb5de3bfc758de4628b2f4ec9c29627e05025bb7c87e01697bbe3998ea4" Mar 10 08:55:18 crc kubenswrapper[4825]: E0310 08:55:18.999610 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d32c5fb5de3bfc758de4628b2f4ec9c29627e05025bb7c87e01697bbe3998ea4\": container with ID starting with d32c5fb5de3bfc758de4628b2f4ec9c29627e05025bb7c87e01697bbe3998ea4 not found: ID does not exist" containerID="d32c5fb5de3bfc758de4628b2f4ec9c29627e05025bb7c87e01697bbe3998ea4" Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.999626 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d32c5fb5de3bfc758de4628b2f4ec9c29627e05025bb7c87e01697bbe3998ea4"} err="failed to get container status \"d32c5fb5de3bfc758de4628b2f4ec9c29627e05025bb7c87e01697bbe3998ea4\": rpc error: code = NotFound desc = could not find container \"d32c5fb5de3bfc758de4628b2f4ec9c29627e05025bb7c87e01697bbe3998ea4\": container with ID starting with d32c5fb5de3bfc758de4628b2f4ec9c29627e05025bb7c87e01697bbe3998ea4 not found: ID does not exist" Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.999640 4825 scope.go:117] "RemoveContainer" containerID="519d41ce8171fde9c493c61e748ac0a598c4bd79dcc9dfa37a03953f898e7bf7" Mar 10 08:55:18 crc kubenswrapper[4825]: E0310 08:55:18.999821 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"519d41ce8171fde9c493c61e748ac0a598c4bd79dcc9dfa37a03953f898e7bf7\": container with ID starting with 519d41ce8171fde9c493c61e748ac0a598c4bd79dcc9dfa37a03953f898e7bf7 not found: ID does not exist" containerID="519d41ce8171fde9c493c61e748ac0a598c4bd79dcc9dfa37a03953f898e7bf7" Mar 10 08:55:18 crc kubenswrapper[4825]: I0310 08:55:18.999836 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"519d41ce8171fde9c493c61e748ac0a598c4bd79dcc9dfa37a03953f898e7bf7"} err="failed to get container status \"519d41ce8171fde9c493c61e748ac0a598c4bd79dcc9dfa37a03953f898e7bf7\": rpc error: code = NotFound desc = could not find container \"519d41ce8171fde9c493c61e748ac0a598c4bd79dcc9dfa37a03953f898e7bf7\": container with ID starting with 519d41ce8171fde9c493c61e748ac0a598c4bd79dcc9dfa37a03953f898e7bf7 not found: ID does not exist" Mar 10 08:55:19 crc kubenswrapper[4825]: I0310 08:55:19.247723 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6" path="/var/lib/kubelet/pods/499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6/volumes" Mar 10 08:56:00 crc kubenswrapper[4825]: I0310 08:56:00.167353 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552216-jw2q2"] Mar 10 08:56:00 crc kubenswrapper[4825]: E0310 08:56:00.169190 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c0ba11-41d9-435e-a895-0b7976356d86" containerName="extract-utilities" Mar 10 08:56:00 crc kubenswrapper[4825]: I0310 08:56:00.169260 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c0ba11-41d9-435e-a895-0b7976356d86" containerName="extract-utilities" Mar 10 08:56:00 crc kubenswrapper[4825]: E0310 08:56:00.169276 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c0ba11-41d9-435e-a895-0b7976356d86" containerName="extract-content" Mar 10 08:56:00 crc kubenswrapper[4825]: I0310 08:56:00.169290 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c0ba11-41d9-435e-a895-0b7976356d86" containerName="extract-content" Mar 10 08:56:00 crc kubenswrapper[4825]: E0310 08:56:00.169336 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6" containerName="extract-utilities" Mar 10 08:56:00 crc kubenswrapper[4825]: I0310 08:56:00.169349 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6" containerName="extract-utilities" Mar 10 08:56:00 crc kubenswrapper[4825]: E0310 08:56:00.169375 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c0ba11-41d9-435e-a895-0b7976356d86" containerName="registry-server" Mar 10 08:56:00 crc kubenswrapper[4825]: I0310 08:56:00.169388 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c0ba11-41d9-435e-a895-0b7976356d86" containerName="registry-server" Mar 10 08:56:00 crc kubenswrapper[4825]: E0310 08:56:00.169419 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6" containerName="registry-server" Mar 10 08:56:00 crc kubenswrapper[4825]: I0310 08:56:00.169431 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6" containerName="registry-server" Mar 10 08:56:00 crc kubenswrapper[4825]: E0310 08:56:00.169471 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6" containerName="extract-content" Mar 10 08:56:00 crc kubenswrapper[4825]: I0310 08:56:00.169483 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6" containerName="extract-content" Mar 10 08:56:00 crc kubenswrapper[4825]: I0310 08:56:00.169827 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="499a0f4a-9ae5-418b-8700-c2e1b2d3c2a6" containerName="registry-server" Mar 10 08:56:00 crc kubenswrapper[4825]: I0310 08:56:00.169873 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1c0ba11-41d9-435e-a895-0b7976356d86" containerName="registry-server" Mar 10 08:56:00 crc kubenswrapper[4825]: I0310 08:56:00.171342 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552216-jw2q2" Mar 10 08:56:00 crc kubenswrapper[4825]: I0310 08:56:00.177022 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:56:00 crc kubenswrapper[4825]: I0310 08:56:00.177074 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:56:00 crc kubenswrapper[4825]: I0310 08:56:00.177316 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:56:00 crc kubenswrapper[4825]: I0310 08:56:00.185189 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552216-jw2q2"] Mar 10 08:56:00 crc kubenswrapper[4825]: I0310 08:56:00.331644 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgt2b\" (UniqueName: \"kubernetes.io/projected/aa238966-2baf-447e-a928-fd65e87c30b1-kube-api-access-wgt2b\") pod \"auto-csr-approver-29552216-jw2q2\" (UID: \"aa238966-2baf-447e-a928-fd65e87c30b1\") " pod="openshift-infra/auto-csr-approver-29552216-jw2q2" Mar 10 08:56:00 crc kubenswrapper[4825]: I0310 08:56:00.433680 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgt2b\" (UniqueName: \"kubernetes.io/projected/aa238966-2baf-447e-a928-fd65e87c30b1-kube-api-access-wgt2b\") pod \"auto-csr-approver-29552216-jw2q2\" (UID: \"aa238966-2baf-447e-a928-fd65e87c30b1\") " pod="openshift-infra/auto-csr-approver-29552216-jw2q2" Mar 10 08:56:00 crc kubenswrapper[4825]: I0310 08:56:00.452824 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgt2b\" (UniqueName: \"kubernetes.io/projected/aa238966-2baf-447e-a928-fd65e87c30b1-kube-api-access-wgt2b\") pod \"auto-csr-approver-29552216-jw2q2\" (UID: \"aa238966-2baf-447e-a928-fd65e87c30b1\") " pod="openshift-infra/auto-csr-approver-29552216-jw2q2" Mar 10 08:56:00 crc kubenswrapper[4825]: I0310 08:56:00.506020 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552216-jw2q2" Mar 10 08:56:00 crc kubenswrapper[4825]: W0310 08:56:00.953206 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa238966_2baf_447e_a928_fd65e87c30b1.slice/crio-bca24544ecd6dd18717cd4acaf420260382dd7f87f982ac55fedbb8f5336748f WatchSource:0}: Error finding container bca24544ecd6dd18717cd4acaf420260382dd7f87f982ac55fedbb8f5336748f: Status 404 returned error can't find the container with id bca24544ecd6dd18717cd4acaf420260382dd7f87f982ac55fedbb8f5336748f Mar 10 08:56:00 crc kubenswrapper[4825]: I0310 08:56:00.954504 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552216-jw2q2"] Mar 10 08:56:01 crc kubenswrapper[4825]: I0310 08:56:01.299621 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552216-jw2q2" event={"ID":"aa238966-2baf-447e-a928-fd65e87c30b1","Type":"ContainerStarted","Data":"bca24544ecd6dd18717cd4acaf420260382dd7f87f982ac55fedbb8f5336748f"} Mar 10 08:56:02 crc kubenswrapper[4825]: I0310 08:56:02.309796 4825 generic.go:334] "Generic (PLEG): container finished" podID="aa238966-2baf-447e-a928-fd65e87c30b1" containerID="69882d7986da8f262c50e1ddef78696c7150e81dc9f5e3cfa16d79a989865d79" exitCode=0 Mar 10 08:56:02 crc kubenswrapper[4825]: I0310 08:56:02.310024 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552216-jw2q2" event={"ID":"aa238966-2baf-447e-a928-fd65e87c30b1","Type":"ContainerDied","Data":"69882d7986da8f262c50e1ddef78696c7150e81dc9f5e3cfa16d79a989865d79"} Mar 10 08:56:03 crc kubenswrapper[4825]: I0310 08:56:03.633736 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552216-jw2q2" Mar 10 08:56:03 crc kubenswrapper[4825]: I0310 08:56:03.699805 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgt2b\" (UniqueName: \"kubernetes.io/projected/aa238966-2baf-447e-a928-fd65e87c30b1-kube-api-access-wgt2b\") pod \"aa238966-2baf-447e-a928-fd65e87c30b1\" (UID: \"aa238966-2baf-447e-a928-fd65e87c30b1\") " Mar 10 08:56:03 crc kubenswrapper[4825]: I0310 08:56:03.707885 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa238966-2baf-447e-a928-fd65e87c30b1-kube-api-access-wgt2b" (OuterVolumeSpecName: "kube-api-access-wgt2b") pod "aa238966-2baf-447e-a928-fd65e87c30b1" (UID: "aa238966-2baf-447e-a928-fd65e87c30b1"). InnerVolumeSpecName "kube-api-access-wgt2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:56:03 crc kubenswrapper[4825]: I0310 08:56:03.803110 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgt2b\" (UniqueName: \"kubernetes.io/projected/aa238966-2baf-447e-a928-fd65e87c30b1-kube-api-access-wgt2b\") on node \"crc\" DevicePath \"\"" Mar 10 08:56:04 crc kubenswrapper[4825]: I0310 08:56:04.328944 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552216-jw2q2" event={"ID":"aa238966-2baf-447e-a928-fd65e87c30b1","Type":"ContainerDied","Data":"bca24544ecd6dd18717cd4acaf420260382dd7f87f982ac55fedbb8f5336748f"} Mar 10 08:56:04 crc kubenswrapper[4825]: I0310 08:56:04.329294 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bca24544ecd6dd18717cd4acaf420260382dd7f87f982ac55fedbb8f5336748f" Mar 10 08:56:04 crc kubenswrapper[4825]: I0310 08:56:04.329023 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552216-jw2q2" Mar 10 08:56:04 crc kubenswrapper[4825]: I0310 08:56:04.723704 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552210-mkgwn"] Mar 10 08:56:04 crc kubenswrapper[4825]: I0310 08:56:04.735066 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552210-mkgwn"] Mar 10 08:56:05 crc kubenswrapper[4825]: I0310 08:56:05.252666 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b87021f-252d-48d8-87a2-6d48161f495e" path="/var/lib/kubelet/pods/5b87021f-252d-48d8-87a2-6d48161f495e/volumes" Mar 10 08:56:05 crc kubenswrapper[4825]: I0310 08:56:05.340549 4825 generic.go:334] "Generic (PLEG): container finished" podID="49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7" containerID="6e359258a7dea541056e2079b68e158683b84e730fe1e87547a2ba124bb1b6b3" exitCode=0 Mar 10 08:56:05 crc kubenswrapper[4825]: I0310 08:56:05.340585 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" event={"ID":"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7","Type":"ContainerDied","Data":"6e359258a7dea541056e2079b68e158683b84e730fe1e87547a2ba124bb1b6b3"} Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.788914 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.880860 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-migration-ssh-key-0\") pod \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.880920 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-inventory\") pod \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.880995 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-ssh-key-openstack-cell1\") pod \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.881080 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-compute-config-3\") pod \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.881294 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-compute-config-2\") pod \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.881347 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgcv2\" (UniqueName: \"kubernetes.io/projected/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-kube-api-access-mgcv2\") pod \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.881400 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-combined-ca-bundle\") pod \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.881473 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-compute-config-1\") pod \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.881499 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-migration-ssh-key-1\") pod \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.881534 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cells-global-config-0\") pod \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.881573 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-compute-config-0\") pod \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\" (UID: \"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7\") " Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.895084 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-kube-api-access-mgcv2" (OuterVolumeSpecName: "kube-api-access-mgcv2") pod "49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7" (UID: "49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7"). InnerVolumeSpecName "kube-api-access-mgcv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.895438 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7" (UID: "49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.915239 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7" (UID: "49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.916273 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7" (UID: "49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.916705 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-inventory" (OuterVolumeSpecName: "inventory") pod "49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7" (UID: "49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.917806 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7" (UID: "49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.918792 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7" (UID: "49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.919615 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7" (UID: "49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.921104 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7" (UID: "49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.925046 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7" (UID: "49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.927893 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7" (UID: "49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.984404 4825 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.984446 4825 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.984459 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgcv2\" (UniqueName: \"kubernetes.io/projected/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-kube-api-access-mgcv2\") on node \"crc\" DevicePath \"\"" Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.984471 4825 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.984484 4825 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.984498 4825 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.984510 4825 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.984522 4825 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.984534 4825 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.984547 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 08:56:06 crc kubenswrapper[4825]: I0310 08:56:06.984561 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.369428 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" event={"ID":"49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7","Type":"ContainerDied","Data":"6631c4d3fab0d8b9351360e0cf531c489ee9f223a646c716ccf934efbf85f36b"} Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.369477 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6631c4d3fab0d8b9351360e0cf531c489ee9f223a646c716ccf934efbf85f36b" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.369539 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-dvswf" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.494444 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-9zrj6"] Mar 10 08:56:07 crc kubenswrapper[4825]: E0310 08:56:07.495096 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7" containerName="nova-cell1-openstack-openstack-cell1" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.495145 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7" containerName="nova-cell1-openstack-openstack-cell1" Mar 10 08:56:07 crc kubenswrapper[4825]: E0310 08:56:07.495163 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa238966-2baf-447e-a928-fd65e87c30b1" containerName="oc" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.495174 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa238966-2baf-447e-a928-fd65e87c30b1" containerName="oc" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.495462 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7" containerName="nova-cell1-openstack-openstack-cell1" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.495497 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa238966-2baf-447e-a928-fd65e87c30b1" containerName="oc" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.496425 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.506033 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-9zrj6"] Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.538292 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.538989 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-5twvv" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.539162 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.539274 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.539281 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.599532 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-9zrj6\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.599649 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-9zrj6\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.599713 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-9zrj6\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.599769 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-inventory\") pod \"telemetry-openstack-openstack-cell1-9zrj6\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.600106 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-9zrj6\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.600245 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mwgd\" (UniqueName: \"kubernetes.io/projected/d5874626-6e2f-4545-a7f4-225c36f183f4-kube-api-access-8mwgd\") pod \"telemetry-openstack-openstack-cell1-9zrj6\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.600365 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-9zrj6\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.702688 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-9zrj6\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.702832 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-9zrj6\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.702953 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-9zrj6\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.703029 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-9zrj6\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.703072 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-inventory\") pod \"telemetry-openstack-openstack-cell1-9zrj6\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.703186 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-9zrj6\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.704242 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mwgd\" (UniqueName: \"kubernetes.io/projected/d5874626-6e2f-4545-a7f4-225c36f183f4-kube-api-access-8mwgd\") pod \"telemetry-openstack-openstack-cell1-9zrj6\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.707521 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-9zrj6\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.707571 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-9zrj6\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.707758 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-inventory\") pod \"telemetry-openstack-openstack-cell1-9zrj6\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.707911 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-9zrj6\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.709430 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-9zrj6\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.710071 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-9zrj6\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.724057 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mwgd\" (UniqueName: \"kubernetes.io/projected/d5874626-6e2f-4545-a7f4-225c36f183f4-kube-api-access-8mwgd\") pod \"telemetry-openstack-openstack-cell1-9zrj6\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:56:07 crc kubenswrapper[4825]: I0310 08:56:07.871823 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:56:08 crc kubenswrapper[4825]: I0310 08:56:08.433576 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-9zrj6"] Mar 10 08:56:09 crc kubenswrapper[4825]: I0310 08:56:09.388124 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" event={"ID":"d5874626-6e2f-4545-a7f4-225c36f183f4","Type":"ContainerStarted","Data":"e49f8683cc2e7938197cb5ae284d8356dfc3a693e7de6d41037c6e4675c6cd4a"} Mar 10 08:56:09 crc kubenswrapper[4825]: I0310 08:56:09.388488 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" event={"ID":"d5874626-6e2f-4545-a7f4-225c36f183f4","Type":"ContainerStarted","Data":"5b44a3e4642f49c533c8cc3fd24235f91b789a293453fa2580c287a80e8723b2"} Mar 10 08:56:09 crc kubenswrapper[4825]: I0310 08:56:09.410786 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" podStartSLOduration=1.93086425 podStartE2EDuration="2.410766574s" podCreationTimestamp="2026-03-10 08:56:07 +0000 UTC" firstStartedPulling="2026-03-10 08:56:08.44472948 +0000 UTC m=+7921.474510095" lastFinishedPulling="2026-03-10 08:56:08.924631764 +0000 UTC m=+7921.954412419" observedRunningTime="2026-03-10 08:56:09.40647538 +0000 UTC m=+7922.436256015" watchObservedRunningTime="2026-03-10 08:56:09.410766574 +0000 UTC m=+7922.440547189" Mar 10 08:56:54 crc kubenswrapper[4825]: I0310 08:56:54.637411 4825 scope.go:117] "RemoveContainer" containerID="4175ef55e72869e7c45ae2499480e2aa86c30afaf9cec0c37bc229506563d9ed" Mar 10 08:57:16 crc kubenswrapper[4825]: I0310 08:57:16.887793 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:57:16 crc kubenswrapper[4825]: I0310 08:57:16.889957 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:57:46 crc kubenswrapper[4825]: I0310 08:57:46.887885 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:57:46 crc kubenswrapper[4825]: I0310 08:57:46.889473 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:58:00 crc kubenswrapper[4825]: I0310 08:58:00.156411 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552218-8bncf"] Mar 10 08:58:00 crc kubenswrapper[4825]: I0310 08:58:00.158225 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552218-8bncf" Mar 10 08:58:00 crc kubenswrapper[4825]: I0310 08:58:00.161976 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 08:58:00 crc kubenswrapper[4825]: I0310 08:58:00.164439 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 08:58:00 crc kubenswrapper[4825]: I0310 08:58:00.164715 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 08:58:00 crc kubenswrapper[4825]: I0310 08:58:00.176529 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552218-8bncf"] Mar 10 08:58:00 crc kubenswrapper[4825]: I0310 08:58:00.286063 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnwq6\" (UniqueName: \"kubernetes.io/projected/da625163-95ad-4099-92f2-ae375b811efb-kube-api-access-wnwq6\") pod \"auto-csr-approver-29552218-8bncf\" (UID: \"da625163-95ad-4099-92f2-ae375b811efb\") " pod="openshift-infra/auto-csr-approver-29552218-8bncf" Mar 10 08:58:00 crc kubenswrapper[4825]: I0310 08:58:00.388473 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnwq6\" (UniqueName: \"kubernetes.io/projected/da625163-95ad-4099-92f2-ae375b811efb-kube-api-access-wnwq6\") pod \"auto-csr-approver-29552218-8bncf\" (UID: \"da625163-95ad-4099-92f2-ae375b811efb\") " pod="openshift-infra/auto-csr-approver-29552218-8bncf" Mar 10 08:58:00 crc kubenswrapper[4825]: I0310 08:58:00.408291 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnwq6\" (UniqueName: \"kubernetes.io/projected/da625163-95ad-4099-92f2-ae375b811efb-kube-api-access-wnwq6\") pod \"auto-csr-approver-29552218-8bncf\" (UID: \"da625163-95ad-4099-92f2-ae375b811efb\") " pod="openshift-infra/auto-csr-approver-29552218-8bncf" Mar 10 08:58:00 crc kubenswrapper[4825]: I0310 08:58:00.491751 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552218-8bncf" Mar 10 08:58:00 crc kubenswrapper[4825]: I0310 08:58:00.941990 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552218-8bncf"] Mar 10 08:58:01 crc kubenswrapper[4825]: I0310 08:58:01.552847 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552218-8bncf" event={"ID":"da625163-95ad-4099-92f2-ae375b811efb","Type":"ContainerStarted","Data":"2e81bb813ba8e4a8a85d9a61f46e05de89e613c51a13b7910d0addf2184701b4"} Mar 10 08:58:02 crc kubenswrapper[4825]: I0310 08:58:02.563420 4825 generic.go:334] "Generic (PLEG): container finished" podID="da625163-95ad-4099-92f2-ae375b811efb" containerID="35c18eb0721446c0e6f18c45f1589f97a64a864a87f6738ca82bdae190c666bb" exitCode=0 Mar 10 08:58:02 crc kubenswrapper[4825]: I0310 08:58:02.563473 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552218-8bncf" event={"ID":"da625163-95ad-4099-92f2-ae375b811efb","Type":"ContainerDied","Data":"35c18eb0721446c0e6f18c45f1589f97a64a864a87f6738ca82bdae190c666bb"} Mar 10 08:58:03 crc kubenswrapper[4825]: I0310 08:58:03.905501 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552218-8bncf" Mar 10 08:58:04 crc kubenswrapper[4825]: I0310 08:58:04.069976 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnwq6\" (UniqueName: \"kubernetes.io/projected/da625163-95ad-4099-92f2-ae375b811efb-kube-api-access-wnwq6\") pod \"da625163-95ad-4099-92f2-ae375b811efb\" (UID: \"da625163-95ad-4099-92f2-ae375b811efb\") " Mar 10 08:58:04 crc kubenswrapper[4825]: I0310 08:58:04.077107 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da625163-95ad-4099-92f2-ae375b811efb-kube-api-access-wnwq6" (OuterVolumeSpecName: "kube-api-access-wnwq6") pod "da625163-95ad-4099-92f2-ae375b811efb" (UID: "da625163-95ad-4099-92f2-ae375b811efb"). InnerVolumeSpecName "kube-api-access-wnwq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:58:04 crc kubenswrapper[4825]: I0310 08:58:04.172619 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnwq6\" (UniqueName: \"kubernetes.io/projected/da625163-95ad-4099-92f2-ae375b811efb-kube-api-access-wnwq6\") on node \"crc\" DevicePath \"\"" Mar 10 08:58:04 crc kubenswrapper[4825]: I0310 08:58:04.590408 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552218-8bncf" event={"ID":"da625163-95ad-4099-92f2-ae375b811efb","Type":"ContainerDied","Data":"2e81bb813ba8e4a8a85d9a61f46e05de89e613c51a13b7910d0addf2184701b4"} Mar 10 08:58:04 crc kubenswrapper[4825]: I0310 08:58:04.590648 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e81bb813ba8e4a8a85d9a61f46e05de89e613c51a13b7910d0addf2184701b4" Mar 10 08:58:04 crc kubenswrapper[4825]: I0310 08:58:04.590502 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552218-8bncf" Mar 10 08:58:04 crc kubenswrapper[4825]: I0310 08:58:04.972057 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552212-bgwk7"] Mar 10 08:58:04 crc kubenswrapper[4825]: I0310 08:58:04.983632 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552212-bgwk7"] Mar 10 08:58:05 crc kubenswrapper[4825]: I0310 08:58:05.247391 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a6a81f5-a188-4e8e-85d3-008ca6f74e22" path="/var/lib/kubelet/pods/5a6a81f5-a188-4e8e-85d3-008ca6f74e22/volumes" Mar 10 08:58:16 crc kubenswrapper[4825]: I0310 08:58:16.888714 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 08:58:16 crc kubenswrapper[4825]: I0310 08:58:16.889392 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 08:58:16 crc kubenswrapper[4825]: I0310 08:58:16.889439 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 08:58:16 crc kubenswrapper[4825]: I0310 08:58:16.890291 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"81b5ce52c08a356df2c6362cda9b87f444da49d61e188354e699ec5130fe40a8"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 08:58:16 crc kubenswrapper[4825]: I0310 08:58:16.890360 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://81b5ce52c08a356df2c6362cda9b87f444da49d61e188354e699ec5130fe40a8" gracePeriod=600 Mar 10 08:58:17 crc kubenswrapper[4825]: I0310 08:58:17.852350 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="81b5ce52c08a356df2c6362cda9b87f444da49d61e188354e699ec5130fe40a8" exitCode=0 Mar 10 08:58:17 crc kubenswrapper[4825]: I0310 08:58:17.852410 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"81b5ce52c08a356df2c6362cda9b87f444da49d61e188354e699ec5130fe40a8"} Mar 10 08:58:17 crc kubenswrapper[4825]: I0310 08:58:17.852863 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f"} Mar 10 08:58:17 crc kubenswrapper[4825]: I0310 08:58:17.852885 4825 scope.go:117] "RemoveContainer" containerID="2d7f4d948979ee71550b4af3d706ff94e76881dcdf895270cec84c86069dd8fe" Mar 10 08:58:54 crc kubenswrapper[4825]: I0310 08:58:54.729743 4825 scope.go:117] "RemoveContainer" containerID="65b0c56c065f618de483013f8ef40ff01d54d0de4572f8f2c3e767467417ee2a" Mar 10 08:58:56 crc kubenswrapper[4825]: I0310 08:58:56.916816 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v8fbx"] Mar 10 08:58:56 crc kubenswrapper[4825]: E0310 08:58:56.918425 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da625163-95ad-4099-92f2-ae375b811efb" containerName="oc" Mar 10 08:58:56 crc kubenswrapper[4825]: I0310 08:58:56.918438 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="da625163-95ad-4099-92f2-ae375b811efb" containerName="oc" Mar 10 08:58:56 crc kubenswrapper[4825]: I0310 08:58:56.918654 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="da625163-95ad-4099-92f2-ae375b811efb" containerName="oc" Mar 10 08:58:56 crc kubenswrapper[4825]: I0310 08:58:56.919994 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8fbx" Mar 10 08:58:56 crc kubenswrapper[4825]: I0310 08:58:56.951018 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v8fbx"] Mar 10 08:58:57 crc kubenswrapper[4825]: I0310 08:58:57.024332 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gjt8\" (UniqueName: \"kubernetes.io/projected/246ff60d-084c-4575-bab8-a0c1084f56b1-kube-api-access-2gjt8\") pod \"redhat-operators-v8fbx\" (UID: \"246ff60d-084c-4575-bab8-a0c1084f56b1\") " pod="openshift-marketplace/redhat-operators-v8fbx" Mar 10 08:58:57 crc kubenswrapper[4825]: I0310 08:58:57.024833 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/246ff60d-084c-4575-bab8-a0c1084f56b1-catalog-content\") pod \"redhat-operators-v8fbx\" (UID: \"246ff60d-084c-4575-bab8-a0c1084f56b1\") " pod="openshift-marketplace/redhat-operators-v8fbx" Mar 10 08:58:57 crc kubenswrapper[4825]: I0310 08:58:57.024906 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/246ff60d-084c-4575-bab8-a0c1084f56b1-utilities\") pod \"redhat-operators-v8fbx\" (UID: \"246ff60d-084c-4575-bab8-a0c1084f56b1\") " pod="openshift-marketplace/redhat-operators-v8fbx" Mar 10 08:58:57 crc kubenswrapper[4825]: I0310 08:58:57.126996 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gjt8\" (UniqueName: \"kubernetes.io/projected/246ff60d-084c-4575-bab8-a0c1084f56b1-kube-api-access-2gjt8\") pod \"redhat-operators-v8fbx\" (UID: \"246ff60d-084c-4575-bab8-a0c1084f56b1\") " pod="openshift-marketplace/redhat-operators-v8fbx" Mar 10 08:58:57 crc kubenswrapper[4825]: I0310 08:58:57.127157 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/246ff60d-084c-4575-bab8-a0c1084f56b1-catalog-content\") pod \"redhat-operators-v8fbx\" (UID: \"246ff60d-084c-4575-bab8-a0c1084f56b1\") " pod="openshift-marketplace/redhat-operators-v8fbx" Mar 10 08:58:57 crc kubenswrapper[4825]: I0310 08:58:57.127187 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/246ff60d-084c-4575-bab8-a0c1084f56b1-utilities\") pod \"redhat-operators-v8fbx\" (UID: \"246ff60d-084c-4575-bab8-a0c1084f56b1\") " pod="openshift-marketplace/redhat-operators-v8fbx" Mar 10 08:58:57 crc kubenswrapper[4825]: I0310 08:58:57.127811 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/246ff60d-084c-4575-bab8-a0c1084f56b1-utilities\") pod \"redhat-operators-v8fbx\" (UID: \"246ff60d-084c-4575-bab8-a0c1084f56b1\") " pod="openshift-marketplace/redhat-operators-v8fbx" Mar 10 08:58:57 crc kubenswrapper[4825]: I0310 08:58:57.127816 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/246ff60d-084c-4575-bab8-a0c1084f56b1-catalog-content\") pod \"redhat-operators-v8fbx\" (UID: \"246ff60d-084c-4575-bab8-a0c1084f56b1\") " pod="openshift-marketplace/redhat-operators-v8fbx" Mar 10 08:58:57 crc kubenswrapper[4825]: I0310 08:58:57.147248 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gjt8\" (UniqueName: \"kubernetes.io/projected/246ff60d-084c-4575-bab8-a0c1084f56b1-kube-api-access-2gjt8\") pod \"redhat-operators-v8fbx\" (UID: \"246ff60d-084c-4575-bab8-a0c1084f56b1\") " pod="openshift-marketplace/redhat-operators-v8fbx" Mar 10 08:58:57 crc kubenswrapper[4825]: I0310 08:58:57.281865 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8fbx" Mar 10 08:58:57 crc kubenswrapper[4825]: I0310 08:58:57.733666 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v8fbx"] Mar 10 08:58:58 crc kubenswrapper[4825]: I0310 08:58:58.254968 4825 generic.go:334] "Generic (PLEG): container finished" podID="246ff60d-084c-4575-bab8-a0c1084f56b1" containerID="419b2d164dc820827ef0a21dfbc547c428623e2aae544b21ce0a48e975df36cf" exitCode=0 Mar 10 08:58:58 crc kubenswrapper[4825]: I0310 08:58:58.255008 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8fbx" event={"ID":"246ff60d-084c-4575-bab8-a0c1084f56b1","Type":"ContainerDied","Data":"419b2d164dc820827ef0a21dfbc547c428623e2aae544b21ce0a48e975df36cf"} Mar 10 08:58:58 crc kubenswrapper[4825]: I0310 08:58:58.255281 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8fbx" event={"ID":"246ff60d-084c-4575-bab8-a0c1084f56b1","Type":"ContainerStarted","Data":"c29db7ed53254412daa5ab416cb0cda23be157b90bfb67c4f7a9a2a8554e6af3"} Mar 10 08:58:58 crc kubenswrapper[4825]: I0310 08:58:58.256811 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 08:58:59 crc kubenswrapper[4825]: I0310 08:58:59.263655 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8fbx" event={"ID":"246ff60d-084c-4575-bab8-a0c1084f56b1","Type":"ContainerStarted","Data":"a215873007e3ef4c0d16ee1958e7756bf4b3e53b988c921ec4f163b90509651f"} Mar 10 08:59:04 crc kubenswrapper[4825]: I0310 08:59:04.307033 4825 generic.go:334] "Generic (PLEG): container finished" podID="246ff60d-084c-4575-bab8-a0c1084f56b1" containerID="a215873007e3ef4c0d16ee1958e7756bf4b3e53b988c921ec4f163b90509651f" exitCode=0 Mar 10 08:59:04 crc kubenswrapper[4825]: I0310 08:59:04.307057 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8fbx" event={"ID":"246ff60d-084c-4575-bab8-a0c1084f56b1","Type":"ContainerDied","Data":"a215873007e3ef4c0d16ee1958e7756bf4b3e53b988c921ec4f163b90509651f"} Mar 10 08:59:05 crc kubenswrapper[4825]: I0310 08:59:05.317622 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8fbx" event={"ID":"246ff60d-084c-4575-bab8-a0c1084f56b1","Type":"ContainerStarted","Data":"c97c0062f4648a491afe75e3de369b40be89a21b869264cee150873d0d5b9e46"} Mar 10 08:59:05 crc kubenswrapper[4825]: I0310 08:59:05.342066 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v8fbx" podStartSLOduration=2.905372741 podStartE2EDuration="9.342042999s" podCreationTimestamp="2026-03-10 08:58:56 +0000 UTC" firstStartedPulling="2026-03-10 08:58:58.256516402 +0000 UTC m=+8091.286297017" lastFinishedPulling="2026-03-10 08:59:04.69318665 +0000 UTC m=+8097.722967275" observedRunningTime="2026-03-10 08:59:05.335285449 +0000 UTC m=+8098.365066064" watchObservedRunningTime="2026-03-10 08:59:05.342042999 +0000 UTC m=+8098.371823624" Mar 10 08:59:07 crc kubenswrapper[4825]: I0310 08:59:07.282289 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v8fbx" Mar 10 08:59:07 crc kubenswrapper[4825]: I0310 08:59:07.282856 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v8fbx" Mar 10 08:59:08 crc kubenswrapper[4825]: I0310 08:59:08.372890 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v8fbx" podUID="246ff60d-084c-4575-bab8-a0c1084f56b1" containerName="registry-server" probeResult="failure" output=< Mar 10 08:59:08 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 08:59:08 crc kubenswrapper[4825]: > Mar 10 08:59:18 crc kubenswrapper[4825]: I0310 08:59:18.336060 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v8fbx" podUID="246ff60d-084c-4575-bab8-a0c1084f56b1" containerName="registry-server" probeResult="failure" output=< Mar 10 08:59:18 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 08:59:18 crc kubenswrapper[4825]: > Mar 10 08:59:25 crc kubenswrapper[4825]: I0310 08:59:25.534443 4825 generic.go:334] "Generic (PLEG): container finished" podID="d5874626-6e2f-4545-a7f4-225c36f183f4" containerID="e49f8683cc2e7938197cb5ae284d8356dfc3a693e7de6d41037c6e4675c6cd4a" exitCode=0 Mar 10 08:59:25 crc kubenswrapper[4825]: I0310 08:59:25.534512 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" event={"ID":"d5874626-6e2f-4545-a7f4-225c36f183f4","Type":"ContainerDied","Data":"e49f8683cc2e7938197cb5ae284d8356dfc3a693e7de6d41037c6e4675c6cd4a"} Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.008243 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.095802 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-ceilometer-compute-config-data-1\") pod \"d5874626-6e2f-4545-a7f4-225c36f183f4\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.095870 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-inventory\") pod \"d5874626-6e2f-4545-a7f4-225c36f183f4\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.095894 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mwgd\" (UniqueName: \"kubernetes.io/projected/d5874626-6e2f-4545-a7f4-225c36f183f4-kube-api-access-8mwgd\") pod \"d5874626-6e2f-4545-a7f4-225c36f183f4\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.096087 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-ceilometer-compute-config-data-0\") pod \"d5874626-6e2f-4545-a7f4-225c36f183f4\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.096129 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-ssh-key-openstack-cell1\") pod \"d5874626-6e2f-4545-a7f4-225c36f183f4\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.096334 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-telemetry-combined-ca-bundle\") pod \"d5874626-6e2f-4545-a7f4-225c36f183f4\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.096391 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-ceilometer-compute-config-data-2\") pod \"d5874626-6e2f-4545-a7f4-225c36f183f4\" (UID: \"d5874626-6e2f-4545-a7f4-225c36f183f4\") " Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.102575 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5874626-6e2f-4545-a7f4-225c36f183f4-kube-api-access-8mwgd" (OuterVolumeSpecName: "kube-api-access-8mwgd") pod "d5874626-6e2f-4545-a7f4-225c36f183f4" (UID: "d5874626-6e2f-4545-a7f4-225c36f183f4"). InnerVolumeSpecName "kube-api-access-8mwgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.105010 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "d5874626-6e2f-4545-a7f4-225c36f183f4" (UID: "d5874626-6e2f-4545-a7f4-225c36f183f4"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.127221 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-inventory" (OuterVolumeSpecName: "inventory") pod "d5874626-6e2f-4545-a7f4-225c36f183f4" (UID: "d5874626-6e2f-4545-a7f4-225c36f183f4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.128792 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "d5874626-6e2f-4545-a7f4-225c36f183f4" (UID: "d5874626-6e2f-4545-a7f4-225c36f183f4"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.131813 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "d5874626-6e2f-4545-a7f4-225c36f183f4" (UID: "d5874626-6e2f-4545-a7f4-225c36f183f4"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.144152 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "d5874626-6e2f-4545-a7f4-225c36f183f4" (UID: "d5874626-6e2f-4545-a7f4-225c36f183f4"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.145682 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "d5874626-6e2f-4545-a7f4-225c36f183f4" (UID: "d5874626-6e2f-4545-a7f4-225c36f183f4"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.198858 4825 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.198895 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.198939 4825 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.198948 4825 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.198958 4825 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.199002 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5874626-6e2f-4545-a7f4-225c36f183f4-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.199049 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mwgd\" (UniqueName: \"kubernetes.io/projected/d5874626-6e2f-4545-a7f4-225c36f183f4-kube-api-access-8mwgd\") on node \"crc\" DevicePath \"\"" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.555863 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" event={"ID":"d5874626-6e2f-4545-a7f4-225c36f183f4","Type":"ContainerDied","Data":"5b44a3e4642f49c533c8cc3fd24235f91b789a293453fa2580c287a80e8723b2"} Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.555910 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b44a3e4642f49c533c8cc3fd24235f91b789a293453fa2580c287a80e8723b2" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.555926 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-9zrj6" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.670843 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-fpdt8"] Mar 10 08:59:27 crc kubenswrapper[4825]: E0310 08:59:27.671508 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5874626-6e2f-4545-a7f4-225c36f183f4" containerName="telemetry-openstack-openstack-cell1" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.671539 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5874626-6e2f-4545-a7f4-225c36f183f4" containerName="telemetry-openstack-openstack-cell1" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.671855 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5874626-6e2f-4545-a7f4-225c36f183f4" containerName="telemetry-openstack-openstack-cell1" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.672773 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-fpdt8" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.674818 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.675244 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.675422 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.675577 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.677765 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-5twvv" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.680490 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-fpdt8"] Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.708782 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-fpdt8\" (UID: \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fpdt8" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.708833 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-fpdt8\" (UID: \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fpdt8" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.709239 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx7j2\" (UniqueName: \"kubernetes.io/projected/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-kube-api-access-hx7j2\") pod \"neutron-sriov-openstack-openstack-cell1-fpdt8\" (UID: \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fpdt8" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.709323 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-fpdt8\" (UID: \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fpdt8" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.709495 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-fpdt8\" (UID: \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fpdt8" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.811399 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-fpdt8\" (UID: \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fpdt8" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.811769 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-fpdt8\" (UID: \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fpdt8" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.811853 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-fpdt8\" (UID: \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fpdt8" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.811986 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx7j2\" (UniqueName: \"kubernetes.io/projected/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-kube-api-access-hx7j2\") pod \"neutron-sriov-openstack-openstack-cell1-fpdt8\" (UID: \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fpdt8" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.812063 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-fpdt8\" (UID: \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fpdt8" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.816587 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-fpdt8\" (UID: \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fpdt8" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.816791 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-fpdt8\" (UID: \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fpdt8" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.818091 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-fpdt8\" (UID: \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fpdt8" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.826888 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-fpdt8\" (UID: \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fpdt8" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.829105 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx7j2\" (UniqueName: \"kubernetes.io/projected/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-kube-api-access-hx7j2\") pod \"neutron-sriov-openstack-openstack-cell1-fpdt8\" (UID: \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-fpdt8" Mar 10 08:59:27 crc kubenswrapper[4825]: I0310 08:59:27.988031 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-fpdt8" Mar 10 08:59:28 crc kubenswrapper[4825]: I0310 08:59:28.333533 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v8fbx" podUID="246ff60d-084c-4575-bab8-a0c1084f56b1" containerName="registry-server" probeResult="failure" output=< Mar 10 08:59:28 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 08:59:28 crc kubenswrapper[4825]: > Mar 10 08:59:28 crc kubenswrapper[4825]: I0310 08:59:28.603389 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-fpdt8"] Mar 10 08:59:28 crc kubenswrapper[4825]: W0310 08:59:28.604730 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe72d3b9_862b_4b9b_83a6_b8a991ddd51a.slice/crio-a1996d6562b3a8d09f53fbe39b13b1fbf22e1f1baca70a85f00e3926ea8b0678 WatchSource:0}: Error finding container a1996d6562b3a8d09f53fbe39b13b1fbf22e1f1baca70a85f00e3926ea8b0678: Status 404 returned error can't find the container with id a1996d6562b3a8d09f53fbe39b13b1fbf22e1f1baca70a85f00e3926ea8b0678 Mar 10 08:59:29 crc kubenswrapper[4825]: I0310 08:59:29.575129 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-fpdt8" event={"ID":"be72d3b9-862b-4b9b-83a6-b8a991ddd51a","Type":"ContainerStarted","Data":"69c39c4a46b343de1d15dcfee51d1b4f6b3c4ac1b0008f6c1fcc643b5048fbe1"} Mar 10 08:59:29 crc kubenswrapper[4825]: I0310 08:59:29.575481 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-fpdt8" event={"ID":"be72d3b9-862b-4b9b-83a6-b8a991ddd51a","Type":"ContainerStarted","Data":"a1996d6562b3a8d09f53fbe39b13b1fbf22e1f1baca70a85f00e3926ea8b0678"} Mar 10 08:59:29 crc kubenswrapper[4825]: I0310 08:59:29.594279 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-fpdt8" podStartSLOduration=2.12481125 podStartE2EDuration="2.594258226s" podCreationTimestamp="2026-03-10 08:59:27 +0000 UTC" firstStartedPulling="2026-03-10 08:59:28.607943241 +0000 UTC m=+8121.637723876" lastFinishedPulling="2026-03-10 08:59:29.077390237 +0000 UTC m=+8122.107170852" observedRunningTime="2026-03-10 08:59:29.592562201 +0000 UTC m=+8122.622342836" watchObservedRunningTime="2026-03-10 08:59:29.594258226 +0000 UTC m=+8122.624038841" Mar 10 08:59:37 crc kubenswrapper[4825]: I0310 08:59:37.334703 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v8fbx" Mar 10 08:59:37 crc kubenswrapper[4825]: I0310 08:59:37.385748 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v8fbx" Mar 10 08:59:37 crc kubenswrapper[4825]: I0310 08:59:37.571104 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v8fbx"] Mar 10 08:59:38 crc kubenswrapper[4825]: I0310 08:59:38.658547 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v8fbx" podUID="246ff60d-084c-4575-bab8-a0c1084f56b1" containerName="registry-server" containerID="cri-o://c97c0062f4648a491afe75e3de369b40be89a21b869264cee150873d0d5b9e46" gracePeriod=2 Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.168887 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8fbx" Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.365278 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gjt8\" (UniqueName: \"kubernetes.io/projected/246ff60d-084c-4575-bab8-a0c1084f56b1-kube-api-access-2gjt8\") pod \"246ff60d-084c-4575-bab8-a0c1084f56b1\" (UID: \"246ff60d-084c-4575-bab8-a0c1084f56b1\") " Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.365328 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/246ff60d-084c-4575-bab8-a0c1084f56b1-catalog-content\") pod \"246ff60d-084c-4575-bab8-a0c1084f56b1\" (UID: \"246ff60d-084c-4575-bab8-a0c1084f56b1\") " Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.365390 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/246ff60d-084c-4575-bab8-a0c1084f56b1-utilities\") pod \"246ff60d-084c-4575-bab8-a0c1084f56b1\" (UID: \"246ff60d-084c-4575-bab8-a0c1084f56b1\") " Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.366179 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/246ff60d-084c-4575-bab8-a0c1084f56b1-utilities" (OuterVolumeSpecName: "utilities") pod "246ff60d-084c-4575-bab8-a0c1084f56b1" (UID: "246ff60d-084c-4575-bab8-a0c1084f56b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.371406 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/246ff60d-084c-4575-bab8-a0c1084f56b1-kube-api-access-2gjt8" (OuterVolumeSpecName: "kube-api-access-2gjt8") pod "246ff60d-084c-4575-bab8-a0c1084f56b1" (UID: "246ff60d-084c-4575-bab8-a0c1084f56b1"). InnerVolumeSpecName "kube-api-access-2gjt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.468305 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gjt8\" (UniqueName: \"kubernetes.io/projected/246ff60d-084c-4575-bab8-a0c1084f56b1-kube-api-access-2gjt8\") on node \"crc\" DevicePath \"\"" Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.468341 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/246ff60d-084c-4575-bab8-a0c1084f56b1-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.495248 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/246ff60d-084c-4575-bab8-a0c1084f56b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "246ff60d-084c-4575-bab8-a0c1084f56b1" (UID: "246ff60d-084c-4575-bab8-a0c1084f56b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.571161 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/246ff60d-084c-4575-bab8-a0c1084f56b1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.673648 4825 generic.go:334] "Generic (PLEG): container finished" podID="246ff60d-084c-4575-bab8-a0c1084f56b1" containerID="c97c0062f4648a491afe75e3de369b40be89a21b869264cee150873d0d5b9e46" exitCode=0 Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.673704 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8fbx" event={"ID":"246ff60d-084c-4575-bab8-a0c1084f56b1","Type":"ContainerDied","Data":"c97c0062f4648a491afe75e3de369b40be89a21b869264cee150873d0d5b9e46"} Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.673745 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8fbx" event={"ID":"246ff60d-084c-4575-bab8-a0c1084f56b1","Type":"ContainerDied","Data":"c29db7ed53254412daa5ab416cb0cda23be157b90bfb67c4f7a9a2a8554e6af3"} Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.673765 4825 scope.go:117] "RemoveContainer" containerID="c97c0062f4648a491afe75e3de369b40be89a21b869264cee150873d0d5b9e46" Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.673800 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8fbx" Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.705531 4825 scope.go:117] "RemoveContainer" containerID="a215873007e3ef4c0d16ee1958e7756bf4b3e53b988c921ec4f163b90509651f" Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.731371 4825 scope.go:117] "RemoveContainer" containerID="419b2d164dc820827ef0a21dfbc547c428623e2aae544b21ce0a48e975df36cf" Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.734928 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v8fbx"] Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.746708 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v8fbx"] Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.790968 4825 scope.go:117] "RemoveContainer" containerID="c97c0062f4648a491afe75e3de369b40be89a21b869264cee150873d0d5b9e46" Mar 10 08:59:39 crc kubenswrapper[4825]: E0310 08:59:39.791527 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c97c0062f4648a491afe75e3de369b40be89a21b869264cee150873d0d5b9e46\": container with ID starting with c97c0062f4648a491afe75e3de369b40be89a21b869264cee150873d0d5b9e46 not found: ID does not exist" containerID="c97c0062f4648a491afe75e3de369b40be89a21b869264cee150873d0d5b9e46" Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.791858 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c97c0062f4648a491afe75e3de369b40be89a21b869264cee150873d0d5b9e46"} err="failed to get container status \"c97c0062f4648a491afe75e3de369b40be89a21b869264cee150873d0d5b9e46\": rpc error: code = NotFound desc = could not find container \"c97c0062f4648a491afe75e3de369b40be89a21b869264cee150873d0d5b9e46\": container with ID starting with c97c0062f4648a491afe75e3de369b40be89a21b869264cee150873d0d5b9e46 not found: ID does not exist" Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.791930 4825 scope.go:117] "RemoveContainer" containerID="a215873007e3ef4c0d16ee1958e7756bf4b3e53b988c921ec4f163b90509651f" Mar 10 08:59:39 crc kubenswrapper[4825]: E0310 08:59:39.792389 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a215873007e3ef4c0d16ee1958e7756bf4b3e53b988c921ec4f163b90509651f\": container with ID starting with a215873007e3ef4c0d16ee1958e7756bf4b3e53b988c921ec4f163b90509651f not found: ID does not exist" containerID="a215873007e3ef4c0d16ee1958e7756bf4b3e53b988c921ec4f163b90509651f" Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.792429 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a215873007e3ef4c0d16ee1958e7756bf4b3e53b988c921ec4f163b90509651f"} err="failed to get container status \"a215873007e3ef4c0d16ee1958e7756bf4b3e53b988c921ec4f163b90509651f\": rpc error: code = NotFound desc = could not find container \"a215873007e3ef4c0d16ee1958e7756bf4b3e53b988c921ec4f163b90509651f\": container with ID starting with a215873007e3ef4c0d16ee1958e7756bf4b3e53b988c921ec4f163b90509651f not found: ID does not exist" Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.792480 4825 scope.go:117] "RemoveContainer" containerID="419b2d164dc820827ef0a21dfbc547c428623e2aae544b21ce0a48e975df36cf" Mar 10 08:59:39 crc kubenswrapper[4825]: E0310 08:59:39.792810 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"419b2d164dc820827ef0a21dfbc547c428623e2aae544b21ce0a48e975df36cf\": container with ID starting with 419b2d164dc820827ef0a21dfbc547c428623e2aae544b21ce0a48e975df36cf not found: ID does not exist" containerID="419b2d164dc820827ef0a21dfbc547c428623e2aae544b21ce0a48e975df36cf" Mar 10 08:59:39 crc kubenswrapper[4825]: I0310 08:59:39.792845 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"419b2d164dc820827ef0a21dfbc547c428623e2aae544b21ce0a48e975df36cf"} err="failed to get container status \"419b2d164dc820827ef0a21dfbc547c428623e2aae544b21ce0a48e975df36cf\": rpc error: code = NotFound desc = could not find container \"419b2d164dc820827ef0a21dfbc547c428623e2aae544b21ce0a48e975df36cf\": container with ID starting with 419b2d164dc820827ef0a21dfbc547c428623e2aae544b21ce0a48e975df36cf not found: ID does not exist" Mar 10 08:59:41 crc kubenswrapper[4825]: I0310 08:59:41.249108 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="246ff60d-084c-4575-bab8-a0c1084f56b1" path="/var/lib/kubelet/pods/246ff60d-084c-4575-bab8-a0c1084f56b1/volumes" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.165371 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552220-rn9kv"] Mar 10 09:00:00 crc kubenswrapper[4825]: E0310 09:00:00.166392 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="246ff60d-084c-4575-bab8-a0c1084f56b1" containerName="extract-utilities" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.166408 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="246ff60d-084c-4575-bab8-a0c1084f56b1" containerName="extract-utilities" Mar 10 09:00:00 crc kubenswrapper[4825]: E0310 09:00:00.166426 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="246ff60d-084c-4575-bab8-a0c1084f56b1" containerName="extract-content" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.166434 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="246ff60d-084c-4575-bab8-a0c1084f56b1" containerName="extract-content" Mar 10 09:00:00 crc kubenswrapper[4825]: E0310 09:00:00.166446 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="246ff60d-084c-4575-bab8-a0c1084f56b1" containerName="registry-server" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.166455 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="246ff60d-084c-4575-bab8-a0c1084f56b1" containerName="registry-server" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.166715 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="246ff60d-084c-4575-bab8-a0c1084f56b1" containerName="registry-server" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.167636 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552220-rn9kv" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.170167 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.171943 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.173337 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.182363 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552220-kfpr2"] Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.184031 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-kfpr2" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.196459 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.196487 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.202487 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552220-rn9kv"] Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.219335 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552220-kfpr2"] Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.317198 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb9043c4-3531-40f5-8349-e79f0d6705ab-secret-volume\") pod \"collect-profiles-29552220-kfpr2\" (UID: \"fb9043c4-3531-40f5-8349-e79f0d6705ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-kfpr2" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.317411 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb9043c4-3531-40f5-8349-e79f0d6705ab-config-volume\") pod \"collect-profiles-29552220-kfpr2\" (UID: \"fb9043c4-3531-40f5-8349-e79f0d6705ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-kfpr2" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.317508 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x745\" (UniqueName: \"kubernetes.io/projected/c7834d68-a65e-4887-9b5a-3608b5778b4d-kube-api-access-8x745\") pod \"auto-csr-approver-29552220-rn9kv\" (UID: \"c7834d68-a65e-4887-9b5a-3608b5778b4d\") " pod="openshift-infra/auto-csr-approver-29552220-rn9kv" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.317686 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc7m8\" (UniqueName: \"kubernetes.io/projected/fb9043c4-3531-40f5-8349-e79f0d6705ab-kube-api-access-tc7m8\") pod \"collect-profiles-29552220-kfpr2\" (UID: \"fb9043c4-3531-40f5-8349-e79f0d6705ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-kfpr2" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.419563 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb9043c4-3531-40f5-8349-e79f0d6705ab-secret-volume\") pod \"collect-profiles-29552220-kfpr2\" (UID: \"fb9043c4-3531-40f5-8349-e79f0d6705ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-kfpr2" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.420290 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb9043c4-3531-40f5-8349-e79f0d6705ab-config-volume\") pod \"collect-profiles-29552220-kfpr2\" (UID: \"fb9043c4-3531-40f5-8349-e79f0d6705ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-kfpr2" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.420372 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x745\" (UniqueName: \"kubernetes.io/projected/c7834d68-a65e-4887-9b5a-3608b5778b4d-kube-api-access-8x745\") pod \"auto-csr-approver-29552220-rn9kv\" (UID: \"c7834d68-a65e-4887-9b5a-3608b5778b4d\") " pod="openshift-infra/auto-csr-approver-29552220-rn9kv" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.420616 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc7m8\" (UniqueName: \"kubernetes.io/projected/fb9043c4-3531-40f5-8349-e79f0d6705ab-kube-api-access-tc7m8\") pod \"collect-profiles-29552220-kfpr2\" (UID: \"fb9043c4-3531-40f5-8349-e79f0d6705ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-kfpr2" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.421642 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb9043c4-3531-40f5-8349-e79f0d6705ab-config-volume\") pod \"collect-profiles-29552220-kfpr2\" (UID: \"fb9043c4-3531-40f5-8349-e79f0d6705ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-kfpr2" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.426378 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb9043c4-3531-40f5-8349-e79f0d6705ab-secret-volume\") pod \"collect-profiles-29552220-kfpr2\" (UID: \"fb9043c4-3531-40f5-8349-e79f0d6705ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-kfpr2" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.436870 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x745\" (UniqueName: \"kubernetes.io/projected/c7834d68-a65e-4887-9b5a-3608b5778b4d-kube-api-access-8x745\") pod \"auto-csr-approver-29552220-rn9kv\" (UID: \"c7834d68-a65e-4887-9b5a-3608b5778b4d\") " pod="openshift-infra/auto-csr-approver-29552220-rn9kv" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.441058 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc7m8\" (UniqueName: \"kubernetes.io/projected/fb9043c4-3531-40f5-8349-e79f0d6705ab-kube-api-access-tc7m8\") pod \"collect-profiles-29552220-kfpr2\" (UID: \"fb9043c4-3531-40f5-8349-e79f0d6705ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-kfpr2" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.491952 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552220-rn9kv" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.514018 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-kfpr2" Mar 10 09:00:00 crc kubenswrapper[4825]: I0310 09:00:00.995025 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552220-rn9kv"] Mar 10 09:00:01 crc kubenswrapper[4825]: I0310 09:00:01.005336 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552220-kfpr2"] Mar 10 09:00:01 crc kubenswrapper[4825]: W0310 09:00:01.008734 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb9043c4_3531_40f5_8349_e79f0d6705ab.slice/crio-70de139ed82a160eb86127bf58bc84612e4d7942071d1cbbe5e270b90c550c9a WatchSource:0}: Error finding container 70de139ed82a160eb86127bf58bc84612e4d7942071d1cbbe5e270b90c550c9a: Status 404 returned error can't find the container with id 70de139ed82a160eb86127bf58bc84612e4d7942071d1cbbe5e270b90c550c9a Mar 10 09:00:01 crc kubenswrapper[4825]: I0310 09:00:01.885418 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552220-rn9kv" event={"ID":"c7834d68-a65e-4887-9b5a-3608b5778b4d","Type":"ContainerStarted","Data":"e970242ef56168ea3ca1ccb76e949e1a82446a9bc4854a4e5b8dd12c3ab45e6c"} Mar 10 09:00:01 crc kubenswrapper[4825]: I0310 09:00:01.887787 4825 generic.go:334] "Generic (PLEG): container finished" podID="fb9043c4-3531-40f5-8349-e79f0d6705ab" containerID="4f96e8f13f806522d207a97d6587d61e721f26dc61279f61fb000a9c432385fb" exitCode=0 Mar 10 09:00:01 crc kubenswrapper[4825]: I0310 09:00:01.887813 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-kfpr2" event={"ID":"fb9043c4-3531-40f5-8349-e79f0d6705ab","Type":"ContainerDied","Data":"4f96e8f13f806522d207a97d6587d61e721f26dc61279f61fb000a9c432385fb"} Mar 10 09:00:01 crc kubenswrapper[4825]: I0310 09:00:01.887828 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-kfpr2" event={"ID":"fb9043c4-3531-40f5-8349-e79f0d6705ab","Type":"ContainerStarted","Data":"70de139ed82a160eb86127bf58bc84612e4d7942071d1cbbe5e270b90c550c9a"} Mar 10 09:00:03 crc kubenswrapper[4825]: I0310 09:00:03.303916 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-kfpr2" Mar 10 09:00:03 crc kubenswrapper[4825]: I0310 09:00:03.491249 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb9043c4-3531-40f5-8349-e79f0d6705ab-secret-volume\") pod \"fb9043c4-3531-40f5-8349-e79f0d6705ab\" (UID: \"fb9043c4-3531-40f5-8349-e79f0d6705ab\") " Mar 10 09:00:03 crc kubenswrapper[4825]: I0310 09:00:03.491360 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb9043c4-3531-40f5-8349-e79f0d6705ab-config-volume\") pod \"fb9043c4-3531-40f5-8349-e79f0d6705ab\" (UID: \"fb9043c4-3531-40f5-8349-e79f0d6705ab\") " Mar 10 09:00:03 crc kubenswrapper[4825]: I0310 09:00:03.491412 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc7m8\" (UniqueName: \"kubernetes.io/projected/fb9043c4-3531-40f5-8349-e79f0d6705ab-kube-api-access-tc7m8\") pod \"fb9043c4-3531-40f5-8349-e79f0d6705ab\" (UID: \"fb9043c4-3531-40f5-8349-e79f0d6705ab\") " Mar 10 09:00:03 crc kubenswrapper[4825]: I0310 09:00:03.491957 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb9043c4-3531-40f5-8349-e79f0d6705ab-config-volume" (OuterVolumeSpecName: "config-volume") pod "fb9043c4-3531-40f5-8349-e79f0d6705ab" (UID: "fb9043c4-3531-40f5-8349-e79f0d6705ab"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:00:03 crc kubenswrapper[4825]: I0310 09:00:03.496705 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb9043c4-3531-40f5-8349-e79f0d6705ab-kube-api-access-tc7m8" (OuterVolumeSpecName: "kube-api-access-tc7m8") pod "fb9043c4-3531-40f5-8349-e79f0d6705ab" (UID: "fb9043c4-3531-40f5-8349-e79f0d6705ab"). InnerVolumeSpecName "kube-api-access-tc7m8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:00:03 crc kubenswrapper[4825]: I0310 09:00:03.503905 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb9043c4-3531-40f5-8349-e79f0d6705ab-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fb9043c4-3531-40f5-8349-e79f0d6705ab" (UID: "fb9043c4-3531-40f5-8349-e79f0d6705ab"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:00:03 crc kubenswrapper[4825]: I0310 09:00:03.593564 4825 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fb9043c4-3531-40f5-8349-e79f0d6705ab-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:00:03 crc kubenswrapper[4825]: I0310 09:00:03.593607 4825 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb9043c4-3531-40f5-8349-e79f0d6705ab-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:00:03 crc kubenswrapper[4825]: I0310 09:00:03.593621 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc7m8\" (UniqueName: \"kubernetes.io/projected/fb9043c4-3531-40f5-8349-e79f0d6705ab-kube-api-access-tc7m8\") on node \"crc\" DevicePath \"\"" Mar 10 09:00:03 crc kubenswrapper[4825]: I0310 09:00:03.909346 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-kfpr2" event={"ID":"fb9043c4-3531-40f5-8349-e79f0d6705ab","Type":"ContainerDied","Data":"70de139ed82a160eb86127bf58bc84612e4d7942071d1cbbe5e270b90c550c9a"} Mar 10 09:00:03 crc kubenswrapper[4825]: I0310 09:00:03.909404 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70de139ed82a160eb86127bf58bc84612e4d7942071d1cbbe5e270b90c550c9a" Mar 10 09:00:03 crc kubenswrapper[4825]: I0310 09:00:03.909473 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552220-kfpr2" Mar 10 09:00:04 crc kubenswrapper[4825]: I0310 09:00:04.384911 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552175-b5k8l"] Mar 10 09:00:04 crc kubenswrapper[4825]: I0310 09:00:04.398679 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552175-b5k8l"] Mar 10 09:00:04 crc kubenswrapper[4825]: I0310 09:00:04.919505 4825 generic.go:334] "Generic (PLEG): container finished" podID="c7834d68-a65e-4887-9b5a-3608b5778b4d" containerID="1e6103c3e35e9fe8fd9b3775bf57c6f318a307bedfb973e85bcbd8d9de3aeb5a" exitCode=0 Mar 10 09:00:04 crc kubenswrapper[4825]: I0310 09:00:04.919552 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552220-rn9kv" event={"ID":"c7834d68-a65e-4887-9b5a-3608b5778b4d","Type":"ContainerDied","Data":"1e6103c3e35e9fe8fd9b3775bf57c6f318a307bedfb973e85bcbd8d9de3aeb5a"} Mar 10 09:00:05 crc kubenswrapper[4825]: I0310 09:00:05.255597 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="634b3cd8-d25a-4786-8206-d0ab216d359d" path="/var/lib/kubelet/pods/634b3cd8-d25a-4786-8206-d0ab216d359d/volumes" Mar 10 09:00:06 crc kubenswrapper[4825]: I0310 09:00:06.305605 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552220-rn9kv" Mar 10 09:00:06 crc kubenswrapper[4825]: I0310 09:00:06.451051 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x745\" (UniqueName: \"kubernetes.io/projected/c7834d68-a65e-4887-9b5a-3608b5778b4d-kube-api-access-8x745\") pod \"c7834d68-a65e-4887-9b5a-3608b5778b4d\" (UID: \"c7834d68-a65e-4887-9b5a-3608b5778b4d\") " Mar 10 09:00:06 crc kubenswrapper[4825]: I0310 09:00:06.456334 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7834d68-a65e-4887-9b5a-3608b5778b4d-kube-api-access-8x745" (OuterVolumeSpecName: "kube-api-access-8x745") pod "c7834d68-a65e-4887-9b5a-3608b5778b4d" (UID: "c7834d68-a65e-4887-9b5a-3608b5778b4d"). InnerVolumeSpecName "kube-api-access-8x745". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:00:06 crc kubenswrapper[4825]: I0310 09:00:06.554119 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x745\" (UniqueName: \"kubernetes.io/projected/c7834d68-a65e-4887-9b5a-3608b5778b4d-kube-api-access-8x745\") on node \"crc\" DevicePath \"\"" Mar 10 09:00:06 crc kubenswrapper[4825]: I0310 09:00:06.940637 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552220-rn9kv" event={"ID":"c7834d68-a65e-4887-9b5a-3608b5778b4d","Type":"ContainerDied","Data":"e970242ef56168ea3ca1ccb76e949e1a82446a9bc4854a4e5b8dd12c3ab45e6c"} Mar 10 09:00:06 crc kubenswrapper[4825]: I0310 09:00:06.940684 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e970242ef56168ea3ca1ccb76e949e1a82446a9bc4854a4e5b8dd12c3ab45e6c" Mar 10 09:00:06 crc kubenswrapper[4825]: I0310 09:00:06.940710 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552220-rn9kv" Mar 10 09:00:07 crc kubenswrapper[4825]: I0310 09:00:07.367364 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552214-82k5m"] Mar 10 09:00:07 crc kubenswrapper[4825]: I0310 09:00:07.376417 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552214-82k5m"] Mar 10 09:00:09 crc kubenswrapper[4825]: I0310 09:00:09.252509 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0332297-f277-4a15-966b-d529656eb1b6" path="/var/lib/kubelet/pods/e0332297-f277-4a15-966b-d529656eb1b6/volumes" Mar 10 09:00:26 crc kubenswrapper[4825]: I0310 09:00:26.150892 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-fpdt8" event={"ID":"be72d3b9-862b-4b9b-83a6-b8a991ddd51a","Type":"ContainerDied","Data":"69c39c4a46b343de1d15dcfee51d1b4f6b3c4ac1b0008f6c1fcc643b5048fbe1"} Mar 10 09:00:26 crc kubenswrapper[4825]: I0310 09:00:26.150977 4825 generic.go:334] "Generic (PLEG): container finished" podID="be72d3b9-862b-4b9b-83a6-b8a991ddd51a" containerID="69c39c4a46b343de1d15dcfee51d1b4f6b3c4ac1b0008f6c1fcc643b5048fbe1" exitCode=0 Mar 10 09:00:27 crc kubenswrapper[4825]: I0310 09:00:27.635875 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-fpdt8" Mar 10 09:00:27 crc kubenswrapper[4825]: I0310 09:00:27.682570 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-inventory\") pod \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\" (UID: \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\") " Mar 10 09:00:27 crc kubenswrapper[4825]: I0310 09:00:27.682652 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-neutron-sriov-agent-neutron-config-0\") pod \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\" (UID: \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\") " Mar 10 09:00:27 crc kubenswrapper[4825]: I0310 09:00:27.682809 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-neutron-sriov-combined-ca-bundle\") pod \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\" (UID: \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\") " Mar 10 09:00:27 crc kubenswrapper[4825]: I0310 09:00:27.682859 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-ssh-key-openstack-cell1\") pod \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\" (UID: \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\") " Mar 10 09:00:27 crc kubenswrapper[4825]: I0310 09:00:27.682982 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx7j2\" (UniqueName: \"kubernetes.io/projected/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-kube-api-access-hx7j2\") pod \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\" (UID: \"be72d3b9-862b-4b9b-83a6-b8a991ddd51a\") " Mar 10 09:00:27 crc kubenswrapper[4825]: I0310 09:00:27.744855 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "be72d3b9-862b-4b9b-83a6-b8a991ddd51a" (UID: "be72d3b9-862b-4b9b-83a6-b8a991ddd51a"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:00:27 crc kubenswrapper[4825]: I0310 09:00:27.756706 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-kube-api-access-hx7j2" (OuterVolumeSpecName: "kube-api-access-hx7j2") pod "be72d3b9-862b-4b9b-83a6-b8a991ddd51a" (UID: "be72d3b9-862b-4b9b-83a6-b8a991ddd51a"). InnerVolumeSpecName "kube-api-access-hx7j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:00:27 crc kubenswrapper[4825]: I0310 09:00:27.780355 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "be72d3b9-862b-4b9b-83a6-b8a991ddd51a" (UID: "be72d3b9-862b-4b9b-83a6-b8a991ddd51a"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:00:27 crc kubenswrapper[4825]: I0310 09:00:27.785993 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx7j2\" (UniqueName: \"kubernetes.io/projected/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-kube-api-access-hx7j2\") on node \"crc\" DevicePath \"\"" Mar 10 09:00:27 crc kubenswrapper[4825]: I0310 09:00:27.786032 4825 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:00:27 crc kubenswrapper[4825]: I0310 09:00:27.786042 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 09:00:27 crc kubenswrapper[4825]: I0310 09:00:27.800288 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-inventory" (OuterVolumeSpecName: "inventory") pod "be72d3b9-862b-4b9b-83a6-b8a991ddd51a" (UID: "be72d3b9-862b-4b9b-83a6-b8a991ddd51a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:00:27 crc kubenswrapper[4825]: I0310 09:00:27.830261 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "be72d3b9-862b-4b9b-83a6-b8a991ddd51a" (UID: "be72d3b9-862b-4b9b-83a6-b8a991ddd51a"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:00:27 crc kubenswrapper[4825]: I0310 09:00:27.888015 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:00:27 crc kubenswrapper[4825]: I0310 09:00:27.888051 4825 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/be72d3b9-862b-4b9b-83a6-b8a991ddd51a-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.169639 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-fpdt8" event={"ID":"be72d3b9-862b-4b9b-83a6-b8a991ddd51a","Type":"ContainerDied","Data":"a1996d6562b3a8d09f53fbe39b13b1fbf22e1f1baca70a85f00e3926ea8b0678"} Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.169677 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1996d6562b3a8d09f53fbe39b13b1fbf22e1f1baca70a85f00e3926ea8b0678" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.169741 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-fpdt8" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.271539 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n"] Mar 10 09:00:28 crc kubenswrapper[4825]: E0310 09:00:28.272052 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7834d68-a65e-4887-9b5a-3608b5778b4d" containerName="oc" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.272077 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7834d68-a65e-4887-9b5a-3608b5778b4d" containerName="oc" Mar 10 09:00:28 crc kubenswrapper[4825]: E0310 09:00:28.272114 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb9043c4-3531-40f5-8349-e79f0d6705ab" containerName="collect-profiles" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.272124 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb9043c4-3531-40f5-8349-e79f0d6705ab" containerName="collect-profiles" Mar 10 09:00:28 crc kubenswrapper[4825]: E0310 09:00:28.272185 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be72d3b9-862b-4b9b-83a6-b8a991ddd51a" containerName="neutron-sriov-openstack-openstack-cell1" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.272195 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="be72d3b9-862b-4b9b-83a6-b8a991ddd51a" containerName="neutron-sriov-openstack-openstack-cell1" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.272430 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb9043c4-3531-40f5-8349-e79f0d6705ab" containerName="collect-profiles" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.272456 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7834d68-a65e-4887-9b5a-3608b5778b4d" containerName="oc" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.272475 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="be72d3b9-862b-4b9b-83a6-b8a991ddd51a" containerName="neutron-sriov-openstack-openstack-cell1" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.273384 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.275880 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-5twvv" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.276091 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.276901 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.277064 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.282222 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n"] Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.306874 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.409860 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/21a36d26-5d6d-4c12-869e-2b09e847886a-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-7nh2n\" (UID: \"21a36d26-5d6d-4c12-869e-2b09e847886a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.409961 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21a36d26-5d6d-4c12-869e-2b09e847886a-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-7nh2n\" (UID: \"21a36d26-5d6d-4c12-869e-2b09e847886a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.410044 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm4hh\" (UniqueName: \"kubernetes.io/projected/21a36d26-5d6d-4c12-869e-2b09e847886a-kube-api-access-rm4hh\") pod \"neutron-dhcp-openstack-openstack-cell1-7nh2n\" (UID: \"21a36d26-5d6d-4c12-869e-2b09e847886a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.410247 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a36d26-5d6d-4c12-869e-2b09e847886a-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-7nh2n\" (UID: \"21a36d26-5d6d-4c12-869e-2b09e847886a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.410346 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/21a36d26-5d6d-4c12-869e-2b09e847886a-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-7nh2n\" (UID: \"21a36d26-5d6d-4c12-869e-2b09e847886a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.512629 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/21a36d26-5d6d-4c12-869e-2b09e847886a-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-7nh2n\" (UID: \"21a36d26-5d6d-4c12-869e-2b09e847886a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.512740 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21a36d26-5d6d-4c12-869e-2b09e847886a-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-7nh2n\" (UID: \"21a36d26-5d6d-4c12-869e-2b09e847886a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.512802 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm4hh\" (UniqueName: \"kubernetes.io/projected/21a36d26-5d6d-4c12-869e-2b09e847886a-kube-api-access-rm4hh\") pod \"neutron-dhcp-openstack-openstack-cell1-7nh2n\" (UID: \"21a36d26-5d6d-4c12-869e-2b09e847886a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.513009 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a36d26-5d6d-4c12-869e-2b09e847886a-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-7nh2n\" (UID: \"21a36d26-5d6d-4c12-869e-2b09e847886a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.513122 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/21a36d26-5d6d-4c12-869e-2b09e847886a-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-7nh2n\" (UID: \"21a36d26-5d6d-4c12-869e-2b09e847886a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.516644 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/21a36d26-5d6d-4c12-869e-2b09e847886a-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-7nh2n\" (UID: \"21a36d26-5d6d-4c12-869e-2b09e847886a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.518695 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a36d26-5d6d-4c12-869e-2b09e847886a-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-7nh2n\" (UID: \"21a36d26-5d6d-4c12-869e-2b09e847886a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.520287 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21a36d26-5d6d-4c12-869e-2b09e847886a-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-7nh2n\" (UID: \"21a36d26-5d6d-4c12-869e-2b09e847886a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.520923 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/21a36d26-5d6d-4c12-869e-2b09e847886a-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-7nh2n\" (UID: \"21a36d26-5d6d-4c12-869e-2b09e847886a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.536216 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm4hh\" (UniqueName: \"kubernetes.io/projected/21a36d26-5d6d-4c12-869e-2b09e847886a-kube-api-access-rm4hh\") pod \"neutron-dhcp-openstack-openstack-cell1-7nh2n\" (UID: \"21a36d26-5d6d-4c12-869e-2b09e847886a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n" Mar 10 09:00:28 crc kubenswrapper[4825]: I0310 09:00:28.624104 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n" Mar 10 09:00:29 crc kubenswrapper[4825]: I0310 09:00:29.161498 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n"] Mar 10 09:00:29 crc kubenswrapper[4825]: I0310 09:00:29.192661 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n" event={"ID":"21a36d26-5d6d-4c12-869e-2b09e847886a","Type":"ContainerStarted","Data":"511236ca9871b941f9b5b8b16ae1ae97b096998f259329a2a021df80abd09763"} Mar 10 09:00:31 crc kubenswrapper[4825]: I0310 09:00:31.217797 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n" event={"ID":"21a36d26-5d6d-4c12-869e-2b09e847886a","Type":"ContainerStarted","Data":"e776ef31b883df46e920d8481019fc3e890e9f6483e209cab325f71ab137b1f1"} Mar 10 09:00:31 crc kubenswrapper[4825]: I0310 09:00:31.248177 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n" podStartSLOduration=2.2958923110000002 podStartE2EDuration="3.248104018s" podCreationTimestamp="2026-03-10 09:00:28 +0000 UTC" firstStartedPulling="2026-03-10 09:00:29.178024728 +0000 UTC m=+8182.207805343" lastFinishedPulling="2026-03-10 09:00:30.130236395 +0000 UTC m=+8183.160017050" observedRunningTime="2026-03-10 09:00:31.241502562 +0000 UTC m=+8184.271283177" watchObservedRunningTime="2026-03-10 09:00:31.248104018 +0000 UTC m=+8184.277884663" Mar 10 09:00:46 crc kubenswrapper[4825]: I0310 09:00:46.887883 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:00:46 crc kubenswrapper[4825]: I0310 09:00:46.888495 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:00:54 crc kubenswrapper[4825]: I0310 09:00:54.848670 4825 scope.go:117] "RemoveContainer" containerID="f319bafbe0385242850f13f1baaf7c73d8b2f7b3e5f67bb20c6a571dbe6448c4" Mar 10 09:00:54 crc kubenswrapper[4825]: I0310 09:00:54.916421 4825 scope.go:117] "RemoveContainer" containerID="69d280806fb41c5c27e18f2c606cad1ef2f2991f0cf454bc8d1d590fb4946a5e" Mar 10 09:01:00 crc kubenswrapper[4825]: I0310 09:01:00.168679 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29552221-sbsfl"] Mar 10 09:01:00 crc kubenswrapper[4825]: I0310 09:01:00.170607 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552221-sbsfl" Mar 10 09:01:00 crc kubenswrapper[4825]: I0310 09:01:00.183455 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29552221-sbsfl"] Mar 10 09:01:00 crc kubenswrapper[4825]: I0310 09:01:00.284911 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfrf2\" (UniqueName: \"kubernetes.io/projected/1304cf5f-c346-4ccd-b858-c8e63f3d0056-kube-api-access-rfrf2\") pod \"keystone-cron-29552221-sbsfl\" (UID: \"1304cf5f-c346-4ccd-b858-c8e63f3d0056\") " pod="openstack/keystone-cron-29552221-sbsfl" Mar 10 09:01:00 crc kubenswrapper[4825]: I0310 09:01:00.284969 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1304cf5f-c346-4ccd-b858-c8e63f3d0056-combined-ca-bundle\") pod \"keystone-cron-29552221-sbsfl\" (UID: \"1304cf5f-c346-4ccd-b858-c8e63f3d0056\") " pod="openstack/keystone-cron-29552221-sbsfl" Mar 10 09:01:00 crc kubenswrapper[4825]: I0310 09:01:00.285349 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1304cf5f-c346-4ccd-b858-c8e63f3d0056-config-data\") pod \"keystone-cron-29552221-sbsfl\" (UID: \"1304cf5f-c346-4ccd-b858-c8e63f3d0056\") " pod="openstack/keystone-cron-29552221-sbsfl" Mar 10 09:01:00 crc kubenswrapper[4825]: I0310 09:01:00.285480 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1304cf5f-c346-4ccd-b858-c8e63f3d0056-fernet-keys\") pod \"keystone-cron-29552221-sbsfl\" (UID: \"1304cf5f-c346-4ccd-b858-c8e63f3d0056\") " pod="openstack/keystone-cron-29552221-sbsfl" Mar 10 09:01:00 crc kubenswrapper[4825]: I0310 09:01:00.387944 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfrf2\" (UniqueName: \"kubernetes.io/projected/1304cf5f-c346-4ccd-b858-c8e63f3d0056-kube-api-access-rfrf2\") pod \"keystone-cron-29552221-sbsfl\" (UID: \"1304cf5f-c346-4ccd-b858-c8e63f3d0056\") " pod="openstack/keystone-cron-29552221-sbsfl" Mar 10 09:01:00 crc kubenswrapper[4825]: I0310 09:01:00.387998 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1304cf5f-c346-4ccd-b858-c8e63f3d0056-combined-ca-bundle\") pod \"keystone-cron-29552221-sbsfl\" (UID: \"1304cf5f-c346-4ccd-b858-c8e63f3d0056\") " pod="openstack/keystone-cron-29552221-sbsfl" Mar 10 09:01:00 crc kubenswrapper[4825]: I0310 09:01:00.388052 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1304cf5f-c346-4ccd-b858-c8e63f3d0056-config-data\") pod \"keystone-cron-29552221-sbsfl\" (UID: \"1304cf5f-c346-4ccd-b858-c8e63f3d0056\") " pod="openstack/keystone-cron-29552221-sbsfl" Mar 10 09:01:00 crc kubenswrapper[4825]: I0310 09:01:00.388081 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1304cf5f-c346-4ccd-b858-c8e63f3d0056-fernet-keys\") pod \"keystone-cron-29552221-sbsfl\" (UID: \"1304cf5f-c346-4ccd-b858-c8e63f3d0056\") " pod="openstack/keystone-cron-29552221-sbsfl" Mar 10 09:01:00 crc kubenswrapper[4825]: I0310 09:01:00.396162 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1304cf5f-c346-4ccd-b858-c8e63f3d0056-combined-ca-bundle\") pod \"keystone-cron-29552221-sbsfl\" (UID: \"1304cf5f-c346-4ccd-b858-c8e63f3d0056\") " pod="openstack/keystone-cron-29552221-sbsfl" Mar 10 09:01:00 crc kubenswrapper[4825]: I0310 09:01:00.396458 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1304cf5f-c346-4ccd-b858-c8e63f3d0056-fernet-keys\") pod \"keystone-cron-29552221-sbsfl\" (UID: \"1304cf5f-c346-4ccd-b858-c8e63f3d0056\") " pod="openstack/keystone-cron-29552221-sbsfl" Mar 10 09:01:00 crc kubenswrapper[4825]: I0310 09:01:00.396981 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1304cf5f-c346-4ccd-b858-c8e63f3d0056-config-data\") pod \"keystone-cron-29552221-sbsfl\" (UID: \"1304cf5f-c346-4ccd-b858-c8e63f3d0056\") " pod="openstack/keystone-cron-29552221-sbsfl" Mar 10 09:01:00 crc kubenswrapper[4825]: I0310 09:01:00.405427 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfrf2\" (UniqueName: \"kubernetes.io/projected/1304cf5f-c346-4ccd-b858-c8e63f3d0056-kube-api-access-rfrf2\") pod \"keystone-cron-29552221-sbsfl\" (UID: \"1304cf5f-c346-4ccd-b858-c8e63f3d0056\") " pod="openstack/keystone-cron-29552221-sbsfl" Mar 10 09:01:00 crc kubenswrapper[4825]: I0310 09:01:00.542927 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552221-sbsfl" Mar 10 09:01:01 crc kubenswrapper[4825]: I0310 09:01:01.057290 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29552221-sbsfl"] Mar 10 09:01:01 crc kubenswrapper[4825]: I0310 09:01:01.516659 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552221-sbsfl" event={"ID":"1304cf5f-c346-4ccd-b858-c8e63f3d0056","Type":"ContainerStarted","Data":"c349ed28b83898cca5b4a7f6cb4b10be8551e203a3a337245c31d6ca540b3dcf"} Mar 10 09:01:01 crc kubenswrapper[4825]: I0310 09:01:01.516989 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552221-sbsfl" event={"ID":"1304cf5f-c346-4ccd-b858-c8e63f3d0056","Type":"ContainerStarted","Data":"4fdc6c82cb57a4f6a25d92d91b06ba55de9edebfcce2014489dd545015a7a345"} Mar 10 09:01:01 crc kubenswrapper[4825]: I0310 09:01:01.534990 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29552221-sbsfl" podStartSLOduration=1.534965538 podStartE2EDuration="1.534965538s" podCreationTimestamp="2026-03-10 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:01:01.531919817 +0000 UTC m=+8214.561700432" watchObservedRunningTime="2026-03-10 09:01:01.534965538 +0000 UTC m=+8214.564746153" Mar 10 09:01:04 crc kubenswrapper[4825]: I0310 09:01:04.554805 4825 generic.go:334] "Generic (PLEG): container finished" podID="1304cf5f-c346-4ccd-b858-c8e63f3d0056" containerID="c349ed28b83898cca5b4a7f6cb4b10be8551e203a3a337245c31d6ca540b3dcf" exitCode=0 Mar 10 09:01:04 crc kubenswrapper[4825]: I0310 09:01:04.554853 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552221-sbsfl" event={"ID":"1304cf5f-c346-4ccd-b858-c8e63f3d0056","Type":"ContainerDied","Data":"c349ed28b83898cca5b4a7f6cb4b10be8551e203a3a337245c31d6ca540b3dcf"} Mar 10 09:01:05 crc kubenswrapper[4825]: I0310 09:01:05.997440 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552221-sbsfl" Mar 10 09:01:06 crc kubenswrapper[4825]: I0310 09:01:06.111013 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfrf2\" (UniqueName: \"kubernetes.io/projected/1304cf5f-c346-4ccd-b858-c8e63f3d0056-kube-api-access-rfrf2\") pod \"1304cf5f-c346-4ccd-b858-c8e63f3d0056\" (UID: \"1304cf5f-c346-4ccd-b858-c8e63f3d0056\") " Mar 10 09:01:06 crc kubenswrapper[4825]: I0310 09:01:06.111152 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1304cf5f-c346-4ccd-b858-c8e63f3d0056-fernet-keys\") pod \"1304cf5f-c346-4ccd-b858-c8e63f3d0056\" (UID: \"1304cf5f-c346-4ccd-b858-c8e63f3d0056\") " Mar 10 09:01:06 crc kubenswrapper[4825]: I0310 09:01:06.111226 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1304cf5f-c346-4ccd-b858-c8e63f3d0056-config-data\") pod \"1304cf5f-c346-4ccd-b858-c8e63f3d0056\" (UID: \"1304cf5f-c346-4ccd-b858-c8e63f3d0056\") " Mar 10 09:01:06 crc kubenswrapper[4825]: I0310 09:01:06.111340 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1304cf5f-c346-4ccd-b858-c8e63f3d0056-combined-ca-bundle\") pod \"1304cf5f-c346-4ccd-b858-c8e63f3d0056\" (UID: \"1304cf5f-c346-4ccd-b858-c8e63f3d0056\") " Mar 10 09:01:06 crc kubenswrapper[4825]: I0310 09:01:06.117433 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1304cf5f-c346-4ccd-b858-c8e63f3d0056-kube-api-access-rfrf2" (OuterVolumeSpecName: "kube-api-access-rfrf2") pod "1304cf5f-c346-4ccd-b858-c8e63f3d0056" (UID: "1304cf5f-c346-4ccd-b858-c8e63f3d0056"). InnerVolumeSpecName "kube-api-access-rfrf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:01:06 crc kubenswrapper[4825]: I0310 09:01:06.117534 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1304cf5f-c346-4ccd-b858-c8e63f3d0056-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1304cf5f-c346-4ccd-b858-c8e63f3d0056" (UID: "1304cf5f-c346-4ccd-b858-c8e63f3d0056"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:01:06 crc kubenswrapper[4825]: I0310 09:01:06.139631 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1304cf5f-c346-4ccd-b858-c8e63f3d0056-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1304cf5f-c346-4ccd-b858-c8e63f3d0056" (UID: "1304cf5f-c346-4ccd-b858-c8e63f3d0056"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:01:06 crc kubenswrapper[4825]: I0310 09:01:06.194128 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1304cf5f-c346-4ccd-b858-c8e63f3d0056-config-data" (OuterVolumeSpecName: "config-data") pod "1304cf5f-c346-4ccd-b858-c8e63f3d0056" (UID: "1304cf5f-c346-4ccd-b858-c8e63f3d0056"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:01:06 crc kubenswrapper[4825]: I0310 09:01:06.214034 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfrf2\" (UniqueName: \"kubernetes.io/projected/1304cf5f-c346-4ccd-b858-c8e63f3d0056-kube-api-access-rfrf2\") on node \"crc\" DevicePath \"\"" Mar 10 09:01:06 crc kubenswrapper[4825]: I0310 09:01:06.214081 4825 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1304cf5f-c346-4ccd-b858-c8e63f3d0056-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 09:01:06 crc kubenswrapper[4825]: I0310 09:01:06.214094 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1304cf5f-c346-4ccd-b858-c8e63f3d0056-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:01:06 crc kubenswrapper[4825]: I0310 09:01:06.214108 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1304cf5f-c346-4ccd-b858-c8e63f3d0056-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:01:06 crc kubenswrapper[4825]: I0310 09:01:06.585360 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552221-sbsfl" event={"ID":"1304cf5f-c346-4ccd-b858-c8e63f3d0056","Type":"ContainerDied","Data":"4fdc6c82cb57a4f6a25d92d91b06ba55de9edebfcce2014489dd545015a7a345"} Mar 10 09:01:06 crc kubenswrapper[4825]: I0310 09:01:06.585425 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fdc6c82cb57a4f6a25d92d91b06ba55de9edebfcce2014489dd545015a7a345" Mar 10 09:01:06 crc kubenswrapper[4825]: I0310 09:01:06.585570 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552221-sbsfl" Mar 10 09:01:16 crc kubenswrapper[4825]: I0310 09:01:16.888255 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:01:16 crc kubenswrapper[4825]: I0310 09:01:16.888809 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:01:43 crc kubenswrapper[4825]: I0310 09:01:43.960897 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n" event={"ID":"21a36d26-5d6d-4c12-869e-2b09e847886a","Type":"ContainerDied","Data":"e776ef31b883df46e920d8481019fc3e890e9f6483e209cab325f71ab137b1f1"} Mar 10 09:01:43 crc kubenswrapper[4825]: I0310 09:01:43.960984 4825 generic.go:334] "Generic (PLEG): container finished" podID="21a36d26-5d6d-4c12-869e-2b09e847886a" containerID="e776ef31b883df46e920d8481019fc3e890e9f6483e209cab325f71ab137b1f1" exitCode=0 Mar 10 09:01:45 crc kubenswrapper[4825]: I0310 09:01:45.509253 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n" Mar 10 09:01:45 crc kubenswrapper[4825]: I0310 09:01:45.650364 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a36d26-5d6d-4c12-869e-2b09e847886a-neutron-dhcp-combined-ca-bundle\") pod \"21a36d26-5d6d-4c12-869e-2b09e847886a\" (UID: \"21a36d26-5d6d-4c12-869e-2b09e847886a\") " Mar 10 09:01:45 crc kubenswrapper[4825]: I0310 09:01:45.650431 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/21a36d26-5d6d-4c12-869e-2b09e847886a-neutron-dhcp-agent-neutron-config-0\") pod \"21a36d26-5d6d-4c12-869e-2b09e847886a\" (UID: \"21a36d26-5d6d-4c12-869e-2b09e847886a\") " Mar 10 09:01:45 crc kubenswrapper[4825]: I0310 09:01:45.650540 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21a36d26-5d6d-4c12-869e-2b09e847886a-inventory\") pod \"21a36d26-5d6d-4c12-869e-2b09e847886a\" (UID: \"21a36d26-5d6d-4c12-869e-2b09e847886a\") " Mar 10 09:01:45 crc kubenswrapper[4825]: I0310 09:01:45.650611 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/21a36d26-5d6d-4c12-869e-2b09e847886a-ssh-key-openstack-cell1\") pod \"21a36d26-5d6d-4c12-869e-2b09e847886a\" (UID: \"21a36d26-5d6d-4c12-869e-2b09e847886a\") " Mar 10 09:01:45 crc kubenswrapper[4825]: I0310 09:01:45.650816 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm4hh\" (UniqueName: \"kubernetes.io/projected/21a36d26-5d6d-4c12-869e-2b09e847886a-kube-api-access-rm4hh\") pod \"21a36d26-5d6d-4c12-869e-2b09e847886a\" (UID: \"21a36d26-5d6d-4c12-869e-2b09e847886a\") " Mar 10 09:01:45 crc kubenswrapper[4825]: I0310 09:01:45.657321 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21a36d26-5d6d-4c12-869e-2b09e847886a-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "21a36d26-5d6d-4c12-869e-2b09e847886a" (UID: "21a36d26-5d6d-4c12-869e-2b09e847886a"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:01:45 crc kubenswrapper[4825]: I0310 09:01:45.660336 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21a36d26-5d6d-4c12-869e-2b09e847886a-kube-api-access-rm4hh" (OuterVolumeSpecName: "kube-api-access-rm4hh") pod "21a36d26-5d6d-4c12-869e-2b09e847886a" (UID: "21a36d26-5d6d-4c12-869e-2b09e847886a"). InnerVolumeSpecName "kube-api-access-rm4hh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:01:45 crc kubenswrapper[4825]: I0310 09:01:45.685158 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21a36d26-5d6d-4c12-869e-2b09e847886a-inventory" (OuterVolumeSpecName: "inventory") pod "21a36d26-5d6d-4c12-869e-2b09e847886a" (UID: "21a36d26-5d6d-4c12-869e-2b09e847886a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:01:45 crc kubenswrapper[4825]: I0310 09:01:45.688714 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21a36d26-5d6d-4c12-869e-2b09e847886a-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "21a36d26-5d6d-4c12-869e-2b09e847886a" (UID: "21a36d26-5d6d-4c12-869e-2b09e847886a"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:01:45 crc kubenswrapper[4825]: I0310 09:01:45.698587 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21a36d26-5d6d-4c12-869e-2b09e847886a-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "21a36d26-5d6d-4c12-869e-2b09e847886a" (UID: "21a36d26-5d6d-4c12-869e-2b09e847886a"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:01:45 crc kubenswrapper[4825]: I0310 09:01:45.755235 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm4hh\" (UniqueName: \"kubernetes.io/projected/21a36d26-5d6d-4c12-869e-2b09e847886a-kube-api-access-rm4hh\") on node \"crc\" DevicePath \"\"" Mar 10 09:01:45 crc kubenswrapper[4825]: I0310 09:01:45.755274 4825 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a36d26-5d6d-4c12-869e-2b09e847886a-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:01:45 crc kubenswrapper[4825]: I0310 09:01:45.755287 4825 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/21a36d26-5d6d-4c12-869e-2b09e847886a-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:01:45 crc kubenswrapper[4825]: I0310 09:01:45.755302 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21a36d26-5d6d-4c12-869e-2b09e847886a-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:01:45 crc kubenswrapper[4825]: I0310 09:01:45.755313 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/21a36d26-5d6d-4c12-869e-2b09e847886a-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 09:01:45 crc kubenswrapper[4825]: I0310 09:01:45.992485 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n" event={"ID":"21a36d26-5d6d-4c12-869e-2b09e847886a","Type":"ContainerDied","Data":"511236ca9871b941f9b5b8b16ae1ae97b096998f259329a2a021df80abd09763"} Mar 10 09:01:45 crc kubenswrapper[4825]: I0310 09:01:45.992532 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="511236ca9871b941f9b5b8b16ae1ae97b096998f259329a2a021df80abd09763" Mar 10 09:01:45 crc kubenswrapper[4825]: I0310 09:01:45.992558 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-7nh2n" Mar 10 09:01:46 crc kubenswrapper[4825]: I0310 09:01:46.887994 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:01:46 crc kubenswrapper[4825]: I0310 09:01:46.888061 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:01:46 crc kubenswrapper[4825]: I0310 09:01:46.888109 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 09:01:46 crc kubenswrapper[4825]: I0310 09:01:46.888933 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:01:46 crc kubenswrapper[4825]: I0310 09:01:46.889003 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" gracePeriod=600 Mar 10 09:01:47 crc kubenswrapper[4825]: E0310 09:01:47.020466 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:01:48 crc kubenswrapper[4825]: I0310 09:01:48.016258 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" exitCode=0 Mar 10 09:01:48 crc kubenswrapper[4825]: I0310 09:01:48.016350 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f"} Mar 10 09:01:48 crc kubenswrapper[4825]: I0310 09:01:48.017420 4825 scope.go:117] "RemoveContainer" containerID="81b5ce52c08a356df2c6362cda9b87f444da49d61e188354e699ec5130fe40a8" Mar 10 09:01:48 crc kubenswrapper[4825]: I0310 09:01:48.018435 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:01:48 crc kubenswrapper[4825]: E0310 09:01:48.018751 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:01:59 crc kubenswrapper[4825]: I0310 09:01:59.243999 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:01:59 crc kubenswrapper[4825]: E0310 09:01:59.244814 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:02:00 crc kubenswrapper[4825]: I0310 09:02:00.150211 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552222-w2fmn"] Mar 10 09:02:00 crc kubenswrapper[4825]: E0310 09:02:00.151066 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1304cf5f-c346-4ccd-b858-c8e63f3d0056" containerName="keystone-cron" Mar 10 09:02:00 crc kubenswrapper[4825]: I0310 09:02:00.151090 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1304cf5f-c346-4ccd-b858-c8e63f3d0056" containerName="keystone-cron" Mar 10 09:02:00 crc kubenswrapper[4825]: E0310 09:02:00.151113 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21a36d26-5d6d-4c12-869e-2b09e847886a" containerName="neutron-dhcp-openstack-openstack-cell1" Mar 10 09:02:00 crc kubenswrapper[4825]: I0310 09:02:00.151123 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="21a36d26-5d6d-4c12-869e-2b09e847886a" containerName="neutron-dhcp-openstack-openstack-cell1" Mar 10 09:02:00 crc kubenswrapper[4825]: I0310 09:02:00.151518 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="1304cf5f-c346-4ccd-b858-c8e63f3d0056" containerName="keystone-cron" Mar 10 09:02:00 crc kubenswrapper[4825]: I0310 09:02:00.151556 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="21a36d26-5d6d-4c12-869e-2b09e847886a" containerName="neutron-dhcp-openstack-openstack-cell1" Mar 10 09:02:00 crc kubenswrapper[4825]: I0310 09:02:00.153211 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552222-w2fmn" Mar 10 09:02:00 crc kubenswrapper[4825]: I0310 09:02:00.156001 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:02:00 crc kubenswrapper[4825]: I0310 09:02:00.156386 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:02:00 crc kubenswrapper[4825]: I0310 09:02:00.156571 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 09:02:00 crc kubenswrapper[4825]: I0310 09:02:00.183957 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552222-w2fmn"] Mar 10 09:02:00 crc kubenswrapper[4825]: I0310 09:02:00.206549 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmtdw\" (UniqueName: \"kubernetes.io/projected/7219dcd8-f33b-4734-9f06-5d232060adcd-kube-api-access-wmtdw\") pod \"auto-csr-approver-29552222-w2fmn\" (UID: \"7219dcd8-f33b-4734-9f06-5d232060adcd\") " pod="openshift-infra/auto-csr-approver-29552222-w2fmn" Mar 10 09:02:00 crc kubenswrapper[4825]: I0310 09:02:00.309293 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmtdw\" (UniqueName: \"kubernetes.io/projected/7219dcd8-f33b-4734-9f06-5d232060adcd-kube-api-access-wmtdw\") pod \"auto-csr-approver-29552222-w2fmn\" (UID: \"7219dcd8-f33b-4734-9f06-5d232060adcd\") " pod="openshift-infra/auto-csr-approver-29552222-w2fmn" Mar 10 09:02:00 crc kubenswrapper[4825]: I0310 09:02:00.332027 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmtdw\" (UniqueName: \"kubernetes.io/projected/7219dcd8-f33b-4734-9f06-5d232060adcd-kube-api-access-wmtdw\") pod \"auto-csr-approver-29552222-w2fmn\" (UID: \"7219dcd8-f33b-4734-9f06-5d232060adcd\") " pod="openshift-infra/auto-csr-approver-29552222-w2fmn" Mar 10 09:02:00 crc kubenswrapper[4825]: I0310 09:02:00.483974 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552222-w2fmn" Mar 10 09:02:00 crc kubenswrapper[4825]: W0310 09:02:00.938068 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7219dcd8_f33b_4734_9f06_5d232060adcd.slice/crio-6e6f35c4bf2076604c1e406a89ea1e292dbfeea1cd281897e19f726e99c2803e WatchSource:0}: Error finding container 6e6f35c4bf2076604c1e406a89ea1e292dbfeea1cd281897e19f726e99c2803e: Status 404 returned error can't find the container with id 6e6f35c4bf2076604c1e406a89ea1e292dbfeea1cd281897e19f726e99c2803e Mar 10 09:02:00 crc kubenswrapper[4825]: I0310 09:02:00.940701 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552222-w2fmn"] Mar 10 09:02:01 crc kubenswrapper[4825]: I0310 09:02:01.167541 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552222-w2fmn" event={"ID":"7219dcd8-f33b-4734-9f06-5d232060adcd","Type":"ContainerStarted","Data":"6e6f35c4bf2076604c1e406a89ea1e292dbfeea1cd281897e19f726e99c2803e"} Mar 10 09:02:03 crc kubenswrapper[4825]: I0310 09:02:03.192754 4825 generic.go:334] "Generic (PLEG): container finished" podID="7219dcd8-f33b-4734-9f06-5d232060adcd" containerID="714206142d3ba3d6d1b5114fe11faf0404775b17a7a0dbcb4ebe4f83ff657c7b" exitCode=0 Mar 10 09:02:03 crc kubenswrapper[4825]: I0310 09:02:03.192846 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552222-w2fmn" event={"ID":"7219dcd8-f33b-4734-9f06-5d232060adcd","Type":"ContainerDied","Data":"714206142d3ba3d6d1b5114fe11faf0404775b17a7a0dbcb4ebe4f83ff657c7b"} Mar 10 09:02:03 crc kubenswrapper[4825]: I0310 09:02:03.781451 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 09:02:03 crc kubenswrapper[4825]: I0310 09:02:03.781686 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="81060137-51ef-4ed2-a740-aeb94217f912" containerName="nova-cell0-conductor-conductor" containerID="cri-o://78f7952655702f7d5aee9e67af3b70c3c97c4a73047658e3a8e794fcc744b3ae" gracePeriod=30 Mar 10 09:02:04 crc kubenswrapper[4825]: I0310 09:02:04.306064 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 09:02:04 crc kubenswrapper[4825]: I0310 09:02:04.306311 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="e5972ff3-b652-4ed2-8414-404f2d7b24e0" containerName="nova-cell1-conductor-conductor" containerID="cri-o://5dd6b79a640eb1c780a7157a5aa0793fc089f2ff62c548e00d338a55ea63a00c" gracePeriod=30 Mar 10 09:02:04 crc kubenswrapper[4825]: I0310 09:02:04.520277 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:02:04 crc kubenswrapper[4825]: I0310 09:02:04.520516 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5189c199-0d2b-40c2-8fe8-04bdea46a84c" containerName="nova-scheduler-scheduler" containerID="cri-o://2b8ba41409430104450ede717f08bf58654090af49362e9caf0f65c4d13c1060" gracePeriod=30 Mar 10 09:02:04 crc kubenswrapper[4825]: I0310 09:02:04.545309 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:02:04 crc kubenswrapper[4825]: I0310 09:02:04.545567 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="62dad1be-8413-4a7e-b097-b6d2e1ec3246" containerName="nova-api-log" containerID="cri-o://e58f953a04ebd72b37b1f49cf6ed5a60c52afd420da2fa4ea715f458184f4621" gracePeriod=30 Mar 10 09:02:04 crc kubenswrapper[4825]: I0310 09:02:04.545990 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="62dad1be-8413-4a7e-b097-b6d2e1ec3246" containerName="nova-api-api" containerID="cri-o://53501f735e5e94a19464b8ed3190c4c9cc118c8695bb55f561845902951cfdbc" gracePeriod=30 Mar 10 09:02:04 crc kubenswrapper[4825]: I0310 09:02:04.561744 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:02:04 crc kubenswrapper[4825]: I0310 09:02:04.562229 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ac546983-f13f-4257-ad1d-f0ff7398a28b" containerName="nova-metadata-log" containerID="cri-o://9e25877e7cb42aa5c8c8b1f77338e31beeeb7dfb90d6daa03b3a197a82f65e77" gracePeriod=30 Mar 10 09:02:04 crc kubenswrapper[4825]: I0310 09:02:04.562644 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ac546983-f13f-4257-ad1d-f0ff7398a28b" containerName="nova-metadata-metadata" containerID="cri-o://a150344ed6b966cabe77567d2b52d46ec4894b221ec2780fc56ef6d0b3b79aea" gracePeriod=30 Mar 10 09:02:04 crc kubenswrapper[4825]: I0310 09:02:04.612870 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552222-w2fmn" Mar 10 09:02:04 crc kubenswrapper[4825]: I0310 09:02:04.717626 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmtdw\" (UniqueName: \"kubernetes.io/projected/7219dcd8-f33b-4734-9f06-5d232060adcd-kube-api-access-wmtdw\") pod \"7219dcd8-f33b-4734-9f06-5d232060adcd\" (UID: \"7219dcd8-f33b-4734-9f06-5d232060adcd\") " Mar 10 09:02:04 crc kubenswrapper[4825]: I0310 09:02:04.727682 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7219dcd8-f33b-4734-9f06-5d232060adcd-kube-api-access-wmtdw" (OuterVolumeSpecName: "kube-api-access-wmtdw") pod "7219dcd8-f33b-4734-9f06-5d232060adcd" (UID: "7219dcd8-f33b-4734-9f06-5d232060adcd"). InnerVolumeSpecName "kube-api-access-wmtdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:02:04 crc kubenswrapper[4825]: E0310 09:02:04.768031 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2b8ba41409430104450ede717f08bf58654090af49362e9caf0f65c4d13c1060" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 09:02:04 crc kubenswrapper[4825]: E0310 09:02:04.804495 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2b8ba41409430104450ede717f08bf58654090af49362e9caf0f65c4d13c1060" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 09:02:04 crc kubenswrapper[4825]: E0310 09:02:04.806675 4825 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2b8ba41409430104450ede717f08bf58654090af49362e9caf0f65c4d13c1060" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 09:02:04 crc kubenswrapper[4825]: E0310 09:02:04.806733 4825 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="5189c199-0d2b-40c2-8fe8-04bdea46a84c" containerName="nova-scheduler-scheduler" Mar 10 09:02:04 crc kubenswrapper[4825]: I0310 09:02:04.821821 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmtdw\" (UniqueName: \"kubernetes.io/projected/7219dcd8-f33b-4734-9f06-5d232060adcd-kube-api-access-wmtdw\") on node \"crc\" DevicePath \"\"" Mar 10 09:02:05 crc kubenswrapper[4825]: I0310 09:02:05.214729 4825 generic.go:334] "Generic (PLEG): container finished" podID="62dad1be-8413-4a7e-b097-b6d2e1ec3246" containerID="e58f953a04ebd72b37b1f49cf6ed5a60c52afd420da2fa4ea715f458184f4621" exitCode=143 Mar 10 09:02:05 crc kubenswrapper[4825]: I0310 09:02:05.214843 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"62dad1be-8413-4a7e-b097-b6d2e1ec3246","Type":"ContainerDied","Data":"e58f953a04ebd72b37b1f49cf6ed5a60c52afd420da2fa4ea715f458184f4621"} Mar 10 09:02:05 crc kubenswrapper[4825]: I0310 09:02:05.217658 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552222-w2fmn" event={"ID":"7219dcd8-f33b-4734-9f06-5d232060adcd","Type":"ContainerDied","Data":"6e6f35c4bf2076604c1e406a89ea1e292dbfeea1cd281897e19f726e99c2803e"} Mar 10 09:02:05 crc kubenswrapper[4825]: I0310 09:02:05.217813 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e6f35c4bf2076604c1e406a89ea1e292dbfeea1cd281897e19f726e99c2803e" Mar 10 09:02:05 crc kubenswrapper[4825]: I0310 09:02:05.217954 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552222-w2fmn" Mar 10 09:02:05 crc kubenswrapper[4825]: I0310 09:02:05.226520 4825 generic.go:334] "Generic (PLEG): container finished" podID="ac546983-f13f-4257-ad1d-f0ff7398a28b" containerID="9e25877e7cb42aa5c8c8b1f77338e31beeeb7dfb90d6daa03b3a197a82f65e77" exitCode=143 Mar 10 09:02:05 crc kubenswrapper[4825]: I0310 09:02:05.226561 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ac546983-f13f-4257-ad1d-f0ff7398a28b","Type":"ContainerDied","Data":"9e25877e7cb42aa5c8c8b1f77338e31beeeb7dfb90d6daa03b3a197a82f65e77"} Mar 10 09:02:05 crc kubenswrapper[4825]: I0310 09:02:05.730859 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552216-jw2q2"] Mar 10 09:02:05 crc kubenswrapper[4825]: I0310 09:02:05.739860 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552216-jw2q2"] Mar 10 09:02:05 crc kubenswrapper[4825]: I0310 09:02:05.770985 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 09:02:05 crc kubenswrapper[4825]: I0310 09:02:05.925790 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 09:02:05 crc kubenswrapper[4825]: I0310 09:02:05.944241 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5972ff3-b652-4ed2-8414-404f2d7b24e0-config-data\") pod \"e5972ff3-b652-4ed2-8414-404f2d7b24e0\" (UID: \"e5972ff3-b652-4ed2-8414-404f2d7b24e0\") " Mar 10 09:02:05 crc kubenswrapper[4825]: I0310 09:02:05.944504 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84p4l\" (UniqueName: \"kubernetes.io/projected/e5972ff3-b652-4ed2-8414-404f2d7b24e0-kube-api-access-84p4l\") pod \"e5972ff3-b652-4ed2-8414-404f2d7b24e0\" (UID: \"e5972ff3-b652-4ed2-8414-404f2d7b24e0\") " Mar 10 09:02:05 crc kubenswrapper[4825]: I0310 09:02:05.944541 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5972ff3-b652-4ed2-8414-404f2d7b24e0-combined-ca-bundle\") pod \"e5972ff3-b652-4ed2-8414-404f2d7b24e0\" (UID: \"e5972ff3-b652-4ed2-8414-404f2d7b24e0\") " Mar 10 09:02:05 crc kubenswrapper[4825]: I0310 09:02:05.949070 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5972ff3-b652-4ed2-8414-404f2d7b24e0-kube-api-access-84p4l" (OuterVolumeSpecName: "kube-api-access-84p4l") pod "e5972ff3-b652-4ed2-8414-404f2d7b24e0" (UID: "e5972ff3-b652-4ed2-8414-404f2d7b24e0"). InnerVolumeSpecName "kube-api-access-84p4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:02:05 crc kubenswrapper[4825]: I0310 09:02:05.986671 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5972ff3-b652-4ed2-8414-404f2d7b24e0-config-data" (OuterVolumeSpecName: "config-data") pod "e5972ff3-b652-4ed2-8414-404f2d7b24e0" (UID: "e5972ff3-b652-4ed2-8414-404f2d7b24e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:02:05 crc kubenswrapper[4825]: I0310 09:02:05.994737 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5972ff3-b652-4ed2-8414-404f2d7b24e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5972ff3-b652-4ed2-8414-404f2d7b24e0" (UID: "e5972ff3-b652-4ed2-8414-404f2d7b24e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.046264 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81060137-51ef-4ed2-a740-aeb94217f912-combined-ca-bundle\") pod \"81060137-51ef-4ed2-a740-aeb94217f912\" (UID: \"81060137-51ef-4ed2-a740-aeb94217f912\") " Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.046530 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z5gv\" (UniqueName: \"kubernetes.io/projected/81060137-51ef-4ed2-a740-aeb94217f912-kube-api-access-7z5gv\") pod \"81060137-51ef-4ed2-a740-aeb94217f912\" (UID: \"81060137-51ef-4ed2-a740-aeb94217f912\") " Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.046564 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81060137-51ef-4ed2-a740-aeb94217f912-config-data\") pod \"81060137-51ef-4ed2-a740-aeb94217f912\" (UID: \"81060137-51ef-4ed2-a740-aeb94217f912\") " Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.046990 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5972ff3-b652-4ed2-8414-404f2d7b24e0-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.047010 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84p4l\" (UniqueName: \"kubernetes.io/projected/e5972ff3-b652-4ed2-8414-404f2d7b24e0-kube-api-access-84p4l\") on node \"crc\" DevicePath \"\"" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.047021 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5972ff3-b652-4ed2-8414-404f2d7b24e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.049274 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81060137-51ef-4ed2-a740-aeb94217f912-kube-api-access-7z5gv" (OuterVolumeSpecName: "kube-api-access-7z5gv") pod "81060137-51ef-4ed2-a740-aeb94217f912" (UID: "81060137-51ef-4ed2-a740-aeb94217f912"). InnerVolumeSpecName "kube-api-access-7z5gv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.072767 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81060137-51ef-4ed2-a740-aeb94217f912-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81060137-51ef-4ed2-a740-aeb94217f912" (UID: "81060137-51ef-4ed2-a740-aeb94217f912"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.074289 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81060137-51ef-4ed2-a740-aeb94217f912-config-data" (OuterVolumeSpecName: "config-data") pod "81060137-51ef-4ed2-a740-aeb94217f912" (UID: "81060137-51ef-4ed2-a740-aeb94217f912"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.148472 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z5gv\" (UniqueName: \"kubernetes.io/projected/81060137-51ef-4ed2-a740-aeb94217f912-kube-api-access-7z5gv\") on node \"crc\" DevicePath \"\"" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.148512 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81060137-51ef-4ed2-a740-aeb94217f912-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.148523 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81060137-51ef-4ed2-a740-aeb94217f912-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.236191 4825 generic.go:334] "Generic (PLEG): container finished" podID="81060137-51ef-4ed2-a740-aeb94217f912" containerID="78f7952655702f7d5aee9e67af3b70c3c97c4a73047658e3a8e794fcc744b3ae" exitCode=0 Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.236248 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.236243 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"81060137-51ef-4ed2-a740-aeb94217f912","Type":"ContainerDied","Data":"78f7952655702f7d5aee9e67af3b70c3c97c4a73047658e3a8e794fcc744b3ae"} Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.236303 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"81060137-51ef-4ed2-a740-aeb94217f912","Type":"ContainerDied","Data":"88b99cd0a4864516413d8b98dc28ab7468aa1e5bb28924fc9a81cb4cc0243992"} Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.236321 4825 scope.go:117] "RemoveContainer" containerID="78f7952655702f7d5aee9e67af3b70c3c97c4a73047658e3a8e794fcc744b3ae" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.238509 4825 generic.go:334] "Generic (PLEG): container finished" podID="e5972ff3-b652-4ed2-8414-404f2d7b24e0" containerID="5dd6b79a640eb1c780a7157a5aa0793fc089f2ff62c548e00d338a55ea63a00c" exitCode=0 Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.238548 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e5972ff3-b652-4ed2-8414-404f2d7b24e0","Type":"ContainerDied","Data":"5dd6b79a640eb1c780a7157a5aa0793fc089f2ff62c548e00d338a55ea63a00c"} Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.238576 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e5972ff3-b652-4ed2-8414-404f2d7b24e0","Type":"ContainerDied","Data":"6b6670fac27d4102b421612ff592d554e6ff035e7168df4573e2483c3179f626"} Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.238576 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.268406 4825 scope.go:117] "RemoveContainer" containerID="78f7952655702f7d5aee9e67af3b70c3c97c4a73047658e3a8e794fcc744b3ae" Mar 10 09:02:06 crc kubenswrapper[4825]: E0310 09:02:06.269805 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78f7952655702f7d5aee9e67af3b70c3c97c4a73047658e3a8e794fcc744b3ae\": container with ID starting with 78f7952655702f7d5aee9e67af3b70c3c97c4a73047658e3a8e794fcc744b3ae not found: ID does not exist" containerID="78f7952655702f7d5aee9e67af3b70c3c97c4a73047658e3a8e794fcc744b3ae" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.269841 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78f7952655702f7d5aee9e67af3b70c3c97c4a73047658e3a8e794fcc744b3ae"} err="failed to get container status \"78f7952655702f7d5aee9e67af3b70c3c97c4a73047658e3a8e794fcc744b3ae\": rpc error: code = NotFound desc = could not find container \"78f7952655702f7d5aee9e67af3b70c3c97c4a73047658e3a8e794fcc744b3ae\": container with ID starting with 78f7952655702f7d5aee9e67af3b70c3c97c4a73047658e3a8e794fcc744b3ae not found: ID does not exist" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.269863 4825 scope.go:117] "RemoveContainer" containerID="5dd6b79a640eb1c780a7157a5aa0793fc089f2ff62c548e00d338a55ea63a00c" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.303172 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.316172 4825 scope.go:117] "RemoveContainer" containerID="5dd6b79a640eb1c780a7157a5aa0793fc089f2ff62c548e00d338a55ea63a00c" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.317755 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 09:02:06 crc kubenswrapper[4825]: E0310 09:02:06.317923 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dd6b79a640eb1c780a7157a5aa0793fc089f2ff62c548e00d338a55ea63a00c\": container with ID starting with 5dd6b79a640eb1c780a7157a5aa0793fc089f2ff62c548e00d338a55ea63a00c not found: ID does not exist" containerID="5dd6b79a640eb1c780a7157a5aa0793fc089f2ff62c548e00d338a55ea63a00c" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.317971 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dd6b79a640eb1c780a7157a5aa0793fc089f2ff62c548e00d338a55ea63a00c"} err="failed to get container status \"5dd6b79a640eb1c780a7157a5aa0793fc089f2ff62c548e00d338a55ea63a00c\": rpc error: code = NotFound desc = could not find container \"5dd6b79a640eb1c780a7157a5aa0793fc089f2ff62c548e00d338a55ea63a00c\": container with ID starting with 5dd6b79a640eb1c780a7157a5aa0793fc089f2ff62c548e00d338a55ea63a00c not found: ID does not exist" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.337592 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.346422 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.355163 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 09:02:06 crc kubenswrapper[4825]: E0310 09:02:06.355580 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5972ff3-b652-4ed2-8414-404f2d7b24e0" containerName="nova-cell1-conductor-conductor" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.355598 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5972ff3-b652-4ed2-8414-404f2d7b24e0" containerName="nova-cell1-conductor-conductor" Mar 10 09:02:06 crc kubenswrapper[4825]: E0310 09:02:06.355612 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7219dcd8-f33b-4734-9f06-5d232060adcd" containerName="oc" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.355618 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="7219dcd8-f33b-4734-9f06-5d232060adcd" containerName="oc" Mar 10 09:02:06 crc kubenswrapper[4825]: E0310 09:02:06.355651 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81060137-51ef-4ed2-a740-aeb94217f912" containerName="nova-cell0-conductor-conductor" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.355657 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="81060137-51ef-4ed2-a740-aeb94217f912" containerName="nova-cell0-conductor-conductor" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.355829 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="81060137-51ef-4ed2-a740-aeb94217f912" containerName="nova-cell0-conductor-conductor" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.355858 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5972ff3-b652-4ed2-8414-404f2d7b24e0" containerName="nova-cell1-conductor-conductor" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.355870 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="7219dcd8-f33b-4734-9f06-5d232060adcd" containerName="oc" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.356593 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.358535 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.369649 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.371337 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.373476 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.378768 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.398760 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.455513 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7daff41-c98a-45c9-b2d3-7c4c58f4921f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a7daff41-c98a-45c9-b2d3-7c4c58f4921f\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.455574 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nsfm\" (UniqueName: \"kubernetes.io/projected/a7daff41-c98a-45c9-b2d3-7c4c58f4921f-kube-api-access-7nsfm\") pod \"nova-cell0-conductor-0\" (UID: \"a7daff41-c98a-45c9-b2d3-7c4c58f4921f\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.455623 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb509b3-8b71-4ed1-9ce1-0a0b535b723a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7eb509b3-8b71-4ed1-9ce1-0a0b535b723a\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.455986 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb509b3-8b71-4ed1-9ce1-0a0b535b723a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7eb509b3-8b71-4ed1-9ce1-0a0b535b723a\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.456236 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7daff41-c98a-45c9-b2d3-7c4c58f4921f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a7daff41-c98a-45c9-b2d3-7c4c58f4921f\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.456279 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pv4r\" (UniqueName: \"kubernetes.io/projected/7eb509b3-8b71-4ed1-9ce1-0a0b535b723a-kube-api-access-8pv4r\") pod \"nova-cell1-conductor-0\" (UID: \"7eb509b3-8b71-4ed1-9ce1-0a0b535b723a\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.558698 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7daff41-c98a-45c9-b2d3-7c4c58f4921f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a7daff41-c98a-45c9-b2d3-7c4c58f4921f\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.558973 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nsfm\" (UniqueName: \"kubernetes.io/projected/a7daff41-c98a-45c9-b2d3-7c4c58f4921f-kube-api-access-7nsfm\") pod \"nova-cell0-conductor-0\" (UID: \"a7daff41-c98a-45c9-b2d3-7c4c58f4921f\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.559200 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb509b3-8b71-4ed1-9ce1-0a0b535b723a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7eb509b3-8b71-4ed1-9ce1-0a0b535b723a\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.559502 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb509b3-8b71-4ed1-9ce1-0a0b535b723a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7eb509b3-8b71-4ed1-9ce1-0a0b535b723a\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.559881 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7daff41-c98a-45c9-b2d3-7c4c58f4921f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a7daff41-c98a-45c9-b2d3-7c4c58f4921f\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.560012 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pv4r\" (UniqueName: \"kubernetes.io/projected/7eb509b3-8b71-4ed1-9ce1-0a0b535b723a-kube-api-access-8pv4r\") pod \"nova-cell1-conductor-0\" (UID: \"7eb509b3-8b71-4ed1-9ce1-0a0b535b723a\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.564102 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7daff41-c98a-45c9-b2d3-7c4c58f4921f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a7daff41-c98a-45c9-b2d3-7c4c58f4921f\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.565338 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb509b3-8b71-4ed1-9ce1-0a0b535b723a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7eb509b3-8b71-4ed1-9ce1-0a0b535b723a\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.565645 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7daff41-c98a-45c9-b2d3-7c4c58f4921f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a7daff41-c98a-45c9-b2d3-7c4c58f4921f\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.565887 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb509b3-8b71-4ed1-9ce1-0a0b535b723a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7eb509b3-8b71-4ed1-9ce1-0a0b535b723a\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.578695 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pv4r\" (UniqueName: \"kubernetes.io/projected/7eb509b3-8b71-4ed1-9ce1-0a0b535b723a-kube-api-access-8pv4r\") pod \"nova-cell1-conductor-0\" (UID: \"7eb509b3-8b71-4ed1-9ce1-0a0b535b723a\") " pod="openstack/nova-cell1-conductor-0" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.579989 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nsfm\" (UniqueName: \"kubernetes.io/projected/a7daff41-c98a-45c9-b2d3-7c4c58f4921f-kube-api-access-7nsfm\") pod \"nova-cell0-conductor-0\" (UID: \"a7daff41-c98a-45c9-b2d3-7c4c58f4921f\") " pod="openstack/nova-cell0-conductor-0" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.676672 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 09:02:06 crc kubenswrapper[4825]: I0310 09:02:06.690567 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 09:02:07 crc kubenswrapper[4825]: I0310 09:02:07.169506 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 09:02:07 crc kubenswrapper[4825]: W0310 09:02:07.248933 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7eb509b3_8b71_4ed1_9ce1_0a0b535b723a.slice/crio-2f3dd5467cab53b12af5c949f7a35dafedcc816f6901058f69f868c0ad6745d8 WatchSource:0}: Error finding container 2f3dd5467cab53b12af5c949f7a35dafedcc816f6901058f69f868c0ad6745d8: Status 404 returned error can't find the container with id 2f3dd5467cab53b12af5c949f7a35dafedcc816f6901058f69f868c0ad6745d8 Mar 10 09:02:07 crc kubenswrapper[4825]: I0310 09:02:07.253896 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81060137-51ef-4ed2-a740-aeb94217f912" path="/var/lib/kubelet/pods/81060137-51ef-4ed2-a740-aeb94217f912/volumes" Mar 10 09:02:07 crc kubenswrapper[4825]: I0310 09:02:07.254885 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa238966-2baf-447e-a928-fd65e87c30b1" path="/var/lib/kubelet/pods/aa238966-2baf-447e-a928-fd65e87c30b1/volumes" Mar 10 09:02:07 crc kubenswrapper[4825]: I0310 09:02:07.255878 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5972ff3-b652-4ed2-8414-404f2d7b24e0" path="/var/lib/kubelet/pods/e5972ff3-b652-4ed2-8414-404f2d7b24e0/volumes" Mar 10 09:02:07 crc kubenswrapper[4825]: I0310 09:02:07.256740 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 09:02:07 crc kubenswrapper[4825]: I0310 09:02:07.256783 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a7daff41-c98a-45c9-b2d3-7c4c58f4921f","Type":"ContainerStarted","Data":"16a0ab2942d1de38ba56b4d61d6ab24953a97dae4481d191f2ebfb69d80bcc91"} Mar 10 09:02:07 crc kubenswrapper[4825]: I0310 09:02:07.965187 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="62dad1be-8413-4a7e-b097-b6d2e1ec3246" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.135:8774/\": read tcp 10.217.0.2:58948->10.217.1.135:8774: read: connection reset by peer" Mar 10 09:02:07 crc kubenswrapper[4825]: I0310 09:02:07.965376 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="62dad1be-8413-4a7e-b097-b6d2e1ec3246" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.135:8774/\": read tcp 10.217.0.2:58960->10.217.1.135:8774: read: connection reset by peer" Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.042579 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ac546983-f13f-4257-ad1d-f0ff7398a28b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.131:8775/\": read tcp 10.217.0.2:35560->10.217.1.131:8775: read: connection reset by peer" Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.042629 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ac546983-f13f-4257-ad1d-f0ff7398a28b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.131:8775/\": read tcp 10.217.0.2:35576->10.217.1.131:8775: read: connection reset by peer" Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.270795 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a7daff41-c98a-45c9-b2d3-7c4c58f4921f","Type":"ContainerStarted","Data":"42cc807f47ce612dbeb9fa3c5974d550b8f852e4afb66ea98bdb0b86803b89e2"} Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.271223 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.273707 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7eb509b3-8b71-4ed1-9ce1-0a0b535b723a","Type":"ContainerStarted","Data":"44e8b2ad2f43dee7b87345fd55afe0cf42fddc8c5a2ce44204e8fadbb479491e"} Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.273739 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7eb509b3-8b71-4ed1-9ce1-0a0b535b723a","Type":"ContainerStarted","Data":"2f3dd5467cab53b12af5c949f7a35dafedcc816f6901058f69f868c0ad6745d8"} Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.274512 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.277108 4825 generic.go:334] "Generic (PLEG): container finished" podID="62dad1be-8413-4a7e-b097-b6d2e1ec3246" containerID="53501f735e5e94a19464b8ed3190c4c9cc118c8695bb55f561845902951cfdbc" exitCode=0 Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.277195 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"62dad1be-8413-4a7e-b097-b6d2e1ec3246","Type":"ContainerDied","Data":"53501f735e5e94a19464b8ed3190c4c9cc118c8695bb55f561845902951cfdbc"} Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.278556 4825 generic.go:334] "Generic (PLEG): container finished" podID="ac546983-f13f-4257-ad1d-f0ff7398a28b" containerID="a150344ed6b966cabe77567d2b52d46ec4894b221ec2780fc56ef6d0b3b79aea" exitCode=0 Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.278582 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ac546983-f13f-4257-ad1d-f0ff7398a28b","Type":"ContainerDied","Data":"a150344ed6b966cabe77567d2b52d46ec4894b221ec2780fc56ef6d0b3b79aea"} Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.298053 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.298031821 podStartE2EDuration="2.298031821s" podCreationTimestamp="2026-03-10 09:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:02:08.289547944 +0000 UTC m=+8281.319328579" watchObservedRunningTime="2026-03-10 09:02:08.298031821 +0000 UTC m=+8281.327812446" Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.321974 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.321948869 podStartE2EDuration="2.321948869s" podCreationTimestamp="2026-03-10 09:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:02:08.306352583 +0000 UTC m=+8281.336133208" watchObservedRunningTime="2026-03-10 09:02:08.321948869 +0000 UTC m=+8281.351729484" Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.737087 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.763728 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.916614 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac546983-f13f-4257-ad1d-f0ff7398a28b-config-data\") pod \"ac546983-f13f-4257-ad1d-f0ff7398a28b\" (UID: \"ac546983-f13f-4257-ad1d-f0ff7398a28b\") " Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.916679 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62dad1be-8413-4a7e-b097-b6d2e1ec3246-public-tls-certs\") pod \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\" (UID: \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\") " Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.916769 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac546983-f13f-4257-ad1d-f0ff7398a28b-nova-metadata-tls-certs\") pod \"ac546983-f13f-4257-ad1d-f0ff7398a28b\" (UID: \"ac546983-f13f-4257-ad1d-f0ff7398a28b\") " Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.916822 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72nms\" (UniqueName: \"kubernetes.io/projected/ac546983-f13f-4257-ad1d-f0ff7398a28b-kube-api-access-72nms\") pod \"ac546983-f13f-4257-ad1d-f0ff7398a28b\" (UID: \"ac546983-f13f-4257-ad1d-f0ff7398a28b\") " Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.916892 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62dad1be-8413-4a7e-b097-b6d2e1ec3246-internal-tls-certs\") pod \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\" (UID: \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\") " Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.916945 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac546983-f13f-4257-ad1d-f0ff7398a28b-combined-ca-bundle\") pod \"ac546983-f13f-4257-ad1d-f0ff7398a28b\" (UID: \"ac546983-f13f-4257-ad1d-f0ff7398a28b\") " Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.916964 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac546983-f13f-4257-ad1d-f0ff7398a28b-logs\") pod \"ac546983-f13f-4257-ad1d-f0ff7398a28b\" (UID: \"ac546983-f13f-4257-ad1d-f0ff7398a28b\") " Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.917047 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62dad1be-8413-4a7e-b097-b6d2e1ec3246-combined-ca-bundle\") pod \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\" (UID: \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\") " Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.917064 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62dad1be-8413-4a7e-b097-b6d2e1ec3246-config-data\") pod \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\" (UID: \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\") " Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.917106 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k8jz\" (UniqueName: \"kubernetes.io/projected/62dad1be-8413-4a7e-b097-b6d2e1ec3246-kube-api-access-7k8jz\") pod \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\" (UID: \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\") " Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.917143 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62dad1be-8413-4a7e-b097-b6d2e1ec3246-logs\") pod \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\" (UID: \"62dad1be-8413-4a7e-b097-b6d2e1ec3246\") " Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.918210 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62dad1be-8413-4a7e-b097-b6d2e1ec3246-logs" (OuterVolumeSpecName: "logs") pod "62dad1be-8413-4a7e-b097-b6d2e1ec3246" (UID: "62dad1be-8413-4a7e-b097-b6d2e1ec3246"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.918546 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac546983-f13f-4257-ad1d-f0ff7398a28b-logs" (OuterVolumeSpecName: "logs") pod "ac546983-f13f-4257-ad1d-f0ff7398a28b" (UID: "ac546983-f13f-4257-ad1d-f0ff7398a28b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.935044 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62dad1be-8413-4a7e-b097-b6d2e1ec3246-kube-api-access-7k8jz" (OuterVolumeSpecName: "kube-api-access-7k8jz") pod "62dad1be-8413-4a7e-b097-b6d2e1ec3246" (UID: "62dad1be-8413-4a7e-b097-b6d2e1ec3246"). InnerVolumeSpecName "kube-api-access-7k8jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.941034 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac546983-f13f-4257-ad1d-f0ff7398a28b-kube-api-access-72nms" (OuterVolumeSpecName: "kube-api-access-72nms") pod "ac546983-f13f-4257-ad1d-f0ff7398a28b" (UID: "ac546983-f13f-4257-ad1d-f0ff7398a28b"). InnerVolumeSpecName "kube-api-access-72nms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.992296 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac546983-f13f-4257-ad1d-f0ff7398a28b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac546983-f13f-4257-ad1d-f0ff7398a28b" (UID: "ac546983-f13f-4257-ad1d-f0ff7398a28b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:02:08 crc kubenswrapper[4825]: I0310 09:02:08.997086 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62dad1be-8413-4a7e-b097-b6d2e1ec3246-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62dad1be-8413-4a7e-b097-b6d2e1ec3246" (UID: "62dad1be-8413-4a7e-b097-b6d2e1ec3246"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.006911 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac546983-f13f-4257-ad1d-f0ff7398a28b-config-data" (OuterVolumeSpecName: "config-data") pod "ac546983-f13f-4257-ad1d-f0ff7398a28b" (UID: "ac546983-f13f-4257-ad1d-f0ff7398a28b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.016404 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62dad1be-8413-4a7e-b097-b6d2e1ec3246-config-data" (OuterVolumeSpecName: "config-data") pod "62dad1be-8413-4a7e-b097-b6d2e1ec3246" (UID: "62dad1be-8413-4a7e-b097-b6d2e1ec3246"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.019999 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72nms\" (UniqueName: \"kubernetes.io/projected/ac546983-f13f-4257-ad1d-f0ff7398a28b-kube-api-access-72nms\") on node \"crc\" DevicePath \"\"" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.020041 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac546983-f13f-4257-ad1d-f0ff7398a28b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.020056 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac546983-f13f-4257-ad1d-f0ff7398a28b-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.020068 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62dad1be-8413-4a7e-b097-b6d2e1ec3246-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.020079 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62dad1be-8413-4a7e-b097-b6d2e1ec3246-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.020091 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k8jz\" (UniqueName: \"kubernetes.io/projected/62dad1be-8413-4a7e-b097-b6d2e1ec3246-kube-api-access-7k8jz\") on node \"crc\" DevicePath \"\"" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.020102 4825 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62dad1be-8413-4a7e-b097-b6d2e1ec3246-logs\") on node \"crc\" DevicePath \"\"" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.020116 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac546983-f13f-4257-ad1d-f0ff7398a28b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.035521 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac546983-f13f-4257-ad1d-f0ff7398a28b-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ac546983-f13f-4257-ad1d-f0ff7398a28b" (UID: "ac546983-f13f-4257-ad1d-f0ff7398a28b"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.045875 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62dad1be-8413-4a7e-b097-b6d2e1ec3246-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "62dad1be-8413-4a7e-b097-b6d2e1ec3246" (UID: "62dad1be-8413-4a7e-b097-b6d2e1ec3246"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.074868 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62dad1be-8413-4a7e-b097-b6d2e1ec3246-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "62dad1be-8413-4a7e-b097-b6d2e1ec3246" (UID: "62dad1be-8413-4a7e-b097-b6d2e1ec3246"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.076938 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.153723 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5189c199-0d2b-40c2-8fe8-04bdea46a84c-combined-ca-bundle\") pod \"5189c199-0d2b-40c2-8fe8-04bdea46a84c\" (UID: \"5189c199-0d2b-40c2-8fe8-04bdea46a84c\") " Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.153907 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sfdc\" (UniqueName: \"kubernetes.io/projected/5189c199-0d2b-40c2-8fe8-04bdea46a84c-kube-api-access-2sfdc\") pod \"5189c199-0d2b-40c2-8fe8-04bdea46a84c\" (UID: \"5189c199-0d2b-40c2-8fe8-04bdea46a84c\") " Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.153930 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5189c199-0d2b-40c2-8fe8-04bdea46a84c-config-data\") pod \"5189c199-0d2b-40c2-8fe8-04bdea46a84c\" (UID: \"5189c199-0d2b-40c2-8fe8-04bdea46a84c\") " Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.154304 4825 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62dad1be-8413-4a7e-b097-b6d2e1ec3246-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.154322 4825 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac546983-f13f-4257-ad1d-f0ff7398a28b-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.154331 4825 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62dad1be-8413-4a7e-b097-b6d2e1ec3246-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.168871 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5189c199-0d2b-40c2-8fe8-04bdea46a84c-kube-api-access-2sfdc" (OuterVolumeSpecName: "kube-api-access-2sfdc") pod "5189c199-0d2b-40c2-8fe8-04bdea46a84c" (UID: "5189c199-0d2b-40c2-8fe8-04bdea46a84c"). InnerVolumeSpecName "kube-api-access-2sfdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.193249 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5189c199-0d2b-40c2-8fe8-04bdea46a84c-config-data" (OuterVolumeSpecName: "config-data") pod "5189c199-0d2b-40c2-8fe8-04bdea46a84c" (UID: "5189c199-0d2b-40c2-8fe8-04bdea46a84c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.202352 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5189c199-0d2b-40c2-8fe8-04bdea46a84c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5189c199-0d2b-40c2-8fe8-04bdea46a84c" (UID: "5189c199-0d2b-40c2-8fe8-04bdea46a84c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.256652 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sfdc\" (UniqueName: \"kubernetes.io/projected/5189c199-0d2b-40c2-8fe8-04bdea46a84c-kube-api-access-2sfdc\") on node \"crc\" DevicePath \"\"" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.256879 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5189c199-0d2b-40c2-8fe8-04bdea46a84c-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.256938 4825 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5189c199-0d2b-40c2-8fe8-04bdea46a84c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.299164 4825 generic.go:334] "Generic (PLEG): container finished" podID="5189c199-0d2b-40c2-8fe8-04bdea46a84c" containerID="2b8ba41409430104450ede717f08bf58654090af49362e9caf0f65c4d13c1060" exitCode=0 Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.299249 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5189c199-0d2b-40c2-8fe8-04bdea46a84c","Type":"ContainerDied","Data":"2b8ba41409430104450ede717f08bf58654090af49362e9caf0f65c4d13c1060"} Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.299309 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5189c199-0d2b-40c2-8fe8-04bdea46a84c","Type":"ContainerDied","Data":"2d942202374c08919f0649968ad5599ab4bd43c83ca5e92e24d199bfe87fdbc6"} Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.299346 4825 scope.go:117] "RemoveContainer" containerID="2b8ba41409430104450ede717f08bf58654090af49362e9caf0f65c4d13c1060" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.300162 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.303824 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ac546983-f13f-4257-ad1d-f0ff7398a28b","Type":"ContainerDied","Data":"d046535f77908c0e136ed66785af0ff9ed53a3a9cad56fe3ded368e7723b0bb0"} Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.303830 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.308528 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"62dad1be-8413-4a7e-b097-b6d2e1ec3246","Type":"ContainerDied","Data":"d61052bd14c88a6e5a4b74270347a1d3f4fc6df0f055339d88a561100f399a65"} Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.308653 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.330417 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.343515 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.344941 4825 scope.go:117] "RemoveContainer" containerID="2b8ba41409430104450ede717f08bf58654090af49362e9caf0f65c4d13c1060" Mar 10 09:02:09 crc kubenswrapper[4825]: E0310 09:02:09.348491 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b8ba41409430104450ede717f08bf58654090af49362e9caf0f65c4d13c1060\": container with ID starting with 2b8ba41409430104450ede717f08bf58654090af49362e9caf0f65c4d13c1060 not found: ID does not exist" containerID="2b8ba41409430104450ede717f08bf58654090af49362e9caf0f65c4d13c1060" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.348546 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b8ba41409430104450ede717f08bf58654090af49362e9caf0f65c4d13c1060"} err="failed to get container status \"2b8ba41409430104450ede717f08bf58654090af49362e9caf0f65c4d13c1060\": rpc error: code = NotFound desc = could not find container \"2b8ba41409430104450ede717f08bf58654090af49362e9caf0f65c4d13c1060\": container with ID starting with 2b8ba41409430104450ede717f08bf58654090af49362e9caf0f65c4d13c1060 not found: ID does not exist" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.348592 4825 scope.go:117] "RemoveContainer" containerID="a150344ed6b966cabe77567d2b52d46ec4894b221ec2780fc56ef6d0b3b79aea" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.385310 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.390863 4825 scope.go:117] "RemoveContainer" containerID="9e25877e7cb42aa5c8c8b1f77338e31beeeb7dfb90d6daa03b3a197a82f65e77" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.410180 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.416538 4825 scope.go:117] "RemoveContainer" containerID="53501f735e5e94a19464b8ed3190c4c9cc118c8695bb55f561845902951cfdbc" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.435617 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:02:09 crc kubenswrapper[4825]: E0310 09:02:09.436062 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac546983-f13f-4257-ad1d-f0ff7398a28b" containerName="nova-metadata-metadata" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.436081 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac546983-f13f-4257-ad1d-f0ff7398a28b" containerName="nova-metadata-metadata" Mar 10 09:02:09 crc kubenswrapper[4825]: E0310 09:02:09.436123 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5189c199-0d2b-40c2-8fe8-04bdea46a84c" containerName="nova-scheduler-scheduler" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.436140 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5189c199-0d2b-40c2-8fe8-04bdea46a84c" containerName="nova-scheduler-scheduler" Mar 10 09:02:09 crc kubenswrapper[4825]: E0310 09:02:09.436153 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62dad1be-8413-4a7e-b097-b6d2e1ec3246" containerName="nova-api-log" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.436159 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="62dad1be-8413-4a7e-b097-b6d2e1ec3246" containerName="nova-api-log" Mar 10 09:02:09 crc kubenswrapper[4825]: E0310 09:02:09.436174 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac546983-f13f-4257-ad1d-f0ff7398a28b" containerName="nova-metadata-log" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.436180 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac546983-f13f-4257-ad1d-f0ff7398a28b" containerName="nova-metadata-log" Mar 10 09:02:09 crc kubenswrapper[4825]: E0310 09:02:09.436194 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62dad1be-8413-4a7e-b097-b6d2e1ec3246" containerName="nova-api-api" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.436200 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="62dad1be-8413-4a7e-b097-b6d2e1ec3246" containerName="nova-api-api" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.436390 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5189c199-0d2b-40c2-8fe8-04bdea46a84c" containerName="nova-scheduler-scheduler" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.436408 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="62dad1be-8413-4a7e-b097-b6d2e1ec3246" containerName="nova-api-api" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.436422 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac546983-f13f-4257-ad1d-f0ff7398a28b" containerName="nova-metadata-metadata" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.436435 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="62dad1be-8413-4a7e-b097-b6d2e1ec3246" containerName="nova-api-log" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.436447 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac546983-f13f-4257-ad1d-f0ff7398a28b" containerName="nova-metadata-log" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.437111 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.438858 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.445080 4825 scope.go:117] "RemoveContainer" containerID="e58f953a04ebd72b37b1f49cf6ed5a60c52afd420da2fa4ea715f458184f4621" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.451958 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.460239 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx97t\" (UniqueName: \"kubernetes.io/projected/aee794fa-32be-4d93-ad70-f9c4837f2a66-kube-api-access-gx97t\") pod \"nova-scheduler-0\" (UID: \"aee794fa-32be-4d93-ad70-f9c4837f2a66\") " pod="openstack/nova-scheduler-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.460278 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee794fa-32be-4d93-ad70-f9c4837f2a66-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"aee794fa-32be-4d93-ad70-f9c4837f2a66\") " pod="openstack/nova-scheduler-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.460371 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aee794fa-32be-4d93-ad70-f9c4837f2a66-config-data\") pod \"nova-scheduler-0\" (UID: \"aee794fa-32be-4d93-ad70-f9c4837f2a66\") " pod="openstack/nova-scheduler-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.472778 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.474403 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.477443 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.477587 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.477723 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.486367 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.510388 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.518306 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.529899 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.532206 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.534555 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.534890 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.542095 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.562939 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecdac00d-be5b-4b28-9c67-30e3249ac5b0-public-tls-certs\") pod \"nova-api-0\" (UID: \"ecdac00d-be5b-4b28-9c67-30e3249ac5b0\") " pod="openstack/nova-api-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.562987 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecdac00d-be5b-4b28-9c67-30e3249ac5b0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ecdac00d-be5b-4b28-9c67-30e3249ac5b0\") " pod="openstack/nova-api-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.563224 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e429567-63da-4536-88f6-4a52cf840573-config-data\") pod \"nova-metadata-0\" (UID: \"2e429567-63da-4536-88f6-4a52cf840573\") " pod="openstack/nova-metadata-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.563262 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e429567-63da-4536-88f6-4a52cf840573-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2e429567-63da-4536-88f6-4a52cf840573\") " pod="openstack/nova-metadata-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.563340 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e429567-63da-4536-88f6-4a52cf840573-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2e429567-63da-4536-88f6-4a52cf840573\") " pod="openstack/nova-metadata-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.563376 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqqpk\" (UniqueName: \"kubernetes.io/projected/ecdac00d-be5b-4b28-9c67-30e3249ac5b0-kube-api-access-dqqpk\") pod \"nova-api-0\" (UID: \"ecdac00d-be5b-4b28-9c67-30e3249ac5b0\") " pod="openstack/nova-api-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.563401 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx97t\" (UniqueName: \"kubernetes.io/projected/aee794fa-32be-4d93-ad70-f9c4837f2a66-kube-api-access-gx97t\") pod \"nova-scheduler-0\" (UID: \"aee794fa-32be-4d93-ad70-f9c4837f2a66\") " pod="openstack/nova-scheduler-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.563472 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxqmk\" (UniqueName: \"kubernetes.io/projected/2e429567-63da-4536-88f6-4a52cf840573-kube-api-access-wxqmk\") pod \"nova-metadata-0\" (UID: \"2e429567-63da-4536-88f6-4a52cf840573\") " pod="openstack/nova-metadata-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.563496 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee794fa-32be-4d93-ad70-f9c4837f2a66-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"aee794fa-32be-4d93-ad70-f9c4837f2a66\") " pod="openstack/nova-scheduler-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.563546 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecdac00d-be5b-4b28-9c67-30e3249ac5b0-logs\") pod \"nova-api-0\" (UID: \"ecdac00d-be5b-4b28-9c67-30e3249ac5b0\") " pod="openstack/nova-api-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.564264 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecdac00d-be5b-4b28-9c67-30e3249ac5b0-config-data\") pod \"nova-api-0\" (UID: \"ecdac00d-be5b-4b28-9c67-30e3249ac5b0\") " pod="openstack/nova-api-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.564321 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e429567-63da-4536-88f6-4a52cf840573-logs\") pod \"nova-metadata-0\" (UID: \"2e429567-63da-4536-88f6-4a52cf840573\") " pod="openstack/nova-metadata-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.564339 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aee794fa-32be-4d93-ad70-f9c4837f2a66-config-data\") pod \"nova-scheduler-0\" (UID: \"aee794fa-32be-4d93-ad70-f9c4837f2a66\") " pod="openstack/nova-scheduler-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.564356 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecdac00d-be5b-4b28-9c67-30e3249ac5b0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ecdac00d-be5b-4b28-9c67-30e3249ac5b0\") " pod="openstack/nova-api-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.567815 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee794fa-32be-4d93-ad70-f9c4837f2a66-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"aee794fa-32be-4d93-ad70-f9c4837f2a66\") " pod="openstack/nova-scheduler-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.568791 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aee794fa-32be-4d93-ad70-f9c4837f2a66-config-data\") pod \"nova-scheduler-0\" (UID: \"aee794fa-32be-4d93-ad70-f9c4837f2a66\") " pod="openstack/nova-scheduler-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.582857 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx97t\" (UniqueName: \"kubernetes.io/projected/aee794fa-32be-4d93-ad70-f9c4837f2a66-kube-api-access-gx97t\") pod \"nova-scheduler-0\" (UID: \"aee794fa-32be-4d93-ad70-f9c4837f2a66\") " pod="openstack/nova-scheduler-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.666166 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqqpk\" (UniqueName: \"kubernetes.io/projected/ecdac00d-be5b-4b28-9c67-30e3249ac5b0-kube-api-access-dqqpk\") pod \"nova-api-0\" (UID: \"ecdac00d-be5b-4b28-9c67-30e3249ac5b0\") " pod="openstack/nova-api-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.666215 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxqmk\" (UniqueName: \"kubernetes.io/projected/2e429567-63da-4536-88f6-4a52cf840573-kube-api-access-wxqmk\") pod \"nova-metadata-0\" (UID: \"2e429567-63da-4536-88f6-4a52cf840573\") " pod="openstack/nova-metadata-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.666258 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecdac00d-be5b-4b28-9c67-30e3249ac5b0-logs\") pod \"nova-api-0\" (UID: \"ecdac00d-be5b-4b28-9c67-30e3249ac5b0\") " pod="openstack/nova-api-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.666340 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecdac00d-be5b-4b28-9c67-30e3249ac5b0-config-data\") pod \"nova-api-0\" (UID: \"ecdac00d-be5b-4b28-9c67-30e3249ac5b0\") " pod="openstack/nova-api-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.666399 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e429567-63da-4536-88f6-4a52cf840573-logs\") pod \"nova-metadata-0\" (UID: \"2e429567-63da-4536-88f6-4a52cf840573\") " pod="openstack/nova-metadata-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.666417 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecdac00d-be5b-4b28-9c67-30e3249ac5b0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ecdac00d-be5b-4b28-9c67-30e3249ac5b0\") " pod="openstack/nova-api-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.666455 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecdac00d-be5b-4b28-9c67-30e3249ac5b0-public-tls-certs\") pod \"nova-api-0\" (UID: \"ecdac00d-be5b-4b28-9c67-30e3249ac5b0\") " pod="openstack/nova-api-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.666487 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecdac00d-be5b-4b28-9c67-30e3249ac5b0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ecdac00d-be5b-4b28-9c67-30e3249ac5b0\") " pod="openstack/nova-api-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.666522 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e429567-63da-4536-88f6-4a52cf840573-config-data\") pod \"nova-metadata-0\" (UID: \"2e429567-63da-4536-88f6-4a52cf840573\") " pod="openstack/nova-metadata-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.666537 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e429567-63da-4536-88f6-4a52cf840573-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2e429567-63da-4536-88f6-4a52cf840573\") " pod="openstack/nova-metadata-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.666606 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e429567-63da-4536-88f6-4a52cf840573-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2e429567-63da-4536-88f6-4a52cf840573\") " pod="openstack/nova-metadata-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.666848 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecdac00d-be5b-4b28-9c67-30e3249ac5b0-logs\") pod \"nova-api-0\" (UID: \"ecdac00d-be5b-4b28-9c67-30e3249ac5b0\") " pod="openstack/nova-api-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.667410 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e429567-63da-4536-88f6-4a52cf840573-logs\") pod \"nova-metadata-0\" (UID: \"2e429567-63da-4536-88f6-4a52cf840573\") " pod="openstack/nova-metadata-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.670459 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecdac00d-be5b-4b28-9c67-30e3249ac5b0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ecdac00d-be5b-4b28-9c67-30e3249ac5b0\") " pod="openstack/nova-api-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.671509 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e429567-63da-4536-88f6-4a52cf840573-config-data\") pod \"nova-metadata-0\" (UID: \"2e429567-63da-4536-88f6-4a52cf840573\") " pod="openstack/nova-metadata-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.670473 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecdac00d-be5b-4b28-9c67-30e3249ac5b0-config-data\") pod \"nova-api-0\" (UID: \"ecdac00d-be5b-4b28-9c67-30e3249ac5b0\") " pod="openstack/nova-api-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.670720 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecdac00d-be5b-4b28-9c67-30e3249ac5b0-public-tls-certs\") pod \"nova-api-0\" (UID: \"ecdac00d-be5b-4b28-9c67-30e3249ac5b0\") " pod="openstack/nova-api-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.670462 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e429567-63da-4536-88f6-4a52cf840573-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2e429567-63da-4536-88f6-4a52cf840573\") " pod="openstack/nova-metadata-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.672890 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecdac00d-be5b-4b28-9c67-30e3249ac5b0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ecdac00d-be5b-4b28-9c67-30e3249ac5b0\") " pod="openstack/nova-api-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.674573 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e429567-63da-4536-88f6-4a52cf840573-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2e429567-63da-4536-88f6-4a52cf840573\") " pod="openstack/nova-metadata-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.685003 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqqpk\" (UniqueName: \"kubernetes.io/projected/ecdac00d-be5b-4b28-9c67-30e3249ac5b0-kube-api-access-dqqpk\") pod \"nova-api-0\" (UID: \"ecdac00d-be5b-4b28-9c67-30e3249ac5b0\") " pod="openstack/nova-api-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.685281 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxqmk\" (UniqueName: \"kubernetes.io/projected/2e429567-63da-4536-88f6-4a52cf840573-kube-api-access-wxqmk\") pod \"nova-metadata-0\" (UID: \"2e429567-63da-4536-88f6-4a52cf840573\") " pod="openstack/nova-metadata-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.752484 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.794614 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 09:02:09 crc kubenswrapper[4825]: I0310 09:02:09.953224 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 09:02:10 crc kubenswrapper[4825]: W0310 09:02:10.232591 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaee794fa_32be_4d93_ad70_f9c4837f2a66.slice/crio-9cf2093b615864eeed90ffe1ce0e567101b7171e262c2c616760fbd6adcb8688 WatchSource:0}: Error finding container 9cf2093b615864eeed90ffe1ce0e567101b7171e262c2c616760fbd6adcb8688: Status 404 returned error can't find the container with id 9cf2093b615864eeed90ffe1ce0e567101b7171e262c2c616760fbd6adcb8688 Mar 10 09:02:10 crc kubenswrapper[4825]: I0310 09:02:10.234952 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 09:02:10 crc kubenswrapper[4825]: I0310 09:02:10.300030 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 09:02:10 crc kubenswrapper[4825]: W0310 09:02:10.306307 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecdac00d_be5b_4b28_9c67_30e3249ac5b0.slice/crio-f155f717af3ab1baa8f0c5717f335c1f911349c426d356b6008e73858002a924 WatchSource:0}: Error finding container f155f717af3ab1baa8f0c5717f335c1f911349c426d356b6008e73858002a924: Status 404 returned error can't find the container with id f155f717af3ab1baa8f0c5717f335c1f911349c426d356b6008e73858002a924 Mar 10 09:02:10 crc kubenswrapper[4825]: I0310 09:02:10.321353 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"aee794fa-32be-4d93-ad70-f9c4837f2a66","Type":"ContainerStarted","Data":"9cf2093b615864eeed90ffe1ce0e567101b7171e262c2c616760fbd6adcb8688"} Mar 10 09:02:10 crc kubenswrapper[4825]: I0310 09:02:10.323657 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ecdac00d-be5b-4b28-9c67-30e3249ac5b0","Type":"ContainerStarted","Data":"f155f717af3ab1baa8f0c5717f335c1f911349c426d356b6008e73858002a924"} Mar 10 09:02:10 crc kubenswrapper[4825]: W0310 09:02:10.420244 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e429567_63da_4536_88f6_4a52cf840573.slice/crio-96424c89224d289f52e967404c12ecefdce966371b5e1dd98d208f0da2365de4 WatchSource:0}: Error finding container 96424c89224d289f52e967404c12ecefdce966371b5e1dd98d208f0da2365de4: Status 404 returned error can't find the container with id 96424c89224d289f52e967404c12ecefdce966371b5e1dd98d208f0da2365de4 Mar 10 09:02:10 crc kubenswrapper[4825]: I0310 09:02:10.422611 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 09:02:11 crc kubenswrapper[4825]: I0310 09:02:11.250624 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5189c199-0d2b-40c2-8fe8-04bdea46a84c" path="/var/lib/kubelet/pods/5189c199-0d2b-40c2-8fe8-04bdea46a84c/volumes" Mar 10 09:02:11 crc kubenswrapper[4825]: I0310 09:02:11.252299 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62dad1be-8413-4a7e-b097-b6d2e1ec3246" path="/var/lib/kubelet/pods/62dad1be-8413-4a7e-b097-b6d2e1ec3246/volumes" Mar 10 09:02:11 crc kubenswrapper[4825]: I0310 09:02:11.253433 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac546983-f13f-4257-ad1d-f0ff7398a28b" path="/var/lib/kubelet/pods/ac546983-f13f-4257-ad1d-f0ff7398a28b/volumes" Mar 10 09:02:11 crc kubenswrapper[4825]: I0310 09:02:11.336247 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"aee794fa-32be-4d93-ad70-f9c4837f2a66","Type":"ContainerStarted","Data":"90899ace5197e6c36acc03e78391c1afdff2a80749b67e08a834051e86b1752f"} Mar 10 09:02:11 crc kubenswrapper[4825]: I0310 09:02:11.349757 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ecdac00d-be5b-4b28-9c67-30e3249ac5b0","Type":"ContainerStarted","Data":"2ea4839a0eea16a03e1263c73fd63ea0d7013becea6622d4421ec2994baa2d7f"} Mar 10 09:02:11 crc kubenswrapper[4825]: I0310 09:02:11.349797 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ecdac00d-be5b-4b28-9c67-30e3249ac5b0","Type":"ContainerStarted","Data":"a041112ac1a2aaac7638639c2f868745cddd941bf984fb257ad489d9600bf035"} Mar 10 09:02:11 crc kubenswrapper[4825]: I0310 09:02:11.352349 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2e429567-63da-4536-88f6-4a52cf840573","Type":"ContainerStarted","Data":"928140e057187dfe97c7ebfcaac63a46504fd258e14b20017a2d243a122ab331"} Mar 10 09:02:11 crc kubenswrapper[4825]: I0310 09:02:11.352501 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2e429567-63da-4536-88f6-4a52cf840573","Type":"ContainerStarted","Data":"81c0dc4e9e6f80f43a1c0f800be2326bd5994cc443eb99a91af3830a3b0245ed"} Mar 10 09:02:11 crc kubenswrapper[4825]: I0310 09:02:11.352578 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2e429567-63da-4536-88f6-4a52cf840573","Type":"ContainerStarted","Data":"96424c89224d289f52e967404c12ecefdce966371b5e1dd98d208f0da2365de4"} Mar 10 09:02:11 crc kubenswrapper[4825]: I0310 09:02:11.368336 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.368315097 podStartE2EDuration="2.368315097s" podCreationTimestamp="2026-03-10 09:02:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:02:11.361535576 +0000 UTC m=+8284.391316201" watchObservedRunningTime="2026-03-10 09:02:11.368315097 +0000 UTC m=+8284.398095712" Mar 10 09:02:11 crc kubenswrapper[4825]: I0310 09:02:11.390246 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.390226753 podStartE2EDuration="2.390226753s" podCreationTimestamp="2026-03-10 09:02:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:02:11.389795301 +0000 UTC m=+8284.419575996" watchObservedRunningTime="2026-03-10 09:02:11.390226753 +0000 UTC m=+8284.420007368" Mar 10 09:02:11 crc kubenswrapper[4825]: I0310 09:02:11.422840 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.422818783 podStartE2EDuration="2.422818783s" podCreationTimestamp="2026-03-10 09:02:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 09:02:11.411990114 +0000 UTC m=+8284.441770739" watchObservedRunningTime="2026-03-10 09:02:11.422818783 +0000 UTC m=+8284.452599398" Mar 10 09:02:13 crc kubenswrapper[4825]: I0310 09:02:13.238600 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:02:13 crc kubenswrapper[4825]: E0310 09:02:13.239620 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:02:14 crc kubenswrapper[4825]: I0310 09:02:14.753239 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 09:02:14 crc kubenswrapper[4825]: I0310 09:02:14.954256 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 09:02:14 crc kubenswrapper[4825]: I0310 09:02:14.954337 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 09:02:16 crc kubenswrapper[4825]: I0310 09:02:16.721825 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 10 09:02:16 crc kubenswrapper[4825]: I0310 09:02:16.731464 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 10 09:02:19 crc kubenswrapper[4825]: I0310 09:02:19.547363 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-74cf7b6d9d-m2z87" podUID="944a661d-8d36-464b-a9d3-b2477f6e4663" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.48:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 09:02:19 crc kubenswrapper[4825]: I0310 09:02:19.590637 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-74cf7b6d9d-m2z87" podUID="944a661d-8d36-464b-a9d3-b2477f6e4663" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.48:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 09:02:20 crc kubenswrapper[4825]: I0310 09:02:20.138463 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-s68rr" podUID="3e18f017-e70d-45b7-a7fc-a9398f698980" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 09:02:20 crc kubenswrapper[4825]: I0310 09:02:20.195079 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 09:02:20 crc kubenswrapper[4825]: I0310 09:02:20.195127 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 09:02:20 crc kubenswrapper[4825]: I0310 09:02:20.195160 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 09:02:20 crc kubenswrapper[4825]: I0310 09:02:20.195174 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 09:02:20 crc kubenswrapper[4825]: I0310 09:02:20.195187 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 09:02:20 crc kubenswrapper[4825]: I0310 09:02:20.276394 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 09:02:20 crc kubenswrapper[4825]: I0310 09:02:20.966296 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2e429567-63da-4536-88f6-4a52cf840573" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.241:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 09:02:20 crc kubenswrapper[4825]: I0310 09:02:20.966296 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2e429567-63da-4536-88f6-4a52cf840573" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.241:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 09:02:20 crc kubenswrapper[4825]: I0310 09:02:20.979297 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ecdac00d-be5b-4b28-9c67-30e3249ac5b0" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.240:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 09:02:20 crc kubenswrapper[4825]: I0310 09:02:20.979357 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ecdac00d-be5b-4b28-9c67-30e3249ac5b0" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.240:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 09:02:21 crc kubenswrapper[4825]: I0310 09:02:21.211329 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 09:02:24 crc kubenswrapper[4825]: I0310 09:02:24.236919 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:02:24 crc kubenswrapper[4825]: E0310 09:02:24.237695 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:02:29 crc kubenswrapper[4825]: I0310 09:02:29.805816 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 09:02:29 crc kubenswrapper[4825]: I0310 09:02:29.807035 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 09:02:29 crc kubenswrapper[4825]: I0310 09:02:29.807841 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 09:02:29 crc kubenswrapper[4825]: I0310 09:02:29.822608 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 09:02:29 crc kubenswrapper[4825]: I0310 09:02:29.961853 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 09:02:29 crc kubenswrapper[4825]: I0310 09:02:29.961942 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 09:02:29 crc kubenswrapper[4825]: I0310 09:02:29.966332 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 09:02:29 crc kubenswrapper[4825]: I0310 09:02:29.973121 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 09:02:30 crc kubenswrapper[4825]: I0310 09:02:30.280663 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 09:02:30 crc kubenswrapper[4825]: I0310 09:02:30.290594 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.616740 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc"] Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.618412 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.627265 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.627441 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc"] Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.627576 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.627659 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.627854 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.627855 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-5twvv" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.627930 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.627976 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.775963 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmsgx\" (UniqueName: \"kubernetes.io/projected/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-kube-api-access-nmsgx\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.776023 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.776175 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.776238 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.776276 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.776323 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.776390 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.776487 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.776552 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.776771 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.776819 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.878777 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.878845 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.878869 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.878901 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.878937 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.878976 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.879014 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.879274 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.879294 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.879340 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmsgx\" (UniqueName: \"kubernetes.io/projected/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-kube-api-access-nmsgx\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.879357 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.880402 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.892221 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.896843 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.897823 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.898484 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.898635 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.912495 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.930277 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.948672 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.948763 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmsgx\" (UniqueName: \"kubernetes.io/projected/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-kube-api-access-nmsgx\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:31 crc kubenswrapper[4825]: I0310 09:02:31.953539 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:32 crc kubenswrapper[4825]: I0310 09:02:32.251773 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:02:32 crc kubenswrapper[4825]: I0310 09:02:32.801419 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc"] Mar 10 09:02:33 crc kubenswrapper[4825]: I0310 09:02:33.307513 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" event={"ID":"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019","Type":"ContainerStarted","Data":"ab515bd090b427006138e935db5d3d8c08c988e985e2ee8c28988b7306d53759"} Mar 10 09:02:34 crc kubenswrapper[4825]: I0310 09:02:34.318645 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" event={"ID":"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019","Type":"ContainerStarted","Data":"5f73dd636b78b13e3eee16f822ed9aecddb1ba7e755062fcc2d080276f761388"} Mar 10 09:02:34 crc kubenswrapper[4825]: I0310 09:02:34.341651 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" podStartSLOduration=2.610361271 podStartE2EDuration="3.341629763s" podCreationTimestamp="2026-03-10 09:02:31 +0000 UTC" firstStartedPulling="2026-03-10 09:02:32.800987012 +0000 UTC m=+8305.830767627" lastFinishedPulling="2026-03-10 09:02:33.532255514 +0000 UTC m=+8306.562036119" observedRunningTime="2026-03-10 09:02:34.335643383 +0000 UTC m=+8307.365424038" watchObservedRunningTime="2026-03-10 09:02:34.341629763 +0000 UTC m=+8307.371410388" Mar 10 09:02:35 crc kubenswrapper[4825]: I0310 09:02:35.261792 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:02:35 crc kubenswrapper[4825]: E0310 09:02:35.262837 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:02:48 crc kubenswrapper[4825]: I0310 09:02:48.236615 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:02:48 crc kubenswrapper[4825]: E0310 09:02:48.237350 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:02:55 crc kubenswrapper[4825]: I0310 09:02:55.022994 4825 scope.go:117] "RemoveContainer" containerID="69882d7986da8f262c50e1ddef78696c7150e81dc9f5e3cfa16d79a989865d79" Mar 10 09:03:03 crc kubenswrapper[4825]: I0310 09:03:03.236323 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:03:03 crc kubenswrapper[4825]: E0310 09:03:03.239066 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:03:16 crc kubenswrapper[4825]: I0310 09:03:16.236728 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:03:16 crc kubenswrapper[4825]: E0310 09:03:16.237891 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:03:28 crc kubenswrapper[4825]: I0310 09:03:28.237795 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:03:28 crc kubenswrapper[4825]: E0310 09:03:28.239013 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:03:41 crc kubenswrapper[4825]: I0310 09:03:41.237326 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:03:41 crc kubenswrapper[4825]: E0310 09:03:41.238240 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:03:53 crc kubenswrapper[4825]: I0310 09:03:53.238423 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:03:53 crc kubenswrapper[4825]: E0310 09:03:53.239296 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:04:00 crc kubenswrapper[4825]: I0310 09:04:00.158048 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552224-wn565"] Mar 10 09:04:00 crc kubenswrapper[4825]: I0310 09:04:00.160548 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552224-wn565" Mar 10 09:04:00 crc kubenswrapper[4825]: I0310 09:04:00.164811 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:04:00 crc kubenswrapper[4825]: I0310 09:04:00.165206 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 09:04:00 crc kubenswrapper[4825]: I0310 09:04:00.170526 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552224-wn565"] Mar 10 09:04:00 crc kubenswrapper[4825]: I0310 09:04:00.175056 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:04:00 crc kubenswrapper[4825]: I0310 09:04:00.253917 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89w8b\" (UniqueName: \"kubernetes.io/projected/e8b6e419-d1ea-4849-9beb-63fff48ebe84-kube-api-access-89w8b\") pod \"auto-csr-approver-29552224-wn565\" (UID: \"e8b6e419-d1ea-4849-9beb-63fff48ebe84\") " pod="openshift-infra/auto-csr-approver-29552224-wn565" Mar 10 09:04:00 crc kubenswrapper[4825]: I0310 09:04:00.356181 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89w8b\" (UniqueName: \"kubernetes.io/projected/e8b6e419-d1ea-4849-9beb-63fff48ebe84-kube-api-access-89w8b\") pod \"auto-csr-approver-29552224-wn565\" (UID: \"e8b6e419-d1ea-4849-9beb-63fff48ebe84\") " pod="openshift-infra/auto-csr-approver-29552224-wn565" Mar 10 09:04:00 crc kubenswrapper[4825]: I0310 09:04:00.382962 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89w8b\" (UniqueName: \"kubernetes.io/projected/e8b6e419-d1ea-4849-9beb-63fff48ebe84-kube-api-access-89w8b\") pod \"auto-csr-approver-29552224-wn565\" (UID: \"e8b6e419-d1ea-4849-9beb-63fff48ebe84\") " pod="openshift-infra/auto-csr-approver-29552224-wn565" Mar 10 09:04:00 crc kubenswrapper[4825]: I0310 09:04:00.511517 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552224-wn565" Mar 10 09:04:00 crc kubenswrapper[4825]: I0310 09:04:00.990344 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552224-wn565"] Mar 10 09:04:00 crc kubenswrapper[4825]: I0310 09:04:00.996530 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:04:01 crc kubenswrapper[4825]: I0310 09:04:01.291825 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552224-wn565" event={"ID":"e8b6e419-d1ea-4849-9beb-63fff48ebe84","Type":"ContainerStarted","Data":"f7d54487c458ebd7a3ffb53de76becf058508e708327bedbb69732338db3b138"} Mar 10 09:04:03 crc kubenswrapper[4825]: I0310 09:04:03.325834 4825 generic.go:334] "Generic (PLEG): container finished" podID="e8b6e419-d1ea-4849-9beb-63fff48ebe84" containerID="b23a2c769c7415aa1704d759934a888fec13771b0003794db53d47b5b3922b05" exitCode=0 Mar 10 09:04:03 crc kubenswrapper[4825]: I0310 09:04:03.326293 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552224-wn565" event={"ID":"e8b6e419-d1ea-4849-9beb-63fff48ebe84","Type":"ContainerDied","Data":"b23a2c769c7415aa1704d759934a888fec13771b0003794db53d47b5b3922b05"} Mar 10 09:04:04 crc kubenswrapper[4825]: I0310 09:04:04.684707 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552224-wn565" Mar 10 09:04:04 crc kubenswrapper[4825]: I0310 09:04:04.751215 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89w8b\" (UniqueName: \"kubernetes.io/projected/e8b6e419-d1ea-4849-9beb-63fff48ebe84-kube-api-access-89w8b\") pod \"e8b6e419-d1ea-4849-9beb-63fff48ebe84\" (UID: \"e8b6e419-d1ea-4849-9beb-63fff48ebe84\") " Mar 10 09:04:04 crc kubenswrapper[4825]: I0310 09:04:04.758549 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8b6e419-d1ea-4849-9beb-63fff48ebe84-kube-api-access-89w8b" (OuterVolumeSpecName: "kube-api-access-89w8b") pod "e8b6e419-d1ea-4849-9beb-63fff48ebe84" (UID: "e8b6e419-d1ea-4849-9beb-63fff48ebe84"). InnerVolumeSpecName "kube-api-access-89w8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:04:04 crc kubenswrapper[4825]: I0310 09:04:04.854219 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89w8b\" (UniqueName: \"kubernetes.io/projected/e8b6e419-d1ea-4849-9beb-63fff48ebe84-kube-api-access-89w8b\") on node \"crc\" DevicePath \"\"" Mar 10 09:04:05 crc kubenswrapper[4825]: I0310 09:04:05.348398 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552224-wn565" event={"ID":"e8b6e419-d1ea-4849-9beb-63fff48ebe84","Type":"ContainerDied","Data":"f7d54487c458ebd7a3ffb53de76becf058508e708327bedbb69732338db3b138"} Mar 10 09:04:05 crc kubenswrapper[4825]: I0310 09:04:05.348447 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7d54487c458ebd7a3ffb53de76becf058508e708327bedbb69732338db3b138" Mar 10 09:04:05 crc kubenswrapper[4825]: I0310 09:04:05.348474 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552224-wn565" Mar 10 09:04:05 crc kubenswrapper[4825]: I0310 09:04:05.772402 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552218-8bncf"] Mar 10 09:04:05 crc kubenswrapper[4825]: I0310 09:04:05.790194 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552218-8bncf"] Mar 10 09:04:06 crc kubenswrapper[4825]: I0310 09:04:06.236780 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:04:06 crc kubenswrapper[4825]: E0310 09:04:06.237598 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:04:07 crc kubenswrapper[4825]: I0310 09:04:07.247747 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da625163-95ad-4099-92f2-ae375b811efb" path="/var/lib/kubelet/pods/da625163-95ad-4099-92f2-ae375b811efb/volumes" Mar 10 09:04:18 crc kubenswrapper[4825]: I0310 09:04:18.237444 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:04:18 crc kubenswrapper[4825]: E0310 09:04:18.238406 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:04:29 crc kubenswrapper[4825]: I0310 09:04:29.243162 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:04:29 crc kubenswrapper[4825]: E0310 09:04:29.245653 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:04:42 crc kubenswrapper[4825]: I0310 09:04:42.237392 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:04:42 crc kubenswrapper[4825]: E0310 09:04:42.238158 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:04:53 crc kubenswrapper[4825]: I0310 09:04:53.610427 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-flhmm"] Mar 10 09:04:53 crc kubenswrapper[4825]: E0310 09:04:53.611440 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8b6e419-d1ea-4849-9beb-63fff48ebe84" containerName="oc" Mar 10 09:04:53 crc kubenswrapper[4825]: I0310 09:04:53.611455 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8b6e419-d1ea-4849-9beb-63fff48ebe84" containerName="oc" Mar 10 09:04:53 crc kubenswrapper[4825]: I0310 09:04:53.611705 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8b6e419-d1ea-4849-9beb-63fff48ebe84" containerName="oc" Mar 10 09:04:53 crc kubenswrapper[4825]: I0310 09:04:53.613634 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-flhmm" Mar 10 09:04:53 crc kubenswrapper[4825]: I0310 09:04:53.638209 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-flhmm"] Mar 10 09:04:53 crc kubenswrapper[4825]: I0310 09:04:53.757849 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f804f4b-92a6-493e-a86d-b2487a92f194-catalog-content\") pod \"redhat-marketplace-flhmm\" (UID: \"4f804f4b-92a6-493e-a86d-b2487a92f194\") " pod="openshift-marketplace/redhat-marketplace-flhmm" Mar 10 09:04:53 crc kubenswrapper[4825]: I0310 09:04:53.758000 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r9pr\" (UniqueName: \"kubernetes.io/projected/4f804f4b-92a6-493e-a86d-b2487a92f194-kube-api-access-9r9pr\") pod \"redhat-marketplace-flhmm\" (UID: \"4f804f4b-92a6-493e-a86d-b2487a92f194\") " pod="openshift-marketplace/redhat-marketplace-flhmm" Mar 10 09:04:53 crc kubenswrapper[4825]: I0310 09:04:53.758098 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f804f4b-92a6-493e-a86d-b2487a92f194-utilities\") pod \"redhat-marketplace-flhmm\" (UID: \"4f804f4b-92a6-493e-a86d-b2487a92f194\") " pod="openshift-marketplace/redhat-marketplace-flhmm" Mar 10 09:04:53 crc kubenswrapper[4825]: I0310 09:04:53.859577 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f804f4b-92a6-493e-a86d-b2487a92f194-utilities\") pod \"redhat-marketplace-flhmm\" (UID: \"4f804f4b-92a6-493e-a86d-b2487a92f194\") " pod="openshift-marketplace/redhat-marketplace-flhmm" Mar 10 09:04:53 crc kubenswrapper[4825]: I0310 09:04:53.859880 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f804f4b-92a6-493e-a86d-b2487a92f194-catalog-content\") pod \"redhat-marketplace-flhmm\" (UID: \"4f804f4b-92a6-493e-a86d-b2487a92f194\") " pod="openshift-marketplace/redhat-marketplace-flhmm" Mar 10 09:04:53 crc kubenswrapper[4825]: I0310 09:04:53.860000 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r9pr\" (UniqueName: \"kubernetes.io/projected/4f804f4b-92a6-493e-a86d-b2487a92f194-kube-api-access-9r9pr\") pod \"redhat-marketplace-flhmm\" (UID: \"4f804f4b-92a6-493e-a86d-b2487a92f194\") " pod="openshift-marketplace/redhat-marketplace-flhmm" Mar 10 09:04:53 crc kubenswrapper[4825]: I0310 09:04:53.860193 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f804f4b-92a6-493e-a86d-b2487a92f194-utilities\") pod \"redhat-marketplace-flhmm\" (UID: \"4f804f4b-92a6-493e-a86d-b2487a92f194\") " pod="openshift-marketplace/redhat-marketplace-flhmm" Mar 10 09:04:53 crc kubenswrapper[4825]: I0310 09:04:53.860472 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f804f4b-92a6-493e-a86d-b2487a92f194-catalog-content\") pod \"redhat-marketplace-flhmm\" (UID: \"4f804f4b-92a6-493e-a86d-b2487a92f194\") " pod="openshift-marketplace/redhat-marketplace-flhmm" Mar 10 09:04:53 crc kubenswrapper[4825]: I0310 09:04:53.893388 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r9pr\" (UniqueName: \"kubernetes.io/projected/4f804f4b-92a6-493e-a86d-b2487a92f194-kube-api-access-9r9pr\") pod \"redhat-marketplace-flhmm\" (UID: \"4f804f4b-92a6-493e-a86d-b2487a92f194\") " pod="openshift-marketplace/redhat-marketplace-flhmm" Mar 10 09:04:53 crc kubenswrapper[4825]: I0310 09:04:53.936554 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-flhmm" Mar 10 09:04:54 crc kubenswrapper[4825]: I0310 09:04:54.417924 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-flhmm"] Mar 10 09:04:54 crc kubenswrapper[4825]: I0310 09:04:54.852351 4825 generic.go:334] "Generic (PLEG): container finished" podID="4f804f4b-92a6-493e-a86d-b2487a92f194" containerID="15d41c595643ef511f9d024bc20cf9c35acead6838632ad9ea85446296fcf3c3" exitCode=0 Mar 10 09:04:54 crc kubenswrapper[4825]: I0310 09:04:54.852481 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-flhmm" event={"ID":"4f804f4b-92a6-493e-a86d-b2487a92f194","Type":"ContainerDied","Data":"15d41c595643ef511f9d024bc20cf9c35acead6838632ad9ea85446296fcf3c3"} Mar 10 09:04:54 crc kubenswrapper[4825]: I0310 09:04:54.854000 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-flhmm" event={"ID":"4f804f4b-92a6-493e-a86d-b2487a92f194","Type":"ContainerStarted","Data":"f1c74491a424b8aeeb62406034b19e4e8b41cd41091356073dbfad33a7d6bb84"} Mar 10 09:04:55 crc kubenswrapper[4825]: I0310 09:04:55.279839 4825 scope.go:117] "RemoveContainer" containerID="35c18eb0721446c0e6f18c45f1589f97a64a864a87f6738ca82bdae190c666bb" Mar 10 09:04:55 crc kubenswrapper[4825]: I0310 09:04:55.872615 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-flhmm" event={"ID":"4f804f4b-92a6-493e-a86d-b2487a92f194","Type":"ContainerStarted","Data":"cf0ef309a011f17292a8882fc506a4d43caa5e2ade902a19793daefb494238b4"} Mar 10 09:04:56 crc kubenswrapper[4825]: I0310 09:04:56.236811 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:04:56 crc kubenswrapper[4825]: E0310 09:04:56.237066 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:04:56 crc kubenswrapper[4825]: I0310 09:04:56.883390 4825 generic.go:334] "Generic (PLEG): container finished" podID="4f804f4b-92a6-493e-a86d-b2487a92f194" containerID="cf0ef309a011f17292a8882fc506a4d43caa5e2ade902a19793daefb494238b4" exitCode=0 Mar 10 09:04:56 crc kubenswrapper[4825]: I0310 09:04:56.883458 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-flhmm" event={"ID":"4f804f4b-92a6-493e-a86d-b2487a92f194","Type":"ContainerDied","Data":"cf0ef309a011f17292a8882fc506a4d43caa5e2ade902a19793daefb494238b4"} Mar 10 09:04:57 crc kubenswrapper[4825]: I0310 09:04:57.899841 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-flhmm" event={"ID":"4f804f4b-92a6-493e-a86d-b2487a92f194","Type":"ContainerStarted","Data":"cbca5b1d029437d9bb57bb4472474534e97a00dbe27c7af3ea95f9f470d9d28d"} Mar 10 09:05:03 crc kubenswrapper[4825]: I0310 09:05:03.937577 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-flhmm" Mar 10 09:05:03 crc kubenswrapper[4825]: I0310 09:05:03.938049 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-flhmm" Mar 10 09:05:03 crc kubenswrapper[4825]: I0310 09:05:03.987821 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-flhmm" Mar 10 09:05:04 crc kubenswrapper[4825]: I0310 09:05:04.005745 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-flhmm" podStartSLOduration=8.348136946 podStartE2EDuration="11.005724859s" podCreationTimestamp="2026-03-10 09:04:53 +0000 UTC" firstStartedPulling="2026-03-10 09:04:54.85443482 +0000 UTC m=+8447.884215435" lastFinishedPulling="2026-03-10 09:04:57.512022733 +0000 UTC m=+8450.541803348" observedRunningTime="2026-03-10 09:04:57.931624181 +0000 UTC m=+8450.961404796" watchObservedRunningTime="2026-03-10 09:05:04.005724859 +0000 UTC m=+8457.035505484" Mar 10 09:05:04 crc kubenswrapper[4825]: I0310 09:05:04.062515 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-flhmm" Mar 10 09:05:04 crc kubenswrapper[4825]: I0310 09:05:04.227841 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-flhmm"] Mar 10 09:05:05 crc kubenswrapper[4825]: I0310 09:05:05.983681 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-flhmm" podUID="4f804f4b-92a6-493e-a86d-b2487a92f194" containerName="registry-server" containerID="cri-o://cbca5b1d029437d9bb57bb4472474534e97a00dbe27c7af3ea95f9f470d9d28d" gracePeriod=2 Mar 10 09:05:06 crc kubenswrapper[4825]: I0310 09:05:06.487351 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-flhmm" Mar 10 09:05:06 crc kubenswrapper[4825]: I0310 09:05:06.586693 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f804f4b-92a6-493e-a86d-b2487a92f194-catalog-content\") pod \"4f804f4b-92a6-493e-a86d-b2487a92f194\" (UID: \"4f804f4b-92a6-493e-a86d-b2487a92f194\") " Mar 10 09:05:06 crc kubenswrapper[4825]: I0310 09:05:06.587042 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r9pr\" (UniqueName: \"kubernetes.io/projected/4f804f4b-92a6-493e-a86d-b2487a92f194-kube-api-access-9r9pr\") pod \"4f804f4b-92a6-493e-a86d-b2487a92f194\" (UID: \"4f804f4b-92a6-493e-a86d-b2487a92f194\") " Mar 10 09:05:06 crc kubenswrapper[4825]: I0310 09:05:06.587158 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f804f4b-92a6-493e-a86d-b2487a92f194-utilities\") pod \"4f804f4b-92a6-493e-a86d-b2487a92f194\" (UID: \"4f804f4b-92a6-493e-a86d-b2487a92f194\") " Mar 10 09:05:06 crc kubenswrapper[4825]: I0310 09:05:06.587812 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f804f4b-92a6-493e-a86d-b2487a92f194-utilities" (OuterVolumeSpecName: "utilities") pod "4f804f4b-92a6-493e-a86d-b2487a92f194" (UID: "4f804f4b-92a6-493e-a86d-b2487a92f194"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:05:06 crc kubenswrapper[4825]: I0310 09:05:06.592703 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f804f4b-92a6-493e-a86d-b2487a92f194-kube-api-access-9r9pr" (OuterVolumeSpecName: "kube-api-access-9r9pr") pod "4f804f4b-92a6-493e-a86d-b2487a92f194" (UID: "4f804f4b-92a6-493e-a86d-b2487a92f194"). InnerVolumeSpecName "kube-api-access-9r9pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:05:06 crc kubenswrapper[4825]: I0310 09:05:06.613366 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f804f4b-92a6-493e-a86d-b2487a92f194-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f804f4b-92a6-493e-a86d-b2487a92f194" (UID: "4f804f4b-92a6-493e-a86d-b2487a92f194"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:05:06 crc kubenswrapper[4825]: I0310 09:05:06.691303 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r9pr\" (UniqueName: \"kubernetes.io/projected/4f804f4b-92a6-493e-a86d-b2487a92f194-kube-api-access-9r9pr\") on node \"crc\" DevicePath \"\"" Mar 10 09:05:06 crc kubenswrapper[4825]: I0310 09:05:06.691353 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f804f4b-92a6-493e-a86d-b2487a92f194-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:05:06 crc kubenswrapper[4825]: I0310 09:05:06.691370 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f804f4b-92a6-493e-a86d-b2487a92f194-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:05:06 crc kubenswrapper[4825]: I0310 09:05:06.996411 4825 generic.go:334] "Generic (PLEG): container finished" podID="4f804f4b-92a6-493e-a86d-b2487a92f194" containerID="cbca5b1d029437d9bb57bb4472474534e97a00dbe27c7af3ea95f9f470d9d28d" exitCode=0 Mar 10 09:05:06 crc kubenswrapper[4825]: I0310 09:05:06.996454 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-flhmm" event={"ID":"4f804f4b-92a6-493e-a86d-b2487a92f194","Type":"ContainerDied","Data":"cbca5b1d029437d9bb57bb4472474534e97a00dbe27c7af3ea95f9f470d9d28d"} Mar 10 09:05:06 crc kubenswrapper[4825]: I0310 09:05:06.996488 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-flhmm" event={"ID":"4f804f4b-92a6-493e-a86d-b2487a92f194","Type":"ContainerDied","Data":"f1c74491a424b8aeeb62406034b19e4e8b41cd41091356073dbfad33a7d6bb84"} Mar 10 09:05:06 crc kubenswrapper[4825]: I0310 09:05:06.996504 4825 scope.go:117] "RemoveContainer" containerID="cbca5b1d029437d9bb57bb4472474534e97a00dbe27c7af3ea95f9f470d9d28d" Mar 10 09:05:06 crc kubenswrapper[4825]: I0310 09:05:06.996514 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-flhmm" Mar 10 09:05:07 crc kubenswrapper[4825]: I0310 09:05:07.037706 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-flhmm"] Mar 10 09:05:07 crc kubenswrapper[4825]: I0310 09:05:07.039536 4825 scope.go:117] "RemoveContainer" containerID="cf0ef309a011f17292a8882fc506a4d43caa5e2ade902a19793daefb494238b4" Mar 10 09:05:07 crc kubenswrapper[4825]: I0310 09:05:07.048930 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-flhmm"] Mar 10 09:05:07 crc kubenswrapper[4825]: I0310 09:05:07.067084 4825 scope.go:117] "RemoveContainer" containerID="15d41c595643ef511f9d024bc20cf9c35acead6838632ad9ea85446296fcf3c3" Mar 10 09:05:07 crc kubenswrapper[4825]: I0310 09:05:07.118925 4825 scope.go:117] "RemoveContainer" containerID="cbca5b1d029437d9bb57bb4472474534e97a00dbe27c7af3ea95f9f470d9d28d" Mar 10 09:05:07 crc kubenswrapper[4825]: E0310 09:05:07.119682 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbca5b1d029437d9bb57bb4472474534e97a00dbe27c7af3ea95f9f470d9d28d\": container with ID starting with cbca5b1d029437d9bb57bb4472474534e97a00dbe27c7af3ea95f9f470d9d28d not found: ID does not exist" containerID="cbca5b1d029437d9bb57bb4472474534e97a00dbe27c7af3ea95f9f470d9d28d" Mar 10 09:05:07 crc kubenswrapper[4825]: I0310 09:05:07.119749 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbca5b1d029437d9bb57bb4472474534e97a00dbe27c7af3ea95f9f470d9d28d"} err="failed to get container status \"cbca5b1d029437d9bb57bb4472474534e97a00dbe27c7af3ea95f9f470d9d28d\": rpc error: code = NotFound desc = could not find container \"cbca5b1d029437d9bb57bb4472474534e97a00dbe27c7af3ea95f9f470d9d28d\": container with ID starting with cbca5b1d029437d9bb57bb4472474534e97a00dbe27c7af3ea95f9f470d9d28d not found: ID does not exist" Mar 10 09:05:07 crc kubenswrapper[4825]: I0310 09:05:07.119782 4825 scope.go:117] "RemoveContainer" containerID="cf0ef309a011f17292a8882fc506a4d43caa5e2ade902a19793daefb494238b4" Mar 10 09:05:07 crc kubenswrapper[4825]: E0310 09:05:07.121918 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf0ef309a011f17292a8882fc506a4d43caa5e2ade902a19793daefb494238b4\": container with ID starting with cf0ef309a011f17292a8882fc506a4d43caa5e2ade902a19793daefb494238b4 not found: ID does not exist" containerID="cf0ef309a011f17292a8882fc506a4d43caa5e2ade902a19793daefb494238b4" Mar 10 09:05:07 crc kubenswrapper[4825]: I0310 09:05:07.121971 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf0ef309a011f17292a8882fc506a4d43caa5e2ade902a19793daefb494238b4"} err="failed to get container status \"cf0ef309a011f17292a8882fc506a4d43caa5e2ade902a19793daefb494238b4\": rpc error: code = NotFound desc = could not find container \"cf0ef309a011f17292a8882fc506a4d43caa5e2ade902a19793daefb494238b4\": container with ID starting with cf0ef309a011f17292a8882fc506a4d43caa5e2ade902a19793daefb494238b4 not found: ID does not exist" Mar 10 09:05:07 crc kubenswrapper[4825]: I0310 09:05:07.122006 4825 scope.go:117] "RemoveContainer" containerID="15d41c595643ef511f9d024bc20cf9c35acead6838632ad9ea85446296fcf3c3" Mar 10 09:05:07 crc kubenswrapper[4825]: E0310 09:05:07.122578 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15d41c595643ef511f9d024bc20cf9c35acead6838632ad9ea85446296fcf3c3\": container with ID starting with 15d41c595643ef511f9d024bc20cf9c35acead6838632ad9ea85446296fcf3c3 not found: ID does not exist" containerID="15d41c595643ef511f9d024bc20cf9c35acead6838632ad9ea85446296fcf3c3" Mar 10 09:05:07 crc kubenswrapper[4825]: I0310 09:05:07.122607 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15d41c595643ef511f9d024bc20cf9c35acead6838632ad9ea85446296fcf3c3"} err="failed to get container status \"15d41c595643ef511f9d024bc20cf9c35acead6838632ad9ea85446296fcf3c3\": rpc error: code = NotFound desc = could not find container \"15d41c595643ef511f9d024bc20cf9c35acead6838632ad9ea85446296fcf3c3\": container with ID starting with 15d41c595643ef511f9d024bc20cf9c35acead6838632ad9ea85446296fcf3c3 not found: ID does not exist" Mar 10 09:05:07 crc kubenswrapper[4825]: I0310 09:05:07.236762 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:05:07 crc kubenswrapper[4825]: E0310 09:05:07.237050 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:05:07 crc kubenswrapper[4825]: I0310 09:05:07.247273 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f804f4b-92a6-493e-a86d-b2487a92f194" path="/var/lib/kubelet/pods/4f804f4b-92a6-493e-a86d-b2487a92f194/volumes" Mar 10 09:05:18 crc kubenswrapper[4825]: I0310 09:05:18.236980 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:05:18 crc kubenswrapper[4825]: E0310 09:05:18.237958 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:05:20 crc kubenswrapper[4825]: I0310 09:05:20.572122 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4tn5s"] Mar 10 09:05:20 crc kubenswrapper[4825]: E0310 09:05:20.572783 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f804f4b-92a6-493e-a86d-b2487a92f194" containerName="registry-server" Mar 10 09:05:20 crc kubenswrapper[4825]: I0310 09:05:20.572794 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f804f4b-92a6-493e-a86d-b2487a92f194" containerName="registry-server" Mar 10 09:05:20 crc kubenswrapper[4825]: E0310 09:05:20.572812 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f804f4b-92a6-493e-a86d-b2487a92f194" containerName="extract-content" Mar 10 09:05:20 crc kubenswrapper[4825]: I0310 09:05:20.572818 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f804f4b-92a6-493e-a86d-b2487a92f194" containerName="extract-content" Mar 10 09:05:20 crc kubenswrapper[4825]: E0310 09:05:20.572839 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f804f4b-92a6-493e-a86d-b2487a92f194" containerName="extract-utilities" Mar 10 09:05:20 crc kubenswrapper[4825]: I0310 09:05:20.572845 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f804f4b-92a6-493e-a86d-b2487a92f194" containerName="extract-utilities" Mar 10 09:05:20 crc kubenswrapper[4825]: I0310 09:05:20.573038 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f804f4b-92a6-493e-a86d-b2487a92f194" containerName="registry-server" Mar 10 09:05:20 crc kubenswrapper[4825]: I0310 09:05:20.574495 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4tn5s" Mar 10 09:05:20 crc kubenswrapper[4825]: I0310 09:05:20.599123 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4tn5s"] Mar 10 09:05:20 crc kubenswrapper[4825]: I0310 09:05:20.694634 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/872ce701-95ef-49d7-a511-8b89f637f562-utilities\") pod \"certified-operators-4tn5s\" (UID: \"872ce701-95ef-49d7-a511-8b89f637f562\") " pod="openshift-marketplace/certified-operators-4tn5s" Mar 10 09:05:20 crc kubenswrapper[4825]: I0310 09:05:20.694757 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxj4p\" (UniqueName: \"kubernetes.io/projected/872ce701-95ef-49d7-a511-8b89f637f562-kube-api-access-zxj4p\") pod \"certified-operators-4tn5s\" (UID: \"872ce701-95ef-49d7-a511-8b89f637f562\") " pod="openshift-marketplace/certified-operators-4tn5s" Mar 10 09:05:20 crc kubenswrapper[4825]: I0310 09:05:20.695336 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/872ce701-95ef-49d7-a511-8b89f637f562-catalog-content\") pod \"certified-operators-4tn5s\" (UID: \"872ce701-95ef-49d7-a511-8b89f637f562\") " pod="openshift-marketplace/certified-operators-4tn5s" Mar 10 09:05:20 crc kubenswrapper[4825]: I0310 09:05:20.797797 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/872ce701-95ef-49d7-a511-8b89f637f562-catalog-content\") pod \"certified-operators-4tn5s\" (UID: \"872ce701-95ef-49d7-a511-8b89f637f562\") " pod="openshift-marketplace/certified-operators-4tn5s" Mar 10 09:05:20 crc kubenswrapper[4825]: I0310 09:05:20.797894 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/872ce701-95ef-49d7-a511-8b89f637f562-utilities\") pod \"certified-operators-4tn5s\" (UID: \"872ce701-95ef-49d7-a511-8b89f637f562\") " pod="openshift-marketplace/certified-operators-4tn5s" Mar 10 09:05:20 crc kubenswrapper[4825]: I0310 09:05:20.797955 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxj4p\" (UniqueName: \"kubernetes.io/projected/872ce701-95ef-49d7-a511-8b89f637f562-kube-api-access-zxj4p\") pod \"certified-operators-4tn5s\" (UID: \"872ce701-95ef-49d7-a511-8b89f637f562\") " pod="openshift-marketplace/certified-operators-4tn5s" Mar 10 09:05:20 crc kubenswrapper[4825]: I0310 09:05:20.798716 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/872ce701-95ef-49d7-a511-8b89f637f562-catalog-content\") pod \"certified-operators-4tn5s\" (UID: \"872ce701-95ef-49d7-a511-8b89f637f562\") " pod="openshift-marketplace/certified-operators-4tn5s" Mar 10 09:05:20 crc kubenswrapper[4825]: I0310 09:05:20.798735 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/872ce701-95ef-49d7-a511-8b89f637f562-utilities\") pod \"certified-operators-4tn5s\" (UID: \"872ce701-95ef-49d7-a511-8b89f637f562\") " pod="openshift-marketplace/certified-operators-4tn5s" Mar 10 09:05:20 crc kubenswrapper[4825]: I0310 09:05:20.819407 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxj4p\" (UniqueName: \"kubernetes.io/projected/872ce701-95ef-49d7-a511-8b89f637f562-kube-api-access-zxj4p\") pod \"certified-operators-4tn5s\" (UID: \"872ce701-95ef-49d7-a511-8b89f637f562\") " pod="openshift-marketplace/certified-operators-4tn5s" Mar 10 09:05:20 crc kubenswrapper[4825]: I0310 09:05:20.952969 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4tn5s" Mar 10 09:05:21 crc kubenswrapper[4825]: I0310 09:05:21.468177 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4tn5s"] Mar 10 09:05:22 crc kubenswrapper[4825]: I0310 09:05:22.189835 4825 generic.go:334] "Generic (PLEG): container finished" podID="872ce701-95ef-49d7-a511-8b89f637f562" containerID="700ca5ae800a97b89ed229f35ba99f7452792e2f1dde78252691820e29e95529" exitCode=0 Mar 10 09:05:22 crc kubenswrapper[4825]: I0310 09:05:22.189930 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4tn5s" event={"ID":"872ce701-95ef-49d7-a511-8b89f637f562","Type":"ContainerDied","Data":"700ca5ae800a97b89ed229f35ba99f7452792e2f1dde78252691820e29e95529"} Mar 10 09:05:22 crc kubenswrapper[4825]: I0310 09:05:22.190176 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4tn5s" event={"ID":"872ce701-95ef-49d7-a511-8b89f637f562","Type":"ContainerStarted","Data":"f963f4c945f6f13f2d4b6e77548c691ac8c580f23a932fc495eb8ba0a48487b0"} Mar 10 09:05:24 crc kubenswrapper[4825]: I0310 09:05:24.253299 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4tn5s" event={"ID":"872ce701-95ef-49d7-a511-8b89f637f562","Type":"ContainerStarted","Data":"f806e1b7e0712d3318e36990b7dc4b4e2a08507bc04a4c6fa5211384dc2f0be6"} Mar 10 09:05:25 crc kubenswrapper[4825]: E0310 09:05:25.481283 4825 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod872ce701_95ef_49d7_a511_8b89f637f562.slice/crio-f806e1b7e0712d3318e36990b7dc4b4e2a08507bc04a4c6fa5211384dc2f0be6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod872ce701_95ef_49d7_a511_8b89f637f562.slice/crio-conmon-f806e1b7e0712d3318e36990b7dc4b4e2a08507bc04a4c6fa5211384dc2f0be6.scope\": RecentStats: unable to find data in memory cache]" Mar 10 09:05:26 crc kubenswrapper[4825]: I0310 09:05:26.271088 4825 generic.go:334] "Generic (PLEG): container finished" podID="872ce701-95ef-49d7-a511-8b89f637f562" containerID="f806e1b7e0712d3318e36990b7dc4b4e2a08507bc04a4c6fa5211384dc2f0be6" exitCode=0 Mar 10 09:05:26 crc kubenswrapper[4825]: I0310 09:05:26.271153 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4tn5s" event={"ID":"872ce701-95ef-49d7-a511-8b89f637f562","Type":"ContainerDied","Data":"f806e1b7e0712d3318e36990b7dc4b4e2a08507bc04a4c6fa5211384dc2f0be6"} Mar 10 09:05:27 crc kubenswrapper[4825]: I0310 09:05:27.283602 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4tn5s" event={"ID":"872ce701-95ef-49d7-a511-8b89f637f562","Type":"ContainerStarted","Data":"20a516f884f3d025b52ad5b741e7c25e023c56c0b14eac18e1c3f8b0bfcb3a6a"} Mar 10 09:05:27 crc kubenswrapper[4825]: I0310 09:05:27.304620 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4tn5s" podStartSLOduration=2.820911019 podStartE2EDuration="7.304604709s" podCreationTimestamp="2026-03-10 09:05:20 +0000 UTC" firstStartedPulling="2026-03-10 09:05:22.192428852 +0000 UTC m=+8475.222209497" lastFinishedPulling="2026-03-10 09:05:26.676122552 +0000 UTC m=+8479.705903187" observedRunningTime="2026-03-10 09:05:27.301986709 +0000 UTC m=+8480.331767364" watchObservedRunningTime="2026-03-10 09:05:27.304604709 +0000 UTC m=+8480.334385324" Mar 10 09:05:30 crc kubenswrapper[4825]: I0310 09:05:30.953304 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4tn5s" Mar 10 09:05:30 crc kubenswrapper[4825]: I0310 09:05:30.953842 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4tn5s" Mar 10 09:05:31 crc kubenswrapper[4825]: I0310 09:05:31.018582 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4tn5s" Mar 10 09:05:31 crc kubenswrapper[4825]: I0310 09:05:31.397525 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4tn5s" Mar 10 09:05:33 crc kubenswrapper[4825]: I0310 09:05:33.237375 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:05:33 crc kubenswrapper[4825]: E0310 09:05:33.238064 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:05:34 crc kubenswrapper[4825]: I0310 09:05:34.561022 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4tn5s"] Mar 10 09:05:34 crc kubenswrapper[4825]: I0310 09:05:34.561416 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4tn5s" podUID="872ce701-95ef-49d7-a511-8b89f637f562" containerName="registry-server" containerID="cri-o://20a516f884f3d025b52ad5b741e7c25e023c56c0b14eac18e1c3f8b0bfcb3a6a" gracePeriod=2 Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.094429 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4tn5s" Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.218412 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/872ce701-95ef-49d7-a511-8b89f637f562-utilities\") pod \"872ce701-95ef-49d7-a511-8b89f637f562\" (UID: \"872ce701-95ef-49d7-a511-8b89f637f562\") " Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.218775 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxj4p\" (UniqueName: \"kubernetes.io/projected/872ce701-95ef-49d7-a511-8b89f637f562-kube-api-access-zxj4p\") pod \"872ce701-95ef-49d7-a511-8b89f637f562\" (UID: \"872ce701-95ef-49d7-a511-8b89f637f562\") " Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.218876 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/872ce701-95ef-49d7-a511-8b89f637f562-catalog-content\") pod \"872ce701-95ef-49d7-a511-8b89f637f562\" (UID: \"872ce701-95ef-49d7-a511-8b89f637f562\") " Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.222625 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/872ce701-95ef-49d7-a511-8b89f637f562-utilities" (OuterVolumeSpecName: "utilities") pod "872ce701-95ef-49d7-a511-8b89f637f562" (UID: "872ce701-95ef-49d7-a511-8b89f637f562"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.229347 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/872ce701-95ef-49d7-a511-8b89f637f562-kube-api-access-zxj4p" (OuterVolumeSpecName: "kube-api-access-zxj4p") pod "872ce701-95ef-49d7-a511-8b89f637f562" (UID: "872ce701-95ef-49d7-a511-8b89f637f562"). InnerVolumeSpecName "kube-api-access-zxj4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.271391 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/872ce701-95ef-49d7-a511-8b89f637f562-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "872ce701-95ef-49d7-a511-8b89f637f562" (UID: "872ce701-95ef-49d7-a511-8b89f637f562"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.321221 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxj4p\" (UniqueName: \"kubernetes.io/projected/872ce701-95ef-49d7-a511-8b89f637f562-kube-api-access-zxj4p\") on node \"crc\" DevicePath \"\"" Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.321440 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/872ce701-95ef-49d7-a511-8b89f637f562-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.321500 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/872ce701-95ef-49d7-a511-8b89f637f562-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.370849 4825 generic.go:334] "Generic (PLEG): container finished" podID="872ce701-95ef-49d7-a511-8b89f637f562" containerID="20a516f884f3d025b52ad5b741e7c25e023c56c0b14eac18e1c3f8b0bfcb3a6a" exitCode=0 Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.370912 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4tn5s" event={"ID":"872ce701-95ef-49d7-a511-8b89f637f562","Type":"ContainerDied","Data":"20a516f884f3d025b52ad5b741e7c25e023c56c0b14eac18e1c3f8b0bfcb3a6a"} Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.370985 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4tn5s" event={"ID":"872ce701-95ef-49d7-a511-8b89f637f562","Type":"ContainerDied","Data":"f963f4c945f6f13f2d4b6e77548c691ac8c580f23a932fc495eb8ba0a48487b0"} Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.371005 4825 scope.go:117] "RemoveContainer" containerID="20a516f884f3d025b52ad5b741e7c25e023c56c0b14eac18e1c3f8b0bfcb3a6a" Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.371683 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4tn5s" Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.401802 4825 scope.go:117] "RemoveContainer" containerID="f806e1b7e0712d3318e36990b7dc4b4e2a08507bc04a4c6fa5211384dc2f0be6" Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.425319 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4tn5s"] Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.442714 4825 scope.go:117] "RemoveContainer" containerID="700ca5ae800a97b89ed229f35ba99f7452792e2f1dde78252691820e29e95529" Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.445417 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4tn5s"] Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.489625 4825 scope.go:117] "RemoveContainer" containerID="20a516f884f3d025b52ad5b741e7c25e023c56c0b14eac18e1c3f8b0bfcb3a6a" Mar 10 09:05:35 crc kubenswrapper[4825]: E0310 09:05:35.490052 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20a516f884f3d025b52ad5b741e7c25e023c56c0b14eac18e1c3f8b0bfcb3a6a\": container with ID starting with 20a516f884f3d025b52ad5b741e7c25e023c56c0b14eac18e1c3f8b0bfcb3a6a not found: ID does not exist" containerID="20a516f884f3d025b52ad5b741e7c25e023c56c0b14eac18e1c3f8b0bfcb3a6a" Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.490103 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20a516f884f3d025b52ad5b741e7c25e023c56c0b14eac18e1c3f8b0bfcb3a6a"} err="failed to get container status \"20a516f884f3d025b52ad5b741e7c25e023c56c0b14eac18e1c3f8b0bfcb3a6a\": rpc error: code = NotFound desc = could not find container \"20a516f884f3d025b52ad5b741e7c25e023c56c0b14eac18e1c3f8b0bfcb3a6a\": container with ID starting with 20a516f884f3d025b52ad5b741e7c25e023c56c0b14eac18e1c3f8b0bfcb3a6a not found: ID does not exist" Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.490161 4825 scope.go:117] "RemoveContainer" containerID="f806e1b7e0712d3318e36990b7dc4b4e2a08507bc04a4c6fa5211384dc2f0be6" Mar 10 09:05:35 crc kubenswrapper[4825]: E0310 09:05:35.490393 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f806e1b7e0712d3318e36990b7dc4b4e2a08507bc04a4c6fa5211384dc2f0be6\": container with ID starting with f806e1b7e0712d3318e36990b7dc4b4e2a08507bc04a4c6fa5211384dc2f0be6 not found: ID does not exist" containerID="f806e1b7e0712d3318e36990b7dc4b4e2a08507bc04a4c6fa5211384dc2f0be6" Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.490452 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f806e1b7e0712d3318e36990b7dc4b4e2a08507bc04a4c6fa5211384dc2f0be6"} err="failed to get container status \"f806e1b7e0712d3318e36990b7dc4b4e2a08507bc04a4c6fa5211384dc2f0be6\": rpc error: code = NotFound desc = could not find container \"f806e1b7e0712d3318e36990b7dc4b4e2a08507bc04a4c6fa5211384dc2f0be6\": container with ID starting with f806e1b7e0712d3318e36990b7dc4b4e2a08507bc04a4c6fa5211384dc2f0be6 not found: ID does not exist" Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.490479 4825 scope.go:117] "RemoveContainer" containerID="700ca5ae800a97b89ed229f35ba99f7452792e2f1dde78252691820e29e95529" Mar 10 09:05:35 crc kubenswrapper[4825]: E0310 09:05:35.490832 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"700ca5ae800a97b89ed229f35ba99f7452792e2f1dde78252691820e29e95529\": container with ID starting with 700ca5ae800a97b89ed229f35ba99f7452792e2f1dde78252691820e29e95529 not found: ID does not exist" containerID="700ca5ae800a97b89ed229f35ba99f7452792e2f1dde78252691820e29e95529" Mar 10 09:05:35 crc kubenswrapper[4825]: I0310 09:05:35.490871 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"700ca5ae800a97b89ed229f35ba99f7452792e2f1dde78252691820e29e95529"} err="failed to get container status \"700ca5ae800a97b89ed229f35ba99f7452792e2f1dde78252691820e29e95529\": rpc error: code = NotFound desc = could not find container \"700ca5ae800a97b89ed229f35ba99f7452792e2f1dde78252691820e29e95529\": container with ID starting with 700ca5ae800a97b89ed229f35ba99f7452792e2f1dde78252691820e29e95529 not found: ID does not exist" Mar 10 09:05:37 crc kubenswrapper[4825]: I0310 09:05:37.267316 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="872ce701-95ef-49d7-a511-8b89f637f562" path="/var/lib/kubelet/pods/872ce701-95ef-49d7-a511-8b89f637f562/volumes" Mar 10 09:05:44 crc kubenswrapper[4825]: I0310 09:05:44.184567 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-25pqp"] Mar 10 09:05:44 crc kubenswrapper[4825]: E0310 09:05:44.185582 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="872ce701-95ef-49d7-a511-8b89f637f562" containerName="registry-server" Mar 10 09:05:44 crc kubenswrapper[4825]: I0310 09:05:44.185607 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="872ce701-95ef-49d7-a511-8b89f637f562" containerName="registry-server" Mar 10 09:05:44 crc kubenswrapper[4825]: E0310 09:05:44.185635 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="872ce701-95ef-49d7-a511-8b89f637f562" containerName="extract-content" Mar 10 09:05:44 crc kubenswrapper[4825]: I0310 09:05:44.185644 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="872ce701-95ef-49d7-a511-8b89f637f562" containerName="extract-content" Mar 10 09:05:44 crc kubenswrapper[4825]: E0310 09:05:44.185675 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="872ce701-95ef-49d7-a511-8b89f637f562" containerName="extract-utilities" Mar 10 09:05:44 crc kubenswrapper[4825]: I0310 09:05:44.185683 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="872ce701-95ef-49d7-a511-8b89f637f562" containerName="extract-utilities" Mar 10 09:05:44 crc kubenswrapper[4825]: I0310 09:05:44.185932 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="872ce701-95ef-49d7-a511-8b89f637f562" containerName="registry-server" Mar 10 09:05:44 crc kubenswrapper[4825]: I0310 09:05:44.187426 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25pqp" Mar 10 09:05:44 crc kubenswrapper[4825]: I0310 09:05:44.198458 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-25pqp"] Mar 10 09:05:44 crc kubenswrapper[4825]: I0310 09:05:44.221921 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bedb31f-5d1b-45ce-9d9d-06d235d3213b-catalog-content\") pod \"community-operators-25pqp\" (UID: \"9bedb31f-5d1b-45ce-9d9d-06d235d3213b\") " pod="openshift-marketplace/community-operators-25pqp" Mar 10 09:05:44 crc kubenswrapper[4825]: I0310 09:05:44.222116 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pctk\" (UniqueName: \"kubernetes.io/projected/9bedb31f-5d1b-45ce-9d9d-06d235d3213b-kube-api-access-7pctk\") pod \"community-operators-25pqp\" (UID: \"9bedb31f-5d1b-45ce-9d9d-06d235d3213b\") " pod="openshift-marketplace/community-operators-25pqp" Mar 10 09:05:44 crc kubenswrapper[4825]: I0310 09:05:44.222197 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bedb31f-5d1b-45ce-9d9d-06d235d3213b-utilities\") pod \"community-operators-25pqp\" (UID: \"9bedb31f-5d1b-45ce-9d9d-06d235d3213b\") " pod="openshift-marketplace/community-operators-25pqp" Mar 10 09:05:44 crc kubenswrapper[4825]: I0310 09:05:44.237590 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:05:44 crc kubenswrapper[4825]: E0310 09:05:44.237793 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:05:44 crc kubenswrapper[4825]: I0310 09:05:44.324374 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bedb31f-5d1b-45ce-9d9d-06d235d3213b-utilities\") pod \"community-operators-25pqp\" (UID: \"9bedb31f-5d1b-45ce-9d9d-06d235d3213b\") " pod="openshift-marketplace/community-operators-25pqp" Mar 10 09:05:44 crc kubenswrapper[4825]: I0310 09:05:44.324443 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bedb31f-5d1b-45ce-9d9d-06d235d3213b-catalog-content\") pod \"community-operators-25pqp\" (UID: \"9bedb31f-5d1b-45ce-9d9d-06d235d3213b\") " pod="openshift-marketplace/community-operators-25pqp" Mar 10 09:05:44 crc kubenswrapper[4825]: I0310 09:05:44.324582 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pctk\" (UniqueName: \"kubernetes.io/projected/9bedb31f-5d1b-45ce-9d9d-06d235d3213b-kube-api-access-7pctk\") pod \"community-operators-25pqp\" (UID: \"9bedb31f-5d1b-45ce-9d9d-06d235d3213b\") " pod="openshift-marketplace/community-operators-25pqp" Mar 10 09:05:44 crc kubenswrapper[4825]: I0310 09:05:44.325155 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bedb31f-5d1b-45ce-9d9d-06d235d3213b-utilities\") pod \"community-operators-25pqp\" (UID: \"9bedb31f-5d1b-45ce-9d9d-06d235d3213b\") " pod="openshift-marketplace/community-operators-25pqp" Mar 10 09:05:44 crc kubenswrapper[4825]: I0310 09:05:44.325225 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bedb31f-5d1b-45ce-9d9d-06d235d3213b-catalog-content\") pod \"community-operators-25pqp\" (UID: \"9bedb31f-5d1b-45ce-9d9d-06d235d3213b\") " pod="openshift-marketplace/community-operators-25pqp" Mar 10 09:05:44 crc kubenswrapper[4825]: I0310 09:05:44.352245 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pctk\" (UniqueName: \"kubernetes.io/projected/9bedb31f-5d1b-45ce-9d9d-06d235d3213b-kube-api-access-7pctk\") pod \"community-operators-25pqp\" (UID: \"9bedb31f-5d1b-45ce-9d9d-06d235d3213b\") " pod="openshift-marketplace/community-operators-25pqp" Mar 10 09:05:44 crc kubenswrapper[4825]: I0310 09:05:44.533290 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25pqp" Mar 10 09:05:45 crc kubenswrapper[4825]: I0310 09:05:45.103536 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-25pqp"] Mar 10 09:05:45 crc kubenswrapper[4825]: I0310 09:05:45.485060 4825 generic.go:334] "Generic (PLEG): container finished" podID="9bedb31f-5d1b-45ce-9d9d-06d235d3213b" containerID="5b0ba5f83999c24f337b2687e98d769ff2a37647e96dd37d823acecc8998a50c" exitCode=0 Mar 10 09:05:45 crc kubenswrapper[4825]: I0310 09:05:45.485199 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25pqp" event={"ID":"9bedb31f-5d1b-45ce-9d9d-06d235d3213b","Type":"ContainerDied","Data":"5b0ba5f83999c24f337b2687e98d769ff2a37647e96dd37d823acecc8998a50c"} Mar 10 09:05:45 crc kubenswrapper[4825]: I0310 09:05:45.485438 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25pqp" event={"ID":"9bedb31f-5d1b-45ce-9d9d-06d235d3213b","Type":"ContainerStarted","Data":"b01fb0022ca8d5adb8b3b6961633456e6f69c3982fb12a33caeafa489d3b27b4"} Mar 10 09:05:46 crc kubenswrapper[4825]: I0310 09:05:46.500370 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25pqp" event={"ID":"9bedb31f-5d1b-45ce-9d9d-06d235d3213b","Type":"ContainerStarted","Data":"2c2674394a78a3f4205b1ca5fe4f140061af05c7dd28181ce251132e1b556a9e"} Mar 10 09:05:47 crc kubenswrapper[4825]: I0310 09:05:47.515643 4825 generic.go:334] "Generic (PLEG): container finished" podID="9bedb31f-5d1b-45ce-9d9d-06d235d3213b" containerID="2c2674394a78a3f4205b1ca5fe4f140061af05c7dd28181ce251132e1b556a9e" exitCode=0 Mar 10 09:05:47 crc kubenswrapper[4825]: I0310 09:05:47.515723 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25pqp" event={"ID":"9bedb31f-5d1b-45ce-9d9d-06d235d3213b","Type":"ContainerDied","Data":"2c2674394a78a3f4205b1ca5fe4f140061af05c7dd28181ce251132e1b556a9e"} Mar 10 09:05:49 crc kubenswrapper[4825]: I0310 09:05:49.534780 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25pqp" event={"ID":"9bedb31f-5d1b-45ce-9d9d-06d235d3213b","Type":"ContainerStarted","Data":"6a995efe68a69db809bbeb1a620d0d37bb0376615a600b7c9176732b527312e6"} Mar 10 09:05:49 crc kubenswrapper[4825]: I0310 09:05:49.563121 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-25pqp" podStartSLOduration=2.372316822 podStartE2EDuration="5.563093528s" podCreationTimestamp="2026-03-10 09:05:44 +0000 UTC" firstStartedPulling="2026-03-10 09:05:45.487925021 +0000 UTC m=+8498.517705636" lastFinishedPulling="2026-03-10 09:05:48.678701727 +0000 UTC m=+8501.708482342" observedRunningTime="2026-03-10 09:05:49.554089608 +0000 UTC m=+8502.583870243" watchObservedRunningTime="2026-03-10 09:05:49.563093528 +0000 UTC m=+8502.592874143" Mar 10 09:05:54 crc kubenswrapper[4825]: I0310 09:05:54.533559 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-25pqp" Mar 10 09:05:54 crc kubenswrapper[4825]: I0310 09:05:54.534077 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-25pqp" Mar 10 09:05:54 crc kubenswrapper[4825]: I0310 09:05:54.578014 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-25pqp" Mar 10 09:05:54 crc kubenswrapper[4825]: I0310 09:05:54.639184 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-25pqp" Mar 10 09:05:54 crc kubenswrapper[4825]: I0310 09:05:54.809795 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-25pqp"] Mar 10 09:05:56 crc kubenswrapper[4825]: I0310 09:05:56.604329 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-25pqp" podUID="9bedb31f-5d1b-45ce-9d9d-06d235d3213b" containerName="registry-server" containerID="cri-o://6a995efe68a69db809bbeb1a620d0d37bb0376615a600b7c9176732b527312e6" gracePeriod=2 Mar 10 09:05:57 crc kubenswrapper[4825]: I0310 09:05:57.617547 4825 generic.go:334] "Generic (PLEG): container finished" podID="9bedb31f-5d1b-45ce-9d9d-06d235d3213b" containerID="6a995efe68a69db809bbeb1a620d0d37bb0376615a600b7c9176732b527312e6" exitCode=0 Mar 10 09:05:57 crc kubenswrapper[4825]: I0310 09:05:57.617621 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25pqp" event={"ID":"9bedb31f-5d1b-45ce-9d9d-06d235d3213b","Type":"ContainerDied","Data":"6a995efe68a69db809bbeb1a620d0d37bb0376615a600b7c9176732b527312e6"} Mar 10 09:05:57 crc kubenswrapper[4825]: I0310 09:05:57.617855 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-25pqp" event={"ID":"9bedb31f-5d1b-45ce-9d9d-06d235d3213b","Type":"ContainerDied","Data":"b01fb0022ca8d5adb8b3b6961633456e6f69c3982fb12a33caeafa489d3b27b4"} Mar 10 09:05:57 crc kubenswrapper[4825]: I0310 09:05:57.617871 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b01fb0022ca8d5adb8b3b6961633456e6f69c3982fb12a33caeafa489d3b27b4" Mar 10 09:05:57 crc kubenswrapper[4825]: I0310 09:05:57.641541 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25pqp" Mar 10 09:05:57 crc kubenswrapper[4825]: I0310 09:05:57.743064 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pctk\" (UniqueName: \"kubernetes.io/projected/9bedb31f-5d1b-45ce-9d9d-06d235d3213b-kube-api-access-7pctk\") pod \"9bedb31f-5d1b-45ce-9d9d-06d235d3213b\" (UID: \"9bedb31f-5d1b-45ce-9d9d-06d235d3213b\") " Mar 10 09:05:57 crc kubenswrapper[4825]: I0310 09:05:57.743329 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bedb31f-5d1b-45ce-9d9d-06d235d3213b-catalog-content\") pod \"9bedb31f-5d1b-45ce-9d9d-06d235d3213b\" (UID: \"9bedb31f-5d1b-45ce-9d9d-06d235d3213b\") " Mar 10 09:05:57 crc kubenswrapper[4825]: I0310 09:05:57.743389 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bedb31f-5d1b-45ce-9d9d-06d235d3213b-utilities\") pod \"9bedb31f-5d1b-45ce-9d9d-06d235d3213b\" (UID: \"9bedb31f-5d1b-45ce-9d9d-06d235d3213b\") " Mar 10 09:05:57 crc kubenswrapper[4825]: I0310 09:05:57.745400 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bedb31f-5d1b-45ce-9d9d-06d235d3213b-utilities" (OuterVolumeSpecName: "utilities") pod "9bedb31f-5d1b-45ce-9d9d-06d235d3213b" (UID: "9bedb31f-5d1b-45ce-9d9d-06d235d3213b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:05:57 crc kubenswrapper[4825]: I0310 09:05:57.749893 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bedb31f-5d1b-45ce-9d9d-06d235d3213b-kube-api-access-7pctk" (OuterVolumeSpecName: "kube-api-access-7pctk") pod "9bedb31f-5d1b-45ce-9d9d-06d235d3213b" (UID: "9bedb31f-5d1b-45ce-9d9d-06d235d3213b"). InnerVolumeSpecName "kube-api-access-7pctk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:05:57 crc kubenswrapper[4825]: I0310 09:05:57.804277 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bedb31f-5d1b-45ce-9d9d-06d235d3213b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9bedb31f-5d1b-45ce-9d9d-06d235d3213b" (UID: "9bedb31f-5d1b-45ce-9d9d-06d235d3213b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:05:57 crc kubenswrapper[4825]: I0310 09:05:57.846691 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bedb31f-5d1b-45ce-9d9d-06d235d3213b-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:05:57 crc kubenswrapper[4825]: I0310 09:05:57.846756 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pctk\" (UniqueName: \"kubernetes.io/projected/9bedb31f-5d1b-45ce-9d9d-06d235d3213b-kube-api-access-7pctk\") on node \"crc\" DevicePath \"\"" Mar 10 09:05:57 crc kubenswrapper[4825]: I0310 09:05:57.846776 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bedb31f-5d1b-45ce-9d9d-06d235d3213b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:05:58 crc kubenswrapper[4825]: I0310 09:05:58.265860 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:05:58 crc kubenswrapper[4825]: E0310 09:05:58.266529 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:05:58 crc kubenswrapper[4825]: I0310 09:05:58.630159 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-25pqp" Mar 10 09:05:58 crc kubenswrapper[4825]: I0310 09:05:58.673980 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-25pqp"] Mar 10 09:05:58 crc kubenswrapper[4825]: I0310 09:05:58.686321 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-25pqp"] Mar 10 09:05:59 crc kubenswrapper[4825]: I0310 09:05:59.264682 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bedb31f-5d1b-45ce-9d9d-06d235d3213b" path="/var/lib/kubelet/pods/9bedb31f-5d1b-45ce-9d9d-06d235d3213b/volumes" Mar 10 09:05:59 crc kubenswrapper[4825]: I0310 09:05:59.642998 4825 generic.go:334] "Generic (PLEG): container finished" podID="2ef9f6e3-ec3c-40b2-9072-8f6859aa4019" containerID="5f73dd636b78b13e3eee16f822ed9aecddb1ba7e755062fcc2d080276f761388" exitCode=0 Mar 10 09:05:59 crc kubenswrapper[4825]: I0310 09:05:59.643049 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" event={"ID":"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019","Type":"ContainerDied","Data":"5f73dd636b78b13e3eee16f822ed9aecddb1ba7e755062fcc2d080276f761388"} Mar 10 09:06:00 crc kubenswrapper[4825]: I0310 09:06:00.163577 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552226-kcfrv"] Mar 10 09:06:00 crc kubenswrapper[4825]: E0310 09:06:00.164179 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bedb31f-5d1b-45ce-9d9d-06d235d3213b" containerName="extract-content" Mar 10 09:06:00 crc kubenswrapper[4825]: I0310 09:06:00.164205 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bedb31f-5d1b-45ce-9d9d-06d235d3213b" containerName="extract-content" Mar 10 09:06:00 crc kubenswrapper[4825]: E0310 09:06:00.164230 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bedb31f-5d1b-45ce-9d9d-06d235d3213b" containerName="extract-utilities" Mar 10 09:06:00 crc kubenswrapper[4825]: I0310 09:06:00.164239 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bedb31f-5d1b-45ce-9d9d-06d235d3213b" containerName="extract-utilities" Mar 10 09:06:00 crc kubenswrapper[4825]: E0310 09:06:00.164272 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bedb31f-5d1b-45ce-9d9d-06d235d3213b" containerName="registry-server" Mar 10 09:06:00 crc kubenswrapper[4825]: I0310 09:06:00.164283 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bedb31f-5d1b-45ce-9d9d-06d235d3213b" containerName="registry-server" Mar 10 09:06:00 crc kubenswrapper[4825]: I0310 09:06:00.164555 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bedb31f-5d1b-45ce-9d9d-06d235d3213b" containerName="registry-server" Mar 10 09:06:00 crc kubenswrapper[4825]: I0310 09:06:00.165489 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552226-kcfrv" Mar 10 09:06:00 crc kubenswrapper[4825]: I0310 09:06:00.169438 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:06:00 crc kubenswrapper[4825]: I0310 09:06:00.169634 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 09:06:00 crc kubenswrapper[4825]: I0310 09:06:00.169859 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:06:00 crc kubenswrapper[4825]: I0310 09:06:00.181247 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552226-kcfrv"] Mar 10 09:06:00 crc kubenswrapper[4825]: I0310 09:06:00.309246 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x8pd\" (UniqueName: \"kubernetes.io/projected/3dad9b10-003f-4255-bc72-577855c4d316-kube-api-access-9x8pd\") pod \"auto-csr-approver-29552226-kcfrv\" (UID: \"3dad9b10-003f-4255-bc72-577855c4d316\") " pod="openshift-infra/auto-csr-approver-29552226-kcfrv" Mar 10 09:06:00 crc kubenswrapper[4825]: I0310 09:06:00.414158 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x8pd\" (UniqueName: \"kubernetes.io/projected/3dad9b10-003f-4255-bc72-577855c4d316-kube-api-access-9x8pd\") pod \"auto-csr-approver-29552226-kcfrv\" (UID: \"3dad9b10-003f-4255-bc72-577855c4d316\") " pod="openshift-infra/auto-csr-approver-29552226-kcfrv" Mar 10 09:06:00 crc kubenswrapper[4825]: I0310 09:06:00.435084 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x8pd\" (UniqueName: \"kubernetes.io/projected/3dad9b10-003f-4255-bc72-577855c4d316-kube-api-access-9x8pd\") pod \"auto-csr-approver-29552226-kcfrv\" (UID: \"3dad9b10-003f-4255-bc72-577855c4d316\") " pod="openshift-infra/auto-csr-approver-29552226-kcfrv" Mar 10 09:06:00 crc kubenswrapper[4825]: I0310 09:06:00.513807 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552226-kcfrv" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.138680 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552226-kcfrv"] Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.189697 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.233919 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-inventory\") pod \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.233980 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-compute-config-3\") pod \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.234015 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmsgx\" (UniqueName: \"kubernetes.io/projected/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-kube-api-access-nmsgx\") pod \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.234059 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-combined-ca-bundle\") pod \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.234082 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-compute-config-2\") pod \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.234102 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cells-global-config-0\") pod \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.234177 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-ssh-key-openstack-cell1\") pod \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.234204 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-migration-ssh-key-0\") pod \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.234239 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-migration-ssh-key-1\") pod \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.234264 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-compute-config-1\") pod \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.234295 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-compute-config-0\") pod \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\" (UID: \"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019\") " Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.240675 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "2ef9f6e3-ec3c-40b2-9072-8f6859aa4019" (UID: "2ef9f6e3-ec3c-40b2-9072-8f6859aa4019"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.243033 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-kube-api-access-nmsgx" (OuterVolumeSpecName: "kube-api-access-nmsgx") pod "2ef9f6e3-ec3c-40b2-9072-8f6859aa4019" (UID: "2ef9f6e3-ec3c-40b2-9072-8f6859aa4019"). InnerVolumeSpecName "kube-api-access-nmsgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.266608 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "2ef9f6e3-ec3c-40b2-9072-8f6859aa4019" (UID: "2ef9f6e3-ec3c-40b2-9072-8f6859aa4019"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.266999 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "2ef9f6e3-ec3c-40b2-9072-8f6859aa4019" (UID: "2ef9f6e3-ec3c-40b2-9072-8f6859aa4019"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.268728 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-inventory" (OuterVolumeSpecName: "inventory") pod "2ef9f6e3-ec3c-40b2-9072-8f6859aa4019" (UID: "2ef9f6e3-ec3c-40b2-9072-8f6859aa4019"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.269820 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "2ef9f6e3-ec3c-40b2-9072-8f6859aa4019" (UID: "2ef9f6e3-ec3c-40b2-9072-8f6859aa4019"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.271679 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "2ef9f6e3-ec3c-40b2-9072-8f6859aa4019" (UID: "2ef9f6e3-ec3c-40b2-9072-8f6859aa4019"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.272881 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "2ef9f6e3-ec3c-40b2-9072-8f6859aa4019" (UID: "2ef9f6e3-ec3c-40b2-9072-8f6859aa4019"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.273386 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "2ef9f6e3-ec3c-40b2-9072-8f6859aa4019" (UID: "2ef9f6e3-ec3c-40b2-9072-8f6859aa4019"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.273843 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "2ef9f6e3-ec3c-40b2-9072-8f6859aa4019" (UID: "2ef9f6e3-ec3c-40b2-9072-8f6859aa4019"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.284325 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "2ef9f6e3-ec3c-40b2-9072-8f6859aa4019" (UID: "2ef9f6e3-ec3c-40b2-9072-8f6859aa4019"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.336874 4825 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.336940 4825 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.336957 4825 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.336970 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmsgx\" (UniqueName: \"kubernetes.io/projected/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-kube-api-access-nmsgx\") on node \"crc\" DevicePath \"\"" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.336982 4825 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.336995 4825 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.337033 4825 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.337045 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.337059 4825 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.337070 4825 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.337082 4825 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2ef9f6e3-ec3c-40b2-9072-8f6859aa4019-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.682266 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" event={"ID":"2ef9f6e3-ec3c-40b2-9072-8f6859aa4019","Type":"ContainerDied","Data":"ab515bd090b427006138e935db5d3d8c08c988e985e2ee8c28988b7306d53759"} Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.682594 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab515bd090b427006138e935db5d3d8c08c988e985e2ee8c28988b7306d53759" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.682301 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc" Mar 10 09:06:01 crc kubenswrapper[4825]: I0310 09:06:01.683888 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552226-kcfrv" event={"ID":"3dad9b10-003f-4255-bc72-577855c4d316","Type":"ContainerStarted","Data":"9a3a70de2662cd3a82fceaec7d4b04891f194b17a23f1b8929d413e8121e2b4f"} Mar 10 09:06:03 crc kubenswrapper[4825]: I0310 09:06:03.703055 4825 generic.go:334] "Generic (PLEG): container finished" podID="3dad9b10-003f-4255-bc72-577855c4d316" containerID="85a3c232edaabb6a2ec5602b4e9d02bcfaed7293b5fb03838d4d7dd6ae609235" exitCode=0 Mar 10 09:06:03 crc kubenswrapper[4825]: I0310 09:06:03.703179 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552226-kcfrv" event={"ID":"3dad9b10-003f-4255-bc72-577855c4d316","Type":"ContainerDied","Data":"85a3c232edaabb6a2ec5602b4e9d02bcfaed7293b5fb03838d4d7dd6ae609235"} Mar 10 09:06:05 crc kubenswrapper[4825]: I0310 09:06:05.099035 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552226-kcfrv" Mar 10 09:06:05 crc kubenswrapper[4825]: I0310 09:06:05.119888 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x8pd\" (UniqueName: \"kubernetes.io/projected/3dad9b10-003f-4255-bc72-577855c4d316-kube-api-access-9x8pd\") pod \"3dad9b10-003f-4255-bc72-577855c4d316\" (UID: \"3dad9b10-003f-4255-bc72-577855c4d316\") " Mar 10 09:06:05 crc kubenswrapper[4825]: I0310 09:06:05.131453 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dad9b10-003f-4255-bc72-577855c4d316-kube-api-access-9x8pd" (OuterVolumeSpecName: "kube-api-access-9x8pd") pod "3dad9b10-003f-4255-bc72-577855c4d316" (UID: "3dad9b10-003f-4255-bc72-577855c4d316"). InnerVolumeSpecName "kube-api-access-9x8pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:06:05 crc kubenswrapper[4825]: I0310 09:06:05.223265 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x8pd\" (UniqueName: \"kubernetes.io/projected/3dad9b10-003f-4255-bc72-577855c4d316-kube-api-access-9x8pd\") on node \"crc\" DevicePath \"\"" Mar 10 09:06:05 crc kubenswrapper[4825]: I0310 09:06:05.737670 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552226-kcfrv" event={"ID":"3dad9b10-003f-4255-bc72-577855c4d316","Type":"ContainerDied","Data":"9a3a70de2662cd3a82fceaec7d4b04891f194b17a23f1b8929d413e8121e2b4f"} Mar 10 09:06:05 crc kubenswrapper[4825]: I0310 09:06:05.737986 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a3a70de2662cd3a82fceaec7d4b04891f194b17a23f1b8929d413e8121e2b4f" Mar 10 09:06:05 crc kubenswrapper[4825]: I0310 09:06:05.737724 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552226-kcfrv" Mar 10 09:06:06 crc kubenswrapper[4825]: I0310 09:06:06.160220 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552220-rn9kv"] Mar 10 09:06:06 crc kubenswrapper[4825]: I0310 09:06:06.168646 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552220-rn9kv"] Mar 10 09:06:07 crc kubenswrapper[4825]: I0310 09:06:07.255156 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7834d68-a65e-4887-9b5a-3608b5778b4d" path="/var/lib/kubelet/pods/c7834d68-a65e-4887-9b5a-3608b5778b4d/volumes" Mar 10 09:06:12 crc kubenswrapper[4825]: I0310 09:06:12.236107 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:06:12 crc kubenswrapper[4825]: E0310 09:06:12.236730 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:06:27 crc kubenswrapper[4825]: I0310 09:06:27.236400 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:06:27 crc kubenswrapper[4825]: E0310 09:06:27.238034 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:06:42 crc kubenswrapper[4825]: I0310 09:06:42.237033 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:06:42 crc kubenswrapper[4825]: E0310 09:06:42.238771 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:06:54 crc kubenswrapper[4825]: I0310 09:06:54.237324 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:06:55 crc kubenswrapper[4825]: I0310 09:06:55.248290 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"251bcbde3d587c53e13f80f1dc3248b62b51ef9f6f569d929f4325376c5948f9"} Mar 10 09:06:55 crc kubenswrapper[4825]: I0310 09:06:55.408446 4825 scope.go:117] "RemoveContainer" containerID="1e6103c3e35e9fe8fd9b3775bf57c6f318a307bedfb973e85bcbd8d9de3aeb5a" Mar 10 09:07:55 crc kubenswrapper[4825]: I0310 09:07:55.520271 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Mar 10 09:07:55 crc kubenswrapper[4825]: I0310 09:07:55.522426 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="808cc76b-6582-4007-8504-5c11f9d43ba4" containerName="adoption" containerID="cri-o://8b01dc2d22588b22bd3853e41c894a4cb9414e35aa1a1773d48381c1a43c3440" gracePeriod=30 Mar 10 09:08:00 crc kubenswrapper[4825]: I0310 09:08:00.167606 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552228-v247f"] Mar 10 09:08:00 crc kubenswrapper[4825]: E0310 09:08:00.169213 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef9f6e3-ec3c-40b2-9072-8f6859aa4019" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Mar 10 09:08:00 crc kubenswrapper[4825]: I0310 09:08:00.169248 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef9f6e3-ec3c-40b2-9072-8f6859aa4019" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Mar 10 09:08:00 crc kubenswrapper[4825]: E0310 09:08:00.169318 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dad9b10-003f-4255-bc72-577855c4d316" containerName="oc" Mar 10 09:08:00 crc kubenswrapper[4825]: I0310 09:08:00.169339 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dad9b10-003f-4255-bc72-577855c4d316" containerName="oc" Mar 10 09:08:00 crc kubenswrapper[4825]: I0310 09:08:00.169846 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ef9f6e3-ec3c-40b2-9072-8f6859aa4019" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Mar 10 09:08:00 crc kubenswrapper[4825]: I0310 09:08:00.169959 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dad9b10-003f-4255-bc72-577855c4d316" containerName="oc" Mar 10 09:08:00 crc kubenswrapper[4825]: I0310 09:08:00.171789 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552228-v247f" Mar 10 09:08:00 crc kubenswrapper[4825]: I0310 09:08:00.176835 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 09:08:00 crc kubenswrapper[4825]: I0310 09:08:00.177383 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:08:00 crc kubenswrapper[4825]: I0310 09:08:00.181541 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:08:00 crc kubenswrapper[4825]: I0310 09:08:00.185057 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552228-v247f"] Mar 10 09:08:00 crc kubenswrapper[4825]: I0310 09:08:00.233783 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsb25\" (UniqueName: \"kubernetes.io/projected/03b01350-dc59-49db-a606-9abd90e1bf11-kube-api-access-xsb25\") pod \"auto-csr-approver-29552228-v247f\" (UID: \"03b01350-dc59-49db-a606-9abd90e1bf11\") " pod="openshift-infra/auto-csr-approver-29552228-v247f" Mar 10 09:08:00 crc kubenswrapper[4825]: I0310 09:08:00.336297 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsb25\" (UniqueName: \"kubernetes.io/projected/03b01350-dc59-49db-a606-9abd90e1bf11-kube-api-access-xsb25\") pod \"auto-csr-approver-29552228-v247f\" (UID: \"03b01350-dc59-49db-a606-9abd90e1bf11\") " pod="openshift-infra/auto-csr-approver-29552228-v247f" Mar 10 09:08:00 crc kubenswrapper[4825]: I0310 09:08:00.356207 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsb25\" (UniqueName: \"kubernetes.io/projected/03b01350-dc59-49db-a606-9abd90e1bf11-kube-api-access-xsb25\") pod \"auto-csr-approver-29552228-v247f\" (UID: \"03b01350-dc59-49db-a606-9abd90e1bf11\") " pod="openshift-infra/auto-csr-approver-29552228-v247f" Mar 10 09:08:00 crc kubenswrapper[4825]: I0310 09:08:00.516379 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552228-v247f" Mar 10 09:08:01 crc kubenswrapper[4825]: I0310 09:08:01.015550 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552228-v247f" event={"ID":"03b01350-dc59-49db-a606-9abd90e1bf11","Type":"ContainerStarted","Data":"1ff81352c0afff3412d017054d1a19e6c2d7b47ad57b45b88cc1275db3d533c0"} Mar 10 09:08:01 crc kubenswrapper[4825]: I0310 09:08:01.022070 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552228-v247f"] Mar 10 09:08:03 crc kubenswrapper[4825]: I0310 09:08:03.041448 4825 generic.go:334] "Generic (PLEG): container finished" podID="03b01350-dc59-49db-a606-9abd90e1bf11" containerID="29138918703f351cfacad81471f690ce74f3560f53ba0eac482c466e26a1194b" exitCode=0 Mar 10 09:08:03 crc kubenswrapper[4825]: I0310 09:08:03.041909 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552228-v247f" event={"ID":"03b01350-dc59-49db-a606-9abd90e1bf11","Type":"ContainerDied","Data":"29138918703f351cfacad81471f690ce74f3560f53ba0eac482c466e26a1194b"} Mar 10 09:08:04 crc kubenswrapper[4825]: I0310 09:08:04.416026 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552228-v247f" Mar 10 09:08:04 crc kubenswrapper[4825]: I0310 09:08:04.538592 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsb25\" (UniqueName: \"kubernetes.io/projected/03b01350-dc59-49db-a606-9abd90e1bf11-kube-api-access-xsb25\") pod \"03b01350-dc59-49db-a606-9abd90e1bf11\" (UID: \"03b01350-dc59-49db-a606-9abd90e1bf11\") " Mar 10 09:08:04 crc kubenswrapper[4825]: I0310 09:08:04.546656 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03b01350-dc59-49db-a606-9abd90e1bf11-kube-api-access-xsb25" (OuterVolumeSpecName: "kube-api-access-xsb25") pod "03b01350-dc59-49db-a606-9abd90e1bf11" (UID: "03b01350-dc59-49db-a606-9abd90e1bf11"). InnerVolumeSpecName "kube-api-access-xsb25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:08:04 crc kubenswrapper[4825]: I0310 09:08:04.642577 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsb25\" (UniqueName: \"kubernetes.io/projected/03b01350-dc59-49db-a606-9abd90e1bf11-kube-api-access-xsb25\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:05 crc kubenswrapper[4825]: I0310 09:08:05.073347 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552228-v247f" Mar 10 09:08:05 crc kubenswrapper[4825]: I0310 09:08:05.073418 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552228-v247f" event={"ID":"03b01350-dc59-49db-a606-9abd90e1bf11","Type":"ContainerDied","Data":"1ff81352c0afff3412d017054d1a19e6c2d7b47ad57b45b88cc1275db3d533c0"} Mar 10 09:08:05 crc kubenswrapper[4825]: I0310 09:08:05.073520 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ff81352c0afff3412d017054d1a19e6c2d7b47ad57b45b88cc1275db3d533c0" Mar 10 09:08:05 crc kubenswrapper[4825]: I0310 09:08:05.512990 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552222-w2fmn"] Mar 10 09:08:05 crc kubenswrapper[4825]: I0310 09:08:05.524927 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552222-w2fmn"] Mar 10 09:08:07 crc kubenswrapper[4825]: I0310 09:08:07.259295 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7219dcd8-f33b-4734-9f06-5d232060adcd" path="/var/lib/kubelet/pods/7219dcd8-f33b-4734-9f06-5d232060adcd/volumes" Mar 10 09:08:26 crc kubenswrapper[4825]: I0310 09:08:26.023051 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 10 09:08:26 crc kubenswrapper[4825]: I0310 09:08:26.150746 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q79n6\" (UniqueName: \"kubernetes.io/projected/808cc76b-6582-4007-8504-5c11f9d43ba4-kube-api-access-q79n6\") pod \"808cc76b-6582-4007-8504-5c11f9d43ba4\" (UID: \"808cc76b-6582-4007-8504-5c11f9d43ba4\") " Mar 10 09:08:26 crc kubenswrapper[4825]: I0310 09:08:26.151529 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acd5ef12-e6ba-4571-9bdc-07cbc77dbb5b\") pod \"808cc76b-6582-4007-8504-5c11f9d43ba4\" (UID: \"808cc76b-6582-4007-8504-5c11f9d43ba4\") " Mar 10 09:08:26 crc kubenswrapper[4825]: I0310 09:08:26.156334 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/808cc76b-6582-4007-8504-5c11f9d43ba4-kube-api-access-q79n6" (OuterVolumeSpecName: "kube-api-access-q79n6") pod "808cc76b-6582-4007-8504-5c11f9d43ba4" (UID: "808cc76b-6582-4007-8504-5c11f9d43ba4"). InnerVolumeSpecName "kube-api-access-q79n6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:08:26 crc kubenswrapper[4825]: I0310 09:08:26.171379 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acd5ef12-e6ba-4571-9bdc-07cbc77dbb5b" (OuterVolumeSpecName: "mariadb-data") pod "808cc76b-6582-4007-8504-5c11f9d43ba4" (UID: "808cc76b-6582-4007-8504-5c11f9d43ba4"). InnerVolumeSpecName "pvc-acd5ef12-e6ba-4571-9bdc-07cbc77dbb5b". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 09:08:26 crc kubenswrapper[4825]: I0310 09:08:26.253309 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-acd5ef12-e6ba-4571-9bdc-07cbc77dbb5b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acd5ef12-e6ba-4571-9bdc-07cbc77dbb5b\") on node \"crc\" " Mar 10 09:08:26 crc kubenswrapper[4825]: I0310 09:08:26.253357 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q79n6\" (UniqueName: \"kubernetes.io/projected/808cc76b-6582-4007-8504-5c11f9d43ba4-kube-api-access-q79n6\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:26 crc kubenswrapper[4825]: I0310 09:08:26.275680 4825 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 10 09:08:26 crc kubenswrapper[4825]: I0310 09:08:26.275829 4825 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-acd5ef12-e6ba-4571-9bdc-07cbc77dbb5b" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acd5ef12-e6ba-4571-9bdc-07cbc77dbb5b") on node "crc" Mar 10 09:08:26 crc kubenswrapper[4825]: I0310 09:08:26.320579 4825 generic.go:334] "Generic (PLEG): container finished" podID="808cc76b-6582-4007-8504-5c11f9d43ba4" containerID="8b01dc2d22588b22bd3853e41c894a4cb9414e35aa1a1773d48381c1a43c3440" exitCode=137 Mar 10 09:08:26 crc kubenswrapper[4825]: I0310 09:08:26.320623 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"808cc76b-6582-4007-8504-5c11f9d43ba4","Type":"ContainerDied","Data":"8b01dc2d22588b22bd3853e41c894a4cb9414e35aa1a1773d48381c1a43c3440"} Mar 10 09:08:26 crc kubenswrapper[4825]: I0310 09:08:26.320669 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"808cc76b-6582-4007-8504-5c11f9d43ba4","Type":"ContainerDied","Data":"11a7b84ddee9aad62b933d0dcc7a5a47e936aab10adef93d82aa6fc3b9183508"} Mar 10 09:08:26 crc kubenswrapper[4825]: I0310 09:08:26.320689 4825 scope.go:117] "RemoveContainer" containerID="8b01dc2d22588b22bd3853e41c894a4cb9414e35aa1a1773d48381c1a43c3440" Mar 10 09:08:26 crc kubenswrapper[4825]: I0310 09:08:26.320632 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 10 09:08:26 crc kubenswrapper[4825]: I0310 09:08:26.343874 4825 scope.go:117] "RemoveContainer" containerID="8b01dc2d22588b22bd3853e41c894a4cb9414e35aa1a1773d48381c1a43c3440" Mar 10 09:08:26 crc kubenswrapper[4825]: E0310 09:08:26.344666 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b01dc2d22588b22bd3853e41c894a4cb9414e35aa1a1773d48381c1a43c3440\": container with ID starting with 8b01dc2d22588b22bd3853e41c894a4cb9414e35aa1a1773d48381c1a43c3440 not found: ID does not exist" containerID="8b01dc2d22588b22bd3853e41c894a4cb9414e35aa1a1773d48381c1a43c3440" Mar 10 09:08:26 crc kubenswrapper[4825]: I0310 09:08:26.344700 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b01dc2d22588b22bd3853e41c894a4cb9414e35aa1a1773d48381c1a43c3440"} err="failed to get container status \"8b01dc2d22588b22bd3853e41c894a4cb9414e35aa1a1773d48381c1a43c3440\": rpc error: code = NotFound desc = could not find container \"8b01dc2d22588b22bd3853e41c894a4cb9414e35aa1a1773d48381c1a43c3440\": container with ID starting with 8b01dc2d22588b22bd3853e41c894a4cb9414e35aa1a1773d48381c1a43c3440 not found: ID does not exist" Mar 10 09:08:26 crc kubenswrapper[4825]: I0310 09:08:26.351319 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Mar 10 09:08:26 crc kubenswrapper[4825]: I0310 09:08:26.355383 4825 reconciler_common.go:293] "Volume detached for volume \"pvc-acd5ef12-e6ba-4571-9bdc-07cbc77dbb5b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acd5ef12-e6ba-4571-9bdc-07cbc77dbb5b\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:26 crc kubenswrapper[4825]: I0310 09:08:26.362396 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Mar 10 09:08:26 crc kubenswrapper[4825]: I0310 09:08:26.992383 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Mar 10 09:08:26 crc kubenswrapper[4825]: I0310 09:08:26.992581 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="45279de6-a8ae-49b0-8182-9b576a1f5e11" containerName="adoption" containerID="cri-o://a06eea67b494d75e4c30481fc42af7aea77ef71a8057552e179619bfc049e25c" gracePeriod=30 Mar 10 09:08:27 crc kubenswrapper[4825]: I0310 09:08:27.246259 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="808cc76b-6582-4007-8504-5c11f9d43ba4" path="/var/lib/kubelet/pods/808cc76b-6582-4007-8504-5c11f9d43ba4/volumes" Mar 10 09:08:55 crc kubenswrapper[4825]: I0310 09:08:55.527417 4825 scope.go:117] "RemoveContainer" containerID="714206142d3ba3d6d1b5114fe11faf0404775b17a7a0dbcb4ebe4f83ff657c7b" Mar 10 09:08:57 crc kubenswrapper[4825]: I0310 09:08:57.487338 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 10 09:08:57 crc kubenswrapper[4825]: I0310 09:08:57.634612 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wm8h\" (UniqueName: \"kubernetes.io/projected/45279de6-a8ae-49b0-8182-9b576a1f5e11-kube-api-access-7wm8h\") pod \"45279de6-a8ae-49b0-8182-9b576a1f5e11\" (UID: \"45279de6-a8ae-49b0-8182-9b576a1f5e11\") " Mar 10 09:08:57 crc kubenswrapper[4825]: I0310 09:08:57.635051 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/45279de6-a8ae-49b0-8182-9b576a1f5e11-ovn-data-cert\") pod \"45279de6-a8ae-49b0-8182-9b576a1f5e11\" (UID: \"45279de6-a8ae-49b0-8182-9b576a1f5e11\") " Mar 10 09:08:57 crc kubenswrapper[4825]: I0310 09:08:57.636267 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6d1180a2-690c-4b45-b239-887bef574278\") pod \"45279de6-a8ae-49b0-8182-9b576a1f5e11\" (UID: \"45279de6-a8ae-49b0-8182-9b576a1f5e11\") " Mar 10 09:08:57 crc kubenswrapper[4825]: I0310 09:08:57.640583 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45279de6-a8ae-49b0-8182-9b576a1f5e11-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "45279de6-a8ae-49b0-8182-9b576a1f5e11" (UID: "45279de6-a8ae-49b0-8182-9b576a1f5e11"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:08:57 crc kubenswrapper[4825]: I0310 09:08:57.640916 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45279de6-a8ae-49b0-8182-9b576a1f5e11-kube-api-access-7wm8h" (OuterVolumeSpecName: "kube-api-access-7wm8h") pod "45279de6-a8ae-49b0-8182-9b576a1f5e11" (UID: "45279de6-a8ae-49b0-8182-9b576a1f5e11"). InnerVolumeSpecName "kube-api-access-7wm8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:08:57 crc kubenswrapper[4825]: I0310 09:08:57.666290 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6d1180a2-690c-4b45-b239-887bef574278" (OuterVolumeSpecName: "ovn-data") pod "45279de6-a8ae-49b0-8182-9b576a1f5e11" (UID: "45279de6-a8ae-49b0-8182-9b576a1f5e11"). InnerVolumeSpecName "pvc-6d1180a2-690c-4b45-b239-887bef574278". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 09:08:57 crc kubenswrapper[4825]: I0310 09:08:57.729583 4825 generic.go:334] "Generic (PLEG): container finished" podID="45279de6-a8ae-49b0-8182-9b576a1f5e11" containerID="a06eea67b494d75e4c30481fc42af7aea77ef71a8057552e179619bfc049e25c" exitCode=137 Mar 10 09:08:57 crc kubenswrapper[4825]: I0310 09:08:57.729692 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 10 09:08:57 crc kubenswrapper[4825]: I0310 09:08:57.729725 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"45279de6-a8ae-49b0-8182-9b576a1f5e11","Type":"ContainerDied","Data":"a06eea67b494d75e4c30481fc42af7aea77ef71a8057552e179619bfc049e25c"} Mar 10 09:08:57 crc kubenswrapper[4825]: I0310 09:08:57.731169 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"45279de6-a8ae-49b0-8182-9b576a1f5e11","Type":"ContainerDied","Data":"81577592660eb68339e1ce86026b02b416d949e0244f3c76122f6912028c4f57"} Mar 10 09:08:57 crc kubenswrapper[4825]: I0310 09:08:57.731204 4825 scope.go:117] "RemoveContainer" containerID="a06eea67b494d75e4c30481fc42af7aea77ef71a8057552e179619bfc049e25c" Mar 10 09:08:57 crc kubenswrapper[4825]: I0310 09:08:57.738181 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wm8h\" (UniqueName: \"kubernetes.io/projected/45279de6-a8ae-49b0-8182-9b576a1f5e11-kube-api-access-7wm8h\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:57 crc kubenswrapper[4825]: I0310 09:08:57.738212 4825 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/45279de6-a8ae-49b0-8182-9b576a1f5e11-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:57 crc kubenswrapper[4825]: I0310 09:08:57.738253 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-6d1180a2-690c-4b45-b239-887bef574278\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6d1180a2-690c-4b45-b239-887bef574278\") on node \"crc\" " Mar 10 09:08:57 crc kubenswrapper[4825]: I0310 09:08:57.755636 4825 scope.go:117] "RemoveContainer" containerID="a06eea67b494d75e4c30481fc42af7aea77ef71a8057552e179619bfc049e25c" Mar 10 09:08:57 crc kubenswrapper[4825]: E0310 09:08:57.756057 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a06eea67b494d75e4c30481fc42af7aea77ef71a8057552e179619bfc049e25c\": container with ID starting with a06eea67b494d75e4c30481fc42af7aea77ef71a8057552e179619bfc049e25c not found: ID does not exist" containerID="a06eea67b494d75e4c30481fc42af7aea77ef71a8057552e179619bfc049e25c" Mar 10 09:08:57 crc kubenswrapper[4825]: I0310 09:08:57.756120 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a06eea67b494d75e4c30481fc42af7aea77ef71a8057552e179619bfc049e25c"} err="failed to get container status \"a06eea67b494d75e4c30481fc42af7aea77ef71a8057552e179619bfc049e25c\": rpc error: code = NotFound desc = could not find container \"a06eea67b494d75e4c30481fc42af7aea77ef71a8057552e179619bfc049e25c\": container with ID starting with a06eea67b494d75e4c30481fc42af7aea77ef71a8057552e179619bfc049e25c not found: ID does not exist" Mar 10 09:08:57 crc kubenswrapper[4825]: I0310 09:08:57.768956 4825 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 10 09:08:57 crc kubenswrapper[4825]: I0310 09:08:57.769292 4825 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-6d1180a2-690c-4b45-b239-887bef574278" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6d1180a2-690c-4b45-b239-887bef574278") on node "crc" Mar 10 09:08:57 crc kubenswrapper[4825]: I0310 09:08:57.774902 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Mar 10 09:08:57 crc kubenswrapper[4825]: I0310 09:08:57.782934 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Mar 10 09:08:57 crc kubenswrapper[4825]: I0310 09:08:57.840560 4825 reconciler_common.go:293] "Volume detached for volume \"pvc-6d1180a2-690c-4b45-b239-887bef574278\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6d1180a2-690c-4b45-b239-887bef574278\") on node \"crc\" DevicePath \"\"" Mar 10 09:08:59 crc kubenswrapper[4825]: I0310 09:08:59.249929 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45279de6-a8ae-49b0-8182-9b576a1f5e11" path="/var/lib/kubelet/pods/45279de6-a8ae-49b0-8182-9b576a1f5e11/volumes" Mar 10 09:09:15 crc kubenswrapper[4825]: I0310 09:09:15.394441 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-497j6"] Mar 10 09:09:15 crc kubenswrapper[4825]: E0310 09:09:15.395708 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45279de6-a8ae-49b0-8182-9b576a1f5e11" containerName="adoption" Mar 10 09:09:15 crc kubenswrapper[4825]: I0310 09:09:15.395731 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="45279de6-a8ae-49b0-8182-9b576a1f5e11" containerName="adoption" Mar 10 09:09:15 crc kubenswrapper[4825]: E0310 09:09:15.395751 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="808cc76b-6582-4007-8504-5c11f9d43ba4" containerName="adoption" Mar 10 09:09:15 crc kubenswrapper[4825]: I0310 09:09:15.395766 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="808cc76b-6582-4007-8504-5c11f9d43ba4" containerName="adoption" Mar 10 09:09:15 crc kubenswrapper[4825]: E0310 09:09:15.395806 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03b01350-dc59-49db-a606-9abd90e1bf11" containerName="oc" Mar 10 09:09:15 crc kubenswrapper[4825]: I0310 09:09:15.395820 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b01350-dc59-49db-a606-9abd90e1bf11" containerName="oc" Mar 10 09:09:15 crc kubenswrapper[4825]: I0310 09:09:15.396290 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="45279de6-a8ae-49b0-8182-9b576a1f5e11" containerName="adoption" Mar 10 09:09:15 crc kubenswrapper[4825]: I0310 09:09:15.396324 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="808cc76b-6582-4007-8504-5c11f9d43ba4" containerName="adoption" Mar 10 09:09:15 crc kubenswrapper[4825]: I0310 09:09:15.396365 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="03b01350-dc59-49db-a606-9abd90e1bf11" containerName="oc" Mar 10 09:09:15 crc kubenswrapper[4825]: I0310 09:09:15.399016 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-497j6" Mar 10 09:09:15 crc kubenswrapper[4825]: I0310 09:09:15.410082 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-497j6"] Mar 10 09:09:15 crc kubenswrapper[4825]: I0310 09:09:15.545296 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a489cee-f32c-4cfe-a53b-09a00a8f33dc-catalog-content\") pod \"redhat-operators-497j6\" (UID: \"8a489cee-f32c-4cfe-a53b-09a00a8f33dc\") " pod="openshift-marketplace/redhat-operators-497j6" Mar 10 09:09:15 crc kubenswrapper[4825]: I0310 09:09:15.545373 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhrw9\" (UniqueName: \"kubernetes.io/projected/8a489cee-f32c-4cfe-a53b-09a00a8f33dc-kube-api-access-vhrw9\") pod \"redhat-operators-497j6\" (UID: \"8a489cee-f32c-4cfe-a53b-09a00a8f33dc\") " pod="openshift-marketplace/redhat-operators-497j6" Mar 10 09:09:15 crc kubenswrapper[4825]: I0310 09:09:15.545495 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a489cee-f32c-4cfe-a53b-09a00a8f33dc-utilities\") pod \"redhat-operators-497j6\" (UID: \"8a489cee-f32c-4cfe-a53b-09a00a8f33dc\") " pod="openshift-marketplace/redhat-operators-497j6" Mar 10 09:09:15 crc kubenswrapper[4825]: I0310 09:09:15.647663 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a489cee-f32c-4cfe-a53b-09a00a8f33dc-catalog-content\") pod \"redhat-operators-497j6\" (UID: \"8a489cee-f32c-4cfe-a53b-09a00a8f33dc\") " pod="openshift-marketplace/redhat-operators-497j6" Mar 10 09:09:15 crc kubenswrapper[4825]: I0310 09:09:15.648036 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhrw9\" (UniqueName: \"kubernetes.io/projected/8a489cee-f32c-4cfe-a53b-09a00a8f33dc-kube-api-access-vhrw9\") pod \"redhat-operators-497j6\" (UID: \"8a489cee-f32c-4cfe-a53b-09a00a8f33dc\") " pod="openshift-marketplace/redhat-operators-497j6" Mar 10 09:09:15 crc kubenswrapper[4825]: I0310 09:09:15.648062 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a489cee-f32c-4cfe-a53b-09a00a8f33dc-utilities\") pod \"redhat-operators-497j6\" (UID: \"8a489cee-f32c-4cfe-a53b-09a00a8f33dc\") " pod="openshift-marketplace/redhat-operators-497j6" Mar 10 09:09:15 crc kubenswrapper[4825]: I0310 09:09:15.648593 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a489cee-f32c-4cfe-a53b-09a00a8f33dc-catalog-content\") pod \"redhat-operators-497j6\" (UID: \"8a489cee-f32c-4cfe-a53b-09a00a8f33dc\") " pod="openshift-marketplace/redhat-operators-497j6" Mar 10 09:09:15 crc kubenswrapper[4825]: I0310 09:09:15.648604 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a489cee-f32c-4cfe-a53b-09a00a8f33dc-utilities\") pod \"redhat-operators-497j6\" (UID: \"8a489cee-f32c-4cfe-a53b-09a00a8f33dc\") " pod="openshift-marketplace/redhat-operators-497j6" Mar 10 09:09:15 crc kubenswrapper[4825]: I0310 09:09:15.677981 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhrw9\" (UniqueName: \"kubernetes.io/projected/8a489cee-f32c-4cfe-a53b-09a00a8f33dc-kube-api-access-vhrw9\") pod \"redhat-operators-497j6\" (UID: \"8a489cee-f32c-4cfe-a53b-09a00a8f33dc\") " pod="openshift-marketplace/redhat-operators-497j6" Mar 10 09:09:15 crc kubenswrapper[4825]: I0310 09:09:15.722603 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-497j6" Mar 10 09:09:16 crc kubenswrapper[4825]: I0310 09:09:16.223936 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-497j6"] Mar 10 09:09:16 crc kubenswrapper[4825]: I0310 09:09:16.887680 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:09:16 crc kubenswrapper[4825]: I0310 09:09:16.887736 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:09:16 crc kubenswrapper[4825]: I0310 09:09:16.978200 4825 generic.go:334] "Generic (PLEG): container finished" podID="8a489cee-f32c-4cfe-a53b-09a00a8f33dc" containerID="0c6d239c2dc63a3ce31dc597a0f2ee670a7b7fb0cbd94c06b59c7cbcb386f672" exitCode=0 Mar 10 09:09:16 crc kubenswrapper[4825]: I0310 09:09:16.978248 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-497j6" event={"ID":"8a489cee-f32c-4cfe-a53b-09a00a8f33dc","Type":"ContainerDied","Data":"0c6d239c2dc63a3ce31dc597a0f2ee670a7b7fb0cbd94c06b59c7cbcb386f672"} Mar 10 09:09:16 crc kubenswrapper[4825]: I0310 09:09:16.978276 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-497j6" event={"ID":"8a489cee-f32c-4cfe-a53b-09a00a8f33dc","Type":"ContainerStarted","Data":"bc3cfc2693aeb743ae75f049a65a6d5cb23b8b4f5d5d587dde304686b302a245"} Mar 10 09:09:16 crc kubenswrapper[4825]: I0310 09:09:16.980054 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.569264 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.571969 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.582103 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.582516 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.582926 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.582961 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-grwd4" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.598012 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.724873 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f26d1ad6-86b3-4fef-84d4-78cd5df47576-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.724931 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f26d1ad6-86b3-4fef-84d4-78cd5df47576-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.725064 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.725283 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f26d1ad6-86b3-4fef-84d4-78cd5df47576-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.725550 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f26d1ad6-86b3-4fef-84d4-78cd5df47576-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.725794 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f26d1ad6-86b3-4fef-84d4-78cd5df47576-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.725885 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f26d1ad6-86b3-4fef-84d4-78cd5df47576-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.725930 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f26d1ad6-86b3-4fef-84d4-78cd5df47576-config-data\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.725979 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgb25\" (UniqueName: \"kubernetes.io/projected/f26d1ad6-86b3-4fef-84d4-78cd5df47576-kube-api-access-hgb25\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.827571 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f26d1ad6-86b3-4fef-84d4-78cd5df47576-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.827634 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f26d1ad6-86b3-4fef-84d4-78cd5df47576-config-data\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.827671 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgb25\" (UniqueName: \"kubernetes.io/projected/f26d1ad6-86b3-4fef-84d4-78cd5df47576-kube-api-access-hgb25\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.827721 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f26d1ad6-86b3-4fef-84d4-78cd5df47576-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.827758 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f26d1ad6-86b3-4fef-84d4-78cd5df47576-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.827782 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.827830 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f26d1ad6-86b3-4fef-84d4-78cd5df47576-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.827902 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f26d1ad6-86b3-4fef-84d4-78cd5df47576-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.827981 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f26d1ad6-86b3-4fef-84d4-78cd5df47576-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.828447 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f26d1ad6-86b3-4fef-84d4-78cd5df47576-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.828764 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f26d1ad6-86b3-4fef-84d4-78cd5df47576-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.828887 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.830152 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f26d1ad6-86b3-4fef-84d4-78cd5df47576-config-data\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.831062 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f26d1ad6-86b3-4fef-84d4-78cd5df47576-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.835501 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f26d1ad6-86b3-4fef-84d4-78cd5df47576-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.835530 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f26d1ad6-86b3-4fef-84d4-78cd5df47576-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.836524 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f26d1ad6-86b3-4fef-84d4-78cd5df47576-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.846800 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgb25\" (UniqueName: \"kubernetes.io/projected/f26d1ad6-86b3-4fef-84d4-78cd5df47576-kube-api-access-hgb25\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.861067 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " pod="openstack/tempest-tests-tempest" Mar 10 09:09:17 crc kubenswrapper[4825]: I0310 09:09:17.895153 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 10 09:09:18 crc kubenswrapper[4825]: W0310 09:09:18.378305 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf26d1ad6_86b3_4fef_84d4_78cd5df47576.slice/crio-b8df03aebcfa752f76528ce6dd9e3a98cf412093e569f74d4eae9ea3557a7114 WatchSource:0}: Error finding container b8df03aebcfa752f76528ce6dd9e3a98cf412093e569f74d4eae9ea3557a7114: Status 404 returned error can't find the container with id b8df03aebcfa752f76528ce6dd9e3a98cf412093e569f74d4eae9ea3557a7114 Mar 10 09:09:18 crc kubenswrapper[4825]: I0310 09:09:18.389109 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 10 09:09:19 crc kubenswrapper[4825]: I0310 09:09:19.000864 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"f26d1ad6-86b3-4fef-84d4-78cd5df47576","Type":"ContainerStarted","Data":"b8df03aebcfa752f76528ce6dd9e3a98cf412093e569f74d4eae9ea3557a7114"} Mar 10 09:09:19 crc kubenswrapper[4825]: I0310 09:09:19.003940 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-497j6" event={"ID":"8a489cee-f32c-4cfe-a53b-09a00a8f33dc","Type":"ContainerStarted","Data":"1e86e213c8681eb2d1a29e8bc9c5d59be1b8d5614b3e174adb6349e50543bbe7"} Mar 10 09:09:23 crc kubenswrapper[4825]: I0310 09:09:23.044162 4825 generic.go:334] "Generic (PLEG): container finished" podID="8a489cee-f32c-4cfe-a53b-09a00a8f33dc" containerID="1e86e213c8681eb2d1a29e8bc9c5d59be1b8d5614b3e174adb6349e50543bbe7" exitCode=0 Mar 10 09:09:23 crc kubenswrapper[4825]: I0310 09:09:23.044201 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-497j6" event={"ID":"8a489cee-f32c-4cfe-a53b-09a00a8f33dc","Type":"ContainerDied","Data":"1e86e213c8681eb2d1a29e8bc9c5d59be1b8d5614b3e174adb6349e50543bbe7"} Mar 10 09:09:24 crc kubenswrapper[4825]: I0310 09:09:24.058619 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-497j6" event={"ID":"8a489cee-f32c-4cfe-a53b-09a00a8f33dc","Type":"ContainerStarted","Data":"270ebcb4d4869eaf2b80f8a66f092cf391ad49f8569e00aa05b33ce5eff68008"} Mar 10 09:09:24 crc kubenswrapper[4825]: I0310 09:09:24.083172 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-497j6" podStartSLOduration=2.442477366 podStartE2EDuration="9.083150948s" podCreationTimestamp="2026-03-10 09:09:15 +0000 UTC" firstStartedPulling="2026-03-10 09:09:16.979820189 +0000 UTC m=+8710.009600804" lastFinishedPulling="2026-03-10 09:09:23.620493761 +0000 UTC m=+8716.650274386" observedRunningTime="2026-03-10 09:09:24.076536281 +0000 UTC m=+8717.106316896" watchObservedRunningTime="2026-03-10 09:09:24.083150948 +0000 UTC m=+8717.112931563" Mar 10 09:09:25 crc kubenswrapper[4825]: I0310 09:09:25.722925 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-497j6" Mar 10 09:09:25 crc kubenswrapper[4825]: I0310 09:09:25.723250 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-497j6" Mar 10 09:09:26 crc kubenswrapper[4825]: I0310 09:09:26.771572 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-497j6" podUID="8a489cee-f32c-4cfe-a53b-09a00a8f33dc" containerName="registry-server" probeResult="failure" output=< Mar 10 09:09:26 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 09:09:26 crc kubenswrapper[4825]: > Mar 10 09:09:36 crc kubenswrapper[4825]: I0310 09:09:36.777814 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-497j6" podUID="8a489cee-f32c-4cfe-a53b-09a00a8f33dc" containerName="registry-server" probeResult="failure" output=< Mar 10 09:09:36 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 09:09:36 crc kubenswrapper[4825]: > Mar 10 09:09:45 crc kubenswrapper[4825]: I0310 09:09:45.771582 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-497j6" Mar 10 09:09:45 crc kubenswrapper[4825]: I0310 09:09:45.825172 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-497j6" Mar 10 09:09:46 crc kubenswrapper[4825]: I0310 09:09:46.584100 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-497j6"] Mar 10 09:09:46 crc kubenswrapper[4825]: I0310 09:09:46.888688 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:09:46 crc kubenswrapper[4825]: I0310 09:09:46.889035 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:09:47 crc kubenswrapper[4825]: I0310 09:09:47.286678 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-497j6" podUID="8a489cee-f32c-4cfe-a53b-09a00a8f33dc" containerName="registry-server" containerID="cri-o://270ebcb4d4869eaf2b80f8a66f092cf391ad49f8569e00aa05b33ce5eff68008" gracePeriod=2 Mar 10 09:09:47 crc kubenswrapper[4825]: I0310 09:09:47.733502 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-497j6" Mar 10 09:09:47 crc kubenswrapper[4825]: I0310 09:09:47.795246 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhrw9\" (UniqueName: \"kubernetes.io/projected/8a489cee-f32c-4cfe-a53b-09a00a8f33dc-kube-api-access-vhrw9\") pod \"8a489cee-f32c-4cfe-a53b-09a00a8f33dc\" (UID: \"8a489cee-f32c-4cfe-a53b-09a00a8f33dc\") " Mar 10 09:09:47 crc kubenswrapper[4825]: I0310 09:09:47.795407 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a489cee-f32c-4cfe-a53b-09a00a8f33dc-catalog-content\") pod \"8a489cee-f32c-4cfe-a53b-09a00a8f33dc\" (UID: \"8a489cee-f32c-4cfe-a53b-09a00a8f33dc\") " Mar 10 09:09:47 crc kubenswrapper[4825]: I0310 09:09:47.795529 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a489cee-f32c-4cfe-a53b-09a00a8f33dc-utilities\") pod \"8a489cee-f32c-4cfe-a53b-09a00a8f33dc\" (UID: \"8a489cee-f32c-4cfe-a53b-09a00a8f33dc\") " Mar 10 09:09:47 crc kubenswrapper[4825]: I0310 09:09:47.796228 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a489cee-f32c-4cfe-a53b-09a00a8f33dc-utilities" (OuterVolumeSpecName: "utilities") pod "8a489cee-f32c-4cfe-a53b-09a00a8f33dc" (UID: "8a489cee-f32c-4cfe-a53b-09a00a8f33dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:09:47 crc kubenswrapper[4825]: I0310 09:09:47.803528 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a489cee-f32c-4cfe-a53b-09a00a8f33dc-kube-api-access-vhrw9" (OuterVolumeSpecName: "kube-api-access-vhrw9") pod "8a489cee-f32c-4cfe-a53b-09a00a8f33dc" (UID: "8a489cee-f32c-4cfe-a53b-09a00a8f33dc"). InnerVolumeSpecName "kube-api-access-vhrw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:09:47 crc kubenswrapper[4825]: I0310 09:09:47.897758 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a489cee-f32c-4cfe-a53b-09a00a8f33dc-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:09:47 crc kubenswrapper[4825]: I0310 09:09:47.897821 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhrw9\" (UniqueName: \"kubernetes.io/projected/8a489cee-f32c-4cfe-a53b-09a00a8f33dc-kube-api-access-vhrw9\") on node \"crc\" DevicePath \"\"" Mar 10 09:09:47 crc kubenswrapper[4825]: I0310 09:09:47.949560 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a489cee-f32c-4cfe-a53b-09a00a8f33dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a489cee-f32c-4cfe-a53b-09a00a8f33dc" (UID: "8a489cee-f32c-4cfe-a53b-09a00a8f33dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:09:48 crc kubenswrapper[4825]: I0310 09:09:48.000994 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a489cee-f32c-4cfe-a53b-09a00a8f33dc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:09:48 crc kubenswrapper[4825]: I0310 09:09:48.299223 4825 generic.go:334] "Generic (PLEG): container finished" podID="8a489cee-f32c-4cfe-a53b-09a00a8f33dc" containerID="270ebcb4d4869eaf2b80f8a66f092cf391ad49f8569e00aa05b33ce5eff68008" exitCode=0 Mar 10 09:09:48 crc kubenswrapper[4825]: I0310 09:09:48.299271 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-497j6" event={"ID":"8a489cee-f32c-4cfe-a53b-09a00a8f33dc","Type":"ContainerDied","Data":"270ebcb4d4869eaf2b80f8a66f092cf391ad49f8569e00aa05b33ce5eff68008"} Mar 10 09:09:48 crc kubenswrapper[4825]: I0310 09:09:48.299296 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-497j6" event={"ID":"8a489cee-f32c-4cfe-a53b-09a00a8f33dc","Type":"ContainerDied","Data":"bc3cfc2693aeb743ae75f049a65a6d5cb23b8b4f5d5d587dde304686b302a245"} Mar 10 09:09:48 crc kubenswrapper[4825]: I0310 09:09:48.299312 4825 scope.go:117] "RemoveContainer" containerID="270ebcb4d4869eaf2b80f8a66f092cf391ad49f8569e00aa05b33ce5eff68008" Mar 10 09:09:48 crc kubenswrapper[4825]: I0310 09:09:48.299312 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-497j6" Mar 10 09:09:48 crc kubenswrapper[4825]: I0310 09:09:48.338316 4825 scope.go:117] "RemoveContainer" containerID="1e86e213c8681eb2d1a29e8bc9c5d59be1b8d5614b3e174adb6349e50543bbe7" Mar 10 09:09:48 crc kubenswrapper[4825]: I0310 09:09:48.343896 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-497j6"] Mar 10 09:09:48 crc kubenswrapper[4825]: I0310 09:09:48.355359 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-497j6"] Mar 10 09:09:48 crc kubenswrapper[4825]: I0310 09:09:48.363039 4825 scope.go:117] "RemoveContainer" containerID="0c6d239c2dc63a3ce31dc597a0f2ee670a7b7fb0cbd94c06b59c7cbcb386f672" Mar 10 09:09:48 crc kubenswrapper[4825]: I0310 09:09:48.412841 4825 scope.go:117] "RemoveContainer" containerID="270ebcb4d4869eaf2b80f8a66f092cf391ad49f8569e00aa05b33ce5eff68008" Mar 10 09:09:48 crc kubenswrapper[4825]: E0310 09:09:48.413451 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"270ebcb4d4869eaf2b80f8a66f092cf391ad49f8569e00aa05b33ce5eff68008\": container with ID starting with 270ebcb4d4869eaf2b80f8a66f092cf391ad49f8569e00aa05b33ce5eff68008 not found: ID does not exist" containerID="270ebcb4d4869eaf2b80f8a66f092cf391ad49f8569e00aa05b33ce5eff68008" Mar 10 09:09:48 crc kubenswrapper[4825]: I0310 09:09:48.413484 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"270ebcb4d4869eaf2b80f8a66f092cf391ad49f8569e00aa05b33ce5eff68008"} err="failed to get container status \"270ebcb4d4869eaf2b80f8a66f092cf391ad49f8569e00aa05b33ce5eff68008\": rpc error: code = NotFound desc = could not find container \"270ebcb4d4869eaf2b80f8a66f092cf391ad49f8569e00aa05b33ce5eff68008\": container with ID starting with 270ebcb4d4869eaf2b80f8a66f092cf391ad49f8569e00aa05b33ce5eff68008 not found: ID does not exist" Mar 10 09:09:48 crc kubenswrapper[4825]: I0310 09:09:48.413508 4825 scope.go:117] "RemoveContainer" containerID="1e86e213c8681eb2d1a29e8bc9c5d59be1b8d5614b3e174adb6349e50543bbe7" Mar 10 09:09:48 crc kubenswrapper[4825]: E0310 09:09:48.413819 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e86e213c8681eb2d1a29e8bc9c5d59be1b8d5614b3e174adb6349e50543bbe7\": container with ID starting with 1e86e213c8681eb2d1a29e8bc9c5d59be1b8d5614b3e174adb6349e50543bbe7 not found: ID does not exist" containerID="1e86e213c8681eb2d1a29e8bc9c5d59be1b8d5614b3e174adb6349e50543bbe7" Mar 10 09:09:48 crc kubenswrapper[4825]: I0310 09:09:48.413871 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e86e213c8681eb2d1a29e8bc9c5d59be1b8d5614b3e174adb6349e50543bbe7"} err="failed to get container status \"1e86e213c8681eb2d1a29e8bc9c5d59be1b8d5614b3e174adb6349e50543bbe7\": rpc error: code = NotFound desc = could not find container \"1e86e213c8681eb2d1a29e8bc9c5d59be1b8d5614b3e174adb6349e50543bbe7\": container with ID starting with 1e86e213c8681eb2d1a29e8bc9c5d59be1b8d5614b3e174adb6349e50543bbe7 not found: ID does not exist" Mar 10 09:09:48 crc kubenswrapper[4825]: I0310 09:09:48.413888 4825 scope.go:117] "RemoveContainer" containerID="0c6d239c2dc63a3ce31dc597a0f2ee670a7b7fb0cbd94c06b59c7cbcb386f672" Mar 10 09:09:48 crc kubenswrapper[4825]: E0310 09:09:48.414201 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c6d239c2dc63a3ce31dc597a0f2ee670a7b7fb0cbd94c06b59c7cbcb386f672\": container with ID starting with 0c6d239c2dc63a3ce31dc597a0f2ee670a7b7fb0cbd94c06b59c7cbcb386f672 not found: ID does not exist" containerID="0c6d239c2dc63a3ce31dc597a0f2ee670a7b7fb0cbd94c06b59c7cbcb386f672" Mar 10 09:09:48 crc kubenswrapper[4825]: I0310 09:09:48.414251 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c6d239c2dc63a3ce31dc597a0f2ee670a7b7fb0cbd94c06b59c7cbcb386f672"} err="failed to get container status \"0c6d239c2dc63a3ce31dc597a0f2ee670a7b7fb0cbd94c06b59c7cbcb386f672\": rpc error: code = NotFound desc = could not find container \"0c6d239c2dc63a3ce31dc597a0f2ee670a7b7fb0cbd94c06b59c7cbcb386f672\": container with ID starting with 0c6d239c2dc63a3ce31dc597a0f2ee670a7b7fb0cbd94c06b59c7cbcb386f672 not found: ID does not exist" Mar 10 09:09:49 crc kubenswrapper[4825]: I0310 09:09:49.250631 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a489cee-f32c-4cfe-a53b-09a00a8f33dc" path="/var/lib/kubelet/pods/8a489cee-f32c-4cfe-a53b-09a00a8f33dc/volumes" Mar 10 09:10:00 crc kubenswrapper[4825]: I0310 09:10:00.145047 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552230-g4qn2"] Mar 10 09:10:00 crc kubenswrapper[4825]: E0310 09:10:00.146027 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a489cee-f32c-4cfe-a53b-09a00a8f33dc" containerName="extract-content" Mar 10 09:10:00 crc kubenswrapper[4825]: I0310 09:10:00.146039 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a489cee-f32c-4cfe-a53b-09a00a8f33dc" containerName="extract-content" Mar 10 09:10:00 crc kubenswrapper[4825]: E0310 09:10:00.146059 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a489cee-f32c-4cfe-a53b-09a00a8f33dc" containerName="extract-utilities" Mar 10 09:10:00 crc kubenswrapper[4825]: I0310 09:10:00.146066 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a489cee-f32c-4cfe-a53b-09a00a8f33dc" containerName="extract-utilities" Mar 10 09:10:00 crc kubenswrapper[4825]: E0310 09:10:00.146097 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a489cee-f32c-4cfe-a53b-09a00a8f33dc" containerName="registry-server" Mar 10 09:10:00 crc kubenswrapper[4825]: I0310 09:10:00.146103 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a489cee-f32c-4cfe-a53b-09a00a8f33dc" containerName="registry-server" Mar 10 09:10:00 crc kubenswrapper[4825]: I0310 09:10:00.146329 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a489cee-f32c-4cfe-a53b-09a00a8f33dc" containerName="registry-server" Mar 10 09:10:00 crc kubenswrapper[4825]: I0310 09:10:00.147172 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552230-g4qn2" Mar 10 09:10:00 crc kubenswrapper[4825]: I0310 09:10:00.150196 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:10:00 crc kubenswrapper[4825]: I0310 09:10:00.150324 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 09:10:00 crc kubenswrapper[4825]: I0310 09:10:00.150325 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:10:00 crc kubenswrapper[4825]: I0310 09:10:00.153772 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552230-g4qn2"] Mar 10 09:10:00 crc kubenswrapper[4825]: I0310 09:10:00.169676 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftdfr\" (UniqueName: \"kubernetes.io/projected/e70206fe-4c86-4fb9-987d-2770ffc62adc-kube-api-access-ftdfr\") pod \"auto-csr-approver-29552230-g4qn2\" (UID: \"e70206fe-4c86-4fb9-987d-2770ffc62adc\") " pod="openshift-infra/auto-csr-approver-29552230-g4qn2" Mar 10 09:10:00 crc kubenswrapper[4825]: I0310 09:10:00.271617 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftdfr\" (UniqueName: \"kubernetes.io/projected/e70206fe-4c86-4fb9-987d-2770ffc62adc-kube-api-access-ftdfr\") pod \"auto-csr-approver-29552230-g4qn2\" (UID: \"e70206fe-4c86-4fb9-987d-2770ffc62adc\") " pod="openshift-infra/auto-csr-approver-29552230-g4qn2" Mar 10 09:10:00 crc kubenswrapper[4825]: I0310 09:10:00.295760 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftdfr\" (UniqueName: \"kubernetes.io/projected/e70206fe-4c86-4fb9-987d-2770ffc62adc-kube-api-access-ftdfr\") pod \"auto-csr-approver-29552230-g4qn2\" (UID: \"e70206fe-4c86-4fb9-987d-2770ffc62adc\") " pod="openshift-infra/auto-csr-approver-29552230-g4qn2" Mar 10 09:10:00 crc kubenswrapper[4825]: I0310 09:10:00.477923 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552230-g4qn2" Mar 10 09:10:07 crc kubenswrapper[4825]: E0310 09:10:07.453329 4825 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:e43235cb19da04699a53f42b6a75afe9" Mar 10 09:10:07 crc kubenswrapper[4825]: E0310 09:10:07.453807 4825 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:e43235cb19da04699a53f42b6a75afe9" Mar 10 09:10:07 crc kubenswrapper[4825]: E0310 09:10:07.453940 4825 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:e43235cb19da04699a53f42b6a75afe9,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hgb25,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(f26d1ad6-86b3-4fef-84d4-78cd5df47576): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 09:10:07 crc kubenswrapper[4825]: E0310 09:10:07.455379 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="f26d1ad6-86b3-4fef-84d4-78cd5df47576" Mar 10 09:10:07 crc kubenswrapper[4825]: E0310 09:10:07.528623 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:e43235cb19da04699a53f42b6a75afe9\\\"\"" pod="openstack/tempest-tests-tempest" podUID="f26d1ad6-86b3-4fef-84d4-78cd5df47576" Mar 10 09:10:08 crc kubenswrapper[4825]: I0310 09:10:08.030345 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552230-g4qn2"] Mar 10 09:10:08 crc kubenswrapper[4825]: W0310 09:10:08.047919 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode70206fe_4c86_4fb9_987d_2770ffc62adc.slice/crio-18ccd0d1a44854414e85b6898af8a5ef23755c196363e5e994a53136bf5561a1 WatchSource:0}: Error finding container 18ccd0d1a44854414e85b6898af8a5ef23755c196363e5e994a53136bf5561a1: Status 404 returned error can't find the container with id 18ccd0d1a44854414e85b6898af8a5ef23755c196363e5e994a53136bf5561a1 Mar 10 09:10:08 crc kubenswrapper[4825]: I0310 09:10:08.540553 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552230-g4qn2" event={"ID":"e70206fe-4c86-4fb9-987d-2770ffc62adc","Type":"ContainerStarted","Data":"18ccd0d1a44854414e85b6898af8a5ef23755c196363e5e994a53136bf5561a1"} Mar 10 09:10:10 crc kubenswrapper[4825]: I0310 09:10:10.562214 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552230-g4qn2" event={"ID":"e70206fe-4c86-4fb9-987d-2770ffc62adc","Type":"ContainerStarted","Data":"06b15fe51986006d832f657be9e352652bf33765297c2bc5720c736697189ba9"} Mar 10 09:10:10 crc kubenswrapper[4825]: I0310 09:10:10.587740 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552230-g4qn2" podStartSLOduration=9.762330276 podStartE2EDuration="10.587723098s" podCreationTimestamp="2026-03-10 09:10:00 +0000 UTC" firstStartedPulling="2026-03-10 09:10:08.055343073 +0000 UTC m=+8761.085123688" lastFinishedPulling="2026-03-10 09:10:08.880735855 +0000 UTC m=+8761.910516510" observedRunningTime="2026-03-10 09:10:10.583161756 +0000 UTC m=+8763.612942371" watchObservedRunningTime="2026-03-10 09:10:10.587723098 +0000 UTC m=+8763.617503713" Mar 10 09:10:11 crc kubenswrapper[4825]: I0310 09:10:11.572966 4825 generic.go:334] "Generic (PLEG): container finished" podID="e70206fe-4c86-4fb9-987d-2770ffc62adc" containerID="06b15fe51986006d832f657be9e352652bf33765297c2bc5720c736697189ba9" exitCode=0 Mar 10 09:10:11 crc kubenswrapper[4825]: I0310 09:10:11.573103 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552230-g4qn2" event={"ID":"e70206fe-4c86-4fb9-987d-2770ffc62adc","Type":"ContainerDied","Data":"06b15fe51986006d832f657be9e352652bf33765297c2bc5720c736697189ba9"} Mar 10 09:10:12 crc kubenswrapper[4825]: I0310 09:10:12.972912 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552230-g4qn2" Mar 10 09:10:13 crc kubenswrapper[4825]: I0310 09:10:13.043005 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftdfr\" (UniqueName: \"kubernetes.io/projected/e70206fe-4c86-4fb9-987d-2770ffc62adc-kube-api-access-ftdfr\") pod \"e70206fe-4c86-4fb9-987d-2770ffc62adc\" (UID: \"e70206fe-4c86-4fb9-987d-2770ffc62adc\") " Mar 10 09:10:13 crc kubenswrapper[4825]: I0310 09:10:13.047956 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e70206fe-4c86-4fb9-987d-2770ffc62adc-kube-api-access-ftdfr" (OuterVolumeSpecName: "kube-api-access-ftdfr") pod "e70206fe-4c86-4fb9-987d-2770ffc62adc" (UID: "e70206fe-4c86-4fb9-987d-2770ffc62adc"). InnerVolumeSpecName "kube-api-access-ftdfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:10:13 crc kubenswrapper[4825]: I0310 09:10:13.146231 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftdfr\" (UniqueName: \"kubernetes.io/projected/e70206fe-4c86-4fb9-987d-2770ffc62adc-kube-api-access-ftdfr\") on node \"crc\" DevicePath \"\"" Mar 10 09:10:13 crc kubenswrapper[4825]: I0310 09:10:13.595632 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552230-g4qn2" event={"ID":"e70206fe-4c86-4fb9-987d-2770ffc62adc","Type":"ContainerDied","Data":"18ccd0d1a44854414e85b6898af8a5ef23755c196363e5e994a53136bf5561a1"} Mar 10 09:10:13 crc kubenswrapper[4825]: I0310 09:10:13.595963 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18ccd0d1a44854414e85b6898af8a5ef23755c196363e5e994a53136bf5561a1" Mar 10 09:10:13 crc kubenswrapper[4825]: I0310 09:10:13.595747 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552230-g4qn2" Mar 10 09:10:13 crc kubenswrapper[4825]: I0310 09:10:13.680518 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552224-wn565"] Mar 10 09:10:13 crc kubenswrapper[4825]: I0310 09:10:13.693886 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552224-wn565"] Mar 10 09:10:15 crc kubenswrapper[4825]: I0310 09:10:15.255041 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8b6e419-d1ea-4849-9beb-63fff48ebe84" path="/var/lib/kubelet/pods/e8b6e419-d1ea-4849-9beb-63fff48ebe84/volumes" Mar 10 09:10:16 crc kubenswrapper[4825]: I0310 09:10:16.891714 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:10:16 crc kubenswrapper[4825]: I0310 09:10:16.892109 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:10:16 crc kubenswrapper[4825]: I0310 09:10:16.892190 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 09:10:16 crc kubenswrapper[4825]: I0310 09:10:16.893033 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"251bcbde3d587c53e13f80f1dc3248b62b51ef9f6f569d929f4325376c5948f9"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:10:16 crc kubenswrapper[4825]: I0310 09:10:16.893091 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://251bcbde3d587c53e13f80f1dc3248b62b51ef9f6f569d929f4325376c5948f9" gracePeriod=600 Mar 10 09:10:17 crc kubenswrapper[4825]: I0310 09:10:17.642361 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="251bcbde3d587c53e13f80f1dc3248b62b51ef9f6f569d929f4325376c5948f9" exitCode=0 Mar 10 09:10:17 crc kubenswrapper[4825]: I0310 09:10:17.642467 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"251bcbde3d587c53e13f80f1dc3248b62b51ef9f6f569d929f4325376c5948f9"} Mar 10 09:10:17 crc kubenswrapper[4825]: I0310 09:10:17.642856 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f"} Mar 10 09:10:17 crc kubenswrapper[4825]: I0310 09:10:17.642910 4825 scope.go:117] "RemoveContainer" containerID="390899e144160760e4121c06e9f2e317e0be0012bf795558da227f948f1f1b1f" Mar 10 09:10:20 crc kubenswrapper[4825]: I0310 09:10:20.463671 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 10 09:10:22 crc kubenswrapper[4825]: I0310 09:10:22.714721 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"f26d1ad6-86b3-4fef-84d4-78cd5df47576","Type":"ContainerStarted","Data":"a0cfb3e5f662ad76d0b02bdf85ceb7a19f5c0a6ffb0a5f8f43e9f1c5c2ae9c9c"} Mar 10 09:10:22 crc kubenswrapper[4825]: I0310 09:10:22.754087 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.675171369 podStartE2EDuration="1m6.754064052s" podCreationTimestamp="2026-03-10 09:09:16 +0000 UTC" firstStartedPulling="2026-03-10 09:09:18.381531299 +0000 UTC m=+8711.411311914" lastFinishedPulling="2026-03-10 09:10:20.460423952 +0000 UTC m=+8773.490204597" observedRunningTime="2026-03-10 09:10:22.746738007 +0000 UTC m=+8775.776518632" watchObservedRunningTime="2026-03-10 09:10:22.754064052 +0000 UTC m=+8775.783844667" Mar 10 09:10:35 crc kubenswrapper[4825]: I0310 09:10:35.383869 4825 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-6d6b57675b-q5h4k" podUID="8b67c6af-7ab6-4359-b2c7-c3e8b57fc722" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 10 09:11:07 crc kubenswrapper[4825]: I0310 09:11:07.392693 4825 scope.go:117] "RemoveContainer" containerID="b23a2c769c7415aa1704d759934a888fec13771b0003794db53d47b5b3922b05" Mar 10 09:12:00 crc kubenswrapper[4825]: I0310 09:12:00.146709 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552232-tqmr6"] Mar 10 09:12:00 crc kubenswrapper[4825]: E0310 09:12:00.147679 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e70206fe-4c86-4fb9-987d-2770ffc62adc" containerName="oc" Mar 10 09:12:00 crc kubenswrapper[4825]: I0310 09:12:00.147696 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e70206fe-4c86-4fb9-987d-2770ffc62adc" containerName="oc" Mar 10 09:12:00 crc kubenswrapper[4825]: I0310 09:12:00.147943 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e70206fe-4c86-4fb9-987d-2770ffc62adc" containerName="oc" Mar 10 09:12:00 crc kubenswrapper[4825]: I0310 09:12:00.148735 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552232-tqmr6" Mar 10 09:12:00 crc kubenswrapper[4825]: I0310 09:12:00.152318 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:12:00 crc kubenswrapper[4825]: I0310 09:12:00.152933 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:12:00 crc kubenswrapper[4825]: I0310 09:12:00.153092 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 09:12:00 crc kubenswrapper[4825]: I0310 09:12:00.174054 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552232-tqmr6"] Mar 10 09:12:00 crc kubenswrapper[4825]: I0310 09:12:00.253418 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b4b4\" (UniqueName: \"kubernetes.io/projected/35e63c43-e67b-4d24-ad9a-428cac91eb29-kube-api-access-2b4b4\") pod \"auto-csr-approver-29552232-tqmr6\" (UID: \"35e63c43-e67b-4d24-ad9a-428cac91eb29\") " pod="openshift-infra/auto-csr-approver-29552232-tqmr6" Mar 10 09:12:00 crc kubenswrapper[4825]: I0310 09:12:00.355469 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b4b4\" (UniqueName: \"kubernetes.io/projected/35e63c43-e67b-4d24-ad9a-428cac91eb29-kube-api-access-2b4b4\") pod \"auto-csr-approver-29552232-tqmr6\" (UID: \"35e63c43-e67b-4d24-ad9a-428cac91eb29\") " pod="openshift-infra/auto-csr-approver-29552232-tqmr6" Mar 10 09:12:00 crc kubenswrapper[4825]: I0310 09:12:00.373594 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b4b4\" (UniqueName: \"kubernetes.io/projected/35e63c43-e67b-4d24-ad9a-428cac91eb29-kube-api-access-2b4b4\") pod \"auto-csr-approver-29552232-tqmr6\" (UID: \"35e63c43-e67b-4d24-ad9a-428cac91eb29\") " pod="openshift-infra/auto-csr-approver-29552232-tqmr6" Mar 10 09:12:00 crc kubenswrapper[4825]: I0310 09:12:00.468331 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552232-tqmr6" Mar 10 09:12:01 crc kubenswrapper[4825]: I0310 09:12:01.111391 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552232-tqmr6"] Mar 10 09:12:01 crc kubenswrapper[4825]: I0310 09:12:01.667446 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552232-tqmr6" event={"ID":"35e63c43-e67b-4d24-ad9a-428cac91eb29","Type":"ContainerStarted","Data":"7a5608e736f722c08ae2a87fdebafd58df7d3f178f1b5f6272bc95c20b409085"} Mar 10 09:12:03 crc kubenswrapper[4825]: I0310 09:12:03.685965 4825 generic.go:334] "Generic (PLEG): container finished" podID="35e63c43-e67b-4d24-ad9a-428cac91eb29" containerID="9b34c7a02375c1810baf0f58f483f3f053ac244c575a043379de66cf22235952" exitCode=0 Mar 10 09:12:03 crc kubenswrapper[4825]: I0310 09:12:03.686009 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552232-tqmr6" event={"ID":"35e63c43-e67b-4d24-ad9a-428cac91eb29","Type":"ContainerDied","Data":"9b34c7a02375c1810baf0f58f483f3f053ac244c575a043379de66cf22235952"} Mar 10 09:12:05 crc kubenswrapper[4825]: I0310 09:12:05.323791 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552232-tqmr6" Mar 10 09:12:05 crc kubenswrapper[4825]: I0310 09:12:05.355914 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b4b4\" (UniqueName: \"kubernetes.io/projected/35e63c43-e67b-4d24-ad9a-428cac91eb29-kube-api-access-2b4b4\") pod \"35e63c43-e67b-4d24-ad9a-428cac91eb29\" (UID: \"35e63c43-e67b-4d24-ad9a-428cac91eb29\") " Mar 10 09:12:05 crc kubenswrapper[4825]: I0310 09:12:05.362390 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35e63c43-e67b-4d24-ad9a-428cac91eb29-kube-api-access-2b4b4" (OuterVolumeSpecName: "kube-api-access-2b4b4") pod "35e63c43-e67b-4d24-ad9a-428cac91eb29" (UID: "35e63c43-e67b-4d24-ad9a-428cac91eb29"). InnerVolumeSpecName "kube-api-access-2b4b4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:12:05 crc kubenswrapper[4825]: I0310 09:12:05.459092 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b4b4\" (UniqueName: \"kubernetes.io/projected/35e63c43-e67b-4d24-ad9a-428cac91eb29-kube-api-access-2b4b4\") on node \"crc\" DevicePath \"\"" Mar 10 09:12:05 crc kubenswrapper[4825]: I0310 09:12:05.714557 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552232-tqmr6" event={"ID":"35e63c43-e67b-4d24-ad9a-428cac91eb29","Type":"ContainerDied","Data":"7a5608e736f722c08ae2a87fdebafd58df7d3f178f1b5f6272bc95c20b409085"} Mar 10 09:12:05 crc kubenswrapper[4825]: I0310 09:12:05.714818 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a5608e736f722c08ae2a87fdebafd58df7d3f178f1b5f6272bc95c20b409085" Mar 10 09:12:05 crc kubenswrapper[4825]: I0310 09:12:05.714786 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552232-tqmr6" Mar 10 09:12:06 crc kubenswrapper[4825]: I0310 09:12:06.396242 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552226-kcfrv"] Mar 10 09:12:06 crc kubenswrapper[4825]: I0310 09:12:06.405392 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552226-kcfrv"] Mar 10 09:12:07 crc kubenswrapper[4825]: I0310 09:12:07.247699 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dad9b10-003f-4255-bc72-577855c4d316" path="/var/lib/kubelet/pods/3dad9b10-003f-4255-bc72-577855c4d316/volumes" Mar 10 09:12:07 crc kubenswrapper[4825]: I0310 09:12:07.560482 4825 scope.go:117] "RemoveContainer" containerID="6a995efe68a69db809bbeb1a620d0d37bb0376615a600b7c9176732b527312e6" Mar 10 09:12:07 crc kubenswrapper[4825]: I0310 09:12:07.587230 4825 scope.go:117] "RemoveContainer" containerID="2c2674394a78a3f4205b1ca5fe4f140061af05c7dd28181ce251132e1b556a9e" Mar 10 09:12:07 crc kubenswrapper[4825]: I0310 09:12:07.607728 4825 scope.go:117] "RemoveContainer" containerID="85a3c232edaabb6a2ec5602b4e9d02bcfaed7293b5fb03838d4d7dd6ae609235" Mar 10 09:12:07 crc kubenswrapper[4825]: I0310 09:12:07.671968 4825 scope.go:117] "RemoveContainer" containerID="5b0ba5f83999c24f337b2687e98d769ff2a37647e96dd37d823acecc8998a50c" Mar 10 09:12:46 crc kubenswrapper[4825]: I0310 09:12:46.887772 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:12:46 crc kubenswrapper[4825]: I0310 09:12:46.888320 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:13:16 crc kubenswrapper[4825]: I0310 09:13:16.890597 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:13:16 crc kubenswrapper[4825]: I0310 09:13:16.891083 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:13:46 crc kubenswrapper[4825]: I0310 09:13:46.888621 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:13:46 crc kubenswrapper[4825]: I0310 09:13:46.889127 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:13:46 crc kubenswrapper[4825]: I0310 09:13:46.889191 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 09:13:46 crc kubenswrapper[4825]: I0310 09:13:46.889875 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:13:46 crc kubenswrapper[4825]: I0310 09:13:46.889919 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" gracePeriod=600 Mar 10 09:13:47 crc kubenswrapper[4825]: E0310 09:13:47.011163 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:13:47 crc kubenswrapper[4825]: I0310 09:13:47.695120 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" exitCode=0 Mar 10 09:13:47 crc kubenswrapper[4825]: I0310 09:13:47.695507 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f"} Mar 10 09:13:47 crc kubenswrapper[4825]: I0310 09:13:47.695545 4825 scope.go:117] "RemoveContainer" containerID="251bcbde3d587c53e13f80f1dc3248b62b51ef9f6f569d929f4325376c5948f9" Mar 10 09:13:47 crc kubenswrapper[4825]: I0310 09:13:47.696251 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:13:47 crc kubenswrapper[4825]: E0310 09:13:47.696706 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:13:59 crc kubenswrapper[4825]: I0310 09:13:59.243210 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:13:59 crc kubenswrapper[4825]: E0310 09:13:59.244058 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:14:00 crc kubenswrapper[4825]: I0310 09:14:00.185381 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552234-8shqx"] Mar 10 09:14:00 crc kubenswrapper[4825]: E0310 09:14:00.186159 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35e63c43-e67b-4d24-ad9a-428cac91eb29" containerName="oc" Mar 10 09:14:00 crc kubenswrapper[4825]: I0310 09:14:00.186182 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="35e63c43-e67b-4d24-ad9a-428cac91eb29" containerName="oc" Mar 10 09:14:00 crc kubenswrapper[4825]: I0310 09:14:00.186404 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="35e63c43-e67b-4d24-ad9a-428cac91eb29" containerName="oc" Mar 10 09:14:00 crc kubenswrapper[4825]: I0310 09:14:00.187125 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552234-8shqx" Mar 10 09:14:00 crc kubenswrapper[4825]: I0310 09:14:00.189462 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:14:00 crc kubenswrapper[4825]: I0310 09:14:00.189810 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:14:00 crc kubenswrapper[4825]: I0310 09:14:00.190050 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 09:14:00 crc kubenswrapper[4825]: I0310 09:14:00.201848 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552234-8shqx"] Mar 10 09:14:00 crc kubenswrapper[4825]: I0310 09:14:00.349405 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcknn\" (UniqueName: \"kubernetes.io/projected/fc431574-9341-47a0-88f3-00ff2fd2ff18-kube-api-access-lcknn\") pod \"auto-csr-approver-29552234-8shqx\" (UID: \"fc431574-9341-47a0-88f3-00ff2fd2ff18\") " pod="openshift-infra/auto-csr-approver-29552234-8shqx" Mar 10 09:14:00 crc kubenswrapper[4825]: I0310 09:14:00.451371 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcknn\" (UniqueName: \"kubernetes.io/projected/fc431574-9341-47a0-88f3-00ff2fd2ff18-kube-api-access-lcknn\") pod \"auto-csr-approver-29552234-8shqx\" (UID: \"fc431574-9341-47a0-88f3-00ff2fd2ff18\") " pod="openshift-infra/auto-csr-approver-29552234-8shqx" Mar 10 09:14:00 crc kubenswrapper[4825]: I0310 09:14:00.471474 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcknn\" (UniqueName: \"kubernetes.io/projected/fc431574-9341-47a0-88f3-00ff2fd2ff18-kube-api-access-lcknn\") pod \"auto-csr-approver-29552234-8shqx\" (UID: \"fc431574-9341-47a0-88f3-00ff2fd2ff18\") " pod="openshift-infra/auto-csr-approver-29552234-8shqx" Mar 10 09:14:00 crc kubenswrapper[4825]: I0310 09:14:00.505077 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552234-8shqx" Mar 10 09:14:01 crc kubenswrapper[4825]: I0310 09:14:01.043564 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552234-8shqx"] Mar 10 09:14:01 crc kubenswrapper[4825]: I0310 09:14:01.841669 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552234-8shqx" event={"ID":"fc431574-9341-47a0-88f3-00ff2fd2ff18","Type":"ContainerStarted","Data":"356cd0e6aa869f4900dd70c932501de8d116342b98a08174a1b195647b1b29c7"} Mar 10 09:14:02 crc kubenswrapper[4825]: I0310 09:14:02.851770 4825 generic.go:334] "Generic (PLEG): container finished" podID="fc431574-9341-47a0-88f3-00ff2fd2ff18" containerID="d32c990b52e32d4dec37a049a4bb75bfb514d34005170a26c9ba9eac6d3013d1" exitCode=0 Mar 10 09:14:02 crc kubenswrapper[4825]: I0310 09:14:02.851862 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552234-8shqx" event={"ID":"fc431574-9341-47a0-88f3-00ff2fd2ff18","Type":"ContainerDied","Data":"d32c990b52e32d4dec37a049a4bb75bfb514d34005170a26c9ba9eac6d3013d1"} Mar 10 09:14:04 crc kubenswrapper[4825]: I0310 09:14:04.383579 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552234-8shqx" Mar 10 09:14:04 crc kubenswrapper[4825]: I0310 09:14:04.539792 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcknn\" (UniqueName: \"kubernetes.io/projected/fc431574-9341-47a0-88f3-00ff2fd2ff18-kube-api-access-lcknn\") pod \"fc431574-9341-47a0-88f3-00ff2fd2ff18\" (UID: \"fc431574-9341-47a0-88f3-00ff2fd2ff18\") " Mar 10 09:14:04 crc kubenswrapper[4825]: I0310 09:14:04.548784 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc431574-9341-47a0-88f3-00ff2fd2ff18-kube-api-access-lcknn" (OuterVolumeSpecName: "kube-api-access-lcknn") pod "fc431574-9341-47a0-88f3-00ff2fd2ff18" (UID: "fc431574-9341-47a0-88f3-00ff2fd2ff18"). InnerVolumeSpecName "kube-api-access-lcknn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:14:04 crc kubenswrapper[4825]: I0310 09:14:04.642297 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcknn\" (UniqueName: \"kubernetes.io/projected/fc431574-9341-47a0-88f3-00ff2fd2ff18-kube-api-access-lcknn\") on node \"crc\" DevicePath \"\"" Mar 10 09:14:04 crc kubenswrapper[4825]: I0310 09:14:04.902593 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552234-8shqx" event={"ID":"fc431574-9341-47a0-88f3-00ff2fd2ff18","Type":"ContainerDied","Data":"356cd0e6aa869f4900dd70c932501de8d116342b98a08174a1b195647b1b29c7"} Mar 10 09:14:04 crc kubenswrapper[4825]: I0310 09:14:04.902878 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="356cd0e6aa869f4900dd70c932501de8d116342b98a08174a1b195647b1b29c7" Mar 10 09:14:04 crc kubenswrapper[4825]: I0310 09:14:04.902805 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552234-8shqx" Mar 10 09:14:05 crc kubenswrapper[4825]: I0310 09:14:05.455053 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552228-v247f"] Mar 10 09:14:05 crc kubenswrapper[4825]: I0310 09:14:05.464904 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552228-v247f"] Mar 10 09:14:07 crc kubenswrapper[4825]: I0310 09:14:07.246595 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03b01350-dc59-49db-a606-9abd90e1bf11" path="/var/lib/kubelet/pods/03b01350-dc59-49db-a606-9abd90e1bf11/volumes" Mar 10 09:14:07 crc kubenswrapper[4825]: I0310 09:14:07.794509 4825 scope.go:117] "RemoveContainer" containerID="29138918703f351cfacad81471f690ce74f3560f53ba0eac482c466e26a1194b" Mar 10 09:14:10 crc kubenswrapper[4825]: I0310 09:14:10.237186 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:14:10 crc kubenswrapper[4825]: E0310 09:14:10.238203 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:14:22 crc kubenswrapper[4825]: I0310 09:14:22.236601 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:14:22 crc kubenswrapper[4825]: E0310 09:14:22.237365 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:14:34 crc kubenswrapper[4825]: I0310 09:14:34.236521 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:14:34 crc kubenswrapper[4825]: E0310 09:14:34.237275 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:14:46 crc kubenswrapper[4825]: I0310 09:14:46.236935 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:14:46 crc kubenswrapper[4825]: E0310 09:14:46.237883 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:15:00 crc kubenswrapper[4825]: I0310 09:15:00.164819 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552235-q8vnf"] Mar 10 09:15:00 crc kubenswrapper[4825]: E0310 09:15:00.165748 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc431574-9341-47a0-88f3-00ff2fd2ff18" containerName="oc" Mar 10 09:15:00 crc kubenswrapper[4825]: I0310 09:15:00.165762 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc431574-9341-47a0-88f3-00ff2fd2ff18" containerName="oc" Mar 10 09:15:00 crc kubenswrapper[4825]: I0310 09:15:00.166049 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc431574-9341-47a0-88f3-00ff2fd2ff18" containerName="oc" Mar 10 09:15:00 crc kubenswrapper[4825]: I0310 09:15:00.166863 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-q8vnf" Mar 10 09:15:00 crc kubenswrapper[4825]: I0310 09:15:00.168817 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 09:15:00 crc kubenswrapper[4825]: I0310 09:15:00.171410 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 09:15:00 crc kubenswrapper[4825]: I0310 09:15:00.187025 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552235-q8vnf"] Mar 10 09:15:00 crc kubenswrapper[4825]: I0310 09:15:00.324486 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9958\" (UniqueName: \"kubernetes.io/projected/3a3adfc2-3d10-417d-bfac-a029f7ba2525-kube-api-access-c9958\") pod \"collect-profiles-29552235-q8vnf\" (UID: \"3a3adfc2-3d10-417d-bfac-a029f7ba2525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-q8vnf" Mar 10 09:15:00 crc kubenswrapper[4825]: I0310 09:15:00.324624 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a3adfc2-3d10-417d-bfac-a029f7ba2525-config-volume\") pod \"collect-profiles-29552235-q8vnf\" (UID: \"3a3adfc2-3d10-417d-bfac-a029f7ba2525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-q8vnf" Mar 10 09:15:00 crc kubenswrapper[4825]: I0310 09:15:00.324808 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a3adfc2-3d10-417d-bfac-a029f7ba2525-secret-volume\") pod \"collect-profiles-29552235-q8vnf\" (UID: \"3a3adfc2-3d10-417d-bfac-a029f7ba2525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-q8vnf" Mar 10 09:15:00 crc kubenswrapper[4825]: I0310 09:15:00.426990 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9958\" (UniqueName: \"kubernetes.io/projected/3a3adfc2-3d10-417d-bfac-a029f7ba2525-kube-api-access-c9958\") pod \"collect-profiles-29552235-q8vnf\" (UID: \"3a3adfc2-3d10-417d-bfac-a029f7ba2525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-q8vnf" Mar 10 09:15:00 crc kubenswrapper[4825]: I0310 09:15:00.427074 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a3adfc2-3d10-417d-bfac-a029f7ba2525-config-volume\") pod \"collect-profiles-29552235-q8vnf\" (UID: \"3a3adfc2-3d10-417d-bfac-a029f7ba2525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-q8vnf" Mar 10 09:15:00 crc kubenswrapper[4825]: I0310 09:15:00.427164 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a3adfc2-3d10-417d-bfac-a029f7ba2525-secret-volume\") pod \"collect-profiles-29552235-q8vnf\" (UID: \"3a3adfc2-3d10-417d-bfac-a029f7ba2525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-q8vnf" Mar 10 09:15:00 crc kubenswrapper[4825]: I0310 09:15:00.428305 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a3adfc2-3d10-417d-bfac-a029f7ba2525-config-volume\") pod \"collect-profiles-29552235-q8vnf\" (UID: \"3a3adfc2-3d10-417d-bfac-a029f7ba2525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-q8vnf" Mar 10 09:15:00 crc kubenswrapper[4825]: I0310 09:15:00.439929 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a3adfc2-3d10-417d-bfac-a029f7ba2525-secret-volume\") pod \"collect-profiles-29552235-q8vnf\" (UID: \"3a3adfc2-3d10-417d-bfac-a029f7ba2525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-q8vnf" Mar 10 09:15:00 crc kubenswrapper[4825]: I0310 09:15:00.452100 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9958\" (UniqueName: \"kubernetes.io/projected/3a3adfc2-3d10-417d-bfac-a029f7ba2525-kube-api-access-c9958\") pod \"collect-profiles-29552235-q8vnf\" (UID: \"3a3adfc2-3d10-417d-bfac-a029f7ba2525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-q8vnf" Mar 10 09:15:00 crc kubenswrapper[4825]: I0310 09:15:00.497878 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-q8vnf" Mar 10 09:15:01 crc kubenswrapper[4825]: I0310 09:15:01.040374 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552235-q8vnf"] Mar 10 09:15:01 crc kubenswrapper[4825]: I0310 09:15:01.237920 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:15:01 crc kubenswrapper[4825]: E0310 09:15:01.238344 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:15:01 crc kubenswrapper[4825]: W0310 09:15:01.465119 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a3adfc2_3d10_417d_bfac_a029f7ba2525.slice/crio-a17d10491c55a7ad096da3fa8792ca073406dc2c95371e38c7c728a8124f4bdc WatchSource:0}: Error finding container a17d10491c55a7ad096da3fa8792ca073406dc2c95371e38c7c728a8124f4bdc: Status 404 returned error can't find the container with id a17d10491c55a7ad096da3fa8792ca073406dc2c95371e38c7c728a8124f4bdc Mar 10 09:15:02 crc kubenswrapper[4825]: I0310 09:15:02.445050 4825 generic.go:334] "Generic (PLEG): container finished" podID="3a3adfc2-3d10-417d-bfac-a029f7ba2525" containerID="c73ac7d99bc1990df9794d25d4a5b253127dafdd3e69ac4ad21c7d1556005cb7" exitCode=0 Mar 10 09:15:02 crc kubenswrapper[4825]: I0310 09:15:02.445149 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-q8vnf" event={"ID":"3a3adfc2-3d10-417d-bfac-a029f7ba2525","Type":"ContainerDied","Data":"c73ac7d99bc1990df9794d25d4a5b253127dafdd3e69ac4ad21c7d1556005cb7"} Mar 10 09:15:02 crc kubenswrapper[4825]: I0310 09:15:02.445382 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-q8vnf" event={"ID":"3a3adfc2-3d10-417d-bfac-a029f7ba2525","Type":"ContainerStarted","Data":"a17d10491c55a7ad096da3fa8792ca073406dc2c95371e38c7c728a8124f4bdc"} Mar 10 09:15:04 crc kubenswrapper[4825]: I0310 09:15:04.045382 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-q8vnf" Mar 10 09:15:04 crc kubenswrapper[4825]: I0310 09:15:04.125480 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9958\" (UniqueName: \"kubernetes.io/projected/3a3adfc2-3d10-417d-bfac-a029f7ba2525-kube-api-access-c9958\") pod \"3a3adfc2-3d10-417d-bfac-a029f7ba2525\" (UID: \"3a3adfc2-3d10-417d-bfac-a029f7ba2525\") " Mar 10 09:15:04 crc kubenswrapper[4825]: I0310 09:15:04.125610 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a3adfc2-3d10-417d-bfac-a029f7ba2525-secret-volume\") pod \"3a3adfc2-3d10-417d-bfac-a029f7ba2525\" (UID: \"3a3adfc2-3d10-417d-bfac-a029f7ba2525\") " Mar 10 09:15:04 crc kubenswrapper[4825]: I0310 09:15:04.126636 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a3adfc2-3d10-417d-bfac-a029f7ba2525-config-volume\") pod \"3a3adfc2-3d10-417d-bfac-a029f7ba2525\" (UID: \"3a3adfc2-3d10-417d-bfac-a029f7ba2525\") " Mar 10 09:15:04 crc kubenswrapper[4825]: I0310 09:15:04.127785 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a3adfc2-3d10-417d-bfac-a029f7ba2525-config-volume" (OuterVolumeSpecName: "config-volume") pod "3a3adfc2-3d10-417d-bfac-a029f7ba2525" (UID: "3a3adfc2-3d10-417d-bfac-a029f7ba2525"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:15:04 crc kubenswrapper[4825]: I0310 09:15:04.143703 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3adfc2-3d10-417d-bfac-a029f7ba2525-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3a3adfc2-3d10-417d-bfac-a029f7ba2525" (UID: "3a3adfc2-3d10-417d-bfac-a029f7ba2525"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:15:04 crc kubenswrapper[4825]: I0310 09:15:04.144036 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a3adfc2-3d10-417d-bfac-a029f7ba2525-kube-api-access-c9958" (OuterVolumeSpecName: "kube-api-access-c9958") pod "3a3adfc2-3d10-417d-bfac-a029f7ba2525" (UID: "3a3adfc2-3d10-417d-bfac-a029f7ba2525"). InnerVolumeSpecName "kube-api-access-c9958". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:15:04 crc kubenswrapper[4825]: I0310 09:15:04.229831 4825 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a3adfc2-3d10-417d-bfac-a029f7ba2525-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:15:04 crc kubenswrapper[4825]: I0310 09:15:04.229889 4825 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a3adfc2-3d10-417d-bfac-a029f7ba2525-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:15:04 crc kubenswrapper[4825]: I0310 09:15:04.229903 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9958\" (UniqueName: \"kubernetes.io/projected/3a3adfc2-3d10-417d-bfac-a029f7ba2525-kube-api-access-c9958\") on node \"crc\" DevicePath \"\"" Mar 10 09:15:04 crc kubenswrapper[4825]: I0310 09:15:04.468393 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-q8vnf" event={"ID":"3a3adfc2-3d10-417d-bfac-a029f7ba2525","Type":"ContainerDied","Data":"a17d10491c55a7ad096da3fa8792ca073406dc2c95371e38c7c728a8124f4bdc"} Mar 10 09:15:04 crc kubenswrapper[4825]: I0310 09:15:04.468731 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a17d10491c55a7ad096da3fa8792ca073406dc2c95371e38c7c728a8124f4bdc" Mar 10 09:15:04 crc kubenswrapper[4825]: I0310 09:15:04.468562 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552235-q8vnf" Mar 10 09:15:05 crc kubenswrapper[4825]: I0310 09:15:05.129497 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552190-lbl4m"] Mar 10 09:15:05 crc kubenswrapper[4825]: I0310 09:15:05.138900 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552190-lbl4m"] Mar 10 09:15:05 crc kubenswrapper[4825]: I0310 09:15:05.247039 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba915940-2e22-4fcb-92f5-b99de841c175" path="/var/lib/kubelet/pods/ba915940-2e22-4fcb-92f5-b99de841c175/volumes" Mar 10 09:15:07 crc kubenswrapper[4825]: I0310 09:15:07.931059 4825 scope.go:117] "RemoveContainer" containerID="b9a400ab978e78670446ab782d68132aa6af9bb7536e009a894c3e487358d0b7" Mar 10 09:15:13 crc kubenswrapper[4825]: I0310 09:15:13.236861 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:15:13 crc kubenswrapper[4825]: E0310 09:15:13.237629 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:15:25 crc kubenswrapper[4825]: I0310 09:15:25.238245 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:15:25 crc kubenswrapper[4825]: E0310 09:15:25.240385 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:15:40 crc kubenswrapper[4825]: I0310 09:15:40.237041 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:15:40 crc kubenswrapper[4825]: E0310 09:15:40.237796 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:15:47 crc kubenswrapper[4825]: I0310 09:15:47.319278 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ssljg"] Mar 10 09:15:47 crc kubenswrapper[4825]: E0310 09:15:47.320383 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a3adfc2-3d10-417d-bfac-a029f7ba2525" containerName="collect-profiles" Mar 10 09:15:47 crc kubenswrapper[4825]: I0310 09:15:47.320398 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3adfc2-3d10-417d-bfac-a029f7ba2525" containerName="collect-profiles" Mar 10 09:15:47 crc kubenswrapper[4825]: I0310 09:15:47.320655 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a3adfc2-3d10-417d-bfac-a029f7ba2525" containerName="collect-profiles" Mar 10 09:15:47 crc kubenswrapper[4825]: I0310 09:15:47.322330 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ssljg" Mar 10 09:15:47 crc kubenswrapper[4825]: I0310 09:15:47.363676 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssljg"] Mar 10 09:15:47 crc kubenswrapper[4825]: I0310 09:15:47.452063 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ed0e09d-4c76-4ebf-b9b1-f7f581de225c-catalog-content\") pod \"redhat-marketplace-ssljg\" (UID: \"0ed0e09d-4c76-4ebf-b9b1-f7f581de225c\") " pod="openshift-marketplace/redhat-marketplace-ssljg" Mar 10 09:15:47 crc kubenswrapper[4825]: I0310 09:15:47.452303 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ed0e09d-4c76-4ebf-b9b1-f7f581de225c-utilities\") pod \"redhat-marketplace-ssljg\" (UID: \"0ed0e09d-4c76-4ebf-b9b1-f7f581de225c\") " pod="openshift-marketplace/redhat-marketplace-ssljg" Mar 10 09:15:47 crc kubenswrapper[4825]: I0310 09:15:47.452331 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nmr4\" (UniqueName: \"kubernetes.io/projected/0ed0e09d-4c76-4ebf-b9b1-f7f581de225c-kube-api-access-8nmr4\") pod \"redhat-marketplace-ssljg\" (UID: \"0ed0e09d-4c76-4ebf-b9b1-f7f581de225c\") " pod="openshift-marketplace/redhat-marketplace-ssljg" Mar 10 09:15:47 crc kubenswrapper[4825]: I0310 09:15:47.554440 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ed0e09d-4c76-4ebf-b9b1-f7f581de225c-catalog-content\") pod \"redhat-marketplace-ssljg\" (UID: \"0ed0e09d-4c76-4ebf-b9b1-f7f581de225c\") " pod="openshift-marketplace/redhat-marketplace-ssljg" Mar 10 09:15:47 crc kubenswrapper[4825]: I0310 09:15:47.554687 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ed0e09d-4c76-4ebf-b9b1-f7f581de225c-utilities\") pod \"redhat-marketplace-ssljg\" (UID: \"0ed0e09d-4c76-4ebf-b9b1-f7f581de225c\") " pod="openshift-marketplace/redhat-marketplace-ssljg" Mar 10 09:15:47 crc kubenswrapper[4825]: I0310 09:15:47.554726 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nmr4\" (UniqueName: \"kubernetes.io/projected/0ed0e09d-4c76-4ebf-b9b1-f7f581de225c-kube-api-access-8nmr4\") pod \"redhat-marketplace-ssljg\" (UID: \"0ed0e09d-4c76-4ebf-b9b1-f7f581de225c\") " pod="openshift-marketplace/redhat-marketplace-ssljg" Mar 10 09:15:47 crc kubenswrapper[4825]: I0310 09:15:47.555061 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ed0e09d-4c76-4ebf-b9b1-f7f581de225c-catalog-content\") pod \"redhat-marketplace-ssljg\" (UID: \"0ed0e09d-4c76-4ebf-b9b1-f7f581de225c\") " pod="openshift-marketplace/redhat-marketplace-ssljg" Mar 10 09:15:47 crc kubenswrapper[4825]: I0310 09:15:47.555286 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ed0e09d-4c76-4ebf-b9b1-f7f581de225c-utilities\") pod \"redhat-marketplace-ssljg\" (UID: \"0ed0e09d-4c76-4ebf-b9b1-f7f581de225c\") " pod="openshift-marketplace/redhat-marketplace-ssljg" Mar 10 09:15:47 crc kubenswrapper[4825]: I0310 09:15:47.658994 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nmr4\" (UniqueName: \"kubernetes.io/projected/0ed0e09d-4c76-4ebf-b9b1-f7f581de225c-kube-api-access-8nmr4\") pod \"redhat-marketplace-ssljg\" (UID: \"0ed0e09d-4c76-4ebf-b9b1-f7f581de225c\") " pod="openshift-marketplace/redhat-marketplace-ssljg" Mar 10 09:15:47 crc kubenswrapper[4825]: I0310 09:15:47.949854 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ssljg" Mar 10 09:15:48 crc kubenswrapper[4825]: I0310 09:15:48.509402 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssljg"] Mar 10 09:15:48 crc kubenswrapper[4825]: I0310 09:15:48.909154 4825 generic.go:334] "Generic (PLEG): container finished" podID="0ed0e09d-4c76-4ebf-b9b1-f7f581de225c" containerID="d73931dfe12b0f2146de4337cbaefb7e3e973ba650bd1e4007fd55373a8f5347" exitCode=0 Mar 10 09:15:48 crc kubenswrapper[4825]: I0310 09:15:48.909224 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssljg" event={"ID":"0ed0e09d-4c76-4ebf-b9b1-f7f581de225c","Type":"ContainerDied","Data":"d73931dfe12b0f2146de4337cbaefb7e3e973ba650bd1e4007fd55373a8f5347"} Mar 10 09:15:48 crc kubenswrapper[4825]: I0310 09:15:48.909433 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssljg" event={"ID":"0ed0e09d-4c76-4ebf-b9b1-f7f581de225c","Type":"ContainerStarted","Data":"b6e102fab3e2269c8d29213a52124786089a174e810c8f235a571c452f3bd091"} Mar 10 09:15:48 crc kubenswrapper[4825]: I0310 09:15:48.913651 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:15:50 crc kubenswrapper[4825]: I0310 09:15:50.928295 4825 generic.go:334] "Generic (PLEG): container finished" podID="0ed0e09d-4c76-4ebf-b9b1-f7f581de225c" containerID="39d38d6d219f88be6324ba288969eb5c1fc8de61ba6aae4210396568b712b876" exitCode=0 Mar 10 09:15:50 crc kubenswrapper[4825]: I0310 09:15:50.928512 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssljg" event={"ID":"0ed0e09d-4c76-4ebf-b9b1-f7f581de225c","Type":"ContainerDied","Data":"39d38d6d219f88be6324ba288969eb5c1fc8de61ba6aae4210396568b712b876"} Mar 10 09:15:51 crc kubenswrapper[4825]: I0310 09:15:51.938100 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssljg" event={"ID":"0ed0e09d-4c76-4ebf-b9b1-f7f581de225c","Type":"ContainerStarted","Data":"279fb1adf0df0ce4014cf9003a141306a11d15cf8d01b20e71fbef7cf3b74426"} Mar 10 09:15:53 crc kubenswrapper[4825]: I0310 09:15:53.237459 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:15:53 crc kubenswrapper[4825]: E0310 09:15:53.238009 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:15:57 crc kubenswrapper[4825]: I0310 09:15:57.950536 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ssljg" Mar 10 09:15:57 crc kubenswrapper[4825]: I0310 09:15:57.952580 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ssljg" Mar 10 09:15:58 crc kubenswrapper[4825]: I0310 09:15:58.002484 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ssljg" Mar 10 09:15:58 crc kubenswrapper[4825]: I0310 09:15:58.029219 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ssljg" podStartSLOduration=8.514384472 podStartE2EDuration="11.029193608s" podCreationTimestamp="2026-03-10 09:15:47 +0000 UTC" firstStartedPulling="2026-03-10 09:15:48.91223272 +0000 UTC m=+9101.942013335" lastFinishedPulling="2026-03-10 09:15:51.427041856 +0000 UTC m=+9104.456822471" observedRunningTime="2026-03-10 09:15:51.958628437 +0000 UTC m=+9104.988409052" watchObservedRunningTime="2026-03-10 09:15:58.029193608 +0000 UTC m=+9111.058974243" Mar 10 09:15:59 crc kubenswrapper[4825]: I0310 09:15:59.055969 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ssljg" Mar 10 09:15:59 crc kubenswrapper[4825]: I0310 09:15:59.134391 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssljg"] Mar 10 09:16:00 crc kubenswrapper[4825]: I0310 09:16:00.146201 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552236-q5rtj"] Mar 10 09:16:00 crc kubenswrapper[4825]: I0310 09:16:00.147767 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552236-q5rtj" Mar 10 09:16:00 crc kubenswrapper[4825]: I0310 09:16:00.149932 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:16:00 crc kubenswrapper[4825]: I0310 09:16:00.150169 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:16:00 crc kubenswrapper[4825]: I0310 09:16:00.150443 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 09:16:00 crc kubenswrapper[4825]: I0310 09:16:00.180693 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552236-q5rtj"] Mar 10 09:16:00 crc kubenswrapper[4825]: I0310 09:16:00.306995 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hngf\" (UniqueName: \"kubernetes.io/projected/c3d4edf3-4f2f-4238-a12a-1c5d41805acf-kube-api-access-7hngf\") pod \"auto-csr-approver-29552236-q5rtj\" (UID: \"c3d4edf3-4f2f-4238-a12a-1c5d41805acf\") " pod="openshift-infra/auto-csr-approver-29552236-q5rtj" Mar 10 09:16:00 crc kubenswrapper[4825]: I0310 09:16:00.409106 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hngf\" (UniqueName: \"kubernetes.io/projected/c3d4edf3-4f2f-4238-a12a-1c5d41805acf-kube-api-access-7hngf\") pod \"auto-csr-approver-29552236-q5rtj\" (UID: \"c3d4edf3-4f2f-4238-a12a-1c5d41805acf\") " pod="openshift-infra/auto-csr-approver-29552236-q5rtj" Mar 10 09:16:00 crc kubenswrapper[4825]: I0310 09:16:00.435054 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hngf\" (UniqueName: \"kubernetes.io/projected/c3d4edf3-4f2f-4238-a12a-1c5d41805acf-kube-api-access-7hngf\") pod \"auto-csr-approver-29552236-q5rtj\" (UID: \"c3d4edf3-4f2f-4238-a12a-1c5d41805acf\") " pod="openshift-infra/auto-csr-approver-29552236-q5rtj" Mar 10 09:16:00 crc kubenswrapper[4825]: I0310 09:16:00.481163 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552236-q5rtj" Mar 10 09:16:00 crc kubenswrapper[4825]: W0310 09:16:00.943743 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3d4edf3_4f2f_4238_a12a_1c5d41805acf.slice/crio-aea625c08452db507ea7d667965fc5c1447e2cbd6dcd3c9dbd8beaff1cef850a WatchSource:0}: Error finding container aea625c08452db507ea7d667965fc5c1447e2cbd6dcd3c9dbd8beaff1cef850a: Status 404 returned error can't find the container with id aea625c08452db507ea7d667965fc5c1447e2cbd6dcd3c9dbd8beaff1cef850a Mar 10 09:16:00 crc kubenswrapper[4825]: I0310 09:16:00.950940 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552236-q5rtj"] Mar 10 09:16:01 crc kubenswrapper[4825]: I0310 09:16:01.014607 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552236-q5rtj" event={"ID":"c3d4edf3-4f2f-4238-a12a-1c5d41805acf","Type":"ContainerStarted","Data":"aea625c08452db507ea7d667965fc5c1447e2cbd6dcd3c9dbd8beaff1cef850a"} Mar 10 09:16:01 crc kubenswrapper[4825]: I0310 09:16:01.014780 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ssljg" podUID="0ed0e09d-4c76-4ebf-b9b1-f7f581de225c" containerName="registry-server" containerID="cri-o://279fb1adf0df0ce4014cf9003a141306a11d15cf8d01b20e71fbef7cf3b74426" gracePeriod=2 Mar 10 09:16:01 crc kubenswrapper[4825]: I0310 09:16:01.655229 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ssljg" Mar 10 09:16:01 crc kubenswrapper[4825]: I0310 09:16:01.736115 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nmr4\" (UniqueName: \"kubernetes.io/projected/0ed0e09d-4c76-4ebf-b9b1-f7f581de225c-kube-api-access-8nmr4\") pod \"0ed0e09d-4c76-4ebf-b9b1-f7f581de225c\" (UID: \"0ed0e09d-4c76-4ebf-b9b1-f7f581de225c\") " Mar 10 09:16:01 crc kubenswrapper[4825]: I0310 09:16:01.736462 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ed0e09d-4c76-4ebf-b9b1-f7f581de225c-catalog-content\") pod \"0ed0e09d-4c76-4ebf-b9b1-f7f581de225c\" (UID: \"0ed0e09d-4c76-4ebf-b9b1-f7f581de225c\") " Mar 10 09:16:01 crc kubenswrapper[4825]: I0310 09:16:01.736503 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ed0e09d-4c76-4ebf-b9b1-f7f581de225c-utilities\") pod \"0ed0e09d-4c76-4ebf-b9b1-f7f581de225c\" (UID: \"0ed0e09d-4c76-4ebf-b9b1-f7f581de225c\") " Mar 10 09:16:01 crc kubenswrapper[4825]: I0310 09:16:01.738538 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ed0e09d-4c76-4ebf-b9b1-f7f581de225c-utilities" (OuterVolumeSpecName: "utilities") pod "0ed0e09d-4c76-4ebf-b9b1-f7f581de225c" (UID: "0ed0e09d-4c76-4ebf-b9b1-f7f581de225c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:16:01 crc kubenswrapper[4825]: I0310 09:16:01.756944 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ed0e09d-4c76-4ebf-b9b1-f7f581de225c-kube-api-access-8nmr4" (OuterVolumeSpecName: "kube-api-access-8nmr4") pod "0ed0e09d-4c76-4ebf-b9b1-f7f581de225c" (UID: "0ed0e09d-4c76-4ebf-b9b1-f7f581de225c"). InnerVolumeSpecName "kube-api-access-8nmr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:16:01 crc kubenswrapper[4825]: I0310 09:16:01.771700 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ed0e09d-4c76-4ebf-b9b1-f7f581de225c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ed0e09d-4c76-4ebf-b9b1-f7f581de225c" (UID: "0ed0e09d-4c76-4ebf-b9b1-f7f581de225c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:16:01 crc kubenswrapper[4825]: I0310 09:16:01.838465 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nmr4\" (UniqueName: \"kubernetes.io/projected/0ed0e09d-4c76-4ebf-b9b1-f7f581de225c-kube-api-access-8nmr4\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:01 crc kubenswrapper[4825]: I0310 09:16:01.838505 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ed0e09d-4c76-4ebf-b9b1-f7f581de225c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:01 crc kubenswrapper[4825]: I0310 09:16:01.838518 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ed0e09d-4c76-4ebf-b9b1-f7f581de225c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:02 crc kubenswrapper[4825]: I0310 09:16:02.077417 4825 generic.go:334] "Generic (PLEG): container finished" podID="0ed0e09d-4c76-4ebf-b9b1-f7f581de225c" containerID="279fb1adf0df0ce4014cf9003a141306a11d15cf8d01b20e71fbef7cf3b74426" exitCode=0 Mar 10 09:16:02 crc kubenswrapper[4825]: I0310 09:16:02.077467 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssljg" event={"ID":"0ed0e09d-4c76-4ebf-b9b1-f7f581de225c","Type":"ContainerDied","Data":"279fb1adf0df0ce4014cf9003a141306a11d15cf8d01b20e71fbef7cf3b74426"} Mar 10 09:16:02 crc kubenswrapper[4825]: I0310 09:16:02.077502 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ssljg" event={"ID":"0ed0e09d-4c76-4ebf-b9b1-f7f581de225c","Type":"ContainerDied","Data":"b6e102fab3e2269c8d29213a52124786089a174e810c8f235a571c452f3bd091"} Mar 10 09:16:02 crc kubenswrapper[4825]: I0310 09:16:02.077513 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ssljg" Mar 10 09:16:02 crc kubenswrapper[4825]: I0310 09:16:02.077520 4825 scope.go:117] "RemoveContainer" containerID="279fb1adf0df0ce4014cf9003a141306a11d15cf8d01b20e71fbef7cf3b74426" Mar 10 09:16:02 crc kubenswrapper[4825]: I0310 09:16:02.105771 4825 scope.go:117] "RemoveContainer" containerID="39d38d6d219f88be6324ba288969eb5c1fc8de61ba6aae4210396568b712b876" Mar 10 09:16:02 crc kubenswrapper[4825]: I0310 09:16:02.134880 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssljg"] Mar 10 09:16:02 crc kubenswrapper[4825]: I0310 09:16:02.141639 4825 scope.go:117] "RemoveContainer" containerID="d73931dfe12b0f2146de4337cbaefb7e3e973ba650bd1e4007fd55373a8f5347" Mar 10 09:16:02 crc kubenswrapper[4825]: I0310 09:16:02.150496 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ssljg"] Mar 10 09:16:02 crc kubenswrapper[4825]: I0310 09:16:02.188216 4825 scope.go:117] "RemoveContainer" containerID="279fb1adf0df0ce4014cf9003a141306a11d15cf8d01b20e71fbef7cf3b74426" Mar 10 09:16:02 crc kubenswrapper[4825]: E0310 09:16:02.189066 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"279fb1adf0df0ce4014cf9003a141306a11d15cf8d01b20e71fbef7cf3b74426\": container with ID starting with 279fb1adf0df0ce4014cf9003a141306a11d15cf8d01b20e71fbef7cf3b74426 not found: ID does not exist" containerID="279fb1adf0df0ce4014cf9003a141306a11d15cf8d01b20e71fbef7cf3b74426" Mar 10 09:16:02 crc kubenswrapper[4825]: I0310 09:16:02.189099 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"279fb1adf0df0ce4014cf9003a141306a11d15cf8d01b20e71fbef7cf3b74426"} err="failed to get container status \"279fb1adf0df0ce4014cf9003a141306a11d15cf8d01b20e71fbef7cf3b74426\": rpc error: code = NotFound desc = could not find container \"279fb1adf0df0ce4014cf9003a141306a11d15cf8d01b20e71fbef7cf3b74426\": container with ID starting with 279fb1adf0df0ce4014cf9003a141306a11d15cf8d01b20e71fbef7cf3b74426 not found: ID does not exist" Mar 10 09:16:02 crc kubenswrapper[4825]: I0310 09:16:02.189121 4825 scope.go:117] "RemoveContainer" containerID="39d38d6d219f88be6324ba288969eb5c1fc8de61ba6aae4210396568b712b876" Mar 10 09:16:02 crc kubenswrapper[4825]: E0310 09:16:02.189586 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39d38d6d219f88be6324ba288969eb5c1fc8de61ba6aae4210396568b712b876\": container with ID starting with 39d38d6d219f88be6324ba288969eb5c1fc8de61ba6aae4210396568b712b876 not found: ID does not exist" containerID="39d38d6d219f88be6324ba288969eb5c1fc8de61ba6aae4210396568b712b876" Mar 10 09:16:02 crc kubenswrapper[4825]: I0310 09:16:02.189609 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39d38d6d219f88be6324ba288969eb5c1fc8de61ba6aae4210396568b712b876"} err="failed to get container status \"39d38d6d219f88be6324ba288969eb5c1fc8de61ba6aae4210396568b712b876\": rpc error: code = NotFound desc = could not find container \"39d38d6d219f88be6324ba288969eb5c1fc8de61ba6aae4210396568b712b876\": container with ID starting with 39d38d6d219f88be6324ba288969eb5c1fc8de61ba6aae4210396568b712b876 not found: ID does not exist" Mar 10 09:16:02 crc kubenswrapper[4825]: I0310 09:16:02.189622 4825 scope.go:117] "RemoveContainer" containerID="d73931dfe12b0f2146de4337cbaefb7e3e973ba650bd1e4007fd55373a8f5347" Mar 10 09:16:02 crc kubenswrapper[4825]: E0310 09:16:02.189824 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d73931dfe12b0f2146de4337cbaefb7e3e973ba650bd1e4007fd55373a8f5347\": container with ID starting with d73931dfe12b0f2146de4337cbaefb7e3e973ba650bd1e4007fd55373a8f5347 not found: ID does not exist" containerID="d73931dfe12b0f2146de4337cbaefb7e3e973ba650bd1e4007fd55373a8f5347" Mar 10 09:16:02 crc kubenswrapper[4825]: I0310 09:16:02.189843 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d73931dfe12b0f2146de4337cbaefb7e3e973ba650bd1e4007fd55373a8f5347"} err="failed to get container status \"d73931dfe12b0f2146de4337cbaefb7e3e973ba650bd1e4007fd55373a8f5347\": rpc error: code = NotFound desc = could not find container \"d73931dfe12b0f2146de4337cbaefb7e3e973ba650bd1e4007fd55373a8f5347\": container with ID starting with d73931dfe12b0f2146de4337cbaefb7e3e973ba650bd1e4007fd55373a8f5347 not found: ID does not exist" Mar 10 09:16:02 crc kubenswrapper[4825]: E0310 09:16:02.323492 4825 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ed0e09d_4c76_4ebf_b9b1_f7f581de225c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ed0e09d_4c76_4ebf_b9b1_f7f581de225c.slice/crio-b6e102fab3e2269c8d29213a52124786089a174e810c8f235a571c452f3bd091\": RecentStats: unable to find data in memory cache]" Mar 10 09:16:03 crc kubenswrapper[4825]: I0310 09:16:03.089182 4825 generic.go:334] "Generic (PLEG): container finished" podID="c3d4edf3-4f2f-4238-a12a-1c5d41805acf" containerID="8a3491189e86f604eaaceeeac539fe1143157dd166c76852131ec38dc5671da4" exitCode=0 Mar 10 09:16:03 crc kubenswrapper[4825]: I0310 09:16:03.089259 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552236-q5rtj" event={"ID":"c3d4edf3-4f2f-4238-a12a-1c5d41805acf","Type":"ContainerDied","Data":"8a3491189e86f604eaaceeeac539fe1143157dd166c76852131ec38dc5671da4"} Mar 10 09:16:03 crc kubenswrapper[4825]: I0310 09:16:03.256861 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ed0e09d-4c76-4ebf-b9b1-f7f581de225c" path="/var/lib/kubelet/pods/0ed0e09d-4c76-4ebf-b9b1-f7f581de225c/volumes" Mar 10 09:16:04 crc kubenswrapper[4825]: I0310 09:16:04.236174 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:16:04 crc kubenswrapper[4825]: E0310 09:16:04.236486 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:16:04 crc kubenswrapper[4825]: I0310 09:16:04.611379 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552236-q5rtj" Mar 10 09:16:04 crc kubenswrapper[4825]: I0310 09:16:04.696537 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hngf\" (UniqueName: \"kubernetes.io/projected/c3d4edf3-4f2f-4238-a12a-1c5d41805acf-kube-api-access-7hngf\") pod \"c3d4edf3-4f2f-4238-a12a-1c5d41805acf\" (UID: \"c3d4edf3-4f2f-4238-a12a-1c5d41805acf\") " Mar 10 09:16:05 crc kubenswrapper[4825]: I0310 09:16:05.114449 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552236-q5rtj" event={"ID":"c3d4edf3-4f2f-4238-a12a-1c5d41805acf","Type":"ContainerDied","Data":"aea625c08452db507ea7d667965fc5c1447e2cbd6dcd3c9dbd8beaff1cef850a"} Mar 10 09:16:05 crc kubenswrapper[4825]: I0310 09:16:05.114711 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aea625c08452db507ea7d667965fc5c1447e2cbd6dcd3c9dbd8beaff1cef850a" Mar 10 09:16:05 crc kubenswrapper[4825]: I0310 09:16:05.114469 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552236-q5rtj" Mar 10 09:16:05 crc kubenswrapper[4825]: I0310 09:16:05.268957 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3d4edf3-4f2f-4238-a12a-1c5d41805acf-kube-api-access-7hngf" (OuterVolumeSpecName: "kube-api-access-7hngf") pod "c3d4edf3-4f2f-4238-a12a-1c5d41805acf" (UID: "c3d4edf3-4f2f-4238-a12a-1c5d41805acf"). InnerVolumeSpecName "kube-api-access-7hngf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:16:05 crc kubenswrapper[4825]: I0310 09:16:05.310476 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hngf\" (UniqueName: \"kubernetes.io/projected/c3d4edf3-4f2f-4238-a12a-1c5d41805acf-kube-api-access-7hngf\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:05 crc kubenswrapper[4825]: I0310 09:16:05.677666 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552230-g4qn2"] Mar 10 09:16:05 crc kubenswrapper[4825]: I0310 09:16:05.687478 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552230-g4qn2"] Mar 10 09:16:07 crc kubenswrapper[4825]: I0310 09:16:07.246812 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e70206fe-4c86-4fb9-987d-2770ffc62adc" path="/var/lib/kubelet/pods/e70206fe-4c86-4fb9-987d-2770ffc62adc/volumes" Mar 10 09:16:18 crc kubenswrapper[4825]: I0310 09:16:18.237239 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:16:18 crc kubenswrapper[4825]: E0310 09:16:18.238539 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:16:30 crc kubenswrapper[4825]: I0310 09:16:30.236539 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:16:30 crc kubenswrapper[4825]: E0310 09:16:30.237426 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:16:32 crc kubenswrapper[4825]: I0310 09:16:32.215608 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-897vw"] Mar 10 09:16:32 crc kubenswrapper[4825]: E0310 09:16:32.216115 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ed0e09d-4c76-4ebf-b9b1-f7f581de225c" containerName="extract-utilities" Mar 10 09:16:32 crc kubenswrapper[4825]: I0310 09:16:32.216162 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ed0e09d-4c76-4ebf-b9b1-f7f581de225c" containerName="extract-utilities" Mar 10 09:16:32 crc kubenswrapper[4825]: E0310 09:16:32.216191 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ed0e09d-4c76-4ebf-b9b1-f7f581de225c" containerName="extract-content" Mar 10 09:16:32 crc kubenswrapper[4825]: I0310 09:16:32.216200 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ed0e09d-4c76-4ebf-b9b1-f7f581de225c" containerName="extract-content" Mar 10 09:16:32 crc kubenswrapper[4825]: E0310 09:16:32.216226 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ed0e09d-4c76-4ebf-b9b1-f7f581de225c" containerName="registry-server" Mar 10 09:16:32 crc kubenswrapper[4825]: I0310 09:16:32.216234 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ed0e09d-4c76-4ebf-b9b1-f7f581de225c" containerName="registry-server" Mar 10 09:16:32 crc kubenswrapper[4825]: E0310 09:16:32.216256 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3d4edf3-4f2f-4238-a12a-1c5d41805acf" containerName="oc" Mar 10 09:16:32 crc kubenswrapper[4825]: I0310 09:16:32.216264 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3d4edf3-4f2f-4238-a12a-1c5d41805acf" containerName="oc" Mar 10 09:16:32 crc kubenswrapper[4825]: I0310 09:16:32.217620 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ed0e09d-4c76-4ebf-b9b1-f7f581de225c" containerName="registry-server" Mar 10 09:16:32 crc kubenswrapper[4825]: I0310 09:16:32.217649 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3d4edf3-4f2f-4238-a12a-1c5d41805acf" containerName="oc" Mar 10 09:16:32 crc kubenswrapper[4825]: I0310 09:16:32.219995 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-897vw" Mar 10 09:16:32 crc kubenswrapper[4825]: I0310 09:16:32.242005 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-897vw"] Mar 10 09:16:32 crc kubenswrapper[4825]: I0310 09:16:32.246985 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k96zt\" (UniqueName: \"kubernetes.io/projected/1f78541c-142d-4820-b871-d84ed6878888-kube-api-access-k96zt\") pod \"community-operators-897vw\" (UID: \"1f78541c-142d-4820-b871-d84ed6878888\") " pod="openshift-marketplace/community-operators-897vw" Mar 10 09:16:32 crc kubenswrapper[4825]: I0310 09:16:32.247170 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f78541c-142d-4820-b871-d84ed6878888-catalog-content\") pod \"community-operators-897vw\" (UID: \"1f78541c-142d-4820-b871-d84ed6878888\") " pod="openshift-marketplace/community-operators-897vw" Mar 10 09:16:32 crc kubenswrapper[4825]: I0310 09:16:32.247342 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f78541c-142d-4820-b871-d84ed6878888-utilities\") pod \"community-operators-897vw\" (UID: \"1f78541c-142d-4820-b871-d84ed6878888\") " pod="openshift-marketplace/community-operators-897vw" Mar 10 09:16:32 crc kubenswrapper[4825]: I0310 09:16:32.348967 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f78541c-142d-4820-b871-d84ed6878888-utilities\") pod \"community-operators-897vw\" (UID: \"1f78541c-142d-4820-b871-d84ed6878888\") " pod="openshift-marketplace/community-operators-897vw" Mar 10 09:16:32 crc kubenswrapper[4825]: I0310 09:16:32.349024 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k96zt\" (UniqueName: \"kubernetes.io/projected/1f78541c-142d-4820-b871-d84ed6878888-kube-api-access-k96zt\") pod \"community-operators-897vw\" (UID: \"1f78541c-142d-4820-b871-d84ed6878888\") " pod="openshift-marketplace/community-operators-897vw" Mar 10 09:16:32 crc kubenswrapper[4825]: I0310 09:16:32.349103 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f78541c-142d-4820-b871-d84ed6878888-catalog-content\") pod \"community-operators-897vw\" (UID: \"1f78541c-142d-4820-b871-d84ed6878888\") " pod="openshift-marketplace/community-operators-897vw" Mar 10 09:16:32 crc kubenswrapper[4825]: I0310 09:16:32.349495 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f78541c-142d-4820-b871-d84ed6878888-utilities\") pod \"community-operators-897vw\" (UID: \"1f78541c-142d-4820-b871-d84ed6878888\") " pod="openshift-marketplace/community-operators-897vw" Mar 10 09:16:32 crc kubenswrapper[4825]: I0310 09:16:32.349562 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f78541c-142d-4820-b871-d84ed6878888-catalog-content\") pod \"community-operators-897vw\" (UID: \"1f78541c-142d-4820-b871-d84ed6878888\") " pod="openshift-marketplace/community-operators-897vw" Mar 10 09:16:32 crc kubenswrapper[4825]: I0310 09:16:32.379882 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k96zt\" (UniqueName: \"kubernetes.io/projected/1f78541c-142d-4820-b871-d84ed6878888-kube-api-access-k96zt\") pod \"community-operators-897vw\" (UID: \"1f78541c-142d-4820-b871-d84ed6878888\") " pod="openshift-marketplace/community-operators-897vw" Mar 10 09:16:32 crc kubenswrapper[4825]: I0310 09:16:32.546025 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-897vw" Mar 10 09:16:33 crc kubenswrapper[4825]: I0310 09:16:33.104928 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-897vw"] Mar 10 09:16:33 crc kubenswrapper[4825]: I0310 09:16:33.375801 4825 generic.go:334] "Generic (PLEG): container finished" podID="1f78541c-142d-4820-b871-d84ed6878888" containerID="3b2d39a52086bf47e6edf3edea6fcc60e4881963cea7d3b07bc08056202e5fc2" exitCode=0 Mar 10 09:16:33 crc kubenswrapper[4825]: I0310 09:16:33.375904 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-897vw" event={"ID":"1f78541c-142d-4820-b871-d84ed6878888","Type":"ContainerDied","Data":"3b2d39a52086bf47e6edf3edea6fcc60e4881963cea7d3b07bc08056202e5fc2"} Mar 10 09:16:33 crc kubenswrapper[4825]: I0310 09:16:33.376112 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-897vw" event={"ID":"1f78541c-142d-4820-b871-d84ed6878888","Type":"ContainerStarted","Data":"03d3daed2bc7ebb743d3874ee59378dab6ffcc6452aa5008928195066e848adb"} Mar 10 09:16:34 crc kubenswrapper[4825]: I0310 09:16:34.387502 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-897vw" event={"ID":"1f78541c-142d-4820-b871-d84ed6878888","Type":"ContainerStarted","Data":"c5cdb9dc9ccb8018b9a63d56c134ff208b4e5b6e1bac704624252b74f012a787"} Mar 10 09:16:36 crc kubenswrapper[4825]: I0310 09:16:36.409183 4825 generic.go:334] "Generic (PLEG): container finished" podID="1f78541c-142d-4820-b871-d84ed6878888" containerID="c5cdb9dc9ccb8018b9a63d56c134ff208b4e5b6e1bac704624252b74f012a787" exitCode=0 Mar 10 09:16:36 crc kubenswrapper[4825]: I0310 09:16:36.409246 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-897vw" event={"ID":"1f78541c-142d-4820-b871-d84ed6878888","Type":"ContainerDied","Data":"c5cdb9dc9ccb8018b9a63d56c134ff208b4e5b6e1bac704624252b74f012a787"} Mar 10 09:16:38 crc kubenswrapper[4825]: I0310 09:16:38.432404 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-897vw" event={"ID":"1f78541c-142d-4820-b871-d84ed6878888","Type":"ContainerStarted","Data":"ae43662ab35badf54cc6aa6b08dde982452299396a3e2853df063522e3d4c357"} Mar 10 09:16:38 crc kubenswrapper[4825]: I0310 09:16:38.461602 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-897vw" podStartSLOduration=2.998786837 podStartE2EDuration="6.461576723s" podCreationTimestamp="2026-03-10 09:16:32 +0000 UTC" firstStartedPulling="2026-03-10 09:16:33.382290386 +0000 UTC m=+9146.412071001" lastFinishedPulling="2026-03-10 09:16:36.845080272 +0000 UTC m=+9149.874860887" observedRunningTime="2026-03-10 09:16:38.45095447 +0000 UTC m=+9151.480735095" watchObservedRunningTime="2026-03-10 09:16:38.461576723 +0000 UTC m=+9151.491357348" Mar 10 09:16:42 crc kubenswrapper[4825]: I0310 09:16:42.236962 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:16:42 crc kubenswrapper[4825]: E0310 09:16:42.237608 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:16:42 crc kubenswrapper[4825]: I0310 09:16:42.546489 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-897vw" Mar 10 09:16:42 crc kubenswrapper[4825]: I0310 09:16:42.546882 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-897vw" Mar 10 09:16:42 crc kubenswrapper[4825]: I0310 09:16:42.629734 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-897vw" Mar 10 09:16:43 crc kubenswrapper[4825]: I0310 09:16:43.529978 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-897vw" Mar 10 09:16:43 crc kubenswrapper[4825]: I0310 09:16:43.597726 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-897vw"] Mar 10 09:16:45 crc kubenswrapper[4825]: I0310 09:16:45.491902 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-897vw" podUID="1f78541c-142d-4820-b871-d84ed6878888" containerName="registry-server" containerID="cri-o://ae43662ab35badf54cc6aa6b08dde982452299396a3e2853df063522e3d4c357" gracePeriod=2 Mar 10 09:16:46 crc kubenswrapper[4825]: I0310 09:16:46.179109 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-897vw" Mar 10 09:16:46 crc kubenswrapper[4825]: I0310 09:16:46.261401 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f78541c-142d-4820-b871-d84ed6878888-utilities\") pod \"1f78541c-142d-4820-b871-d84ed6878888\" (UID: \"1f78541c-142d-4820-b871-d84ed6878888\") " Mar 10 09:16:46 crc kubenswrapper[4825]: I0310 09:16:46.261575 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k96zt\" (UniqueName: \"kubernetes.io/projected/1f78541c-142d-4820-b871-d84ed6878888-kube-api-access-k96zt\") pod \"1f78541c-142d-4820-b871-d84ed6878888\" (UID: \"1f78541c-142d-4820-b871-d84ed6878888\") " Mar 10 09:16:46 crc kubenswrapper[4825]: I0310 09:16:46.261701 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f78541c-142d-4820-b871-d84ed6878888-catalog-content\") pod \"1f78541c-142d-4820-b871-d84ed6878888\" (UID: \"1f78541c-142d-4820-b871-d84ed6878888\") " Mar 10 09:16:46 crc kubenswrapper[4825]: I0310 09:16:46.262616 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f78541c-142d-4820-b871-d84ed6878888-utilities" (OuterVolumeSpecName: "utilities") pod "1f78541c-142d-4820-b871-d84ed6878888" (UID: "1f78541c-142d-4820-b871-d84ed6878888"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:16:46 crc kubenswrapper[4825]: I0310 09:16:46.343392 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f78541c-142d-4820-b871-d84ed6878888-kube-api-access-k96zt" (OuterVolumeSpecName: "kube-api-access-k96zt") pod "1f78541c-142d-4820-b871-d84ed6878888" (UID: "1f78541c-142d-4820-b871-d84ed6878888"). InnerVolumeSpecName "kube-api-access-k96zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:16:46 crc kubenswrapper[4825]: I0310 09:16:46.372844 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k96zt\" (UniqueName: \"kubernetes.io/projected/1f78541c-142d-4820-b871-d84ed6878888-kube-api-access-k96zt\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:46 crc kubenswrapper[4825]: I0310 09:16:46.372886 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f78541c-142d-4820-b871-d84ed6878888-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:46 crc kubenswrapper[4825]: I0310 09:16:46.386358 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f78541c-142d-4820-b871-d84ed6878888-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f78541c-142d-4820-b871-d84ed6878888" (UID: "1f78541c-142d-4820-b871-d84ed6878888"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:16:46 crc kubenswrapper[4825]: I0310 09:16:46.474795 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f78541c-142d-4820-b871-d84ed6878888-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:16:46 crc kubenswrapper[4825]: I0310 09:16:46.501573 4825 generic.go:334] "Generic (PLEG): container finished" podID="1f78541c-142d-4820-b871-d84ed6878888" containerID="ae43662ab35badf54cc6aa6b08dde982452299396a3e2853df063522e3d4c357" exitCode=0 Mar 10 09:16:46 crc kubenswrapper[4825]: I0310 09:16:46.501633 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-897vw" event={"ID":"1f78541c-142d-4820-b871-d84ed6878888","Type":"ContainerDied","Data":"ae43662ab35badf54cc6aa6b08dde982452299396a3e2853df063522e3d4c357"} Mar 10 09:16:46 crc kubenswrapper[4825]: I0310 09:16:46.501670 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-897vw" Mar 10 09:16:46 crc kubenswrapper[4825]: I0310 09:16:46.501716 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-897vw" event={"ID":"1f78541c-142d-4820-b871-d84ed6878888","Type":"ContainerDied","Data":"03d3daed2bc7ebb743d3874ee59378dab6ffcc6452aa5008928195066e848adb"} Mar 10 09:16:46 crc kubenswrapper[4825]: I0310 09:16:46.501741 4825 scope.go:117] "RemoveContainer" containerID="ae43662ab35badf54cc6aa6b08dde982452299396a3e2853df063522e3d4c357" Mar 10 09:16:46 crc kubenswrapper[4825]: I0310 09:16:46.522413 4825 scope.go:117] "RemoveContainer" containerID="c5cdb9dc9ccb8018b9a63d56c134ff208b4e5b6e1bac704624252b74f012a787" Mar 10 09:16:46 crc kubenswrapper[4825]: I0310 09:16:46.537900 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-897vw"] Mar 10 09:16:46 crc kubenswrapper[4825]: I0310 09:16:46.557793 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-897vw"] Mar 10 09:16:46 crc kubenswrapper[4825]: I0310 09:16:46.566121 4825 scope.go:117] "RemoveContainer" containerID="3b2d39a52086bf47e6edf3edea6fcc60e4881963cea7d3b07bc08056202e5fc2" Mar 10 09:16:47 crc kubenswrapper[4825]: I0310 09:16:47.246874 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f78541c-142d-4820-b871-d84ed6878888" path="/var/lib/kubelet/pods/1f78541c-142d-4820-b871-d84ed6878888/volumes" Mar 10 09:16:47 crc kubenswrapper[4825]: I0310 09:16:47.279014 4825 scope.go:117] "RemoveContainer" containerID="ae43662ab35badf54cc6aa6b08dde982452299396a3e2853df063522e3d4c357" Mar 10 09:16:47 crc kubenswrapper[4825]: E0310 09:16:47.279386 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae43662ab35badf54cc6aa6b08dde982452299396a3e2853df063522e3d4c357\": container with ID starting with ae43662ab35badf54cc6aa6b08dde982452299396a3e2853df063522e3d4c357 not found: ID does not exist" containerID="ae43662ab35badf54cc6aa6b08dde982452299396a3e2853df063522e3d4c357" Mar 10 09:16:47 crc kubenswrapper[4825]: I0310 09:16:47.279419 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae43662ab35badf54cc6aa6b08dde982452299396a3e2853df063522e3d4c357"} err="failed to get container status \"ae43662ab35badf54cc6aa6b08dde982452299396a3e2853df063522e3d4c357\": rpc error: code = NotFound desc = could not find container \"ae43662ab35badf54cc6aa6b08dde982452299396a3e2853df063522e3d4c357\": container with ID starting with ae43662ab35badf54cc6aa6b08dde982452299396a3e2853df063522e3d4c357 not found: ID does not exist" Mar 10 09:16:47 crc kubenswrapper[4825]: I0310 09:16:47.279443 4825 scope.go:117] "RemoveContainer" containerID="c5cdb9dc9ccb8018b9a63d56c134ff208b4e5b6e1bac704624252b74f012a787" Mar 10 09:16:47 crc kubenswrapper[4825]: E0310 09:16:47.280306 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5cdb9dc9ccb8018b9a63d56c134ff208b4e5b6e1bac704624252b74f012a787\": container with ID starting with c5cdb9dc9ccb8018b9a63d56c134ff208b4e5b6e1bac704624252b74f012a787 not found: ID does not exist" containerID="c5cdb9dc9ccb8018b9a63d56c134ff208b4e5b6e1bac704624252b74f012a787" Mar 10 09:16:47 crc kubenswrapper[4825]: I0310 09:16:47.280347 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5cdb9dc9ccb8018b9a63d56c134ff208b4e5b6e1bac704624252b74f012a787"} err="failed to get container status \"c5cdb9dc9ccb8018b9a63d56c134ff208b4e5b6e1bac704624252b74f012a787\": rpc error: code = NotFound desc = could not find container \"c5cdb9dc9ccb8018b9a63d56c134ff208b4e5b6e1bac704624252b74f012a787\": container with ID starting with c5cdb9dc9ccb8018b9a63d56c134ff208b4e5b6e1bac704624252b74f012a787 not found: ID does not exist" Mar 10 09:16:47 crc kubenswrapper[4825]: I0310 09:16:47.280376 4825 scope.go:117] "RemoveContainer" containerID="3b2d39a52086bf47e6edf3edea6fcc60e4881963cea7d3b07bc08056202e5fc2" Mar 10 09:16:47 crc kubenswrapper[4825]: E0310 09:16:47.280722 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b2d39a52086bf47e6edf3edea6fcc60e4881963cea7d3b07bc08056202e5fc2\": container with ID starting with 3b2d39a52086bf47e6edf3edea6fcc60e4881963cea7d3b07bc08056202e5fc2 not found: ID does not exist" containerID="3b2d39a52086bf47e6edf3edea6fcc60e4881963cea7d3b07bc08056202e5fc2" Mar 10 09:16:47 crc kubenswrapper[4825]: I0310 09:16:47.280753 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2d39a52086bf47e6edf3edea6fcc60e4881963cea7d3b07bc08056202e5fc2"} err="failed to get container status \"3b2d39a52086bf47e6edf3edea6fcc60e4881963cea7d3b07bc08056202e5fc2\": rpc error: code = NotFound desc = could not find container \"3b2d39a52086bf47e6edf3edea6fcc60e4881963cea7d3b07bc08056202e5fc2\": container with ID starting with 3b2d39a52086bf47e6edf3edea6fcc60e4881963cea7d3b07bc08056202e5fc2 not found: ID does not exist" Mar 10 09:16:57 crc kubenswrapper[4825]: I0310 09:16:57.236956 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:16:57 crc kubenswrapper[4825]: E0310 09:16:57.237722 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:17:08 crc kubenswrapper[4825]: I0310 09:17:08.040638 4825 scope.go:117] "RemoveContainer" containerID="06b15fe51986006d832f657be9e352652bf33765297c2bc5720c736697189ba9" Mar 10 09:17:10 crc kubenswrapper[4825]: I0310 09:17:10.237377 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:17:10 crc kubenswrapper[4825]: E0310 09:17:10.237968 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:17:24 crc kubenswrapper[4825]: I0310 09:17:24.242112 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:17:24 crc kubenswrapper[4825]: E0310 09:17:24.242936 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:17:35 crc kubenswrapper[4825]: I0310 09:17:35.236606 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:17:35 crc kubenswrapper[4825]: E0310 09:17:35.237449 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:17:46 crc kubenswrapper[4825]: I0310 09:17:46.236711 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:17:46 crc kubenswrapper[4825]: E0310 09:17:46.237497 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:17:59 crc kubenswrapper[4825]: I0310 09:17:59.243436 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:17:59 crc kubenswrapper[4825]: E0310 09:17:59.245246 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:18:00 crc kubenswrapper[4825]: I0310 09:18:00.161192 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552238-ghhg6"] Mar 10 09:18:00 crc kubenswrapper[4825]: E0310 09:18:00.162108 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f78541c-142d-4820-b871-d84ed6878888" containerName="extract-content" Mar 10 09:18:00 crc kubenswrapper[4825]: I0310 09:18:00.162150 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f78541c-142d-4820-b871-d84ed6878888" containerName="extract-content" Mar 10 09:18:00 crc kubenswrapper[4825]: E0310 09:18:00.162181 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f78541c-142d-4820-b871-d84ed6878888" containerName="registry-server" Mar 10 09:18:00 crc kubenswrapper[4825]: I0310 09:18:00.162189 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f78541c-142d-4820-b871-d84ed6878888" containerName="registry-server" Mar 10 09:18:00 crc kubenswrapper[4825]: E0310 09:18:00.162226 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f78541c-142d-4820-b871-d84ed6878888" containerName="extract-utilities" Mar 10 09:18:00 crc kubenswrapper[4825]: I0310 09:18:00.162237 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f78541c-142d-4820-b871-d84ed6878888" containerName="extract-utilities" Mar 10 09:18:00 crc kubenswrapper[4825]: I0310 09:18:00.162773 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f78541c-142d-4820-b871-d84ed6878888" containerName="registry-server" Mar 10 09:18:00 crc kubenswrapper[4825]: I0310 09:18:00.174082 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552238-ghhg6" Mar 10 09:18:00 crc kubenswrapper[4825]: I0310 09:18:00.202616 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 09:18:00 crc kubenswrapper[4825]: I0310 09:18:00.202971 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:18:00 crc kubenswrapper[4825]: I0310 09:18:00.202946 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:18:00 crc kubenswrapper[4825]: I0310 09:18:00.217188 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552238-ghhg6"] Mar 10 09:18:00 crc kubenswrapper[4825]: I0310 09:18:00.237503 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czznj\" (UniqueName: \"kubernetes.io/projected/c6d7b9e0-a38d-4886-ab3a-b4ba7e6efe57-kube-api-access-czznj\") pod \"auto-csr-approver-29552238-ghhg6\" (UID: \"c6d7b9e0-a38d-4886-ab3a-b4ba7e6efe57\") " pod="openshift-infra/auto-csr-approver-29552238-ghhg6" Mar 10 09:18:00 crc kubenswrapper[4825]: I0310 09:18:00.339774 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czznj\" (UniqueName: \"kubernetes.io/projected/c6d7b9e0-a38d-4886-ab3a-b4ba7e6efe57-kube-api-access-czznj\") pod \"auto-csr-approver-29552238-ghhg6\" (UID: \"c6d7b9e0-a38d-4886-ab3a-b4ba7e6efe57\") " pod="openshift-infra/auto-csr-approver-29552238-ghhg6" Mar 10 09:18:00 crc kubenswrapper[4825]: I0310 09:18:00.385933 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czznj\" (UniqueName: \"kubernetes.io/projected/c6d7b9e0-a38d-4886-ab3a-b4ba7e6efe57-kube-api-access-czznj\") pod \"auto-csr-approver-29552238-ghhg6\" (UID: \"c6d7b9e0-a38d-4886-ab3a-b4ba7e6efe57\") " pod="openshift-infra/auto-csr-approver-29552238-ghhg6" Mar 10 09:18:00 crc kubenswrapper[4825]: I0310 09:18:00.523818 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552238-ghhg6" Mar 10 09:18:01 crc kubenswrapper[4825]: I0310 09:18:01.092031 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552238-ghhg6"] Mar 10 09:18:01 crc kubenswrapper[4825]: I0310 09:18:01.479883 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552238-ghhg6" event={"ID":"c6d7b9e0-a38d-4886-ab3a-b4ba7e6efe57","Type":"ContainerStarted","Data":"971f2b42dcc810ade009f2cb98d2e4e582f5c8909068af177eb3bc56b192b549"} Mar 10 09:18:02 crc kubenswrapper[4825]: I0310 09:18:02.491413 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552238-ghhg6" event={"ID":"c6d7b9e0-a38d-4886-ab3a-b4ba7e6efe57","Type":"ContainerStarted","Data":"45254e6e4d180b8c1a60b4de026befc988fdc76fd1f1729262dafab5013ece42"} Mar 10 09:18:02 crc kubenswrapper[4825]: I0310 09:18:02.515037 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552238-ghhg6" podStartSLOduration=1.537725793 podStartE2EDuration="2.515018575s" podCreationTimestamp="2026-03-10 09:18:00 +0000 UTC" firstStartedPulling="2026-03-10 09:18:01.096267385 +0000 UTC m=+9234.126048000" lastFinishedPulling="2026-03-10 09:18:02.073560147 +0000 UTC m=+9235.103340782" observedRunningTime="2026-03-10 09:18:02.508388158 +0000 UTC m=+9235.538168773" watchObservedRunningTime="2026-03-10 09:18:02.515018575 +0000 UTC m=+9235.544799190" Mar 10 09:18:03 crc kubenswrapper[4825]: I0310 09:18:03.503664 4825 generic.go:334] "Generic (PLEG): container finished" podID="c6d7b9e0-a38d-4886-ab3a-b4ba7e6efe57" containerID="45254e6e4d180b8c1a60b4de026befc988fdc76fd1f1729262dafab5013ece42" exitCode=0 Mar 10 09:18:03 crc kubenswrapper[4825]: I0310 09:18:03.504002 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552238-ghhg6" event={"ID":"c6d7b9e0-a38d-4886-ab3a-b4ba7e6efe57","Type":"ContainerDied","Data":"45254e6e4d180b8c1a60b4de026befc988fdc76fd1f1729262dafab5013ece42"} Mar 10 09:18:05 crc kubenswrapper[4825]: I0310 09:18:05.055784 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552238-ghhg6" Mar 10 09:18:05 crc kubenswrapper[4825]: I0310 09:18:05.236744 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czznj\" (UniqueName: \"kubernetes.io/projected/c6d7b9e0-a38d-4886-ab3a-b4ba7e6efe57-kube-api-access-czznj\") pod \"c6d7b9e0-a38d-4886-ab3a-b4ba7e6efe57\" (UID: \"c6d7b9e0-a38d-4886-ab3a-b4ba7e6efe57\") " Mar 10 09:18:05 crc kubenswrapper[4825]: I0310 09:18:05.242506 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6d7b9e0-a38d-4886-ab3a-b4ba7e6efe57-kube-api-access-czznj" (OuterVolumeSpecName: "kube-api-access-czznj") pod "c6d7b9e0-a38d-4886-ab3a-b4ba7e6efe57" (UID: "c6d7b9e0-a38d-4886-ab3a-b4ba7e6efe57"). InnerVolumeSpecName "kube-api-access-czznj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:18:05 crc kubenswrapper[4825]: I0310 09:18:05.338827 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czznj\" (UniqueName: \"kubernetes.io/projected/c6d7b9e0-a38d-4886-ab3a-b4ba7e6efe57-kube-api-access-czznj\") on node \"crc\" DevicePath \"\"" Mar 10 09:18:05 crc kubenswrapper[4825]: I0310 09:18:05.520056 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552238-ghhg6" event={"ID":"c6d7b9e0-a38d-4886-ab3a-b4ba7e6efe57","Type":"ContainerDied","Data":"971f2b42dcc810ade009f2cb98d2e4e582f5c8909068af177eb3bc56b192b549"} Mar 10 09:18:05 crc kubenswrapper[4825]: I0310 09:18:05.520094 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="971f2b42dcc810ade009f2cb98d2e4e582f5c8909068af177eb3bc56b192b549" Mar 10 09:18:05 crc kubenswrapper[4825]: I0310 09:18:05.520118 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552238-ghhg6" Mar 10 09:18:05 crc kubenswrapper[4825]: I0310 09:18:05.587929 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552232-tqmr6"] Mar 10 09:18:05 crc kubenswrapper[4825]: I0310 09:18:05.600894 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552232-tqmr6"] Mar 10 09:18:07 crc kubenswrapper[4825]: I0310 09:18:07.248872 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35e63c43-e67b-4d24-ad9a-428cac91eb29" path="/var/lib/kubelet/pods/35e63c43-e67b-4d24-ad9a-428cac91eb29/volumes" Mar 10 09:18:08 crc kubenswrapper[4825]: I0310 09:18:08.137678 4825 scope.go:117] "RemoveContainer" containerID="9b34c7a02375c1810baf0f58f483f3f053ac244c575a043379de66cf22235952" Mar 10 09:18:11 crc kubenswrapper[4825]: I0310 09:18:11.236751 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:18:11 crc kubenswrapper[4825]: E0310 09:18:11.237792 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:18:23 crc kubenswrapper[4825]: I0310 09:18:23.237338 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:18:23 crc kubenswrapper[4825]: E0310 09:18:23.238221 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:18:34 crc kubenswrapper[4825]: I0310 09:18:34.237430 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:18:34 crc kubenswrapper[4825]: E0310 09:18:34.238325 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:18:46 crc kubenswrapper[4825]: I0310 09:18:46.237409 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:18:46 crc kubenswrapper[4825]: E0310 09:18:46.238075 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:18:59 crc kubenswrapper[4825]: I0310 09:18:59.242686 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:19:00 crc kubenswrapper[4825]: I0310 09:19:00.109438 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"b92fb551f45e393908a43b65868ed01517bc29380c8beb54ff37ab5e2099e73d"} Mar 10 09:19:19 crc kubenswrapper[4825]: I0310 09:19:19.043577 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l2spv"] Mar 10 09:19:19 crc kubenswrapper[4825]: E0310 09:19:19.045070 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6d7b9e0-a38d-4886-ab3a-b4ba7e6efe57" containerName="oc" Mar 10 09:19:19 crc kubenswrapper[4825]: I0310 09:19:19.045086 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6d7b9e0-a38d-4886-ab3a-b4ba7e6efe57" containerName="oc" Mar 10 09:19:19 crc kubenswrapper[4825]: I0310 09:19:19.045338 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6d7b9e0-a38d-4886-ab3a-b4ba7e6efe57" containerName="oc" Mar 10 09:19:19 crc kubenswrapper[4825]: I0310 09:19:19.046645 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l2spv" Mar 10 09:19:19 crc kubenswrapper[4825]: I0310 09:19:19.059766 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l2spv"] Mar 10 09:19:19 crc kubenswrapper[4825]: I0310 09:19:19.165986 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw6gj\" (UniqueName: \"kubernetes.io/projected/02af0e88-84e1-4b02-967d-9ef9317bd815-kube-api-access-nw6gj\") pod \"certified-operators-l2spv\" (UID: \"02af0e88-84e1-4b02-967d-9ef9317bd815\") " pod="openshift-marketplace/certified-operators-l2spv" Mar 10 09:19:19 crc kubenswrapper[4825]: I0310 09:19:19.166363 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02af0e88-84e1-4b02-967d-9ef9317bd815-utilities\") pod \"certified-operators-l2spv\" (UID: \"02af0e88-84e1-4b02-967d-9ef9317bd815\") " pod="openshift-marketplace/certified-operators-l2spv" Mar 10 09:19:19 crc kubenswrapper[4825]: I0310 09:19:19.166648 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02af0e88-84e1-4b02-967d-9ef9317bd815-catalog-content\") pod \"certified-operators-l2spv\" (UID: \"02af0e88-84e1-4b02-967d-9ef9317bd815\") " pod="openshift-marketplace/certified-operators-l2spv" Mar 10 09:19:19 crc kubenswrapper[4825]: I0310 09:19:19.269606 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw6gj\" (UniqueName: \"kubernetes.io/projected/02af0e88-84e1-4b02-967d-9ef9317bd815-kube-api-access-nw6gj\") pod \"certified-operators-l2spv\" (UID: \"02af0e88-84e1-4b02-967d-9ef9317bd815\") " pod="openshift-marketplace/certified-operators-l2spv" Mar 10 09:19:19 crc kubenswrapper[4825]: I0310 09:19:19.269925 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02af0e88-84e1-4b02-967d-9ef9317bd815-utilities\") pod \"certified-operators-l2spv\" (UID: \"02af0e88-84e1-4b02-967d-9ef9317bd815\") " pod="openshift-marketplace/certified-operators-l2spv" Mar 10 09:19:19 crc kubenswrapper[4825]: I0310 09:19:19.270009 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02af0e88-84e1-4b02-967d-9ef9317bd815-catalog-content\") pod \"certified-operators-l2spv\" (UID: \"02af0e88-84e1-4b02-967d-9ef9317bd815\") " pod="openshift-marketplace/certified-operators-l2spv" Mar 10 09:19:19 crc kubenswrapper[4825]: I0310 09:19:19.270463 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02af0e88-84e1-4b02-967d-9ef9317bd815-utilities\") pod \"certified-operators-l2spv\" (UID: \"02af0e88-84e1-4b02-967d-9ef9317bd815\") " pod="openshift-marketplace/certified-operators-l2spv" Mar 10 09:19:19 crc kubenswrapper[4825]: I0310 09:19:19.270557 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02af0e88-84e1-4b02-967d-9ef9317bd815-catalog-content\") pod \"certified-operators-l2spv\" (UID: \"02af0e88-84e1-4b02-967d-9ef9317bd815\") " pod="openshift-marketplace/certified-operators-l2spv" Mar 10 09:19:19 crc kubenswrapper[4825]: I0310 09:19:19.296973 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw6gj\" (UniqueName: \"kubernetes.io/projected/02af0e88-84e1-4b02-967d-9ef9317bd815-kube-api-access-nw6gj\") pod \"certified-operators-l2spv\" (UID: \"02af0e88-84e1-4b02-967d-9ef9317bd815\") " pod="openshift-marketplace/certified-operators-l2spv" Mar 10 09:19:19 crc kubenswrapper[4825]: I0310 09:19:19.371496 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l2spv" Mar 10 09:19:19 crc kubenswrapper[4825]: I0310 09:19:19.831017 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l2spv"] Mar 10 09:19:20 crc kubenswrapper[4825]: I0310 09:19:20.318153 4825 generic.go:334] "Generic (PLEG): container finished" podID="02af0e88-84e1-4b02-967d-9ef9317bd815" containerID="26796b5c13477d70dc8b1709e73182385130225c51915829227fd00653e31835" exitCode=0 Mar 10 09:19:20 crc kubenswrapper[4825]: I0310 09:19:20.318205 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2spv" event={"ID":"02af0e88-84e1-4b02-967d-9ef9317bd815","Type":"ContainerDied","Data":"26796b5c13477d70dc8b1709e73182385130225c51915829227fd00653e31835"} Mar 10 09:19:20 crc kubenswrapper[4825]: I0310 09:19:20.318461 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2spv" event={"ID":"02af0e88-84e1-4b02-967d-9ef9317bd815","Type":"ContainerStarted","Data":"a4c56f47a259fe07b5a5f400c73bfdb871ba9dfef379e9b9b4a17c91ef4d55c5"} Mar 10 09:19:21 crc kubenswrapper[4825]: I0310 09:19:21.327865 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2spv" event={"ID":"02af0e88-84e1-4b02-967d-9ef9317bd815","Type":"ContainerStarted","Data":"03c75dc577f231622d32b7237e5551adf1d1af3b0b8c81599361c2af8dee05e9"} Mar 10 09:19:21 crc kubenswrapper[4825]: I0310 09:19:21.444182 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2v7fx"] Mar 10 09:19:21 crc kubenswrapper[4825]: I0310 09:19:21.446761 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2v7fx" Mar 10 09:19:21 crc kubenswrapper[4825]: I0310 09:19:21.456928 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2v7fx"] Mar 10 09:19:21 crc kubenswrapper[4825]: I0310 09:19:21.625889 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d20de7b-2e50-487c-822c-d477400971b4-utilities\") pod \"redhat-operators-2v7fx\" (UID: \"0d20de7b-2e50-487c-822c-d477400971b4\") " pod="openshift-marketplace/redhat-operators-2v7fx" Mar 10 09:19:21 crc kubenswrapper[4825]: I0310 09:19:21.625938 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8jfw\" (UniqueName: \"kubernetes.io/projected/0d20de7b-2e50-487c-822c-d477400971b4-kube-api-access-w8jfw\") pod \"redhat-operators-2v7fx\" (UID: \"0d20de7b-2e50-487c-822c-d477400971b4\") " pod="openshift-marketplace/redhat-operators-2v7fx" Mar 10 09:19:21 crc kubenswrapper[4825]: I0310 09:19:21.626010 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d20de7b-2e50-487c-822c-d477400971b4-catalog-content\") pod \"redhat-operators-2v7fx\" (UID: \"0d20de7b-2e50-487c-822c-d477400971b4\") " pod="openshift-marketplace/redhat-operators-2v7fx" Mar 10 09:19:21 crc kubenswrapper[4825]: I0310 09:19:21.727727 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d20de7b-2e50-487c-822c-d477400971b4-catalog-content\") pod \"redhat-operators-2v7fx\" (UID: \"0d20de7b-2e50-487c-822c-d477400971b4\") " pod="openshift-marketplace/redhat-operators-2v7fx" Mar 10 09:19:21 crc kubenswrapper[4825]: I0310 09:19:21.727955 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d20de7b-2e50-487c-822c-d477400971b4-utilities\") pod \"redhat-operators-2v7fx\" (UID: \"0d20de7b-2e50-487c-822c-d477400971b4\") " pod="openshift-marketplace/redhat-operators-2v7fx" Mar 10 09:19:21 crc kubenswrapper[4825]: I0310 09:19:21.727980 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8jfw\" (UniqueName: \"kubernetes.io/projected/0d20de7b-2e50-487c-822c-d477400971b4-kube-api-access-w8jfw\") pod \"redhat-operators-2v7fx\" (UID: \"0d20de7b-2e50-487c-822c-d477400971b4\") " pod="openshift-marketplace/redhat-operators-2v7fx" Mar 10 09:19:21 crc kubenswrapper[4825]: I0310 09:19:21.728318 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d20de7b-2e50-487c-822c-d477400971b4-catalog-content\") pod \"redhat-operators-2v7fx\" (UID: \"0d20de7b-2e50-487c-822c-d477400971b4\") " pod="openshift-marketplace/redhat-operators-2v7fx" Mar 10 09:19:21 crc kubenswrapper[4825]: I0310 09:19:21.728422 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d20de7b-2e50-487c-822c-d477400971b4-utilities\") pod \"redhat-operators-2v7fx\" (UID: \"0d20de7b-2e50-487c-822c-d477400971b4\") " pod="openshift-marketplace/redhat-operators-2v7fx" Mar 10 09:19:21 crc kubenswrapper[4825]: I0310 09:19:21.748847 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8jfw\" (UniqueName: \"kubernetes.io/projected/0d20de7b-2e50-487c-822c-d477400971b4-kube-api-access-w8jfw\") pod \"redhat-operators-2v7fx\" (UID: \"0d20de7b-2e50-487c-822c-d477400971b4\") " pod="openshift-marketplace/redhat-operators-2v7fx" Mar 10 09:19:21 crc kubenswrapper[4825]: I0310 09:19:21.771820 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2v7fx" Mar 10 09:19:22 crc kubenswrapper[4825]: I0310 09:19:22.290591 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2v7fx"] Mar 10 09:19:22 crc kubenswrapper[4825]: W0310 09:19:22.765148 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d20de7b_2e50_487c_822c_d477400971b4.slice/crio-340144dd1bd2768f5294b3dffa635290a5599e788314bc489f18057fe15693a9 WatchSource:0}: Error finding container 340144dd1bd2768f5294b3dffa635290a5599e788314bc489f18057fe15693a9: Status 404 returned error can't find the container with id 340144dd1bd2768f5294b3dffa635290a5599e788314bc489f18057fe15693a9 Mar 10 09:19:23 crc kubenswrapper[4825]: I0310 09:19:23.346592 4825 generic.go:334] "Generic (PLEG): container finished" podID="0d20de7b-2e50-487c-822c-d477400971b4" containerID="8c1cfb2b90cc107d4029cc60e810f98e6c41075437f7140920e18bb5be96ce6b" exitCode=0 Mar 10 09:19:23 crc kubenswrapper[4825]: I0310 09:19:23.346705 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2v7fx" event={"ID":"0d20de7b-2e50-487c-822c-d477400971b4","Type":"ContainerDied","Data":"8c1cfb2b90cc107d4029cc60e810f98e6c41075437f7140920e18bb5be96ce6b"} Mar 10 09:19:23 crc kubenswrapper[4825]: I0310 09:19:23.347024 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2v7fx" event={"ID":"0d20de7b-2e50-487c-822c-d477400971b4","Type":"ContainerStarted","Data":"340144dd1bd2768f5294b3dffa635290a5599e788314bc489f18057fe15693a9"} Mar 10 09:19:23 crc kubenswrapper[4825]: I0310 09:19:23.351424 4825 generic.go:334] "Generic (PLEG): container finished" podID="02af0e88-84e1-4b02-967d-9ef9317bd815" containerID="03c75dc577f231622d32b7237e5551adf1d1af3b0b8c81599361c2af8dee05e9" exitCode=0 Mar 10 09:19:23 crc kubenswrapper[4825]: I0310 09:19:23.351485 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2spv" event={"ID":"02af0e88-84e1-4b02-967d-9ef9317bd815","Type":"ContainerDied","Data":"03c75dc577f231622d32b7237e5551adf1d1af3b0b8c81599361c2af8dee05e9"} Mar 10 09:19:24 crc kubenswrapper[4825]: I0310 09:19:24.365259 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2v7fx" event={"ID":"0d20de7b-2e50-487c-822c-d477400971b4","Type":"ContainerStarted","Data":"b5fb9367c4094634e9c303d3b469f5aca2664db8adbd7231d9393a454d58af0e"} Mar 10 09:19:24 crc kubenswrapper[4825]: I0310 09:19:24.369360 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2spv" event={"ID":"02af0e88-84e1-4b02-967d-9ef9317bd815","Type":"ContainerStarted","Data":"523ed57250a4bf8d70a6ada6ec5ace05f4bc89d6ad375b28fdd557d17edd36b3"} Mar 10 09:19:24 crc kubenswrapper[4825]: I0310 09:19:24.414183 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l2spv" podStartSLOduration=1.950752531 podStartE2EDuration="5.414165701s" podCreationTimestamp="2026-03-10 09:19:19 +0000 UTC" firstStartedPulling="2026-03-10 09:19:20.319676837 +0000 UTC m=+9313.349457452" lastFinishedPulling="2026-03-10 09:19:23.783090007 +0000 UTC m=+9316.812870622" observedRunningTime="2026-03-10 09:19:24.408997874 +0000 UTC m=+9317.438778489" watchObservedRunningTime="2026-03-10 09:19:24.414165701 +0000 UTC m=+9317.443946316" Mar 10 09:19:29 crc kubenswrapper[4825]: I0310 09:19:29.372542 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l2spv" Mar 10 09:19:29 crc kubenswrapper[4825]: I0310 09:19:29.372910 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l2spv" Mar 10 09:19:29 crc kubenswrapper[4825]: I0310 09:19:29.442742 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l2spv" Mar 10 09:19:29 crc kubenswrapper[4825]: I0310 09:19:29.447390 4825 generic.go:334] "Generic (PLEG): container finished" podID="0d20de7b-2e50-487c-822c-d477400971b4" containerID="b5fb9367c4094634e9c303d3b469f5aca2664db8adbd7231d9393a454d58af0e" exitCode=0 Mar 10 09:19:29 crc kubenswrapper[4825]: I0310 09:19:29.447520 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2v7fx" event={"ID":"0d20de7b-2e50-487c-822c-d477400971b4","Type":"ContainerDied","Data":"b5fb9367c4094634e9c303d3b469f5aca2664db8adbd7231d9393a454d58af0e"} Mar 10 09:19:29 crc kubenswrapper[4825]: I0310 09:19:29.520311 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l2spv" Mar 10 09:19:30 crc kubenswrapper[4825]: I0310 09:19:30.459045 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2v7fx" event={"ID":"0d20de7b-2e50-487c-822c-d477400971b4","Type":"ContainerStarted","Data":"397612ff7be6b26e953d64888eca36c12e5015477454139f8ef004b6c8a036c1"} Mar 10 09:19:30 crc kubenswrapper[4825]: I0310 09:19:30.485779 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2v7fx" podStartSLOduration=2.964102759 podStartE2EDuration="9.485757243s" podCreationTimestamp="2026-03-10 09:19:21 +0000 UTC" firstStartedPulling="2026-03-10 09:19:23.348544168 +0000 UTC m=+9316.378324783" lastFinishedPulling="2026-03-10 09:19:29.870198642 +0000 UTC m=+9322.899979267" observedRunningTime="2026-03-10 09:19:30.477907474 +0000 UTC m=+9323.507688099" watchObservedRunningTime="2026-03-10 09:19:30.485757243 +0000 UTC m=+9323.515537858" Mar 10 09:19:31 crc kubenswrapper[4825]: I0310 09:19:31.772652 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2v7fx" Mar 10 09:19:31 crc kubenswrapper[4825]: I0310 09:19:31.772939 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2v7fx" Mar 10 09:19:31 crc kubenswrapper[4825]: I0310 09:19:31.829833 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l2spv"] Mar 10 09:19:31 crc kubenswrapper[4825]: I0310 09:19:31.830062 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l2spv" podUID="02af0e88-84e1-4b02-967d-9ef9317bd815" containerName="registry-server" containerID="cri-o://523ed57250a4bf8d70a6ada6ec5ace05f4bc89d6ad375b28fdd557d17edd36b3" gracePeriod=2 Mar 10 09:19:32 crc kubenswrapper[4825]: I0310 09:19:32.368840 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l2spv" Mar 10 09:19:32 crc kubenswrapper[4825]: I0310 09:19:32.468422 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02af0e88-84e1-4b02-967d-9ef9317bd815-utilities\") pod \"02af0e88-84e1-4b02-967d-9ef9317bd815\" (UID: \"02af0e88-84e1-4b02-967d-9ef9317bd815\") " Mar 10 09:19:32 crc kubenswrapper[4825]: I0310 09:19:32.468531 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02af0e88-84e1-4b02-967d-9ef9317bd815-catalog-content\") pod \"02af0e88-84e1-4b02-967d-9ef9317bd815\" (UID: \"02af0e88-84e1-4b02-967d-9ef9317bd815\") " Mar 10 09:19:32 crc kubenswrapper[4825]: I0310 09:19:32.468660 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw6gj\" (UniqueName: \"kubernetes.io/projected/02af0e88-84e1-4b02-967d-9ef9317bd815-kube-api-access-nw6gj\") pod \"02af0e88-84e1-4b02-967d-9ef9317bd815\" (UID: \"02af0e88-84e1-4b02-967d-9ef9317bd815\") " Mar 10 09:19:32 crc kubenswrapper[4825]: I0310 09:19:32.477017 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02af0e88-84e1-4b02-967d-9ef9317bd815-kube-api-access-nw6gj" (OuterVolumeSpecName: "kube-api-access-nw6gj") pod "02af0e88-84e1-4b02-967d-9ef9317bd815" (UID: "02af0e88-84e1-4b02-967d-9ef9317bd815"). InnerVolumeSpecName "kube-api-access-nw6gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:19:32 crc kubenswrapper[4825]: I0310 09:19:32.477654 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02af0e88-84e1-4b02-967d-9ef9317bd815-utilities" (OuterVolumeSpecName: "utilities") pod "02af0e88-84e1-4b02-967d-9ef9317bd815" (UID: "02af0e88-84e1-4b02-967d-9ef9317bd815"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:19:32 crc kubenswrapper[4825]: I0310 09:19:32.484895 4825 generic.go:334] "Generic (PLEG): container finished" podID="02af0e88-84e1-4b02-967d-9ef9317bd815" containerID="523ed57250a4bf8d70a6ada6ec5ace05f4bc89d6ad375b28fdd557d17edd36b3" exitCode=0 Mar 10 09:19:32 crc kubenswrapper[4825]: I0310 09:19:32.484956 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2spv" event={"ID":"02af0e88-84e1-4b02-967d-9ef9317bd815","Type":"ContainerDied","Data":"523ed57250a4bf8d70a6ada6ec5ace05f4bc89d6ad375b28fdd557d17edd36b3"} Mar 10 09:19:32 crc kubenswrapper[4825]: I0310 09:19:32.484989 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2spv" event={"ID":"02af0e88-84e1-4b02-967d-9ef9317bd815","Type":"ContainerDied","Data":"a4c56f47a259fe07b5a5f400c73bfdb871ba9dfef379e9b9b4a17c91ef4d55c5"} Mar 10 09:19:32 crc kubenswrapper[4825]: I0310 09:19:32.485011 4825 scope.go:117] "RemoveContainer" containerID="523ed57250a4bf8d70a6ada6ec5ace05f4bc89d6ad375b28fdd557d17edd36b3" Mar 10 09:19:32 crc kubenswrapper[4825]: I0310 09:19:32.485119 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l2spv" Mar 10 09:19:32 crc kubenswrapper[4825]: I0310 09:19:32.553555 4825 scope.go:117] "RemoveContainer" containerID="03c75dc577f231622d32b7237e5551adf1d1af3b0b8c81599361c2af8dee05e9" Mar 10 09:19:32 crc kubenswrapper[4825]: I0310 09:19:32.571420 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw6gj\" (UniqueName: \"kubernetes.io/projected/02af0e88-84e1-4b02-967d-9ef9317bd815-kube-api-access-nw6gj\") on node \"crc\" DevicePath \"\"" Mar 10 09:19:32 crc kubenswrapper[4825]: I0310 09:19:32.571453 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02af0e88-84e1-4b02-967d-9ef9317bd815-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:19:32 crc kubenswrapper[4825]: I0310 09:19:32.579194 4825 scope.go:117] "RemoveContainer" containerID="26796b5c13477d70dc8b1709e73182385130225c51915829227fd00653e31835" Mar 10 09:19:32 crc kubenswrapper[4825]: I0310 09:19:32.629304 4825 scope.go:117] "RemoveContainer" containerID="523ed57250a4bf8d70a6ada6ec5ace05f4bc89d6ad375b28fdd557d17edd36b3" Mar 10 09:19:32 crc kubenswrapper[4825]: E0310 09:19:32.630422 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"523ed57250a4bf8d70a6ada6ec5ace05f4bc89d6ad375b28fdd557d17edd36b3\": container with ID starting with 523ed57250a4bf8d70a6ada6ec5ace05f4bc89d6ad375b28fdd557d17edd36b3 not found: ID does not exist" containerID="523ed57250a4bf8d70a6ada6ec5ace05f4bc89d6ad375b28fdd557d17edd36b3" Mar 10 09:19:32 crc kubenswrapper[4825]: I0310 09:19:32.630479 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"523ed57250a4bf8d70a6ada6ec5ace05f4bc89d6ad375b28fdd557d17edd36b3"} err="failed to get container status \"523ed57250a4bf8d70a6ada6ec5ace05f4bc89d6ad375b28fdd557d17edd36b3\": rpc error: code = NotFound desc = could not find container \"523ed57250a4bf8d70a6ada6ec5ace05f4bc89d6ad375b28fdd557d17edd36b3\": container with ID starting with 523ed57250a4bf8d70a6ada6ec5ace05f4bc89d6ad375b28fdd557d17edd36b3 not found: ID does not exist" Mar 10 09:19:32 crc kubenswrapper[4825]: I0310 09:19:32.630509 4825 scope.go:117] "RemoveContainer" containerID="03c75dc577f231622d32b7237e5551adf1d1af3b0b8c81599361c2af8dee05e9" Mar 10 09:19:32 crc kubenswrapper[4825]: E0310 09:19:32.631044 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03c75dc577f231622d32b7237e5551adf1d1af3b0b8c81599361c2af8dee05e9\": container with ID starting with 03c75dc577f231622d32b7237e5551adf1d1af3b0b8c81599361c2af8dee05e9 not found: ID does not exist" containerID="03c75dc577f231622d32b7237e5551adf1d1af3b0b8c81599361c2af8dee05e9" Mar 10 09:19:32 crc kubenswrapper[4825]: I0310 09:19:32.631074 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03c75dc577f231622d32b7237e5551adf1d1af3b0b8c81599361c2af8dee05e9"} err="failed to get container status \"03c75dc577f231622d32b7237e5551adf1d1af3b0b8c81599361c2af8dee05e9\": rpc error: code = NotFound desc = could not find container \"03c75dc577f231622d32b7237e5551adf1d1af3b0b8c81599361c2af8dee05e9\": container with ID starting with 03c75dc577f231622d32b7237e5551adf1d1af3b0b8c81599361c2af8dee05e9 not found: ID does not exist" Mar 10 09:19:32 crc kubenswrapper[4825]: I0310 09:19:32.631089 4825 scope.go:117] "RemoveContainer" containerID="26796b5c13477d70dc8b1709e73182385130225c51915829227fd00653e31835" Mar 10 09:19:32 crc kubenswrapper[4825]: E0310 09:19:32.631638 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26796b5c13477d70dc8b1709e73182385130225c51915829227fd00653e31835\": container with ID starting with 26796b5c13477d70dc8b1709e73182385130225c51915829227fd00653e31835 not found: ID does not exist" containerID="26796b5c13477d70dc8b1709e73182385130225c51915829227fd00653e31835" Mar 10 09:19:32 crc kubenswrapper[4825]: I0310 09:19:32.631693 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26796b5c13477d70dc8b1709e73182385130225c51915829227fd00653e31835"} err="failed to get container status \"26796b5c13477d70dc8b1709e73182385130225c51915829227fd00653e31835\": rpc error: code = NotFound desc = could not find container \"26796b5c13477d70dc8b1709e73182385130225c51915829227fd00653e31835\": container with ID starting with 26796b5c13477d70dc8b1709e73182385130225c51915829227fd00653e31835 not found: ID does not exist" Mar 10 09:19:32 crc kubenswrapper[4825]: I0310 09:19:32.827655 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2v7fx" podUID="0d20de7b-2e50-487c-822c-d477400971b4" containerName="registry-server" probeResult="failure" output=< Mar 10 09:19:32 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 09:19:32 crc kubenswrapper[4825]: > Mar 10 09:19:33 crc kubenswrapper[4825]: I0310 09:19:33.073381 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02af0e88-84e1-4b02-967d-9ef9317bd815-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02af0e88-84e1-4b02-967d-9ef9317bd815" (UID: "02af0e88-84e1-4b02-967d-9ef9317bd815"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:19:33 crc kubenswrapper[4825]: I0310 09:19:33.083324 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02af0e88-84e1-4b02-967d-9ef9317bd815-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:19:33 crc kubenswrapper[4825]: I0310 09:19:33.128362 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l2spv"] Mar 10 09:19:33 crc kubenswrapper[4825]: I0310 09:19:33.139720 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l2spv"] Mar 10 09:19:33 crc kubenswrapper[4825]: I0310 09:19:33.251837 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02af0e88-84e1-4b02-967d-9ef9317bd815" path="/var/lib/kubelet/pods/02af0e88-84e1-4b02-967d-9ef9317bd815/volumes" Mar 10 09:19:41 crc kubenswrapper[4825]: I0310 09:19:41.841284 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2v7fx" Mar 10 09:19:41 crc kubenswrapper[4825]: I0310 09:19:41.899489 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2v7fx" Mar 10 09:19:42 crc kubenswrapper[4825]: I0310 09:19:42.098956 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2v7fx"] Mar 10 09:19:43 crc kubenswrapper[4825]: I0310 09:19:43.603850 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2v7fx" podUID="0d20de7b-2e50-487c-822c-d477400971b4" containerName="registry-server" containerID="cri-o://397612ff7be6b26e953d64888eca36c12e5015477454139f8ef004b6c8a036c1" gracePeriod=2 Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.196874 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2v7fx" Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.261298 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8jfw\" (UniqueName: \"kubernetes.io/projected/0d20de7b-2e50-487c-822c-d477400971b4-kube-api-access-w8jfw\") pod \"0d20de7b-2e50-487c-822c-d477400971b4\" (UID: \"0d20de7b-2e50-487c-822c-d477400971b4\") " Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.261356 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d20de7b-2e50-487c-822c-d477400971b4-catalog-content\") pod \"0d20de7b-2e50-487c-822c-d477400971b4\" (UID: \"0d20de7b-2e50-487c-822c-d477400971b4\") " Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.261477 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d20de7b-2e50-487c-822c-d477400971b4-utilities\") pod \"0d20de7b-2e50-487c-822c-d477400971b4\" (UID: \"0d20de7b-2e50-487c-822c-d477400971b4\") " Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.262457 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d20de7b-2e50-487c-822c-d477400971b4-utilities" (OuterVolumeSpecName: "utilities") pod "0d20de7b-2e50-487c-822c-d477400971b4" (UID: "0d20de7b-2e50-487c-822c-d477400971b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.287312 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d20de7b-2e50-487c-822c-d477400971b4-kube-api-access-w8jfw" (OuterVolumeSpecName: "kube-api-access-w8jfw") pod "0d20de7b-2e50-487c-822c-d477400971b4" (UID: "0d20de7b-2e50-487c-822c-d477400971b4"). InnerVolumeSpecName "kube-api-access-w8jfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.362939 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8jfw\" (UniqueName: \"kubernetes.io/projected/0d20de7b-2e50-487c-822c-d477400971b4-kube-api-access-w8jfw\") on node \"crc\" DevicePath \"\"" Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.362969 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d20de7b-2e50-487c-822c-d477400971b4-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.385232 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d20de7b-2e50-487c-822c-d477400971b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d20de7b-2e50-487c-822c-d477400971b4" (UID: "0d20de7b-2e50-487c-822c-d477400971b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.464047 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d20de7b-2e50-487c-822c-d477400971b4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.614351 4825 generic.go:334] "Generic (PLEG): container finished" podID="0d20de7b-2e50-487c-822c-d477400971b4" containerID="397612ff7be6b26e953d64888eca36c12e5015477454139f8ef004b6c8a036c1" exitCode=0 Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.614412 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2v7fx" event={"ID":"0d20de7b-2e50-487c-822c-d477400971b4","Type":"ContainerDied","Data":"397612ff7be6b26e953d64888eca36c12e5015477454139f8ef004b6c8a036c1"} Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.614451 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2v7fx" Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.614465 4825 scope.go:117] "RemoveContainer" containerID="397612ff7be6b26e953d64888eca36c12e5015477454139f8ef004b6c8a036c1" Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.614450 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2v7fx" event={"ID":"0d20de7b-2e50-487c-822c-d477400971b4","Type":"ContainerDied","Data":"340144dd1bd2768f5294b3dffa635290a5599e788314bc489f18057fe15693a9"} Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.640296 4825 scope.go:117] "RemoveContainer" containerID="b5fb9367c4094634e9c303d3b469f5aca2664db8adbd7231d9393a454d58af0e" Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.658670 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2v7fx"] Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.669048 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2v7fx"] Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.675376 4825 scope.go:117] "RemoveContainer" containerID="8c1cfb2b90cc107d4029cc60e810f98e6c41075437f7140920e18bb5be96ce6b" Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.737267 4825 scope.go:117] "RemoveContainer" containerID="397612ff7be6b26e953d64888eca36c12e5015477454139f8ef004b6c8a036c1" Mar 10 09:19:44 crc kubenswrapper[4825]: E0310 09:19:44.738020 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"397612ff7be6b26e953d64888eca36c12e5015477454139f8ef004b6c8a036c1\": container with ID starting with 397612ff7be6b26e953d64888eca36c12e5015477454139f8ef004b6c8a036c1 not found: ID does not exist" containerID="397612ff7be6b26e953d64888eca36c12e5015477454139f8ef004b6c8a036c1" Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.738184 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"397612ff7be6b26e953d64888eca36c12e5015477454139f8ef004b6c8a036c1"} err="failed to get container status \"397612ff7be6b26e953d64888eca36c12e5015477454139f8ef004b6c8a036c1\": rpc error: code = NotFound desc = could not find container \"397612ff7be6b26e953d64888eca36c12e5015477454139f8ef004b6c8a036c1\": container with ID starting with 397612ff7be6b26e953d64888eca36c12e5015477454139f8ef004b6c8a036c1 not found: ID does not exist" Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.738227 4825 scope.go:117] "RemoveContainer" containerID="b5fb9367c4094634e9c303d3b469f5aca2664db8adbd7231d9393a454d58af0e" Mar 10 09:19:44 crc kubenswrapper[4825]: E0310 09:19:44.738610 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5fb9367c4094634e9c303d3b469f5aca2664db8adbd7231d9393a454d58af0e\": container with ID starting with b5fb9367c4094634e9c303d3b469f5aca2664db8adbd7231d9393a454d58af0e not found: ID does not exist" containerID="b5fb9367c4094634e9c303d3b469f5aca2664db8adbd7231d9393a454d58af0e" Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.738662 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5fb9367c4094634e9c303d3b469f5aca2664db8adbd7231d9393a454d58af0e"} err="failed to get container status \"b5fb9367c4094634e9c303d3b469f5aca2664db8adbd7231d9393a454d58af0e\": rpc error: code = NotFound desc = could not find container \"b5fb9367c4094634e9c303d3b469f5aca2664db8adbd7231d9393a454d58af0e\": container with ID starting with b5fb9367c4094634e9c303d3b469f5aca2664db8adbd7231d9393a454d58af0e not found: ID does not exist" Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.738696 4825 scope.go:117] "RemoveContainer" containerID="8c1cfb2b90cc107d4029cc60e810f98e6c41075437f7140920e18bb5be96ce6b" Mar 10 09:19:44 crc kubenswrapper[4825]: E0310 09:19:44.739032 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c1cfb2b90cc107d4029cc60e810f98e6c41075437f7140920e18bb5be96ce6b\": container with ID starting with 8c1cfb2b90cc107d4029cc60e810f98e6c41075437f7140920e18bb5be96ce6b not found: ID does not exist" containerID="8c1cfb2b90cc107d4029cc60e810f98e6c41075437f7140920e18bb5be96ce6b" Mar 10 09:19:44 crc kubenswrapper[4825]: I0310 09:19:44.739086 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c1cfb2b90cc107d4029cc60e810f98e6c41075437f7140920e18bb5be96ce6b"} err="failed to get container status \"8c1cfb2b90cc107d4029cc60e810f98e6c41075437f7140920e18bb5be96ce6b\": rpc error: code = NotFound desc = could not find container \"8c1cfb2b90cc107d4029cc60e810f98e6c41075437f7140920e18bb5be96ce6b\": container with ID starting with 8c1cfb2b90cc107d4029cc60e810f98e6c41075437f7140920e18bb5be96ce6b not found: ID does not exist" Mar 10 09:19:45 crc kubenswrapper[4825]: I0310 09:19:45.247387 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d20de7b-2e50-487c-822c-d477400971b4" path="/var/lib/kubelet/pods/0d20de7b-2e50-487c-822c-d477400971b4/volumes" Mar 10 09:20:00 crc kubenswrapper[4825]: I0310 09:20:00.160300 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552240-pr7kz"] Mar 10 09:20:00 crc kubenswrapper[4825]: E0310 09:20:00.161253 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02af0e88-84e1-4b02-967d-9ef9317bd815" containerName="registry-server" Mar 10 09:20:00 crc kubenswrapper[4825]: I0310 09:20:00.161270 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="02af0e88-84e1-4b02-967d-9ef9317bd815" containerName="registry-server" Mar 10 09:20:00 crc kubenswrapper[4825]: E0310 09:20:00.161290 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d20de7b-2e50-487c-822c-d477400971b4" containerName="extract-utilities" Mar 10 09:20:00 crc kubenswrapper[4825]: I0310 09:20:00.161300 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d20de7b-2e50-487c-822c-d477400971b4" containerName="extract-utilities" Mar 10 09:20:00 crc kubenswrapper[4825]: E0310 09:20:00.161308 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d20de7b-2e50-487c-822c-d477400971b4" containerName="extract-content" Mar 10 09:20:00 crc kubenswrapper[4825]: I0310 09:20:00.161316 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d20de7b-2e50-487c-822c-d477400971b4" containerName="extract-content" Mar 10 09:20:00 crc kubenswrapper[4825]: E0310 09:20:00.161335 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02af0e88-84e1-4b02-967d-9ef9317bd815" containerName="extract-utilities" Mar 10 09:20:00 crc kubenswrapper[4825]: I0310 09:20:00.161342 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="02af0e88-84e1-4b02-967d-9ef9317bd815" containerName="extract-utilities" Mar 10 09:20:00 crc kubenswrapper[4825]: E0310 09:20:00.161358 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02af0e88-84e1-4b02-967d-9ef9317bd815" containerName="extract-content" Mar 10 09:20:00 crc kubenswrapper[4825]: I0310 09:20:00.161367 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="02af0e88-84e1-4b02-967d-9ef9317bd815" containerName="extract-content" Mar 10 09:20:00 crc kubenswrapper[4825]: E0310 09:20:00.161385 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d20de7b-2e50-487c-822c-d477400971b4" containerName="registry-server" Mar 10 09:20:00 crc kubenswrapper[4825]: I0310 09:20:00.161393 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d20de7b-2e50-487c-822c-d477400971b4" containerName="registry-server" Mar 10 09:20:00 crc kubenswrapper[4825]: I0310 09:20:00.161662 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d20de7b-2e50-487c-822c-d477400971b4" containerName="registry-server" Mar 10 09:20:00 crc kubenswrapper[4825]: I0310 09:20:00.161690 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="02af0e88-84e1-4b02-967d-9ef9317bd815" containerName="registry-server" Mar 10 09:20:00 crc kubenswrapper[4825]: I0310 09:20:00.163826 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552240-pr7kz" Mar 10 09:20:00 crc kubenswrapper[4825]: I0310 09:20:00.168801 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 09:20:00 crc kubenswrapper[4825]: I0310 09:20:00.169329 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:20:00 crc kubenswrapper[4825]: I0310 09:20:00.170055 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:20:00 crc kubenswrapper[4825]: I0310 09:20:00.180370 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552240-pr7kz"] Mar 10 09:20:00 crc kubenswrapper[4825]: I0310 09:20:00.267766 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tvmr\" (UniqueName: \"kubernetes.io/projected/9f2d034b-b5fe-43dd-bb3e-f0e6abcaf576-kube-api-access-9tvmr\") pod \"auto-csr-approver-29552240-pr7kz\" (UID: \"9f2d034b-b5fe-43dd-bb3e-f0e6abcaf576\") " pod="openshift-infra/auto-csr-approver-29552240-pr7kz" Mar 10 09:20:00 crc kubenswrapper[4825]: I0310 09:20:00.370253 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tvmr\" (UniqueName: \"kubernetes.io/projected/9f2d034b-b5fe-43dd-bb3e-f0e6abcaf576-kube-api-access-9tvmr\") pod \"auto-csr-approver-29552240-pr7kz\" (UID: \"9f2d034b-b5fe-43dd-bb3e-f0e6abcaf576\") " pod="openshift-infra/auto-csr-approver-29552240-pr7kz" Mar 10 09:20:00 crc kubenswrapper[4825]: I0310 09:20:00.961156 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tvmr\" (UniqueName: \"kubernetes.io/projected/9f2d034b-b5fe-43dd-bb3e-f0e6abcaf576-kube-api-access-9tvmr\") pod \"auto-csr-approver-29552240-pr7kz\" (UID: \"9f2d034b-b5fe-43dd-bb3e-f0e6abcaf576\") " pod="openshift-infra/auto-csr-approver-29552240-pr7kz" Mar 10 09:20:01 crc kubenswrapper[4825]: I0310 09:20:01.105972 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552240-pr7kz" Mar 10 09:20:01 crc kubenswrapper[4825]: I0310 09:20:01.532723 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552240-pr7kz"] Mar 10 09:20:01 crc kubenswrapper[4825]: W0310 09:20:01.538197 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f2d034b_b5fe_43dd_bb3e_f0e6abcaf576.slice/crio-8d505977a64bcc733c86292d34277177e112959ccfe0b2185aa1d882af2e5c39 WatchSource:0}: Error finding container 8d505977a64bcc733c86292d34277177e112959ccfe0b2185aa1d882af2e5c39: Status 404 returned error can't find the container with id 8d505977a64bcc733c86292d34277177e112959ccfe0b2185aa1d882af2e5c39 Mar 10 09:20:01 crc kubenswrapper[4825]: I0310 09:20:01.795825 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552240-pr7kz" event={"ID":"9f2d034b-b5fe-43dd-bb3e-f0e6abcaf576","Type":"ContainerStarted","Data":"8d505977a64bcc733c86292d34277177e112959ccfe0b2185aa1d882af2e5c39"} Mar 10 09:20:03 crc kubenswrapper[4825]: I0310 09:20:03.818742 4825 generic.go:334] "Generic (PLEG): container finished" podID="9f2d034b-b5fe-43dd-bb3e-f0e6abcaf576" containerID="832c2d8c4318ea276ca58758b3333554ac72c5ba86987707c7c5d68b4624661e" exitCode=0 Mar 10 09:20:03 crc kubenswrapper[4825]: I0310 09:20:03.818958 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552240-pr7kz" event={"ID":"9f2d034b-b5fe-43dd-bb3e-f0e6abcaf576","Type":"ContainerDied","Data":"832c2d8c4318ea276ca58758b3333554ac72c5ba86987707c7c5d68b4624661e"} Mar 10 09:20:05 crc kubenswrapper[4825]: I0310 09:20:05.281631 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552240-pr7kz" Mar 10 09:20:05 crc kubenswrapper[4825]: I0310 09:20:05.371052 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tvmr\" (UniqueName: \"kubernetes.io/projected/9f2d034b-b5fe-43dd-bb3e-f0e6abcaf576-kube-api-access-9tvmr\") pod \"9f2d034b-b5fe-43dd-bb3e-f0e6abcaf576\" (UID: \"9f2d034b-b5fe-43dd-bb3e-f0e6abcaf576\") " Mar 10 09:20:05 crc kubenswrapper[4825]: I0310 09:20:05.378475 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f2d034b-b5fe-43dd-bb3e-f0e6abcaf576-kube-api-access-9tvmr" (OuterVolumeSpecName: "kube-api-access-9tvmr") pod "9f2d034b-b5fe-43dd-bb3e-f0e6abcaf576" (UID: "9f2d034b-b5fe-43dd-bb3e-f0e6abcaf576"). InnerVolumeSpecName "kube-api-access-9tvmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:20:05 crc kubenswrapper[4825]: I0310 09:20:05.474196 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tvmr\" (UniqueName: \"kubernetes.io/projected/9f2d034b-b5fe-43dd-bb3e-f0e6abcaf576-kube-api-access-9tvmr\") on node \"crc\" DevicePath \"\"" Mar 10 09:20:05 crc kubenswrapper[4825]: I0310 09:20:05.844699 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552240-pr7kz" event={"ID":"9f2d034b-b5fe-43dd-bb3e-f0e6abcaf576","Type":"ContainerDied","Data":"8d505977a64bcc733c86292d34277177e112959ccfe0b2185aa1d882af2e5c39"} Mar 10 09:20:05 crc kubenswrapper[4825]: I0310 09:20:05.845059 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d505977a64bcc733c86292d34277177e112959ccfe0b2185aa1d882af2e5c39" Mar 10 09:20:05 crc kubenswrapper[4825]: I0310 09:20:05.845215 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552240-pr7kz" Mar 10 09:20:06 crc kubenswrapper[4825]: I0310 09:20:06.376053 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552234-8shqx"] Mar 10 09:20:06 crc kubenswrapper[4825]: I0310 09:20:06.387537 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552234-8shqx"] Mar 10 09:20:07 crc kubenswrapper[4825]: I0310 09:20:07.251347 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc431574-9341-47a0-88f3-00ff2fd2ff18" path="/var/lib/kubelet/pods/fc431574-9341-47a0-88f3-00ff2fd2ff18/volumes" Mar 10 09:20:08 crc kubenswrapper[4825]: I0310 09:20:08.655740 4825 scope.go:117] "RemoveContainer" containerID="d32c990b52e32d4dec37a049a4bb75bfb514d34005170a26c9ba9eac6d3013d1" Mar 10 09:20:38 crc kubenswrapper[4825]: I0310 09:20:38.204353 4825 generic.go:334] "Generic (PLEG): container finished" podID="f26d1ad6-86b3-4fef-84d4-78cd5df47576" containerID="a0cfb3e5f662ad76d0b02bdf85ceb7a19f5c0a6ffb0a5f8f43e9f1c5c2ae9c9c" exitCode=0 Mar 10 09:20:38 crc kubenswrapper[4825]: I0310 09:20:38.204431 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"f26d1ad6-86b3-4fef-84d4-78cd5df47576","Type":"ContainerDied","Data":"a0cfb3e5f662ad76d0b02bdf85ceb7a19f5c0a6ffb0a5f8f43e9f1c5c2ae9c9c"} Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.649204 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.748532 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgb25\" (UniqueName: \"kubernetes.io/projected/f26d1ad6-86b3-4fef-84d4-78cd5df47576-kube-api-access-hgb25\") pod \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.748596 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f26d1ad6-86b3-4fef-84d4-78cd5df47576-test-operator-ephemeral-workdir\") pod \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.748711 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f26d1ad6-86b3-4fef-84d4-78cd5df47576-ca-certs\") pod \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.748781 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f26d1ad6-86b3-4fef-84d4-78cd5df47576-test-operator-ephemeral-temporary\") pod \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.748821 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f26d1ad6-86b3-4fef-84d4-78cd5df47576-config-data\") pod \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.748837 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f26d1ad6-86b3-4fef-84d4-78cd5df47576-openstack-config-secret\") pod \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.748878 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f26d1ad6-86b3-4fef-84d4-78cd5df47576-openstack-config\") pod \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.748893 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f26d1ad6-86b3-4fef-84d4-78cd5df47576-ssh-key\") pod \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.749395 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\" (UID: \"f26d1ad6-86b3-4fef-84d4-78cd5df47576\") " Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.750208 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f26d1ad6-86b3-4fef-84d4-78cd5df47576-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "f26d1ad6-86b3-4fef-84d4-78cd5df47576" (UID: "f26d1ad6-86b3-4fef-84d4-78cd5df47576"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.750820 4825 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f26d1ad6-86b3-4fef-84d4-78cd5df47576-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.751002 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f26d1ad6-86b3-4fef-84d4-78cd5df47576-config-data" (OuterVolumeSpecName: "config-data") pod "f26d1ad6-86b3-4fef-84d4-78cd5df47576" (UID: "f26d1ad6-86b3-4fef-84d4-78cd5df47576"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.755402 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "test-operator-logs") pod "f26d1ad6-86b3-4fef-84d4-78cd5df47576" (UID: "f26d1ad6-86b3-4fef-84d4-78cd5df47576"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.755956 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f26d1ad6-86b3-4fef-84d4-78cd5df47576-kube-api-access-hgb25" (OuterVolumeSpecName: "kube-api-access-hgb25") pod "f26d1ad6-86b3-4fef-84d4-78cd5df47576" (UID: "f26d1ad6-86b3-4fef-84d4-78cd5df47576"). InnerVolumeSpecName "kube-api-access-hgb25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.780158 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f26d1ad6-86b3-4fef-84d4-78cd5df47576-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f26d1ad6-86b3-4fef-84d4-78cd5df47576" (UID: "f26d1ad6-86b3-4fef-84d4-78cd5df47576"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.781756 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f26d1ad6-86b3-4fef-84d4-78cd5df47576-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "f26d1ad6-86b3-4fef-84d4-78cd5df47576" (UID: "f26d1ad6-86b3-4fef-84d4-78cd5df47576"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.786191 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f26d1ad6-86b3-4fef-84d4-78cd5df47576-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "f26d1ad6-86b3-4fef-84d4-78cd5df47576" (UID: "f26d1ad6-86b3-4fef-84d4-78cd5df47576"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.800939 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f26d1ad6-86b3-4fef-84d4-78cd5df47576-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "f26d1ad6-86b3-4fef-84d4-78cd5df47576" (UID: "f26d1ad6-86b3-4fef-84d4-78cd5df47576"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.810189 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f26d1ad6-86b3-4fef-84d4-78cd5df47576-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "f26d1ad6-86b3-4fef-84d4-78cd5df47576" (UID: "f26d1ad6-86b3-4fef-84d4-78cd5df47576"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.853166 4825 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f26d1ad6-86b3-4fef-84d4-78cd5df47576-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.854391 4825 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f26d1ad6-86b3-4fef-84d4-78cd5df47576-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.854587 4825 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f26d1ad6-86b3-4fef-84d4-78cd5df47576-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.854691 4825 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f26d1ad6-86b3-4fef-84d4-78cd5df47576-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.854911 4825 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f26d1ad6-86b3-4fef-84d4-78cd5df47576-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.855833 4825 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.855932 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgb25\" (UniqueName: \"kubernetes.io/projected/f26d1ad6-86b3-4fef-84d4-78cd5df47576-kube-api-access-hgb25\") on node \"crc\" DevicePath \"\"" Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.856024 4825 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f26d1ad6-86b3-4fef-84d4-78cd5df47576-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.882539 4825 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 10 09:20:39 crc kubenswrapper[4825]: I0310 09:20:39.958173 4825 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 10 09:20:40 crc kubenswrapper[4825]: I0310 09:20:40.226493 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"f26d1ad6-86b3-4fef-84d4-78cd5df47576","Type":"ContainerDied","Data":"b8df03aebcfa752f76528ce6dd9e3a98cf412093e569f74d4eae9ea3557a7114"} Mar 10 09:20:40 crc kubenswrapper[4825]: I0310 09:20:40.226561 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8df03aebcfa752f76528ce6dd9e3a98cf412093e569f74d4eae9ea3557a7114" Mar 10 09:20:40 crc kubenswrapper[4825]: I0310 09:20:40.226583 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 10 09:20:42 crc kubenswrapper[4825]: I0310 09:20:42.075186 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 10 09:20:42 crc kubenswrapper[4825]: E0310 09:20:42.076275 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f2d034b-b5fe-43dd-bb3e-f0e6abcaf576" containerName="oc" Mar 10 09:20:42 crc kubenswrapper[4825]: I0310 09:20:42.076343 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f2d034b-b5fe-43dd-bb3e-f0e6abcaf576" containerName="oc" Mar 10 09:20:42 crc kubenswrapper[4825]: E0310 09:20:42.076401 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26d1ad6-86b3-4fef-84d4-78cd5df47576" containerName="tempest-tests-tempest-tests-runner" Mar 10 09:20:42 crc kubenswrapper[4825]: I0310 09:20:42.076418 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26d1ad6-86b3-4fef-84d4-78cd5df47576" containerName="tempest-tests-tempest-tests-runner" Mar 10 09:20:42 crc kubenswrapper[4825]: I0310 09:20:42.076885 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f2d034b-b5fe-43dd-bb3e-f0e6abcaf576" containerName="oc" Mar 10 09:20:42 crc kubenswrapper[4825]: I0310 09:20:42.076946 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="f26d1ad6-86b3-4fef-84d4-78cd5df47576" containerName="tempest-tests-tempest-tests-runner" Mar 10 09:20:42 crc kubenswrapper[4825]: I0310 09:20:42.078516 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:20:42 crc kubenswrapper[4825]: I0310 09:20:42.080805 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-grwd4" Mar 10 09:20:42 crc kubenswrapper[4825]: I0310 09:20:42.088526 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 10 09:20:42 crc kubenswrapper[4825]: I0310 09:20:42.211781 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dg8w\" (UniqueName: \"kubernetes.io/projected/7e511dae-571d-4450-a56e-9d9dfe4cc83a-kube-api-access-5dg8w\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7e511dae-571d-4450-a56e-9d9dfe4cc83a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:20:42 crc kubenswrapper[4825]: I0310 09:20:42.212347 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7e511dae-571d-4450-a56e-9d9dfe4cc83a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:20:42 crc kubenswrapper[4825]: I0310 09:20:42.314309 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dg8w\" (UniqueName: \"kubernetes.io/projected/7e511dae-571d-4450-a56e-9d9dfe4cc83a-kube-api-access-5dg8w\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7e511dae-571d-4450-a56e-9d9dfe4cc83a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:20:42 crc kubenswrapper[4825]: I0310 09:20:42.314496 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7e511dae-571d-4450-a56e-9d9dfe4cc83a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:20:42 crc kubenswrapper[4825]: I0310 09:20:42.315015 4825 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7e511dae-571d-4450-a56e-9d9dfe4cc83a\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:20:42 crc kubenswrapper[4825]: I0310 09:20:42.330273 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dg8w\" (UniqueName: \"kubernetes.io/projected/7e511dae-571d-4450-a56e-9d9dfe4cc83a-kube-api-access-5dg8w\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7e511dae-571d-4450-a56e-9d9dfe4cc83a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:20:42 crc kubenswrapper[4825]: I0310 09:20:42.347918 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7e511dae-571d-4450-a56e-9d9dfe4cc83a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:20:42 crc kubenswrapper[4825]: I0310 09:20:42.412523 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 09:20:42 crc kubenswrapper[4825]: I0310 09:20:42.944936 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 10 09:20:43 crc kubenswrapper[4825]: I0310 09:20:43.263041 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"7e511dae-571d-4450-a56e-9d9dfe4cc83a","Type":"ContainerStarted","Data":"45e138c56d888a21559e8b9aa2f4e5c1f58d31b094f32ce690312bc85b7cf405"} Mar 10 09:20:44 crc kubenswrapper[4825]: I0310 09:20:44.283201 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"7e511dae-571d-4450-a56e-9d9dfe4cc83a","Type":"ContainerStarted","Data":"69ddb3b377de24a70d90f1eb153aa1ac28f981ff6f5b4d5acee61c909d560602"} Mar 10 09:20:44 crc kubenswrapper[4825]: I0310 09:20:44.300688 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.322451362 podStartE2EDuration="2.300669675s" podCreationTimestamp="2026-03-10 09:20:42 +0000 UTC" firstStartedPulling="2026-03-10 09:20:42.949971298 +0000 UTC m=+9395.979751953" lastFinishedPulling="2026-03-10 09:20:43.928189641 +0000 UTC m=+9396.957970266" observedRunningTime="2026-03-10 09:20:44.298083006 +0000 UTC m=+9397.327863641" watchObservedRunningTime="2026-03-10 09:20:44.300669675 +0000 UTC m=+9397.330450300" Mar 10 09:21:16 crc kubenswrapper[4825]: I0310 09:21:16.889080 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:21:16 crc kubenswrapper[4825]: I0310 09:21:16.889600 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:21:44 crc kubenswrapper[4825]: I0310 09:21:44.145961 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r6wjf/must-gather-kkm2h"] Mar 10 09:21:44 crc kubenswrapper[4825]: I0310 09:21:44.148543 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r6wjf/must-gather-kkm2h" Mar 10 09:21:44 crc kubenswrapper[4825]: I0310 09:21:44.155037 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-r6wjf"/"openshift-service-ca.crt" Mar 10 09:21:44 crc kubenswrapper[4825]: I0310 09:21:44.155324 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-r6wjf"/"kube-root-ca.crt" Mar 10 09:21:44 crc kubenswrapper[4825]: I0310 09:21:44.155453 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-r6wjf"/"default-dockercfg-mgvp6" Mar 10 09:21:44 crc kubenswrapper[4825]: I0310 09:21:44.162527 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r6wjf/must-gather-kkm2h"] Mar 10 09:21:44 crc kubenswrapper[4825]: I0310 09:21:44.288819 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/605d8ac1-592a-4163-a750-ecdfec37d34d-must-gather-output\") pod \"must-gather-kkm2h\" (UID: \"605d8ac1-592a-4163-a750-ecdfec37d34d\") " pod="openshift-must-gather-r6wjf/must-gather-kkm2h" Mar 10 09:21:44 crc kubenswrapper[4825]: I0310 09:21:44.289259 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwz4x\" (UniqueName: \"kubernetes.io/projected/605d8ac1-592a-4163-a750-ecdfec37d34d-kube-api-access-zwz4x\") pod \"must-gather-kkm2h\" (UID: \"605d8ac1-592a-4163-a750-ecdfec37d34d\") " pod="openshift-must-gather-r6wjf/must-gather-kkm2h" Mar 10 09:21:44 crc kubenswrapper[4825]: I0310 09:21:44.390988 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/605d8ac1-592a-4163-a750-ecdfec37d34d-must-gather-output\") pod \"must-gather-kkm2h\" (UID: \"605d8ac1-592a-4163-a750-ecdfec37d34d\") " pod="openshift-must-gather-r6wjf/must-gather-kkm2h" Mar 10 09:21:44 crc kubenswrapper[4825]: I0310 09:21:44.391178 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwz4x\" (UniqueName: \"kubernetes.io/projected/605d8ac1-592a-4163-a750-ecdfec37d34d-kube-api-access-zwz4x\") pod \"must-gather-kkm2h\" (UID: \"605d8ac1-592a-4163-a750-ecdfec37d34d\") " pod="openshift-must-gather-r6wjf/must-gather-kkm2h" Mar 10 09:21:44 crc kubenswrapper[4825]: I0310 09:21:44.392437 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/605d8ac1-592a-4163-a750-ecdfec37d34d-must-gather-output\") pod \"must-gather-kkm2h\" (UID: \"605d8ac1-592a-4163-a750-ecdfec37d34d\") " pod="openshift-must-gather-r6wjf/must-gather-kkm2h" Mar 10 09:21:44 crc kubenswrapper[4825]: I0310 09:21:44.412452 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwz4x\" (UniqueName: \"kubernetes.io/projected/605d8ac1-592a-4163-a750-ecdfec37d34d-kube-api-access-zwz4x\") pod \"must-gather-kkm2h\" (UID: \"605d8ac1-592a-4163-a750-ecdfec37d34d\") " pod="openshift-must-gather-r6wjf/must-gather-kkm2h" Mar 10 09:21:44 crc kubenswrapper[4825]: I0310 09:21:44.512230 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r6wjf/must-gather-kkm2h" Mar 10 09:21:45 crc kubenswrapper[4825]: I0310 09:21:45.105382 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:21:45 crc kubenswrapper[4825]: I0310 09:21:45.116074 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r6wjf/must-gather-kkm2h"] Mar 10 09:21:45 crc kubenswrapper[4825]: I0310 09:21:45.987352 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r6wjf/must-gather-kkm2h" event={"ID":"605d8ac1-592a-4163-a750-ecdfec37d34d","Type":"ContainerStarted","Data":"35a51e1ea7e0011529c98ca75f2927b20a0de0312c9be2eaac7ed532efa7410c"} Mar 10 09:21:46 crc kubenswrapper[4825]: I0310 09:21:46.887951 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:21:46 crc kubenswrapper[4825]: I0310 09:21:46.888003 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:21:52 crc kubenswrapper[4825]: I0310 09:21:52.042048 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r6wjf/must-gather-kkm2h" event={"ID":"605d8ac1-592a-4163-a750-ecdfec37d34d","Type":"ContainerStarted","Data":"dd3c3fdff94a987089f78265fc591a0367f6c5967711ec969b7ddaed28860fa2"} Mar 10 09:21:53 crc kubenswrapper[4825]: I0310 09:21:53.050831 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r6wjf/must-gather-kkm2h" event={"ID":"605d8ac1-592a-4163-a750-ecdfec37d34d","Type":"ContainerStarted","Data":"f643552ef8085f2c4326cbbd990eae6f91c9ca2c7fbd46a93530ac2986257360"} Mar 10 09:21:58 crc kubenswrapper[4825]: I0310 09:21:58.423968 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-r6wjf/must-gather-kkm2h" podStartSLOduration=7.840252116 podStartE2EDuration="14.423949704s" podCreationTimestamp="2026-03-10 09:21:44 +0000 UTC" firstStartedPulling="2026-03-10 09:21:45.105094254 +0000 UTC m=+9458.134874879" lastFinishedPulling="2026-03-10 09:21:51.688791852 +0000 UTC m=+9464.718572467" observedRunningTime="2026-03-10 09:21:53.071834692 +0000 UTC m=+9466.101615327" watchObservedRunningTime="2026-03-10 09:21:58.423949704 +0000 UTC m=+9471.453730319" Mar 10 09:21:58 crc kubenswrapper[4825]: I0310 09:21:58.427014 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r6wjf/crc-debug-rf4rp"] Mar 10 09:21:58 crc kubenswrapper[4825]: I0310 09:21:58.429175 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r6wjf/crc-debug-rf4rp" Mar 10 09:21:58 crc kubenswrapper[4825]: I0310 09:21:58.502698 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e26ea3b5-2d23-47aa-aaae-b676e270c3a9-host\") pod \"crc-debug-rf4rp\" (UID: \"e26ea3b5-2d23-47aa-aaae-b676e270c3a9\") " pod="openshift-must-gather-r6wjf/crc-debug-rf4rp" Mar 10 09:21:58 crc kubenswrapper[4825]: I0310 09:21:58.502882 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wwlp\" (UniqueName: \"kubernetes.io/projected/e26ea3b5-2d23-47aa-aaae-b676e270c3a9-kube-api-access-2wwlp\") pod \"crc-debug-rf4rp\" (UID: \"e26ea3b5-2d23-47aa-aaae-b676e270c3a9\") " pod="openshift-must-gather-r6wjf/crc-debug-rf4rp" Mar 10 09:21:58 crc kubenswrapper[4825]: I0310 09:21:58.604885 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e26ea3b5-2d23-47aa-aaae-b676e270c3a9-host\") pod \"crc-debug-rf4rp\" (UID: \"e26ea3b5-2d23-47aa-aaae-b676e270c3a9\") " pod="openshift-must-gather-r6wjf/crc-debug-rf4rp" Mar 10 09:21:58 crc kubenswrapper[4825]: I0310 09:21:58.605058 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e26ea3b5-2d23-47aa-aaae-b676e270c3a9-host\") pod \"crc-debug-rf4rp\" (UID: \"e26ea3b5-2d23-47aa-aaae-b676e270c3a9\") " pod="openshift-must-gather-r6wjf/crc-debug-rf4rp" Mar 10 09:21:58 crc kubenswrapper[4825]: I0310 09:21:58.605101 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wwlp\" (UniqueName: \"kubernetes.io/projected/e26ea3b5-2d23-47aa-aaae-b676e270c3a9-kube-api-access-2wwlp\") pod \"crc-debug-rf4rp\" (UID: \"e26ea3b5-2d23-47aa-aaae-b676e270c3a9\") " pod="openshift-must-gather-r6wjf/crc-debug-rf4rp" Mar 10 09:21:58 crc kubenswrapper[4825]: I0310 09:21:58.633831 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wwlp\" (UniqueName: \"kubernetes.io/projected/e26ea3b5-2d23-47aa-aaae-b676e270c3a9-kube-api-access-2wwlp\") pod \"crc-debug-rf4rp\" (UID: \"e26ea3b5-2d23-47aa-aaae-b676e270c3a9\") " pod="openshift-must-gather-r6wjf/crc-debug-rf4rp" Mar 10 09:21:58 crc kubenswrapper[4825]: I0310 09:21:58.747799 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r6wjf/crc-debug-rf4rp" Mar 10 09:21:59 crc kubenswrapper[4825]: I0310 09:21:59.117322 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r6wjf/crc-debug-rf4rp" event={"ID":"e26ea3b5-2d23-47aa-aaae-b676e270c3a9","Type":"ContainerStarted","Data":"381b82e0b51b947f0bab10b692fc4d5b5b3d461ce66d224dc56779e0c9483b4e"} Mar 10 09:22:00 crc kubenswrapper[4825]: I0310 09:22:00.149620 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552242-f7v75"] Mar 10 09:22:00 crc kubenswrapper[4825]: I0310 09:22:00.151316 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552242-f7v75" Mar 10 09:22:00 crc kubenswrapper[4825]: I0310 09:22:00.153045 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 09:22:00 crc kubenswrapper[4825]: I0310 09:22:00.153948 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:22:00 crc kubenswrapper[4825]: I0310 09:22:00.154161 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:22:00 crc kubenswrapper[4825]: I0310 09:22:00.170544 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552242-f7v75"] Mar 10 09:22:00 crc kubenswrapper[4825]: I0310 09:22:00.237974 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nvxn\" (UniqueName: \"kubernetes.io/projected/2ac7fce3-efdc-4ce8-85e3-ddf075cc33a3-kube-api-access-6nvxn\") pod \"auto-csr-approver-29552242-f7v75\" (UID: \"2ac7fce3-efdc-4ce8-85e3-ddf075cc33a3\") " pod="openshift-infra/auto-csr-approver-29552242-f7v75" Mar 10 09:22:00 crc kubenswrapper[4825]: I0310 09:22:00.344516 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nvxn\" (UniqueName: \"kubernetes.io/projected/2ac7fce3-efdc-4ce8-85e3-ddf075cc33a3-kube-api-access-6nvxn\") pod \"auto-csr-approver-29552242-f7v75\" (UID: \"2ac7fce3-efdc-4ce8-85e3-ddf075cc33a3\") " pod="openshift-infra/auto-csr-approver-29552242-f7v75" Mar 10 09:22:00 crc kubenswrapper[4825]: I0310 09:22:00.364163 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nvxn\" (UniqueName: \"kubernetes.io/projected/2ac7fce3-efdc-4ce8-85e3-ddf075cc33a3-kube-api-access-6nvxn\") pod \"auto-csr-approver-29552242-f7v75\" (UID: \"2ac7fce3-efdc-4ce8-85e3-ddf075cc33a3\") " pod="openshift-infra/auto-csr-approver-29552242-f7v75" Mar 10 09:22:00 crc kubenswrapper[4825]: I0310 09:22:00.478188 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552242-f7v75" Mar 10 09:22:01 crc kubenswrapper[4825]: I0310 09:22:01.089618 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552242-f7v75"] Mar 10 09:22:01 crc kubenswrapper[4825]: W0310 09:22:01.100537 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ac7fce3_efdc_4ce8_85e3_ddf075cc33a3.slice/crio-ae09c65b0996a935926a1204f13f5c4c7249022249170cca8f5c82c7a1f478f3 WatchSource:0}: Error finding container ae09c65b0996a935926a1204f13f5c4c7249022249170cca8f5c82c7a1f478f3: Status 404 returned error can't find the container with id ae09c65b0996a935926a1204f13f5c4c7249022249170cca8f5c82c7a1f478f3 Mar 10 09:22:01 crc kubenswrapper[4825]: I0310 09:22:01.156278 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552242-f7v75" event={"ID":"2ac7fce3-efdc-4ce8-85e3-ddf075cc33a3","Type":"ContainerStarted","Data":"ae09c65b0996a935926a1204f13f5c4c7249022249170cca8f5c82c7a1f478f3"} Mar 10 09:22:03 crc kubenswrapper[4825]: I0310 09:22:03.179230 4825 generic.go:334] "Generic (PLEG): container finished" podID="2ac7fce3-efdc-4ce8-85e3-ddf075cc33a3" containerID="90c406be1ae8517dd7245a11cc253022e43e0d50f31061a1a4396f4babe843bc" exitCode=0 Mar 10 09:22:03 crc kubenswrapper[4825]: I0310 09:22:03.179606 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552242-f7v75" event={"ID":"2ac7fce3-efdc-4ce8-85e3-ddf075cc33a3","Type":"ContainerDied","Data":"90c406be1ae8517dd7245a11cc253022e43e0d50f31061a1a4396f4babe843bc"} Mar 10 09:22:04 crc kubenswrapper[4825]: I0310 09:22:04.583070 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552242-f7v75" Mar 10 09:22:04 crc kubenswrapper[4825]: I0310 09:22:04.662284 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nvxn\" (UniqueName: \"kubernetes.io/projected/2ac7fce3-efdc-4ce8-85e3-ddf075cc33a3-kube-api-access-6nvxn\") pod \"2ac7fce3-efdc-4ce8-85e3-ddf075cc33a3\" (UID: \"2ac7fce3-efdc-4ce8-85e3-ddf075cc33a3\") " Mar 10 09:22:04 crc kubenswrapper[4825]: I0310 09:22:04.669985 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ac7fce3-efdc-4ce8-85e3-ddf075cc33a3-kube-api-access-6nvxn" (OuterVolumeSpecName: "kube-api-access-6nvxn") pod "2ac7fce3-efdc-4ce8-85e3-ddf075cc33a3" (UID: "2ac7fce3-efdc-4ce8-85e3-ddf075cc33a3"). InnerVolumeSpecName "kube-api-access-6nvxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:22:04 crc kubenswrapper[4825]: I0310 09:22:04.768157 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nvxn\" (UniqueName: \"kubernetes.io/projected/2ac7fce3-efdc-4ce8-85e3-ddf075cc33a3-kube-api-access-6nvxn\") on node \"crc\" DevicePath \"\"" Mar 10 09:22:05 crc kubenswrapper[4825]: I0310 09:22:05.201669 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552242-f7v75" event={"ID":"2ac7fce3-efdc-4ce8-85e3-ddf075cc33a3","Type":"ContainerDied","Data":"ae09c65b0996a935926a1204f13f5c4c7249022249170cca8f5c82c7a1f478f3"} Mar 10 09:22:05 crc kubenswrapper[4825]: I0310 09:22:05.201715 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae09c65b0996a935926a1204f13f5c4c7249022249170cca8f5c82c7a1f478f3" Mar 10 09:22:05 crc kubenswrapper[4825]: I0310 09:22:05.201775 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552242-f7v75" Mar 10 09:22:05 crc kubenswrapper[4825]: I0310 09:22:05.671899 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552236-q5rtj"] Mar 10 09:22:05 crc kubenswrapper[4825]: I0310 09:22:05.681040 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552236-q5rtj"] Mar 10 09:22:07 crc kubenswrapper[4825]: I0310 09:22:07.251919 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3d4edf3-4f2f-4238-a12a-1c5d41805acf" path="/var/lib/kubelet/pods/c3d4edf3-4f2f-4238-a12a-1c5d41805acf/volumes" Mar 10 09:22:08 crc kubenswrapper[4825]: I0310 09:22:08.787582 4825 scope.go:117] "RemoveContainer" containerID="8a3491189e86f604eaaceeeac539fe1143157dd166c76852131ec38dc5671da4" Mar 10 09:22:11 crc kubenswrapper[4825]: I0310 09:22:11.266118 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r6wjf/crc-debug-rf4rp" event={"ID":"e26ea3b5-2d23-47aa-aaae-b676e270c3a9","Type":"ContainerStarted","Data":"1273d86ae42bf16ed9c6dd6bb37e44c23a47cf2f5c02db1eff173a1412a7d6f8"} Mar 10 09:22:11 crc kubenswrapper[4825]: I0310 09:22:11.282070 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-r6wjf/crc-debug-rf4rp" podStartSLOduration=1.6759437240000001 podStartE2EDuration="13.282048297s" podCreationTimestamp="2026-03-10 09:21:58 +0000 UTC" firstStartedPulling="2026-03-10 09:21:58.790526462 +0000 UTC m=+9471.820307077" lastFinishedPulling="2026-03-10 09:22:10.396631035 +0000 UTC m=+9483.426411650" observedRunningTime="2026-03-10 09:22:11.279457758 +0000 UTC m=+9484.309238373" watchObservedRunningTime="2026-03-10 09:22:11.282048297 +0000 UTC m=+9484.311828912" Mar 10 09:22:16 crc kubenswrapper[4825]: I0310 09:22:16.888437 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:22:16 crc kubenswrapper[4825]: I0310 09:22:16.890626 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:22:16 crc kubenswrapper[4825]: I0310 09:22:16.890858 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 09:22:16 crc kubenswrapper[4825]: I0310 09:22:16.891970 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b92fb551f45e393908a43b65868ed01517bc29380c8beb54ff37ab5e2099e73d"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:22:16 crc kubenswrapper[4825]: I0310 09:22:16.892229 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://b92fb551f45e393908a43b65868ed01517bc29380c8beb54ff37ab5e2099e73d" gracePeriod=600 Mar 10 09:22:17 crc kubenswrapper[4825]: I0310 09:22:17.321897 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="b92fb551f45e393908a43b65868ed01517bc29380c8beb54ff37ab5e2099e73d" exitCode=0 Mar 10 09:22:17 crc kubenswrapper[4825]: I0310 09:22:17.322191 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"b92fb551f45e393908a43b65868ed01517bc29380c8beb54ff37ab5e2099e73d"} Mar 10 09:22:17 crc kubenswrapper[4825]: I0310 09:22:17.322343 4825 scope.go:117] "RemoveContainer" containerID="f110d54ce6d62165e047609d778f750d4e0c9b3a0b3dce9f99cbfaef3a90305f" Mar 10 09:22:20 crc kubenswrapper[4825]: I0310 09:22:20.351751 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715"} Mar 10 09:23:05 crc kubenswrapper[4825]: I0310 09:23:05.864438 4825 generic.go:334] "Generic (PLEG): container finished" podID="e26ea3b5-2d23-47aa-aaae-b676e270c3a9" containerID="1273d86ae42bf16ed9c6dd6bb37e44c23a47cf2f5c02db1eff173a1412a7d6f8" exitCode=0 Mar 10 09:23:05 crc kubenswrapper[4825]: I0310 09:23:05.864999 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r6wjf/crc-debug-rf4rp" event={"ID":"e26ea3b5-2d23-47aa-aaae-b676e270c3a9","Type":"ContainerDied","Data":"1273d86ae42bf16ed9c6dd6bb37e44c23a47cf2f5c02db1eff173a1412a7d6f8"} Mar 10 09:23:07 crc kubenswrapper[4825]: I0310 09:23:07.390921 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r6wjf/crc-debug-rf4rp" Mar 10 09:23:07 crc kubenswrapper[4825]: I0310 09:23:07.436602 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-r6wjf/crc-debug-rf4rp"] Mar 10 09:23:07 crc kubenswrapper[4825]: I0310 09:23:07.448384 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-r6wjf/crc-debug-rf4rp"] Mar 10 09:23:07 crc kubenswrapper[4825]: I0310 09:23:07.504967 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e26ea3b5-2d23-47aa-aaae-b676e270c3a9-host\") pod \"e26ea3b5-2d23-47aa-aaae-b676e270c3a9\" (UID: \"e26ea3b5-2d23-47aa-aaae-b676e270c3a9\") " Mar 10 09:23:07 crc kubenswrapper[4825]: I0310 09:23:07.505112 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e26ea3b5-2d23-47aa-aaae-b676e270c3a9-host" (OuterVolumeSpecName: "host") pod "e26ea3b5-2d23-47aa-aaae-b676e270c3a9" (UID: "e26ea3b5-2d23-47aa-aaae-b676e270c3a9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:23:07 crc kubenswrapper[4825]: I0310 09:23:07.505226 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wwlp\" (UniqueName: \"kubernetes.io/projected/e26ea3b5-2d23-47aa-aaae-b676e270c3a9-kube-api-access-2wwlp\") pod \"e26ea3b5-2d23-47aa-aaae-b676e270c3a9\" (UID: \"e26ea3b5-2d23-47aa-aaae-b676e270c3a9\") " Mar 10 09:23:07 crc kubenswrapper[4825]: I0310 09:23:07.505771 4825 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e26ea3b5-2d23-47aa-aaae-b676e270c3a9-host\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:07 crc kubenswrapper[4825]: I0310 09:23:07.511513 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e26ea3b5-2d23-47aa-aaae-b676e270c3a9-kube-api-access-2wwlp" (OuterVolumeSpecName: "kube-api-access-2wwlp") pod "e26ea3b5-2d23-47aa-aaae-b676e270c3a9" (UID: "e26ea3b5-2d23-47aa-aaae-b676e270c3a9"). InnerVolumeSpecName "kube-api-access-2wwlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:07 crc kubenswrapper[4825]: I0310 09:23:07.607330 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wwlp\" (UniqueName: \"kubernetes.io/projected/e26ea3b5-2d23-47aa-aaae-b676e270c3a9-kube-api-access-2wwlp\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:07 crc kubenswrapper[4825]: I0310 09:23:07.890438 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="381b82e0b51b947f0bab10b692fc4d5b5b3d461ce66d224dc56779e0c9483b4e" Mar 10 09:23:07 crc kubenswrapper[4825]: I0310 09:23:07.890492 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r6wjf/crc-debug-rf4rp" Mar 10 09:23:08 crc kubenswrapper[4825]: I0310 09:23:08.606030 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r6wjf/crc-debug-nvvl6"] Mar 10 09:23:08 crc kubenswrapper[4825]: E0310 09:23:08.606533 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e26ea3b5-2d23-47aa-aaae-b676e270c3a9" containerName="container-00" Mar 10 09:23:08 crc kubenswrapper[4825]: I0310 09:23:08.606546 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="e26ea3b5-2d23-47aa-aaae-b676e270c3a9" containerName="container-00" Mar 10 09:23:08 crc kubenswrapper[4825]: E0310 09:23:08.606564 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ac7fce3-efdc-4ce8-85e3-ddf075cc33a3" containerName="oc" Mar 10 09:23:08 crc kubenswrapper[4825]: I0310 09:23:08.606570 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac7fce3-efdc-4ce8-85e3-ddf075cc33a3" containerName="oc" Mar 10 09:23:08 crc kubenswrapper[4825]: I0310 09:23:08.606811 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="e26ea3b5-2d23-47aa-aaae-b676e270c3a9" containerName="container-00" Mar 10 09:23:08 crc kubenswrapper[4825]: I0310 09:23:08.606827 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ac7fce3-efdc-4ce8-85e3-ddf075cc33a3" containerName="oc" Mar 10 09:23:08 crc kubenswrapper[4825]: I0310 09:23:08.607505 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r6wjf/crc-debug-nvvl6" Mar 10 09:23:08 crc kubenswrapper[4825]: I0310 09:23:08.727117 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qj7x\" (UniqueName: \"kubernetes.io/projected/3073a1be-6fa5-430a-9a40-ee8d0b73eb60-kube-api-access-6qj7x\") pod \"crc-debug-nvvl6\" (UID: \"3073a1be-6fa5-430a-9a40-ee8d0b73eb60\") " pod="openshift-must-gather-r6wjf/crc-debug-nvvl6" Mar 10 09:23:08 crc kubenswrapper[4825]: I0310 09:23:08.727478 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3073a1be-6fa5-430a-9a40-ee8d0b73eb60-host\") pod \"crc-debug-nvvl6\" (UID: \"3073a1be-6fa5-430a-9a40-ee8d0b73eb60\") " pod="openshift-must-gather-r6wjf/crc-debug-nvvl6" Mar 10 09:23:08 crc kubenswrapper[4825]: I0310 09:23:08.829856 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qj7x\" (UniqueName: \"kubernetes.io/projected/3073a1be-6fa5-430a-9a40-ee8d0b73eb60-kube-api-access-6qj7x\") pod \"crc-debug-nvvl6\" (UID: \"3073a1be-6fa5-430a-9a40-ee8d0b73eb60\") " pod="openshift-must-gather-r6wjf/crc-debug-nvvl6" Mar 10 09:23:08 crc kubenswrapper[4825]: I0310 09:23:08.830002 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3073a1be-6fa5-430a-9a40-ee8d0b73eb60-host\") pod \"crc-debug-nvvl6\" (UID: \"3073a1be-6fa5-430a-9a40-ee8d0b73eb60\") " pod="openshift-must-gather-r6wjf/crc-debug-nvvl6" Mar 10 09:23:08 crc kubenswrapper[4825]: I0310 09:23:08.830092 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3073a1be-6fa5-430a-9a40-ee8d0b73eb60-host\") pod \"crc-debug-nvvl6\" (UID: \"3073a1be-6fa5-430a-9a40-ee8d0b73eb60\") " pod="openshift-must-gather-r6wjf/crc-debug-nvvl6" Mar 10 09:23:08 crc kubenswrapper[4825]: I0310 09:23:08.852863 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qj7x\" (UniqueName: \"kubernetes.io/projected/3073a1be-6fa5-430a-9a40-ee8d0b73eb60-kube-api-access-6qj7x\") pod \"crc-debug-nvvl6\" (UID: \"3073a1be-6fa5-430a-9a40-ee8d0b73eb60\") " pod="openshift-must-gather-r6wjf/crc-debug-nvvl6" Mar 10 09:23:08 crc kubenswrapper[4825]: I0310 09:23:08.925056 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r6wjf/crc-debug-nvvl6" Mar 10 09:23:09 crc kubenswrapper[4825]: I0310 09:23:09.266418 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e26ea3b5-2d23-47aa-aaae-b676e270c3a9" path="/var/lib/kubelet/pods/e26ea3b5-2d23-47aa-aaae-b676e270c3a9/volumes" Mar 10 09:23:09 crc kubenswrapper[4825]: I0310 09:23:09.909625 4825 generic.go:334] "Generic (PLEG): container finished" podID="3073a1be-6fa5-430a-9a40-ee8d0b73eb60" containerID="07357a2c16a1b56b39857ba18d933ee0c6ff40073fd7805dcc7520effcdac0ff" exitCode=0 Mar 10 09:23:09 crc kubenswrapper[4825]: I0310 09:23:09.909662 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r6wjf/crc-debug-nvvl6" event={"ID":"3073a1be-6fa5-430a-9a40-ee8d0b73eb60","Type":"ContainerDied","Data":"07357a2c16a1b56b39857ba18d933ee0c6ff40073fd7805dcc7520effcdac0ff"} Mar 10 09:23:09 crc kubenswrapper[4825]: I0310 09:23:09.909692 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r6wjf/crc-debug-nvvl6" event={"ID":"3073a1be-6fa5-430a-9a40-ee8d0b73eb60","Type":"ContainerStarted","Data":"b578317506424caf5b1143a2f722454328418ea8eb10369201e9e1d017a137c3"} Mar 10 09:23:11 crc kubenswrapper[4825]: I0310 09:23:11.020294 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r6wjf/crc-debug-nvvl6" Mar 10 09:23:11 crc kubenswrapper[4825]: I0310 09:23:11.220665 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3073a1be-6fa5-430a-9a40-ee8d0b73eb60-host\") pod \"3073a1be-6fa5-430a-9a40-ee8d0b73eb60\" (UID: \"3073a1be-6fa5-430a-9a40-ee8d0b73eb60\") " Mar 10 09:23:11 crc kubenswrapper[4825]: I0310 09:23:11.220742 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3073a1be-6fa5-430a-9a40-ee8d0b73eb60-host" (OuterVolumeSpecName: "host") pod "3073a1be-6fa5-430a-9a40-ee8d0b73eb60" (UID: "3073a1be-6fa5-430a-9a40-ee8d0b73eb60"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:23:11 crc kubenswrapper[4825]: I0310 09:23:11.220890 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qj7x\" (UniqueName: \"kubernetes.io/projected/3073a1be-6fa5-430a-9a40-ee8d0b73eb60-kube-api-access-6qj7x\") pod \"3073a1be-6fa5-430a-9a40-ee8d0b73eb60\" (UID: \"3073a1be-6fa5-430a-9a40-ee8d0b73eb60\") " Mar 10 09:23:11 crc kubenswrapper[4825]: I0310 09:23:11.221460 4825 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3073a1be-6fa5-430a-9a40-ee8d0b73eb60-host\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:11 crc kubenswrapper[4825]: I0310 09:23:11.228508 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3073a1be-6fa5-430a-9a40-ee8d0b73eb60-kube-api-access-6qj7x" (OuterVolumeSpecName: "kube-api-access-6qj7x") pod "3073a1be-6fa5-430a-9a40-ee8d0b73eb60" (UID: "3073a1be-6fa5-430a-9a40-ee8d0b73eb60"). InnerVolumeSpecName "kube-api-access-6qj7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:11 crc kubenswrapper[4825]: I0310 09:23:11.324541 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qj7x\" (UniqueName: \"kubernetes.io/projected/3073a1be-6fa5-430a-9a40-ee8d0b73eb60-kube-api-access-6qj7x\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:11 crc kubenswrapper[4825]: I0310 09:23:11.927668 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r6wjf/crc-debug-nvvl6" event={"ID":"3073a1be-6fa5-430a-9a40-ee8d0b73eb60","Type":"ContainerDied","Data":"b578317506424caf5b1143a2f722454328418ea8eb10369201e9e1d017a137c3"} Mar 10 09:23:11 crc kubenswrapper[4825]: I0310 09:23:11.927717 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b578317506424caf5b1143a2f722454328418ea8eb10369201e9e1d017a137c3" Mar 10 09:23:11 crc kubenswrapper[4825]: I0310 09:23:11.927760 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r6wjf/crc-debug-nvvl6" Mar 10 09:23:12 crc kubenswrapper[4825]: I0310 09:23:12.618187 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-r6wjf/crc-debug-nvvl6"] Mar 10 09:23:12 crc kubenswrapper[4825]: I0310 09:23:12.630978 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-r6wjf/crc-debug-nvvl6"] Mar 10 09:23:13 crc kubenswrapper[4825]: I0310 09:23:13.260477 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3073a1be-6fa5-430a-9a40-ee8d0b73eb60" path="/var/lib/kubelet/pods/3073a1be-6fa5-430a-9a40-ee8d0b73eb60/volumes" Mar 10 09:23:13 crc kubenswrapper[4825]: I0310 09:23:13.788625 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r6wjf/crc-debug-jrmmf"] Mar 10 09:23:13 crc kubenswrapper[4825]: E0310 09:23:13.789031 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3073a1be-6fa5-430a-9a40-ee8d0b73eb60" containerName="container-00" Mar 10 09:23:13 crc kubenswrapper[4825]: I0310 09:23:13.789044 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3073a1be-6fa5-430a-9a40-ee8d0b73eb60" containerName="container-00" Mar 10 09:23:13 crc kubenswrapper[4825]: I0310 09:23:13.789243 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3073a1be-6fa5-430a-9a40-ee8d0b73eb60" containerName="container-00" Mar 10 09:23:13 crc kubenswrapper[4825]: I0310 09:23:13.789882 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r6wjf/crc-debug-jrmmf" Mar 10 09:23:13 crc kubenswrapper[4825]: I0310 09:23:13.989683 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f375732-c472-4db8-a30a-ad47341213a0-host\") pod \"crc-debug-jrmmf\" (UID: \"5f375732-c472-4db8-a30a-ad47341213a0\") " pod="openshift-must-gather-r6wjf/crc-debug-jrmmf" Mar 10 09:23:13 crc kubenswrapper[4825]: I0310 09:23:13.990716 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcb2b\" (UniqueName: \"kubernetes.io/projected/5f375732-c472-4db8-a30a-ad47341213a0-kube-api-access-hcb2b\") pod \"crc-debug-jrmmf\" (UID: \"5f375732-c472-4db8-a30a-ad47341213a0\") " pod="openshift-must-gather-r6wjf/crc-debug-jrmmf" Mar 10 09:23:14 crc kubenswrapper[4825]: I0310 09:23:14.092826 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcb2b\" (UniqueName: \"kubernetes.io/projected/5f375732-c472-4db8-a30a-ad47341213a0-kube-api-access-hcb2b\") pod \"crc-debug-jrmmf\" (UID: \"5f375732-c472-4db8-a30a-ad47341213a0\") " pod="openshift-must-gather-r6wjf/crc-debug-jrmmf" Mar 10 09:23:14 crc kubenswrapper[4825]: I0310 09:23:14.092935 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f375732-c472-4db8-a30a-ad47341213a0-host\") pod \"crc-debug-jrmmf\" (UID: \"5f375732-c472-4db8-a30a-ad47341213a0\") " pod="openshift-must-gather-r6wjf/crc-debug-jrmmf" Mar 10 09:23:14 crc kubenswrapper[4825]: I0310 09:23:14.093029 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f375732-c472-4db8-a30a-ad47341213a0-host\") pod \"crc-debug-jrmmf\" (UID: \"5f375732-c472-4db8-a30a-ad47341213a0\") " pod="openshift-must-gather-r6wjf/crc-debug-jrmmf" Mar 10 09:23:14 crc kubenswrapper[4825]: I0310 09:23:14.113299 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcb2b\" (UniqueName: \"kubernetes.io/projected/5f375732-c472-4db8-a30a-ad47341213a0-kube-api-access-hcb2b\") pod \"crc-debug-jrmmf\" (UID: \"5f375732-c472-4db8-a30a-ad47341213a0\") " pod="openshift-must-gather-r6wjf/crc-debug-jrmmf" Mar 10 09:23:14 crc kubenswrapper[4825]: I0310 09:23:14.408577 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r6wjf/crc-debug-jrmmf" Mar 10 09:23:14 crc kubenswrapper[4825]: I0310 09:23:14.973219 4825 generic.go:334] "Generic (PLEG): container finished" podID="5f375732-c472-4db8-a30a-ad47341213a0" containerID="1800f172731021ff078e692a587a47a450ebfe73b61aeba6477e42f41a80baae" exitCode=0 Mar 10 09:23:14 crc kubenswrapper[4825]: I0310 09:23:14.973315 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r6wjf/crc-debug-jrmmf" event={"ID":"5f375732-c472-4db8-a30a-ad47341213a0","Type":"ContainerDied","Data":"1800f172731021ff078e692a587a47a450ebfe73b61aeba6477e42f41a80baae"} Mar 10 09:23:14 crc kubenswrapper[4825]: I0310 09:23:14.973497 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r6wjf/crc-debug-jrmmf" event={"ID":"5f375732-c472-4db8-a30a-ad47341213a0","Type":"ContainerStarted","Data":"943db5b264b8224fd765938964c4294c0a54cbc306ef3c6bd0286219e8f4d10d"} Mar 10 09:23:15 crc kubenswrapper[4825]: I0310 09:23:15.021606 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-r6wjf/crc-debug-jrmmf"] Mar 10 09:23:15 crc kubenswrapper[4825]: I0310 09:23:15.033427 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-r6wjf/crc-debug-jrmmf"] Mar 10 09:23:16 crc kubenswrapper[4825]: I0310 09:23:16.286028 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r6wjf/crc-debug-jrmmf" Mar 10 09:23:16 crc kubenswrapper[4825]: I0310 09:23:16.458446 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f375732-c472-4db8-a30a-ad47341213a0-host\") pod \"5f375732-c472-4db8-a30a-ad47341213a0\" (UID: \"5f375732-c472-4db8-a30a-ad47341213a0\") " Mar 10 09:23:16 crc kubenswrapper[4825]: I0310 09:23:16.458627 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcb2b\" (UniqueName: \"kubernetes.io/projected/5f375732-c472-4db8-a30a-ad47341213a0-kube-api-access-hcb2b\") pod \"5f375732-c472-4db8-a30a-ad47341213a0\" (UID: \"5f375732-c472-4db8-a30a-ad47341213a0\") " Mar 10 09:23:16 crc kubenswrapper[4825]: I0310 09:23:16.460901 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f375732-c472-4db8-a30a-ad47341213a0-host" (OuterVolumeSpecName: "host") pod "5f375732-c472-4db8-a30a-ad47341213a0" (UID: "5f375732-c472-4db8-a30a-ad47341213a0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 09:23:16 crc kubenswrapper[4825]: I0310 09:23:16.471782 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f375732-c472-4db8-a30a-ad47341213a0-kube-api-access-hcb2b" (OuterVolumeSpecName: "kube-api-access-hcb2b") pod "5f375732-c472-4db8-a30a-ad47341213a0" (UID: "5f375732-c472-4db8-a30a-ad47341213a0"). InnerVolumeSpecName "kube-api-access-hcb2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:23:16 crc kubenswrapper[4825]: I0310 09:23:16.562491 4825 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f375732-c472-4db8-a30a-ad47341213a0-host\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:16 crc kubenswrapper[4825]: I0310 09:23:16.562535 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcb2b\" (UniqueName: \"kubernetes.io/projected/5f375732-c472-4db8-a30a-ad47341213a0-kube-api-access-hcb2b\") on node \"crc\" DevicePath \"\"" Mar 10 09:23:16 crc kubenswrapper[4825]: I0310 09:23:16.992525 4825 scope.go:117] "RemoveContainer" containerID="1800f172731021ff078e692a587a47a450ebfe73b61aeba6477e42f41a80baae" Mar 10 09:23:16 crc kubenswrapper[4825]: I0310 09:23:16.992606 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r6wjf/crc-debug-jrmmf" Mar 10 09:23:17 crc kubenswrapper[4825]: I0310 09:23:17.248397 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f375732-c472-4db8-a30a-ad47341213a0" path="/var/lib/kubelet/pods/5f375732-c472-4db8-a30a-ad47341213a0/volumes" Mar 10 09:24:00 crc kubenswrapper[4825]: I0310 09:24:00.158962 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552244-bnptt"] Mar 10 09:24:00 crc kubenswrapper[4825]: E0310 09:24:00.160380 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f375732-c472-4db8-a30a-ad47341213a0" containerName="container-00" Mar 10 09:24:00 crc kubenswrapper[4825]: I0310 09:24:00.160400 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f375732-c472-4db8-a30a-ad47341213a0" containerName="container-00" Mar 10 09:24:00 crc kubenswrapper[4825]: I0310 09:24:00.160960 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f375732-c472-4db8-a30a-ad47341213a0" containerName="container-00" Mar 10 09:24:00 crc kubenswrapper[4825]: I0310 09:24:00.162123 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552244-bnptt" Mar 10 09:24:00 crc kubenswrapper[4825]: I0310 09:24:00.165651 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:24:00 crc kubenswrapper[4825]: I0310 09:24:00.165927 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:24:00 crc kubenswrapper[4825]: I0310 09:24:00.166112 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 09:24:00 crc kubenswrapper[4825]: I0310 09:24:00.173364 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vjbz\" (UniqueName: \"kubernetes.io/projected/b0e3a09a-8fb1-440b-97d1-60abea92baac-kube-api-access-5vjbz\") pod \"auto-csr-approver-29552244-bnptt\" (UID: \"b0e3a09a-8fb1-440b-97d1-60abea92baac\") " pod="openshift-infra/auto-csr-approver-29552244-bnptt" Mar 10 09:24:00 crc kubenswrapper[4825]: I0310 09:24:00.176429 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552244-bnptt"] Mar 10 09:24:00 crc kubenswrapper[4825]: I0310 09:24:00.276297 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vjbz\" (UniqueName: \"kubernetes.io/projected/b0e3a09a-8fb1-440b-97d1-60abea92baac-kube-api-access-5vjbz\") pod \"auto-csr-approver-29552244-bnptt\" (UID: \"b0e3a09a-8fb1-440b-97d1-60abea92baac\") " pod="openshift-infra/auto-csr-approver-29552244-bnptt" Mar 10 09:24:00 crc kubenswrapper[4825]: I0310 09:24:00.302936 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vjbz\" (UniqueName: \"kubernetes.io/projected/b0e3a09a-8fb1-440b-97d1-60abea92baac-kube-api-access-5vjbz\") pod \"auto-csr-approver-29552244-bnptt\" (UID: \"b0e3a09a-8fb1-440b-97d1-60abea92baac\") " pod="openshift-infra/auto-csr-approver-29552244-bnptt" Mar 10 09:24:00 crc kubenswrapper[4825]: I0310 09:24:00.496112 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552244-bnptt" Mar 10 09:24:01 crc kubenswrapper[4825]: I0310 09:24:01.001784 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552244-bnptt"] Mar 10 09:24:01 crc kubenswrapper[4825]: I0310 09:24:01.526211 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552244-bnptt" event={"ID":"b0e3a09a-8fb1-440b-97d1-60abea92baac","Type":"ContainerStarted","Data":"c8e05b90e58327a7fc1724d359f0a0c06a5288b8aa716a6fd971aad3416873ce"} Mar 10 09:24:03 crc kubenswrapper[4825]: I0310 09:24:03.547547 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552244-bnptt" event={"ID":"b0e3a09a-8fb1-440b-97d1-60abea92baac","Type":"ContainerStarted","Data":"a9c32a5920a7d76803f7f322f3b6273ce17ece3db73976faa0b96d093b7971a2"} Mar 10 09:24:03 crc kubenswrapper[4825]: I0310 09:24:03.575035 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552244-bnptt" podStartSLOduration=2.196577155 podStartE2EDuration="3.575014292s" podCreationTimestamp="2026-03-10 09:24:00 +0000 UTC" firstStartedPulling="2026-03-10 09:24:01.003834695 +0000 UTC m=+9594.033615310" lastFinishedPulling="2026-03-10 09:24:02.382271792 +0000 UTC m=+9595.412052447" observedRunningTime="2026-03-10 09:24:03.566923487 +0000 UTC m=+9596.596704102" watchObservedRunningTime="2026-03-10 09:24:03.575014292 +0000 UTC m=+9596.604794907" Mar 10 09:24:04 crc kubenswrapper[4825]: I0310 09:24:04.568572 4825 generic.go:334] "Generic (PLEG): container finished" podID="b0e3a09a-8fb1-440b-97d1-60abea92baac" containerID="a9c32a5920a7d76803f7f322f3b6273ce17ece3db73976faa0b96d093b7971a2" exitCode=0 Mar 10 09:24:04 crc kubenswrapper[4825]: I0310 09:24:04.568798 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552244-bnptt" event={"ID":"b0e3a09a-8fb1-440b-97d1-60abea92baac","Type":"ContainerDied","Data":"a9c32a5920a7d76803f7f322f3b6273ce17ece3db73976faa0b96d093b7971a2"} Mar 10 09:24:06 crc kubenswrapper[4825]: I0310 09:24:05.929583 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552244-bnptt" Mar 10 09:24:06 crc kubenswrapper[4825]: I0310 09:24:06.106578 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vjbz\" (UniqueName: \"kubernetes.io/projected/b0e3a09a-8fb1-440b-97d1-60abea92baac-kube-api-access-5vjbz\") pod \"b0e3a09a-8fb1-440b-97d1-60abea92baac\" (UID: \"b0e3a09a-8fb1-440b-97d1-60abea92baac\") " Mar 10 09:24:06 crc kubenswrapper[4825]: I0310 09:24:06.112995 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0e3a09a-8fb1-440b-97d1-60abea92baac-kube-api-access-5vjbz" (OuterVolumeSpecName: "kube-api-access-5vjbz") pod "b0e3a09a-8fb1-440b-97d1-60abea92baac" (UID: "b0e3a09a-8fb1-440b-97d1-60abea92baac"). InnerVolumeSpecName "kube-api-access-5vjbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:24:06 crc kubenswrapper[4825]: I0310 09:24:06.209386 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vjbz\" (UniqueName: \"kubernetes.io/projected/b0e3a09a-8fb1-440b-97d1-60abea92baac-kube-api-access-5vjbz\") on node \"crc\" DevicePath \"\"" Mar 10 09:24:06 crc kubenswrapper[4825]: I0310 09:24:06.591899 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552244-bnptt" event={"ID":"b0e3a09a-8fb1-440b-97d1-60abea92baac","Type":"ContainerDied","Data":"c8e05b90e58327a7fc1724d359f0a0c06a5288b8aa716a6fd971aad3416873ce"} Mar 10 09:24:06 crc kubenswrapper[4825]: I0310 09:24:06.591943 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552244-bnptt" Mar 10 09:24:06 crc kubenswrapper[4825]: I0310 09:24:06.591954 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8e05b90e58327a7fc1724d359f0a0c06a5288b8aa716a6fd971aad3416873ce" Mar 10 09:24:06 crc kubenswrapper[4825]: I0310 09:24:06.641799 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552238-ghhg6"] Mar 10 09:24:06 crc kubenswrapper[4825]: I0310 09:24:06.650853 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552238-ghhg6"] Mar 10 09:24:07 crc kubenswrapper[4825]: I0310 09:24:07.257644 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6d7b9e0-a38d-4886-ab3a-b4ba7e6efe57" path="/var/lib/kubelet/pods/c6d7b9e0-a38d-4886-ab3a-b4ba7e6efe57/volumes" Mar 10 09:24:10 crc kubenswrapper[4825]: I0310 09:24:10.388175 4825 scope.go:117] "RemoveContainer" containerID="45254e6e4d180b8c1a60b4de026befc988fdc76fd1f1729262dafab5013ece42" Mar 10 09:24:46 crc kubenswrapper[4825]: I0310 09:24:46.887814 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:24:46 crc kubenswrapper[4825]: I0310 09:24:46.888390 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:25:03 crc kubenswrapper[4825]: I0310 09:25:03.116772 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_9ea0c17e-7d29-45c4-85f4-66836e8860fd/init-config-reloader/0.log" Mar 10 09:25:03 crc kubenswrapper[4825]: I0310 09:25:03.336591 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_9ea0c17e-7d29-45c4-85f4-66836e8860fd/alertmanager/0.log" Mar 10 09:25:03 crc kubenswrapper[4825]: I0310 09:25:03.358035 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_9ea0c17e-7d29-45c4-85f4-66836e8860fd/config-reloader/0.log" Mar 10 09:25:03 crc kubenswrapper[4825]: I0310 09:25:03.363939 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_9ea0c17e-7d29-45c4-85f4-66836e8860fd/init-config-reloader/0.log" Mar 10 09:25:04 crc kubenswrapper[4825]: I0310 09:25:04.291385 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_72b82233-02bf-4511-9c8b-6b9c744a552a/aodh-evaluator/0.log" Mar 10 09:25:04 crc kubenswrapper[4825]: I0310 09:25:04.298710 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_72b82233-02bf-4511-9c8b-6b9c744a552a/aodh-listener/0.log" Mar 10 09:25:04 crc kubenswrapper[4825]: I0310 09:25:04.377290 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_72b82233-02bf-4511-9c8b-6b9c744a552a/aodh-api/0.log" Mar 10 09:25:04 crc kubenswrapper[4825]: I0310 09:25:04.532571 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_72b82233-02bf-4511-9c8b-6b9c744a552a/aodh-notifier/0.log" Mar 10 09:25:04 crc kubenswrapper[4825]: I0310 09:25:04.575727 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-59f9c97db-j6zhw_bacaf889-e480-4e5d-b5c8-d4496d119261/barbican-api/0.log" Mar 10 09:25:04 crc kubenswrapper[4825]: I0310 09:25:04.648530 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-59f9c97db-j6zhw_bacaf889-e480-4e5d-b5c8-d4496d119261/barbican-api-log/0.log" Mar 10 09:25:04 crc kubenswrapper[4825]: I0310 09:25:04.759130 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-644cfb5dc8-ldc2h_0ac2019c-71a9-4006-b1e3-a3d5104834c7/barbican-keystone-listener/0.log" Mar 10 09:25:05 crc kubenswrapper[4825]: I0310 09:25:05.060990 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7c98ff7-wbdq9_057a22f7-8752-4d5b-b896-5e1c5b46ce91/barbican-worker/0.log" Mar 10 09:25:05 crc kubenswrapper[4825]: I0310 09:25:05.124813 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7c98ff7-wbdq9_057a22f7-8752-4d5b-b896-5e1c5b46ce91/barbican-worker-log/0.log" Mar 10 09:25:05 crc kubenswrapper[4825]: I0310 09:25:05.197467 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-644cfb5dc8-ldc2h_0ac2019c-71a9-4006-b1e3-a3d5104834c7/barbican-keystone-listener-log/0.log" Mar 10 09:25:05 crc kubenswrapper[4825]: I0310 09:25:05.297341 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-grqk8_547aa46f-b19d-4704-89c0-4c27e28ba30e/bootstrap-openstack-openstack-cell1/0.log" Mar 10 09:25:05 crc kubenswrapper[4825]: I0310 09:25:05.462067 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0f712d86-b1c5-44e8-8096-2a9d37b7a792/ceilometer-central-agent/0.log" Mar 10 09:25:05 crc kubenswrapper[4825]: I0310 09:25:05.558071 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0f712d86-b1c5-44e8-8096-2a9d37b7a792/ceilometer-notification-agent/0.log" Mar 10 09:25:05 crc kubenswrapper[4825]: I0310 09:25:05.589355 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0f712d86-b1c5-44e8-8096-2a9d37b7a792/proxy-httpd/0.log" Mar 10 09:25:05 crc kubenswrapper[4825]: I0310 09:25:05.681299 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0f712d86-b1c5-44e8-8096-2a9d37b7a792/sg-core/0.log" Mar 10 09:25:06 crc kubenswrapper[4825]: I0310 09:25:06.417165 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f6ed5b03-6418-446b-8bb0-88ed213149d1/cinder-api-log/0.log" Mar 10 09:25:06 crc kubenswrapper[4825]: I0310 09:25:06.478342 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f6ed5b03-6418-446b-8bb0-88ed213149d1/cinder-api/0.log" Mar 10 09:25:06 crc kubenswrapper[4825]: I0310 09:25:06.677801 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3edaef1f-39d6-4e94-a143-486f77278314/cinder-scheduler/0.log" Mar 10 09:25:06 crc kubenswrapper[4825]: I0310 09:25:06.777495 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3edaef1f-39d6-4e94-a143-486f77278314/probe/0.log" Mar 10 09:25:06 crc kubenswrapper[4825]: I0310 09:25:06.822305 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-x88hw_c9229dc9-6790-4cd4-bbc3-0e6e156cc076/configure-network-openstack-openstack-cell1/0.log" Mar 10 09:25:07 crc kubenswrapper[4825]: I0310 09:25:07.048022 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-wghgg_596aaaec-e224-450d-886b-7e7477c7f221/configure-os-openstack-openstack-cell1/0.log" Mar 10 09:25:07 crc kubenswrapper[4825]: I0310 09:25:07.065251 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5958856c47-pd5nn_e59d4d2c-39bd-4900-8777-be4beac0f1c5/init/0.log" Mar 10 09:25:07 crc kubenswrapper[4825]: I0310 09:25:07.298989 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5958856c47-pd5nn_e59d4d2c-39bd-4900-8777-be4beac0f1c5/init/0.log" Mar 10 09:25:07 crc kubenswrapper[4825]: I0310 09:25:07.358630 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5958856c47-pd5nn_e59d4d2c-39bd-4900-8777-be4beac0f1c5/dnsmasq-dns/0.log" Mar 10 09:25:07 crc kubenswrapper[4825]: I0310 09:25:07.362250 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-mhc2l_ae858185-87e8-423e-86ff-cc5b55199c37/download-cache-openstack-openstack-cell1/0.log" Mar 10 09:25:07 crc kubenswrapper[4825]: I0310 09:25:07.546603 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_961053b2-4438-497f-8056-0f9d1b6f058b/glance-log/0.log" Mar 10 09:25:07 crc kubenswrapper[4825]: I0310 09:25:07.571725 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_961053b2-4438-497f-8056-0f9d1b6f058b/glance-httpd/0.log" Mar 10 09:25:07 crc kubenswrapper[4825]: I0310 09:25:07.676835 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_160cbdfa-ab8d-448a-85ae-38fcf6315509/glance-httpd/0.log" Mar 10 09:25:07 crc kubenswrapper[4825]: I0310 09:25:07.770692 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_160cbdfa-ab8d-448a-85ae-38fcf6315509/glance-log/0.log" Mar 10 09:25:08 crc kubenswrapper[4825]: I0310 09:25:08.123674 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-55d8bdb9c6-bvvnc_0f395cfa-be06-4d10-b6ca-2584f3588882/heat-engine/0.log" Mar 10 09:25:08 crc kubenswrapper[4825]: I0310 09:25:08.373989 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-7fcb64c775-kvwk8_8350adde-7bc4-4b8e-8e61-da40714ad2c2/heat-api/0.log" Mar 10 09:25:08 crc kubenswrapper[4825]: I0310 09:25:08.412874 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-fdb9f75f4-fr95n_75ad0a08-756c-40ea-85ec-3cd497c96680/horizon/0.log" Mar 10 09:25:08 crc kubenswrapper[4825]: I0310 09:25:08.602929 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-p52j5_bd05ab73-33e3-4441-80f1-47ecc62c610e/install-certs-openstack-openstack-cell1/0.log" Mar 10 09:25:08 crc kubenswrapper[4825]: I0310 09:25:08.618425 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-65988cc597-hxfff_3a394b38-8efc-4b86-a0a9-5a748e4d56e3/heat-cfnapi/0.log" Mar 10 09:25:08 crc kubenswrapper[4825]: I0310 09:25:08.825035 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-sl2sb_3d150047-2232-43ae-990a-23bf03421efa/install-os-openstack-openstack-cell1/0.log" Mar 10 09:25:08 crc kubenswrapper[4825]: I0310 09:25:08.919579 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-fdb9f75f4-fr95n_75ad0a08-756c-40ea-85ec-3cd497c96680/horizon-log/0.log" Mar 10 09:25:09 crc kubenswrapper[4825]: I0310 09:25:09.090468 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29552221-sbsfl_1304cf5f-c346-4ccd-b858-c8e63f3d0056/keystone-cron/0.log" Mar 10 09:25:09 crc kubenswrapper[4825]: I0310 09:25:09.370918 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_9c2d694b-0214-4e6e-b719-f002b7d58c2d/kube-state-metrics/0.log" Mar 10 09:25:09 crc kubenswrapper[4825]: I0310 09:25:09.510861 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-gd2sq_8da704d5-ded8-4714-936b-91ac018532dd/libvirt-openstack-openstack-cell1/0.log" Mar 10 09:25:09 crc kubenswrapper[4825]: I0310 09:25:09.537578 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-586b5b5d4c-4gh87_f663222e-3075-4730-a6d8-74a90a27c152/keystone-api/0.log" Mar 10 09:25:09 crc kubenswrapper[4825]: I0310 09:25:09.945283 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-cd8d6ccb7-tj99q_b23b1e12-ba16-4288-857c-116ccab84267/neutron-httpd/0.log" Mar 10 09:25:10 crc kubenswrapper[4825]: I0310 09:25:10.063623 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-7nh2n_21a36d26-5d6d-4c12-869e-2b09e847886a/neutron-dhcp-openstack-openstack-cell1/0.log" Mar 10 09:25:10 crc kubenswrapper[4825]: I0310 09:25:10.272612 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-xcjgb_67401e8d-825e-461a-a88d-7573c48c5918/neutron-metadata-openstack-openstack-cell1/0.log" Mar 10 09:25:10 crc kubenswrapper[4825]: I0310 09:25:10.311489 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-cd8d6ccb7-tj99q_b23b1e12-ba16-4288-857c-116ccab84267/neutron-api/0.log" Mar 10 09:25:10 crc kubenswrapper[4825]: I0310 09:25:10.416509 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-fpdt8_be72d3b9-862b-4b9b-83a6-b8a991ddd51a/neutron-sriov-openstack-openstack-cell1/0.log" Mar 10 09:25:10 crc kubenswrapper[4825]: I0310 09:25:10.889553 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ecdac00d-be5b-4b28-9c67-30e3249ac5b0/nova-api-log/0.log" Mar 10 09:25:10 crc kubenswrapper[4825]: I0310 09:25:10.942978 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a7daff41-c98a-45c9-b2d3-7c4c58f4921f/nova-cell0-conductor-conductor/0.log" Mar 10 09:25:11 crc kubenswrapper[4825]: I0310 09:25:11.218948 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ecdac00d-be5b-4b28-9c67-30e3249ac5b0/nova-api-api/0.log" Mar 10 09:25:11 crc kubenswrapper[4825]: I0310 09:25:11.228658 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_7eb509b3-8b71-4ed1-9ce1-0a0b535b723a/nova-cell1-conductor-conductor/0.log" Mar 10 09:25:11 crc kubenswrapper[4825]: I0310 09:25:11.285288 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_8ef73e54-5497-4af4-a1c3-89a47c67bcb8/nova-cell1-novncproxy-novncproxy/0.log" Mar 10 09:25:11 crc kubenswrapper[4825]: I0310 09:25:11.450220 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellcqjbc_2ef9f6e3-ec3c-40b2-9072-8f6859aa4019/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Mar 10 09:25:11 crc kubenswrapper[4825]: I0310 09:25:11.576676 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-dvswf_49b44b5f-c93d-4721-b90d-1bb8ff9d3cb7/nova-cell1-openstack-openstack-cell1/0.log" Mar 10 09:25:11 crc kubenswrapper[4825]: I0310 09:25:11.777761 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2e429567-63da-4536-88f6-4a52cf840573/nova-metadata-log/0.log" Mar 10 09:25:12 crc kubenswrapper[4825]: I0310 09:25:12.071795 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_aee794fa-32be-4d93-ad70-f9c4837f2a66/nova-scheduler-scheduler/0.log" Mar 10 09:25:12 crc kubenswrapper[4825]: I0310 09:25:12.099236 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b069d964-330f-4562-9916-81cae7d0e72f/mysql-bootstrap/0.log" Mar 10 09:25:12 crc kubenswrapper[4825]: I0310 09:25:12.270509 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b069d964-330f-4562-9916-81cae7d0e72f/mysql-bootstrap/0.log" Mar 10 09:25:12 crc kubenswrapper[4825]: I0310 09:25:12.352437 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b069d964-330f-4562-9916-81cae7d0e72f/galera/0.log" Mar 10 09:25:12 crc kubenswrapper[4825]: I0310 09:25:12.398126 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2e429567-63da-4536-88f6-4a52cf840573/nova-metadata-metadata/0.log" Mar 10 09:25:12 crc kubenswrapper[4825]: I0310 09:25:12.525256 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_47fdbdcc-cae4-4261-9695-645201acdc61/mysql-bootstrap/0.log" Mar 10 09:25:12 crc kubenswrapper[4825]: I0310 09:25:12.692232 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_47fdbdcc-cae4-4261-9695-645201acdc61/mysql-bootstrap/0.log" Mar 10 09:25:12 crc kubenswrapper[4825]: I0310 09:25:12.704458 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_47fdbdcc-cae4-4261-9695-645201acdc61/galera/0.log" Mar 10 09:25:12 crc kubenswrapper[4825]: I0310 09:25:12.866224 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_4a7d5fd4-1799-4d5d-9016-4ee3de9ed205/openstackclient/0.log" Mar 10 09:25:12 crc kubenswrapper[4825]: I0310 09:25:12.949828 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6dddf17b-b47a-40c1-b6e9-99de860dd2bf/openstack-network-exporter/0.log" Mar 10 09:25:13 crc kubenswrapper[4825]: I0310 09:25:13.107204 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6dddf17b-b47a-40c1-b6e9-99de860dd2bf/ovn-northd/0.log" Mar 10 09:25:13 crc kubenswrapper[4825]: I0310 09:25:13.283153 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-lwdhg_0fa4e7d5-3d32-40e6-9ea8-b9697ada4d46/ovn-openstack-openstack-cell1/0.log" Mar 10 09:25:13 crc kubenswrapper[4825]: I0310 09:25:13.328369 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_eee9953c-6698-4bdf-bda4-d8c49476fe3c/openstack-network-exporter/0.log" Mar 10 09:25:13 crc kubenswrapper[4825]: I0310 09:25:13.399357 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_eee9953c-6698-4bdf-bda4-d8c49476fe3c/ovsdbserver-nb/0.log" Mar 10 09:25:13 crc kubenswrapper[4825]: I0310 09:25:13.546532 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_624ef9e0-5a07-4d11-b58e-b4b2be2a25cc/ovsdbserver-nb/0.log" Mar 10 09:25:13 crc kubenswrapper[4825]: I0310 09:25:13.554235 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_624ef9e0-5a07-4d11-b58e-b4b2be2a25cc/openstack-network-exporter/0.log" Mar 10 09:25:13 crc kubenswrapper[4825]: I0310 09:25:13.791474 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_bf24e25c-e247-40c5-ab10-ecc60c1b83db/openstack-network-exporter/0.log" Mar 10 09:25:13 crc kubenswrapper[4825]: I0310 09:25:13.830201 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_bf24e25c-e247-40c5-ab10-ecc60c1b83db/ovsdbserver-nb/0.log" Mar 10 09:25:13 crc kubenswrapper[4825]: I0310 09:25:13.926695 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_83412342-2b66-41e4-a4c6-c9715ab28427/openstack-network-exporter/0.log" Mar 10 09:25:14 crc kubenswrapper[4825]: I0310 09:25:14.031320 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_83412342-2b66-41e4-a4c6-c9715ab28427/ovsdbserver-sb/0.log" Mar 10 09:25:14 crc kubenswrapper[4825]: I0310 09:25:14.661842 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_ef0e8e14-a34c-4e12-814d-9c50d99cd5fb/ovsdbserver-sb/0.log" Mar 10 09:25:14 crc kubenswrapper[4825]: I0310 09:25:14.679269 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_ef0e8e14-a34c-4e12-814d-9c50d99cd5fb/openstack-network-exporter/0.log" Mar 10 09:25:14 crc kubenswrapper[4825]: I0310 09:25:14.745287 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_cbaf4a7a-c60d-42a1-8751-849bab562b68/openstack-network-exporter/0.log" Mar 10 09:25:14 crc kubenswrapper[4825]: I0310 09:25:14.942964 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_cbaf4a7a-c60d-42a1-8751-849bab562b68/ovsdbserver-sb/0.log" Mar 10 09:25:15 crc kubenswrapper[4825]: I0310 09:25:15.040930 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7bd6645db8-zxmsk_61eaa20d-9a8d-4edb-97c7-3d9ce430bab0/placement-api/0.log" Mar 10 09:25:15 crc kubenswrapper[4825]: I0310 09:25:15.142057 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7bd6645db8-zxmsk_61eaa20d-9a8d-4edb-97c7-3d9ce430bab0/placement-log/0.log" Mar 10 09:25:15 crc kubenswrapper[4825]: I0310 09:25:15.211456 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-cfsb8x_04196610-0f20-46dc-b6e9-2e0d1b62342d/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Mar 10 09:25:15 crc kubenswrapper[4825]: I0310 09:25:15.406620 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f4caed09-3fd0-43fc-8e28-776e103343bd/init-config-reloader/0.log" Mar 10 09:25:15 crc kubenswrapper[4825]: I0310 09:25:15.560158 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f4caed09-3fd0-43fc-8e28-776e103343bd/config-reloader/0.log" Mar 10 09:25:15 crc kubenswrapper[4825]: I0310 09:25:15.583455 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f4caed09-3fd0-43fc-8e28-776e103343bd/init-config-reloader/0.log" Mar 10 09:25:15 crc kubenswrapper[4825]: I0310 09:25:15.595749 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f4caed09-3fd0-43fc-8e28-776e103343bd/prometheus/0.log" Mar 10 09:25:15 crc kubenswrapper[4825]: I0310 09:25:15.615669 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f4caed09-3fd0-43fc-8e28-776e103343bd/thanos-sidecar/0.log" Mar 10 09:25:16 crc kubenswrapper[4825]: I0310 09:25:16.281595 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a/setup-container/0.log" Mar 10 09:25:16 crc kubenswrapper[4825]: I0310 09:25:16.468127 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a/setup-container/0.log" Mar 10 09:25:16 crc kubenswrapper[4825]: I0310 09:25:16.483230 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b9e3f0a6-4a2e-4363-ab86-a4ac4ad4b65a/rabbitmq/0.log" Mar 10 09:25:16 crc kubenswrapper[4825]: I0310 09:25:16.582241 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_67f57387-e031-49bf-9895-efa6796a98cd/setup-container/0.log" Mar 10 09:25:16 crc kubenswrapper[4825]: I0310 09:25:16.816157 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_67f57387-e031-49bf-9895-efa6796a98cd/setup-container/0.log" Mar 10 09:25:16 crc kubenswrapper[4825]: I0310 09:25:16.890563 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:25:16 crc kubenswrapper[4825]: I0310 09:25:16.890619 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:25:16 crc kubenswrapper[4825]: I0310 09:25:16.922309 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_67f57387-e031-49bf-9895-efa6796a98cd/rabbitmq/0.log" Mar 10 09:25:16 crc kubenswrapper[4825]: I0310 09:25:16.935627 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-ljb4q_5e0b6047-b6c6-4e6e-983d-c2640c8c4ba8/reboot-os-openstack-openstack-cell1/0.log" Mar 10 09:25:17 crc kubenswrapper[4825]: I0310 09:25:17.128461 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-7v9mc_9f21dc2a-195c-4bf3-82fe-be871a275ae3/run-os-openstack-openstack-cell1/0.log" Mar 10 09:25:17 crc kubenswrapper[4825]: I0310 09:25:17.188904 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-jtd8b_f7c77abe-85f2-42bf-a43c-0fb55031e37d/ssh-known-hosts-openstack/0.log" Mar 10 09:25:17 crc kubenswrapper[4825]: I0310 09:25:17.395611 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6d6b57675b-q5h4k_8b67c6af-7ab6-4359-b2c7-c3e8b57fc722/proxy-server/0.log" Mar 10 09:25:17 crc kubenswrapper[4825]: I0310 09:25:17.596893 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6d6b57675b-q5h4k_8b67c6af-7ab6-4359-b2c7-c3e8b57fc722/proxy-httpd/0.log" Mar 10 09:25:17 crc kubenswrapper[4825]: I0310 09:25:17.601278 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-th79c_72b5083e-156e-42a5-abbd-1d71521f8ed4/swift-ring-rebalance/0.log" Mar 10 09:25:17 crc kubenswrapper[4825]: I0310 09:25:17.912112 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-9zrj6_d5874626-6e2f-4545-a7f4-225c36f183f4/telemetry-openstack-openstack-cell1/0.log" Mar 10 09:25:17 crc kubenswrapper[4825]: I0310 09:25:17.959765 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_f26d1ad6-86b3-4fef-84d4-78cd5df47576/tempest-tests-tempest-tests-runner/0.log" Mar 10 09:25:18 crc kubenswrapper[4825]: I0310 09:25:18.122294 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_7e511dae-571d-4450-a56e-9d9dfe4cc83a/test-operator-logs-container/0.log" Mar 10 09:25:18 crc kubenswrapper[4825]: I0310 09:25:18.231853 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-wv5bk_98107d82-46d1-4863-8c79-a10abec1737f/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Mar 10 09:25:18 crc kubenswrapper[4825]: I0310 09:25:18.365937 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-c2hw6_3d18dc7e-210e-4b9b-8d00-a7dc2d9dac8a/validate-network-openstack-openstack-cell1/0.log" Mar 10 09:25:25 crc kubenswrapper[4825]: I0310 09:25:25.022886 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_a6ce6631-6a7f-447f-9ea1-036ab13eec97/memcached/0.log" Mar 10 09:25:46 crc kubenswrapper[4825]: I0310 09:25:46.888680 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:25:46 crc kubenswrapper[4825]: I0310 09:25:46.889183 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:25:46 crc kubenswrapper[4825]: I0310 09:25:46.889227 4825 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" Mar 10 09:25:46 crc kubenswrapper[4825]: I0310 09:25:46.890020 4825 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715"} pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 09:25:46 crc kubenswrapper[4825]: I0310 09:25:46.890113 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" containerID="cri-o://15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" gracePeriod=600 Mar 10 09:25:47 crc kubenswrapper[4825]: E0310 09:25:47.018985 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:25:47 crc kubenswrapper[4825]: I0310 09:25:47.594327 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-5d87c9d997-qqskl_7ed2e288-eed6-49bf-8eb8-be9a1e8415a3/manager/0.log" Mar 10 09:25:47 crc kubenswrapper[4825]: I0310 09:25:47.722651 4825 generic.go:334] "Generic (PLEG): container finished" podID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" exitCode=0 Mar 10 09:25:47 crc kubenswrapper[4825]: I0310 09:25:47.722702 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerDied","Data":"15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715"} Mar 10 09:25:47 crc kubenswrapper[4825]: I0310 09:25:47.722748 4825 scope.go:117] "RemoveContainer" containerID="b92fb551f45e393908a43b65868ed01517bc29380c8beb54ff37ab5e2099e73d" Mar 10 09:25:47 crc kubenswrapper[4825]: I0310 09:25:47.723945 4825 scope.go:117] "RemoveContainer" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" Mar 10 09:25:47 crc kubenswrapper[4825]: E0310 09:25:47.724290 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:25:47 crc kubenswrapper[4825]: I0310 09:25:47.853327 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl_fe29f2e8-8710-42c7-b36b-820eb611fd11/util/0.log" Mar 10 09:25:48 crc kubenswrapper[4825]: I0310 09:25:48.040016 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl_fe29f2e8-8710-42c7-b36b-820eb611fd11/pull/0.log" Mar 10 09:25:48 crc kubenswrapper[4825]: I0310 09:25:48.057988 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl_fe29f2e8-8710-42c7-b36b-820eb611fd11/util/0.log" Mar 10 09:25:48 crc kubenswrapper[4825]: I0310 09:25:48.250610 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl_fe29f2e8-8710-42c7-b36b-820eb611fd11/pull/0.log" Mar 10 09:25:48 crc kubenswrapper[4825]: I0310 09:25:48.509418 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl_fe29f2e8-8710-42c7-b36b-820eb611fd11/util/0.log" Mar 10 09:25:48 crc kubenswrapper[4825]: I0310 09:25:48.533407 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl_fe29f2e8-8710-42c7-b36b-820eb611fd11/pull/0.log" Mar 10 09:25:48 crc kubenswrapper[4825]: I0310 09:25:48.773670 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfak69dl_fe29f2e8-8710-42c7-b36b-820eb611fd11/extract/0.log" Mar 10 09:25:48 crc kubenswrapper[4825]: I0310 09:25:48.830347 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6db6876945-nl9ph_53bfb88f-ffff-4945-acc2-ae245147edcb/manager/0.log" Mar 10 09:25:49 crc kubenswrapper[4825]: I0310 09:25:49.165517 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-64db6967f8-6jm6b_3ee3f9b2-09ba-4f80-823f-a4fd8639a14a/manager/0.log" Mar 10 09:25:49 crc kubenswrapper[4825]: I0310 09:25:49.166455 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-cf99c678f-7lfrk_a16a88fb-4aab-48c8-a540-f999a66f712b/manager/0.log" Mar 10 09:25:49 crc kubenswrapper[4825]: I0310 09:25:49.441550 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-78bc7f9bd9-zbvvz_2c8e75e0-135a-4d58-9230-a0f18ad89b25/manager/0.log" Mar 10 09:25:49 crc kubenswrapper[4825]: I0310 09:25:49.732711 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-545456dc4-bpf8z_ff530b27-be8c-40d4-9af7-ed6200fcdcac/manager/0.log" Mar 10 09:25:50 crc kubenswrapper[4825]: I0310 09:25:50.145567 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c789f89c6-pnfls_c0c7b817-9526-41fb-9a08-f2951ef9db20/manager/0.log" Mar 10 09:25:50 crc kubenswrapper[4825]: I0310 09:25:50.262268 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-f7fcc58b9-b5zsk_adbbc5e4-169b-4b18-819f-2ca9136edf9b/manager/0.log" Mar 10 09:25:50 crc kubenswrapper[4825]: I0310 09:25:50.390034 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-cmg8t_c947e95f-2771-4a5e-8adc-adf86fbbcc48/manager/0.log" Mar 10 09:25:50 crc kubenswrapper[4825]: I0310 09:25:50.567935 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b6bfb6475-npxgg_18fccb01-59df-4020-88c5-9c70b9f0edec/manager/0.log" Mar 10 09:25:50 crc kubenswrapper[4825]: I0310 09:25:50.819035 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54688575f-qhwdw_d1dffe25-5486-4aba-932a-41725895d7cf/manager/0.log" Mar 10 09:25:51 crc kubenswrapper[4825]: I0310 09:25:51.083113 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5d86c7ddb7-z8726_406e912a-4ca3-4760-ba95-e63b4d342849/manager/0.log" Mar 10 09:25:51 crc kubenswrapper[4825]: I0310 09:25:51.109250 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-sd5jw_acc23278-44ed-4cee-bd08-4562e17175db/manager/0.log" Mar 10 09:25:51 crc kubenswrapper[4825]: I0310 09:25:51.188256 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74b6b5dc96-dn5wn_6667b99e-cca6-4450-b44d-03edafac3e7a/manager/0.log" Mar 10 09:25:51 crc kubenswrapper[4825]: I0310 09:25:51.318628 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64bd8c75rmqrc_0db55365-104a-42c2-ba9b-1c084fdd08cf/manager/0.log" Mar 10 09:25:51 crc kubenswrapper[4825]: I0310 09:25:51.464070 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-568b7cf6db-7vlc2_5e253cf5-24cb-4d27-ab16-7aaa2cafa25b/operator/0.log" Mar 10 09:25:51 crc kubenswrapper[4825]: I0310 09:25:51.868199 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-75684d597f-4fxt4_7004ec9c-96b9-41e9-85f2-f57a2ca785ef/manager/0.log" Mar 10 09:25:51 crc kubenswrapper[4825]: I0310 09:25:51.883561 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-f9ksf_38fb683b-3108-46e6-9532-3d971e047bde/registry-server/0.log" Mar 10 09:25:51 crc kubenswrapper[4825]: I0310 09:25:51.912886 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-648564c9fc-wx259_4753d4b8-4f42-480b-bed5-54aa4dd2e77b/manager/0.log" Mar 10 09:25:52 crc kubenswrapper[4825]: I0310 09:25:52.156093 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-k2l9r_41dde8f5-d43b-4cdf-beb8-56e67290e024/operator/0.log" Mar 10 09:25:52 crc kubenswrapper[4825]: I0310 09:25:52.207622 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9b9ff9f4d-bdxtj_e5de5a57-9f17-4101-952a-cc33147fc220/manager/0.log" Mar 10 09:25:52 crc kubenswrapper[4825]: I0310 09:25:52.458528 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55b5ff4dbb-s56gc_6a3b19ba-bd45-4cae-a466-a4f380a10cd0/manager/0.log" Mar 10 09:25:52 crc kubenswrapper[4825]: I0310 09:25:52.662960 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-t88nq_ab1a7215-c090-4f4e-a994-603bbbafb22e/manager/0.log" Mar 10 09:25:52 crc kubenswrapper[4825]: I0310 09:25:52.684372 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5fdb694969-6bw6n_88d2e717-1281-45d1-a787-74fafe29c95b/manager/0.log" Mar 10 09:25:53 crc kubenswrapper[4825]: I0310 09:25:53.753725 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-59b6c9788f-n2gmk_5e92a016-c5ae-4ae4-a853-f69e701639fd/manager/0.log" Mar 10 09:25:59 crc kubenswrapper[4825]: I0310 09:25:59.243402 4825 scope.go:117] "RemoveContainer" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" Mar 10 09:25:59 crc kubenswrapper[4825]: E0310 09:25:59.244241 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:26:00 crc kubenswrapper[4825]: I0310 09:26:00.144648 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552246-gx7b6"] Mar 10 09:26:00 crc kubenswrapper[4825]: E0310 09:26:00.145385 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0e3a09a-8fb1-440b-97d1-60abea92baac" containerName="oc" Mar 10 09:26:00 crc kubenswrapper[4825]: I0310 09:26:00.145404 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0e3a09a-8fb1-440b-97d1-60abea92baac" containerName="oc" Mar 10 09:26:00 crc kubenswrapper[4825]: I0310 09:26:00.145606 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0e3a09a-8fb1-440b-97d1-60abea92baac" containerName="oc" Mar 10 09:26:00 crc kubenswrapper[4825]: I0310 09:26:00.146422 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552246-gx7b6" Mar 10 09:26:00 crc kubenswrapper[4825]: I0310 09:26:00.151114 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 09:26:00 crc kubenswrapper[4825]: I0310 09:26:00.151340 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:26:00 crc kubenswrapper[4825]: I0310 09:26:00.151444 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:26:00 crc kubenswrapper[4825]: I0310 09:26:00.156373 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552246-gx7b6"] Mar 10 09:26:00 crc kubenswrapper[4825]: I0310 09:26:00.203266 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9jm2\" (UniqueName: \"kubernetes.io/projected/6771d618-87af-4f42-bbfd-fa0882be021b-kube-api-access-j9jm2\") pod \"auto-csr-approver-29552246-gx7b6\" (UID: \"6771d618-87af-4f42-bbfd-fa0882be021b\") " pod="openshift-infra/auto-csr-approver-29552246-gx7b6" Mar 10 09:26:00 crc kubenswrapper[4825]: I0310 09:26:00.305915 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9jm2\" (UniqueName: \"kubernetes.io/projected/6771d618-87af-4f42-bbfd-fa0882be021b-kube-api-access-j9jm2\") pod \"auto-csr-approver-29552246-gx7b6\" (UID: \"6771d618-87af-4f42-bbfd-fa0882be021b\") " pod="openshift-infra/auto-csr-approver-29552246-gx7b6" Mar 10 09:26:00 crc kubenswrapper[4825]: I0310 09:26:00.331882 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9jm2\" (UniqueName: \"kubernetes.io/projected/6771d618-87af-4f42-bbfd-fa0882be021b-kube-api-access-j9jm2\") pod \"auto-csr-approver-29552246-gx7b6\" (UID: \"6771d618-87af-4f42-bbfd-fa0882be021b\") " pod="openshift-infra/auto-csr-approver-29552246-gx7b6" Mar 10 09:26:00 crc kubenswrapper[4825]: I0310 09:26:00.474835 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552246-gx7b6" Mar 10 09:26:00 crc kubenswrapper[4825]: I0310 09:26:00.927845 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552246-gx7b6"] Mar 10 09:26:01 crc kubenswrapper[4825]: I0310 09:26:01.858665 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552246-gx7b6" event={"ID":"6771d618-87af-4f42-bbfd-fa0882be021b","Type":"ContainerStarted","Data":"958f69a975d3aea2307e11e4673e17180330fc57ef99690d7140c5be24787416"} Mar 10 09:26:02 crc kubenswrapper[4825]: I0310 09:26:02.877082 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552246-gx7b6" event={"ID":"6771d618-87af-4f42-bbfd-fa0882be021b","Type":"ContainerStarted","Data":"91db7283cc7c88f6dbb5ebaa7e4456911dcb4cc23d28b2e206e9a13ad56dcd51"} Mar 10 09:26:02 crc kubenswrapper[4825]: I0310 09:26:02.899595 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552246-gx7b6" podStartSLOduration=1.2956641709999999 podStartE2EDuration="2.899574635s" podCreationTimestamp="2026-03-10 09:26:00 +0000 UTC" firstStartedPulling="2026-03-10 09:26:00.932052243 +0000 UTC m=+9713.961832858" lastFinishedPulling="2026-03-10 09:26:02.535962707 +0000 UTC m=+9715.565743322" observedRunningTime="2026-03-10 09:26:02.892916207 +0000 UTC m=+9715.922696832" watchObservedRunningTime="2026-03-10 09:26:02.899574635 +0000 UTC m=+9715.929355250" Mar 10 09:26:03 crc kubenswrapper[4825]: I0310 09:26:03.890835 4825 generic.go:334] "Generic (PLEG): container finished" podID="6771d618-87af-4f42-bbfd-fa0882be021b" containerID="91db7283cc7c88f6dbb5ebaa7e4456911dcb4cc23d28b2e206e9a13ad56dcd51" exitCode=0 Mar 10 09:26:03 crc kubenswrapper[4825]: I0310 09:26:03.891031 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552246-gx7b6" event={"ID":"6771d618-87af-4f42-bbfd-fa0882be021b","Type":"ContainerDied","Data":"91db7283cc7c88f6dbb5ebaa7e4456911dcb4cc23d28b2e206e9a13ad56dcd51"} Mar 10 09:26:05 crc kubenswrapper[4825]: I0310 09:26:05.287521 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552246-gx7b6" Mar 10 09:26:05 crc kubenswrapper[4825]: I0310 09:26:05.309434 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9jm2\" (UniqueName: \"kubernetes.io/projected/6771d618-87af-4f42-bbfd-fa0882be021b-kube-api-access-j9jm2\") pod \"6771d618-87af-4f42-bbfd-fa0882be021b\" (UID: \"6771d618-87af-4f42-bbfd-fa0882be021b\") " Mar 10 09:26:05 crc kubenswrapper[4825]: I0310 09:26:05.357243 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6771d618-87af-4f42-bbfd-fa0882be021b-kube-api-access-j9jm2" (OuterVolumeSpecName: "kube-api-access-j9jm2") pod "6771d618-87af-4f42-bbfd-fa0882be021b" (UID: "6771d618-87af-4f42-bbfd-fa0882be021b"). InnerVolumeSpecName "kube-api-access-j9jm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:26:05 crc kubenswrapper[4825]: I0310 09:26:05.412807 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9jm2\" (UniqueName: \"kubernetes.io/projected/6771d618-87af-4f42-bbfd-fa0882be021b-kube-api-access-j9jm2\") on node \"crc\" DevicePath \"\"" Mar 10 09:26:05 crc kubenswrapper[4825]: I0310 09:26:05.940303 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552246-gx7b6" event={"ID":"6771d618-87af-4f42-bbfd-fa0882be021b","Type":"ContainerDied","Data":"958f69a975d3aea2307e11e4673e17180330fc57ef99690d7140c5be24787416"} Mar 10 09:26:05 crc kubenswrapper[4825]: I0310 09:26:05.940337 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552246-gx7b6" Mar 10 09:26:05 crc kubenswrapper[4825]: I0310 09:26:05.940349 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="958f69a975d3aea2307e11e4673e17180330fc57ef99690d7140c5be24787416" Mar 10 09:26:05 crc kubenswrapper[4825]: I0310 09:26:05.970466 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552240-pr7kz"] Mar 10 09:26:05 crc kubenswrapper[4825]: I0310 09:26:05.979733 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552240-pr7kz"] Mar 10 09:26:07 crc kubenswrapper[4825]: I0310 09:26:07.247324 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f2d034b-b5fe-43dd-bb3e-f0e6abcaf576" path="/var/lib/kubelet/pods/9f2d034b-b5fe-43dd-bb3e-f0e6abcaf576/volumes" Mar 10 09:26:10 crc kubenswrapper[4825]: I0310 09:26:10.557463 4825 scope.go:117] "RemoveContainer" containerID="832c2d8c4318ea276ca58758b3333554ac72c5ba86987707c7c5d68b4624661e" Mar 10 09:26:13 crc kubenswrapper[4825]: I0310 09:26:13.237017 4825 scope.go:117] "RemoveContainer" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" Mar 10 09:26:13 crc kubenswrapper[4825]: E0310 09:26:13.237815 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:26:16 crc kubenswrapper[4825]: I0310 09:26:16.964596 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-wp526_14642521-187c-45cf-aa34-cfd4fa40e632/control-plane-machine-set-operator/0.log" Mar 10 09:26:17 crc kubenswrapper[4825]: I0310 09:26:17.016530 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-2kdt2_25ad6c89-d2e0-408e-8c6f-a49da5a55bdd/kube-rbac-proxy/0.log" Mar 10 09:26:17 crc kubenswrapper[4825]: I0310 09:26:17.166738 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-2kdt2_25ad6c89-d2e0-408e-8c6f-a49da5a55bdd/machine-api-operator/0.log" Mar 10 09:26:28 crc kubenswrapper[4825]: I0310 09:26:28.236688 4825 scope.go:117] "RemoveContainer" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" Mar 10 09:26:28 crc kubenswrapper[4825]: E0310 09:26:28.237626 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:26:30 crc kubenswrapper[4825]: I0310 09:26:30.910216 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-f568q_2d2dbde8-ed02-48ce-9b9e-103834db8e3a/cert-manager-controller/0.log" Mar 10 09:26:31 crc kubenswrapper[4825]: I0310 09:26:31.142735 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-7sr7h_11b4eb4d-13b1-4244-ae96-c77df6e04d59/cert-manager-webhook/0.log" Mar 10 09:26:31 crc kubenswrapper[4825]: I0310 09:26:31.161051 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-hlzgp_a5f3c59e-41fb-4805-b696-47f2095503e1/cert-manager-cainjector/0.log" Mar 10 09:26:43 crc kubenswrapper[4825]: I0310 09:26:43.239597 4825 scope.go:117] "RemoveContainer" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" Mar 10 09:26:43 crc kubenswrapper[4825]: E0310 09:26:43.240364 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:26:45 crc kubenswrapper[4825]: I0310 09:26:45.817533 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-q777z_b267da99-b608-441b-a6de-7a74d1923a8b/nmstate-console-plugin/0.log" Mar 10 09:26:46 crc kubenswrapper[4825]: I0310 09:26:46.550066 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-k8c9x_d4991d4a-6815-4cc2-84e6-b7be04db45bf/nmstate-handler/0.log" Mar 10 09:26:46 crc kubenswrapper[4825]: I0310 09:26:46.590843 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-5hjqp_3cb1cbce-e37d-43fb-8b62-fc1b414cf7b9/kube-rbac-proxy/0.log" Mar 10 09:26:46 crc kubenswrapper[4825]: I0310 09:26:46.607078 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-5hjqp_3cb1cbce-e37d-43fb-8b62-fc1b414cf7b9/nmstate-metrics/0.log" Mar 10 09:26:46 crc kubenswrapper[4825]: I0310 09:26:46.779951 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-qttr9_8e892956-8421-4579-a5e1-bef91a563c26/nmstate-operator/0.log" Mar 10 09:26:46 crc kubenswrapper[4825]: I0310 09:26:46.835773 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-d8h2s_d75451be-5545-4c15-ae11-5c795d29494c/nmstate-webhook/0.log" Mar 10 09:26:53 crc kubenswrapper[4825]: I0310 09:26:53.553301 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ntmtl"] Mar 10 09:26:53 crc kubenswrapper[4825]: E0310 09:26:53.554417 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6771d618-87af-4f42-bbfd-fa0882be021b" containerName="oc" Mar 10 09:26:53 crc kubenswrapper[4825]: I0310 09:26:53.554433 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="6771d618-87af-4f42-bbfd-fa0882be021b" containerName="oc" Mar 10 09:26:53 crc kubenswrapper[4825]: I0310 09:26:53.554699 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="6771d618-87af-4f42-bbfd-fa0882be021b" containerName="oc" Mar 10 09:26:53 crc kubenswrapper[4825]: I0310 09:26:53.556640 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ntmtl" Mar 10 09:26:53 crc kubenswrapper[4825]: I0310 09:26:53.563536 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ntmtl"] Mar 10 09:26:53 crc kubenswrapper[4825]: I0310 09:26:53.617973 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwlbc\" (UniqueName: \"kubernetes.io/projected/8361d3a4-00fc-4fe7-ab1b-a90452a1b73e-kube-api-access-pwlbc\") pod \"redhat-marketplace-ntmtl\" (UID: \"8361d3a4-00fc-4fe7-ab1b-a90452a1b73e\") " pod="openshift-marketplace/redhat-marketplace-ntmtl" Mar 10 09:26:53 crc kubenswrapper[4825]: I0310 09:26:53.618028 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8361d3a4-00fc-4fe7-ab1b-a90452a1b73e-catalog-content\") pod \"redhat-marketplace-ntmtl\" (UID: \"8361d3a4-00fc-4fe7-ab1b-a90452a1b73e\") " pod="openshift-marketplace/redhat-marketplace-ntmtl" Mar 10 09:26:53 crc kubenswrapper[4825]: I0310 09:26:53.618057 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8361d3a4-00fc-4fe7-ab1b-a90452a1b73e-utilities\") pod \"redhat-marketplace-ntmtl\" (UID: \"8361d3a4-00fc-4fe7-ab1b-a90452a1b73e\") " pod="openshift-marketplace/redhat-marketplace-ntmtl" Mar 10 09:26:53 crc kubenswrapper[4825]: I0310 09:26:53.720535 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8361d3a4-00fc-4fe7-ab1b-a90452a1b73e-utilities\") pod \"redhat-marketplace-ntmtl\" (UID: \"8361d3a4-00fc-4fe7-ab1b-a90452a1b73e\") " pod="openshift-marketplace/redhat-marketplace-ntmtl" Mar 10 09:26:53 crc kubenswrapper[4825]: I0310 09:26:53.721005 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8361d3a4-00fc-4fe7-ab1b-a90452a1b73e-utilities\") pod \"redhat-marketplace-ntmtl\" (UID: \"8361d3a4-00fc-4fe7-ab1b-a90452a1b73e\") " pod="openshift-marketplace/redhat-marketplace-ntmtl" Mar 10 09:26:53 crc kubenswrapper[4825]: I0310 09:26:53.721323 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwlbc\" (UniqueName: \"kubernetes.io/projected/8361d3a4-00fc-4fe7-ab1b-a90452a1b73e-kube-api-access-pwlbc\") pod \"redhat-marketplace-ntmtl\" (UID: \"8361d3a4-00fc-4fe7-ab1b-a90452a1b73e\") " pod="openshift-marketplace/redhat-marketplace-ntmtl" Mar 10 09:26:53 crc kubenswrapper[4825]: I0310 09:26:53.721382 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8361d3a4-00fc-4fe7-ab1b-a90452a1b73e-catalog-content\") pod \"redhat-marketplace-ntmtl\" (UID: \"8361d3a4-00fc-4fe7-ab1b-a90452a1b73e\") " pod="openshift-marketplace/redhat-marketplace-ntmtl" Mar 10 09:26:53 crc kubenswrapper[4825]: I0310 09:26:53.721677 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8361d3a4-00fc-4fe7-ab1b-a90452a1b73e-catalog-content\") pod \"redhat-marketplace-ntmtl\" (UID: \"8361d3a4-00fc-4fe7-ab1b-a90452a1b73e\") " pod="openshift-marketplace/redhat-marketplace-ntmtl" Mar 10 09:26:53 crc kubenswrapper[4825]: I0310 09:26:53.740130 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwlbc\" (UniqueName: \"kubernetes.io/projected/8361d3a4-00fc-4fe7-ab1b-a90452a1b73e-kube-api-access-pwlbc\") pod \"redhat-marketplace-ntmtl\" (UID: \"8361d3a4-00fc-4fe7-ab1b-a90452a1b73e\") " pod="openshift-marketplace/redhat-marketplace-ntmtl" Mar 10 09:26:53 crc kubenswrapper[4825]: I0310 09:26:53.898837 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ntmtl" Mar 10 09:26:54 crc kubenswrapper[4825]: I0310 09:26:54.394243 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ntmtl"] Mar 10 09:26:55 crc kubenswrapper[4825]: I0310 09:26:55.405613 4825 generic.go:334] "Generic (PLEG): container finished" podID="8361d3a4-00fc-4fe7-ab1b-a90452a1b73e" containerID="f265ff9d4713ad8fcf735395712ccfe786a773be6bf74d5f9e99e0a76501d785" exitCode=0 Mar 10 09:26:55 crc kubenswrapper[4825]: I0310 09:26:55.405700 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ntmtl" event={"ID":"8361d3a4-00fc-4fe7-ab1b-a90452a1b73e","Type":"ContainerDied","Data":"f265ff9d4713ad8fcf735395712ccfe786a773be6bf74d5f9e99e0a76501d785"} Mar 10 09:26:55 crc kubenswrapper[4825]: I0310 09:26:55.406017 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ntmtl" event={"ID":"8361d3a4-00fc-4fe7-ab1b-a90452a1b73e","Type":"ContainerStarted","Data":"5a72514d39da468c3345602bf36b4d681c9aad77cc56f426792faa1f188b0ca9"} Mar 10 09:26:55 crc kubenswrapper[4825]: I0310 09:26:55.409087 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:26:57 crc kubenswrapper[4825]: I0310 09:26:57.236609 4825 scope.go:117] "RemoveContainer" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" Mar 10 09:26:57 crc kubenswrapper[4825]: E0310 09:26:57.236919 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:26:57 crc kubenswrapper[4825]: I0310 09:26:57.437882 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ntmtl" event={"ID":"8361d3a4-00fc-4fe7-ab1b-a90452a1b73e","Type":"ContainerStarted","Data":"f3ae0416ce9703b4050fbe396ec54cabd62d7255d8cc4973f2e970a313df1f6d"} Mar 10 09:26:58 crc kubenswrapper[4825]: I0310 09:26:58.454064 4825 generic.go:334] "Generic (PLEG): container finished" podID="8361d3a4-00fc-4fe7-ab1b-a90452a1b73e" containerID="f3ae0416ce9703b4050fbe396ec54cabd62d7255d8cc4973f2e970a313df1f6d" exitCode=0 Mar 10 09:26:58 crc kubenswrapper[4825]: I0310 09:26:58.454107 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ntmtl" event={"ID":"8361d3a4-00fc-4fe7-ab1b-a90452a1b73e","Type":"ContainerDied","Data":"f3ae0416ce9703b4050fbe396ec54cabd62d7255d8cc4973f2e970a313df1f6d"} Mar 10 09:26:59 crc kubenswrapper[4825]: I0310 09:26:59.465404 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ntmtl" event={"ID":"8361d3a4-00fc-4fe7-ab1b-a90452a1b73e","Type":"ContainerStarted","Data":"700a5b2a5b579f04d9a05c207112066421cf6a265681c9aff9b47513d85060bb"} Mar 10 09:26:59 crc kubenswrapper[4825]: I0310 09:26:59.495491 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ntmtl" podStartSLOduration=2.9535714459999998 podStartE2EDuration="6.495471043s" podCreationTimestamp="2026-03-10 09:26:53 +0000 UTC" firstStartedPulling="2026-03-10 09:26:55.408665253 +0000 UTC m=+9768.438445888" lastFinishedPulling="2026-03-10 09:26:58.95056483 +0000 UTC m=+9771.980345485" observedRunningTime="2026-03-10 09:26:59.485641111 +0000 UTC m=+9772.515421726" watchObservedRunningTime="2026-03-10 09:26:59.495471043 +0000 UTC m=+9772.525251658" Mar 10 09:27:01 crc kubenswrapper[4825]: I0310 09:27:01.122500 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-xhhcv_66cf4082-239b-4875-b3cc-4f83e75f3c41/prometheus-operator/0.log" Mar 10 09:27:01 crc kubenswrapper[4825]: I0310 09:27:01.331440 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-bbfbfd454-md4ll_3d3e4992-fd3b-42ba-af1c-65278a4b277e/prometheus-operator-admission-webhook/0.log" Mar 10 09:27:01 crc kubenswrapper[4825]: I0310 09:27:01.400323 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-bbfbfd454-rcqmb_a291a88f-48bf-45a9-80f1-e558286ab74a/prometheus-operator-admission-webhook/0.log" Mar 10 09:27:01 crc kubenswrapper[4825]: I0310 09:27:01.537234 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-rtvkd_0b126417-0421-4901-bbf0-e8c75dffa4d5/operator/0.log" Mar 10 09:27:01 crc kubenswrapper[4825]: I0310 09:27:01.608444 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-jxdsc_e9441b13-c0dc-478c-90c1-43abb52482af/perses-operator/0.log" Mar 10 09:27:03 crc kubenswrapper[4825]: I0310 09:27:03.899073 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ntmtl" Mar 10 09:27:03 crc kubenswrapper[4825]: I0310 09:27:03.899344 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ntmtl" Mar 10 09:27:03 crc kubenswrapper[4825]: I0310 09:27:03.951603 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ntmtl" Mar 10 09:27:04 crc kubenswrapper[4825]: I0310 09:27:04.589344 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ntmtl" Mar 10 09:27:04 crc kubenswrapper[4825]: I0310 09:27:04.650843 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ntmtl"] Mar 10 09:27:06 crc kubenswrapper[4825]: I0310 09:27:06.527018 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ntmtl" podUID="8361d3a4-00fc-4fe7-ab1b-a90452a1b73e" containerName="registry-server" containerID="cri-o://700a5b2a5b579f04d9a05c207112066421cf6a265681c9aff9b47513d85060bb" gracePeriod=2 Mar 10 09:27:07 crc kubenswrapper[4825]: I0310 09:27:07.545518 4825 generic.go:334] "Generic (PLEG): container finished" podID="8361d3a4-00fc-4fe7-ab1b-a90452a1b73e" containerID="700a5b2a5b579f04d9a05c207112066421cf6a265681c9aff9b47513d85060bb" exitCode=0 Mar 10 09:27:07 crc kubenswrapper[4825]: I0310 09:27:07.545573 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ntmtl" event={"ID":"8361d3a4-00fc-4fe7-ab1b-a90452a1b73e","Type":"ContainerDied","Data":"700a5b2a5b579f04d9a05c207112066421cf6a265681c9aff9b47513d85060bb"} Mar 10 09:27:07 crc kubenswrapper[4825]: I0310 09:27:07.786503 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ntmtl" Mar 10 09:27:07 crc kubenswrapper[4825]: I0310 09:27:07.937549 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8361d3a4-00fc-4fe7-ab1b-a90452a1b73e-catalog-content\") pod \"8361d3a4-00fc-4fe7-ab1b-a90452a1b73e\" (UID: \"8361d3a4-00fc-4fe7-ab1b-a90452a1b73e\") " Mar 10 09:27:07 crc kubenswrapper[4825]: I0310 09:27:07.937639 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwlbc\" (UniqueName: \"kubernetes.io/projected/8361d3a4-00fc-4fe7-ab1b-a90452a1b73e-kube-api-access-pwlbc\") pod \"8361d3a4-00fc-4fe7-ab1b-a90452a1b73e\" (UID: \"8361d3a4-00fc-4fe7-ab1b-a90452a1b73e\") " Mar 10 09:27:07 crc kubenswrapper[4825]: I0310 09:27:07.937691 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8361d3a4-00fc-4fe7-ab1b-a90452a1b73e-utilities\") pod \"8361d3a4-00fc-4fe7-ab1b-a90452a1b73e\" (UID: \"8361d3a4-00fc-4fe7-ab1b-a90452a1b73e\") " Mar 10 09:27:07 crc kubenswrapper[4825]: I0310 09:27:07.938378 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8361d3a4-00fc-4fe7-ab1b-a90452a1b73e-utilities" (OuterVolumeSpecName: "utilities") pod "8361d3a4-00fc-4fe7-ab1b-a90452a1b73e" (UID: "8361d3a4-00fc-4fe7-ab1b-a90452a1b73e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:27:07 crc kubenswrapper[4825]: I0310 09:27:07.948287 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8361d3a4-00fc-4fe7-ab1b-a90452a1b73e-kube-api-access-pwlbc" (OuterVolumeSpecName: "kube-api-access-pwlbc") pod "8361d3a4-00fc-4fe7-ab1b-a90452a1b73e" (UID: "8361d3a4-00fc-4fe7-ab1b-a90452a1b73e"). InnerVolumeSpecName "kube-api-access-pwlbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:27:07 crc kubenswrapper[4825]: I0310 09:27:07.979699 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8361d3a4-00fc-4fe7-ab1b-a90452a1b73e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8361d3a4-00fc-4fe7-ab1b-a90452a1b73e" (UID: "8361d3a4-00fc-4fe7-ab1b-a90452a1b73e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:27:08 crc kubenswrapper[4825]: I0310 09:27:08.040898 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8361d3a4-00fc-4fe7-ab1b-a90452a1b73e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:27:08 crc kubenswrapper[4825]: I0310 09:27:08.041454 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwlbc\" (UniqueName: \"kubernetes.io/projected/8361d3a4-00fc-4fe7-ab1b-a90452a1b73e-kube-api-access-pwlbc\") on node \"crc\" DevicePath \"\"" Mar 10 09:27:08 crc kubenswrapper[4825]: I0310 09:27:08.041810 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8361d3a4-00fc-4fe7-ab1b-a90452a1b73e-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:27:08 crc kubenswrapper[4825]: I0310 09:27:08.558599 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ntmtl" event={"ID":"8361d3a4-00fc-4fe7-ab1b-a90452a1b73e","Type":"ContainerDied","Data":"5a72514d39da468c3345602bf36b4d681c9aad77cc56f426792faa1f188b0ca9"} Mar 10 09:27:08 crc kubenswrapper[4825]: I0310 09:27:08.558679 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ntmtl" Mar 10 09:27:08 crc kubenswrapper[4825]: I0310 09:27:08.559628 4825 scope.go:117] "RemoveContainer" containerID="700a5b2a5b579f04d9a05c207112066421cf6a265681c9aff9b47513d85060bb" Mar 10 09:27:08 crc kubenswrapper[4825]: I0310 09:27:08.587170 4825 scope.go:117] "RemoveContainer" containerID="f3ae0416ce9703b4050fbe396ec54cabd62d7255d8cc4973f2e970a313df1f6d" Mar 10 09:27:08 crc kubenswrapper[4825]: I0310 09:27:08.610359 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ntmtl"] Mar 10 09:27:08 crc kubenswrapper[4825]: I0310 09:27:08.613083 4825 scope.go:117] "RemoveContainer" containerID="f265ff9d4713ad8fcf735395712ccfe786a773be6bf74d5f9e99e0a76501d785" Mar 10 09:27:08 crc kubenswrapper[4825]: I0310 09:27:08.626806 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ntmtl"] Mar 10 09:27:09 crc kubenswrapper[4825]: I0310 09:27:09.251890 4825 scope.go:117] "RemoveContainer" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" Mar 10 09:27:09 crc kubenswrapper[4825]: E0310 09:27:09.253067 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:27:09 crc kubenswrapper[4825]: I0310 09:27:09.255430 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8361d3a4-00fc-4fe7-ab1b-a90452a1b73e" path="/var/lib/kubelet/pods/8361d3a4-00fc-4fe7-ab1b-a90452a1b73e/volumes" Mar 10 09:27:17 crc kubenswrapper[4825]: I0310 09:27:17.114712 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-sb9dl_e3a458e6-d3de-498f-83c2-215eb477a030/kube-rbac-proxy/0.log" Mar 10 09:27:17 crc kubenswrapper[4825]: I0310 09:27:17.388877 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lrjtj_876b5cff-b150-487c-8f79-752c845a44da/cp-frr-files/0.log" Mar 10 09:27:17 crc kubenswrapper[4825]: I0310 09:27:17.445790 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-sb9dl_e3a458e6-d3de-498f-83c2-215eb477a030/controller/0.log" Mar 10 09:27:17 crc kubenswrapper[4825]: I0310 09:27:17.616217 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lrjtj_876b5cff-b150-487c-8f79-752c845a44da/cp-metrics/0.log" Mar 10 09:27:17 crc kubenswrapper[4825]: I0310 09:27:17.625439 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lrjtj_876b5cff-b150-487c-8f79-752c845a44da/cp-frr-files/0.log" Mar 10 09:27:17 crc kubenswrapper[4825]: I0310 09:27:17.627808 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lrjtj_876b5cff-b150-487c-8f79-752c845a44da/cp-reloader/0.log" Mar 10 09:27:17 crc kubenswrapper[4825]: I0310 09:27:17.656398 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lrjtj_876b5cff-b150-487c-8f79-752c845a44da/cp-reloader/0.log" Mar 10 09:27:18 crc kubenswrapper[4825]: I0310 09:27:18.406954 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lrjtj_876b5cff-b150-487c-8f79-752c845a44da/cp-frr-files/0.log" Mar 10 09:27:18 crc kubenswrapper[4825]: I0310 09:27:18.478575 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lrjtj_876b5cff-b150-487c-8f79-752c845a44da/cp-reloader/0.log" Mar 10 09:27:18 crc kubenswrapper[4825]: I0310 09:27:18.491000 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lrjtj_876b5cff-b150-487c-8f79-752c845a44da/cp-metrics/0.log" Mar 10 09:27:18 crc kubenswrapper[4825]: I0310 09:27:18.498968 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lrjtj_876b5cff-b150-487c-8f79-752c845a44da/cp-metrics/0.log" Mar 10 09:27:18 crc kubenswrapper[4825]: I0310 09:27:18.665811 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lrjtj_876b5cff-b150-487c-8f79-752c845a44da/cp-reloader/0.log" Mar 10 09:27:18 crc kubenswrapper[4825]: I0310 09:27:18.665840 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lrjtj_876b5cff-b150-487c-8f79-752c845a44da/cp-metrics/0.log" Mar 10 09:27:18 crc kubenswrapper[4825]: I0310 09:27:18.670653 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lrjtj_876b5cff-b150-487c-8f79-752c845a44da/cp-frr-files/0.log" Mar 10 09:27:18 crc kubenswrapper[4825]: I0310 09:27:18.743008 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lrjtj_876b5cff-b150-487c-8f79-752c845a44da/controller/0.log" Mar 10 09:27:18 crc kubenswrapper[4825]: I0310 09:27:18.867689 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lrjtj_876b5cff-b150-487c-8f79-752c845a44da/frr-metrics/0.log" Mar 10 09:27:18 crc kubenswrapper[4825]: I0310 09:27:18.880880 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lrjtj_876b5cff-b150-487c-8f79-752c845a44da/kube-rbac-proxy/0.log" Mar 10 09:27:19 crc kubenswrapper[4825]: I0310 09:27:19.004807 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lrjtj_876b5cff-b150-487c-8f79-752c845a44da/kube-rbac-proxy-frr/0.log" Mar 10 09:27:19 crc kubenswrapper[4825]: I0310 09:27:19.119405 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lrjtj_876b5cff-b150-487c-8f79-752c845a44da/reloader/0.log" Mar 10 09:27:19 crc kubenswrapper[4825]: I0310 09:27:19.317426 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-v6sfm_0f74a263-0c43-4b78-8d8e-6a3b66166658/frr-k8s-webhook-server/0.log" Mar 10 09:27:19 crc kubenswrapper[4825]: I0310 09:27:19.384820 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5488d4f4f7-kjl8v_1876f466-8750-4a15-bbe2-e03da6d0df87/manager/0.log" Mar 10 09:27:19 crc kubenswrapper[4825]: I0310 09:27:19.644877 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-74cf7b6d9d-m2z87_944a661d-8d36-464b-a9d3-b2477f6e4663/webhook-server/0.log" Mar 10 09:27:19 crc kubenswrapper[4825]: I0310 09:27:19.814457 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tsj92_f9daa544-4d9f-4106-a729-d330dc8b6cc3/kube-rbac-proxy/0.log" Mar 10 09:27:20 crc kubenswrapper[4825]: I0310 09:27:20.800412 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tsj92_f9daa544-4d9f-4106-a729-d330dc8b6cc3/speaker/0.log" Mar 10 09:27:21 crc kubenswrapper[4825]: I0310 09:27:21.235763 4825 scope.go:117] "RemoveContainer" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" Mar 10 09:27:21 crc kubenswrapper[4825]: E0310 09:27:21.236030 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:27:22 crc kubenswrapper[4825]: I0310 09:27:22.656182 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lrjtj_876b5cff-b150-487c-8f79-752c845a44da/frr/0.log" Mar 10 09:27:34 crc kubenswrapper[4825]: I0310 09:27:34.384858 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6_174bce7a-2e4f-4dfa-b6a4-d57d028d00de/util/0.log" Mar 10 09:27:34 crc kubenswrapper[4825]: I0310 09:27:34.606265 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6_174bce7a-2e4f-4dfa-b6a4-d57d028d00de/pull/0.log" Mar 10 09:27:34 crc kubenswrapper[4825]: I0310 09:27:34.634346 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6_174bce7a-2e4f-4dfa-b6a4-d57d028d00de/util/0.log" Mar 10 09:27:34 crc kubenswrapper[4825]: I0310 09:27:34.679551 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6_174bce7a-2e4f-4dfa-b6a4-d57d028d00de/pull/0.log" Mar 10 09:27:34 crc kubenswrapper[4825]: I0310 09:27:34.873953 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6_174bce7a-2e4f-4dfa-b6a4-d57d028d00de/util/0.log" Mar 10 09:27:34 crc kubenswrapper[4825]: I0310 09:27:34.923592 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6_174bce7a-2e4f-4dfa-b6a4-d57d028d00de/extract/0.log" Mar 10 09:27:34 crc kubenswrapper[4825]: I0310 09:27:34.953032 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a822p8k6_174bce7a-2e4f-4dfa-b6a4-d57d028d00de/pull/0.log" Mar 10 09:27:35 crc kubenswrapper[4825]: I0310 09:27:35.084174 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp_91c44c21-0e53-4940-a438-d4e4761d50e0/util/0.log" Mar 10 09:27:35 crc kubenswrapper[4825]: I0310 09:27:35.240067 4825 scope.go:117] "RemoveContainer" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" Mar 10 09:27:35 crc kubenswrapper[4825]: E0310 09:27:35.240388 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:27:35 crc kubenswrapper[4825]: I0310 09:27:35.304161 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp_91c44c21-0e53-4940-a438-d4e4761d50e0/pull/0.log" Mar 10 09:27:35 crc kubenswrapper[4825]: I0310 09:27:35.319654 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp_91c44c21-0e53-4940-a438-d4e4761d50e0/util/0.log" Mar 10 09:27:35 crc kubenswrapper[4825]: I0310 09:27:35.349785 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp_91c44c21-0e53-4940-a438-d4e4761d50e0/pull/0.log" Mar 10 09:27:35 crc kubenswrapper[4825]: I0310 09:27:35.469720 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp_91c44c21-0e53-4940-a438-d4e4761d50e0/util/0.log" Mar 10 09:27:35 crc kubenswrapper[4825]: I0310 09:27:35.509805 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp_91c44c21-0e53-4940-a438-d4e4761d50e0/pull/0.log" Mar 10 09:27:35 crc kubenswrapper[4825]: I0310 09:27:35.584689 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5zw7qp_91c44c21-0e53-4940-a438-d4e4761d50e0/extract/0.log" Mar 10 09:27:35 crc kubenswrapper[4825]: I0310 09:27:35.731169 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b_84d59e89-a282-426f-9173-22b57c51522a/util/0.log" Mar 10 09:27:35 crc kubenswrapper[4825]: I0310 09:27:35.987551 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b_84d59e89-a282-426f-9173-22b57c51522a/pull/0.log" Mar 10 09:27:35 crc kubenswrapper[4825]: I0310 09:27:35.998103 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b_84d59e89-a282-426f-9173-22b57c51522a/pull/0.log" Mar 10 09:27:36 crc kubenswrapper[4825]: I0310 09:27:36.003934 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b_84d59e89-a282-426f-9173-22b57c51522a/util/0.log" Mar 10 09:27:36 crc kubenswrapper[4825]: I0310 09:27:36.163174 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b_84d59e89-a282-426f-9173-22b57c51522a/extract/0.log" Mar 10 09:27:36 crc kubenswrapper[4825]: I0310 09:27:36.212376 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b_84d59e89-a282-426f-9173-22b57c51522a/util/0.log" Mar 10 09:27:36 crc kubenswrapper[4825]: I0310 09:27:36.222367 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08lkk6b_84d59e89-a282-426f-9173-22b57c51522a/pull/0.log" Mar 10 09:27:36 crc kubenswrapper[4825]: I0310 09:27:36.384449 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g5lhn_9028cc8e-79e8-4c44-8bc0-2db0c871a8e4/extract-utilities/0.log" Mar 10 09:27:36 crc kubenswrapper[4825]: I0310 09:27:36.581255 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g5lhn_9028cc8e-79e8-4c44-8bc0-2db0c871a8e4/extract-utilities/0.log" Mar 10 09:27:36 crc kubenswrapper[4825]: I0310 09:27:36.586366 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g5lhn_9028cc8e-79e8-4c44-8bc0-2db0c871a8e4/extract-content/0.log" Mar 10 09:27:36 crc kubenswrapper[4825]: I0310 09:27:36.589588 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g5lhn_9028cc8e-79e8-4c44-8bc0-2db0c871a8e4/extract-content/0.log" Mar 10 09:27:36 crc kubenswrapper[4825]: I0310 09:27:36.760253 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g5lhn_9028cc8e-79e8-4c44-8bc0-2db0c871a8e4/extract-utilities/0.log" Mar 10 09:27:36 crc kubenswrapper[4825]: I0310 09:27:36.819543 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g5lhn_9028cc8e-79e8-4c44-8bc0-2db0c871a8e4/extract-content/0.log" Mar 10 09:27:37 crc kubenswrapper[4825]: I0310 09:27:37.028077 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bjjzc_4818c2a2-e616-4b5f-87c5-5c1e5c0cea22/extract-utilities/0.log" Mar 10 09:27:37 crc kubenswrapper[4825]: I0310 09:27:37.204548 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bjjzc_4818c2a2-e616-4b5f-87c5-5c1e5c0cea22/extract-utilities/0.log" Mar 10 09:27:37 crc kubenswrapper[4825]: I0310 09:27:37.282773 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bjjzc_4818c2a2-e616-4b5f-87c5-5c1e5c0cea22/extract-content/0.log" Mar 10 09:27:37 crc kubenswrapper[4825]: I0310 09:27:37.326592 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bjjzc_4818c2a2-e616-4b5f-87c5-5c1e5c0cea22/extract-content/0.log" Mar 10 09:27:37 crc kubenswrapper[4825]: I0310 09:27:37.530949 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bjjzc_4818c2a2-e616-4b5f-87c5-5c1e5c0cea22/extract-content/0.log" Mar 10 09:27:37 crc kubenswrapper[4825]: I0310 09:27:37.557742 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bjjzc_4818c2a2-e616-4b5f-87c5-5c1e5c0cea22/extract-utilities/0.log" Mar 10 09:27:37 crc kubenswrapper[4825]: I0310 09:27:37.690620 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g5lhn_9028cc8e-79e8-4c44-8bc0-2db0c871a8e4/registry-server/0.log" Mar 10 09:27:37 crc kubenswrapper[4825]: I0310 09:27:37.801240 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79_b30ba77f-d659-4501-94e6-cc3e980b3f41/util/0.log" Mar 10 09:27:38 crc kubenswrapper[4825]: I0310 09:27:38.103181 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79_b30ba77f-d659-4501-94e6-cc3e980b3f41/pull/0.log" Mar 10 09:27:38 crc kubenswrapper[4825]: I0310 09:27:38.153671 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79_b30ba77f-d659-4501-94e6-cc3e980b3f41/util/0.log" Mar 10 09:27:38 crc kubenswrapper[4825]: I0310 09:27:38.157057 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79_b30ba77f-d659-4501-94e6-cc3e980b3f41/pull/0.log" Mar 10 09:27:38 crc kubenswrapper[4825]: I0310 09:27:38.308530 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79_b30ba77f-d659-4501-94e6-cc3e980b3f41/util/0.log" Mar 10 09:27:38 crc kubenswrapper[4825]: I0310 09:27:38.408501 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79_b30ba77f-d659-4501-94e6-cc3e980b3f41/pull/0.log" Mar 10 09:27:38 crc kubenswrapper[4825]: I0310 09:27:38.462856 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f44gb79_b30ba77f-d659-4501-94e6-cc3e980b3f41/extract/0.log" Mar 10 09:27:38 crc kubenswrapper[4825]: I0310 09:27:38.613713 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-jrdg7_fdaf5903-c728-4085-9c33-30359e2c9af3/marketplace-operator/0.log" Mar 10 09:27:38 crc kubenswrapper[4825]: I0310 09:27:38.915885 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-twdqr_1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63/extract-utilities/0.log" Mar 10 09:27:39 crc kubenswrapper[4825]: I0310 09:27:39.061274 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bjjzc_4818c2a2-e616-4b5f-87c5-5c1e5c0cea22/registry-server/0.log" Mar 10 09:27:39 crc kubenswrapper[4825]: I0310 09:27:39.087893 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-twdqr_1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63/extract-content/0.log" Mar 10 09:27:39 crc kubenswrapper[4825]: I0310 09:27:39.125832 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-twdqr_1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63/extract-utilities/0.log" Mar 10 09:27:39 crc kubenswrapper[4825]: I0310 09:27:39.143651 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-twdqr_1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63/extract-content/0.log" Mar 10 09:27:39 crc kubenswrapper[4825]: I0310 09:27:39.283780 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-twdqr_1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63/extract-utilities/0.log" Mar 10 09:27:39 crc kubenswrapper[4825]: I0310 09:27:39.352086 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jnnl2_72e0dde1-2305-4348-b207-6809040bc665/extract-utilities/0.log" Mar 10 09:27:39 crc kubenswrapper[4825]: I0310 09:27:39.635502 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-twdqr_1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63/extract-content/0.log" Mar 10 09:27:39 crc kubenswrapper[4825]: I0310 09:27:39.646329 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-twdqr_1b90190d-cc9e-4a1e-9e24-cb85d5e4fd63/registry-server/0.log" Mar 10 09:27:39 crc kubenswrapper[4825]: I0310 09:27:39.828797 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jnnl2_72e0dde1-2305-4348-b207-6809040bc665/extract-content/0.log" Mar 10 09:27:39 crc kubenswrapper[4825]: I0310 09:27:39.874971 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jnnl2_72e0dde1-2305-4348-b207-6809040bc665/extract-content/0.log" Mar 10 09:27:39 crc kubenswrapper[4825]: I0310 09:27:39.880046 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jnnl2_72e0dde1-2305-4348-b207-6809040bc665/extract-utilities/0.log" Mar 10 09:27:40 crc kubenswrapper[4825]: I0310 09:27:40.012928 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jnnl2_72e0dde1-2305-4348-b207-6809040bc665/extract-utilities/0.log" Mar 10 09:27:40 crc kubenswrapper[4825]: I0310 09:27:40.025165 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jnnl2_72e0dde1-2305-4348-b207-6809040bc665/extract-content/0.log" Mar 10 09:27:40 crc kubenswrapper[4825]: I0310 09:27:40.797917 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-jnnl2_72e0dde1-2305-4348-b207-6809040bc665/registry-server/0.log" Mar 10 09:27:49 crc kubenswrapper[4825]: I0310 09:27:49.238755 4825 scope.go:117] "RemoveContainer" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" Mar 10 09:27:49 crc kubenswrapper[4825]: E0310 09:27:49.239495 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:27:50 crc kubenswrapper[4825]: I0310 09:27:50.196710 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7bkkx"] Mar 10 09:27:50 crc kubenswrapper[4825]: E0310 09:27:50.197174 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8361d3a4-00fc-4fe7-ab1b-a90452a1b73e" containerName="extract-content" Mar 10 09:27:50 crc kubenswrapper[4825]: I0310 09:27:50.197194 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8361d3a4-00fc-4fe7-ab1b-a90452a1b73e" containerName="extract-content" Mar 10 09:27:50 crc kubenswrapper[4825]: E0310 09:27:50.197216 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8361d3a4-00fc-4fe7-ab1b-a90452a1b73e" containerName="extract-utilities" Mar 10 09:27:50 crc kubenswrapper[4825]: I0310 09:27:50.197223 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8361d3a4-00fc-4fe7-ab1b-a90452a1b73e" containerName="extract-utilities" Mar 10 09:27:50 crc kubenswrapper[4825]: E0310 09:27:50.197248 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8361d3a4-00fc-4fe7-ab1b-a90452a1b73e" containerName="registry-server" Mar 10 09:27:50 crc kubenswrapper[4825]: I0310 09:27:50.197254 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="8361d3a4-00fc-4fe7-ab1b-a90452a1b73e" containerName="registry-server" Mar 10 09:27:50 crc kubenswrapper[4825]: I0310 09:27:50.197466 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="8361d3a4-00fc-4fe7-ab1b-a90452a1b73e" containerName="registry-server" Mar 10 09:27:50 crc kubenswrapper[4825]: I0310 09:27:50.199073 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7bkkx" Mar 10 09:27:50 crc kubenswrapper[4825]: I0310 09:27:50.218936 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7bkkx"] Mar 10 09:27:50 crc kubenswrapper[4825]: I0310 09:27:50.352448 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5d8x\" (UniqueName: \"kubernetes.io/projected/763b5859-4985-4e4d-b2ed-a310cdaf5f3a-kube-api-access-d5d8x\") pod \"community-operators-7bkkx\" (UID: \"763b5859-4985-4e4d-b2ed-a310cdaf5f3a\") " pod="openshift-marketplace/community-operators-7bkkx" Mar 10 09:27:50 crc kubenswrapper[4825]: I0310 09:27:50.353751 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/763b5859-4985-4e4d-b2ed-a310cdaf5f3a-utilities\") pod \"community-operators-7bkkx\" (UID: \"763b5859-4985-4e4d-b2ed-a310cdaf5f3a\") " pod="openshift-marketplace/community-operators-7bkkx" Mar 10 09:27:50 crc kubenswrapper[4825]: I0310 09:27:50.354264 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/763b5859-4985-4e4d-b2ed-a310cdaf5f3a-catalog-content\") pod \"community-operators-7bkkx\" (UID: \"763b5859-4985-4e4d-b2ed-a310cdaf5f3a\") " pod="openshift-marketplace/community-operators-7bkkx" Mar 10 09:27:50 crc kubenswrapper[4825]: I0310 09:27:50.457067 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/763b5859-4985-4e4d-b2ed-a310cdaf5f3a-catalog-content\") pod \"community-operators-7bkkx\" (UID: \"763b5859-4985-4e4d-b2ed-a310cdaf5f3a\") " pod="openshift-marketplace/community-operators-7bkkx" Mar 10 09:27:50 crc kubenswrapper[4825]: I0310 09:27:50.456479 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/763b5859-4985-4e4d-b2ed-a310cdaf5f3a-catalog-content\") pod \"community-operators-7bkkx\" (UID: \"763b5859-4985-4e4d-b2ed-a310cdaf5f3a\") " pod="openshift-marketplace/community-operators-7bkkx" Mar 10 09:27:50 crc kubenswrapper[4825]: I0310 09:27:50.457600 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5d8x\" (UniqueName: \"kubernetes.io/projected/763b5859-4985-4e4d-b2ed-a310cdaf5f3a-kube-api-access-d5d8x\") pod \"community-operators-7bkkx\" (UID: \"763b5859-4985-4e4d-b2ed-a310cdaf5f3a\") " pod="openshift-marketplace/community-operators-7bkkx" Mar 10 09:27:50 crc kubenswrapper[4825]: I0310 09:27:50.458204 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/763b5859-4985-4e4d-b2ed-a310cdaf5f3a-utilities\") pod \"community-operators-7bkkx\" (UID: \"763b5859-4985-4e4d-b2ed-a310cdaf5f3a\") " pod="openshift-marketplace/community-operators-7bkkx" Mar 10 09:27:50 crc kubenswrapper[4825]: I0310 09:27:50.458719 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/763b5859-4985-4e4d-b2ed-a310cdaf5f3a-utilities\") pod \"community-operators-7bkkx\" (UID: \"763b5859-4985-4e4d-b2ed-a310cdaf5f3a\") " pod="openshift-marketplace/community-operators-7bkkx" Mar 10 09:27:50 crc kubenswrapper[4825]: I0310 09:27:50.476746 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5d8x\" (UniqueName: \"kubernetes.io/projected/763b5859-4985-4e4d-b2ed-a310cdaf5f3a-kube-api-access-d5d8x\") pod \"community-operators-7bkkx\" (UID: \"763b5859-4985-4e4d-b2ed-a310cdaf5f3a\") " pod="openshift-marketplace/community-operators-7bkkx" Mar 10 09:27:50 crc kubenswrapper[4825]: I0310 09:27:50.519592 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7bkkx" Mar 10 09:27:51 crc kubenswrapper[4825]: I0310 09:27:51.068995 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7bkkx"] Mar 10 09:27:52 crc kubenswrapper[4825]: I0310 09:27:52.020632 4825 generic.go:334] "Generic (PLEG): container finished" podID="763b5859-4985-4e4d-b2ed-a310cdaf5f3a" containerID="f05fdd3060b6a82f8664b39c6d51e6475dc1998ddd0c82646e201c26a4cc0478" exitCode=0 Mar 10 09:27:52 crc kubenswrapper[4825]: I0310 09:27:52.020791 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bkkx" event={"ID":"763b5859-4985-4e4d-b2ed-a310cdaf5f3a","Type":"ContainerDied","Data":"f05fdd3060b6a82f8664b39c6d51e6475dc1998ddd0c82646e201c26a4cc0478"} Mar 10 09:27:52 crc kubenswrapper[4825]: I0310 09:27:52.020951 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bkkx" event={"ID":"763b5859-4985-4e4d-b2ed-a310cdaf5f3a","Type":"ContainerStarted","Data":"d41b55b98794c978eda134f2ac4245a94c15c028af5ef97a7f9d1596eae646b9"} Mar 10 09:27:54 crc kubenswrapper[4825]: I0310 09:27:54.041694 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bkkx" event={"ID":"763b5859-4985-4e4d-b2ed-a310cdaf5f3a","Type":"ContainerStarted","Data":"ac451f7a7e9cf0bf46428e7ca37f378992d11c6626dee82ee5ae14fabd7b62f6"} Mar 10 09:27:55 crc kubenswrapper[4825]: I0310 09:27:55.053849 4825 generic.go:334] "Generic (PLEG): container finished" podID="763b5859-4985-4e4d-b2ed-a310cdaf5f3a" containerID="ac451f7a7e9cf0bf46428e7ca37f378992d11c6626dee82ee5ae14fabd7b62f6" exitCode=0 Mar 10 09:27:55 crc kubenswrapper[4825]: I0310 09:27:55.053900 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bkkx" event={"ID":"763b5859-4985-4e4d-b2ed-a310cdaf5f3a","Type":"ContainerDied","Data":"ac451f7a7e9cf0bf46428e7ca37f378992d11c6626dee82ee5ae14fabd7b62f6"} Mar 10 09:27:55 crc kubenswrapper[4825]: I0310 09:27:55.660206 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-xhhcv_66cf4082-239b-4875-b3cc-4f83e75f3c41/prometheus-operator/0.log" Mar 10 09:27:55 crc kubenswrapper[4825]: I0310 09:27:55.705574 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-bbfbfd454-md4ll_3d3e4992-fd3b-42ba-af1c-65278a4b277e/prometheus-operator-admission-webhook/0.log" Mar 10 09:27:55 crc kubenswrapper[4825]: I0310 09:27:55.720651 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-rtvkd_0b126417-0421-4901-bbf0-e8c75dffa4d5/operator/0.log" Mar 10 09:27:55 crc kubenswrapper[4825]: I0310 09:27:55.785896 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-bbfbfd454-rcqmb_a291a88f-48bf-45a9-80f1-e558286ab74a/prometheus-operator-admission-webhook/0.log" Mar 10 09:27:55 crc kubenswrapper[4825]: I0310 09:27:55.894470 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-jxdsc_e9441b13-c0dc-478c-90c1-43abb52482af/perses-operator/0.log" Mar 10 09:27:56 crc kubenswrapper[4825]: I0310 09:27:56.065463 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bkkx" event={"ID":"763b5859-4985-4e4d-b2ed-a310cdaf5f3a","Type":"ContainerStarted","Data":"c1a992559f491f587bd906742c4de491f5c80d44cad7cb4b751777b1f5fe554e"} Mar 10 09:27:56 crc kubenswrapper[4825]: I0310 09:27:56.110090 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7bkkx" podStartSLOduration=2.676709291 podStartE2EDuration="6.110072296s" podCreationTimestamp="2026-03-10 09:27:50 +0000 UTC" firstStartedPulling="2026-03-10 09:27:52.022762032 +0000 UTC m=+9825.052542647" lastFinishedPulling="2026-03-10 09:27:55.456125027 +0000 UTC m=+9828.485905652" observedRunningTime="2026-03-10 09:27:56.108307579 +0000 UTC m=+9829.138088204" watchObservedRunningTime="2026-03-10 09:27:56.110072296 +0000 UTC m=+9829.139852911" Mar 10 09:28:00 crc kubenswrapper[4825]: I0310 09:28:00.156657 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552248-k9m67"] Mar 10 09:28:00 crc kubenswrapper[4825]: I0310 09:28:00.158218 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552248-k9m67" Mar 10 09:28:00 crc kubenswrapper[4825]: I0310 09:28:00.160315 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 09:28:00 crc kubenswrapper[4825]: I0310 09:28:00.161007 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:28:00 crc kubenswrapper[4825]: I0310 09:28:00.161588 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:28:00 crc kubenswrapper[4825]: I0310 09:28:00.169677 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552248-k9m67"] Mar 10 09:28:00 crc kubenswrapper[4825]: I0310 09:28:00.261813 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skm6j\" (UniqueName: \"kubernetes.io/projected/c6ca031d-d17f-4b5d-891d-3d30afd18ac5-kube-api-access-skm6j\") pod \"auto-csr-approver-29552248-k9m67\" (UID: \"c6ca031d-d17f-4b5d-891d-3d30afd18ac5\") " pod="openshift-infra/auto-csr-approver-29552248-k9m67" Mar 10 09:28:00 crc kubenswrapper[4825]: I0310 09:28:00.365957 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skm6j\" (UniqueName: \"kubernetes.io/projected/c6ca031d-d17f-4b5d-891d-3d30afd18ac5-kube-api-access-skm6j\") pod \"auto-csr-approver-29552248-k9m67\" (UID: \"c6ca031d-d17f-4b5d-891d-3d30afd18ac5\") " pod="openshift-infra/auto-csr-approver-29552248-k9m67" Mar 10 09:28:00 crc kubenswrapper[4825]: I0310 09:28:00.398112 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skm6j\" (UniqueName: \"kubernetes.io/projected/c6ca031d-d17f-4b5d-891d-3d30afd18ac5-kube-api-access-skm6j\") pod \"auto-csr-approver-29552248-k9m67\" (UID: \"c6ca031d-d17f-4b5d-891d-3d30afd18ac5\") " pod="openshift-infra/auto-csr-approver-29552248-k9m67" Mar 10 09:28:00 crc kubenswrapper[4825]: I0310 09:28:00.493166 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552248-k9m67" Mar 10 09:28:00 crc kubenswrapper[4825]: I0310 09:28:00.520687 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7bkkx" Mar 10 09:28:00 crc kubenswrapper[4825]: I0310 09:28:00.520726 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7bkkx" Mar 10 09:28:00 crc kubenswrapper[4825]: I0310 09:28:00.597162 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7bkkx" Mar 10 09:28:01 crc kubenswrapper[4825]: W0310 09:28:01.038828 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6ca031d_d17f_4b5d_891d_3d30afd18ac5.slice/crio-13617bcf2e82b264c7eda16ddde46e3ebe324830c05c7e06b0a7b849af38d6c8 WatchSource:0}: Error finding container 13617bcf2e82b264c7eda16ddde46e3ebe324830c05c7e06b0a7b849af38d6c8: Status 404 returned error can't find the container with id 13617bcf2e82b264c7eda16ddde46e3ebe324830c05c7e06b0a7b849af38d6c8 Mar 10 09:28:01 crc kubenswrapper[4825]: I0310 09:28:01.042901 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552248-k9m67"] Mar 10 09:28:01 crc kubenswrapper[4825]: I0310 09:28:01.122750 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552248-k9m67" event={"ID":"c6ca031d-d17f-4b5d-891d-3d30afd18ac5","Type":"ContainerStarted","Data":"13617bcf2e82b264c7eda16ddde46e3ebe324830c05c7e06b0a7b849af38d6c8"} Mar 10 09:28:01 crc kubenswrapper[4825]: I0310 09:28:01.195725 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7bkkx" Mar 10 09:28:01 crc kubenswrapper[4825]: I0310 09:28:01.263060 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7bkkx"] Mar 10 09:28:03 crc kubenswrapper[4825]: I0310 09:28:03.154896 4825 generic.go:334] "Generic (PLEG): container finished" podID="c6ca031d-d17f-4b5d-891d-3d30afd18ac5" containerID="52ef8b0202f8209f3fe0bc09de0cd458bf9b56ea081c8dd7e83d180fbbef112e" exitCode=0 Mar 10 09:28:03 crc kubenswrapper[4825]: I0310 09:28:03.155062 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552248-k9m67" event={"ID":"c6ca031d-d17f-4b5d-891d-3d30afd18ac5","Type":"ContainerDied","Data":"52ef8b0202f8209f3fe0bc09de0cd458bf9b56ea081c8dd7e83d180fbbef112e"} Mar 10 09:28:03 crc kubenswrapper[4825]: I0310 09:28:03.155759 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7bkkx" podUID="763b5859-4985-4e4d-b2ed-a310cdaf5f3a" containerName="registry-server" containerID="cri-o://c1a992559f491f587bd906742c4de491f5c80d44cad7cb4b751777b1f5fe554e" gracePeriod=2 Mar 10 09:28:03 crc kubenswrapper[4825]: I0310 09:28:03.650344 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7bkkx" Mar 10 09:28:03 crc kubenswrapper[4825]: I0310 09:28:03.843814 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/763b5859-4985-4e4d-b2ed-a310cdaf5f3a-catalog-content\") pod \"763b5859-4985-4e4d-b2ed-a310cdaf5f3a\" (UID: \"763b5859-4985-4e4d-b2ed-a310cdaf5f3a\") " Mar 10 09:28:03 crc kubenswrapper[4825]: I0310 09:28:03.843924 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/763b5859-4985-4e4d-b2ed-a310cdaf5f3a-utilities\") pod \"763b5859-4985-4e4d-b2ed-a310cdaf5f3a\" (UID: \"763b5859-4985-4e4d-b2ed-a310cdaf5f3a\") " Mar 10 09:28:03 crc kubenswrapper[4825]: I0310 09:28:03.844056 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5d8x\" (UniqueName: \"kubernetes.io/projected/763b5859-4985-4e4d-b2ed-a310cdaf5f3a-kube-api-access-d5d8x\") pod \"763b5859-4985-4e4d-b2ed-a310cdaf5f3a\" (UID: \"763b5859-4985-4e4d-b2ed-a310cdaf5f3a\") " Mar 10 09:28:03 crc kubenswrapper[4825]: I0310 09:28:03.845070 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/763b5859-4985-4e4d-b2ed-a310cdaf5f3a-utilities" (OuterVolumeSpecName: "utilities") pod "763b5859-4985-4e4d-b2ed-a310cdaf5f3a" (UID: "763b5859-4985-4e4d-b2ed-a310cdaf5f3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:28:03 crc kubenswrapper[4825]: I0310 09:28:03.850046 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/763b5859-4985-4e4d-b2ed-a310cdaf5f3a-kube-api-access-d5d8x" (OuterVolumeSpecName: "kube-api-access-d5d8x") pod "763b5859-4985-4e4d-b2ed-a310cdaf5f3a" (UID: "763b5859-4985-4e4d-b2ed-a310cdaf5f3a"). InnerVolumeSpecName "kube-api-access-d5d8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:28:03 crc kubenswrapper[4825]: I0310 09:28:03.925706 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/763b5859-4985-4e4d-b2ed-a310cdaf5f3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "763b5859-4985-4e4d-b2ed-a310cdaf5f3a" (UID: "763b5859-4985-4e4d-b2ed-a310cdaf5f3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:28:03 crc kubenswrapper[4825]: I0310 09:28:03.945805 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/763b5859-4985-4e4d-b2ed-a310cdaf5f3a-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:28:03 crc kubenswrapper[4825]: I0310 09:28:03.945836 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5d8x\" (UniqueName: \"kubernetes.io/projected/763b5859-4985-4e4d-b2ed-a310cdaf5f3a-kube-api-access-d5d8x\") on node \"crc\" DevicePath \"\"" Mar 10 09:28:03 crc kubenswrapper[4825]: I0310 09:28:03.945846 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/763b5859-4985-4e4d-b2ed-a310cdaf5f3a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:28:04 crc kubenswrapper[4825]: I0310 09:28:04.188912 4825 generic.go:334] "Generic (PLEG): container finished" podID="763b5859-4985-4e4d-b2ed-a310cdaf5f3a" containerID="c1a992559f491f587bd906742c4de491f5c80d44cad7cb4b751777b1f5fe554e" exitCode=0 Mar 10 09:28:04 crc kubenswrapper[4825]: I0310 09:28:04.188976 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7bkkx" Mar 10 09:28:04 crc kubenswrapper[4825]: I0310 09:28:04.189032 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bkkx" event={"ID":"763b5859-4985-4e4d-b2ed-a310cdaf5f3a","Type":"ContainerDied","Data":"c1a992559f491f587bd906742c4de491f5c80d44cad7cb4b751777b1f5fe554e"} Mar 10 09:28:04 crc kubenswrapper[4825]: I0310 09:28:04.189069 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7bkkx" event={"ID":"763b5859-4985-4e4d-b2ed-a310cdaf5f3a","Type":"ContainerDied","Data":"d41b55b98794c978eda134f2ac4245a94c15c028af5ef97a7f9d1596eae646b9"} Mar 10 09:28:04 crc kubenswrapper[4825]: I0310 09:28:04.189089 4825 scope.go:117] "RemoveContainer" containerID="c1a992559f491f587bd906742c4de491f5c80d44cad7cb4b751777b1f5fe554e" Mar 10 09:28:04 crc kubenswrapper[4825]: I0310 09:28:04.238032 4825 scope.go:117] "RemoveContainer" containerID="ac451f7a7e9cf0bf46428e7ca37f378992d11c6626dee82ee5ae14fabd7b62f6" Mar 10 09:28:04 crc kubenswrapper[4825]: I0310 09:28:04.240562 4825 scope.go:117] "RemoveContainer" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" Mar 10 09:28:04 crc kubenswrapper[4825]: E0310 09:28:04.240995 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:28:04 crc kubenswrapper[4825]: I0310 09:28:04.245187 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7bkkx"] Mar 10 09:28:04 crc kubenswrapper[4825]: I0310 09:28:04.252020 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7bkkx"] Mar 10 09:28:04 crc kubenswrapper[4825]: I0310 09:28:04.299805 4825 scope.go:117] "RemoveContainer" containerID="f05fdd3060b6a82f8664b39c6d51e6475dc1998ddd0c82646e201c26a4cc0478" Mar 10 09:28:04 crc kubenswrapper[4825]: I0310 09:28:04.345434 4825 scope.go:117] "RemoveContainer" containerID="c1a992559f491f587bd906742c4de491f5c80d44cad7cb4b751777b1f5fe554e" Mar 10 09:28:04 crc kubenswrapper[4825]: E0310 09:28:04.347460 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1a992559f491f587bd906742c4de491f5c80d44cad7cb4b751777b1f5fe554e\": container with ID starting with c1a992559f491f587bd906742c4de491f5c80d44cad7cb4b751777b1f5fe554e not found: ID does not exist" containerID="c1a992559f491f587bd906742c4de491f5c80d44cad7cb4b751777b1f5fe554e" Mar 10 09:28:04 crc kubenswrapper[4825]: I0310 09:28:04.347501 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a992559f491f587bd906742c4de491f5c80d44cad7cb4b751777b1f5fe554e"} err="failed to get container status \"c1a992559f491f587bd906742c4de491f5c80d44cad7cb4b751777b1f5fe554e\": rpc error: code = NotFound desc = could not find container \"c1a992559f491f587bd906742c4de491f5c80d44cad7cb4b751777b1f5fe554e\": container with ID starting with c1a992559f491f587bd906742c4de491f5c80d44cad7cb4b751777b1f5fe554e not found: ID does not exist" Mar 10 09:28:04 crc kubenswrapper[4825]: I0310 09:28:04.347527 4825 scope.go:117] "RemoveContainer" containerID="ac451f7a7e9cf0bf46428e7ca37f378992d11c6626dee82ee5ae14fabd7b62f6" Mar 10 09:28:04 crc kubenswrapper[4825]: E0310 09:28:04.347843 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac451f7a7e9cf0bf46428e7ca37f378992d11c6626dee82ee5ae14fabd7b62f6\": container with ID starting with ac451f7a7e9cf0bf46428e7ca37f378992d11c6626dee82ee5ae14fabd7b62f6 not found: ID does not exist" containerID="ac451f7a7e9cf0bf46428e7ca37f378992d11c6626dee82ee5ae14fabd7b62f6" Mar 10 09:28:04 crc kubenswrapper[4825]: I0310 09:28:04.347865 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac451f7a7e9cf0bf46428e7ca37f378992d11c6626dee82ee5ae14fabd7b62f6"} err="failed to get container status \"ac451f7a7e9cf0bf46428e7ca37f378992d11c6626dee82ee5ae14fabd7b62f6\": rpc error: code = NotFound desc = could not find container \"ac451f7a7e9cf0bf46428e7ca37f378992d11c6626dee82ee5ae14fabd7b62f6\": container with ID starting with ac451f7a7e9cf0bf46428e7ca37f378992d11c6626dee82ee5ae14fabd7b62f6 not found: ID does not exist" Mar 10 09:28:04 crc kubenswrapper[4825]: I0310 09:28:04.347877 4825 scope.go:117] "RemoveContainer" containerID="f05fdd3060b6a82f8664b39c6d51e6475dc1998ddd0c82646e201c26a4cc0478" Mar 10 09:28:04 crc kubenswrapper[4825]: E0310 09:28:04.348326 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f05fdd3060b6a82f8664b39c6d51e6475dc1998ddd0c82646e201c26a4cc0478\": container with ID starting with f05fdd3060b6a82f8664b39c6d51e6475dc1998ddd0c82646e201c26a4cc0478 not found: ID does not exist" containerID="f05fdd3060b6a82f8664b39c6d51e6475dc1998ddd0c82646e201c26a4cc0478" Mar 10 09:28:04 crc kubenswrapper[4825]: I0310 09:28:04.348373 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f05fdd3060b6a82f8664b39c6d51e6475dc1998ddd0c82646e201c26a4cc0478"} err="failed to get container status \"f05fdd3060b6a82f8664b39c6d51e6475dc1998ddd0c82646e201c26a4cc0478\": rpc error: code = NotFound desc = could not find container \"f05fdd3060b6a82f8664b39c6d51e6475dc1998ddd0c82646e201c26a4cc0478\": container with ID starting with f05fdd3060b6a82f8664b39c6d51e6475dc1998ddd0c82646e201c26a4cc0478 not found: ID does not exist" Mar 10 09:28:04 crc kubenswrapper[4825]: I0310 09:28:04.645490 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552248-k9m67" Mar 10 09:28:04 crc kubenswrapper[4825]: I0310 09:28:04.772083 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skm6j\" (UniqueName: \"kubernetes.io/projected/c6ca031d-d17f-4b5d-891d-3d30afd18ac5-kube-api-access-skm6j\") pod \"c6ca031d-d17f-4b5d-891d-3d30afd18ac5\" (UID: \"c6ca031d-d17f-4b5d-891d-3d30afd18ac5\") " Mar 10 09:28:04 crc kubenswrapper[4825]: I0310 09:28:04.782005 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6ca031d-d17f-4b5d-891d-3d30afd18ac5-kube-api-access-skm6j" (OuterVolumeSpecName: "kube-api-access-skm6j") pod "c6ca031d-d17f-4b5d-891d-3d30afd18ac5" (UID: "c6ca031d-d17f-4b5d-891d-3d30afd18ac5"). InnerVolumeSpecName "kube-api-access-skm6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:28:04 crc kubenswrapper[4825]: I0310 09:28:04.874715 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skm6j\" (UniqueName: \"kubernetes.io/projected/c6ca031d-d17f-4b5d-891d-3d30afd18ac5-kube-api-access-skm6j\") on node \"crc\" DevicePath \"\"" Mar 10 09:28:05 crc kubenswrapper[4825]: I0310 09:28:05.204376 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552248-k9m67" event={"ID":"c6ca031d-d17f-4b5d-891d-3d30afd18ac5","Type":"ContainerDied","Data":"13617bcf2e82b264c7eda16ddde46e3ebe324830c05c7e06b0a7b849af38d6c8"} Mar 10 09:28:05 crc kubenswrapper[4825]: I0310 09:28:05.204415 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13617bcf2e82b264c7eda16ddde46e3ebe324830c05c7e06b0a7b849af38d6c8" Mar 10 09:28:05 crc kubenswrapper[4825]: I0310 09:28:05.204413 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552248-k9m67" Mar 10 09:28:05 crc kubenswrapper[4825]: I0310 09:28:05.246618 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="763b5859-4985-4e4d-b2ed-a310cdaf5f3a" path="/var/lib/kubelet/pods/763b5859-4985-4e4d-b2ed-a310cdaf5f3a/volumes" Mar 10 09:28:05 crc kubenswrapper[4825]: I0310 09:28:05.712678 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552242-f7v75"] Mar 10 09:28:05 crc kubenswrapper[4825]: I0310 09:28:05.722411 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552242-f7v75"] Mar 10 09:28:07 crc kubenswrapper[4825]: I0310 09:28:07.246496 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ac7fce3-efdc-4ce8-85e3-ddf075cc33a3" path="/var/lib/kubelet/pods/2ac7fce3-efdc-4ce8-85e3-ddf075cc33a3/volumes" Mar 10 09:28:10 crc kubenswrapper[4825]: I0310 09:28:10.672166 4825 scope.go:117] "RemoveContainer" containerID="90c406be1ae8517dd7245a11cc253022e43e0d50f31061a1a4396f4babe843bc" Mar 10 09:28:10 crc kubenswrapper[4825]: I0310 09:28:10.717644 4825 scope.go:117] "RemoveContainer" containerID="1273d86ae42bf16ed9c6dd6bb37e44c23a47cf2f5c02db1eff173a1412a7d6f8" Mar 10 09:28:16 crc kubenswrapper[4825]: E0310 09:28:16.230840 4825 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.222:46136->38.102.83.222:42289: write tcp 38.102.83.222:46136->38.102.83.222:42289: write: broken pipe Mar 10 09:28:19 crc kubenswrapper[4825]: I0310 09:28:19.242485 4825 scope.go:117] "RemoveContainer" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" Mar 10 09:28:19 crc kubenswrapper[4825]: E0310 09:28:19.243453 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:28:34 crc kubenswrapper[4825]: I0310 09:28:34.236543 4825 scope.go:117] "RemoveContainer" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" Mar 10 09:28:34 crc kubenswrapper[4825]: E0310 09:28:34.237341 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:28:46 crc kubenswrapper[4825]: I0310 09:28:46.236303 4825 scope.go:117] "RemoveContainer" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" Mar 10 09:28:46 crc kubenswrapper[4825]: E0310 09:28:46.237000 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:29:00 crc kubenswrapper[4825]: I0310 09:29:00.237059 4825 scope.go:117] "RemoveContainer" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" Mar 10 09:29:00 crc kubenswrapper[4825]: E0310 09:29:00.237912 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:29:10 crc kubenswrapper[4825]: I0310 09:29:10.810330 4825 scope.go:117] "RemoveContainer" containerID="07357a2c16a1b56b39857ba18d933ee0c6ff40073fd7805dcc7520effcdac0ff" Mar 10 09:29:11 crc kubenswrapper[4825]: I0310 09:29:11.238047 4825 scope.go:117] "RemoveContainer" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" Mar 10 09:29:11 crc kubenswrapper[4825]: E0310 09:29:11.238372 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:29:25 crc kubenswrapper[4825]: I0310 09:29:25.236526 4825 scope.go:117] "RemoveContainer" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" Mar 10 09:29:25 crc kubenswrapper[4825]: E0310 09:29:25.237264 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:29:37 crc kubenswrapper[4825]: I0310 09:29:37.237225 4825 scope.go:117] "RemoveContainer" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" Mar 10 09:29:37 crc kubenswrapper[4825]: E0310 09:29:37.237987 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:29:50 crc kubenswrapper[4825]: I0310 09:29:50.236630 4825 scope.go:117] "RemoveContainer" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" Mar 10 09:29:50 crc kubenswrapper[4825]: E0310 09:29:50.238620 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.183200 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552250-5k2zp"] Mar 10 09:30:00 crc kubenswrapper[4825]: E0310 09:30:00.184206 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763b5859-4985-4e4d-b2ed-a310cdaf5f3a" containerName="registry-server" Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.184219 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="763b5859-4985-4e4d-b2ed-a310cdaf5f3a" containerName="registry-server" Mar 10 09:30:00 crc kubenswrapper[4825]: E0310 09:30:00.184241 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763b5859-4985-4e4d-b2ed-a310cdaf5f3a" containerName="extract-content" Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.184248 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="763b5859-4985-4e4d-b2ed-a310cdaf5f3a" containerName="extract-content" Mar 10 09:30:00 crc kubenswrapper[4825]: E0310 09:30:00.184261 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ca031d-d17f-4b5d-891d-3d30afd18ac5" containerName="oc" Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.184267 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ca031d-d17f-4b5d-891d-3d30afd18ac5" containerName="oc" Mar 10 09:30:00 crc kubenswrapper[4825]: E0310 09:30:00.184286 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763b5859-4985-4e4d-b2ed-a310cdaf5f3a" containerName="extract-utilities" Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.184293 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="763b5859-4985-4e4d-b2ed-a310cdaf5f3a" containerName="extract-utilities" Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.184489 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6ca031d-d17f-4b5d-891d-3d30afd18ac5" containerName="oc" Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.184503 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="763b5859-4985-4e4d-b2ed-a310cdaf5f3a" containerName="registry-server" Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.185235 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-5k2zp" Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.197738 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.198373 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.223536 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552250-tmwbw"] Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.225174 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552250-tmwbw" Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.227529 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.227697 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.228311 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.234304 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552250-tmwbw"] Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.244315 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552250-5k2zp"] Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.365277 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/16247d70-8ed3-4c74-98bb-51d4fb5c8194-config-volume\") pod \"collect-profiles-29552250-5k2zp\" (UID: \"16247d70-8ed3-4c74-98bb-51d4fb5c8194\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-5k2zp" Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.365321 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhw9j\" (UniqueName: \"kubernetes.io/projected/16247d70-8ed3-4c74-98bb-51d4fb5c8194-kube-api-access-zhw9j\") pod \"collect-profiles-29552250-5k2zp\" (UID: \"16247d70-8ed3-4c74-98bb-51d4fb5c8194\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-5k2zp" Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.365434 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/16247d70-8ed3-4c74-98bb-51d4fb5c8194-secret-volume\") pod \"collect-profiles-29552250-5k2zp\" (UID: \"16247d70-8ed3-4c74-98bb-51d4fb5c8194\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-5k2zp" Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.365474 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsfd8\" (UniqueName: \"kubernetes.io/projected/3265220c-20d9-42f2-9b4c-f0751565ee20-kube-api-access-tsfd8\") pod \"auto-csr-approver-29552250-tmwbw\" (UID: \"3265220c-20d9-42f2-9b4c-f0751565ee20\") " pod="openshift-infra/auto-csr-approver-29552250-tmwbw" Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.467282 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/16247d70-8ed3-4c74-98bb-51d4fb5c8194-secret-volume\") pod \"collect-profiles-29552250-5k2zp\" (UID: \"16247d70-8ed3-4c74-98bb-51d4fb5c8194\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-5k2zp" Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.467347 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsfd8\" (UniqueName: \"kubernetes.io/projected/3265220c-20d9-42f2-9b4c-f0751565ee20-kube-api-access-tsfd8\") pod \"auto-csr-approver-29552250-tmwbw\" (UID: \"3265220c-20d9-42f2-9b4c-f0751565ee20\") " pod="openshift-infra/auto-csr-approver-29552250-tmwbw" Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.467431 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/16247d70-8ed3-4c74-98bb-51d4fb5c8194-config-volume\") pod \"collect-profiles-29552250-5k2zp\" (UID: \"16247d70-8ed3-4c74-98bb-51d4fb5c8194\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-5k2zp" Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.467454 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhw9j\" (UniqueName: \"kubernetes.io/projected/16247d70-8ed3-4c74-98bb-51d4fb5c8194-kube-api-access-zhw9j\") pod \"collect-profiles-29552250-5k2zp\" (UID: \"16247d70-8ed3-4c74-98bb-51d4fb5c8194\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-5k2zp" Mar 10 09:30:00 crc kubenswrapper[4825]: I0310 09:30:00.468467 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/16247d70-8ed3-4c74-98bb-51d4fb5c8194-config-volume\") pod \"collect-profiles-29552250-5k2zp\" (UID: \"16247d70-8ed3-4c74-98bb-51d4fb5c8194\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-5k2zp" Mar 10 09:30:01 crc kubenswrapper[4825]: I0310 09:30:01.059502 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/16247d70-8ed3-4c74-98bb-51d4fb5c8194-secret-volume\") pod \"collect-profiles-29552250-5k2zp\" (UID: \"16247d70-8ed3-4c74-98bb-51d4fb5c8194\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-5k2zp" Mar 10 09:30:01 crc kubenswrapper[4825]: I0310 09:30:01.076555 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsfd8\" (UniqueName: \"kubernetes.io/projected/3265220c-20d9-42f2-9b4c-f0751565ee20-kube-api-access-tsfd8\") pod \"auto-csr-approver-29552250-tmwbw\" (UID: \"3265220c-20d9-42f2-9b4c-f0751565ee20\") " pod="openshift-infra/auto-csr-approver-29552250-tmwbw" Mar 10 09:30:01 crc kubenswrapper[4825]: I0310 09:30:01.076972 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhw9j\" (UniqueName: \"kubernetes.io/projected/16247d70-8ed3-4c74-98bb-51d4fb5c8194-kube-api-access-zhw9j\") pod \"collect-profiles-29552250-5k2zp\" (UID: \"16247d70-8ed3-4c74-98bb-51d4fb5c8194\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-5k2zp" Mar 10 09:30:01 crc kubenswrapper[4825]: I0310 09:30:01.106267 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-5k2zp" Mar 10 09:30:01 crc kubenswrapper[4825]: I0310 09:30:01.144562 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552250-tmwbw" Mar 10 09:30:01 crc kubenswrapper[4825]: I0310 09:30:01.594343 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552250-5k2zp"] Mar 10 09:30:01 crc kubenswrapper[4825]: W0310 09:30:01.596109 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16247d70_8ed3_4c74_98bb_51d4fb5c8194.slice/crio-c2e91e7fb530bcc59cf169e993dba87c4c9e5b468403138005e3005aa2c8e21c WatchSource:0}: Error finding container c2e91e7fb530bcc59cf169e993dba87c4c9e5b468403138005e3005aa2c8e21c: Status 404 returned error can't find the container with id c2e91e7fb530bcc59cf169e993dba87c4c9e5b468403138005e3005aa2c8e21c Mar 10 09:30:01 crc kubenswrapper[4825]: I0310 09:30:01.665882 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552250-tmwbw"] Mar 10 09:30:01 crc kubenswrapper[4825]: W0310 09:30:01.666475 4825 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3265220c_20d9_42f2_9b4c_f0751565ee20.slice/crio-d4fb0fbc715895609e13676e9020fd92ceb64ffc9827c23726ab126609dc382b WatchSource:0}: Error finding container d4fb0fbc715895609e13676e9020fd92ceb64ffc9827c23726ab126609dc382b: Status 404 returned error can't find the container with id d4fb0fbc715895609e13676e9020fd92ceb64ffc9827c23726ab126609dc382b Mar 10 09:30:02 crc kubenswrapper[4825]: I0310 09:30:02.583744 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552250-tmwbw" event={"ID":"3265220c-20d9-42f2-9b4c-f0751565ee20","Type":"ContainerStarted","Data":"d4fb0fbc715895609e13676e9020fd92ceb64ffc9827c23726ab126609dc382b"} Mar 10 09:30:02 crc kubenswrapper[4825]: I0310 09:30:02.586584 4825 generic.go:334] "Generic (PLEG): container finished" podID="16247d70-8ed3-4c74-98bb-51d4fb5c8194" containerID="f7df5c98b4b3f4d866ca47a886d543aa1424b7ac696215c843415bf84698452a" exitCode=0 Mar 10 09:30:02 crc kubenswrapper[4825]: I0310 09:30:02.586632 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-5k2zp" event={"ID":"16247d70-8ed3-4c74-98bb-51d4fb5c8194","Type":"ContainerDied","Data":"f7df5c98b4b3f4d866ca47a886d543aa1424b7ac696215c843415bf84698452a"} Mar 10 09:30:02 crc kubenswrapper[4825]: I0310 09:30:02.586662 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-5k2zp" event={"ID":"16247d70-8ed3-4c74-98bb-51d4fb5c8194","Type":"ContainerStarted","Data":"c2e91e7fb530bcc59cf169e993dba87c4c9e5b468403138005e3005aa2c8e21c"} Mar 10 09:30:03 crc kubenswrapper[4825]: I0310 09:30:03.962383 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-5k2zp" Mar 10 09:30:04 crc kubenswrapper[4825]: I0310 09:30:04.152660 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/16247d70-8ed3-4c74-98bb-51d4fb5c8194-config-volume\") pod \"16247d70-8ed3-4c74-98bb-51d4fb5c8194\" (UID: \"16247d70-8ed3-4c74-98bb-51d4fb5c8194\") " Mar 10 09:30:04 crc kubenswrapper[4825]: I0310 09:30:04.152783 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/16247d70-8ed3-4c74-98bb-51d4fb5c8194-secret-volume\") pod \"16247d70-8ed3-4c74-98bb-51d4fb5c8194\" (UID: \"16247d70-8ed3-4c74-98bb-51d4fb5c8194\") " Mar 10 09:30:04 crc kubenswrapper[4825]: I0310 09:30:04.152890 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhw9j\" (UniqueName: \"kubernetes.io/projected/16247d70-8ed3-4c74-98bb-51d4fb5c8194-kube-api-access-zhw9j\") pod \"16247d70-8ed3-4c74-98bb-51d4fb5c8194\" (UID: \"16247d70-8ed3-4c74-98bb-51d4fb5c8194\") " Mar 10 09:30:04 crc kubenswrapper[4825]: I0310 09:30:04.153092 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16247d70-8ed3-4c74-98bb-51d4fb5c8194-config-volume" (OuterVolumeSpecName: "config-volume") pod "16247d70-8ed3-4c74-98bb-51d4fb5c8194" (UID: "16247d70-8ed3-4c74-98bb-51d4fb5c8194"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 09:30:04 crc kubenswrapper[4825]: I0310 09:30:04.153871 4825 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/16247d70-8ed3-4c74-98bb-51d4fb5c8194-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:04 crc kubenswrapper[4825]: I0310 09:30:04.159158 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16247d70-8ed3-4c74-98bb-51d4fb5c8194-kube-api-access-zhw9j" (OuterVolumeSpecName: "kube-api-access-zhw9j") pod "16247d70-8ed3-4c74-98bb-51d4fb5c8194" (UID: "16247d70-8ed3-4c74-98bb-51d4fb5c8194"). InnerVolumeSpecName "kube-api-access-zhw9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:30:04 crc kubenswrapper[4825]: I0310 09:30:04.160962 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16247d70-8ed3-4c74-98bb-51d4fb5c8194-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "16247d70-8ed3-4c74-98bb-51d4fb5c8194" (UID: "16247d70-8ed3-4c74-98bb-51d4fb5c8194"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 09:30:04 crc kubenswrapper[4825]: I0310 09:30:04.256867 4825 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/16247d70-8ed3-4c74-98bb-51d4fb5c8194-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:04 crc kubenswrapper[4825]: I0310 09:30:04.257198 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhw9j\" (UniqueName: \"kubernetes.io/projected/16247d70-8ed3-4c74-98bb-51d4fb5c8194-kube-api-access-zhw9j\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:04 crc kubenswrapper[4825]: I0310 09:30:04.610006 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-5k2zp" Mar 10 09:30:04 crc kubenswrapper[4825]: I0310 09:30:04.610300 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552250-5k2zp" event={"ID":"16247d70-8ed3-4c74-98bb-51d4fb5c8194","Type":"ContainerDied","Data":"c2e91e7fb530bcc59cf169e993dba87c4c9e5b468403138005e3005aa2c8e21c"} Mar 10 09:30:04 crc kubenswrapper[4825]: I0310 09:30:04.610332 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2e91e7fb530bcc59cf169e993dba87c4c9e5b468403138005e3005aa2c8e21c" Mar 10 09:30:04 crc kubenswrapper[4825]: I0310 09:30:04.613182 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552250-tmwbw" event={"ID":"3265220c-20d9-42f2-9b4c-f0751565ee20","Type":"ContainerStarted","Data":"1af828e7266b44d3e3c53eef6291e3ec109e470bb43741d6f83fc44430bac6cb"} Mar 10 09:30:04 crc kubenswrapper[4825]: I0310 09:30:04.659926 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552250-tmwbw" podStartSLOduration=2.033139764 podStartE2EDuration="4.659905201s" podCreationTimestamp="2026-03-10 09:30:00 +0000 UTC" firstStartedPulling="2026-03-10 09:30:01.668451934 +0000 UTC m=+9954.698232549" lastFinishedPulling="2026-03-10 09:30:04.295217371 +0000 UTC m=+9957.324997986" observedRunningTime="2026-03-10 09:30:04.629507281 +0000 UTC m=+9957.659287916" watchObservedRunningTime="2026-03-10 09:30:04.659905201 +0000 UTC m=+9957.689685816" Mar 10 09:30:05 crc kubenswrapper[4825]: I0310 09:30:05.044091 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552205-bxgcw"] Mar 10 09:30:05 crc kubenswrapper[4825]: I0310 09:30:05.054910 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552205-bxgcw"] Mar 10 09:30:05 crc kubenswrapper[4825]: I0310 09:30:05.238720 4825 scope.go:117] "RemoveContainer" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" Mar 10 09:30:05 crc kubenswrapper[4825]: E0310 09:30:05.240633 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:30:05 crc kubenswrapper[4825]: I0310 09:30:05.247933 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="749d1021-e2d5-4c3a-a300-ea500dd37438" path="/var/lib/kubelet/pods/749d1021-e2d5-4c3a-a300-ea500dd37438/volumes" Mar 10 09:30:05 crc kubenswrapper[4825]: I0310 09:30:05.627155 4825 generic.go:334] "Generic (PLEG): container finished" podID="3265220c-20d9-42f2-9b4c-f0751565ee20" containerID="1af828e7266b44d3e3c53eef6291e3ec109e470bb43741d6f83fc44430bac6cb" exitCode=0 Mar 10 09:30:05 crc kubenswrapper[4825]: I0310 09:30:05.627205 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552250-tmwbw" event={"ID":"3265220c-20d9-42f2-9b4c-f0751565ee20","Type":"ContainerDied","Data":"1af828e7266b44d3e3c53eef6291e3ec109e470bb43741d6f83fc44430bac6cb"} Mar 10 09:30:07 crc kubenswrapper[4825]: I0310 09:30:07.043127 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552250-tmwbw" Mar 10 09:30:07 crc kubenswrapper[4825]: I0310 09:30:07.232863 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsfd8\" (UniqueName: \"kubernetes.io/projected/3265220c-20d9-42f2-9b4c-f0751565ee20-kube-api-access-tsfd8\") pod \"3265220c-20d9-42f2-9b4c-f0751565ee20\" (UID: \"3265220c-20d9-42f2-9b4c-f0751565ee20\") " Mar 10 09:30:07 crc kubenswrapper[4825]: I0310 09:30:07.246357 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3265220c-20d9-42f2-9b4c-f0751565ee20-kube-api-access-tsfd8" (OuterVolumeSpecName: "kube-api-access-tsfd8") pod "3265220c-20d9-42f2-9b4c-f0751565ee20" (UID: "3265220c-20d9-42f2-9b4c-f0751565ee20"). InnerVolumeSpecName "kube-api-access-tsfd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:30:07 crc kubenswrapper[4825]: I0310 09:30:07.336210 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsfd8\" (UniqueName: \"kubernetes.io/projected/3265220c-20d9-42f2-9b4c-f0751565ee20-kube-api-access-tsfd8\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:07 crc kubenswrapper[4825]: I0310 09:30:07.653006 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552250-tmwbw" event={"ID":"3265220c-20d9-42f2-9b4c-f0751565ee20","Type":"ContainerDied","Data":"d4fb0fbc715895609e13676e9020fd92ceb64ffc9827c23726ab126609dc382b"} Mar 10 09:30:07 crc kubenswrapper[4825]: I0310 09:30:07.653054 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4fb0fbc715895609e13676e9020fd92ceb64ffc9827c23726ab126609dc382b" Mar 10 09:30:07 crc kubenswrapper[4825]: I0310 09:30:07.653084 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552250-tmwbw" Mar 10 09:30:07 crc kubenswrapper[4825]: I0310 09:30:07.694358 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552244-bnptt"] Mar 10 09:30:07 crc kubenswrapper[4825]: I0310 09:30:07.702838 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552244-bnptt"] Mar 10 09:30:09 crc kubenswrapper[4825]: I0310 09:30:09.246674 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0e3a09a-8fb1-440b-97d1-60abea92baac" path="/var/lib/kubelet/pods/b0e3a09a-8fb1-440b-97d1-60abea92baac/volumes" Mar 10 09:30:10 crc kubenswrapper[4825]: I0310 09:30:10.905497 4825 scope.go:117] "RemoveContainer" containerID="7ae228cadb05a6fe4bceec6c595529ca06ba12722a7dbcd3b8fc270ed110488e" Mar 10 09:30:10 crc kubenswrapper[4825]: I0310 09:30:10.949449 4825 scope.go:117] "RemoveContainer" containerID="a9c32a5920a7d76803f7f322f3b6273ce17ece3db73976faa0b96d093b7971a2" Mar 10 09:30:11 crc kubenswrapper[4825]: I0310 09:30:11.696475 4825 generic.go:334] "Generic (PLEG): container finished" podID="605d8ac1-592a-4163-a750-ecdfec37d34d" containerID="dd3c3fdff94a987089f78265fc591a0367f6c5967711ec969b7ddaed28860fa2" exitCode=0 Mar 10 09:30:11 crc kubenswrapper[4825]: I0310 09:30:11.696526 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r6wjf/must-gather-kkm2h" event={"ID":"605d8ac1-592a-4163-a750-ecdfec37d34d","Type":"ContainerDied","Data":"dd3c3fdff94a987089f78265fc591a0367f6c5967711ec969b7ddaed28860fa2"} Mar 10 09:30:11 crc kubenswrapper[4825]: I0310 09:30:11.698001 4825 scope.go:117] "RemoveContainer" containerID="dd3c3fdff94a987089f78265fc591a0367f6c5967711ec969b7ddaed28860fa2" Mar 10 09:30:12 crc kubenswrapper[4825]: I0310 09:30:12.174811 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-r6wjf_must-gather-kkm2h_605d8ac1-592a-4163-a750-ecdfec37d34d/gather/0.log" Mar 10 09:30:17 crc kubenswrapper[4825]: I0310 09:30:17.237105 4825 scope.go:117] "RemoveContainer" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" Mar 10 09:30:17 crc kubenswrapper[4825]: E0310 09:30:17.238527 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:30:21 crc kubenswrapper[4825]: I0310 09:30:21.416330 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-r6wjf/must-gather-kkm2h"] Mar 10 09:30:21 crc kubenswrapper[4825]: I0310 09:30:21.417271 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-r6wjf/must-gather-kkm2h" podUID="605d8ac1-592a-4163-a750-ecdfec37d34d" containerName="copy" containerID="cri-o://f643552ef8085f2c4326cbbd990eae6f91c9ca2c7fbd46a93530ac2986257360" gracePeriod=2 Mar 10 09:30:21 crc kubenswrapper[4825]: I0310 09:30:21.426276 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-r6wjf/must-gather-kkm2h"] Mar 10 09:30:21 crc kubenswrapper[4825]: I0310 09:30:21.812598 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-r6wjf_must-gather-kkm2h_605d8ac1-592a-4163-a750-ecdfec37d34d/copy/0.log" Mar 10 09:30:21 crc kubenswrapper[4825]: I0310 09:30:21.813337 4825 generic.go:334] "Generic (PLEG): container finished" podID="605d8ac1-592a-4163-a750-ecdfec37d34d" containerID="f643552ef8085f2c4326cbbd990eae6f91c9ca2c7fbd46a93530ac2986257360" exitCode=143 Mar 10 09:30:21 crc kubenswrapper[4825]: I0310 09:30:21.926295 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-r6wjf_must-gather-kkm2h_605d8ac1-592a-4163-a750-ecdfec37d34d/copy/0.log" Mar 10 09:30:21 crc kubenswrapper[4825]: I0310 09:30:21.926695 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r6wjf/must-gather-kkm2h" Mar 10 09:30:22 crc kubenswrapper[4825]: I0310 09:30:22.091629 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwz4x\" (UniqueName: \"kubernetes.io/projected/605d8ac1-592a-4163-a750-ecdfec37d34d-kube-api-access-zwz4x\") pod \"605d8ac1-592a-4163-a750-ecdfec37d34d\" (UID: \"605d8ac1-592a-4163-a750-ecdfec37d34d\") " Mar 10 09:30:22 crc kubenswrapper[4825]: I0310 09:30:22.091689 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/605d8ac1-592a-4163-a750-ecdfec37d34d-must-gather-output\") pod \"605d8ac1-592a-4163-a750-ecdfec37d34d\" (UID: \"605d8ac1-592a-4163-a750-ecdfec37d34d\") " Mar 10 09:30:22 crc kubenswrapper[4825]: I0310 09:30:22.098540 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/605d8ac1-592a-4163-a750-ecdfec37d34d-kube-api-access-zwz4x" (OuterVolumeSpecName: "kube-api-access-zwz4x") pod "605d8ac1-592a-4163-a750-ecdfec37d34d" (UID: "605d8ac1-592a-4163-a750-ecdfec37d34d"). InnerVolumeSpecName "kube-api-access-zwz4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:30:22 crc kubenswrapper[4825]: I0310 09:30:22.193598 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwz4x\" (UniqueName: \"kubernetes.io/projected/605d8ac1-592a-4163-a750-ecdfec37d34d-kube-api-access-zwz4x\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:22 crc kubenswrapper[4825]: I0310 09:30:22.277118 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/605d8ac1-592a-4163-a750-ecdfec37d34d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "605d8ac1-592a-4163-a750-ecdfec37d34d" (UID: "605d8ac1-592a-4163-a750-ecdfec37d34d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:30:22 crc kubenswrapper[4825]: I0310 09:30:22.296697 4825 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/605d8ac1-592a-4163-a750-ecdfec37d34d-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:22 crc kubenswrapper[4825]: I0310 09:30:22.823054 4825 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-r6wjf_must-gather-kkm2h_605d8ac1-592a-4163-a750-ecdfec37d34d/copy/0.log" Mar 10 09:30:22 crc kubenswrapper[4825]: I0310 09:30:22.823722 4825 scope.go:117] "RemoveContainer" containerID="f643552ef8085f2c4326cbbd990eae6f91c9ca2c7fbd46a93530ac2986257360" Mar 10 09:30:22 crc kubenswrapper[4825]: I0310 09:30:22.823788 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r6wjf/must-gather-kkm2h" Mar 10 09:30:22 crc kubenswrapper[4825]: I0310 09:30:22.845291 4825 scope.go:117] "RemoveContainer" containerID="dd3c3fdff94a987089f78265fc591a0367f6c5967711ec969b7ddaed28860fa2" Mar 10 09:30:23 crc kubenswrapper[4825]: I0310 09:30:23.247977 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="605d8ac1-592a-4163-a750-ecdfec37d34d" path="/var/lib/kubelet/pods/605d8ac1-592a-4163-a750-ecdfec37d34d/volumes" Mar 10 09:30:30 crc kubenswrapper[4825]: I0310 09:30:30.236573 4825 scope.go:117] "RemoveContainer" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" Mar 10 09:30:30 crc kubenswrapper[4825]: E0310 09:30:30.237426 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:30:41 crc kubenswrapper[4825]: I0310 09:30:41.236494 4825 scope.go:117] "RemoveContainer" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" Mar 10 09:30:41 crc kubenswrapper[4825]: E0310 09:30:41.237175 4825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bvt9j_openshift-machine-config-operator(9beb5814-89d0-47c0-8b0e-24376a358fc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" Mar 10 09:30:41 crc kubenswrapper[4825]: I0310 09:30:41.456255 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p7r65"] Mar 10 09:30:41 crc kubenswrapper[4825]: E0310 09:30:41.456782 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3265220c-20d9-42f2-9b4c-f0751565ee20" containerName="oc" Mar 10 09:30:41 crc kubenswrapper[4825]: I0310 09:30:41.456807 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="3265220c-20d9-42f2-9b4c-f0751565ee20" containerName="oc" Mar 10 09:30:41 crc kubenswrapper[4825]: E0310 09:30:41.456822 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="605d8ac1-592a-4163-a750-ecdfec37d34d" containerName="copy" Mar 10 09:30:41 crc kubenswrapper[4825]: I0310 09:30:41.456831 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="605d8ac1-592a-4163-a750-ecdfec37d34d" containerName="copy" Mar 10 09:30:41 crc kubenswrapper[4825]: E0310 09:30:41.456852 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16247d70-8ed3-4c74-98bb-51d4fb5c8194" containerName="collect-profiles" Mar 10 09:30:41 crc kubenswrapper[4825]: I0310 09:30:41.456860 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="16247d70-8ed3-4c74-98bb-51d4fb5c8194" containerName="collect-profiles" Mar 10 09:30:41 crc kubenswrapper[4825]: E0310 09:30:41.456889 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="605d8ac1-592a-4163-a750-ecdfec37d34d" containerName="gather" Mar 10 09:30:41 crc kubenswrapper[4825]: I0310 09:30:41.456897 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="605d8ac1-592a-4163-a750-ecdfec37d34d" containerName="gather" Mar 10 09:30:41 crc kubenswrapper[4825]: I0310 09:30:41.457125 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="605d8ac1-592a-4163-a750-ecdfec37d34d" containerName="gather" Mar 10 09:30:41 crc kubenswrapper[4825]: I0310 09:30:41.457183 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="3265220c-20d9-42f2-9b4c-f0751565ee20" containerName="oc" Mar 10 09:30:41 crc kubenswrapper[4825]: I0310 09:30:41.457207 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="16247d70-8ed3-4c74-98bb-51d4fb5c8194" containerName="collect-profiles" Mar 10 09:30:41 crc kubenswrapper[4825]: I0310 09:30:41.457227 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="605d8ac1-592a-4163-a750-ecdfec37d34d" containerName="copy" Mar 10 09:30:41 crc kubenswrapper[4825]: I0310 09:30:41.459395 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7r65" Mar 10 09:30:41 crc kubenswrapper[4825]: I0310 09:30:41.476672 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p7r65"] Mar 10 09:30:41 crc kubenswrapper[4825]: I0310 09:30:41.617839 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b56b504-a6f9-4aa8-a404-617687c5e887-utilities\") pod \"certified-operators-p7r65\" (UID: \"0b56b504-a6f9-4aa8-a404-617687c5e887\") " pod="openshift-marketplace/certified-operators-p7r65" Mar 10 09:30:41 crc kubenswrapper[4825]: I0310 09:30:41.618060 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw6jh\" (UniqueName: \"kubernetes.io/projected/0b56b504-a6f9-4aa8-a404-617687c5e887-kube-api-access-tw6jh\") pod \"certified-operators-p7r65\" (UID: \"0b56b504-a6f9-4aa8-a404-617687c5e887\") " pod="openshift-marketplace/certified-operators-p7r65" Mar 10 09:30:41 crc kubenswrapper[4825]: I0310 09:30:41.618270 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b56b504-a6f9-4aa8-a404-617687c5e887-catalog-content\") pod \"certified-operators-p7r65\" (UID: \"0b56b504-a6f9-4aa8-a404-617687c5e887\") " pod="openshift-marketplace/certified-operators-p7r65" Mar 10 09:30:41 crc kubenswrapper[4825]: I0310 09:30:41.720492 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b56b504-a6f9-4aa8-a404-617687c5e887-utilities\") pod \"certified-operators-p7r65\" (UID: \"0b56b504-a6f9-4aa8-a404-617687c5e887\") " pod="openshift-marketplace/certified-operators-p7r65" Mar 10 09:30:41 crc kubenswrapper[4825]: I0310 09:30:41.720596 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw6jh\" (UniqueName: \"kubernetes.io/projected/0b56b504-a6f9-4aa8-a404-617687c5e887-kube-api-access-tw6jh\") pod \"certified-operators-p7r65\" (UID: \"0b56b504-a6f9-4aa8-a404-617687c5e887\") " pod="openshift-marketplace/certified-operators-p7r65" Mar 10 09:30:41 crc kubenswrapper[4825]: I0310 09:30:41.720640 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b56b504-a6f9-4aa8-a404-617687c5e887-catalog-content\") pod \"certified-operators-p7r65\" (UID: \"0b56b504-a6f9-4aa8-a404-617687c5e887\") " pod="openshift-marketplace/certified-operators-p7r65" Mar 10 09:30:41 crc kubenswrapper[4825]: I0310 09:30:41.720947 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b56b504-a6f9-4aa8-a404-617687c5e887-utilities\") pod \"certified-operators-p7r65\" (UID: \"0b56b504-a6f9-4aa8-a404-617687c5e887\") " pod="openshift-marketplace/certified-operators-p7r65" Mar 10 09:30:41 crc kubenswrapper[4825]: I0310 09:30:41.721153 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b56b504-a6f9-4aa8-a404-617687c5e887-catalog-content\") pod \"certified-operators-p7r65\" (UID: \"0b56b504-a6f9-4aa8-a404-617687c5e887\") " pod="openshift-marketplace/certified-operators-p7r65" Mar 10 09:30:41 crc kubenswrapper[4825]: I0310 09:30:41.739840 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw6jh\" (UniqueName: \"kubernetes.io/projected/0b56b504-a6f9-4aa8-a404-617687c5e887-kube-api-access-tw6jh\") pod \"certified-operators-p7r65\" (UID: \"0b56b504-a6f9-4aa8-a404-617687c5e887\") " pod="openshift-marketplace/certified-operators-p7r65" Mar 10 09:30:41 crc kubenswrapper[4825]: I0310 09:30:41.782771 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7r65" Mar 10 09:30:42 crc kubenswrapper[4825]: I0310 09:30:42.287605 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p7r65"] Mar 10 09:30:43 crc kubenswrapper[4825]: I0310 09:30:43.025442 4825 generic.go:334] "Generic (PLEG): container finished" podID="0b56b504-a6f9-4aa8-a404-617687c5e887" containerID="4447ea2e7c49cb5daa7ae85ab00418d27a0ab27e5bf7fb8e0b3585110715e058" exitCode=0 Mar 10 09:30:43 crc kubenswrapper[4825]: I0310 09:30:43.025483 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7r65" event={"ID":"0b56b504-a6f9-4aa8-a404-617687c5e887","Type":"ContainerDied","Data":"4447ea2e7c49cb5daa7ae85ab00418d27a0ab27e5bf7fb8e0b3585110715e058"} Mar 10 09:30:43 crc kubenswrapper[4825]: I0310 09:30:43.025512 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7r65" event={"ID":"0b56b504-a6f9-4aa8-a404-617687c5e887","Type":"ContainerStarted","Data":"1a5dcfc8dc6efce41c59831ab69865b990fb4375c34e456254f94682ab67cd73"} Mar 10 09:30:45 crc kubenswrapper[4825]: I0310 09:30:45.044011 4825 generic.go:334] "Generic (PLEG): container finished" podID="0b56b504-a6f9-4aa8-a404-617687c5e887" containerID="54fe6abdebdcfcab8408cf826ba2ae70241b72c2c4a6fecc99fbe5ad15ce07c3" exitCode=0 Mar 10 09:30:45 crc kubenswrapper[4825]: I0310 09:30:45.044086 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7r65" event={"ID":"0b56b504-a6f9-4aa8-a404-617687c5e887","Type":"ContainerDied","Data":"54fe6abdebdcfcab8408cf826ba2ae70241b72c2c4a6fecc99fbe5ad15ce07c3"} Mar 10 09:30:46 crc kubenswrapper[4825]: I0310 09:30:46.057988 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7r65" event={"ID":"0b56b504-a6f9-4aa8-a404-617687c5e887","Type":"ContainerStarted","Data":"4f82d5ea1806ca12e0014ff998a1dfa087783a58019fe678f6fd06891cf304d8"} Mar 10 09:30:46 crc kubenswrapper[4825]: I0310 09:30:46.106054 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p7r65" podStartSLOduration=2.676078366 podStartE2EDuration="5.106033378s" podCreationTimestamp="2026-03-10 09:30:41 +0000 UTC" firstStartedPulling="2026-03-10 09:30:43.029021761 +0000 UTC m=+9996.058802376" lastFinishedPulling="2026-03-10 09:30:45.458976733 +0000 UTC m=+9998.488757388" observedRunningTime="2026-03-10 09:30:46.097579803 +0000 UTC m=+9999.127360418" watchObservedRunningTime="2026-03-10 09:30:46.106033378 +0000 UTC m=+9999.135813993" Mar 10 09:30:51 crc kubenswrapper[4825]: I0310 09:30:51.783702 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p7r65" Mar 10 09:30:51 crc kubenswrapper[4825]: I0310 09:30:51.784203 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p7r65" Mar 10 09:30:51 crc kubenswrapper[4825]: I0310 09:30:51.832036 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p7r65" Mar 10 09:30:52 crc kubenswrapper[4825]: I0310 09:30:52.183347 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p7r65" Mar 10 09:30:52 crc kubenswrapper[4825]: I0310 09:30:52.225421 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p7r65"] Mar 10 09:30:52 crc kubenswrapper[4825]: I0310 09:30:52.236196 4825 scope.go:117] "RemoveContainer" containerID="15cd81bfb1d58ce096026917f6cfb5d65358bfb886f1c139a5a4b17f5671d715" Mar 10 09:30:53 crc kubenswrapper[4825]: I0310 09:30:53.137901 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" event={"ID":"9beb5814-89d0-47c0-8b0e-24376a358fc3","Type":"ContainerStarted","Data":"de2d5ce1f1803378177d19a0a77add8e873c8417c865f42437c59df82015ffd2"} Mar 10 09:30:54 crc kubenswrapper[4825]: I0310 09:30:54.146889 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p7r65" podUID="0b56b504-a6f9-4aa8-a404-617687c5e887" containerName="registry-server" containerID="cri-o://4f82d5ea1806ca12e0014ff998a1dfa087783a58019fe678f6fd06891cf304d8" gracePeriod=2 Mar 10 09:30:54 crc kubenswrapper[4825]: I0310 09:30:54.685216 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7r65" Mar 10 09:30:54 crc kubenswrapper[4825]: I0310 09:30:54.819178 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b56b504-a6f9-4aa8-a404-617687c5e887-utilities\") pod \"0b56b504-a6f9-4aa8-a404-617687c5e887\" (UID: \"0b56b504-a6f9-4aa8-a404-617687c5e887\") " Mar 10 09:30:54 crc kubenswrapper[4825]: I0310 09:30:54.819476 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw6jh\" (UniqueName: \"kubernetes.io/projected/0b56b504-a6f9-4aa8-a404-617687c5e887-kube-api-access-tw6jh\") pod \"0b56b504-a6f9-4aa8-a404-617687c5e887\" (UID: \"0b56b504-a6f9-4aa8-a404-617687c5e887\") " Mar 10 09:30:54 crc kubenswrapper[4825]: I0310 09:30:54.819519 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b56b504-a6f9-4aa8-a404-617687c5e887-catalog-content\") pod \"0b56b504-a6f9-4aa8-a404-617687c5e887\" (UID: \"0b56b504-a6f9-4aa8-a404-617687c5e887\") " Mar 10 09:30:54 crc kubenswrapper[4825]: I0310 09:30:54.820654 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b56b504-a6f9-4aa8-a404-617687c5e887-utilities" (OuterVolumeSpecName: "utilities") pod "0b56b504-a6f9-4aa8-a404-617687c5e887" (UID: "0b56b504-a6f9-4aa8-a404-617687c5e887"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:30:54 crc kubenswrapper[4825]: I0310 09:30:54.834494 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b56b504-a6f9-4aa8-a404-617687c5e887-kube-api-access-tw6jh" (OuterVolumeSpecName: "kube-api-access-tw6jh") pod "0b56b504-a6f9-4aa8-a404-617687c5e887" (UID: "0b56b504-a6f9-4aa8-a404-617687c5e887"). InnerVolumeSpecName "kube-api-access-tw6jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:30:54 crc kubenswrapper[4825]: I0310 09:30:54.921800 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw6jh\" (UniqueName: \"kubernetes.io/projected/0b56b504-a6f9-4aa8-a404-617687c5e887-kube-api-access-tw6jh\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:54 crc kubenswrapper[4825]: I0310 09:30:54.921848 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b56b504-a6f9-4aa8-a404-617687c5e887-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:55 crc kubenswrapper[4825]: I0310 09:30:55.027730 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b56b504-a6f9-4aa8-a404-617687c5e887-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b56b504-a6f9-4aa8-a404-617687c5e887" (UID: "0b56b504-a6f9-4aa8-a404-617687c5e887"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:30:55 crc kubenswrapper[4825]: I0310 09:30:55.127005 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b56b504-a6f9-4aa8-a404-617687c5e887-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:30:55 crc kubenswrapper[4825]: I0310 09:30:55.161733 4825 generic.go:334] "Generic (PLEG): container finished" podID="0b56b504-a6f9-4aa8-a404-617687c5e887" containerID="4f82d5ea1806ca12e0014ff998a1dfa087783a58019fe678f6fd06891cf304d8" exitCode=0 Mar 10 09:30:55 crc kubenswrapper[4825]: I0310 09:30:55.161799 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7r65" event={"ID":"0b56b504-a6f9-4aa8-a404-617687c5e887","Type":"ContainerDied","Data":"4f82d5ea1806ca12e0014ff998a1dfa087783a58019fe678f6fd06891cf304d8"} Mar 10 09:30:55 crc kubenswrapper[4825]: I0310 09:30:55.161936 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7r65" event={"ID":"0b56b504-a6f9-4aa8-a404-617687c5e887","Type":"ContainerDied","Data":"1a5dcfc8dc6efce41c59831ab69865b990fb4375c34e456254f94682ab67cd73"} Mar 10 09:30:55 crc kubenswrapper[4825]: I0310 09:30:55.161840 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7r65" Mar 10 09:30:55 crc kubenswrapper[4825]: I0310 09:30:55.162017 4825 scope.go:117] "RemoveContainer" containerID="4f82d5ea1806ca12e0014ff998a1dfa087783a58019fe678f6fd06891cf304d8" Mar 10 09:30:55 crc kubenswrapper[4825]: I0310 09:30:55.204060 4825 scope.go:117] "RemoveContainer" containerID="54fe6abdebdcfcab8408cf826ba2ae70241b72c2c4a6fecc99fbe5ad15ce07c3" Mar 10 09:30:55 crc kubenswrapper[4825]: I0310 09:30:55.223491 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p7r65"] Mar 10 09:30:55 crc kubenswrapper[4825]: I0310 09:30:55.252666 4825 scope.go:117] "RemoveContainer" containerID="4447ea2e7c49cb5daa7ae85ab00418d27a0ab27e5bf7fb8e0b3585110715e058" Mar 10 09:30:55 crc kubenswrapper[4825]: I0310 09:30:55.269413 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p7r65"] Mar 10 09:30:55 crc kubenswrapper[4825]: I0310 09:30:55.314062 4825 scope.go:117] "RemoveContainer" containerID="4f82d5ea1806ca12e0014ff998a1dfa087783a58019fe678f6fd06891cf304d8" Mar 10 09:30:55 crc kubenswrapper[4825]: E0310 09:30:55.314729 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f82d5ea1806ca12e0014ff998a1dfa087783a58019fe678f6fd06891cf304d8\": container with ID starting with 4f82d5ea1806ca12e0014ff998a1dfa087783a58019fe678f6fd06891cf304d8 not found: ID does not exist" containerID="4f82d5ea1806ca12e0014ff998a1dfa087783a58019fe678f6fd06891cf304d8" Mar 10 09:30:55 crc kubenswrapper[4825]: I0310 09:30:55.314786 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f82d5ea1806ca12e0014ff998a1dfa087783a58019fe678f6fd06891cf304d8"} err="failed to get container status \"4f82d5ea1806ca12e0014ff998a1dfa087783a58019fe678f6fd06891cf304d8\": rpc error: code = NotFound desc = could not find container \"4f82d5ea1806ca12e0014ff998a1dfa087783a58019fe678f6fd06891cf304d8\": container with ID starting with 4f82d5ea1806ca12e0014ff998a1dfa087783a58019fe678f6fd06891cf304d8 not found: ID does not exist" Mar 10 09:30:55 crc kubenswrapper[4825]: I0310 09:30:55.314823 4825 scope.go:117] "RemoveContainer" containerID="54fe6abdebdcfcab8408cf826ba2ae70241b72c2c4a6fecc99fbe5ad15ce07c3" Mar 10 09:30:55 crc kubenswrapper[4825]: E0310 09:30:55.315249 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54fe6abdebdcfcab8408cf826ba2ae70241b72c2c4a6fecc99fbe5ad15ce07c3\": container with ID starting with 54fe6abdebdcfcab8408cf826ba2ae70241b72c2c4a6fecc99fbe5ad15ce07c3 not found: ID does not exist" containerID="54fe6abdebdcfcab8408cf826ba2ae70241b72c2c4a6fecc99fbe5ad15ce07c3" Mar 10 09:30:55 crc kubenswrapper[4825]: I0310 09:30:55.315285 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54fe6abdebdcfcab8408cf826ba2ae70241b72c2c4a6fecc99fbe5ad15ce07c3"} err="failed to get container status \"54fe6abdebdcfcab8408cf826ba2ae70241b72c2c4a6fecc99fbe5ad15ce07c3\": rpc error: code = NotFound desc = could not find container \"54fe6abdebdcfcab8408cf826ba2ae70241b72c2c4a6fecc99fbe5ad15ce07c3\": container with ID starting with 54fe6abdebdcfcab8408cf826ba2ae70241b72c2c4a6fecc99fbe5ad15ce07c3 not found: ID does not exist" Mar 10 09:30:55 crc kubenswrapper[4825]: I0310 09:30:55.315310 4825 scope.go:117] "RemoveContainer" containerID="4447ea2e7c49cb5daa7ae85ab00418d27a0ab27e5bf7fb8e0b3585110715e058" Mar 10 09:30:55 crc kubenswrapper[4825]: E0310 09:30:55.315860 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4447ea2e7c49cb5daa7ae85ab00418d27a0ab27e5bf7fb8e0b3585110715e058\": container with ID starting with 4447ea2e7c49cb5daa7ae85ab00418d27a0ab27e5bf7fb8e0b3585110715e058 not found: ID does not exist" containerID="4447ea2e7c49cb5daa7ae85ab00418d27a0ab27e5bf7fb8e0b3585110715e058" Mar 10 09:30:55 crc kubenswrapper[4825]: I0310 09:30:55.315899 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4447ea2e7c49cb5daa7ae85ab00418d27a0ab27e5bf7fb8e0b3585110715e058"} err="failed to get container status \"4447ea2e7c49cb5daa7ae85ab00418d27a0ab27e5bf7fb8e0b3585110715e058\": rpc error: code = NotFound desc = could not find container \"4447ea2e7c49cb5daa7ae85ab00418d27a0ab27e5bf7fb8e0b3585110715e058\": container with ID starting with 4447ea2e7c49cb5daa7ae85ab00418d27a0ab27e5bf7fb8e0b3585110715e058 not found: ID does not exist" Mar 10 09:30:57 crc kubenswrapper[4825]: I0310 09:30:57.254953 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b56b504-a6f9-4aa8-a404-617687c5e887" path="/var/lib/kubelet/pods/0b56b504-a6f9-4aa8-a404-617687c5e887/volumes" Mar 10 09:31:42 crc kubenswrapper[4825]: I0310 09:31:42.324569 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s8wlv"] Mar 10 09:31:42 crc kubenswrapper[4825]: E0310 09:31:42.325491 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b56b504-a6f9-4aa8-a404-617687c5e887" containerName="extract-utilities" Mar 10 09:31:42 crc kubenswrapper[4825]: I0310 09:31:42.325506 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b56b504-a6f9-4aa8-a404-617687c5e887" containerName="extract-utilities" Mar 10 09:31:42 crc kubenswrapper[4825]: E0310 09:31:42.325534 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b56b504-a6f9-4aa8-a404-617687c5e887" containerName="registry-server" Mar 10 09:31:42 crc kubenswrapper[4825]: I0310 09:31:42.325544 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b56b504-a6f9-4aa8-a404-617687c5e887" containerName="registry-server" Mar 10 09:31:42 crc kubenswrapper[4825]: E0310 09:31:42.325566 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b56b504-a6f9-4aa8-a404-617687c5e887" containerName="extract-content" Mar 10 09:31:42 crc kubenswrapper[4825]: I0310 09:31:42.325574 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b56b504-a6f9-4aa8-a404-617687c5e887" containerName="extract-content" Mar 10 09:31:42 crc kubenswrapper[4825]: I0310 09:31:42.325828 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b56b504-a6f9-4aa8-a404-617687c5e887" containerName="registry-server" Mar 10 09:31:42 crc kubenswrapper[4825]: I0310 09:31:42.327788 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8wlv" Mar 10 09:31:42 crc kubenswrapper[4825]: I0310 09:31:42.345264 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s8wlv"] Mar 10 09:31:42 crc kubenswrapper[4825]: I0310 09:31:42.465591 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/434afba3-bdfc-44d7-b0b9-9b9c309221d9-catalog-content\") pod \"redhat-operators-s8wlv\" (UID: \"434afba3-bdfc-44d7-b0b9-9b9c309221d9\") " pod="openshift-marketplace/redhat-operators-s8wlv" Mar 10 09:31:42 crc kubenswrapper[4825]: I0310 09:31:42.465638 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh87n\" (UniqueName: \"kubernetes.io/projected/434afba3-bdfc-44d7-b0b9-9b9c309221d9-kube-api-access-fh87n\") pod \"redhat-operators-s8wlv\" (UID: \"434afba3-bdfc-44d7-b0b9-9b9c309221d9\") " pod="openshift-marketplace/redhat-operators-s8wlv" Mar 10 09:31:42 crc kubenswrapper[4825]: I0310 09:31:42.465837 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/434afba3-bdfc-44d7-b0b9-9b9c309221d9-utilities\") pod \"redhat-operators-s8wlv\" (UID: \"434afba3-bdfc-44d7-b0b9-9b9c309221d9\") " pod="openshift-marketplace/redhat-operators-s8wlv" Mar 10 09:31:42 crc kubenswrapper[4825]: I0310 09:31:42.567850 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/434afba3-bdfc-44d7-b0b9-9b9c309221d9-utilities\") pod \"redhat-operators-s8wlv\" (UID: \"434afba3-bdfc-44d7-b0b9-9b9c309221d9\") " pod="openshift-marketplace/redhat-operators-s8wlv" Mar 10 09:31:42 crc kubenswrapper[4825]: I0310 09:31:42.568009 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/434afba3-bdfc-44d7-b0b9-9b9c309221d9-catalog-content\") pod \"redhat-operators-s8wlv\" (UID: \"434afba3-bdfc-44d7-b0b9-9b9c309221d9\") " pod="openshift-marketplace/redhat-operators-s8wlv" Mar 10 09:31:42 crc kubenswrapper[4825]: I0310 09:31:42.568034 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh87n\" (UniqueName: \"kubernetes.io/projected/434afba3-bdfc-44d7-b0b9-9b9c309221d9-kube-api-access-fh87n\") pod \"redhat-operators-s8wlv\" (UID: \"434afba3-bdfc-44d7-b0b9-9b9c309221d9\") " pod="openshift-marketplace/redhat-operators-s8wlv" Mar 10 09:31:42 crc kubenswrapper[4825]: I0310 09:31:42.568641 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/434afba3-bdfc-44d7-b0b9-9b9c309221d9-utilities\") pod \"redhat-operators-s8wlv\" (UID: \"434afba3-bdfc-44d7-b0b9-9b9c309221d9\") " pod="openshift-marketplace/redhat-operators-s8wlv" Mar 10 09:31:42 crc kubenswrapper[4825]: I0310 09:31:42.568694 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/434afba3-bdfc-44d7-b0b9-9b9c309221d9-catalog-content\") pod \"redhat-operators-s8wlv\" (UID: \"434afba3-bdfc-44d7-b0b9-9b9c309221d9\") " pod="openshift-marketplace/redhat-operators-s8wlv" Mar 10 09:31:43 crc kubenswrapper[4825]: I0310 09:31:43.064718 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh87n\" (UniqueName: \"kubernetes.io/projected/434afba3-bdfc-44d7-b0b9-9b9c309221d9-kube-api-access-fh87n\") pod \"redhat-operators-s8wlv\" (UID: \"434afba3-bdfc-44d7-b0b9-9b9c309221d9\") " pod="openshift-marketplace/redhat-operators-s8wlv" Mar 10 09:31:43 crc kubenswrapper[4825]: I0310 09:31:43.253870 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8wlv" Mar 10 09:31:43 crc kubenswrapper[4825]: I0310 09:31:43.729941 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s8wlv"] Mar 10 09:31:44 crc kubenswrapper[4825]: I0310 09:31:44.701202 4825 generic.go:334] "Generic (PLEG): container finished" podID="434afba3-bdfc-44d7-b0b9-9b9c309221d9" containerID="276427d40da455a78757a30950f65440e205e3e2d0e095057080c4ab74504198" exitCode=0 Mar 10 09:31:44 crc kubenswrapper[4825]: I0310 09:31:44.701334 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8wlv" event={"ID":"434afba3-bdfc-44d7-b0b9-9b9c309221d9","Type":"ContainerDied","Data":"276427d40da455a78757a30950f65440e205e3e2d0e095057080c4ab74504198"} Mar 10 09:31:44 crc kubenswrapper[4825]: I0310 09:31:44.701463 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8wlv" event={"ID":"434afba3-bdfc-44d7-b0b9-9b9c309221d9","Type":"ContainerStarted","Data":"88eefb09e45d72d8c47311d0d8ff3339c21f726561ceabd7bbbed136dce53ecc"} Mar 10 09:31:45 crc kubenswrapper[4825]: I0310 09:31:45.713667 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8wlv" event={"ID":"434afba3-bdfc-44d7-b0b9-9b9c309221d9","Type":"ContainerStarted","Data":"a60edfd590a9633149929f3620e10aec62806909513d88ed54ec618303f65001"} Mar 10 09:31:48 crc kubenswrapper[4825]: I0310 09:31:48.740638 4825 generic.go:334] "Generic (PLEG): container finished" podID="434afba3-bdfc-44d7-b0b9-9b9c309221d9" containerID="a60edfd590a9633149929f3620e10aec62806909513d88ed54ec618303f65001" exitCode=0 Mar 10 09:31:48 crc kubenswrapper[4825]: I0310 09:31:48.740698 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8wlv" event={"ID":"434afba3-bdfc-44d7-b0b9-9b9c309221d9","Type":"ContainerDied","Data":"a60edfd590a9633149929f3620e10aec62806909513d88ed54ec618303f65001"} Mar 10 09:31:49 crc kubenswrapper[4825]: I0310 09:31:49.753227 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8wlv" event={"ID":"434afba3-bdfc-44d7-b0b9-9b9c309221d9","Type":"ContainerStarted","Data":"bab2c7c2bcbf7f2ec026bdd57eeb1510ddf573a3f7b389b7187272594a1de746"} Mar 10 09:31:49 crc kubenswrapper[4825]: I0310 09:31:49.777556 4825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s8wlv" podStartSLOduration=3.331478501 podStartE2EDuration="7.777538646s" podCreationTimestamp="2026-03-10 09:31:42 +0000 UTC" firstStartedPulling="2026-03-10 09:31:44.705400046 +0000 UTC m=+10057.735180661" lastFinishedPulling="2026-03-10 09:31:49.151460191 +0000 UTC m=+10062.181240806" observedRunningTime="2026-03-10 09:31:49.772110631 +0000 UTC m=+10062.801891236" watchObservedRunningTime="2026-03-10 09:31:49.777538646 +0000 UTC m=+10062.807319261" Mar 10 09:31:53 crc kubenswrapper[4825]: I0310 09:31:53.255395 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s8wlv" Mar 10 09:31:53 crc kubenswrapper[4825]: I0310 09:31:53.256149 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s8wlv" Mar 10 09:31:54 crc kubenswrapper[4825]: I0310 09:31:54.306411 4825 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s8wlv" podUID="434afba3-bdfc-44d7-b0b9-9b9c309221d9" containerName="registry-server" probeResult="failure" output=< Mar 10 09:31:54 crc kubenswrapper[4825]: timeout: failed to connect service ":50051" within 1s Mar 10 09:31:54 crc kubenswrapper[4825]: > Mar 10 09:32:00 crc kubenswrapper[4825]: I0310 09:32:00.166516 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552252-97txb"] Mar 10 09:32:00 crc kubenswrapper[4825]: I0310 09:32:00.168561 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552252-97txb" Mar 10 09:32:00 crc kubenswrapper[4825]: I0310 09:32:00.171815 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:32:00 crc kubenswrapper[4825]: I0310 09:32:00.172388 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 09:32:00 crc kubenswrapper[4825]: I0310 09:32:00.176622 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:32:00 crc kubenswrapper[4825]: I0310 09:32:00.184448 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552252-97txb"] Mar 10 09:32:00 crc kubenswrapper[4825]: I0310 09:32:00.268862 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbkql\" (UniqueName: \"kubernetes.io/projected/fd97779e-5468-4f3e-bf07-0d4dad56170f-kube-api-access-qbkql\") pod \"auto-csr-approver-29552252-97txb\" (UID: \"fd97779e-5468-4f3e-bf07-0d4dad56170f\") " pod="openshift-infra/auto-csr-approver-29552252-97txb" Mar 10 09:32:00 crc kubenswrapper[4825]: I0310 09:32:00.370808 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbkql\" (UniqueName: \"kubernetes.io/projected/fd97779e-5468-4f3e-bf07-0d4dad56170f-kube-api-access-qbkql\") pod \"auto-csr-approver-29552252-97txb\" (UID: \"fd97779e-5468-4f3e-bf07-0d4dad56170f\") " pod="openshift-infra/auto-csr-approver-29552252-97txb" Mar 10 09:32:00 crc kubenswrapper[4825]: I0310 09:32:00.389154 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbkql\" (UniqueName: \"kubernetes.io/projected/fd97779e-5468-4f3e-bf07-0d4dad56170f-kube-api-access-qbkql\") pod \"auto-csr-approver-29552252-97txb\" (UID: \"fd97779e-5468-4f3e-bf07-0d4dad56170f\") " pod="openshift-infra/auto-csr-approver-29552252-97txb" Mar 10 09:32:00 crc kubenswrapper[4825]: I0310 09:32:00.493440 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552252-97txb" Mar 10 09:32:01 crc kubenswrapper[4825]: I0310 09:32:01.038032 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552252-97txb"] Mar 10 09:32:01 crc kubenswrapper[4825]: I0310 09:32:01.045439 4825 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 09:32:01 crc kubenswrapper[4825]: I0310 09:32:01.870024 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552252-97txb" event={"ID":"fd97779e-5468-4f3e-bf07-0d4dad56170f","Type":"ContainerStarted","Data":"7ca276a5c259aec93cf3f82ed388eff0ecc78649c8a537a1d8504debd290cc27"} Mar 10 09:32:02 crc kubenswrapper[4825]: I0310 09:32:02.880375 4825 generic.go:334] "Generic (PLEG): container finished" podID="fd97779e-5468-4f3e-bf07-0d4dad56170f" containerID="e2d54f31171f83cb13371af25f242e24f451cabb60aae0598adfc66f8e869c81" exitCode=0 Mar 10 09:32:02 crc kubenswrapper[4825]: I0310 09:32:02.880449 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552252-97txb" event={"ID":"fd97779e-5468-4f3e-bf07-0d4dad56170f","Type":"ContainerDied","Data":"e2d54f31171f83cb13371af25f242e24f451cabb60aae0598adfc66f8e869c81"} Mar 10 09:32:03 crc kubenswrapper[4825]: I0310 09:32:03.342021 4825 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s8wlv" Mar 10 09:32:03 crc kubenswrapper[4825]: I0310 09:32:03.389883 4825 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s8wlv" Mar 10 09:32:03 crc kubenswrapper[4825]: I0310 09:32:03.592242 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s8wlv"] Mar 10 09:32:04 crc kubenswrapper[4825]: I0310 09:32:04.345466 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552252-97txb" Mar 10 09:32:04 crc kubenswrapper[4825]: I0310 09:32:04.466609 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbkql\" (UniqueName: \"kubernetes.io/projected/fd97779e-5468-4f3e-bf07-0d4dad56170f-kube-api-access-qbkql\") pod \"fd97779e-5468-4f3e-bf07-0d4dad56170f\" (UID: \"fd97779e-5468-4f3e-bf07-0d4dad56170f\") " Mar 10 09:32:04 crc kubenswrapper[4825]: I0310 09:32:04.472435 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd97779e-5468-4f3e-bf07-0d4dad56170f-kube-api-access-qbkql" (OuterVolumeSpecName: "kube-api-access-qbkql") pod "fd97779e-5468-4f3e-bf07-0d4dad56170f" (UID: "fd97779e-5468-4f3e-bf07-0d4dad56170f"). InnerVolumeSpecName "kube-api-access-qbkql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:32:04 crc kubenswrapper[4825]: I0310 09:32:04.569760 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbkql\" (UniqueName: \"kubernetes.io/projected/fd97779e-5468-4f3e-bf07-0d4dad56170f-kube-api-access-qbkql\") on node \"crc\" DevicePath \"\"" Mar 10 09:32:04 crc kubenswrapper[4825]: I0310 09:32:04.913566 4825 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s8wlv" podUID="434afba3-bdfc-44d7-b0b9-9b9c309221d9" containerName="registry-server" containerID="cri-o://bab2c7c2bcbf7f2ec026bdd57eeb1510ddf573a3f7b389b7187272594a1de746" gracePeriod=2 Mar 10 09:32:04 crc kubenswrapper[4825]: I0310 09:32:04.914040 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552252-97txb" Mar 10 09:32:04 crc kubenswrapper[4825]: I0310 09:32:04.914278 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552252-97txb" event={"ID":"fd97779e-5468-4f3e-bf07-0d4dad56170f","Type":"ContainerDied","Data":"7ca276a5c259aec93cf3f82ed388eff0ecc78649c8a537a1d8504debd290cc27"} Mar 10 09:32:04 crc kubenswrapper[4825]: I0310 09:32:04.914310 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ca276a5c259aec93cf3f82ed388eff0ecc78649c8a537a1d8504debd290cc27" Mar 10 09:32:05 crc kubenswrapper[4825]: I0310 09:32:05.416498 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552246-gx7b6"] Mar 10 09:32:05 crc kubenswrapper[4825]: I0310 09:32:05.425598 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552246-gx7b6"] Mar 10 09:32:05 crc kubenswrapper[4825]: I0310 09:32:05.450570 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8wlv" Mar 10 09:32:05 crc kubenswrapper[4825]: I0310 09:32:05.593680 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/434afba3-bdfc-44d7-b0b9-9b9c309221d9-utilities\") pod \"434afba3-bdfc-44d7-b0b9-9b9c309221d9\" (UID: \"434afba3-bdfc-44d7-b0b9-9b9c309221d9\") " Mar 10 09:32:05 crc kubenswrapper[4825]: I0310 09:32:05.593779 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh87n\" (UniqueName: \"kubernetes.io/projected/434afba3-bdfc-44d7-b0b9-9b9c309221d9-kube-api-access-fh87n\") pod \"434afba3-bdfc-44d7-b0b9-9b9c309221d9\" (UID: \"434afba3-bdfc-44d7-b0b9-9b9c309221d9\") " Mar 10 09:32:05 crc kubenswrapper[4825]: I0310 09:32:05.593896 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/434afba3-bdfc-44d7-b0b9-9b9c309221d9-catalog-content\") pod \"434afba3-bdfc-44d7-b0b9-9b9c309221d9\" (UID: \"434afba3-bdfc-44d7-b0b9-9b9c309221d9\") " Mar 10 09:32:05 crc kubenswrapper[4825]: I0310 09:32:05.594574 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/434afba3-bdfc-44d7-b0b9-9b9c309221d9-utilities" (OuterVolumeSpecName: "utilities") pod "434afba3-bdfc-44d7-b0b9-9b9c309221d9" (UID: "434afba3-bdfc-44d7-b0b9-9b9c309221d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:32:05 crc kubenswrapper[4825]: I0310 09:32:05.598365 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/434afba3-bdfc-44d7-b0b9-9b9c309221d9-kube-api-access-fh87n" (OuterVolumeSpecName: "kube-api-access-fh87n") pod "434afba3-bdfc-44d7-b0b9-9b9c309221d9" (UID: "434afba3-bdfc-44d7-b0b9-9b9c309221d9"). InnerVolumeSpecName "kube-api-access-fh87n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:32:05 crc kubenswrapper[4825]: I0310 09:32:05.696208 4825 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/434afba3-bdfc-44d7-b0b9-9b9c309221d9-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 09:32:05 crc kubenswrapper[4825]: I0310 09:32:05.696240 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh87n\" (UniqueName: \"kubernetes.io/projected/434afba3-bdfc-44d7-b0b9-9b9c309221d9-kube-api-access-fh87n\") on node \"crc\" DevicePath \"\"" Mar 10 09:32:05 crc kubenswrapper[4825]: I0310 09:32:05.734248 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/434afba3-bdfc-44d7-b0b9-9b9c309221d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "434afba3-bdfc-44d7-b0b9-9b9c309221d9" (UID: "434afba3-bdfc-44d7-b0b9-9b9c309221d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 09:32:05 crc kubenswrapper[4825]: I0310 09:32:05.797585 4825 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/434afba3-bdfc-44d7-b0b9-9b9c309221d9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 09:32:05 crc kubenswrapper[4825]: I0310 09:32:05.928273 4825 generic.go:334] "Generic (PLEG): container finished" podID="434afba3-bdfc-44d7-b0b9-9b9c309221d9" containerID="bab2c7c2bcbf7f2ec026bdd57eeb1510ddf573a3f7b389b7187272594a1de746" exitCode=0 Mar 10 09:32:05 crc kubenswrapper[4825]: I0310 09:32:05.928339 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8wlv" event={"ID":"434afba3-bdfc-44d7-b0b9-9b9c309221d9","Type":"ContainerDied","Data":"bab2c7c2bcbf7f2ec026bdd57eeb1510ddf573a3f7b389b7187272594a1de746"} Mar 10 09:32:05 crc kubenswrapper[4825]: I0310 09:32:05.928385 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8wlv" event={"ID":"434afba3-bdfc-44d7-b0b9-9b9c309221d9","Type":"ContainerDied","Data":"88eefb09e45d72d8c47311d0d8ff3339c21f726561ceabd7bbbed136dce53ecc"} Mar 10 09:32:05 crc kubenswrapper[4825]: I0310 09:32:05.928418 4825 scope.go:117] "RemoveContainer" containerID="bab2c7c2bcbf7f2ec026bdd57eeb1510ddf573a3f7b389b7187272594a1de746" Mar 10 09:32:05 crc kubenswrapper[4825]: I0310 09:32:05.928346 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8wlv" Mar 10 09:32:05 crc kubenswrapper[4825]: I0310 09:32:05.952919 4825 scope.go:117] "RemoveContainer" containerID="a60edfd590a9633149929f3620e10aec62806909513d88ed54ec618303f65001" Mar 10 09:32:05 crc kubenswrapper[4825]: I0310 09:32:05.975989 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s8wlv"] Mar 10 09:32:05 crc kubenswrapper[4825]: I0310 09:32:05.980716 4825 scope.go:117] "RemoveContainer" containerID="276427d40da455a78757a30950f65440e205e3e2d0e095057080c4ab74504198" Mar 10 09:32:05 crc kubenswrapper[4825]: I0310 09:32:05.988254 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s8wlv"] Mar 10 09:32:06 crc kubenswrapper[4825]: I0310 09:32:06.030039 4825 scope.go:117] "RemoveContainer" containerID="bab2c7c2bcbf7f2ec026bdd57eeb1510ddf573a3f7b389b7187272594a1de746" Mar 10 09:32:06 crc kubenswrapper[4825]: E0310 09:32:06.030548 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bab2c7c2bcbf7f2ec026bdd57eeb1510ddf573a3f7b389b7187272594a1de746\": container with ID starting with bab2c7c2bcbf7f2ec026bdd57eeb1510ddf573a3f7b389b7187272594a1de746 not found: ID does not exist" containerID="bab2c7c2bcbf7f2ec026bdd57eeb1510ddf573a3f7b389b7187272594a1de746" Mar 10 09:32:06 crc kubenswrapper[4825]: I0310 09:32:06.030605 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bab2c7c2bcbf7f2ec026bdd57eeb1510ddf573a3f7b389b7187272594a1de746"} err="failed to get container status \"bab2c7c2bcbf7f2ec026bdd57eeb1510ddf573a3f7b389b7187272594a1de746\": rpc error: code = NotFound desc = could not find container \"bab2c7c2bcbf7f2ec026bdd57eeb1510ddf573a3f7b389b7187272594a1de746\": container with ID starting with bab2c7c2bcbf7f2ec026bdd57eeb1510ddf573a3f7b389b7187272594a1de746 not found: ID does not exist" Mar 10 09:32:06 crc kubenswrapper[4825]: I0310 09:32:06.030636 4825 scope.go:117] "RemoveContainer" containerID="a60edfd590a9633149929f3620e10aec62806909513d88ed54ec618303f65001" Mar 10 09:32:06 crc kubenswrapper[4825]: E0310 09:32:06.031077 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a60edfd590a9633149929f3620e10aec62806909513d88ed54ec618303f65001\": container with ID starting with a60edfd590a9633149929f3620e10aec62806909513d88ed54ec618303f65001 not found: ID does not exist" containerID="a60edfd590a9633149929f3620e10aec62806909513d88ed54ec618303f65001" Mar 10 09:32:06 crc kubenswrapper[4825]: I0310 09:32:06.031112 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60edfd590a9633149929f3620e10aec62806909513d88ed54ec618303f65001"} err="failed to get container status \"a60edfd590a9633149929f3620e10aec62806909513d88ed54ec618303f65001\": rpc error: code = NotFound desc = could not find container \"a60edfd590a9633149929f3620e10aec62806909513d88ed54ec618303f65001\": container with ID starting with a60edfd590a9633149929f3620e10aec62806909513d88ed54ec618303f65001 not found: ID does not exist" Mar 10 09:32:06 crc kubenswrapper[4825]: I0310 09:32:06.031150 4825 scope.go:117] "RemoveContainer" containerID="276427d40da455a78757a30950f65440e205e3e2d0e095057080c4ab74504198" Mar 10 09:32:06 crc kubenswrapper[4825]: E0310 09:32:06.031530 4825 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"276427d40da455a78757a30950f65440e205e3e2d0e095057080c4ab74504198\": container with ID starting with 276427d40da455a78757a30950f65440e205e3e2d0e095057080c4ab74504198 not found: ID does not exist" containerID="276427d40da455a78757a30950f65440e205e3e2d0e095057080c4ab74504198" Mar 10 09:32:06 crc kubenswrapper[4825]: I0310 09:32:06.031565 4825 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"276427d40da455a78757a30950f65440e205e3e2d0e095057080c4ab74504198"} err="failed to get container status \"276427d40da455a78757a30950f65440e205e3e2d0e095057080c4ab74504198\": rpc error: code = NotFound desc = could not find container \"276427d40da455a78757a30950f65440e205e3e2d0e095057080c4ab74504198\": container with ID starting with 276427d40da455a78757a30950f65440e205e3e2d0e095057080c4ab74504198 not found: ID does not exist" Mar 10 09:32:07 crc kubenswrapper[4825]: I0310 09:32:07.251490 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="434afba3-bdfc-44d7-b0b9-9b9c309221d9" path="/var/lib/kubelet/pods/434afba3-bdfc-44d7-b0b9-9b9c309221d9/volumes" Mar 10 09:32:07 crc kubenswrapper[4825]: I0310 09:32:07.253696 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6771d618-87af-4f42-bbfd-fa0882be021b" path="/var/lib/kubelet/pods/6771d618-87af-4f42-bbfd-fa0882be021b/volumes" Mar 10 09:32:11 crc kubenswrapper[4825]: I0310 09:32:11.191208 4825 scope.go:117] "RemoveContainer" containerID="91db7283cc7c88f6dbb5ebaa7e4456911dcb4cc23d28b2e206e9a13ad56dcd51" Mar 10 09:33:16 crc kubenswrapper[4825]: I0310 09:33:16.888416 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:33:16 crc kubenswrapper[4825]: I0310 09:33:16.888976 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:33:46 crc kubenswrapper[4825]: I0310 09:33:46.889024 4825 patch_prober.go:28] interesting pod/machine-config-daemon-bvt9j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 09:33:46 crc kubenswrapper[4825]: I0310 09:33:46.889902 4825 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bvt9j" podUID="9beb5814-89d0-47c0-8b0e-24376a358fc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 09:34:00 crc kubenswrapper[4825]: I0310 09:34:00.178359 4825 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552254-w5mvb"] Mar 10 09:34:00 crc kubenswrapper[4825]: E0310 09:34:00.179605 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="434afba3-bdfc-44d7-b0b9-9b9c309221d9" containerName="registry-server" Mar 10 09:34:00 crc kubenswrapper[4825]: I0310 09:34:00.179630 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="434afba3-bdfc-44d7-b0b9-9b9c309221d9" containerName="registry-server" Mar 10 09:34:00 crc kubenswrapper[4825]: E0310 09:34:00.179663 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd97779e-5468-4f3e-bf07-0d4dad56170f" containerName="oc" Mar 10 09:34:00 crc kubenswrapper[4825]: I0310 09:34:00.179677 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd97779e-5468-4f3e-bf07-0d4dad56170f" containerName="oc" Mar 10 09:34:00 crc kubenswrapper[4825]: E0310 09:34:00.179718 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="434afba3-bdfc-44d7-b0b9-9b9c309221d9" containerName="extract-utilities" Mar 10 09:34:00 crc kubenswrapper[4825]: I0310 09:34:00.179734 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="434afba3-bdfc-44d7-b0b9-9b9c309221d9" containerName="extract-utilities" Mar 10 09:34:00 crc kubenswrapper[4825]: E0310 09:34:00.179781 4825 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="434afba3-bdfc-44d7-b0b9-9b9c309221d9" containerName="extract-content" Mar 10 09:34:00 crc kubenswrapper[4825]: I0310 09:34:00.179792 4825 state_mem.go:107] "Deleted CPUSet assignment" podUID="434afba3-bdfc-44d7-b0b9-9b9c309221d9" containerName="extract-content" Mar 10 09:34:00 crc kubenswrapper[4825]: I0310 09:34:00.180118 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="434afba3-bdfc-44d7-b0b9-9b9c309221d9" containerName="registry-server" Mar 10 09:34:00 crc kubenswrapper[4825]: I0310 09:34:00.180168 4825 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd97779e-5468-4f3e-bf07-0d4dad56170f" containerName="oc" Mar 10 09:34:00 crc kubenswrapper[4825]: I0310 09:34:00.181281 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552254-w5mvb" Mar 10 09:34:00 crc kubenswrapper[4825]: I0310 09:34:00.183827 4825 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-nhthn" Mar 10 09:34:00 crc kubenswrapper[4825]: I0310 09:34:00.184350 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 09:34:00 crc kubenswrapper[4825]: I0310 09:34:00.184631 4825 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 09:34:00 crc kubenswrapper[4825]: I0310 09:34:00.190319 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552254-w5mvb"] Mar 10 09:34:00 crc kubenswrapper[4825]: I0310 09:34:00.333330 4825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tkff\" (UniqueName: \"kubernetes.io/projected/41e008c4-1638-48d2-a926-e2a1802d2504-kube-api-access-6tkff\") pod \"auto-csr-approver-29552254-w5mvb\" (UID: \"41e008c4-1638-48d2-a926-e2a1802d2504\") " pod="openshift-infra/auto-csr-approver-29552254-w5mvb" Mar 10 09:34:00 crc kubenswrapper[4825]: I0310 09:34:00.435801 4825 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tkff\" (UniqueName: \"kubernetes.io/projected/41e008c4-1638-48d2-a926-e2a1802d2504-kube-api-access-6tkff\") pod \"auto-csr-approver-29552254-w5mvb\" (UID: \"41e008c4-1638-48d2-a926-e2a1802d2504\") " pod="openshift-infra/auto-csr-approver-29552254-w5mvb" Mar 10 09:34:00 crc kubenswrapper[4825]: I0310 09:34:00.462688 4825 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tkff\" (UniqueName: \"kubernetes.io/projected/41e008c4-1638-48d2-a926-e2a1802d2504-kube-api-access-6tkff\") pod \"auto-csr-approver-29552254-w5mvb\" (UID: \"41e008c4-1638-48d2-a926-e2a1802d2504\") " pod="openshift-infra/auto-csr-approver-29552254-w5mvb" Mar 10 09:34:00 crc kubenswrapper[4825]: I0310 09:34:00.535336 4825 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552254-w5mvb" Mar 10 09:34:01 crc kubenswrapper[4825]: I0310 09:34:01.058491 4825 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552254-w5mvb"] Mar 10 09:34:01 crc kubenswrapper[4825]: I0310 09:34:01.323649 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552254-w5mvb" event={"ID":"41e008c4-1638-48d2-a926-e2a1802d2504","Type":"ContainerStarted","Data":"4f9f5c8a8dc89e3402b2cefdb3770a40138cc98dd919b170e6298db1192f460d"} Mar 10 09:34:03 crc kubenswrapper[4825]: I0310 09:34:03.342611 4825 generic.go:334] "Generic (PLEG): container finished" podID="41e008c4-1638-48d2-a926-e2a1802d2504" containerID="a0075ddb49ed59a165666d724997ace8ffe546014544f19246c98794377beafc" exitCode=0 Mar 10 09:34:03 crc kubenswrapper[4825]: I0310 09:34:03.342725 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552254-w5mvb" event={"ID":"41e008c4-1638-48d2-a926-e2a1802d2504","Type":"ContainerDied","Data":"a0075ddb49ed59a165666d724997ace8ffe546014544f19246c98794377beafc"} Mar 10 09:34:04 crc kubenswrapper[4825]: I0310 09:34:04.989760 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552254-w5mvb" Mar 10 09:34:05 crc kubenswrapper[4825]: I0310 09:34:05.143066 4825 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tkff\" (UniqueName: \"kubernetes.io/projected/41e008c4-1638-48d2-a926-e2a1802d2504-kube-api-access-6tkff\") pod \"41e008c4-1638-48d2-a926-e2a1802d2504\" (UID: \"41e008c4-1638-48d2-a926-e2a1802d2504\") " Mar 10 09:34:05 crc kubenswrapper[4825]: I0310 09:34:05.148896 4825 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41e008c4-1638-48d2-a926-e2a1802d2504-kube-api-access-6tkff" (OuterVolumeSpecName: "kube-api-access-6tkff") pod "41e008c4-1638-48d2-a926-e2a1802d2504" (UID: "41e008c4-1638-48d2-a926-e2a1802d2504"). InnerVolumeSpecName "kube-api-access-6tkff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 09:34:05 crc kubenswrapper[4825]: I0310 09:34:05.245104 4825 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tkff\" (UniqueName: \"kubernetes.io/projected/41e008c4-1638-48d2-a926-e2a1802d2504-kube-api-access-6tkff\") on node \"crc\" DevicePath \"\"" Mar 10 09:34:05 crc kubenswrapper[4825]: I0310 09:34:05.362496 4825 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552254-w5mvb" event={"ID":"41e008c4-1638-48d2-a926-e2a1802d2504","Type":"ContainerDied","Data":"4f9f5c8a8dc89e3402b2cefdb3770a40138cc98dd919b170e6298db1192f460d"} Mar 10 09:34:05 crc kubenswrapper[4825]: I0310 09:34:05.362589 4825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f9f5c8a8dc89e3402b2cefdb3770a40138cc98dd919b170e6298db1192f460d" Mar 10 09:34:05 crc kubenswrapper[4825]: I0310 09:34:05.362602 4825 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552254-w5mvb" Mar 10 09:34:06 crc kubenswrapper[4825]: I0310 09:34:06.062484 4825 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552248-k9m67"] Mar 10 09:34:06 crc kubenswrapper[4825]: I0310 09:34:06.070951 4825 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552248-k9m67"] Mar 10 09:34:07 crc kubenswrapper[4825]: I0310 09:34:07.253577 4825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6ca031d-d17f-4b5d-891d-3d30afd18ac5" path="/var/lib/kubelet/pods/c6ca031d-d17f-4b5d-891d-3d30afd18ac5/volumes"